WorldWideScience

Sample records for performance requires complex

  1. State analysis requirements database for engineering complex embedded systems

    Science.gov (United States)

    Bennett, Matthew B.; Rasmussen, Robert D.; Ingham, Michel D.

    2004-01-01

    It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer's intent, potentially leading to software errors. This problem is addressed by a systems engineering tool called the State Analysis Database, which provides a tool for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using the State Analysis Database.

  2. Experimentation on accuracy of non functional requirement prioritization approaches for different complexity projects

    Directory of Open Access Journals (Sweden)

    Raj Kumar Chopra

    2016-09-01

    Full Text Available Non functional requirements must be selected for implementation together with functional requirements to enhance the success of software projects. Three approaches exist for performing the prioritization of non functional requirements using the suitable prioritization technique. This paper performs experimentation on three different complexity versions of the industrial software project using cost-value prioritization technique employing three approaches. Experimentation is conducted to analyze the accuracy of individual approaches and the variation of accuracy with the complexity of the software project. The results indicate that selecting non functional requirements separately, but in accordance with functionality has higher accuracy amongst the other two approaches. Further, likewise other approaches, it witnesses the decrease in accuracy with increase in software complexity but the decrease is minimal.

  3. Automated Derivation of Complex System Constraints from User Requirements

    Science.gov (United States)

    Foshee, Mark; Murey, Kim; Marsh, Angela

    2010-01-01

    The Payload Operations Integration Center (POIC) located at the Marshall Space Flight Center has the responsibility of integrating US payload science requirements for the International Space Station (ISS). All payload operations must request ISS system resources so that the resource usage will be included in the ISS on-board execution timelines. The scheduling of resources and building of the timeline is performed using the Consolidated Planning System (CPS). The ISS resources are quite complex due to the large number of components that must be accounted for. The planners at the POIC simplify the process for Payload Developers (PD) by providing the PDs with a application that has the basic functionality PDs need as well as list of simplified resources in the User Requirements Collection (URC) application. The planners maintained a mapping of the URC resources to the CPS resources. The process of manually converting PD's science requirements from a simplified representation to a more complex CPS representation is a time-consuming and tedious process. The goal is to provide a software solution to allow the planners to build a mapping of the complex CPS constraints to the basic URC constraints and automatically convert the PD's requirements into systems requirements during export to CPS.

  4. Joining Distributed Complex Objects: Definition and Performance

    NARCIS (Netherlands)

    Teeuw, W.B.; Teeuw, Wouter B.; Blanken, Henk

    1992-01-01

    The performance of a non-standard distributed database system is strongly ifluenced by complex objects. The effective exploitation of parallelism in querying them and a suitable structure to store them are required in order to obtain acceptable response times in these database environments where

  5. Experimentation on accuracy of non functional requirement prioritization approaches for different complexity projects

    OpenAIRE

    Raj Kumar Chopra; Varun Gupta; Durg Singh Chauhan

    2016-01-01

    Non functional requirements must be selected for implementation together with functional requirements to enhance the success of software projects. Three approaches exist for performing the prioritization of non functional requirements using the suitable prioritization technique. This paper performs experimentation on three different complexity versions of the industrial software project using cost-value prioritization technique employing three approaches. Experimentation is conducted to analy...

  6. Development and testing of the methodology for performance requirements

    International Nuclear Information System (INIS)

    Rivers, J.D.

    1989-01-01

    The U.S. Department of Energy (DOE) is in the process of implementing a set of materials control and accountability (MC ampersand A) performance requirements. These graded requirements set a uniform level of performance for similar materials at various facilities against the threat of an insider adversary stealing special nuclear material (SNM). These requirements are phrased in terms of detecting the theft of a goal quantity of SNM within a specified time period and with a probability greater than or equal to a special value and include defense-in-depth requirements. The DOE has conducted an extensive effort over the last 2 1/2 yr to develop a practical methodology to be used in evaluating facility performance against the performance requirements specified in DOE order 5633.3. The major participants in the development process have been the Office of Safeguards and Security (OSS), Brookhaven National Laboratory, and Los Alamos National Laboratory. The process has included careful reviews of related evaluation systems, a review of the intent of the requirements in the order, and site visits to most of the major facilities in the DOE complex. As a result of this extensive effort to develop guidance for the MC ampersand A performance requirements, OSS was able to provide a practical method that will allow facilities to evaluate the performance of their safeguards systems against the performance requirements. In addition, the evaluations can be validated by the cognizant operations offices in a systematic manner

  7. Performance-complexity tradeoff in sequential decoding for the unconstrained AWGN channel

    KAUST Repository

    Abediseid, Walid

    2013-06-01

    In this paper, the performance limits and the computational complexity of the lattice sequential decoder are analyzed for the unconstrained additive white Gaussian noise channel. The performance analysis available in the literature for such a channel has been studied only under the use of the minimum Euclidean distance decoder that is commonly referred to as the lattice decoder. Lattice decoders based on solutions to the NP-hard closest vector problem are very complex to implement, and the search for low complexity receivers for the detection of lattice codes is considered a challenging problem. However, the low computational complexity advantage that sequential decoding promises, makes it an alternative solution to the lattice decoder. In this work, we characterize the performance and complexity tradeoff via the error exponent and the decoding complexity, respectively, of such a decoder as a function of the decoding parameter - the bias term. For the above channel, we derive the cut-off volume-to-noise ratio that is required to achieve a good error performance with low decoding complexity. © 2013 IEEE.

  8. Complexity factors and prediction of performance

    International Nuclear Information System (INIS)

    Braarud, Per Oeyvind

    1998-03-01

    Understanding of what makes a control room situation difficult to handle is important when studying operator performance, both with respect to prediction as well as improvement of the human performance. A factor analytic approach identified eight factors from operators' answers to an 39 item questionnaire about complexity of the operator's task in the control room. A Complexity Profiling Questionnaire was developed, based on the factor analytic results from the operators' conception of complexity. The validity of the identified complexity factors was studied by prediction of crew performance and prediction of plant performance from ratings of the complexity of scenarios. The scenarios were rated by both process experts and the operators participating in the scenarios, using the Complexity Profiling Questionnaire. The process experts' complexity ratings predicted both crew performance and plant performance, while the operators' rating predicted plant performance only. The results reported are from initial studies of complexity, and imply a promising potential for further studies of the concept. The approach used in the study as well as the reported results are discussed. A chapter about the structure of the conception of complexity, and a chapter about further research conclude the report. (author)

  9. Optimation and Determination of Fe-Oxinate Complex by Using High Performance Liquid Chromatography

    Science.gov (United States)

    Oktavia, B.; Nasra, E.; Sary, R. C.

    2018-04-01

    The need for iron will improve the industrial processes that require iron as its raw material. Control of industrial iron waste is very important to do. One method of iron analysis is to conduct indirect analysis of iron (III) ions by complexing with 8-Hydroxyquinoline or oxine. In this research, qualitative and quantitative tests of iron (III) ions in the form of complex with oxine. The analysis was performed using HPLC at a wavelength of 470 nm with an ODS C18 column. Three methods of analysis were performed: 1) Fe-oxinate complexes were prepared in an ethanol solvent so no need for separation anymore, (2) Fe-oxinate complexes were made in chloroform so that a solvent extraction was required before the complex was injected into the column while the third complex was formed in the column, wherein the eluent contains the oxide and the metal ions are then injected. The resulting chromatogram shows that the 3rd way provides a better chromatogram for iron analysis.

  10. High performance parallel computing of flows in complex geometries: II. Applications

    International Nuclear Information System (INIS)

    Gourdain, N; Gicquel, L; Staffelbach, G; Vermorel, O; Duchaine, F; Boussuge, J-F; Poinsot, T

    2009-01-01

    Present regulations in terms of pollutant emissions, noise and economical constraints, require new approaches and designs in the fields of energy supply and transportation. It is now well established that the next breakthrough will come from a better understanding of unsteady flow effects and by considering the entire system and not only isolated components. However, these aspects are still not well taken into account by the numerical approaches or understood whatever the design stage considered. The main challenge is essentially due to the computational requirements inferred by such complex systems if it is to be simulated by use of supercomputers. This paper shows how new challenges can be addressed by using parallel computing platforms for distinct elements of a more complex systems as encountered in aeronautical applications. Based on numerical simulations performed with modern aerodynamic and reactive flow solvers, this work underlines the interest of high-performance computing for solving flow in complex industrial configurations such as aircrafts, combustion chambers and turbomachines. Performance indicators related to parallel computing efficiency are presented, showing that establishing fair criterions is a difficult task for complex industrial applications. Examples of numerical simulations performed in industrial systems are also described with a particular interest for the computational time and the potential design improvements obtained with high-fidelity and multi-physics computing methods. These simulations use either unsteady Reynolds-averaged Navier-Stokes methods or large eddy simulation and deal with turbulent unsteady flows, such as coupled flow phenomena (thermo-acoustic instabilities, buffet, etc). Some examples of the difficulties with grid generation and data analysis are also presented when dealing with these complex industrial applications.

  11. 42 CFR 493.1415 - Condition: Laboratories performing moderate complexity testing; clinical consultant.

    Science.gov (United States)

    2010-10-01

    ... § 493.1415 Condition: Laboratories performing moderate complexity testing; clinical consultant. The laboratory must have a clinical consultant who meets the qualification requirements of § 493.1417 of this... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Laboratories performing moderate...

  12. Lithography requirements in complex VLSI device fabrication

    International Nuclear Information System (INIS)

    Wilson, A.D.

    1985-01-01

    Fabrication of complex very large scale integration (VLSI) circuits requires continual advances in lithography to satisfy: decreasing minimum linewidths, larger chip sizes, tighter linewidth and overlay control, increasing topography to linewidth ratios, higher yield demands, increased throughput, harsher device processing, lower lithography cost, and a larger part number set with quick turn-around time. Where optical, electron beam, x-ray, and ion beam lithography can be applied to judiciously satisfy the complex VLSI circuit fabrication requirements is discussed and those areas that are in need of major further advances are addressed. Emphasis will be placed on advanced electron beam and storage ring x-ray lithography

  13. Functional Mitochondrial Complex I Is Required by Tobacco Leaves for Optimal Photosynthetic Performance in Photorespiratory Conditions and during Transients1

    Science.gov (United States)

    Dutilleul, Christelle; Driscoll, Simon; Cornic, Gabriel; De Paepe, Rosine; Foyer, Christine H.; Noctor, Graham

    2003-01-01

    The importance of the mitochondrial electron transport chain in photosynthesis was studied using the tobacco (Nicotiana sylvestris) mutant CMSII, which lacks functional complex I. Rubisco activities and oxygen evolution at saturating CO2 showed that photosynthetic capacity in the mutant was at least as high as in wild-type (WT) leaves. Despite this, steady-state photosynthesis in the mutant was reduced by 20% to 30% at atmospheric CO2 levels. The inhibition of photosynthesis was alleviated by high CO2 or low O2. The mutant showed a prolonged induction of photosynthesis, which was exacerbated in conditions favoring photorespiration and which was accompanied by increased extractable NADP-malate dehydrogenase activity. Feeding experiments with leaf discs demonstrated that CMSII had a lower capacity than the WT for glycine (Gly) oxidation in the dark. Analysis of the postillumination burst in CO2 evolution showed that this was not because of insufficient Gly decarboxylase capacity. Despite the lower rate of Gly metabolism in CMSII leaves in the dark, the Gly to Ser ratio in the light displayed a similar dependence on photosynthesis to the WT. It is concluded that: (a) Mitochondrial complex I is required for optimal photosynthetic performance, despite the operation of alternative dehydrogenases in CMSII; and (b) complex I is necessary to avoid redox disruption of photosynthesis in conditions where leaf mitochondria must oxidize both respiratory and photorespiratory substrates simultaneously. PMID:12529534

  14. How students process equations in solving quantitative synthesis problems? Role of mathematical complexity in students’ mathematical performance

    Directory of Open Access Journals (Sweden)

    Bashirah Ibrahim

    2017-10-01

    Full Text Available We examine students’ mathematical performance on quantitative “synthesis problems” with varying mathematical complexity. Synthesis problems are tasks comprising multiple concepts typically taught in different chapters. Mathematical performance refers to the formulation, combination, and simplification of equations. Generally speaking, formulation and combination of equations require conceptual reasoning; simplification of equations requires manipulation of equations as computational tools. Mathematical complexity is operationally defined by the number and the type of equations to be manipulated concurrently due to the number of unknowns in each equation. We use two types of synthesis problems, namely, sequential and simultaneous tasks. Sequential synthesis tasks require a chronological application of pertinent concepts, and simultaneous synthesis tasks require a concurrent application of the pertinent concepts. A total of 179 physics major students from a second year mechanics course participated in the study. Data were collected from written tasks and individual interviews. Results show that mathematical complexity negatively influences the students’ mathematical performance on both types of synthesis problems. However, for the sequential synthesis tasks, it interferes only with the students’ simplification of equations. For the simultaneous synthesis tasks, mathematical complexity additionally impedes the students’ formulation and combination of equations. Several reasons may explain this difference, including the students’ different approaches to the two types of synthesis problems, cognitive load, and the variation of mathematical complexity within each synthesis type.

  15. A Novel Method for Assessing Task Complexity in Outpatient Clinical-Performance Measures.

    Science.gov (United States)

    Hysong, Sylvia J; Amspoker, Amber B; Petersen, Laura A

    2016-04-01

    Clinical-performance measurement has helped improve the quality of health-care; yet success in attaining high levels of quality across multiple domains simultaneously still varies considerably. Although many sources of variability in care quality have been studied, the difficulty required to complete the clinical work itself has received little attention. We present a task-based methodology for evaluating the difficulty of clinical-performance measures (CPMs) by assessing the complexity of their component requisite tasks. Using Functional Job Analysis (FJA), subject-matter experts (SMEs) generated task lists for 17 CPMs; task lists were rated on ten dimensions of complexity, and then aggregated into difficulty composites. Eleven outpatient work SMEs; 133 VA Medical Centers nationwide. Clinical Performance: 17 outpatient CPMs (2000-2008) at 133 VA Medical Centers nationwide. Measure Difficulty: for each CPM, the number of component requisite tasks and the average rating across ten FJA complexity scales for the set of tasks comprising the measure. Measures varied considerably in the number of component tasks (M = 10.56, SD = 6.25, min = 5, max = 25). Measures of chronic care following acute myocardial infarction exhibited significantly higher measure difficulty ratings compared to diabetes or screening measures, but not to immunization measures ([Formula: see text] = 0.45, -0.04, -0.05, and -0.06 respectively; F (3, 186) = 3.57, p = 0.015). Measure difficulty ratings were not significantly correlated with the number of component tasks (r = -0.30, p = 0.23). Evaluating the difficulty of achieving recommended CPM performance levels requires more than simply counting the tasks involved; using FJA to assess the complexity of CPMs' component tasks presents an alternate means of assessing the difficulty of primary-care CPMs and accounting for performance variation among measures and performers. This in turn could be used in designing

  16. Modeling and Performance Considerations for Automated Fault Isolation in Complex Systems

    Science.gov (United States)

    Ferrell, Bob; Oostdyk, Rebecca

    2010-01-01

    The purpose of this paper is to document the modeling considerations and performance metrics that were examined in the development of a large-scale Fault Detection, Isolation and Recovery (FDIR) system. The FDIR system is envisioned to perform health management functions for both a launch vehicle and the ground systems that support the vehicle during checkout and launch countdown by using suite of complimentary software tools that alert operators to anomalies and failures in real-time. The FDIR team members developed a set of operational requirements for the models that would be used for fault isolation and worked closely with the vendor of the software tools selected for fault isolation to ensure that the software was able to meet the requirements. Once the requirements were established, example models of sufficient complexity were used to test the performance of the software. The results of the performance testing demonstrated the need for enhancements to the software in order to meet the demands of the full-scale ground and vehicle FDIR system. The paper highlights the importance of the development of operational requirements and preliminary performance testing as a strategy for identifying deficiencies in highly scalable systems and rectifying those deficiencies before they imperil the success of the project

  17. Complexity rating of abnormal events and operator performance

    International Nuclear Information System (INIS)

    Oeivind Braarud, Per

    1998-01-01

    The complexity of the work situation during abnormal situations is a major topic in a discussion of safety aspects of Nuclear Power plants. An understanding of complexity and its impact on operator performance in abnormal situations is important. One way to enhance understanding is to look at the dimensions that constitute complexity for NPP operators, and how those dimensions can be measured. A further step is to study how dimensions of complexity of the event are related to performance of operators. One aspect of complexity is the operator 's subjective experience of given difficulties of the event. Another related aspect of complexity is subject matter experts ratings of the complexity of the event. A definition and a measure of this part of complexity are being investigated at the OECD Halden Reactor Project in Norway. This paper focus on the results from a study of simulated scenarios carried out in the Halden Man-Machine Laboratory, which is a full scope PWR simulator. Six crews of two licensed operators each performed in 16 scenarios (simulated events). Before the experiment subject matter experts rated the complexity of the scenarios, using a Complexity Profiling Questionnaire. The Complexity Profiling Questionnaire contains eight previously identified dimensions associated with complexity. After completing the scenarios the operators received a questionnaire containing 39 questions about perceived complexity. This questionnaire was used for development of a measure of subjective complexity. The results from the study indicated that Process experts' rating of scenario complexity, using the Complexity Profiling Questionnaire, were able to predict crew performance quite well. The results further indicated that a measure of subjective complexity could be developed that was related to crew performance. Subjective complexity was found to be related to subjective work load. (author)

  18. Collaborative crew performance in complex operational systems: L'Efficacité du travail en équipage dans des systèmes opérationnel complexes

    National Research Council Canada - National Science Library

    1999-01-01

    .... As we progress towards the next millennium, complex operations will increasingly require consideration and integration of the collaborative element wherein crew performance becomes a critical factor for success...

  19. A review of human factors challenges of complex adaptive systems: discovering and understanding chaos in human performance.

    Science.gov (United States)

    Karwowski, Waldemar

    2012-12-01

    In this paper, the author explores a need for a greater understanding of the true nature of human-system interactions from the perspective of the theory of complex adaptive systems, including the essence of complexity, emergent properties of system behavior, nonlinear systems dynamics, and deterministic chaos. Human performance, more often than not, constitutes complex adaptive phenomena with emergent properties that exhibit nonlinear dynamical (chaotic) behaviors. The complexity challenges in the design and management of contemporary work systems, including service systems, are explored. Examples of selected applications of the concepts of nonlinear dynamics to the study of human physical performance are provided. Understanding and applications of the concepts of theory of complex adaptive and dynamical systems should significantly improve the effectiveness of human-centered design efforts of a large system of systems. Performance of many contemporary work systems and environments may be sensitive to the initial conditions and may exhibit dynamic nonlinear properties and chaotic system behaviors. Human-centered design of emergent human-system interactions requires application of the theories of nonlinear dynamics and complex adaptive system. The success of future human-systems integration efforts requires the fusion of paradigms, knowledge, design principles, and methodologies of human factors and ergonomics with those of the science of complex adaptive systems as well as modern systems engineering.

  20. Organizing Performance Requirements For Dynamical Systems

    Science.gov (United States)

    Malchow, Harvey L.; Croopnick, Steven R.

    1990-01-01

    Paper describes methodology for establishing performance requirements for complicated dynamical systems. Uses top-down approach. In series of steps, makes connections between high-level mission requirements and lower-level functional performance requirements. Provides systematic delineation of elements accommodating design compromises.

  1. Managing teams performing complex innovation projects

    NARCIS (Netherlands)

    Oeij, P.R.A.; Vroome, E.M.M. de; Dhondt, S.; Gaspersz, J.B.R.

    2012-01-01

    Complexity of projects is hotly debated and a factor which affects innovativeness of team performance. Much attention in the past is paid to technical complexity and many issues are related to natural and physical sciences. A growing awareness of the importance of socio-organisational issues is

  2. Complex performance in construction

    DEFF Research Database (Denmark)

    Bougrain, Frédéric; Forman, Marianne; Gottlieb, Stefan Christoffer

    To fulfil the expectations of demanding clients, new project-delivery mechanisms have been developed. Approaches focusing on performance-based building or new procurement processers such as new forms of private-public partnerships are considered as solutions improving the overall performance...... to the end users. This report summarises the results from work undertaken in the international collaborative project “Procuring and Operating Complex Products and Systems in Construction” (POCOPSC). POCOPSC was carried out in the period 2010-2014. The project was executed in collaboration between CSTB...

  3. Performance and Complexity Evaluation of Iterative Receiver for Coded MIMO-OFDM Systems

    Directory of Open Access Journals (Sweden)

    Rida El Chall

    2016-01-01

    Full Text Available Multiple-input multiple-output (MIMO technology in combination with channel coding technique is a promising solution for reliable high data rate transmission in future wireless communication systems. However, these technologies pose significant challenges for the design of an iterative receiver. In this paper, an efficient receiver combining soft-input soft-output (SISO detection based on low-complexity K-Best (LC-K-Best decoder with various forward error correction codes, namely, LTE turbo decoder and LDPC decoder, is investigated. We first investigate the convergence behaviors of the iterative MIMO receivers to determine the required inner and outer iterations. Consequently, the performance of LC-K-Best based receiver is evaluated in various LTE channel environments and compared with other MIMO detection schemes. Moreover, the computational complexity of the iterative receiver with different channel coding techniques is evaluated and compared with different modulation orders and coding rates. Simulation results show that LC-K-Best based receiver achieves satisfactory performance-complexity trade-offs.

  4. Satisfying positivity requirement in the Beyond Complex Langevin approach

    Directory of Open Access Journals (Sweden)

    Wyrzykowski Adam

    2018-01-01

    Full Text Available The problem of finding a positive distribution, which corresponds to a given complex density, is studied. By the requirement that the moments of the positive distribution and of the complex density are equal, one can reduce the problem to solving the matching conditions. These conditions are a set of quadratic equations, thus Groebner basis method was used to find its solutions when it is restricted to a few lowest-order moments. For a Gaussian complex density, these approximate solutions are compared with the exact solution, that is known in this special case.

  5. Satisfying positivity requirement in the Beyond Complex Langevin approach

    Science.gov (United States)

    Wyrzykowski, Adam; Ruba, Błażej Ruba

    2018-03-01

    The problem of finding a positive distribution, which corresponds to a given complex density, is studied. By the requirement that the moments of the positive distribution and of the complex density are equal, one can reduce the problem to solving the matching conditions. These conditions are a set of quadratic equations, thus Groebner basis method was used to find its solutions when it is restricted to a few lowest-order moments. For a Gaussian complex density, these approximate solutions are compared with the exact solution, that is known in this special case.

  6. Flow assurance : complex phase behavior and complex work requires confidence and vigilance

    Energy Technology Data Exchange (ETDEWEB)

    Brown, L.D. [ConocoPhillips, Major Projects, Advanced Integrated Simulation, Houston, TX (United States)

    2008-07-01

    Petroleum exploration and development projects and operations increasingly rely on flow assurance definition. Flow assurance is an integrating discipline as it follows the fluid from the reservoir to the market. Flow assurance works across complex technical and non-technical interfaces, including the reservoir, well completions, operation processes, project management, physical/organic chemistry, fluid mechanics, chemical engineering, mechanical engineering and corrosion. The phase behaviour in real fluids also has complex interfaces. The understanding and management of flow assurance of complex phase behaviour must be well communicated in order to enable proper selection, execution, and operation of development concepts designed to manage successful production within the fluid's phase behaviour. Simulation tools facilitate the translation of science into engineering. Academic, industrial, and field research is the core of these tools. The author cautioned that vigilance is required to assist and identify the right time to move innovation into the core tools.

  7. Procuring complex performance

    DEFF Research Database (Denmark)

    Hartmann, A.; Roehrich, J.; Frederiksen, Lars

    2014-01-01

    the transition process. Design/methodology/approach – A multiple, longitudinal case study method is used to examine the transition towards PCP. The study deploys rich qualitative data sets by combining semi-structured interviews, focus group meetings and organisational reports and documents. Findings...... and relational challenges they need to master when facing higher levels of performance and infrastructural complexity. Originality/value – The study adds to the limited empirical and conceptual understanding on the nature of long-term public-private interactions in PCP. It contributes through a rare focus...

  8. Can motto-goals outperform learning and performance goals? Influence of goal setting on performance and affect in a complex problem solving task

    Directory of Open Access Journals (Sweden)

    Miriam S. Rohe

    2016-09-01

    Full Text Available In this paper, we bring together research on complex problem solving with that on motivational psychology about goal setting. Complex problems require motivational effort because of their inherent difficulties. Goal Setting Theory has shown with simple tasks that high, specific performance goals lead to better performance outcome than do-your-best goals. However, in complex tasks, learning goals have proven more effective than performance goals. Based on the Zurich Resource Model (Storch & Krause, 2014, so-called motto-goals (e.g., "I breathe happiness" should activate a person’s resources through positive affect. It was found that motto-goals are effective with unpleasant duties. Therefore, we tested the hypothesis that motto-goals outperform learning and performance goals in the case of complex problems. A total of N = 123 subjects participated in the experiment. In dependence of their goal condition, subjects developed a personal motto, learning, or performance goal. This goal was adapted for the computer-simulated complex scenario Tailorshop, where subjects worked as managers in a small fictional company. Other than expected, there was no main effect of goal condition for the management performance. As hypothesized, motto goals led to higher positive and lower negative affect than the other two goal types. Even though positive affect decreased and negative affect increased in all three groups during Tailorshop completion, participants with motto goals reported the lowest rates of negative affect over time. Exploratory analyses investigated the role of affect in complex problem solving via mediational analyses and the influence of goal type on perceived goal attainment.

  9. Measuring cognitive load: performance, mental effort and simulation task complexity.

    Science.gov (United States)

    Haji, Faizal A; Rojas, David; Childs, Ruth; de Ribaupierre, Sandrine; Dubrowski, Adam

    2015-08-01

    Interest in applying cognitive load theory in health care simulation is growing. This line of inquiry requires measures that are sensitive to changes in cognitive load arising from different instructional designs. Recently, mental effort ratings and secondary task performance have shown promise as measures of cognitive load in health care simulation. We investigate the sensitivity of these measures to predicted differences in intrinsic load arising from variations in task complexity and learner expertise during simulation-based surgical skills training. We randomly assigned 28 novice medical students to simulation training on a simple or complex surgical knot-tying task. Participants completed 13 practice trials, interspersed with computer-based video instruction. On trials 1, 5, 9 and 13, knot-tying performance was assessed using time and movement efficiency measures, and cognitive load was assessed using subjective rating of mental effort (SRME) and simple reaction time (SRT) on a vibrotactile stimulus-monitoring secondary task. Significant improvements in knot-tying performance (F(1.04,24.95)  = 41.1, p cognitive load (F(2.3,58.5)  = 57.7, p load among novices engaged in simulation-based learning. These measures can be used to track cognitive load during skills training. Mental effort ratings are also sensitive to small differences in intrinsic load arising from variations in the physical complexity of a simulation task. The complementary nature of these subjective and objective measures suggests their combined use is advantageous in simulation instructional design research. © 2015 John Wiley & Sons Ltd.

  10. Performance Potential at one Complex, Specific Site

    DEFF Research Database (Denmark)

    Laursen, Bjørn

    2015-01-01

    disciplines: performance, drama, dance and music. Complex rules of “borders” between audience and actors/performers appeared to be present and active during this long happening. Different narrative genres were active simultaneously during the experimental session. A lot of complex and surprising phenomena...... and combinations of spatial, dramaturgical, narrative and interactive challenges, which appear to be of special interest for the kind of experiences an audience might gather in a site like this, originally created with totally different intentions. Or was it?...

  11. Using Common Graphics Paradigms Implemented in a Java Applet to Represent Complex Scheduling Requirements

    Science.gov (United States)

    Jaap, John; Meyer, Patrick; Davis, Elizabeth

    1997-01-01

    The experiments planned for the International Space Station promise to be complex, lengthy and diverse. The scarcity of the space station resources will cause significant competition for resources between experiments. The scheduling job facing the Space Station mission planning software requires a concise and comprehensive description of the experiments' requirements (to ensure a valid schedule) and a good description of the experiments' flexibility (to effectively utilize available resources). In addition, the continuous operation of the station, the wide geographic dispersion of station users, and the budgetary pressure to reduce operations manpower make a low-cost solution mandatory. A graphical representation of the scheduling requirements for station payloads implemented via an Internet-based application promises to be an elegant solution that addresses all of these issues. The graphical representation of experiment requirements permits a station user to describe his experiment by defining "activities" and "sequences of activities". Activities define the resource requirements (with alternatives) and other quantitative constraints of tasks to be performed. Activities definitions use an "outline" graphics paradigm. Sequences define the time relationships between activities. Sequences may also define time relationships with activities of other payloads or space station systems. Sequences of activities are described by a "network" graphics paradigm. The bulk of this paper will describe the graphical approach to representing requirements and provide examples that show the ease and clarity with which complex requirements can be represented. A Java applet, to run in a web browser, is being developed to support the graphical representation of payload scheduling requirements. Implementing the entry and editing of requirements via the web solves the problems introduced by the geographic dispersion of users. Reducing manpower is accomplished by developing a concise

  12. High-Speed, High-Performance DQPSK Optical Links with Reduced Complexity VDFE Equalizers

    Directory of Open Access Journals (Sweden)

    Maki Nanou

    2017-02-01

    Full Text Available Optical transmission technologies optimized for optical network segments sensitive to power consumption and cost, comprise modulation formats with direct detection technologies. Specifically, non-return to zero differential quaternary phase shift keying (NRZ-DQPSK in deployed fiber plants, combined with high-performance, low-complexity electronic equalizers to compensate residual impairments at the receiver end, can be proved as a viable solution for high-performance, high-capacity optical links. Joint processing of the constructive and the destructive signals at the single-ended DQPSK receiver provides improved performance compared to the balanced configuration, however, at the expense of higher hardware requirements, a fact that may not be neglected especially in the case of high-speed optical links. To overcome this bottleneck, the use of partially joint constructive/destructive DQPSK equalization is investigated in this paper. Symbol-by-symbol equalization is performed by means of Volterra decision feedback-type equalizers, driven by a reduced subset of signals selected from the constructive and the destructive ports of the optical detectors. The proposed approach offers a low-complexity alternative for electronic equalization, without sacrificing much of the performance compared to the fully-deployed counterpart. The efficiency of the proposed equalizers is demonstrated by means of computer simulation in a typical optical transmission scenario.

  13. Cloud Computing for Complex Performance Codes.

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Klein, Brandon Thorin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miner, John Gifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  14. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  15. Obesity-specific neural cost of maintaining gait performance under complex conditions in community-dwelling older adults.

    Science.gov (United States)

    Osofundiya, Olufunmilola; Benden, Mark E; Dowdy, Diane; Mehta, Ranjana K

    2016-06-01

    Recent evidence of obesity-related changes in the prefrontal cortex during cognitive and seated motor activities has surfaced; however, the impact of obesity on neural activity during ambulation remains unclear. The purpose of this study was to determine obesity-specific neural cost of simple and complex ambulation in older adults. Twenty non-obese and obese individuals, 65years and older, performed three tasks varying in the types of complexity of ambulation (simple walking, walking+cognitive dual-task, and precision walking). Maximum oxygenated hemoglobin, a measure of neural activity, was measured bilaterally using a portable functional near infrared spectroscopy system, and gait speed and performance on the complex tasks were also obtained. Complex ambulatory tasks were associated with ~2-3.5 times greater cerebral oxygenation levels and ~30-40% slower gait speeds when compared to the simple walking task. Additionally, obesity was associated with three times greater oxygenation levels, particularly during the precision gait task, despite obese adults demonstrating similar gait speeds and performances on the complex gait tasks as non-obese adults. Compared to existing studies that focus solely on biomechanical outcomes, the present study is one of the first to examine obesity-related differences in neural activity during ambulation in older adults. In order to maintain gait performance, obesity was associated with higher neural costs, and this was augmented during ambulatory tasks requiring greater precision control. These preliminary findings have clinical implications in identifying individuals who are at greater risk of mobility limitations, particularly when performing complex ambulatory tasks. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Nuclear localization of Schizosaccharomyces pombe Mcm2/Cdc19p requires MCM complex assembly.

    Science.gov (United States)

    Pasion, S G; Forsburg, S L

    1999-12-01

    The minichromosome maintenance (MCM) proteins MCM2-MCM7 are conserved eukaryotic replication factors that assemble in a heterohexameric complex. In fission yeast, these proteins are nuclear throughout the cell cycle. In studying the mechanism that regulates assembly of the MCM complex, we analyzed the cis and trans elements required for nuclear localization of a single subunit, Mcm2p. Mutation of any single mcm gene leads to redistribution of wild-type MCM subunits to the cytoplasm, and this redistribution depends on an active nuclear export system. We identified the nuclear localization signal sequences of Mcm2p and showed that these are required for nuclear targeting of other MCM subunits. In turn, Mcm2p must associate with other MCM proteins for its proper localization; nuclear localization of MCM proteins thus requires assembly of MCM proteins in a complex. We suggest that coupling complex assembly to nuclear targeting and retention ensures that only intact heterohexameric MCM complexes remain nuclear.

  17. Supplemental design requirements document solid waste operations complex

    International Nuclear Information System (INIS)

    Ocampo, V.P.; Boothe, G.F.; Broz, D.R.; Eaton, H.E.; Greager, T.M.; Huckfeldt, R.A.; Kooiker, S.L.; Lamberd, D.L.; Lang, L.L.; Myers, J.B.

    1994-11-01

    This document provides additional and supplemental information to the WHC-SD-W112-FDC-001, WHC-SD-W113-FDC-001, and WHC-SD-W100-FDC-001. It provides additional requirements for the design and summarizes Westinghouse Hanford Company key design guidance and establishes the technical baseline agreements to be used for definitive design common to the Solid Waste Operations Complex (SWOC) Facilities (Project W-112, Project W-113, and WRAP 2A)

  18. Cost-optimal levels for energy performance requirements

    DEFF Research Database (Denmark)

    Thomsen, Kirsten Engelund; Aggerholm, Søren; Kluttig-Erhorn, Heike

    2011-01-01

    The CA conducted a study on experiences and challenges for setting cost optimal levels for energy performance requirements. The results were used as input by the EU Commission in their work of establishing the Regulation on a comparative methodology framework for calculating cost optimal levels...... of minimum energy performance requirements. In addition to the summary report released in August 2011, the full detailed report on this study is now also made available, just as the EC is about to publish its proposed Regulation for MS to apply in their process to update national building requirements....

  19. Enabling Requirements-Based Programming for Highly-Dependable Complex Parallel and Distributed Systems

    Science.gov (United States)

    Hinchey, Michael G.; Rash, James L.; Rouff, Christopher A.

    2005-01-01

    The manual application of formal methods in system specification has produced successes, but in the end, despite any claims and assertions by practitioners, there is no provable relationship between a manually derived system specification or formal model and the customer's original requirements. Complex parallel and distributed system present the worst case implications for today s dearth of viable approaches for achieving system dependability. No avenue other than formal methods constitutes a serious contender for resolving the problem, and so recognition of requirements-based programming has come at a critical juncture. We describe a new, NASA-developed automated requirement-based programming method that can be applied to certain classes of systems, including complex parallel and distributed systems, to achieve a high degree of dependability.

  20. 77 FR 50726 - Software Requirement Specifications for Digital Computer Software and Complex Electronics Used in...

    Science.gov (United States)

    2012-08-22

    ... Computer Software and Complex Electronics Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear...-1209, ``Software Requirement Specifications for Digital Computer Software and Complex Electronics used... Electronics Engineers (ANSI/IEEE) Standard 830-1998, ``IEEE Recommended Practice for Software Requirements...

  1. Polar localization of Escherichia coli chemoreceptors requires an intact Tol–Pal complex

    Science.gov (United States)

    Santos, Thiago M. A.; Lin, Ti-Yu; Rajendran, Madhusudan; Anderson, Samantha M.; Weibel, Douglas B.

    2014-01-01

    Summary Subcellular biomolecular localization is critical for the metabolic and structural properties of the cell. The functional implications of the spatiotemporal distribution of protein complexes during the bacterial cell cycle have long been acknowledged; however, the molecular mechanisms for generating and maintaining their dynamic localization in bacteria are not completely understood. Here we demonstrate that the trans-envelope Tol–Pal complex, a widely conserved component of the cell envelope of Gram-negative bacteria, is required to maintain the polar positioning of chemoreceptor clusters in Escherichia coli. Localization of the chemoreceptors was independent of phospholipid composition of the membrane and the curvature of the cell wall. Instead, our data indicate that chemoreceptors interact with components of the Tol–Pal complex and that this interaction is required to polarly localize chemoreceptor clusters. We found that disruption of the Tol–Pal complex perturbs the polar localization of chemoreceptors, alters cell motility, and affects chemotaxis. We propose that the E. coli Tol–Pal complex restricts mobility of the chemoreceptor clusters at the cell poles and may be involved in regulatory mechanisms that co-ordinate cell division and segregation of the chemosensory machinery. PMID:24720726

  2. Training requirements and responsibilities for the Buried Waste Integrated Demonstration at the Radioactive Waste Management Complex

    International Nuclear Information System (INIS)

    Vega, H.G.; French, S.B.; Rick, D.L.

    1992-09-01

    The Buried Waste Integrated Demonstration (BWID) is scheduled to conduct intrusive (hydropunch screening tests, bore hole installation, soil sampling, etc.) and nonintrusive (geophysical surveys) studies at the Radioactive Waste Management Complex (RWMC). These studies and activities will be limited to specific locations at the RWMC. The duration of these activities will vary, but most tasks are not expected to exceed 90 days. The BWID personnel requested that the Waste Management Operational Support Group establish the training requirements and training responsibilities for BWID personnel and BWID subcontractor personnel. This document specifies these training requirements and responsibilities. While the responsibilities of BWID and the RWMC are, in general, defined in the interface agreement, the training elements are based on regulatory requirements, DOE orders, DOE-ID guidance, state law, and the nature of the work to be performed

  3. Similarity, Not Complexity, Determines Visual Working Memory Performance

    Science.gov (United States)

    Jackson, Margaret C.; Linden, David E. J.; Roberts, Mark V.; Kriegeskorte, Nikolaus; Haenschel, Corinna

    2015-01-01

    A number of studies have shown that visual working memory (WM) is poorer for complex versus simple items, traditionally accounted for by higher information load placing greater demands on encoding and storage capacity limits. Other research suggests that it may not be complexity that determines WM performance per se, but rather increased…

  4. 14 CFR 171.269 - Marker beacon performance requirements.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Marker beacon performance requirements. 171.269 Section 171.269 Aeronautics and Space FEDERAL AVIATION ADMINISTRATION, DEPARTMENT OF... Landing System (ISMLS) § 171.269 Marker beacon performance requirements. ISMLS marker beacon equipment...

  5. The impact of manufacturing complexity drivers on performance-a preliminary study

    Science.gov (United States)

    Huah Leang, Suh; Mahmood, Wan Hasrulnizzam Wan; Rahman, Muhamad Arfauz A.

    2018-03-01

    Manufacturing systems, in pursuit of cost, time and flexibility optimisation are becoming more and more complex, exhibiting a dynamic and nonlinear behaviour. Unpredictability is a distinct characteristic of such behaviour and effects production planning significantly. Therefore, this study was undertaken to investigate the priority level and current achievement of manufacturing performance in Malaysia’s manufacturing industry and the complexity drivers on manufacturing productivity performance. The results showed that Malaysia’s manufacturing industry prioritised product quality and they managed to achieve a good on time delivery performance. However, for other manufacturing performance, there was a difference where the current achievement of manufacturing performances in Malaysia’s manufacturing industry is slightly lower than the priority given to them. The strong correlation of significant value for priority status was observed between efficient production levelling (finished goods) and finish product management while the strong correlation of significant value for current achievement was minimised the number of workstation and factory transportation system. This indicates that complexity drivers have an impact towards manufacturing performance. Consequently, it is necessary to identify complexity drivers to achieve well manufacturing performance.

  6. Assessing vocal performance in complex birdsong: a novel approach.

    Science.gov (United States)

    Geberzahn, Nicole; Aubin, Thierry

    2014-08-06

    Vocal performance refers to the ability to produce vocal signals close to physical limits. Such motor skills can be used by conspecifics to assess a signaller's competitive potential. For example it is difficult for birds to produce repeated syllables both rapidly and with a broad frequency bandwidth. Deviation from an upper-bound regression of frequency bandwidth on trill rate has been widely used to assess vocal performance. This approach is, however, only applicable to simple trilled songs, and even then may be affected by differences in syllable complexity. Using skylarks (Alauda arvensis) as a birdsong model with a very complex song structure, we detected another performance trade-off: minimum gap duration between syllables was longer when the frequency ratio between the end of one syllable and the start of the next syllable (inter-syllable frequency shift) was large. This allowed us to apply a novel measure of vocal performance ¿ vocal gap deviation: the deviation from a lower-bound regression of gap duration on inter-syllable frequency shift. We show that skylarks increase vocal performance in an aggressive context suggesting that this trait might serve as a signal for competitive potential. We suggest using vocal gap deviation in future studies to assess vocal performance in songbird species with complex structure.

  7. Scaling of neck performance requirements in side impacts

    NARCIS (Netherlands)

    Wismans, J.S.H.M.; Meijer, R.; Rodarius, C.; Been, B.W.

    2008-01-01

    Neck biofidelity performance requirements for different sized crash dummies and human body computer models are usually based on scaling of performance requirements derived for a 50th percentile body size. The objective of this study is to investigate the validity of the currently used scaling laws

  8. The effects of overtime work and task complexity on the performance of nuclear plant operators: A proposed methodology

    International Nuclear Information System (INIS)

    Banks, W.W.; Potash, L.

    1985-01-01

    This document presents a very general methodology for determining the effect of overtime work and task complexity on operator performance in response to simulated out-of-limit nuclear plant conditions. The independent variables consist of three levels of overtime work and three levels of task complexity. Multiple dependent performance measures are proposed for use and discussion. Overtime work is operationally defined in terms of the number of hours worked by nuclear plant operators beyond the traditional 8 hours per shift. Task complexity is operationalized in terms of the number of operator tasks required to remedy a given plant anomalous condition and bring the plant back to a ''within limits'' or ''normal'' steady-state condition. The proposed methodology would employ a 2 factor repeated measures design along with the analysis of variance (linear) model

  9. 4D Dynamic Required Navigation Performance Final Report

    Science.gov (United States)

    Finkelsztein, Daniel M.; Sturdy, James L.; Alaverdi, Omeed; Hochwarth, Joachim K.

    2011-01-01

    New advanced four dimensional trajectory (4DT) procedures under consideration for the Next Generation Air Transportation System (NextGen) require an aircraft to precisely navigate relative to a moving reference such as another aircraft. Examples are Self-Separation for enroute operations and Interval Management for in-trail and merging operations. The current construct of Required Navigation Performance (RNP), defined for fixed-reference-frame navigation, is not sufficiently specified to be applicable to defining performance levels of such air-to-air procedures. An extension of RNP to air-to-air navigation would enable these advanced procedures to be implemented with a specified level of performance. The objective of this research effort was to propose new 4D Dynamic RNP constructs that account for the dynamic spatial and temporal nature of Interval Management and Self-Separation, develop mathematical models of the Dynamic RNP constructs, "Required Self-Separation Performance" and "Required Interval Management Performance," and to analyze the performance characteristics of these air-to-air procedures using the newly developed models. This final report summarizes the activities led by Raytheon, in collaboration with GE Aviation and SAIC, and presents the results from this research effort to expand the RNP concept to a dynamic 4D frame of reference.

  10. Performance requirements for the single-shell tank

    International Nuclear Information System (INIS)

    GRENARD, C.E.

    1999-01-01

    This document provides performance requirements for the waste storage and waste feed delivery functions of the Single-Shell Tank (SST) System. The requirements presented here in will be used as a basis for evaluating the ability of the system to complete the single-shell tank waste feed delivery mission. They will also be used to select the technology or technologies for retrieving waste from the tanks selected for the single-shell tank waste feed delivery mission, assumed to be 241-C-102 and 241-C-104. This revision of the Performance Requirements for the SST is based on the findings of the SST Functional Analysis, and are reflected in the current System Specification for the SST System

  11. High-performance execution of psychophysical tasks with complex visual stimuli in MATLAB

    Science.gov (United States)

    Asaad, Wael F.; Santhanam, Navaneethan; McClellan, Steven

    2013-01-01

    Behavioral, psychological, and physiological experiments often require the ability to present sensory stimuli, monitor and record subjects' responses, interface with a wide range of devices, and precisely control the timing of events within a behavioral task. Here, we describe our recent progress developing an accessible and full-featured software system for controlling such studies using the MATLAB environment. Compared with earlier reports on this software, key new features have been implemented to allow the presentation of more complex visual stimuli, increase temporal precision, and enhance user interaction. These features greatly improve the performance of the system and broaden its applicability to a wider range of possible experiments. This report describes these new features and improvements, current limitations, and quantifies the performance of the system in a real-world experimental setting. PMID:23034363

  12. Stop: a fast procedure for the exact computation of the performance of complex probabilistic systems

    International Nuclear Information System (INIS)

    Corynen, G.C.

    1982-01-01

    A new set-theoretic method for the exact and efficient computation of the probabilistic performance of complex systems has been developed. The core of the method is a fast algorithm for disjointing a collection of product sets which is intended for systems with more than 1000 components and 100,000 cut sets. The method is based on a divide-and-conquer approach, in which a multidimensional problem is progressively decomposed into lower-dimensional subproblems along its dimensions. The method also uses a particular pointer system that eliminates the need to store the subproblems by only requiring the storage of pointers to those problems. Examples of the algorithm and the divide-and-conquer strategy are provided, and comparisons with other significant methods are made. Statistical complexity studies show that the expected time and space complexity of other methods is O(me/sup n/), but that our method is O(nm 3 log(m)). Problems which would require days of Cray-1 computer time with present methods can now be solved in seconds. Large-scale systems that can only be approximated with other techniques can now also be evaluated exactly

  13. Engineered Barrier System performance requirements systems study report. Revision 02

    International Nuclear Information System (INIS)

    Balady, M.A.

    1997-01-01

    This study evaluates the current design concept for the Engineered Barrier System (EBS), in concert with the current understanding of the geologic setting to assess whether enhancements to the required performance of the EBS are necessary. The performance assessment calculations are performed by coupling the EBS with the geologic setting based on the models (some of which were updated for this study) and assumptions used for the 1995 Total System Performance Assessment (TSPA). The need for enhancements is determined by comparing the performance assessment results against the EBS related performance requirements. Subsystem quantitative performance requirements related to the EBS include the requirement to allow no more than 1% of the waste packages (WPs) to fail before 1,000 years after permanent closure of the repository, as well as a requirement to control the release rate of radionuclides from the EBS. The EBS performance enhancements considered included additional engineered components as well as evaluating additional performance available from existing design features but for which no performance credit is currently being taken

  14. Engineered Barrier System performance requirements systems study report. Revision 02

    Energy Technology Data Exchange (ETDEWEB)

    Balady, M.A.

    1997-01-14

    This study evaluates the current design concept for the Engineered Barrier System (EBS), in concert with the current understanding of the geologic setting to assess whether enhancements to the required performance of the EBS are necessary. The performance assessment calculations are performed by coupling the EBS with the geologic setting based on the models (some of which were updated for this study) and assumptions used for the 1995 Total System Performance Assessment (TSPA). The need for enhancements is determined by comparing the performance assessment results against the EBS related performance requirements. Subsystem quantitative performance requirements related to the EBS include the requirement to allow no more than 1% of the waste packages (WPs) to fail before 1,000 years after permanent closure of the repository, as well as a requirement to control the release rate of radionuclides from the EBS. The EBS performance enhancements considered included additional engineered components as well as evaluating additional performance available from existing design features but for which no performance credit is currently being taken.

  15. Roles of Working Memory Performance and Instructional Strategy in Complex Cognitive Task Performance

    Science.gov (United States)

    Cevik, V.; Altun, A.

    2016-01-01

    This study aims to investigate how working memory (WM) performances and instructional strategy choices affect learners' complex cognitive task performance in online environments. Three different e-learning environments were designed based on Merrill's (2006a) model of instructional strategies. The lack of experimental research on his framework is…

  16. Input data required for specific performance assessment codes

    International Nuclear Information System (INIS)

    Seitz, R.R.; Garcia, R.S.; Starmer, R.J.; Dicke, C.A.; Leonard, P.R.; Maheras, S.J.; Rood, A.S.; Smith, R.W.

    1992-02-01

    The Department of Energy's National Low-Level Waste Management Program at the Idaho National Engineering Laboratory generated this report on input data requirements for computer codes to assist States and compacts in their performance assessments. This report gives generators, developers, operators, and users some guidelines on what input data is required to satisfy 22 common performance assessment codes. Each of the codes is summarized and a matrix table is provided to allow comparison of the various input required by the codes. This report does not determine or recommend which codes are preferable

  17. Procurement of complex performance in public infrastructure: a process perspective

    OpenAIRE

    Hartmann, Andreas; Roehrich, Jens; Davies, Andrew; Frederiksen, Lars; Davies, J.; Harrington, T.; Kirkwood, D.; Holweg, M.

    2011-01-01

    The paper analyzes the process of transitioning from procuring single products and services to procuring complex performance in public infrastructure. The aim is to examine the change in the interactions between buyer and supplier, the emergence of value co-creation and the capability development during the transition process. Based on a multiple, longitudinal case study the paper proposes three generic transition stages towards increased performance and infrastructural complexity. These stag...

  18. Effects of long-term practice and task complexity on brain activities when performing abacus-based mental calculations: a PET study

    International Nuclear Information System (INIS)

    Wu, Tung-Hsin; Chen, Chia-Lin; Huang, Yung-Hui; Liu, Ren-Shyan; Hsieh, Jen-Chuen; Lee, Jason J.S.

    2009-01-01

    The aim of this study was to examine the neural bases for the exceptional mental calculation ability possessed by Chinese abacus experts through PET imaging. We compared the different regional cerebral blood flow (rCBF) patterns using 15 O-water PET in 10 abacus experts and 12 non-experts while they were performing each of the following three tasks: covert reading, simple addition, and complex contiguous addition. All data collected were analyzed using SPM2 and MNI templates. For non-experts during the tasks of simple addition, the observed activation of brain regions were associated with coordination of language (inferior frontal network) and visuospatial processing (left parietal/frontal network). Similar activation patterns but with a larger visuospatial processing involvement were observed during complex contiguous addition tasks, suggesting the recruitment of more visuospatial memory for solving the complex problems. For abacus experts, however, the brain activation patterns showed slight differences when they were performing simple and complex addition tasks, both of which involve visuospatial processing (bilateral parietal/frontal network). These findings supported the notion that the experts were completing all the calculation process on a virtual mental abacus and relying on this same computational strategy in both simple and complex tasks, which required almost no increasing brain workload for solving the latter. In conclusion, after intensive training and practice, the neural pathways in an abacus expert have been connected more effectively for performing the number encoding and retrieval that are required in abacus tasks, resulting in exceptional mental computational ability. (orig.)

  19. Functional Requirements for Fab-7 Boundary Activity in the Bithorax Complex

    Science.gov (United States)

    Wolle, Daniel; Cleard, Fabienne; Aoki, Tsutomu; Deshpande, Girish; Karch, Francois

    2015-01-01

    Chromatin boundaries are architectural elements that determine the three-dimensional folding of the chromatin fiber and organize the chromosome into independent units of genetic activity. The Fab-7 boundary from the Drosophila bithorax complex (BX-C) is required for the parasegment-specific expression of the Abd-B gene. We have used a replacement strategy to identify sequences that are necessary and sufficient for Fab-7 boundary function in the BX-C. Fab-7 boundary activity is known to depend on factors that are stage specific, and we describe a novel ∼700-kDa complex, the late boundary complex (LBC), that binds to Fab-7 sequences that have insulator functions in late embryos and adults. We show that the LBC is enriched in nuclear extracts from late, but not early, embryos and that it contains three insulator proteins, GAF, Mod(mdg4), and E(y)2. Its DNA binding properties are unusual in that it requires a minimal sequence of >65 bp; however, other than a GAGA motif, the three Fab-7 LBC recognition elements display few sequence similarities. Finally, we show that mutations which abrogate LBC binding in vitro inactivate the Fab-7 boundary in the BX-C. PMID:26303531

  20. Functional Requirements for Fab-7 Boundary Activity in the Bithorax Complex.

    Science.gov (United States)

    Wolle, Daniel; Cleard, Fabienne; Aoki, Tsutomu; Deshpande, Girish; Schedl, Paul; Karch, Francois

    2015-11-01

    Chromatin boundaries are architectural elements that determine the three-dimensional folding of the chromatin fiber and organize the chromosome into independent units of genetic activity. The Fab-7 boundary from the Drosophila bithorax complex (BX-C) is required for the parasegment-specific expression of the Abd-B gene. We have used a replacement strategy to identify sequences that are necessary and sufficient for Fab-7 boundary function in the BX-C. Fab-7 boundary activity is known to depend on factors that are stage specific, and we describe a novel ∼700-kDa complex, the late boundary complex (LBC), that binds to Fab-7 sequences that have insulator functions in late embryos and adults. We show that the LBC is enriched in nuclear extracts from late, but not early, embryos and that it contains three insulator proteins, GAF, Mod(mdg4), and E(y)2. Its DNA binding properties are unusual in that it requires a minimal sequence of >65 bp; however, other than a GAGA motif, the three Fab-7 LBC recognition elements display few sequence similarities. Finally, we show that mutations which abrogate LBC binding in vitro inactivate the Fab-7 boundary in the BX-C. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  1. Performance of predictive models in phase equilibria of complex associating systems: PC-SAFT and CEOS/GE

    Directory of Open Access Journals (Sweden)

    N. Bender

    2013-03-01

    Full Text Available Cubic equations of state combined with excess Gibbs energy predictive models (like UNIFAC and equations of state based on applied statistical mechanics are among the main alternatives for phase equilibria prediction involving polar substances in wide temperature and pressure ranges. In this work, the predictive performances of the PC-SAFT with association contribution and Peng-Robinson (PR combined with UNIFAC (Do through mixing rules are compared. Binary and multi-component systems involving polar and non-polar substances were analyzed. Results were also compared to experimental data available in the literature. Results show a similar predictive performance for PC-SAFT with association and cubic equations combined with UNIFAC (Do through mixing rules. Although PC-SAFT with association requires less parameters, it is more complex and requires more computation time.

  2. Sleep-related offline improvements in gross motor task performance occur under free recall requirements

    Directory of Open Access Journals (Sweden)

    Andreas eMalangre

    2016-03-01

    Full Text Available Nocturnal sleep effects on memory consolidation following gross motor sequence learning were examined using a complex arm movement task. This task required participants to produce non-regular spatial patterns in the horizontal plane by successively fitting a small peg into different target-holes on an electronic pegboard. The respective reaching movements typically differed in amplitude and direction. Targets were visualized prior to each transport movement on a computer screen. With this task we tested 18 subjects (22.6 +/- 1.9 years; 8 female using a between-subjects design. Participants initially learned a 10-element arm movement sequence either in the morning or in the evening. Performance was retested under free recall requirements 15 minutes post training, as well as 12 hrs and 24 hrs later. Thus each group was provided with one sleep-filled and one wake retention interval. Dependent variables were error rate (number of erroneous sequences and average sequence execution time (correct sequences only. Performance improved during acquisition. Error rate remained stable across retention. Sequence execution time (inverse to execution speed significantly decreased again during the sleep-filled retention intervals, but remained stable during the respective wake intervals. These results corroborate recent findings on sleep-related enhancement consolidation in ecological valid, complex gross motor tasks. At the same time they suggest this effect to be truly memory-based and independent from repeated access to extrinsic sequence information during retests.

  3. The contribution of material control to meeting performance requirements

    International Nuclear Information System (INIS)

    Rivers, J.D.

    1989-01-01

    The U.S. Dept. of Energy (DOE) is in the process of implementing a set of performance requirements for material control and accountability (MC ampersand A). These graded requirements set a uniform level of performance for similar materials at various facilities with respect to the threat of an insider adversary stealing special nuclear material (SNM). These requirements are phrased in terms of detecting the theft of a goal quantity of SNM within a specified time period and with a probability greater than or equal to a specified value and include defense in-depth requirements

  4. 40 CFR 180.1022 - Iodine-detergent complex; exemption from the requirement of a tolerance.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Iodine-detergent complex; exemption... FOOD Exemptions From Tolerances § 180.1022 Iodine-detergent complex; exemption from the requirement of a tolerance. The aqueous solution of hydriodic acid and elemental iodine, including one or both of...

  5. Germination and seedling establishment in orchids: a complex of requirements.

    Science.gov (United States)

    Rasmussen, Hanne N; Dixon, Kingsley W; Jersáková, Jana; Těšitelová, Tamara

    2015-09-01

    Seedling recruitment is essential to the sustainability of any plant population. Due to the minute nature of seeds and early-stage seedlings, orchid germination in situ was for a long time practically impossible to observe, creating an obstacle towards understanding seedling site requirements and fluctuations in orchid populations. The introduction of seed packet techniques for sowing and retrieval in natural sites has brought with it important insights, but many aspects of orchid seed and germination biology remain largely unexplored. The germination niche for orchids is extremely complex, because it is defined by requirements not only for seed lodging and germination, but also for presence of a fungal host and its substrate. A mycobiont that the seedling can parasitize is considered an essential element, and a great diversity of Basidiomycota and Ascomycota have now been identified for their role in orchid seed germination, with fungi identifiable as imperfect Rhizoctonia species predominating. Specificity patterns vary from orchid species employing a single fungal lineage to species associating individually with a limited selection of distantly related fungi. A suitable organic carbon source for the mycobiont constitutes another key requirement. Orchid germination also relies on factors that generally influence the success of plant seeds, both abiotic, such as light/shade, moisture, substrate chemistry and texture, and biotic, such as competitors and antagonists. Complexity is furthermore increased when these factors influence seeds/seedling, fungi and fungal substrate differentially. A better understanding of germination and seedling establishment is needed for conservation of orchid populations. Due to the obligate association with a mycobiont, the germination niches in orchid species are extremely complex and varied. Microsites suitable for germination can be small and transient, and direct observation is difficult. An experimental approach using several

  6. Does complexity matter? Meta-analysis of learner performance in artificial grammar tasks.

    Science.gov (United States)

    Schiff, Rachel; Katan, Pesia

    2014-01-01

    Complexity has been shown to affect performance on artificial grammar learning (AGL) tasks (categorization of test items as grammatical/ungrammatical according to the implicitly trained grammar rules). However, previously published AGL experiments did not utilize consistent measures to investigate the comprehensive effect of grammar complexity on task performance. The present study focused on computerizing Bollt and Jones's (2000) technique of calculating topological entropy (TE), a quantitative measure of AGL charts' complexity, with the aim of examining associations between grammar systems' TE and learners' AGL task performance. We surveyed the literature and identified 56 previous AGL experiments based on 10 different grammars that met the sampling criteria. Using the automated matrix-lift-action method, we assigned a TE value for each of these 10 previously used AGL systems and examined its correlation with learners' task performance. The meta-regression analysis showed a significant correlation, demonstrating that the complexity effect transcended the different settings and conditions in which the categorization task was performed. The results reinforced the importance of using this new automated tool to uniformly measure grammar systems' complexity when experimenting with and evaluating the findings of AGL studies.

  7. Complexities and constraints influencing learner performance in physical science

    Directory of Open Access Journals (Sweden)

    Mavhungu Abel Mafukata

    2016-01-01

    Full Text Available This paper explores complexities and constraints affecting performance and output of physical science learners in Vhembe District, Limpopo Province, South Africa. The study was motivated by the desire of the researcher to establish, profile and characterise the complexities and constraints reminiscence of poor performance of learners in physical science as measured through end-of-year Grade 12 (final year of high school education examination results. Twenty six schools (n=26 were purposively selected from three circuits of education (n=3. From these schools, two learners were randomly selected (n=52 for interviews. In addition, two circuit managers (n=2 were conveniently selected as part of Key Informant Interviews (KII. For the Focus Group Discussions (FGDs, twelve (n=12 parents were randomly selected to form two groups of six members each. Multi-factor complexities and constraints impeding performance of learners were discovered. Intensive teacher in-service programme is recommended. Community engagement should be encouraged to educate parents on the value of involvement in the education of their children. Free access learner support structures such as Homework and Extra-lessons Assistance Centre (H&EACs should be established.

  8. A performance requirements analysis of the SSC control system

    International Nuclear Information System (INIS)

    Hunt, S.M.; Low, K.

    1992-01-01

    This paper presents the results of analysis of the performance requirements of the Superconducting Super Collider Control System. We quantify the performance requirements of the system in terms of response time, throughput and reliability. We then examine the effect of distance and traffic patterns on control system performance and examine how these factors influence the implementation of the control network architecture and compare the proposed system against those criteria. (author)

  9. Comparison of energy performance requirements levels

    DEFF Research Database (Denmark)

    Spiekman, Marleen; Thomsen, Kirsten Engelund; Rose, Jørgen

    This summary report provides a synthesis of the work within the EU SAVE project ASIEPI on developing a method to compare the energy performance (EP) requirement levels among the countries of Europe. Comparing EP requirement levels constitutes a major challenge. From the comparison of for instance...... the present Dutch requirement level (EPC) of 0,8 with the present Flemish level of E80, it can easily be seen that direct comparison is not possible. The conclusions and recommendations of the study are presented in part A. These constitute the most important result of the project. Part B gives an overview...... of all other project material related to that topic, which allows to easily identify the most pertinent information. Part C lists the project partners and sponsors....

  10. Cooperative decoding in femtocell networks: Performance-complexity tradeoff

    KAUST Repository

    Benkhelifa, Fatma

    2012-06-01

    Femtocells, which are low cost low power, stand alone cellular access points, are a potential solution to provide good indoor coverage with high data rate. However, the femtocell deployment may also increase the co-channel interference (CCI) by reducing the distance reuse of the spectrum. In this paper, we introduce methods to cancel out the interference among the femtocells while considering that macrocells operate orthogonally to the femtocells. The femtocells may also cooperate through joint detection of the received signal and improve the overall error performance at the expense of an increased computational complexity. In this paper, the performance-complexity tradeoff of cooperative detection is investigated for uplink transmissions. Numerical results show that the cooperative detection gain may reach 10 dB at a Bit-Error Rate (BER) of 10 -2 when compared to the case without cooperation. © 2012 IEEE.

  11. Cooperative decoding in femtocell networks: Performance-complexity tradeoff

    KAUST Repository

    Benkhelifa, Fatma; Rezki, Zouheir; Alouini, Mohamed-Slim

    2012-01-01

    Femtocells, which are low cost low power, stand alone cellular access points, are a potential solution to provide good indoor coverage with high data rate. However, the femtocell deployment may also increase the co-channel interference (CCI) by reducing the distance reuse of the spectrum. In this paper, we introduce methods to cancel out the interference among the femtocells while considering that macrocells operate orthogonally to the femtocells. The femtocells may also cooperate through joint detection of the received signal and improve the overall error performance at the expense of an increased computational complexity. In this paper, the performance-complexity tradeoff of cooperative detection is investigated for uplink transmissions. Numerical results show that the cooperative detection gain may reach 10 dB at a Bit-Error Rate (BER) of 10 -2 when compared to the case without cooperation. © 2012 IEEE.

  12. Complex matrix multiplication operations with data pre-conditioning in a high performance computing architecture

    Science.gov (United States)

    Eichenberger, Alexandre E; Gschwind, Michael K; Gunnels, John A

    2014-02-11

    Mechanisms for performing a complex matrix multiplication operation are provided. A vector load operation is performed to load a first vector operand of the complex matrix multiplication operation to a first target vector register. The first vector operand comprises a real and imaginary part of a first complex vector value. A complex load and splat operation is performed to load a second complex vector value of a second vector operand and replicate the second complex vector value within a second target vector register. The second complex vector value has a real and imaginary part. A cross multiply add operation is performed on elements of the first target vector register and elements of the second target vector register to generate a partial product of the complex matrix multiplication operation. The partial product is accumulated with other partial products and a resulting accumulated partial product is stored in a result vector register.

  13. 42 CFR 493.1453 - Condition: Laboratories performing high complexity testing; clinical consultant.

    Science.gov (United States)

    2010-10-01

    ... Condition: Laboratories performing high complexity testing; clinical consultant. The laboratory must have a... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Laboratories performing high complexity testing; clinical consultant. 493.1453 Section 493.1453 Public Health CENTERS FOR MEDICARE & MEDICAID...

  14. 14 CFR 171.321 - DME and marker beacon performance requirements.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false DME and marker beacon performance... (MLS) § 171.321 DME and marker beacon performance requirements. (a) The DME equipment must meet the..._regulations/ibr_locations.html. (b) MLS marker beacon equipment must meet the performance requirements...

  15. Investigation of high-alpha lateral-directional control power requirements for high-performance aircraft

    Science.gov (United States)

    Foster, John V.; Ross, Holly M.; Ashley, Patrick A.

    1993-01-01

    Designers of the next-generation fighter and attack airplanes are faced with the requirements of good high angle-of-attack maneuverability as well as efficient high speed cruise capability with low radar cross section (RCS) characteristics. As a result, they are challenged with the task of making critical design trades to achieve the desired levels of maneuverability and performance. This task has highlighted the need for comprehensive, flight-validated lateral-directional control power design guidelines for high angles of attack. A joint NASA/U.S. Navy study has been initiated to address this need and to investigate the complex flight dynamics characteristics and controls requirements for high angle-of-attack lateral-directional maneuvering. A multi-year research program is underway which includes groundbased piloted simulation and flight validation. This paper will give a status update of this program that will include a program overview, description of test methodology and preliminary results.

  16. 14 CFR 151.53 - Performance of construction work: Labor requirements.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Performance of construction work: Labor... § 151.53 Performance of construction work: Labor requirements. A sponsor who is required to include in a... during the performance of work under the contract, to the extent necessary to determine whether the...

  17. High-performance mussel-inspired adhesives of reduced complexity.

    Science.gov (United States)

    Ahn, B Kollbe; Das, Saurabh; Linstadt, Roscoe; Kaufman, Yair; Martinez-Rodriguez, Nadine R; Mirshafian, Razieh; Kesselman, Ellina; Talmon, Yeshayahu; Lipshutz, Bruce H; Israelachvili, Jacob N; Waite, J Herbert

    2015-10-19

    Despite the recent progress in and demand for wet adhesives, practical underwater adhesion remains limited or non-existent for diverse applications. Translation of mussel-inspired wet adhesion typically entails catechol functionalization of polymers and/or polyelectrolytes, and solution processing of many complex components and steps that require optimization and stabilization. Here we reduced the complexity of a wet adhesive primer to synthetic low-molecular-weight catecholic zwitterionic surfactants that show very strong adhesion (∼50 mJ m(-2)) and retain the ability to coacervate. This catecholic zwitterion adheres to diverse surfaces and self-assembles into a molecularly smooth, thin (adhesive for nanofabrication. This study significantly simplifies bio-inspired themes for wet adhesion by combining catechol with hydrophobic and electrostatic functional groups in a small molecule.

  18. Product variety, product complexity and manufacturing operational performance: A systematic literature review

    DEFF Research Database (Denmark)

    Trattner, Alexandria Lee; Hvam, Lars; Herbert-Hansen, Zaza Nadja Lee

    Manufacturing in the twenty-first century has been wrought with the struggle to satisfy the rising demand for greater product variety and more complex products while still maintaining efficient manufacturing operations. However, the literature lacks an overview of which operational performance...... measures are most affected by increased variety and complexity. This study presents a systematic literature review of the recent scholarly literature on variety, complexity and manufacturing operational performance (MOP). Results show that product variety has a consistently negative relationship with MOP...... across different time, cost, quality and flexibility measures while product complexity lacks evidence of strong relationships with MOP measures....

  19. The Role of Task Complexity, Modality, and Aptitude in Narrative Task Performance

    Science.gov (United States)

    Kormos, Judit; Trebits, Anna

    2012-01-01

    The study reported in this paper investigated the relationship between components of aptitude and the fluency, lexical variety, syntactic complexity, and accuracy of performance in two types of written and spoken narrative tasks. We also addressed the question of how narrative performance varies in tasks of different cognitive complexity in the…

  20. Working group 4B - human intrusion: Design/performance requirements

    International Nuclear Information System (INIS)

    Channell, J.

    1993-01-01

    There is no summary of the progress made by working group 4B (Human Intrusion: Design/performance Requirements) during the Electric Power Research Institute's EPRI Workshop on the technical basis of EPA HLW Disposal Criteria, March 1993. This group was to discuss the waste disposal standard, 40 CFR Part 191, in terms of the design and performance requirements of human intrusion. Instead, because there were so few members, they combined with working group 4A and studied the three-tier approach to evaluating postclosure performance

  1. Performance Requirements for the Double Shell Tank (DST) System

    International Nuclear Information System (INIS)

    SMITH, D.F.

    2001-01-01

    This document identifies the upper-level Double-Shell Tank (DST) System functions and bounds the associated performance requirements. The functions and requirements are provided along with supporting bases. These functions and requirements, in turn, will be incorporated into specifications for the DST System

  2. Performance objectives for the Hanford Immobilized Low-Activity Waste (ILAW) performance assessment

    International Nuclear Information System (INIS)

    MANN, F.M.

    1999-01-01

    Performance objectives for the disposal of low activity waste from Hanford Waste Tanks have been developed. These objectives have been based on DOE requirements, programmatic requirements, and public involvement. The DOE requirements include regulations that direct the performance assessment and are cited within the Radioactive Waste Management Order (DOE Order 435.1). Performance objectives for other DOE complex performance assessments have been included

  3. Radioactive Waste Management Complex performance assessment: Draft

    Energy Technology Data Exchange (ETDEWEB)

    Case, M.J.; Maheras, S.J.; McKenzie-Carter, M.A.; Sussman, M.E.; Voilleque, P.

    1990-06-01

    A radiological performance assessment of the Radioactive Waste Management Complex at the Idaho National Engineering Laboratory was conducted to demonstrate compliance with appropriate radiological criteria of the US Department of Energy and the US Environmental Protection Agency for protection of the general public. The calculations involved modeling the transport of radionuclides from buried waste, to surface soil and subsurface media, and eventually to members of the general public via air, ground water, and food chain pathways. Projections of doses were made for both offsite receptors and individuals intruding onto the site after closure. In addition, uncertainty analyses were performed. Results of calculations made using nominal data indicate that the radiological doses will be below appropriate radiological criteria throughout operations and after closure of the facility. Recommendations were made for future performance assessment calculations.

  4. Radioactive Waste Management Complex performance assessment: Draft

    International Nuclear Information System (INIS)

    Case, M.J.; Maheras, S.J.; McKenzie-Carter, M.A.; Sussman, M.E.; Voilleque, P.

    1990-06-01

    A radiological performance assessment of the Radioactive Waste Management Complex at the Idaho National Engineering Laboratory was conducted to demonstrate compliance with appropriate radiological criteria of the US Department of Energy and the US Environmental Protection Agency for protection of the general public. The calculations involved modeling the transport of radionuclides from buried waste, to surface soil and subsurface media, and eventually to members of the general public via air, ground water, and food chain pathways. Projections of doses were made for both offsite receptors and individuals intruding onto the site after closure. In addition, uncertainty analyses were performed. Results of calculations made using nominal data indicate that the radiological doses will be below appropriate radiological criteria throughout operations and after closure of the facility. Recommendations were made for future performance assessment calculations

  5. 40 CFR 158.2160 - Microbial pesticides product performance data requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Microbial pesticides product... AGENCY (CONTINUED) PESTICIDE PROGRAMS DATA REQUIREMENTS FOR PESTICIDES Microbial Pesticides § 158.2160 Microbial pesticides product performance data requirements. Product performance data must be developed for...

  6. Effects of task complexity on rhythmic reproduction performance in adults.

    Science.gov (United States)

    Iannarilli, Flora; Vannozzi, Giuseppe; Iosa, Marco; Pesce, Caterina; Capranica, Laura

    2013-02-01

    The aim of the present study was to investigate the effect of task complexity on the capability to reproduce rhythmic patterns. Sedentary musically illiterate individuals (age: 34.8±4.2 yrs; M±SD) were administered a rhythmic test including three rhythmic patterns to be reproduced by means of finger-tapping, foot-tapping and walking. For the quantification of subjects' ability in the reproduction of rhythmic patterns, qualitative and quantitative parameters were submitted to analysis. A stereophotogrammetric system was used to reconstruct and evaluate individual performances. The findings indicated a good internal stability of the rhythmic reproduction, suggesting that the present experimental design is suitable to discriminate the participants' rhythmic ability. Qualitative aspects of rhythmic reproduction (i.e., speed of execution and temporal ratios between events) varied as a function of the perceptual-motor requirements of the rhythmic reproduction task, with larger reproduction deviations in the walking task. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Procuring complex performance:case: public infrastructure projects

    OpenAIRE

    Leppänen, T. (Tero)

    2015-01-01

    Abstract This research studies procuring complex performance (PCP) in the case of public infrastructure projects. Focus of the research is on the interface between public clients and private sector contractors. Purpose of this research is to find out what are the main challenges of different project delivery methods according to literature (RQ1) and what are the practical challenges of public procurement (RQ2). As an end re...

  8. Assessing students' performance in software requirements engineering education using scoring rubrics

    Science.gov (United States)

    Mkpojiogu, Emmanuel O. C.; Hussain, Azham

    2017-10-01

    The study investigates how helpful the use of scoring rubrics is, in the performance assessment of software requirements engineering students and whether its use can lead to students' performance improvement in the development of software requirements artifacts and models. Scoring rubrics were used by two instructors to assess the cognitive performance of a student in the design and development of software requirements artifacts. The study results indicate that the use of scoring rubrics is very helpful in objectively assessing the performance of software requirements or software engineering students. Furthermore, the results revealed that the use of scoring rubrics can also produce a good achievement assessments direction showing whether a student is either improving or not in a repeated or iterative assessment. In a nutshell, its use leads to the performance improvement of students. The results provided some insights for further investigation and will be beneficial to researchers, requirements engineers, system designers, developers and project managers.

  9. Performances and improvement of copper-hydrazine complexation deoxidising resin

    International Nuclear Information System (INIS)

    Liu Fenfen; Zhang Hao; Sun Haijun; Liu Xiaojie

    2012-01-01

    Copper-hydrazine complexation deoxidising resin is tested to examine its performances including effluent water quality and capacity of deoxidisation. By the means of changing the resin type and regeneration, the deoxidising capacity of the resin can be improved to 13 times more than before. At the same time, physical performances of the resin are also greatly improved while maintaining its velocity of deoxidisation and effluent quality. (authors)

  10. Complex rectal polyps: other treatment modalities required when offering a transanal endoscopic microsurgery service.

    LENUS (Irish Health Repository)

    Joyce, Myles R

    2011-09-01

    Complex rectal polyps may present a clinical challenge. The study aim was to assess different treatment modalities required in the management of patients referred for transanal endoscopic microsurgery.

  11. Real-Time and Real-Fast Performance of General-Purpose and Real-Time Operating Systems in Multithreaded Physical Simulation of Complex Mechanical Systems

    Directory of Open Access Journals (Sweden)

    Carlos Garre

    2014-01-01

    Full Text Available Physical simulation is a valuable tool in many fields of engineering for the tasks of design, prototyping, and testing. General-purpose operating systems (GPOS are designed for real-fast tasks, such as offline simulation of complex physical models that should finish as soon as possible. Interfacing hardware at a given rate (as in a hardware-in-the-loop test requires instead maximizing time determinism, for which real-time operating systems (RTOS are designed. In this paper, real-fast and real-time performance of RTOS and GPOS are compared when simulating models of high complexity with large time steps. This type of applications is usually present in the automotive industry and requires a good trade-off between real-fast and real-time performance. The performance of an RTOS and a GPOS is compared by running a tire model scalable on the number of degrees-of-freedom and parallel threads. The benchmark shows that the GPOS present better performance in real-fast runs but worse in real-time due to nonexplicit task switches and to the latency associated with interprocess communication (IPC and task switch.

  12. Performance in complex motor tasks deteriorates in hyperthermic humans

    DEFF Research Database (Denmark)

    Piil, Jacob Feder; Lundbye-Jensen, Jesper; Trangmar, Steven J

    2017-01-01

    -motor tracking performance was reduced by 10.7 ± 6.5% following exercise-induced hyperthermia when integrated in the multipart protocol and 4.4 ± 5.7% when tested separately (bothP 1.3% (P math tasks...... of information or decision-making prior to responding. We hypothesized that divergences could relate to task complexity and developed a protocol consisting of 1) simple motor task [TARGET_pinch], 2) complex motor task [Visuo-motor tracking], 3) simple math task [MATH_type], 4) combined motor-math task [MATH...

  13. COMPLEX TRAINING: A BRIEF REVIEW

    Directory of Open Access Journals (Sweden)

    William P. Ebben

    2002-06-01

    Full Text Available The effectiveness of plyometric training is well supported by research. Complex training has gained popularity as a training strategy combining weight training and plyometric training. Anecdotal reports recommend training in this fashion in order to improve muscular power and athletic performance. Recently, several studies have examined complex training. Despite the fact that questions remain about the potential effectiveness and implementation of this type of training, results of recent studies are useful in guiding practitioners in the development and implementation of complex training programs. In some cases, research suggests that complex training has an acute ergogenic effect on upper body power and the results of acute and chronic complex training include improved jumping performance. Improved performance may require three to four minutes rest between the weight training and plyometrics sets and the use of heavy weight training loads

  14. 29 CFR 1620.15 - Jobs requiring equal skill in performance.

    Science.gov (United States)

    2010-07-01

    ... EQUAL PAY ACT § 1620.15 Jobs requiring equal skill in performance. (a) In general. The jobs to which the equal pay standard is applicable are jobs requiring equal skill in their performance. Where the amount... another job, the equal pay standard cannot apply even though the jobs may be equal in all other respects...

  15. 29 CFR 1620.16 - Jobs requiring equal effort in performance.

    Science.gov (United States)

    2010-07-01

    ..., however, that men and women are working side by side on a line assembling parts. Suppose further that one... 29 Labor 4 2010-07-01 2010-07-01 false Jobs requiring equal effort in performance. 1620.16 Section... EQUAL PAY ACT § 1620.16 Jobs requiring equal effort in performance. (a) In general. The jobs to which...

  16. Structuring requirements as necessary premise for customer-oriented development of complex products: A generic approach

    Directory of Open Access Journals (Sweden)

    Sandra Klute

    2011-10-01

    Full Text Available Purpose: Complex products like for example intra-logistical facilities make high demands on developers and producers and involve high investment and operating costs. When planning and developing and also making buying decisions the facility utilization and the thus ensuing requirements on the facility and its components are inadequately considered to date. Nevertheless, with regard to customer-directed product design, these requirements must all be taken into account – especially as they can contribute to possible savings. In this context, it is necessary to survey and systematically regard requirements from a large number of areas like for example the operator, the facility producer and also requirements of external parties such as the law and to implement into adequate product characteristics to produce customer-oriented products. This is, however, a difficult task because of the diversity of stakeholders involved and their numerous and often divergent requirements. Therefore, it is essential to structure the requirements, so that planners and developers are able to manage the large amount of information. Structure models can be used in this context to cluster requirements. Within the German Collaborative Research Centre 696 a 10-dimensional model has been developed. This model allows structuring of all requirements on intra-logistical facilities or respectively complex products in general. In the context of dealing with hundreds of data records, structuring requirements is mandatory to achieve accuracy, clarity and consequently satisfactory results when transforming requirements into product characteristics which fit customer needs. In the paper an excerpt of this model is presented. Design/methodology/approach: In literature a multitude of methods which deal with the topic of structuring exist. The methods have been analysed regarding their purpose and their level of specification, i.e. the number of differentiated categories, to check if

  17. Technique of Substantiating Requirements for the Vision Systems of Industrial Robotic Complexes

    Directory of Open Access Journals (Sweden)

    V. Ya. Kolyuchkin

    2015-01-01

    Full Text Available In references, there is a lack of approaches to describe the justified technical requirements for the vision systems (VS of industrial robotics complexes (IRC. Therefore, an objective of the work is to develop a technique that allows substantiating requirements for the main quality indicators of VS, functioning as a part of the IRC.The proposed technique uses a model representation of VS, which, as a part of the IRC information system, sorts the objects in the work area, as well as measures their linear and angular coordinates. To solve the problem of statement there is a proposal to define the target function of a designed IRC as a dependence of the IRC indicator efficiency on the VS quality indicators. The paper proposes to use, as an indicator of the IRC efficiency, the probability of a lack of fault products when manufacturing. Based on the functions the VS perform as a part of the IRC information system, the accepted indicators of VS quality are as follows: a probability of the proper recognition of objects in the working IRC area, and confidential probabilities of measuring linear and angular orientation coordinates of objects with the specified values of permissible error. Specific values of these errors depend on the orientation errors of working bodies of manipulators that are a part of the IRC. The paper presents mathematical expressions that determine the functional dependence of the probability of a lack of fault products when manufacturing on the VS quality indicators and the probability of failures of IRC technological equipment.The offered technique for substantiating engineering requirements for the VS of IRC has novelty. The results obtained in this work can be useful for professionals involved in IRC VS development, and, in particular, in development of VS algorithms and software.

  18. Complex Hand Dexterity: A Review of Biomechanical Methods for Measuring Musical Performance

    Directory of Open Access Journals (Sweden)

    Cheryl Diane Metcalf

    2014-05-01

    Full Text Available Complex hand dexterity is fundamental to our interactions with the physical, social and cultural environment. Dexterity can be an expression of creativity and precision in a range of activities, including musical performance. Little is understood about complex hand dexterity or how virtuoso expertise is acquired, due to the versatility of movement combinations available to complete any given task. This has historically limited progress of the field because of difficulties in measuring movements of the hand. Recent developments in methods of motion capture and analysis mean it is now possible to explore the intricate movements of the hand and fingers. These methods allow us insights into the neurophysiological mechanisms underpinning complex hand dexterity and motor learning. They also allow investigation into the key factors that contribute to injury, recovery and functional compensation.The application of such analytical techniques within musical performance provides a multidisciplinary framework for purposeful investigation into the process of learning and skill acquisition in instrumental performance. These highly skilled manual and cognitive tasks present the ultimate achievement in complex hand dexterity. This paper will review methods of assessing instrumental performance in music, focusing specifically on biomechanical measurement and the associated technical challenges faced when measuring highly dexterous activities.

  19. Frequency Control Performance Measurement and Requirements

    Energy Technology Data Exchange (ETDEWEB)

    Illian, Howard F.

    2010-12-20

    Frequency control is an essential requirement of reliable electric power system operations. Determination of frequency control depends on frequency measurement and the practices based on these measurements that dictate acceptable frequency management. This report chronicles the evolution of these measurements and practices. As technology progresses from analog to digital for calculation, communication, and control, the technical basis for frequency control measurement and practices to determine acceptable performance continues to improve. Before the introduction of digital computing, practices were determined largely by prior experience. In anticipation of mandatory reliability rules, practices evolved from a focus primarily on commercial and equity issues to an increased focus on reliability. This evolution is expected to continue and place increased requirements for more precise measurements and a stronger scientific basis for future frequency management practices in support of reliability.

  20. High performance sealing - meeting nuclear and aerospace requirements

    International Nuclear Information System (INIS)

    Wensel, R.; Metcalfe, R.

    1994-11-01

    Although high performance sealing is required in many places, two industries lead all others in terms of their demand-nuclear and aerospace. The factors that govern the high reliability and integrity of seals, particularly elastomer seals, for both industries are discussed. Aerospace requirements include low structural weight and a broad range of conditions, from the cold vacuum of space to the hot, high pressures of rocket motors. It is shown, by example, how a seal can be made an integral part of a structure in order to improve performance, rather than using a conventional handbook design. Typical processes are then described for selection, specification and procurement of suitable elastomers, functional and accelerated performance testing, database development and service-life prediction. Methods for quality assurance of elastomer seals are summarized. Potentially catastrophic internal dejects are a particular problem for conventional non-destructive inspection techniques. A new method of elastodynamic testing for these is described. (author)

  1. 19 CFR 143.5 - System performance requirements.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false System performance requirements. 143.5 Section 143.5 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF... must demonstrate that his system can interface directly with the Customs computer and ensure accurate...

  2. Evaluation of the Trade Space Between UAS Maneuver Performance and SAA System Performance Requirements

    Science.gov (United States)

    Jack, Devin P.; Hoffler, Keith D.; Johnson, Sally C.

    2014-01-01

    A need exists to safely integrate Unmanned Aircraft Systems (UAS) into the National Airspace System. Replacing manned aircraft's see-and-avoid capability in the absence of an onboard pilot is one of the key challenges associated with safe integration. Sense-and-avoid (SAA) systems will have to achieve yet-to-be-determined required separation distances for a wide range of encounters. They will also need to account for the maneuver performance of the UAS they are paired with. The work described in this paper is aimed at developing an understanding of the trade space between UAS maneuver performance and SAA system performance requirements. An assessment of current manned and unmanned aircraft performance was used to establish potential UAS performance test matrix bounds. Then, nearterm UAS integration work was used to narrow down the scope. A simulator was developed with sufficient fidelity to assess SAA system performance requirements for a wide range of encounters. The simulator generates closest-point-of-approach (CPA) data from the wide range of UAS performance models maneuvering against a single intruder with various encounter geometries. The simulator is described herein and has both a graphical user interface and batch interface to support detailed analysis of individual UAS encounters and macro analysis of a very large set of UAS and encounter models, respectively. Results from the simulator using approximate performance data from a well-known manned aircraft is presented to provide insight into the problem and as verification and validation of the simulator. Analysis of climb, descent, and level turn maneuvers to avoid a collision is presented. Noting the diversity of backgrounds in the UAS community, a description of the UAS aerodynamic and propulsive design and performance parameters is included. Initial attempts to model the results made it clear that developing maneuver performance groups is required. Discussion of the performance groups developed and how

  3. Input data requirements for performance modelling and monitoring of photovoltaic plants

    DEFF Research Database (Denmark)

    Gavriluta, Anamaria Florina; Spataru, Sergiu; Sera, Dezso

    2018-01-01

    This work investigates the input data requirements in the context of performance modeling of thin-film photovoltaic (PV) systems. The analysis focuses on the PVWatts performance model, well suited for on-line performance monitoring of PV strings, due to its low number of parameters and high......, modelling the performance of the PV modules at high irradiances requires a dataset of only a few hundred samples in order to obtain a power estimation accuracy of ~1-2\\%....

  4. Minimum Energy Requirements in Complex Distillation Arrangements

    Energy Technology Data Exchange (ETDEWEB)

    Halvorsen, Ivar J

    2001-07-01

    Distillation is the most widely used industrial separation technology and distillation units are responsible for a significant part of the total heat consumption in the world's process industry. In this work we focus on directly (fully thermally) coupled column arrangements for separation of multicomponent mixtures. These systems are also denoted Petlyuk arrangements, where a particular implementation is the dividing wall column. Energy savings in the range of 20-40% have been reported with ternary feed mixtures. In addition to energy savings, such integrated units have also a potential for reduced capital cost, making them extra attractive. However, the industrial use has been limited, and difficulties in design and control have been reported as the main reasons. Minimum energy results have only been available for ternary feed mixtures and sharp product splits. This motivates further research in this area, and this thesis will hopefully give some contributions to better understanding of complex column systems. In the first part we derive the general analytic solution for minimum energy consumption in directly coupled columns for a multicomponent feed and any number of products. To our knowledge, this is a new contribution in the field. The basic assumptions are constant relative volatility, constant pressure and constant molar flows and the derivation is based on Underwood's classical methods. An important conclusion is that the minimum energy consumption in a complex directly integrated multi-product arrangement is the same as for the most difficult split between any pair of the specified products when we consider the performance of a conventional two-product column. We also present the Vmin-diagram, which is a simple graphical tool for visualisation of minimum energy related to feed distribution. The Vmin-diagram provides a simple mean to assess the detailed flow requirements for all parts of a complex directly coupled arrangement. The main purpose in the first

  5. Minimum Energy Requirements in Complex Distillation Arrangements

    Energy Technology Data Exchange (ETDEWEB)

    Halvorsen, Ivar J.

    2001-07-01

    Distillation is the most widely used industrial separation technology and distillation units are responsible for a significant part of the total heat consumption in the world's process industry. In this work we focus on directly (fully thermally) coupled column arrangements for separation of multicomponent mixtures. These systems are also denoted Petlyuk arrangements, where a particular implementation is the dividing wall column. Energy savings in the range of 20-40% have been reported with ternary feed mixtures. In addition to energy savings, such integrated units have also a potential for reduced capital cost, making them extra attractive. However, the industrial use has been limited, and difficulties in design and control have been reported as the main reasons. Minimum energy results have only been available for ternary feed mixtures and sharp product splits. This motivates further research in this area, and this thesis will hopefully give some contributions to better understanding of complex column systems. In the first part we derive the general analytic solution for minimum energy consumption in directly coupled columns for a multicomponent feed and any number of products. To our knowledge, this is a new contribution in the field. The basic assumptions are constant relative volatility, constant pressure and constant molar flows and the derivation is based on Underwood's classical methods. An important conclusion is that the minimum energy consumption in a complex directly integrated multi-product arrangement is the same as for the most difficult split between any pair of the specified products when we consider the performance of a conventional two-product column. We also present the Vmin-diagram, which is a simple graphical tool for visualisation of minimum energy related to feed distribution. The Vmin-diagram provides a simple mean to assess the detailed flow requirements for all parts of a complex directly coupled arrangement. The main purpose in

  6. Where to from here? Future applications of mental models of complex performance

    International Nuclear Information System (INIS)

    Hahn, H.A.; Nelson, W.R.; Blackman, H.S.

    1988-01-01

    The purpose of this paper is to raise issues for discussion regarding the applications of mental models in the study of complex performance. Applications for training, expert systems and decision aids, job selection, workstation design, and other complex environments are considered. 1 ref

  7. Performance and Complexity Analysis of Blind FIR Channel Identification Algorithms Based on Deterministic Maximum Likelihood in SIMO Systems

    DEFF Research Database (Denmark)

    De Carvalho, Elisabeth; Omar, Samir; Slock, Dirk

    2013-01-01

    We analyze two algorithms that have been introduced previously for Deterministic Maximum Likelihood (DML) blind estimation of multiple FIR channels. The first one is a modification of the Iterative Quadratic ML (IQML) algorithm. IQML gives biased estimates of the channel and performs poorly at low...... to the initialization. Its asymptotic performance does not reach the DML performance though. The second strategy, called Pseudo-Quadratic ML (PQML), is naturally denoised. The denoising in PQML is furthermore more efficient than in DIQML: PQML yields the same asymptotic performance as DML, as opposed to DIQML......, but requires a consistent initialization. We furthermore compare DIQML and PQML to the strategy of alternating minimization w.r.t. symbols and channel for solving DML (AQML). An asymptotic performance analysis, a complexity evaluation and simulation results are also presented. The proposed DIQML and PQML...

  8. Design requirements and performance requirements for reactor fuel recycle manipulator systems

    International Nuclear Information System (INIS)

    Grundmann, J.G.

    1975-01-01

    The development of a new generation of remote handling devices for remote production work in support of reactor fuel recycle systems is discussed. These devices require greater mobility, speed and visual capability than remote handling systems used in research activities. An upgraded manipulator system proposed for a High-Temperature Gas-Cooled Reactor fuel refabrication facility is described. Design and performance criteria for the manipulators, cranes, and TV cameras in the proposed system are enumerated

  9. A simple approach to enhance the performance of complex-coefficient filter-based PLL in grid-connected applications

    DEFF Research Database (Denmark)

    Ramezani, Malek; Golestan, Saeed; Li, Shuhui

    2018-01-01

    In recent years, a large number of three-phase phase-locked loops (PLLs) have been developed. One of the most popular ones is the complex coefficient filterbased PLL (CCF-PLL). The CCFs benefit from a sequence selective filtering ability and, hence, enable the CCF-PLL to selectively reject/extract...... disturbances before the PLL control loop while maintaining an acceptable dynamic behavior. The aim of this paper is presenting a simple yet effective approach to enhance the standard CCF-PLL performance without requiring any additional computational load....

  10. Task complexity, student perceptions of vocabulary learning in EFL, and task performance.

    Science.gov (United States)

    Wu, Xiaoli; Lowyck, Joost; Sercu, Lies; Elen, Jan

    2013-03-01

    The study deepened our understanding of how students' self-efficacy beliefs contribute to the context of teaching English as a foreign language in the framework of cognitive mediational paradigm at a fine-tuned task-specific level. The aim was to examine the relationship among task complexity, self-efficacy beliefs, domain-related prior knowledge, learning strategy use, and task performance as they were applied to English vocabulary learning from reading tasks. Participants were 120 second-year university students (mean age 21) from a Chinese university. This experiment had two conditions (simple/complex). A vocabulary level test was first conducted to measure participants' prior knowledge of English vocabulary. Participants were then randomly assigned to one of the learning tasks. Participants were administered task booklets together with the self-efficacy scales, measures of learning strategy use, and post-tests. Data obtained were submitted to multivariate analysis of variance (MANOVA) and path analysis. Results from the MANOVA model showed a significant effect of vocabulary level on self-efficacy beliefs, learning strategy use, and task performance. Task complexity showed no significant effect; however, an interaction effect between vocabulary level and task complexity emerged. Results from the path analysis showed self-efficacy beliefs had an indirect effect on performance. Our results highlighted the mediating role of self-efficacy beliefs and learning strategy use. Our findings indicate that students' prior knowledge plays a crucial role on both self-efficacy beliefs and task performance, and the predictive power of self-efficacy on task performance may lie in its association with learning strategy use. © 2011 The British Psychological Society.

  11. The effects of physical threat on team processes during complex task performance

    NARCIS (Netherlands)

    Kamphuis, W.; Gaillard, A.W.K.; Vogelaar, A.L.W.

    2011-01-01

    Teams have become the norm for operating in dangerous and complex situations. To investigate how physical threat affects team performance, 27 threeperson teams engaged in a complex planning and problem-solving task, either under physical threat or under normal conditions. Threat consisted of the

  12. Cost optimal building performance requirements. Calculation methodology for reporting on national energy performance requirements on the basis of cost optimality within the framework of the EPBD

    Energy Technology Data Exchange (ETDEWEB)

    Boermans, T.; Bettgenhaeuser, K.; Hermelink, A.; Schimschar, S. [Ecofys, Utrecht (Netherlands)

    2011-05-15

    On the European level, the principles for the requirements for the energy performance of buildings are set by the Energy Performance of Buildings Directive (EPBD). Dating from December 2002, the EPBD has set a common framework from which the individual Member States in the EU developed or adapted their individual national regulations. The EPBD in 2008 and 2009 underwent a recast procedure, with final political agreement having been reached in November 2009. The new Directive was then formally adopted on May 19, 2010. Among other clarifications and new provisions, the EPBD recast introduces a benchmarking mechanism for national energy performance requirements for the purpose of determining cost-optimal levels to be used by Member States for comparing and setting these requirements. The previous EPBD set out a general framework to assess the energy performance of buildings and required Member States to define maximum values for energy delivered to meet the energy demand associated with the standardised use of the building. However it did not contain requirements or guidance related to the ambition level of such requirements. As a consequence, building regulations in the various Member States have been developed by the use of different approaches (influenced by different building traditions, political processes and individual market conditions) and resulted in different ambition levels where in many cases cost optimality principles could justify higher ambitions. The EPBD recast now requests that Member States shall ensure that minimum energy performance requirements for buildings are set 'with a view to achieving cost-optimal levels'. The cost optimum level shall be calculated in accordance with a comparative methodology. The objective of this report is to contribute to the ongoing discussion in Europe around the details of such a methodology by describing possible details on how to calculate cost optimal levels and pointing towards important factors and

  13. A low complexity visualization tool that helps to perform complex systems analysis

    International Nuclear Information System (INIS)

    Beiro, M G; Alvarez-Hamelin, J I; Busch, J R

    2008-01-01

    In this paper, we present an extension of large network visualization (LaNet-vi), a tool to visualize large scale networks using the k-core decomposition. One of the new features is how vertices compute their angular position. While in the later version it is done using shell clusters, in this version we use the angular coordinate of vertices in higher k-shells, and arrange the highest shell according to a cliques decomposition. The time complexity goes from O(n√n) to O(n) upon bounds on a heavy-tailed degree distribution. The tool also performs a k-core-connectivity analysis, highlighting vertices that are not k-connected; e.g. this property is useful to measure robustness or quality of service (QoS) capabilities in communication networks. Finally, the actual version of LaNet-vi can draw labels and all the edges using transparencies, yielding an accurate visualization. Based on the obtained figure, it is possible to distinguish different sources and types of complex networks at a glance, in a sort of 'network iris-print'.

  14. A low complexity visualization tool that helps to perform complex systems analysis

    Science.gov (United States)

    Beiró, M. G.; Alvarez-Hamelin, J. I.; Busch, J. R.

    2008-12-01

    In this paper, we present an extension of large network visualization (LaNet-vi), a tool to visualize large scale networks using the k-core decomposition. One of the new features is how vertices compute their angular position. While in the later version it is done using shell clusters, in this version we use the angular coordinate of vertices in higher k-shells, and arrange the highest shell according to a cliques decomposition. The time complexity goes from O(n\\sqrt n) to O(n) upon bounds on a heavy-tailed degree distribution. The tool also performs a k-core-connectivity analysis, highlighting vertices that are not k-connected; e.g. this property is useful to measure robustness or quality of service (QoS) capabilities in communication networks. Finally, the actual version of LaNet-vi can draw labels and all the edges using transparencies, yielding an accurate visualization. Based on the obtained figure, it is possible to distinguish different sources and types of complex networks at a glance, in a sort of 'network iris-print'.

  15. The Effect of Performance-Contingent Incentives when Task Complexity is Manipulated through Instruction

    Directory of Open Access Journals (Sweden)

    Monte Wynder

    2010-12-01

    Full Text Available When, and how, performance-contingent incentives improve performance is an important question fororganisations. Empirical results have been mixed – performance-contingent incentives sometimes increaseperformance, sometimes decrease performance, and sometimes have no effect. Theorists have called forfurther research to identify the effect of various moderating variables, including knowledge and taskcomplexity. This study responds by considering the role of instruction in providing the necessary knowledgeto reduce task complexity. The results suggest that a performance-contingent penalty can be a particularlyeffective means of directing effort for a simple task. For a complex task, performance can be improvedthrough instruction. The type of instruction is important – with rule-based instruction effectively directingeffort – however principle-based instruction is necessary to facilitate problem investigation and problemsolving.

  16. HIGH PERFORMANCE PIAA CORONAGRAPHY WITH COMPLEX AMPLITUDE FOCAL PLANE MASKS

    International Nuclear Information System (INIS)

    Guyon, Olivier; Martinache, Frantz; Belikov, Ruslan; Soummer, Remi

    2010-01-01

    We describe a coronagraph approach where the performance of a Phase-Induced Amplitude Apodization (PIAA) coronagraph is improved by using a partially transmissive phase-shifting focal plane mask and a Lyot stop. This approach combines the low inner working angle offered by phase mask coronagraphy, the full throughput and uncompromized angular resolution of the PIAA approach, and the design flexibility of Apodized Pupil Lyot Coronagraph. A PIAA complex mask coronagraph (PIAACMC) is fully described by the focal plane mask size, or, equivalently, its complex transmission which ranges from 0 (opaque) to -1 (phase shifting). For all values of the transmission, the PIAACMC theoretically offers full on-axis extinction and 100% throughput at large angular separations. With a pure phase focal plane mask (complex transmission = -1), the PIAACMC offers 50% throughput at 0.64 λ/D while providing total extinction of an on-axis point source. This performance is very close to the 'fundamental performance limit' of coronagraphy derived from first principles. For very high contrast level, imaging performance with PIAACMC is in practice limited by the angular size of the on-axis target (usually a star). We show that this fundamental limitation must be taken into account when choosing the optimal value of the focal plane mask size in the PIAACMC design. We show that the PIAACMC enables visible imaging of Jupiter-like planets at ∼1.2 λ/D from the host star, and can therefore offer almost three times more targets than a PIAA coronagraph optimized for this type of observation. We find that for visible imaging of Earth-like planets, the PIAACMC gain over a PIAA is probably much smaller, as coronagraphic performance is then strongly constrained by stellar angular size. For observations at 'low' contrast (below ∼ 10 8 ), the PIAACMC offers significant performance enhancement over PIAA. This is especially relevant for ground-based high contrast imaging systems in the near-IR, where

  17. Variations in task constraints shape emergent performance outcomes and complexity levels in balancing.

    Science.gov (United States)

    Caballero Sánchez, Carla; Barbado Murillo, David; Davids, Keith; Moreno Hernández, Francisco J

    2016-06-01

    This study investigated the extent to which specific interacting constraints of performance might increase or decrease the emergent complexity in a movement system, and whether this could affect the relationship between observed movement variability and the central nervous system's capacity to adapt to perturbations during balancing. Fifty-two healthy volunteers performed eight trials where different performance constraints were manipulated: task difficulty (three levels) and visual biofeedback conditions (with and without the center of pressure (COP) displacement and a target displayed). Balance performance was assessed using COP-based measures: mean velocity magnitude (MVM) and bivariate variable error (BVE). To assess the complexity of COP, fuzzy entropy (FE) and detrended fluctuation analysis (DFA) were computed. ANOVAs showed that MVM and BVE increased when task difficulty increased. During biofeedback conditions, individuals showed higher MVM but lower BVE at the easiest level of task difficulty. Overall, higher FE and lower DFA values were observed when biofeedback was available. On the other hand, FE reduced and DFA increased as difficulty level increased, in the presence of biofeedback. However, when biofeedback was not available, the opposite trend in FE and DFA values was observed. Regardless of changes to task constraints and the variable investigated, balance performance was positively related to complexity in every condition. Data revealed how specificity of task constraints can result in an increase or decrease in complexity emerging in a neurobiological system during balance performance.

  18. Education requirements for nurses working with people with complex neurological conditions: nurses' perceptions.

    Science.gov (United States)

    Baker, Mark

    2012-01-01

    Following a service evaluation methodology, this paper reports on registered nurses' (RNs) and healthcare assistants' (HCAs) perceptions about education and training requirements in order to work with people with complex neurological disabilities. A service evaluation was undertaken to meet the study aim using a non-probability, convenience method of sampling 368 nurses (n=110 RNs, n=258 HCAs) employed between October and November 2008 at one specialist hospital in south-west London in the U.K. The main results show that respondents were clear about the need to develop an education and training programme for RNs and HCAs working in this speciality area (91% of RNs and 94% of HCAs). A variety of topics were identified to be included within a work-based education and training programme, such as positively managing challenging behaviour, moving and handling, working with families. Adults with complex neurological needs have diverse needs and thus nurses working with this patient group require diverse education and training in order to deliver quality patient-focused nursing care. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. 42 CFR 456.245 - Number of studies required to be performed.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Number of studies required to be performed. 456.245 Section 456.245 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN... Ur Plan: Medical Care Evaluation Studies § 456.245 Number of studies required to be performed. The...

  20. 42 CFR 456.145 - Number of studies required to be performed.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Number of studies required to be performed. 456.145 Section 456.145 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN...: Medical Care Evaluation Studies § 456.145 Number of studies required to be performed. The hospital must...

  1. Does model performance improve with complexity? A case study with three hydrological models

    Science.gov (United States)

    Orth, Rene; Staudinger, Maria; Seneviratne, Sonia I.; Seibert, Jan; Zappa, Massimiliano

    2015-04-01

    In recent decades considerable progress has been made in climate model development. Following the massive increase in computational power, models became more sophisticated. At the same time also simple conceptual models have advanced. In this study we validate and compare three hydrological models of different complexity to investigate whether their performance varies accordingly. For this purpose we use runoff and also soil moisture measurements, which allow a truly independent validation, from several sites across Switzerland. The models are calibrated in similar ways with the same runoff data. Our results show that the more complex models HBV and PREVAH outperform the simple water balance model (SWBM) in case of runoff but not for soil moisture. Furthermore the most sophisticated PREVAH model shows an added value compared to the HBV model only in case of soil moisture. Focusing on extreme events we find generally improved performance of the SWBM during drought conditions and degraded agreement with observations during wet extremes. For the more complex models we find the opposite behavior, probably because they were primarily developed for prediction of runoff extremes. As expected given their complexity, HBV and PREVAH have more problems with over-fitting. All models show a tendency towards better performance in lower altitudes as opposed to (pre-) alpine sites. The results vary considerably across the investigated sites. In contrast, the different metrics we consider to estimate the agreement between models and observations lead to similar conclusions, indicating that the performance of the considered models is similar at different time scales as well as for anomalies and long-term means. We conclude that added complexity does not necessarily lead to improved performance of hydrological models, and that performance can vary greatly depending on the considered hydrological variable (e.g. runoff vs. soil moisture) or hydrological conditions (floods vs. droughts).

  2. Atg6/UVRAG/Vps34-Containing Lipid Kinase Complex Is Required for Receptor Downregulation through Endolysosomal Degradation and Epithelial Polarity during Drosophila Wing Development

    Directory of Open Access Journals (Sweden)

    Péter Lőrincz

    2014-01-01

    Full Text Available Atg6 (Beclin 1 in mammals is a core component of the Vps34 PI3K (III complex, which promotes multiple vesicle trafficking pathways. Atg6 and Vps34 form two distinct PI3K (III complexes in yeast and mammalian cells, either with Atg14 or with UVRAG. The functions of these two complexes are not entirely clear, as both Atg14 and UVRAG have been suggested to regulate both endocytosis and autophagy. In this study, we performed a microscopic analysis of UVRAG, Atg14, or Atg6 loss-of-function cells in the developing Drosophila wing. Both autophagy and endocytosis are seriously impaired and defective endolysosomes accumulate upon loss of Atg6. We show that Atg6 is required for the downregulation of Notch and Wingless signaling pathways; thus it is essential for normal wing development. Moreover, the loss of Atg6 impairs cell polarity. Atg14 depletion results in autophagy defects with no effect on endocytosis or cell polarity, while the silencing of UVRAG phenocopies all but the autophagy defect of Atg6 depleted cells. Thus, our results indicate that the UVRAG-containing PI3K (III complex is required for receptor downregulation through endolysosomal degradation and for the establishment of proper cell polarity in the developing wing, while the Atg14-containing complex is involved in autophagosome formation.

  3. Application of systems engineering to determine performance requirements for repository waste packages

    International Nuclear Information System (INIS)

    Aitken, E.A.; Stimmell, G.L.

    1987-01-01

    The waste package for a nuclear waste repository in salt must contribute substantially to the performance objectives defined by the Salt Repository Project (SRP) general requirements document governing disposal of high-level waste. The waste package is one of the engineered barriers providing containment. In establishing the performance requirements for a project focused on design and fabrication of the waste package, the systems engineering methodology has been used to translate the hierarchy requirements for the repository system to specific performance requirements for design and fabrication of the waste package, a subsystem of the repository. This activity is ongoing and requires a methodology that provides traceability and is capable of iteration as baseline requirements are refined or changed. The purpose of this summary is to describe the methodology being used and the way it can be applied to similar activities in the nuclear industry

  4. Integrating complex functions: coordination of nuclear pore complex assembly and membrane expansion of the nuclear envelope requires a family of integral membrane proteins.

    Science.gov (United States)

    Schneiter, Roger; Cole, Charles N

    2010-01-01

    The nuclear envelope harbors numerous large proteinaceous channels, the nuclear pore complexes (NPCs), through which macromolecular exchange between the cytosol and the nucleoplasm occurs. This double-membrane nuclear envelope is continuous with the endoplasmic reticulum and thus functionally connected to such diverse processes as vesicular transport, protein maturation and lipid synthesis. Recent results obtained from studies in Saccharomyces cerevisiae indicate that assembly of the nuclear pore complex is functionally dependent upon maintenance of lipid homeostasis of the ER membrane. Previous work from one of our laboratories has revealed that an integral membrane protein Apq12 is important for the assembly of functional nuclear pores. Cells lacking APQ12 are viable but cannot grow at low temperatures, have aberrant NPCs and a defect in mRNA export. Remarkably, these defects in NPC assembly can be overcome by supplementing cells with a membrane fluidizing agent, benzyl alcohol, suggesting that Apq12 impacts the flexibility of the nuclear membrane, possibly by adjusting its lipid composition when cells are shifted to a reduced temperature. Our new study now expands these findings and reveals that an essential membrane protein, Brr6, shares at least partially overlapping functions with Apq12 and is also required for assembly of functional NPCs. A third nuclear envelope membrane protein, Brl1, is related to Brr6, and is also required for NPC assembly. Because maintenance of membrane homeostasis is essential for cellular survival, the fact that these three proteins are conserved in fungi that undergo closed mitoses, but are not found in metazoans or plants, may indicate that their functions are performed by proteins unrelated at the primary sequence level to Brr6, Brl1 and Apq12 in cells that disassemble their nuclear envelopes during mitosis.

  5. Elongator complex is required for long-term olfactory memory formation in Drosophila.

    Science.gov (United States)

    Yu, Dinghui; Tan, Ying; Chakraborty, Molee; Tomchik, Seth; Davis, Ronald L

    2018-04-01

    The evolutionarily conserved Elongator Complex associates with RNA polymerase II for transcriptional elongation. Elp3 is the catalytic subunit, contains histone acetyltransferase activity, and is associated with neurodegeneration in humans. Elp1 is a scaffolding subunit and when mutated causes familial dysautonomia. Here, we show that elp3 and elp1 are required for aversive long-term olfactory memory in Drosophila RNAi knockdown of elp3 in adult mushroom bodies impairs long-term memory (LTM) without affecting earlier forms of memory. RNAi knockdown with coexpression of elp3 cDNA reverses the impairment. Similarly, RNAi knockdown of elp1 impairs LTM and coexpression of elp1 cDNA reverses this phenotype. The LTM deficit in elp3 and elp1 knockdown flies is accompanied by the abolishment of a LTM trace, which is registered as increased calcium influx in response to the CS+ odor in the α-branch of mushroom body neurons. Coexpression of elp1 or elp3 cDNA rescues the memory trace in parallel with LTM. These data show that the Elongator complex is required in adult mushroom body neurons for long-term behavioral memory and the associated long-term memory trace. © 2018 Yu et al.; Published by Cold Spring Harbor Laboratory Press.

  6. Establishment of design and performance requirements using cost and systems analysis

    International Nuclear Information System (INIS)

    Waganer, L.M.; Carosella, L.A.; Defreece, D.A.

    1977-01-01

    The current uncertainty in design approach and performance requirements for a commercial fusion power plant poses a problem for the designer in configuring the plant and for the utilities in analyzing the attractiveness of a future fusion power plant. To provide direction and insight in this area, a systems analysis model was constructed at McDonnell Douglas, utilizing fusion subsystem algorithms with subsystem cost estimating relationships into a self-consistent computerized model for several fusion reactor concepts. Cost estimating data has been compiled by utilizing McDonnell Douglas' experience in fabricating large, complex metal assemblies and soliciting the accumulated store of knowledge in existing power plants and new emerging technologies such as the Clinch River Breeder Reactor. Using the computer model, sensitivities to plasma, reactor and plant parameters are a few of the options that have been evaluated to yield recommended concepts/techniques/solutions. This is a very beneficial tool in assessing the impact of the fusion reactor on the electrical power community and charting the optimum developmental approach

  7. Estimating the operator's performance time of emergency procedural tasks based on a task complexity measure

    International Nuclear Information System (INIS)

    Jung, Won Dae; Park, Jink Yun

    2012-01-01

    It is important to understand the amount of time required to execute an emergency procedural task in a high-stress situation for managing human performance under emergencies in a nuclear power plant. However, the time to execute an emergency procedural task is highly dependent upon expert judgment due to the lack of actual data. This paper proposes an analytical method to estimate the operator's performance time (OPT) of a procedural task, which is based on a measure of the task complexity (TACOM). The proposed method for estimating an OPT is an equation that uses the TACOM as a variable, and the OPT of a procedural task can be calculated if its relevant TACOM score is available. The validity of the proposed equation is demonstrated by comparing the estimated OPTs with the observed OPTs for emergency procedural tasks in a steam generator tube rupture scenario.

  8. The disruptive effects of pain on complex cognitive performance and executive control.

    Directory of Open Access Journals (Sweden)

    Edmund Keogh

    Full Text Available Pain interferes and disrupts attention. What is less clear is how pain affects performance on complex tasks, and the strategies used to ensure optimal outcomes. The aim of the current study was to examine the effect of pain on higher-order executive control processes involved in managing complex tasks. Sixty-two adult volunteers (40 female completed two computer-based tasks: a breakfast making task and a word generation puzzle. Both were complex, involving executive control functions, including goal-directed planning and switching. Half of those recruited performed the tasks under conditions of thermal heat pain, and half with no accompanying pain. Whilst pain did not affect central performance on either task, it did have indirect effects. For the breakfast task, pain resulted in a decreased ability to multitask, with performance decrements found on the secondary task. However, no effects of pain were found on the processes thought to underpin this task. For the word generation puzzle, pain did not affect task performance, but did alter subjective accounts of the processes used to complete the task; pain affected the perceived allocation of time to the task, as well as switching perceptions. Sex differences were also found. When studying higher-order cognitive processes, pain-related interference effects are varied, and may result in subtle or indirect changes in cognition.

  9. The disruptive effects of pain on complex cognitive performance and executive control.

    Science.gov (United States)

    Keogh, Edmund; Moore, David J; Duggan, Geoffrey B; Payne, Stephen J; Eccleston, Christopher

    2013-01-01

    Pain interferes and disrupts attention. What is less clear is how pain affects performance on complex tasks, and the strategies used to ensure optimal outcomes. The aim of the current study was to examine the effect of pain on higher-order executive control processes involved in managing complex tasks. Sixty-two adult volunteers (40 female) completed two computer-based tasks: a breakfast making task and a word generation puzzle. Both were complex, involving executive control functions, including goal-directed planning and switching. Half of those recruited performed the tasks under conditions of thermal heat pain, and half with no accompanying pain. Whilst pain did not affect central performance on either task, it did have indirect effects. For the breakfast task, pain resulted in a decreased ability to multitask, with performance decrements found on the secondary task. However, no effects of pain were found on the processes thought to underpin this task. For the word generation puzzle, pain did not affect task performance, but did alter subjective accounts of the processes used to complete the task; pain affected the perceived allocation of time to the task, as well as switching perceptions. Sex differences were also found. When studying higher-order cognitive processes, pain-related interference effects are varied, and may result in subtle or indirect changes in cognition.

  10. Exploration of the Trade Space Between Unmanned Aircraft Systems Descent Maneuver Performance and Sense-and-Avoid System Performance Requirements

    Science.gov (United States)

    Jack, Devin P.; Hoffler, Keith D.; Johnson, Sally C.

    2014-01-01

    A need exists to safely integrate Unmanned Aircraft Systems (UAS) into the United States' National Airspace System. Replacing manned aircraft's see-and-avoid capability in the absence of an onboard pilot is one of the key challenges associated with safe integration. Sense-and-avoid (SAA) systems will have to achieve yet-to-be-determined required separation distances for a wide range of encounters. They will also need to account for the maneuver performance of the UAS they are paired with. The work described in this paper is aimed at developing an understanding of the trade space between UAS maneuver performance and SAA system performance requirements, focusing on a descent avoidance maneuver. An assessment of current manned and unmanned aircraft performance was used to establish potential UAS performance test matrix bounds. Then, near-term UAS integration work was used to narrow down the scope. A simulator was developed with sufficient fidelity to assess SAA system performance requirements. The simulator generates closest-point-of-approach (CPA) data from the wide range of UAS performance models maneuvering against a single intruder with various encounter geometries. Initial attempts to model the results made it clear that developing maneuver performance groups is required. Discussion of the performance groups developed and how to know in which group an aircraft belongs for a given flight condition and encounter is included. The groups are airplane, flight condition, and encounter specific, rather than airplane-only specific. Results and methodology for developing UAS maneuver performance requirements are presented for a descent avoidance maneuver. Results for the descent maneuver indicate that a minimum specific excess power magnitude can assure a minimum CPA for a given time-to-go prediction. However, smaller amounts of specific excess power may achieve or exceed the same CPA if the UAS has sufficient speed to trade for altitude. The results of this study will

  11. Orbiter data reduction complex data processing requirements for the OFT mission evaluation team (level C)

    Science.gov (United States)

    1979-01-01

    This document addresses requirements for post-test data reduction in support of the Orbital Flight Tests (OFT) mission evaluation team, specifically those which are planned to be implemented in the ODRC (Orbiter Data Reduction Complex). Only those requirements which have been previously baselined by the Data Systems and Analysis Directorate configuration control board are included. This document serves as the control document between Institutional Data Systems Division and the Integration Division for OFT mission evaluation data processing requirements, and shall be the basis for detailed design of ODRC data processing systems.

  12. Benchmarking in pathology: development of a benchmarking complexity unit and associated key performance indicators.

    Science.gov (United States)

    Neil, Amanda; Pfeffer, Sally; Burnett, Leslie

    2013-01-01

    This paper details the development of a new type of pathology laboratory productivity unit, the benchmarking complexity unit (BCU). The BCU provides a comparative index of laboratory efficiency, regardless of test mix. It also enables estimation of a measure of how much complex pathology a laboratory performs, and the identification of peer organisations for the purposes of comparison and benchmarking. The BCU is based on the theory that wage rates reflect productivity at the margin. A weighting factor for the ratio of medical to technical staff time was dynamically calculated based on actual participant site data. Given this weighting, a complexity value for each test, at each site, was calculated. The median complexity value (number of BCUs) for that test across all participating sites was taken as its complexity value for the Benchmarking in Pathology Program. The BCU allowed implementation of an unbiased comparison unit and test listing that was found to be a robust indicator of the relative complexity for each test. Employing the BCU data, a number of Key Performance Indicators (KPIs) were developed, including three that address comparative organisational complexity, analytical depth and performance efficiency, respectively. Peer groups were also established using the BCU combined with simple organisational and environmental metrics. The BCU has enabled productivity statistics to be compared between organisations. The BCU corrects for differences in test mix and workload complexity of different organisations and also allows for objective stratification into peer groups.

  13. Analysis of the existing Standard on Power performance measurement and its application in complex terrain

    International Nuclear Information System (INIS)

    Cuerva, A.

    1997-01-01

    There are some groups working on the improvement of the existing Standard and recommendation on WECS power performance measurement and analysis. One of them, besides the one working in this project, is the MEASNET expert group. This one is trying to adequate the main reference, the IEC1400-12 Ref. [9]. to the current requirements on technical quality and trueness. Within this group and the MEASNET one, many deficiencies have been detected in the procedure followed up to now. Several of them belong to general aspects of the method (calculations, assumptions, etc. ) but the most critical fact regards to the inherent characteristics of complex terrain and to the issue of site calibration and uncertainties due to it, specifically. (Author)

  14. Analysis of the existing Standard on Power performance measurement and its application in complex terrain

    Energy Technology Data Exchange (ETDEWEB)

    Cuerva, A.

    1997-10-01

    There are some groups working on the improvement of the existing Standard and recommendation on WECS power performance measurement and analysis. One of them, besides the one working in this project, is the MEASNET expert group. This one is trying to adequate the main reference, the IEC 1400-12 Re.[9]. to the current requirements on technical quality and trueness. Within this group and the MEASNET one, many deficiencies have been detected in the procedure followed up to now. Several of them belong to general aspects of the method (calculations, assumptions, etc.) but the most critical fact regards to the inherent characteristics of complex terrain and to the issue of site calibration and uncertainties due to it, specifically. (Author)

  15. γ-Tubulin complex in Trypanosoma brucei: molecular composition, subunit interdependence and requirement for axonemal central pair protein assembly.

    Science.gov (United States)

    Zhou, Qing; Li, Ziyin

    2015-11-01

    γ-Tubulin complex constitutes a key component of the microtubule-organizing center and nucleates microtubule assembly. This complex differs in complexity in different organisms: the budding yeast contains the γ-tubulin small complex (γTuSC) composed of γ-tubulin, gamma-tubulin complex protein (GCP)2 and GCP3, whereas animals contain the γ-tubulin ring complex (γTuRC) composed of γTuSC and three additional proteins, GCP4, GCP5 and GCP6. In Trypanosoma brucei, the composition of the γ-tubulin complex remains elusive, and it is not known whether it also regulates assembly of the subpellicular microtubules and the spindle microtubules. Here we report that the γ-tubulin complex in T. brucei is composed of γ-tubulin and three GCP proteins, GCP2-GCP4, and is primarily localized in the basal body throughout the cell cycle. Depletion of GCP2 and GCP3, but not GCP4, disrupted the axonemal central pair microtubules, but not the subpellicular microtubules and the spindle microtubules. Furthermore, we showed that the γTuSC is required for assembly of two central pair proteins and that γTuSC subunits are mutually required for stability. Together, these results identified an unusual γ-tubulin complex in T. brucei, uncovered an essential role of γTuSC in central pair protein assembly, and demonstrated the interdependence of individual γTuSC components for maintaining a stable complex. © 2015 John Wiley & Sons Ltd.

  16. Executive Functioning and School Performance Among Pediatric Survivors of Complex Congenital Heart Disease

    Science.gov (United States)

    Gerstle, Melissa; Beebe, Dean W.; Drotar, Dennis; Cassedy, Amy; Marino, Bradley S.

    2016-01-01

    Objective To investigate the presence and severity of real-world impairments in executive functioning– responsible for children’s regulatory skills (metacognition, behavioral regulation) – and its potential impact on school performance among pediatric survivors of complex congenital heart disease (CHD). Study design Survivors of complex CHD aged 8–16 years (n=143)and their parents/guardians from a regional CHD survivor registry participated (81% participation rate). Parents completed proxy measures of executive functioning, school competency, and school-related quality of life (QOL). Patients also completed a measure of school QOL and underwent IQ testing. Patients were categorized into two groups based on heart lesion complexity: two-ventricle or single-ventricle. Results Survivors of complex CHD performed significantly worse than norms for executive functioning, IQ, school competency, and school QOL. Metacognition was more severely affected than behavioral regulation, and metacognitive deficits were more often present in older children. Even after taking into account demographic factors, disease severity, and IQ, metacognition uniquely and strongly predicted poorer school performance. In exploratory analyses, patients with single-ventricle lesions were rated as having lower school competency and school QOL, and patients with two-ventricle lesions were rated as having poorer behavioral regulation. Conclusions Survivors of complex CHD experience greater executive functioning difficulties than healthy peers, with metacognition particularly impacted and particularly relevant for day-to-day school performance. Especially in older children, clinicians should watch for metacognitive deficits, such as problems with organization, planning, self-monitoring, and follow-through on tasks. PMID:26875011

  17. Effects of orientation on Rey complex figure performance.

    Science.gov (United States)

    Ferraro, F Richard; Grossman, Jennifer; Bren, Amy; Hoverson, Allysa

    2002-10-01

    An experiment was performed that examined the impact of stimulus orientation on performance on the Rey complex figure. A total of 48 undergraduates (24 men, 24 women) were randomly assigned to one of four Rey figure orientation groups (0 degrees, 90 degrees, 180 degrees, and 270 degrees ). Participants followed standard procedures for the Rey figure, initially copying it in whatever orientation group they were assigned to. Next, all participants performed a 15-20 min lexical decision experiment, used as a filler task. Finally, and unbeknownest to them, participants were asked to recall as much of the figure as they could. As expected, results revealed a main effect of Task (F = 83.92, p orientation was not significant, nor did orientation interact with task (Fs .57). The results are important from an applied setting, especially if testing conditions are less than optimal and a fixed stimulus position is not possible (e.g., testing at the bedside).

  18. Practical experience and lessons learned through implementation of Appendix VIII performance demonstration requirements

    International Nuclear Information System (INIS)

    Ashwin, P.J.; Becker, F.L.; Latiolais, C.L.; Spanner, J.C.

    1996-01-01

    To provide the US nuclear industry with a uniform implementation of the Performance Demonstration requirements within the 1989 edition of ASME Section XI, Appendix VIII, representatives from all US nuclear utilities formed the Performance Demonstration Initiative (PDI). The PDI recognized the potential benefits that Appendix VIII offered the nuclear industry and initiated a proactive approach to implement the requirements. In doing so it was expected that performance demonstration of ultrasonic examination procedures would allow for improvement in the efficiency and credibility of inservice inspection to be realized. Explicit within the performance demonstration requirements of Appendix VIII is the need for a Performance Demonstration Administrator, a difficult requirement to fulfill. Not only must the administrator exhibit the attributes of understanding the demonstration requirements, but also have solid technical knowledge, integrity and be able to interface with the industry at all levels, from operations to regulatory. For the nuclear industry, the EPRI NDE Center is an obvious choice to fulfill this position. This paper provides a brief background of the PDI, a nuclear industry-wide initiative to implement the performance demonstration requirements of Appendix VIII. Even though the consensus approach adopted by the PDI is discussed, the paper's primary objective is to provide examples of the lessons learned by the Center through the specific requirements of Appendix VIII

  19. Performance requirements for the double-shell tank system: Phase 1

    International Nuclear Information System (INIS)

    Claghorn, R.D.

    1998-01-01

    This document establishes performance requirements for the double-shell tank system. These requirements, in turn, will be incorporated in the System Specification for the Double-Shell Tank System (Grenard and Claghorn 1998). This version of the document establishes requirements that are applicable to the first phase (Phase 1) of the Tank Waste Remediation System (TWRS) mission described in the TWRS Mission Analysis Report (Acree 1998). It does not specify requirements for either the Phase 2 mission or the double-shell tank system closure period

  20. Saccharomyces cerevisiae vineyard strains have different nitrogen requirements that affect their fermentation performances.

    Science.gov (United States)

    Lemos Junior, W J F; Viel, A; Bovo, B; Carlot, M; Giacomini, A; Corich, V

    2017-11-01

    In this work the fermentation performances of seven vineyard strains, together with the industrial strain EC1118, have been investigated at three differing yeast assimilable nitrogen (YAN) concentrations (300 mg N l -1 , 150 mg N l -1 and 70 mg N l -1 ) in synthetic musts. The results indicated that the response to different nitrogen levels is strain dependent. Most of the strains showed a dramatic decrease of the fermentation at 70 mg N l -1 but no significant differences in CO 2 production were found when fermentations at 300 mg N l -1 and 150 mg N l -1 were compared. Only one among the vineyard strains showed a decrease of the fermentation when 150 mg N l -1 were present in the must. These results contribute to shed light on strain nitrogen requirements and offer new perspectives to manage the fermentation process during winemaking. Selected vineyard Saccharomyces cerevisiae strains can improve the quality and the complexity of local wines. Wine quality is also influenced by nitrogen availability that modulates yeast fermentation activity. In this work, yeast nitrogen assimilation was evaluated to clarify the nitrogen requirements of vineyard strains. Most of the strains needed high nitrogen levels to express the best fermentation performances. The results obtained indicate the critical nitrogen levels. When the nitrogen concentration was above the critical level, the fermentation process increased, but if the level of nitrogen was further increased no effect on the fermentation was found. © 2017 The Society for Applied Microbiology.

  1. Driver’s Cognitive Workload and Driving Performance under Traffic Sign Information Exposure in Complex Environments: A Case Study of the Highways in China

    Directory of Open Access Journals (Sweden)

    Nengchao Lyu

    2017-02-01

    Full Text Available Complex traffic situations and high driving workload are the leading contributing factors to traffic crashes. There is a strong correlation between driving performance and driving workload, such as visual workload from traffic signs on highway off-ramps. This study aimed to evaluate traffic safety by analyzing drivers’ behavior and performance under the cognitive workload in complex environment areas. First, the driving workload of drivers was tested based on traffic signs with different quantities of information. Forty-four drivers were recruited to conduct a traffic sign cognition experiment under static controlled environment conditions. Different complex traffic signs were used for applying the cognitive workload. The static experiment results reveal that workload is highly related to the amount of information on traffic signs and reaction time increases with the information grade, while driving experience and gender effect are not significant. This shows that the cognitive workload of subsequent driving experiments can be controlled by the amount of information on traffic signs. Second, driving characteristics and driving performance were analyzed under different secondary task driving workload levels using a driving simulator. Drivers were required to drive at the required speed on a designed highway off-ramp scene. The cognitive workload was controlled by reading traffic signs with different information, which were divided into four levels. Drivers had to make choices by pushing buttons after reading traffic signs. Meanwhile, the driving performance information was recorded. Questionnaires on objective workload were collected right after each driving task. The results show that speed maintenance and lane deviations are significantly different under different levels of cognitive workload, and the effects of driving experience and gender groups are significant. The research results can be used to analyze traffic safety in highway

  2. A high performance, low power computational platform for complex sensing operations in smart cities

    KAUST Repository

    Jiang, Jiming; Claudel, Christian

    2017-01-01

    This paper presents a new wireless platform designed for an integrated traffic/flash flood monitoring system. The sensor platform is built around a 32-bit ARM Cortex M4 microcontroller and a 2.4GHz 802.15.4802.15.4 ISM compliant radio module. It can be interfaced with fixed traffic sensors, or receive data from vehicle transponders. This platform is specifically designed for solar-powered, low bandwidth, high computational performance wireless sensor network applications. A self-recovering unit is designed to increase reliability and allow periodic hard resets, an essential requirement for sensor networks. A radio monitoring circuitry is proposed to monitor incoming and outgoing transmissions, simplifying software debugging. We illustrate the performance of this wireless sensor platform on complex problems arising in smart cities, such as traffic flow monitoring, machine-learning-based flash flood monitoring or Kalman-filter based vehicle trajectory estimation. All design files have been uploaded and shared in an open science framework, and can be accessed from [1]. The hardware design is under CERN Open Hardware License v1.2.

  3. A high performance, low power computational platform for complex sensing operations in smart cities

    KAUST Repository

    Jiang, Jiming

    2017-02-02

    This paper presents a new wireless platform designed for an integrated traffic/flash flood monitoring system. The sensor platform is built around a 32-bit ARM Cortex M4 microcontroller and a 2.4GHz 802.15.4802.15.4 ISM compliant radio module. It can be interfaced with fixed traffic sensors, or receive data from vehicle transponders. This platform is specifically designed for solar-powered, low bandwidth, high computational performance wireless sensor network applications. A self-recovering unit is designed to increase reliability and allow periodic hard resets, an essential requirement for sensor networks. A radio monitoring circuitry is proposed to monitor incoming and outgoing transmissions, simplifying software debugging. We illustrate the performance of this wireless sensor platform on complex problems arising in smart cities, such as traffic flow monitoring, machine-learning-based flash flood monitoring or Kalman-filter based vehicle trajectory estimation. All design files have been uploaded and shared in an open science framework, and can be accessed from [1]. The hardware design is under CERN Open Hardware License v1.2.

  4. Atmospheric stability and topography effects on wind turbine performance and wake properties in complex terrain

    DEFF Research Database (Denmark)

    Han, Xingxing; Liu, Deyou; Xu, Chang

    2018-01-01

    This paper evaluates the influence of atmospheric stability and topography on wind turbine performance and wake properties in complex terrain. To assess atmospheric stability effects on wind turbine performance, an equivalent wind speed calculated with the power output and the manufacture power...... and topography have significant influences on wind turbine performance and wake properties. Considering effects of atmospheric stability and topography will benefit the wind resource assessment in complex terrain....

  5. On Measuring the Complexity of Networks: Kolmogorov Complexity versus Entropy

    Directory of Open Access Journals (Sweden)

    Mikołaj Morzy

    2017-01-01

    Full Text Available One of the most popular methods of estimating the complexity of networks is to measure the entropy of network invariants, such as adjacency matrices or degree sequences. Unfortunately, entropy and all entropy-based information-theoretic measures have several vulnerabilities. These measures neither are independent of a particular representation of the network nor can capture the properties of the generative process, which produces the network. Instead, we advocate the use of the algorithmic entropy as the basis for complexity definition for networks. Algorithmic entropy (also known as Kolmogorov complexity or K-complexity for short evaluates the complexity of the description required for a lossless recreation of the network. This measure is not affected by a particular choice of network features and it does not depend on the method of network representation. We perform experiments on Shannon entropy and K-complexity for gradually evolving networks. The results of these experiments point to K-complexity as the more robust and reliable measure of network complexity. The original contribution of the paper includes the introduction of several new entropy-deceiving networks and the empirical comparison of entropy and K-complexity as fundamental quantities for constructing complexity measures for networks.

  6. Membranes linked by trans-SNARE complexes require lipids prone to non-bilayer structure for progression to fusion.

    Science.gov (United States)

    Zick, Michael; Stroupe, Christopher; Orr, Amy; Douville, Deborah; Wickner, William T

    2014-01-01

    Like other intracellular fusion events, the homotypic fusion of yeast vacuoles requires a Rab GTPase, a large Rab effector complex, SNARE proteins which can form a 4-helical bundle, and the SNARE disassembly chaperones Sec17p and Sec18p. In addition to these proteins, specific vacuole lipids are required for efficient fusion in vivo and with the purified organelle. Reconstitution of vacuole fusion with all purified components reveals that high SNARE levels can mask the requirement for a complex mixture of vacuole lipids. At lower, more physiological SNARE levels, neutral lipids with small headgroups that tend to form non-bilayer structures (phosphatidylethanolamine, diacylglycerol, and ergosterol) are essential. Membranes without these three lipids can dock and complete trans-SNARE pairing but cannot rearrange their lipids for fusion. DOI: http://dx.doi.org/10.7554/eLife.01879.001.

  7. The Influence of Time Pressure and Case Complexity on Physicians׳ Diagnostic Performance

    Directory of Open Access Journals (Sweden)

    Dalal A. ALQahtani

    2016-12-01

    Conclusions: Time pressure did not impact the diagnostic performance, whereas the complexity of the clinical case negatively influenced the diagnostic accuracy. Further studies with the enhanced experimental manipulation of time pressure are needed to reveal the effect of time pressure, if any, on a physician׳s diagnostic performance.

  8. 42 CFR 84.103 - Man tests; performance requirements.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Man tests; performance requirements. 84.103 Section 84.103 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES OCCUPATIONAL SAFETY AND HEALTH RESEARCH AND RELATED ACTIVITIES APPROVAL OF RESPIRATORY PROTECTIVE DEVICES Self-Contained Breathing Apparatus § 84.103 Man tests;...

  9. Development of DSRC device and communication system performance measures recommendations for DSRC OBE performance and security requirements.

    Science.gov (United States)

    2016-05-22

    This report presents recommendations for minimum DSRC device communication performance and security : requirements to ensure effective operation of the DSRC system. The team identified recommended DSRC : communications requirements aligned to use cas...

  10. 14 CFR 151.49 - Performance of construction work: Contract requirements.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Performance of construction work: Contract... § 151.49 Performance of construction work: Contract requirements. (a) Contract provisions. In addition to any other provisions necessary to ensure completion of the work in accordance with the grant...

  11. Radioactive material package test standards and performance requirements - public perception

    International Nuclear Information System (INIS)

    Pope, R.B.; Shappert, L.B.; Rawl, R.R.

    1992-01-01

    This paper addresses issues related to the public perception of the regulatory test standards and performance requirements for packaging and transporting radioactive material. Specifically, it addresses the adequacy of the package performance standards and testing for Type B packages, which are those packages designed for transporting the most hazardous quantities and forms of radioactive material. Type B packages are designed to withstand accident conditions in transport. To improve public perception, the public needs to better understand: (a) the regulatory standards and requirements themselves, (b) the extensive history underlying their development, and (c) the soundness of the technical foundation. The public needs to be fully informed on studies, tests, and analyses that have been carried out worldwide and form the basis of the regulatory standards and requirements. This paper provides specific information aimed at improving the public perception of packages test standards

  12. 40 CFR Table 5 to Subpart Xxxx of... - Requirements for Performance Tests

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 12 2010-07-01 2010-07-01 true Requirements for Performance Tests 5 Table 5 to Subpart XXXX of Part 63 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED.... XXXX, Table 5 Table 5 to Subpart XXXX of Part 63—Requirements for Performance Tests As stated in § 63...

  13. 40 CFR 63.344 - Performance test requirements and test methods.

    Science.gov (United States)

    2010-07-01

    ... electroplating tanks or chromium anodizing tanks. The sampling time and sample volume for each run of Methods 306... Chromium Anodizing Tanks § 63.344 Performance test requirements and test methods. (a) Performance test... Emissions From Decorative and Hard Chromium Electroplating and Anodizing Operations,” appendix A of this...

  14. The Visual Orientation Memory of "Drosophila" Requires Foraging (PKG) Upstream of Ignorant (RSK2) in Ring Neurons of the Central Complex

    Science.gov (United States)

    Kuntz, Sara; Poeck, Burkhard; Sokolowski, Marla B.; Strauss, Roland

    2012-01-01

    Orientation and navigation in a complex environment requires path planning and recall to exert goal-driven behavior. Walking "Drosophila" flies possess a visual orientation memory for attractive targets which is localized in the central complex of the adult brain. Here we show that this type of working memory requires the cGMP-dependent protein…

  15. Performance Analysis with Network-Enhanced Complexities: On Fading Measurements, Event-Triggered Mechanisms, and Cyber Attacks

    Directory of Open Access Journals (Sweden)

    Derui Ding

    2014-01-01

    Full Text Available Nowadays, the real-world systems are usually subject to various complexities such as parameter uncertainties, time-delays, and nonlinear disturbances. For networked systems, especially large-scale systems such as multiagent systems and systems over sensor networks, the complexities are inevitably enhanced in terms of their degrees or intensities because of the usage of the communication networks. Therefore, it would be interesting to (1 examine how this kind of network-enhanced complexities affects the control or filtering performance; and (2 develop some suitable approaches for controller/filter design problems. In this paper, we aim to survey some recent advances on the performance analysis and synthesis with three sorts of fashionable network-enhanced complexities, namely, fading measurements, event-triggered mechanisms, and attack behaviors of adversaries. First, these three kinds of complexities are introduced in detail according to their engineering backgrounds, dynamical characteristic, and modelling techniques. Then, the developments of the performance analysis and synthesis issues for various networked systems are systematically reviewed. Furthermore, some challenges are illustrated by using a thorough literature review and some possible future research directions are highlighted.

  16. Software Performs Complex Design Analysis

    Science.gov (United States)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  17. Performance of community health workers:situating their intermediary position within complex adaptive health systems

    OpenAIRE

    Kok, Maryse. C; Broerse, Jacqueline E.W; Theobald, Sally; Ormel, Hermen; Dieleman, Marjolein; Taegtmeyer, Miriam

    2017-01-01

    Health systems are social institutions, in which health worker performance is shaped by transactional processes between different actors. This analytical assessment unravels the complex web of factors that influence the performance of community health workers (CHWs) in low- and middle-income countries. It examines their unique intermediary position between the communities they serve and actors in the health sector, and the complexity of the health systems in which they operate. The assessment...

  18. 14 CFR 151.45 - Performance of construction work: General requirements.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 3 2010-01-01 2010-01-01 false Performance of construction work: General... § 151.45 Performance of construction work: General requirements. (a) All construction work under a... work under a project until— (1) The sponsor has furnished three conformed copies of the contract to the...

  19. Required performance to the concrete structure of the accelerator facilities

    International Nuclear Information System (INIS)

    Irie, Masaaki; Yoshioka, Masakazu; Miyahara, Masanobu

    2006-01-01

    As for the accelerator facility, there is many a thing which is constructed as underground concrete structure from viewpoint such as cover of radiation and stability of the structure. Required performance to the concrete structure of the accelerator facility is the same as the general social infrastructure, but it has been possessed the feature where target performance differs largely. As for the body sentence, expressing the difference of the performance which is required from the concrete structure of the social infrastructure and the accelerator facility, construction management of the concrete structure which it plans from order of the accelerator engineering works facility, reaches to the design, supervision and operation it is something which expresses the method of thinking. In addition, in the future of material structural analysis of the concrete which uses the neutron accelerator concerning view it showed. (author)

  20. Measuring working memory in aphasia: Comparing performance on complex span and N-back tasks

    Directory of Open Access Journals (Sweden)

    Maria Ivanova

    2014-04-01

    No significant correlations were observed between performance on complex span task and N-back tasks.Furthermore, performance on the modified listening span was related to performance on the comprehension subtest of the QASA, while no relationship was found for 2-back and 0-back tasks.Our results mirror studies in healthy controls that demonstrated no relationship between performance on the two tasks(Jaeggi et al., 2010; Kane et al., 2007. Thus although N-back tasks seem similar to traditional complex span measures and may also index abilities related to cognitive processing, the evidence to date does not warrant their direct association with the construct of WM. Implications for future investigation of cognitive deficits in aphasia will be discussed.

  1. The Role of Awareness for Complex Planning Task Performance: A Microgaming Study

    Science.gov (United States)

    Lukosch, Heide; Groen, Daan; Kurapati, Shalini; Klemke, Roland; Verbraeck, Alexander

    2016-01-01

    This study introduces the concept of microgames to support situated learning in order to foster situational awareness (SA) of planners in seaport container terminals. In today's complex working environments, it is often difficult to develop the required level of understanding of a given situation, described as situational awareness. A container…

  2. High-level inhibition of mitochondrial complexes III and IV is required to increase glutamate release from the nerve terminal

    Directory of Open Access Journals (Sweden)

    Kilbride Seán M

    2011-07-01

    Full Text Available Abstract Background The activities of mitochondrial complex III (ubiquinol-cytochrome c reductase, EC 1.10.2.2 and complex IV (cytochrome c oxidase EC 1.9.3.1 are reduced by 30-70% in Huntington's disease and Alzheimer's disease, respectively, and are associated with excitotoxic cell death in these disorders. In this study, we investigated the control that complexes III and complex IV exert on glutamate release from the isolated nerve terminal. Results Inhibition of complex III activity by 60-90% was necessary for a major increase in the rate of Ca2+-independent glutamate release to occur from isolated nerve terminals (synaptosomes depolarized with 4-aminopyridine or KCl. Similarly, an 85-90% inhibition of complex IV activity was required before a major increase in the rate of Ca2+-independent glutamate release from depolarized synaptosomes was observed. Inhibition of complex III and IV activities by ~ 60% and above was required before rates of glutamate efflux from polarized synaptosomes were increased. Conclusions These results suggest that nerve terminal mitochondria possess high reserves of complex III and IV activity and that high inhibition thresholds must be reached before excess glutamate is released from the nerve terminal. The implications of the results in the context of the relationship between electron transport chain enzyme deficiencies and excitotoxicity in neurodegenerative disorders are discussed.

  3. High-level inhibition of mitochondrial complexes III and IV is required to increase glutamate release from the nerve terminal

    LENUS (Irish Health Repository)

    Kilbride, Sean M

    2011-07-26

    Abstract Background The activities of mitochondrial complex III (ubiquinol-cytochrome c reductase, EC 1.10.2.2) and complex IV (cytochrome c oxidase EC 1.9.3.1) are reduced by 30-70% in Huntington\\'s disease and Alzheimer\\'s disease, respectively, and are associated with excitotoxic cell death in these disorders. In this study, we investigated the control that complexes III and complex IV exert on glutamate release from the isolated nerve terminal. Results Inhibition of complex III activity by 60-90% was necessary for a major increase in the rate of Ca2+-independent glutamate release to occur from isolated nerve terminals (synaptosomes) depolarized with 4-aminopyridine or KCl. Similarly, an 85-90% inhibition of complex IV activity was required before a major increase in the rate of Ca2+-independent glutamate release from depolarized synaptosomes was observed. Inhibition of complex III and IV activities by ~ 60% and above was required before rates of glutamate efflux from polarized synaptosomes were increased. Conclusions These results suggest that nerve terminal mitochondria possess high reserves of complex III and IV activity and that high inhibition thresholds must be reached before excess glutamate is released from the nerve terminal. The implications of the results in the context of the relationship between electron transport chain enzyme deficiencies and excitotoxicity in neurodegenerative disorders are discussed.

  4. Good distractions: Testing the effects of listening to an audiobook on driving performance in simple and complex road environments.

    Science.gov (United States)

    Nowosielski, Robert J; Trick, Lana M; Toxopeus, Ryan

    2018-02-01

    Distracted driving (driving while performing a secondary task) causes many collisions. Most research on distracted driving has focused on operating a cell-phone, but distracted driving can include eating while driving, conversing with passengers or listening to music or audiobooks. Although the research has focused on the deleterious effects of distraction, there may be situations where distraction improves driving performance. Fatigue and boredom are also associated with collision risk and it is possible that secondary tasks can help alleviate the effects of fatigue and boredom. Furthermore, it has been found that individuals with high levels of executive functioning as measured by the OSPAN (Operation Span) task show better driving while multitasking. In this study, licensed drivers were tested in a driving simulator (a car body surrounded by screens) that simulated simple or complex roads. Road complexity was manipulated by increasing traffic, scenery, and the number of curves in the drive. Participants either drove, or drove while listening to an audiobook. Driving performance was measured in terms of braking response time to hazards (HRT): the time required to brake in response to pedestrians or vehicles that suddenly emerged from the periphery into the path of the vehicle, speed, standard deviation of speed, standard deviation of lateral position (SDLP). Overall, braking times to hazards were higher on the complex drive than the simple one, though the effects of secondary tasks such as audiobooks were especially deleterious on the complex drive. In contrast, on the simple drive, driving while listening to an audiobook lead to faster HRT. We found evidence that individuals with high OSPAN scores had faster HRTs when listening to an audiobook. These results suggest that there are environmental and individual factors behind difference in the allocation of attention while listening to audiobooks while driving. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. The γ-tubulin complex in Trypanosoma brucei: molecular composition, subunit interdependence and requirement for axonemal central pair protein assembly

    Science.gov (United States)

    Zhou, Qing; Li, Ziyin

    2015-01-01

    The γ-tubulin complex constitutes a key component of the microtubule-organizing center and nucleates microtubule assembly. This complex differs in complexity in different organisms: the budding yeast contains the γ-tubulin small complex (γTuSC) composed of γ-tubulin, GCP2 and GCP3, whereas animals contain the γ-tubulin ring complex (γTuRC) composed of γTuSC and three additional proteins, GCP4, GCP5 and GCP6. In Trypanosoma brucei, the composition of the γ-tubulin complex remains elusive, and it is not known whether it also regulates assembly of the subpellicular microtubules and the spindle microtubules. Here we report that the γ-tubulin complex in T. brucei is composed of γ-tubulin and three GCP proteins, GCP2-GCP4, and is primarily localized in the basal body throughout the cell cycle. Depletion of GCP2 and GCP3, but not GCP4, disrupted the axonemal central pair microtubules, but not the subpellicular microtubules and the spindle microtubules. Furthermore, we showed that the γTuSC is required for assembly of two central pair proteins and that γTuSC subunits are mutually required for stability. Together, these results identified an unusual γ-tubulin complex in T. brucei, uncovered an essential role of γTuSC in central pair protein assembly, and demonstrated the interdependence of individual γTuSC components for maintaining a stable complex. PMID:26224545

  6. Critical evaluation of the JDO API for the persistence and portability requirements of complex biological databases

    Directory of Open Access Journals (Sweden)

    Schwieger Michael

    2005-01-01

    Full Text Available Abstract Background Complex biological database systems have become key computational tools used daily by scientists and researchers. Many of these systems must be capable of executing on multiple different hardware and software configurations and are also often made available to users via the Internet. We have used the Java Data Object (JDO persistence technology to develop the database layer of such a system known as the SigPath information management system. SigPath is an example of a complex biological database that needs to store various types of information connected by many relationships. Results Using this system as an example, we perform a critical evaluation of current JDO technology; discuss the suitability of the JDO standard to achieve portability, scalability and performance. We show that JDO supports portability of the SigPath system from a relational database backend to an object database backend and achieves acceptable scalability. To answer the performance question, we have created the SigPath JDO application benchmark that we distribute under the Gnu General Public License. This benchmark can be used as an example of using JDO technology to create a complex biological database and makes it possible for vendors and users of the technology to evaluate the performance of other JDO implementations for similar applications. Conclusions The SigPath JDO benchmark and our discussion of JDO technology in the context of biological databases will be useful to bioinformaticians who design new complex biological databases and aim to create systems that can be ported easily to a variety of database backends.

  7. Multiple domains of fission yeast Cdc19p (MCM2) are required for its association with the core MCM complex.

    Science.gov (United States)

    Sherman, D A; Pasion, S G; Forsburg, S L

    1998-07-01

    The members of the MCM protein family are essential eukaryotic DNA replication factors that form a six-member protein complex. In this study, we use antibodies to four MCM proteins to investigate the structure of and requirements for the formation of fission yeast MCM complexes in vivo, with particular regard to Cdc19p (MCM2). Gel filtration analysis shows that the MCM protein complexes are unstable and can be broken down to subcomplexes. Using coimmunoprecipitation, we find that Mis5p (MCM6) and Cdc21p (MCM4) are tightly associated with one another in a core complex with which Cdc19p loosely associates. Assembly of Cdc19p with the core depends upon Cdc21p. Interestingly, there is no obvious change in Cdc19p-containing MCM complexes through the cell cycle. Using a panel of Cdc19p mutants, we find that multiple domains of Cdc19p are required for MCM binding. These studies indicate that MCM complexes in fission yeast have distinct substructures, which may be relevant for function.

  8. 13 CFR 126.700 - What are the performance of work requirements for HUBZone contracts?

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false What are the performance of work... ADMINISTRATION HUBZONE PROGRAM Contract Performance Requirements § 126.700 What are the performance of work... meet the performance of work requirements set forth in § 125.6(c) of this chapter. (b) In addition to...

  9. Feeding cells induced by phytoparasitic nematodes require γ-tubulin ring complex for microtubule reorganization.

    Directory of Open Access Journals (Sweden)

    Mohamed Youssef Banora

    2011-12-01

    Full Text Available Reorganization of the microtubule network is important for the fast isodiametric expansion of giant-feeding cells induced by root-knot nematodes. The efficiency of microtubule reorganization depends on the nucleation of new microtubules, their elongation rate and activity of microtubule severing factors. New microtubules in plants are nucleated by cytoplasmic or microtubule-bound γ-tubulin ring complexes. Here we investigate the requirement of γ-tubulin complexes for giant feeding cells development using the interaction between Arabidopsis and Meloidogyne spp. as a model system. Immunocytochemical analyses demonstrate that γ-tubulin localizes to both cortical cytoplasm and mitotic microtubule arrays of the giant cells where it can associate with microtubules. The transcripts of two Arabidopsis γ-tubulin (TUBG1 and TUBG2 and two γ-tubulin complex proteins genes (GCP3 and GCP4 are upregulated in galls. Electron microscopy demonstrates association of GCP3 and γ-tubulin as part of a complex in the cytoplasm of giant cells. Knockout of either or both γ-tubulin genes results in the gene dose-dependent alteration of the morphology of feeding site and failure of nematode life cycle completion. We conclude that the γ-tubulin complex is essential for the control of microtubular network remodelling in the course of initiation and development of giant-feeding cells, and for the successful reproduction of nematodes in their plant hosts.

  10. Why performance-based contracting failed in Uganda--an "open-box" evaluation of a complex health system intervention.

    Science.gov (United States)

    Ssengooba, Freddie; McPake, Barbara; Palmer, Natasha

    2012-07-01

    Performance-based contracting (PBC) is a tool that links rewards to attainment of measurable performance targets. Significant problems remain in the methods used to evaluate this tool. The primary focus of evaluations on the effects of PBC (black-box) and less attention to how these effects arise (open-box) generates suboptimal policy learning. A black-box impact evaluation of PBC pilot by the Development Research Group of the World Bank (DRG) and the Ministry of Health (MOH) concluded that PBC was ineffective. This paper reports a theory-based case study intended to clarify how and why PBC failed to achieve its objectives. To explain the observed PBC implementation and responses of participants, this case study employed two related theories i.e. complex adaptive system and expectancy theory respectively. A prospective study trailed the implementation of PBC (2003-2006) while collecting experiences of participants at district and hospital levels. Significant problems were encountered in the implementation of PBC that reflected its inadequate design. As problems were encountered, hasty adaptations resulted in a de facto intervention distinct from the one implied at the design stage. For example, inadequate time was allowed for the selection of service targets by the health centres yet they got 'locked-in' to these poor choices. The learning curve and workload among performance auditors weakened the validity of audit results. Above all, financial shortfalls led to delays, short-cuts and uncertainty about the size and payment of bonuses. The lesson for those intending to implement similar interventions is that PBC should not be attempted 'on the cheap'. It requires a plan to boost local institutional and technical capacities of implementers. It also requires careful consideration of the responses of multiple actors - both insiders and outsiders to the intended change process. Given the costs and complexity of PBC implementation, strengthening conventional approaches

  11. 48 CFR 44.302 - Requirements.

    Science.gov (United States)

    2010-10-01

    ... during the next 12 months, perform a review to determine if a CPSR is needed. Sales include those... SUBCONTRACTING POLICIES AND PROCEDURES Contractors' Purchasing Systems Reviews 44.302 Requirements. (a) The ACO..., and the volume, complexity and dollar value of subcontracts. If a contractor's sales to the Government...

  12. 46 CFR 160.037-3 - Materials, workmanship, construction, and performance requirements.

    Science.gov (United States)

    2010-10-01

    ... 46 Shipping 6 2010-10-01 2010-10-01 false Materials, workmanship, construction, and performance...) EQUIPMENT, CONSTRUCTION, AND MATERIALS: SPECIFICATIONS AND APPROVAL LIFESAVING EQUIPMENT Hand Orange Smoke Distress Signals § 160.037-3 Materials, workmanship, construction, and performance requirements. (a...

  13. The effect of two complexity factors on the performance of emergency tasks-An experimental verification

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea; Jung, Kwangtae

    2008-01-01

    It is well known that the use of procedures is very important in securing the safety of process systems, since good procedures effectively guide human operators by providing 'what should be done' and 'how to do it', especially under stressful conditions. At the same time, it has been emphasized that the use of complicated procedures could drastically impair operators' performance. This means that a systematic approach that can properly evaluate the complexity of procedures is indispensable for minimizing the side effects of complicated procedures. For this reason, Park et al. have developed a task complexity measure called TACOM that can be used to quantify the complexity of tasks stipulated in emergency operating procedures (EOPs) of nuclear power plants (NPPs). The TACOM measure consists of five sub-measures that can cover five important factors making the performance of emergency tasks complicated. However, a verification activity for two kinds of complexity factors-the level of abstraction hierarchy (AH) and engineering decision (ED)-seems to be insufficient. In this study, therefore, an experiment is conducted by using a low-fidelity simulator in order to clarify the appropriateness of these complexity factors. As a result, it seems that subjects' performance data are affected by the level of AH as well as ED. Therefore it is anticipate that both the level of AH and ED will play an important role in evaluating the complexity of EOPs

  14. Cognitive performance modeling based on general systems performance theory.

    Science.gov (United States)

    Kondraske, George V

    2010-01-01

    General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).

  15. Use of simple transport equations to estimate waste package performance requirements

    International Nuclear Information System (INIS)

    Wood, B.J.

    1982-01-01

    A method of developing waste package performance requirements for specific nuclides is described. The method is based on: Federal regulations concerning permissible concentrations in solution at the point of discharge to the accessible environment; a simple and conservative transport model; baseline and potential worst-case release scenarios. Use of the transport model enables calculation of maximum permissible release rates within a repository in basalt for each of the scenarios. The maximum permissible release rates correspond to performance requirements for the engineered barrier system. The repository was assumed to be constructed in a basalt layer. For the cases considered, including a well drilled into an aquifer 1750 m from the repository center, little significant advantage is obtained from a 1000-yr as opposed to a 100-yr waste package. A 1000-yr waste package is of importance only for nuclides with half-lives much less than 100 yr which travel to the accessible environment in much less than 1000 yr. Such short travel times are extremely unlikely for a mined repository. Among the actinides, the most stringent maximum permissible release rates are for 236 U and 234 U. A simple solubility calculation suggests, however, that these performance requirements can be readily met by the engineered barrier system. Under the reducing conditions likely to occur in a repository located in basalt, uranium would be sufficiently insoluble that no solution could contain more than about 0.01% of the maximum permissible concentration at saturation. The performance requirements derived from the one-dimensional modeling approach are conservative by at least one to two orders of magnitude. More quantitative three-dimensional modeling at specific sites should enable relaxation of the performance criteria derived in this study. 12 references, 8 figures, 8 tables

  16. Wind turbine power performance verification in complex terrain and wind farms

    DEFF Research Database (Denmark)

    Friis Pedersen, Troels; Gjerding, S.; Enevoldsen, P.

    2002-01-01

    is a power performance verification procedure for individual wind turbines. The third is a power performance measurement procedure of whole wind farms, and the fourth is a power performance measurement procedurefor non-grid (small) wind turbines. This report presents work that was made to support the basis......The IEC/EN 61400-12 Ed 1 standard for wind turbine power performance testing is being revised. The standard will be divided into four documents. The first one of these is more or less a revision of the existing document on power performance measurementson individual wind turbines. The second one...... then been investigated in more detail. The work has given rise to a range of conclusionsand recommendations regarding: guaranties on power curves in complex terrain; investors and bankers experience with verification of power curves; power performance in relation to regional correction curves for Denmark...

  17. Building on the EGIPPS performance assessment: the multipolar framework as a heuristic to tackle the complexity of performance of public service oriented health care organisations.

    Science.gov (United States)

    Marchal, Bruno; Hoerée, Tom; da Silveira, Valéria Campos; Van Belle, Sara; Prashanth, Nuggehalli S; Kegels, Guy

    2014-04-17

    Performance of health care systems is a key concern of policy makers and health service managers all over the world. It is also a major challenge, given its multidimensional nature that easily leads to conceptual and methodological confusion. This is reflected by a scarcity of models that comprehensively analyse health system performance. In health, one of the most comprehensive performance frameworks was developed by the team of Leggat and Sicotte. Their framework integrates 4 key organisational functions (goal attainment, production, adaptation to the environment, and values and culture) and the tensions between these functions.We modified this framework to better fit the assessment of the performance of health organisations in the public service domain and propose an analytical strategy that takes it into the social complexity of health organisations. The resulting multipolar performance framework (MPF) is a meta-framework that facilitates the analysis of the relations and interactions between the multiple actors that influence the performance of health organisations. Using the MPF in a dynamic reiterative mode not only helps managers to identify the bottlenecks that hamper performance, but also the unintended effects and feedback loops that emerge. Similarly, it helps policymakers and programme managers at central level to better anticipate the potential results and side effects of and required conditions for health policies and programmes and to steer their implementation accordingly.

  18. The Effects of Differential Goal Weights on the Performance of a Complex Financial Task.

    Science.gov (United States)

    Edmister, Robert O.; Locke, Edwin A.

    1987-01-01

    Determined whether people could obtain outcomes on a complex task that would be in line with differential goal weights corresponding to different aspects of the task. Bank lending officers were run through lender-simulation exercises. Five performance goals were weighted. Demonstrated effectiveness of goal setting with complex tasks, using group…

  19. X-ray-enhanced cancer cell migration requires the linker of nucleoskeleton and cytoskeleton complex.

    Science.gov (United States)

    Imaizumi, Hiromasa; Sato, Katsutoshi; Nishihara, Asuka; Minami, Kazumasa; Koizumi, Masahiko; Matsuura, Nariaki; Hieda, Miki

    2018-04-01

    The linker of nucleoskeleton and cytoskeleton (LINC) complex is a multifunctional protein complex that is involved in various processes at the nuclear envelope, including nuclear migration, mechanotransduction, chromatin tethering and DNA damage response. We recently showed that a nuclear envelope protein, Sad1 and UNC84 domain protein 1 (SUN1), a component of the LINC complex, has a critical function in cell migration. Although ionizing radiation activates cell migration and invasion in vivo and in vitro, the underlying molecular mechanism remains unknown. Here, we examined the involvement of the LINC complex in radiation-enhanced cell migration and invasion. A sublethal dose of X-ray radiation promoted human breast cancer MDA-MB-231 cell migration and invasion, whereas carbon ion beam radiation suppressed these processes in a dose-dependent manner. Depletion of SUN1 and SUN2 significantly suppressed X-ray-enhanced cell migration and invasion. Moreover, depletion or overexpression of each SUN1 splicing variant revealed that SUN1_888 containing 888 amino acids of SUN1 but not SUN1_916 was required for X-ray-enhanced migration and invasion. In addition, the results suggested that X-ray irradiation affected the expression level of SUN1 splicing variants and a SUN protein binding partner, nesprins. Taken together, our findings supported that the LINC complex contributed to photon-enhanced cell migration and invasion. © 2018 The Authors. Cancer Science published by John Wiley & Sons Australia, Ltd on behalf of Japanese Cancer Association.

  20. Modeling Complex Workflow in Molecular Diagnostics

    Science.gov (United States)

    Gomah, Mohamed E.; Turley, James P.; Lu, Huimin; Jones, Dan

    2010-01-01

    One of the hurdles to achieving personalized medicine has been implementing the laboratory processes for performing and reporting complex molecular tests. The rapidly changing test rosters and complex analysis platforms in molecular diagnostics have meant that many clinical laboratories still use labor-intensive manual processing and testing without the level of automation seen in high-volume chemistry and hematology testing. We provide here a discussion of design requirements and the results of implementation of a suite of lab management tools that incorporate the many elements required for use of molecular diagnostics in personalized medicine, particularly in cancer. These applications provide the functionality required for sample accessioning and tracking, material generation, and testing that are particular to the evolving needs of individualized molecular diagnostics. On implementation, the applications described here resulted in improvements in the turn-around time for reporting of more complex molecular test sets, and significant changes in the workflow. Therefore, careful mapping of workflow can permit design of software applications that simplify even the complex demands of specialized molecular testing. By incorporating design features for order review, software tools can permit a more personalized approach to sample handling and test selection without compromising efficiency. PMID:20007844

  1. Improving performance of DS-CDMA systems using chaotic complex Bernoulli spreading codes

    Science.gov (United States)

    Farzan Sabahi, Mohammad; Dehghanfard, Ali

    2014-12-01

    The most important goal of spreading spectrum communication system is to protect communication signals against interference and exploitation of information by unintended listeners. In fact, low probability of detection and low probability of intercept are two important parameters to increase the performance of the system. In Direct Sequence Code Division Multiple Access (DS-CDMA) systems, these properties are achieved by multiplying the data information in spreading sequences. Chaotic sequences, with their particular properties, have numerous applications in constructing spreading codes. Using one-dimensional Bernoulli chaotic sequence as spreading code is proposed in literature previously. The main feature of this sequence is its negative auto-correlation at lag of 1, which with proper design, leads to increase in efficiency of the communication system based on these codes. On the other hand, employing the complex chaotic sequences as spreading sequence also has been discussed in several papers. In this paper, use of two-dimensional Bernoulli chaotic sequences is proposed as spreading codes. The performance of a multi-user synchronous and asynchronous DS-CDMA system will be evaluated by applying these sequences under Additive White Gaussian Noise (AWGN) and fading channel. Simulation results indicate improvement of the performance in comparison with conventional spreading codes like Gold codes as well as similar complex chaotic spreading sequences. Similar to one-dimensional Bernoulli chaotic sequences, the proposed sequences also have negative auto-correlation. Besides, construction of complex sequences with lower average cross-correlation is possible with the proposed method.

  2. Performance Evaluation and Requirements Assessment for Gravity Gradient Referenced Navigation

    Directory of Open Access Journals (Sweden)

    Jisun Lee

    2015-07-01

    Full Text Available In this study, simulation tests for gravity gradient referenced navigation (GGRN are conducted to verify the effects of various factors such as database (DB and sensor errors, flight altitude, DB resolution, initial errors, and measurement update rates on the navigation performance. Based on the simulation results, requirements for GGRN are established for position determination with certain target accuracies. It is found that DB and sensor errors and flight altitude have strong effects on the navigation performance. In particular, a DB and sensor with accuracies of 0.1 E and 0.01 E, respectively, are required to determine the position more accurately than or at a level similar to the navigation performance of terrain referenced navigation (TRN. In most cases, the horizontal position error of GGRN is less than 100 m. However, the navigation performance of GGRN is similar to or worse than that of a pure inertial navigation system when the DB and sensor errors are 3 E or 5 E each and the flight altitude is 3000 m. Considering that the accuracy of currently available gradiometers is about 3 E or 5 E, GGRN does not show much advantage over TRN at present. However, GGRN is expected to exhibit much better performance in the near future when accurate DBs and gravity gradiometer are available.

  3. Cognitive function predicts listening effort performance during complex tasks in normally aging adults

    Directory of Open Access Journals (Sweden)

    Jennine Harvey

    2017-01-01

    Full Text Available Purpose: This study examines whether cognitive function, as measured by the subtests of the Woodcock–Johnson III (WCJ-III assessment, predicts listening-effort performance during dual tasks across the adults of varying ages. Materials and Methods: Participants were divided into two groups. Group 1 consisted of 14 listeners (number of females = 11 who were 41–61 years old [mean = 53.18; standard deviation (SD = 5.97]. Group 2 consisted of 15 listeners (number of females = 9 who were 63–81 years old (mean = 72.07; SD = 5.11. Participants were administered the WCJ-III Memory for Words, Auditory Working Memory, Visual Matching, and Decision Speed subtests. All participants were tested in each of the following three dual-task experimental conditions, which were varying in complexity: (1 auditory word recognition + visual processing, (2 auditory working memory (word + visual processing, and (3 auditory working memory (sentence + visual processing in noise. Results: A repeated measures analysis of variance revealed that task complexity significantly affected the performance measures of auditory accuracy, visual accuracy, and processing speed. Linear regression revealed that the cognitive subtests of the WCJ-III test significantly predicted performance across dependent variable measures. Conclusion: Listening effort is significantly affected by task complexity, regardless of age. Performance on the WCJ-III test may predict listening effort in adults and may assist speech-language pathologist (SLPs to understand challenges faced by participants when subjected to noise.

  4. Complex Analysis of Financial State and Performance of Construction Enterprises

    Directory of Open Access Journals (Sweden)

    Algirdas Krivka

    2015-12-01

    Full Text Available The paper analyses the financial state and performance of large constructions enterprises by applying financial indicators. As there is no one single decisive financial indicator enabling to objectively assess enterprise performance, the multi-criteria decision making (MCDM methods are applied with four groups of financial ratios (profitability, liquidity, solvency and asset turnover acting as evaluation criteria, while the alternatives assessed are two enterprises compared throughout the reference period of three years, also with the average indicator values of the whole construction sector. The weights of the criteria have been estimated by involving competent experts with chi-square test employed to check the degree of agreement of expert estimates. The research methodology contributes to the issue of complex evaluation of enterprise financial state and performance, while the result of the multi-criteria assessment – the ranking of enterprises and sector average with respect to financial state and performance – could be considered worth attention from business owners, potential investors, customers or other possible stakeholders.

  5. Performance of community health workers : situating their intermediary position within complex adaptive health systems

    NARCIS (Netherlands)

    Kok, Maryse C; Broerse, Jacqueline E W; Theobald, Sally; Ormel, Hermen; Dieleman, Marjolein; Taegtmeyer, Miriam

    2017-01-01

    Health systems are social institutions, in which health worker performance is shaped by transactional processes between different actors.This analytical assessment unravels the complex web of factors that influence the performance of community health workers (CHWs) in low- and middle-income

  6. Pitch Sequence Complexity and Long-Term Pitcher Performance

    Directory of Open Access Journals (Sweden)

    Joel R. Bock

    2015-03-01

    Full Text Available Winning one or two games during a Major League Baseball (MLB season is often the difference between a team advancing to post-season play, or “waiting until next year”. Technology advances have made it feasible to augment historical data with in-game contextual data to provide managers immediate insights regarding an opponent’s next move, thereby providing a competitive edge. We developed statistical models of pitcher behavior using pitch sequences thrown during three recent MLB seasons (2011–2013. The purpose of these models was to predict the next pitch type, for each pitcher, based on data available at the immediate moment, in each at-bat. Independent models were developed for each player’s most frequent four pitches. The overall predictability of next pitch type is 74:5%. Additional analyses on pitcher predictability within specific game situations are discussed. Finally, using linear regression analysis, we show that an index of pitch sequence predictability may be used to project player performance in terms of Earned Run Average (ERA and Fielding Independent Pitching (FIP over a longer term. On a restricted range of the independent variable, reducing complexity in selection of pitches is correlated with higher values of both FIP and ERA for the players represented in the sample. Both models were significant at the α = 0.05 level (ERA: p = 0.022; FIP: p = 0.0114. With further development, such models may reduce risk faced by management in evaluation of potential trades, or to scouts assessing unproven emerging talent. Pitchers themselves might benefit from awareness of their individual statistical tendencies, and adapt their behavior on the mound accordingly. To our knowledge, the predictive model relating pitch-wise complexity and long-term performance appears to be novel.

  7. Performance and Complexity Co-evaluation of the Advanced Video Coding Standard for Cost-Effective Multimedia Communications

    Directory of Open Access Journals (Sweden)

    Saponara Sergio

    2004-01-01

    Full Text Available The advanced video codec (AVC standard, recently defined by a joint video team (JVT of ITU-T and ISO/IEC, is introduced in this paper together with its performance and complexity co-evaluation. While the basic framework is similar to the motion-compensated hybrid scheme of previous video coding standards, additional tools improve the compression efficiency at the expense of an increased implementation cost. As a first step to bridge the gap between the algorithmic design of a complex multimedia system and its cost-effective realization, a high-level co-evaluation approach is proposed and applied to a real-life AVC design. An exhaustive analysis of the codec compression efficiency versus complexity (memory and computational costs project space is carried out at the early algorithmic design phase. If all new coding features are used, the improved AVC compression efficiency (up to 50% compared to current video coding technology comes with a complexity increase of a factor 2 for the decoder and larger than one order of magnitude for the encoder. This represents a challenge for resource-constrained multimedia systems such as wireless devices or high-volume consumer electronics. The analysis also highlights important properties of the AVC framework allowing for complexity reduction at the high system level: when combining the new coding features, the implementation complexity accumulates, while the global compression efficiency saturates. Thus, a proper use of the AVC tools maintains the same performance as the most complex configuration while considerably reducing complexity. The reported results provide inputs to assist the profile definition in the standard, highlight the AVC bottlenecks, and select optimal trade-offs between algorithmic performance and complexity.

  8. Effect of action verbs on the performance of a complex movement.

    Directory of Open Access Journals (Sweden)

    Tahar Rabahi

    Full Text Available The interaction between language and motor action has been approached by studying the effect of action verbs, kinaesthetic imagery and mental subtraction upon the performance of a complex movement, the squat vertical jump (SVJ. The time of flight gave the value of the height of the SVJ and was measured with an Optojump® and a Myotest® apparatuses. The results obtained by the effects of the cognitive stimuli showed a statistically significant improvement of the SVJ performance after either loudly or silently pronouncing, hearing or reading the verb saute (jump in French language. Action verbs specific for other motor actions (pince = pinch, lèche = lick or non-specific (bouge = move showed no or little effect. A meaningless verb for the French subjects (tiáo = jump in Chinese showed no effect as did rêve (dream, tombe (fall and stop. The verb gagne (win improved significantly the SVJ height, as did its antonym perds (lose suggesting a possible influence of affects in the subjects' performance. The effect of the specific action verb jump upon the heights of SVJ was similar to that obtained after kinaesthetic imagery and after mental subtraction of two digits numbers from three digits ones; possibly, in the latter, because of the intervention of language in calculus. It appears that the effects of the specific action verb jump did seem effective but not totally exclusive for the enhancement of the SVJ performance. The results imply an interaction among language and motor brain areas in the performance of a complex movement resulting in a clear specificity of the corresponding action verb. The effect upon performance may probably be influenced by the subjects' intention, increased attention and emotion produced by cognitive stimuli among which action verbs.

  9. Simulating the Daylight Performance of Complex Fenestration Systems Using Bidirectional Scattering Distribution Functions within Radiance

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Gregory; Mistrick, Ph.D., Richard; Lee, Eleanor; McNeil, Andrew; Jonsson, Ph.D., Jacob

    2011-01-21

    We describe two methods which rely on bidirectional scattering distribution functions (BSDFs) to model the daylighting performance of complex fenestration systems (CFS), enabling greater flexibility and accuracy in evaluating arbitrary assemblies of glazing, shading, and other optically-complex coplanar window systems. Two tools within Radiance enable a) efficient annual performance evaluations of CFS, and b) accurate renderings of CFS despite the loss of spatial resolution associated with low-resolution BSDF datasets for inhomogeneous systems. Validation, accuracy, and limitations of the methods are discussed.

  10. The Contradiction Index (CI): A New Metric Combining System Complexity and Robustness for Early Design Stages

    DEFF Research Database (Denmark)

    Göhler, Simon Moritz; Howard, Thomas J.

    2015-01-01

    For complex and integrated products, companies experience difficulties in achieving a satisfactory and consistent functional performance. When a design has “contradicting” parameter/property requirements it often requires fine tuning with numerous design iterations and complex optimizations to fi...

  11. Performance Support Systems: Integrating AI, Hypermedia, and CBT to Enhance User Performance.

    Science.gov (United States)

    McGraw, Karen L.

    1994-01-01

    Examines the use of a performance support system (PSS) to enhance user performance on an operational system. Highlights include background information that describes the stimulus for PSS development; discussion of the major PSS components and the technology they require; and discussion of the design of a PSS for a complex database system.…

  12. Comparison of exertion required to perform standard and active compression-decompression cardiopulmonary resuscitation.

    Science.gov (United States)

    Shultz, J J; Mianulli, M J; Gisch, T M; Coffeen, P R; Haidet, G C; Lurie, K G

    1995-02-01

    Active compression-decompression (ACD) cardiopulmonary resuscitation (CPR) utilizes a hand-held suction device with a pressure gauge that enables the operator to compress as well as actively decompress the chest. This new CPR method improves hemodynamic and ventilatory parameters when compared with standard CPR. ACD-CPR is easy to perform but may be more labor intensive. The purpose of this study was to quantify and compare the work required to perform ACD and standard CPR. Cardiopulmonary testing was performed on six basic cardiac life support- and ACD-trained St. Paul, MN fire-fighter personnel during performance of 10 min each of ACD and standard CPR on a mannequin equipped with a compression gauge. The order of CPR techniques was determined randomly with > 1 h between each study. Each CPR method was performed at 80 compressions/min (timed with a metronome), to a depth of 1.5-2 inches, and with a 50% duty cycle. Baseline cardiopulmonary measurements were similar at rest prior to performance of both CPR methods. During standard and ACD-CPR, respectively, rate-pressure product was 18.2 +/- 3.0 vs. 23.8 +/- 1.7 (x 1000, P CPR compared with standard CPR. Both methods require subanaerobic energy expenditure and can therefore be sustained for a sufficient length of time by most individuals to optimize resuscitation efforts. Due to the slightly higher work requirement, ACD-CPR may be more difficult to perform compared with standard CPR for long periods of time, particularly by individuals unaccustomed to the workload requirement of CPR, in general.

  13. Explaining high and low performers in complex intervention trials: a new model based on diffusion of innovations theory.

    Science.gov (United States)

    McMullen, Heather; Griffiths, Chris; Leber, Werner; Greenhalgh, Trisha

    2015-05-31

    Complex intervention trials may require health care organisations to implement new service models. In a recent cluster randomised controlled trial, some participating organisations achieved high recruitment, whereas others found it difficult to assimilate the intervention and were low recruiters. We sought to explain this variation and develop a model to inform organisational participation in future complex intervention trials. The trial included 40 general practices in a London borough with high HIV prevalence. The intervention was offering a rapid HIV test as part of the New Patient Health Check. The primary outcome was mean CD4 cell count at diagnosis. The process evaluation consisted of several hundred hours of ethnographic observation, 21 semi-structured interviews and analysis of routine documents (e.g., patient leaflets, clinical protocols) and trial documents (e.g., inclusion criteria, recruitment statistics). Qualitative data were analysed thematically using--and, where necessary, extending--Greenhalgh et al.'s model of diffusion of innovations. Narrative synthesis was used to prepare case studies of four practices representing maximum variety in clinicians' interest in HIV (assessed by level of serological testing prior to the trial) and performance in the trial (high vs. low recruiters). High-recruiting practices were, in general though not invariably, also innovative practices. They were characterised by strong leadership, good managerial relations, readiness for change, a culture of staff training and available staff time ('slack resources'). Their front-line staff believed that patients might benefit from the rapid HIV test ('relative advantage'), were emotionally comfortable administering it ('compatibility'), skilled in performing it ('task issues') and made creative adaptations to embed the test in local working practices ('reinvention'). Early experience of a positive HIV test ('observability') appeared to reinforce staff commitment to recruiting

  14. ASYMMETRIC PRICE TRANSMISSION MODELING: THE IMPORTANCE OF MODEL COMPLEXITY AND THE PERFORMANCE OF THE SELECTION CRITERIA

    Directory of Open Access Journals (Sweden)

    Henry de-Graft Acquah

    2013-01-01

    Full Text Available Information Criteria provides an attractive basis for selecting the best model from a set of competing asymmetric price transmission models or theories. However, little is understood about the sensitivity of the model selection methods to model complexity. This study therefore fits competing asymmetric price transmission models that differ in complexity to simulated data and evaluates the ability of the model selection methods to recover the true model. The results of Monte Carlo experimentation suggest that in general BIC, CAIC and DIC were superior to AIC when the true data generating process was the standard error correction model, whereas AIC was more successful when the true model was the complex error correction model. It is also shown that the model selection methods performed better in large samples for a complex asymmetric data generating process than with a standard asymmetric data generating process. Except for complex models, AIC's performance did not make substantial gains in recovery rates as sample size increased. The research findings demonstrate the influence of model complexity in asymmetric price transmission model comparison and selection.

  15. Unusual Development of Iatrogenic Complex, Mixed Biliary and Duodenal Fistulas Complicating Roux-en-Y Antrectomy for Stenotic Peptic Disease of the Supraampullary Duodenum Requiring Whipple Procedure: An Uncommon Clinical Dilemma.

    Science.gov (United States)

    Polistina, Francesco A; Costantin, Giorgio; Settin, Alessandro; Lumachi, Franco; Ambrosino, Giovanni

    2010-10-23

    Complex fistulas of the duodenum and biliary tree are severe complications of gastric surgery. The association of duodenal and major biliary fistulas occurs rarely and is a major challenge for treatment. They may occur during virtually any kind of operation, but they are more frequent in cases complicated by the presence of difficult duodenal ulcers or cancer, with a mortality rate of up to 35%. Options for treatment are many and range from simple drainage to extended resections and difficult reconstructions. Conservative treatment is the choice for well-drained fistulas, but some cases require reoperation. Very little is known about reoperation techniques and technical selection of the right patients. We present the case of a complex iatrogenic duodenal and biliary fistula. A 42-year-old Caucasian man with a diagnosis of postoperative peritonitis had been operated on 3 days earlier; an antrectomy with a Roux-en-Y reconstruction for stenotic peptic disease was performed. Conservative treatment was attempted with mixed results. Two more operations were required to achieve a definitive resolution of the fistula and related local complications. The decision was made to perform a pancreatoduodenectomy with subsequent reconstruction on a double jejunal loop. The patient did well and was discharged on postoperative day 17. In our experience pancreaticoduodenectomy may be an effective treatment of refractory and complex iatrogenic fistulas involving both the duodenum and the biliary tree.

  16. Unusual Development of Iatrogenic Complex, Mixed Biliary and Duodenal Fistulas Complicating Roux-en-Y Antrectomy for Stenotic Peptic Disease of the Supraampullary Duodenum Requiring Whipple Procedure: An Uncommon Clinical Dilemma

    Directory of Open Access Journals (Sweden)

    Francesco A. Polistina

    2010-10-01

    Full Text Available Complex fistulas of the duodenum and biliary tree are severe complications of gastric surgery. The association of duodenal and major biliary fistulas occurs rarely and is a major challenge for treatment. They may occur during virtually any kind of operation, but they are more frequent in cases complicated by the presence of difficult duodenal ulcers or cancer, with a mortality rate of up to 35%. Options for treatment are many and range from simple drainage to extended resections and difficult reconstructions. Conservative treatment is the choice for well-drained fistulas, but some cases require reoperation. Very little is known about reoperation techniques and technical selection of the right patients. We present the case of a complex iatrogenic duodenal and biliary fistula. A 42-year-old Caucasian man with a diagnosis of postoperative peritonitis had been operated on 3 days earlier; an antrectomy with a Roux-en-Y reconstruction for stenotic peptic disease was performed. Conservative treatment was attempted with mixed results. Two more operations were required to achieve a definitive resolution of the fistula and related local complications. The decision was made to perform a pancreatoduodenectomy with subsequent reconstruction on a double jejunal loop. The patient did well and was discharged on postoperative day 17. In our experience pancreaticoduodenectomy may be an effective treatment of refractory and complex iatrogenic fistulas involving both the duodenum and the biliary tree.

  17. Circadian Effects on Simple Components of Complex Task Performance

    Science.gov (United States)

    Clegg, Benjamin A.; Wickens, Christopher D.; Vieane, Alex Z.; Gutzwiller, Robert S.; Sebok, Angelia L.

    2015-01-01

    The goal of this study was to advance understanding and prediction of the impact of circadian rhythm on aspects of complex task performance during unexpected automation failures, and subsequent fault management. Participants trained on two tasks: a process control simulation, featuring automated support; and a multi-tasking platform. Participants then completed one task in a very early morning (circadian night) session, and the other during a late afternoon (circadian day) session. Small effects of time of day were seen on simple components of task performance, but impacts on more demanding components, such as those that occur following an automation failure, were muted relative to previous studies where circadian rhythm was compounded with sleep deprivation and fatigue. Circadian low participants engaged in compensatory strategies, rather than passively monitoring the automation. The findings and implications are discussed in the context of a model that includes the effects of sleep and fatigue factors.

  18. The complex nature of mixed farming systems requires multidimensional actions supported by integrative research and development efforts

    DEFF Research Database (Denmark)

    González-García, E; Gourdine, J L; Alexandre, G

    2012-01-01

    the requirement for a change in research strategies and initiatives through the development of a complex but necessary multi-/inter-/trans-disciplinary teamwork spirit. We stress as essential the collaboration and active participation of local and regional actors, stakeholders and end-users in the identification...

  19. The dynein regulatory complex is required for ciliary motility and otolith biogenesis in the inner ear.

    Science.gov (United States)

    Colantonio, Jessica R; Vermot, Julien; Wu, David; Langenbacher, Adam D; Fraser, Scott; Chen, Jau-Nian; Hill, Kent L

    2009-01-08

    In teleosts, proper balance and hearing depend on mechanical sensors in the inner ear. These sensors include actin-based microvilli and microtubule-based cilia that extend from the surface of sensory hair cells and attach to biomineralized 'ear stones' (or otoliths). Otolith number, size and placement are under strict developmental control, but the mechanisms that ensure otolith assembly atop specific cells of the sensory epithelium are unclear. Here we demonstrate that cilia motility is required for normal otolith assembly and localization. Using in vivo video microscopy, we show that motile tether cilia at opposite poles of the otic vesicle create fluid vortices that attract otolith precursor particles, thereby biasing an otherwise random distribution to direct localized otolith seeding on tether cilia. Independent knockdown of subunits for the dynein regulatory complex and outer-arm dynein disrupt cilia motility, leading to defective otolith biogenesis. These results demonstrate a requirement for the dynein regulatory complex in vertebrates and show that cilia-driven flow is a key epigenetic factor in controlling otolith biomineralization.

  20. Geometric and Algebraic Approaches in the Concept of Complex Numbers

    Science.gov (United States)

    Panaoura, A.; Elia, I.; Gagatsis, A.; Giatilis, G.-P.

    2006-01-01

    This study explores pupils' performance and processes in tasks involving equations and inequalities of complex numbers requiring conversions from a geometric representation to an algebraic representation and conversions in the reverse direction, and also in complex numbers problem solving. Data were collected from 95 pupils of the final grade from…

  1. 40 CFR 158.2070 - Biochemical pesticides product performance data requirements.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 23 2010-07-01 2010-07-01 false Biochemical pesticides product performance data requirements. 158.2070 Section 158.2070 Protection of Environment ENVIRONMENTAL PROTECTION... efficacy data unless the pesticide product bears a claim to control public health pests, such as pest...

  2. Economic Complexity and Human Development: DEA performance measurement in Asia and Latin America

    OpenAIRE

    Ferraz, Diogo; Moralles, Hérick Fernando; Suarez Campoli, Jéssica; Ribeiro de Oliveira, Fabíola Cristina; do Nascimento Rebelatto, Daisy Aparecida

    2018-01-01

    Economic growth is not the unique factor to explain human development. Due to that many authors have prioritized studies to measure the Human Development Index. However, these indices do not analyze how Economic Complexity can increase Human Development. The aim of this paper is to determine the efficiency of a set of nations from Latin America and Asia, to measure a country’s performance in converting Economic Complexity into Human Development, between 2010 and 2014. The method used was Data...

  3. Soft plasma electrolysis with complex ions for optimizing electrochemical performance

    Science.gov (United States)

    Kamil, Muhammad Prisla; Kaseem, Mosab; Ko, Young Gun

    2017-03-01

    Plasma electrolytic oxidation (PEO) was a promising surface treatment for light metals to tailor an oxide layer with excellent properties. However, porous coating structure was generally exhibited due to excessive plasma discharges, restraining its performance. The present work utilized ethylenediaminetetraacetic acid (EDTA) and Cu-EDTA complexing agents as electrolyte additives that alter the plasma discharges to improve the electrochemical properties of Al-1.1Mg alloy coated by PEO. To achieve this purpose, PEO coatings were fabricated under an alternating current in silicate electrolytes containing EDTA and Cu-EDTA. EDTA complexes were found to modify the plasma discharging behaviour during PEO that led to a lower porosity than that without additives. This was attributed to a more homogeneous electrical field throughout the PEO process while the coating growth would be maintained by an excess of dissolved Al due to the EDTA complexes. When Cu-EDTA was used, the number of discharge channels in the coating layer was lower than that with EDTA due to the incorporation of Cu2O and CuO altering the dielectric behaviour. Accordingly, the sample in the electrolyte containing Cu-EDTA constituted superior corrosion resistance to that with EDTA. The electrochemical mechanism for excellent corrosion protection was elucidated in the context of equivalent circuit model.

  4. 42 CFR 457.710 - State plan requirements: Strategic objectives and performance goals.

    Science.gov (United States)

    2010-10-01

    .... The State's strategic objectives, performance goals and performance measures must include a common... 42 Public Health 4 2010-10-01 2010-10-01 false State plan requirements: Strategic objectives and...) ALLOTMENTS AND GRANTS TO STATES Strategic Planning, Reporting, and Evaluation § 457.710 State plan...

  5. Analyzing complex wake-terrain interactions and its implications on wind-farm performance.

    Science.gov (United States)

    Tabib, Mandar; Rasheed, Adil; Fuchs, Franz

    2016-09-01

    Rotating wind turbine blades generate complex wakes involving vortices (helical tip-vortex, root-vortex etc.).These wakes are regions of high velocity deficits and high turbulence intensities and they tend to degrade the performance of down-stream turbines. Hence, a conservative inter-turbine distance of up-to 10 times turbine diameter (10D) is sometimes used in wind-farm layout (particularly in cases of flat terrain). This ensures that wake-effects will not reduce the overall wind-farm performance, but this leads to larger land footprint for establishing a wind-farm. In-case of complex-terrain, within a short distance (say 10D) itself, the nearby terrain can rise in altitude and be high enough to influence the wake dynamics. This wake-terrain interaction can happen either (a) indirectly, through an interaction of wake (both near tip vortex and far wake large-scale vortex) with terrain induced turbulence (especially, smaller eddies generated by small ridges within the terrain) or (b) directly, by obstructing the wake-region partially or fully in its flow-path. Hence, enhanced understanding of wake- development due to wake-terrain interaction will help in wind farm design. To this end the current study involves: (1) understanding the numerics for successful simulation of vortices, (2) understanding fundamental vortex-terrain interaction mechanism through studies devoted to interaction of a single vortex with different terrains, (3) relating influence of vortex-terrain interactions to performance of a wind-farm by studying a multi-turbine wind-farm layout under different terrains. The results on interaction of terrain and vortex has shown a much faster decay of vortex for complex terrain compared to a flatter-terrain. The potential reasons identified explaining the observation are (a) formation of secondary vortices in flow and its interaction with the primary vortex and (b) enhanced vorticity diffusion due to increased terrain-induced turbulence. The implications of

  6. Requirements Analysis in the Value Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Conner, Alison Marie

    2001-05-01

    The Value Methodology (VM) study brings together a multidisciplinary team of people who own the problem and have the expertise to identify and solve it. With the varied backgrounds and experiences the team brings to the study, come different perspectives on the problem and the requirements of the project. A requirements analysis step can be added to the Information and Function Analysis Phases of a VM study to validate whether the functions being performed are required, either regulatory or customer prescribed. This paper will provide insight to the level of rigor applied to a requirements analysis step and give some examples of tools and techniques utilized to ease the management of the requirements and functions those requirements support for highly complex problems.

  7. Is a Responsive Default Mode Network Required for Successful Working Memory Task Performance?

    Science.gov (United States)

    Čeko, Marta; Gracely, John L.; Fitzcharles, Mary-Ann; Seminowicz, David A.; Schweinhardt, Petra

    2015-01-01

    In studies of cognitive processing using tasks with externally directed attention, regions showing increased (external-task-positive) and decreased or “negative” [default-mode network (DMN)] fMRI responses during task performance are dynamically responsive to increasing task difficulty. Responsiveness (modulation of fMRI signal by increasing load) has been linked directly to successful cognitive task performance in external-task-positive regions but not in DMN regions. To investigate whether a responsive DMN is required for successful cognitive performance, we compared healthy human subjects (n = 23) with individuals shown to have decreased DMN engagement (chronic pain patients, n = 28). Subjects performed a multilevel working-memory task (N-back) during fMRI. If a responsive DMN is required for successful performance, patients having reduced DMN responsiveness should show worsened performance; if performance is not reduced, their brains should show compensatory activation in external-task-positive regions or elsewhere. All subjects showed decreased accuracy and increased reaction times with increasing task level, with no significant group differences on either measure at any level. Patients had significantly reduced negative fMRI response (deactivation) of DMN regions (posterior cingulate/precuneus, medial prefrontal cortex). Controls showed expected modulation of DMN deactivation with increasing task difficulty. Patients showed significantly reduced modulation of DMN deactivation by task difficulty, despite their successful task performance. We found no evidence of compensatory neural recruitment in external-task-positive regions or elsewhere. Individual responsiveness of the external-task-positive ventrolateral prefrontal cortex, but not of DMN regions, correlated with task accuracy. These findings suggest that a responsive DMN may not be required for successful cognitive performance; a responsive external-task-positive network may be sufficient

  8. Ligand effect on the performance of organic light-emitting diodes based on europium complexes

    International Nuclear Information System (INIS)

    Fang Junfeng; You Han; Gao Jia; Lu Wu; Ma Dongge

    2007-01-01

    A series of europium complexes were synthesized and their electroluminescent (EL) characteristics were studied. It was found by comparison that the different substituted groups, such as methyl, chlorine, and nitryl, on ligand 1,10-phenanthroline affect significantly the EL performance of devices based on these complexes. The more methyl-substituted groups on ligand 1,10-phenanthroline led to higher device efficiency. A chlorine-substituted group showed the approximate EL performance as two methyl-substituted groups, whereas a nitryl substituent reduced significantly the EL luminous efficiency. However, β-diketonate ligand TTA and DBM exhibited similar EL performance. The improved EL luminous efficiency by proper substituted groups on the 1,10-phenanthroline was attributed to the reduction of the energy loss caused by light hydrogen atom vibration, as well as concentration quenching caused by intermolecular interaction, and the match of energy level between the ligand and Eu 3+

  9. High-performance liquid chromatography of metal complexes of pheophytins a and b

    International Nuclear Information System (INIS)

    Brykina, G.D.; Lazareva, E.E.; Uvarova, M.I.; Shpigun, O.A.

    1997-01-01

    Cu(2), Zn(2), Pb(2), Hg(2), and Ce(4) complexes of phenophytins a and b were synthesized. The chromatographic retention parameters of pheophytins a and b, chlorophylls a and b, and the above complexes were determined under conditions of normal-phase and reversed-phase high-performance liquid chromatography (HPLC). The adsorption of metal pheophytinates in the hexane-n-butanol (96:4)-Silasorb 600 and acetonitrile-ethanol-acetic acid (40:40:16)-Nucleosil C 18 systems was studied by HPLC. Factors that affect the chromatographic and adsorption characteristics of compounds (structural differences between pheophytinates of the a and b series, the nature of the central metal atom, and the nature of the mobile and stationary phases) are discussed. It is demonstrated that pheophytins a and b their metal complexes can be identified and quantiatively determined by HPLC in the concentration range (0.6-44.0)[10 -6 M

  10. High performance ultrasonic field simulation on complex geometries

    Science.gov (United States)

    Chouh, H.; Rougeron, G.; Chatillon, S.; Iehl, J. C.; Farrugia, J. P.; Ostromoukhov, V.

    2016-02-01

    Ultrasonic field simulation is a key ingredient for the design of new testing methods as well as a crucial step for NDT inspection simulation. As presented in a previous paper [1], CEA-LIST has worked on the acceleration of these simulations focusing on simple geometries (planar interfaces, isotropic materials). In this context, significant accelerations were achieved on multicore processors and GPUs (Graphics Processing Units), bringing the execution time of realistic computations in the 0.1 s range. In this paper, we present recent works that aim at similar performances on a wider range of configurations. We adapted the physical model used by the CIVA platform to design and implement a new algorithm providing a fast ultrasonic field simulation that yields nearly interactive results for complex cases. The improvements over the CIVA pencil-tracing method include adaptive strategies for pencil subdivisions to achieve a good refinement of the sensor geometry while keeping a reasonable number of ray-tracing operations. Also, interpolation of the times of flight was used to avoid time consuming computations in the impulse response reconstruction stage. To achieve the best performance, our algorithm runs on multi-core superscalar CPUs and uses high performance specialized libraries such as Intel Embree for ray-tracing, Intel MKL for signal processing and Intel TBB for parallelization. We validated the simulation results by comparing them to the ones produced by CIVA on identical test configurations including mono-element and multiple-element transducers, homogeneous, meshed 3D CAD specimens, isotropic and anisotropic materials and wave paths that can involve several interactions with interfaces. We show performance results on complete simulations that achieve computation times in the 1s range.

  11. Memory controllers for high-performance and real-time MPSoCs : requirements, architectures, and future trends

    NARCIS (Netherlands)

    Akesson, K.B.; Huang, Po-Chun; Clermidy, F.; Dutoit, D.; Goossens, K.G.W.; Chang, Yuan-Hao; Kuo, Tei-Wei; Vivet, P.; Wingard, D.

    2011-01-01

    Designing memory controllers for complex real-time and high-performance multi-processor systems-on-chip is challenging, since sufficient capacity and (real-time) performance must be provided in a reliable manner at low cost and with low power consumption. This special session contains four

  12. 45 CFR 2516.810 - What types of evaluations are grantees and subgrantees required to perform?

    Science.gov (United States)

    2010-10-01

    ... Welfare (Continued) CORPORATION FOR NATIONAL AND COMMUNITY SERVICE SCHOOL-BASED SERVICE-LEARNING PROGRAMS...? All grantees and subgrantees are required to perform internal evaluations which are ongoing efforts to assess performance and improve quality. Grantees and subgrantees may, but are not required to, arrange...

  13. 42 CFR 493.1467 - Condition: Laboratories performing high complexity testing; cytology general supervisor.

    Science.gov (United States)

    2010-10-01

    ... testing; cytology general supervisor. 493.1467 Section 493.1467 Public Health CENTERS FOR MEDICARE....1467 Condition: Laboratories performing high complexity testing; cytology general supervisor. For the subspecialty of cytology, the laboratory must have a general supervisor who meets the qualification...

  14. Only one ATP-binding DnaX subunit is required for initiation complex formation by the Escherichia coli DNA polymerase III holoenzyme.

    Science.gov (United States)

    Wieczorek, Anna; Downey, Christopher D; Dallmann, H Garry; McHenry, Charles S

    2010-09-17

    The DnaX complex (DnaX(3)δδ'χ psi) within the Escherichia coli DNA polymerase III holoenzyme serves to load the dimeric sliding clamp processivity factor, β(2), onto DNA. The complex contains three DnaX subunits, which occur in two forms: τ and the shorter γ, produced by translational frameshifting. Ten forms of E. coli DnaX complex containing all possible combinations of wild-type or a Walker A motif K51E variant τ or γ have been reconstituted and rigorously purified. DnaX complexes containing three DnaX K51E subunits do not bind ATP. Comparison of their ability to support formation of initiation complexes, as measured by processive replication by the DNA polymerase III holoenzyme, indicates a minimal requirement for one ATP-binding DnaX subunit. DnaX complexes containing two mutant DnaX subunits support DNA synthesis at about two-thirds the level of their wild-type counterparts. β(2) binding (determined functionally) is diminished 12-30-fold for DnaX complexes containing two K51E subunits, suggesting that multiple ATPs must be bound to place the DnaX complex into a conformation with maximal affinity for β(2). DNA synthesis activity can be restored by increased concentrations of β(2). In contrast, severe defects in ATP hydrolysis are observed upon introduction of a single K51E DnaX subunit. Thus, ATP binding, hydrolysis, and the ability to form initiation complexes are not tightly coupled. These results suggest that although ATP hydrolysis likely enhances β(2) loading, it is not absolutely required in a mechanistic sense for formation of functional initiation complexes.

  15. Influence of step complexity and presentation style on step performance of computerized emergency operating procedures

    Energy Technology Data Exchange (ETDEWEB)

    Xu Song [Department of Industrial Engineering, Tsinghua University, Beijing 100084 (China); Li Zhizhong [Department of Industrial Engineering, Tsinghua University, Beijing 100084 (China)], E-mail: zzli@tsinghua.edu.cn; Song Fei; Luo Wei; Zhao Qianyi; Salvendy, Gavriel [Department of Industrial Engineering, Tsinghua University, Beijing 100084 (China)

    2009-02-15

    With the development of information technology, computerized emergency operating procedures (EOPs) are taking the place of paper-based ones. However, ergonomics issues of computerized EOPs have not been studied adequately since the industrial practice is quite limited yet. This study examined the influence of step complexity and presentation style of EOPs on step performance. A simulated computerized EOP system was developed in two presentation styles: Style A: one- and two-dimensional flowcharts combination; Style B: two-dimensional flowchart and success logic tree combination. Step complexity was quantified by a complexity measure model based on an entropy concept. Forty subjects participated in the experiment of EOP execution using the simulated system. The results of data analysis on the experiment data indicate that step complexity and presentation style could significantly influence step performance (both step error rate and operation time). Regression models were also developed. The regression analysis results imply that operation time of a step could be well predicted by step complexity while step error rate could only partly predicted by it. The result of a questionnaire investigation implies that step error rate was influenced not only by the operation task itself but also by other human factors. These findings may be useful for the design and assessment of computerized EOPs.

  16. The binding of Xanthophylls to the bulk light-harvesting complex of photosystem II of higher plants. A specific requirement for carotenoids with a 3-hydroxy-beta-end group.

    Science.gov (United States)

    Phillip, Denise; Hobe, Stephan; Paulsen, Harald; Molnar, Peter; Hashimoto, Hideki; Young, Andrew J

    2002-07-12

    The pigment composition of the light-harvesting complexes (LHCs) of higher plants is highly conserved. The bulk complex (LHCIIb) binds three xanthophyll molecules in combination with chlorophyll (Chl) a and b. The structural requirements for binding xanthophylls to LHCIIb have been examined using an in vitro reconstitution procedure. Reassembly of the monomeric recombinant LHCIIb was performed using a wide range of native and nonnative xanthophylls, and a specific requirement for the presence of a hydroxy group at C-3 on a single beta-end group was identified. The presence of additional substituents (e.g. at C-4) did not interfere with xanthophyll binding, but they could not, on their own, support reassembly. cis isomers of zeaxanthin, violaxanthin, and lutein were not bound, whereas all-trans-neoxanthin and different chiral forms of lutein and zeaxanthin were incorporated into the complex. The C-3 and C-3' diols lactucaxanthin (a carotenoid native to many plant LHCs) and eschscholtzxanthin (a retro-carotenoid) both behaved very differently from lutein and zeaxanthin in that they would not support complex reassembly when used alone. Lactucaxanthin could, however, be bound when lutein was also present, and it showed a high affinity for xanthophyll binding site N1. In the presence of lutein, lactucaxanthin was readily bound to at least one lutein-binding site, suggesting that the ability to bind to the complex and initiate protein folding may be dependent on different structural features of the carotenoid molecule. The importance of carotenoid end group structure and ring-to-chain conformation around the C-6-C-7 torsion angle of the carotenoid molecule in binding and complex reassembly is discussed.

  17. Performance Assessment/Composite Analysis Modeling to Support a Holistic Strategy for the Closure of F Area, a Large Nuclear Complex at the Savannah River Site

    International Nuclear Information System (INIS)

    COOK, JAMES

    2004-01-01

    A performance-based approach is being used at the Savannah River Site to close the F area Complex. F Area consists of a number of large industrial facilities including plutonium separations, uranium fuel fabrication, tanks for storing high level waste and a number of smaller operations. A major part of the overall closure strategy is the use of techniques derived from the Performance Assessment and Composite Analysis requirements for low level waste disposal at DOE sites. This process will provide a means of demonstrating the basis for deactivation, decommissioning and closure decisions to management, stakeholders and regulators

  18. 75 FR 37712 - Automatic Dependent Surveillance-Broadcast (ADS-B) Out Performance Requirements To Support Air...

    Science.gov (United States)

    2010-06-30

    ... Performance Requirements To Support Air Traffic Control (ATC) Service; Technical Amendment AGENCY: Federal... FAA amended its regulations to add equipage requirements and performance standards for Automatic... Approval for deviation was renumbered as Sec. 21.618, effective April 14, 2010. On May 28, 2010, the FAA...

  19. Wind turbine power performance verification in complex terrain and wind farms

    Energy Technology Data Exchange (ETDEWEB)

    Friis Pedersen, T.; Gjerding, S.; Ingham, P.; Enevoldsen, P.; Kjaer Hansen, J.; Kanstrup Joergensen, H.

    2002-04-01

    The IEC/EN 61400-12 Ed 1 standard for wind turbine power performance testing is being revised. The standard will be divided into four documents. The first one of these is more or less a revision of the existing document on power performance measurements on individual wind turbines. The second one is a power performance verification procedure for individual wind turbines. The third is a power performance measurement procedure of whole wind farms, and the fourth is a power performance measurement procedure for non-grid (small) wind turbines. This report presents work that was made to support the basis for this standardisation work. The work addressed experience from several national and international research projects and contractual and field experience gained within the wind energy community on this matter. The work was wide ranging and addressed 'grey' areas of knowledge regarding existing methodologies, which has then been investigated in more detail. The work has given rise to a range of conclusions and recommendations regarding: guaranties on power curves in complex terrain; investors and bankers experience with verification of power curves; power performance in relation to regional correction curves for Denmark; anemometry and the influence of inclined flow. (au)

  20. Dosimeter characteristics and service performance requirements

    International Nuclear Information System (INIS)

    Ambrosi, P.; Bartlett, D.T.

    1999-01-01

    The requirements for personal dosimeters and dosimetry services given by ICRP 26, ICRP 35, ICRP 60 and ICRP 75 are summarised and compared with the requirements given in relevant international standards. Most standards could be made more relevant to actual workplace conditions. In some standards, the required tests of energy and angular dependence of the response are not sufficient, or requirements on overall uncertainty are lacking. (author)

  1. Is a Responsive Default Mode Network Required for Successful Working Memory Task Performance?

    Science.gov (United States)

    Čeko, Marta; Gracely, John L; Fitzcharles, Mary-Ann; Seminowicz, David A; Schweinhardt, Petra; Bushnell, M Catherine

    2015-08-19

    In studies of cognitive processing using tasks with externally directed attention, regions showing increased (external-task-positive) and decreased or "negative" [default-mode network (DMN)] fMRI responses during task performance are dynamically responsive to increasing task difficulty. Responsiveness (modulation of fMRI signal by increasing load) has been linked directly to successful cognitive task performance in external-task-positive regions but not in DMN regions. To investigate whether a responsive DMN is required for successful cognitive performance, we compared healthy human subjects (n = 23) with individuals shown to have decreased DMN engagement (chronic pain patients, n = 28). Subjects performed a multilevel working-memory task (N-back) during fMRI. If a responsive DMN is required for successful performance, patients having reduced DMN responsiveness should show worsened performance; if performance is not reduced, their brains should show compensatory activation in external-task-positive regions or elsewhere. All subjects showed decreased accuracy and increased reaction times with increasing task level, with no significant group differences on either measure at any level. Patients had significantly reduced negative fMRI response (deactivation) of DMN regions (posterior cingulate/precuneus, medial prefrontal cortex). Controls showed expected modulation of DMN deactivation with increasing task difficulty. Patients showed significantly reduced modulation of DMN deactivation by task difficulty, despite their successful task performance. We found no evidence of compensatory neural recruitment in external-task-positive regions or elsewhere. Individual responsiveness of the external-task-positive ventrolateral prefrontal cortex, but not of DMN regions, correlated with task accuracy. These findings suggest that a responsive DMN may not be required for successful cognitive performance; a responsive external-task-positive network may be sufficient. We studied the

  2. Generating units performances: power system requirements

    Energy Technology Data Exchange (ETDEWEB)

    Fourment, C; Girard, N; Lefebvre, H

    1994-08-01

    The part of generating units within the power system is more than providing power and energy. Their performance are not only measured by their energy efficiency and availability. Namely, there is a strong interaction between the generating units and the power system. The units are essential components of the system: for a given load profile the frequency variation follows directly from the behaviour of the units and their ability to adapt their power output. In the same way, the voltage at the units terminals are the key points to which the voltage profile at each node of the network is linked through the active and especially the reactive power flows. Therefore, the customer will experience the frequency and voltage variations induced by the units behaviour. Moreover, in case of adverse conditions, if the units do not operate as well as expected or trip, a portion of the system, may be the whole system, may collapse. The limitation of the performance of a unit has two kinds of consequences. Firstly, it may result in an increased amount of not supplied energy or loss of load probability: for example if the primary reserve is not sufficient, a generator tripping may lead to an abnormal frequency deviation, and load may have to be shed to restore the balance. Secondly, the limitation of a unit performance results in an economic over-cost for the system: for instance, if not enough `cheap` units are able to load-following, other units with higher operating costs have to be started up. We would like to stress the interest for the operators and design teams of the units on the one hand, and the operators and design teams of the system on the other hand, of dialog and information exchange, in operation but also at the conception stage, in order to find a satisfactory compromise between the system requirements and the consequences for the generating units. (authors). 11 refs., 4 figs.

  3. Managing project complexity : A study into adapting early project phases to improve project performance in large engineering projects

    NARCIS (Netherlands)

    Bosch-Rekveldt, M.G.C.

    2011-01-01

    Engineering projects become increasingly more complex and project complexity is assumed to be one of the causes for projects being delivered late and over budget. However, what this project complexity actually comprised of was unclear. To improve the overall project performance, this study focuses

  4. Radioactive Waste Management Complex low-level waste radiological performance assessment

    Energy Technology Data Exchange (ETDEWEB)

    Maheras, S.J.; Rood, A.S.; Magnuson, S.O.; Sussman, M.E.; Bhatt, R.N.

    1994-04-01

    This report documents the projected radiological dose impacts associated with the disposal of radioactive low-level waste at the Radioactive Waste Management Complex at the Idaho National Engineering Laboratory. This radiological performance assessment was conducted to evaluate compliance with applicable radiological criteria of the US Department of Energy and the US Environmental Protection Agency for protection of the public and the environment. The calculations involved modeling the transport of radionuclides from buried waste, to surface soil and subsurface media, and eventually to members of the public via air, groundwater, and food chain pathways. Projections of doses were made for both offsite receptors and individuals inadvertently intruding onto the site after closure. In addition, uncertainty and sensitivity analyses were performed. The results of the analyses indicate compliance with established radiological criteria and provide reasonable assurance that public health and safety will be protected.

  5. Radioactive Waste Management Complex low-level waste radiological performance assessment

    International Nuclear Information System (INIS)

    Maheras, S.J.; Rood, A.S.; Magnuson, S.O.; Sussman, M.E.; Bhatt, R.N.

    1994-04-01

    This report documents the projected radiological dose impacts associated with the disposal of radioactive low-level waste at the Radioactive Waste Management Complex at the Idaho National Engineering Laboratory. This radiological performance assessment was conducted to evaluate compliance with applicable radiological criteria of the US Department of Energy and the US Environmental Protection Agency for protection of the public and the environment. The calculations involved modeling the transport of radionuclides from buried waste, to surface soil and subsurface media, and eventually to members of the public via air, groundwater, and food chain pathways. Projections of doses were made for both offsite receptors and individuals inadvertently intruding onto the site after closure. In addition, uncertainty and sensitivity analyses were performed. The results of the analyses indicate compliance with established radiological criteria and provide reasonable assurance that public health and safety will be protected

  6. Improvement Plans of Fermilab’s Proton Accelerator Complex

    Science.gov (United States)

    Shiltsev, Vladimir

    2017-09-01

    The flagship of Fermilab’s long term research program is the Deep Underground Neutrino Experiment (DUNE), located Sanford Underground Research Facility (SURF) in Lead, South Dakota, which will study neutrino oscillations with a baseline of 1300 km. The neutrinos will be produced in the Long Baseline Neutrino Facility (LBNF), a proposed new beam line from Fermilab’s Main Injector. The physics goals of the DUNE require a proton beam with a power of some 2.4 MW at 120 GeV, which is roughly four times the current maximum power. Here I discuss current performance of the Fermilab proton accelerator complex, our plans for construction of the SRF proton linac as key part of the Proton Improvement Plan-II (PIP-II), outline the main challenges toward multi-MW beam power operation of the Fermilab accelerator complex and the staged plan to achieve the required performance over the next 15 years.

  7. Requirement of the Mre11 complex and exonuclease 1 for activation of the Mec1 signaling pathway.

    Science.gov (United States)

    Nakada, Daisuke; Hirano, Yukinori; Sugimoto, Katsunori

    2004-11-01

    The large protein kinases, ataxia-telangiectasia mutated (ATM) and ATM-Rad3-related (ATR), orchestrate DNA damage checkpoint pathways. In budding yeast, ATM and ATR homologs are encoded by TEL1 and MEC1, respectively. The Mre11 complex consists of two highly related proteins, Mre11 and Rad50, and a third protein, Xrs2 in budding yeast or Nbs1 in mammals. The Mre11 complex controls the ATM/Tel1 signaling pathway in response to double-strand break (DSB) induction. We show here that the Mre11 complex functions together with exonuclease 1 (Exo1) in activation of the Mec1 signaling pathway after DNA damage and replication block. Mec1 controls the checkpoint responses following UV irradiation as well as DSB induction. Correspondingly, the Mre11 complex and Exo1 play an overlapping role in activation of DSB- and UV-induced checkpoints. The Mre11 complex and Exo1 collaborate in producing long single-stranded DNA (ssDNA) tails at DSB ends and promote Mec1 association with the DSBs. The Ddc1-Mec3-Rad17 complex associates with sites of DNA damage and modulates the Mec1 signaling pathway. However, Ddc1 association with DSBs does not require the function of the Mre11 complex and Exo1. Mec1 controls checkpoint responses to stalled DNA replication as well. Accordingly, the Mre11 complex and Exo1 contribute to activation of the replication checkpoint pathway. Our results provide a model in which the Mre11 complex and Exo1 cooperate in generating long ssDNA tracts and thereby facilitate Mec1 association with sites of DNA damage or replication block.

  8. QMU as an approach to strengthening the predictive capabilities of complex models.

    Energy Technology Data Exchange (ETDEWEB)

    Gray, Genetha Anne.; Boggs, Paul T.; Grace, Matthew D.

    2010-09-01

    Complex systems are made up of multiple interdependent parts, and the behavior of the entire system cannot always be directly inferred from the behavior of the individual parts. They are nonlinear and system responses are not necessarily additive. Examples of complex systems include energy, cyber and telecommunication infrastructures, human and animal social structures, and biological structures such as cells. To meet the goals of infrastructure development, maintenance, and protection for cyber-related complex systems, novel modeling and simulation technology is needed. Sandia has shown success using M&S in the nuclear weapons (NW) program. However, complex systems represent a significant challenge and relative departure from the classical M&S exercises, and many of the scientific and mathematical M&S processes must be re-envisioned. Specifically, in the NW program, requirements and acceptable margins for performance, resilience, and security are well-defined and given quantitatively from the start. The Quantification of Margins and Uncertainties (QMU) process helps to assess whether or not these safety, reliability and performance requirements have been met after a system has been developed. In this sense, QMU is used as a sort of check that requirements have been met once the development process is completed. In contrast, performance requirements and margins may not have been defined a priori for many complex systems, (i.e. the Internet, electrical distribution grids, etc.), particularly not in quantitative terms. This project addresses this fundamental difference by investigating the use of QMU at the start of the design process for complex systems. Three major tasks were completed. First, the characteristics of the cyber infrastructure problem were collected and considered in the context of QMU-based tools. Second, UQ methodologies for the quantification of model discrepancies were considered in the context of statistical models of cyber activity. Third

  9. Distributed control software of high-performance control-loop algorithm

    CERN Document Server

    Blanc, D

    1999-01-01

    The majority of industrial cooling and ventilation plants require the control of complex processes. All these processes are highly important for the operation of the machines. The stability and reliability of these processes are leading factors identifying the quality of the service provided. The control system architecture and software structure, as well, are required to have high dynamical performance and robust behaviour. The intelligent systems based on PID or RST controllers are used for their high level of stability and accuracy. The design and tuning of these complex controllers require the dynamic model of the plant to be known (generally obtained by identification) and the desired performance of the various control loops to be specified for achieving good performances. The concept of having a distributed control algorithm software provides full automation facilities with well-adapted functionality and good performances, giving methodology, means and tools to master the dynamic process optimization an...

  10. Fluid Ability (Gf and Complex Problem Solving (CPS

    Directory of Open Access Journals (Sweden)

    Patrick Kyllonen

    2017-07-01

    Full Text Available Complex problem solving (CPS has emerged over the past several decades as an important construct in education and in the workforce. We examine the relationship between CPS and general fluid ability (Gf both conceptually and empirically. A review of definitions of the two factors, prototypical tasks, and the information processing analyses of performance on those tasks suggest considerable conceptual overlap. We review three definitions of CPS: a general definition emerging from the human problem solving literature; a more specialized definition from the “German School” emphasizing performance in many-variable microworlds, with high domain-knowledge requirements; and a third definition based on performance in Minimal Complex Systems (MCS, with fewer variables and reduced knowledge requirements. We find a correlation of 0.86 between expert ratings of the importance of CPS and Gf across 691 occupations in the O*NET database. We find evidence that employers value both Gf and CPS skills, but CPS skills more highly, even after controlling for the importance of domain knowledge. We suggest that this may be due to CPS requiring not just cognitive ability but additionally skill in applying that ability in domains. We suggest that a fruitful future direction is to explore the importance of domain knowledge in CPS.

  11. A Protein Complex Required for Polymerase V Transcripts and RNA- Directed DNA Methylation in Arabidopsis

    KAUST Repository

    Law, Julie A.; Ausí n, Israel; Johnson, Lianna M.; Vashisht, Ajay  A Amar; Zhu, Jian-Kang; Wohlschlegel, James  A A.; Jacobsen, Steven E.

    2010-01-01

    DNA methylation is an epigenetic modification associated with gene silencing. In Arabidopsis, DNA methylation is established by DOMAINS REARRANGED METHYLTRANSFERASE 2 (DRM2), which is targeted by small interfering RNAs through a pathway termed RNA-directed DNA methylation (RdDM) [1, 2]. Recently, RdDM was shown to require intergenic noncoding (IGN) transcripts that are dependent on the Pol V polymerase. These transcripts are proposed to function as scaffolds for the recruitment of downstream RdDM proteins, including DRM2, to loci that produce both siRNAs and IGN transcripts [3]. However, the mechanism(s) through which Pol V is targeted to specific genomic loci remains largely unknown. Through affinity purification of two known RdDM components, DEFECTIVE IN RNA-DIRECTED DNA METHYLATION 1 (DRD1) [4] and DEFECTIVE IN MERISTEM SILENCING 3 (DMS3) [5, 6], we found that they copurify with each other and with a novel protein, RNA-DIRECTED DNA METHYLATION 1 (RDM1), forming a complex we term DDR. We also found that DRD1 copurified with Pol V subunits and that RDM1, like DRD1 [3] and DMS3 [7], is required for the production of Pol V-dependent transcripts. These results suggest that the DDR complex acts in RdDM at a step upstream of the recruitment or activation of Pol V. © 2010 Elsevier Ltd. All rights reserved.

  12. A Protein Complex Required for Polymerase V Transcripts and RNA- Directed DNA Methylation in Arabidopsis

    KAUST Repository

    Law, Julie A.

    2010-05-01

    DNA methylation is an epigenetic modification associated with gene silencing. In Arabidopsis, DNA methylation is established by DOMAINS REARRANGED METHYLTRANSFERASE 2 (DRM2), which is targeted by small interfering RNAs through a pathway termed RNA-directed DNA methylation (RdDM) [1, 2]. Recently, RdDM was shown to require intergenic noncoding (IGN) transcripts that are dependent on the Pol V polymerase. These transcripts are proposed to function as scaffolds for the recruitment of downstream RdDM proteins, including DRM2, to loci that produce both siRNAs and IGN transcripts [3]. However, the mechanism(s) through which Pol V is targeted to specific genomic loci remains largely unknown. Through affinity purification of two known RdDM components, DEFECTIVE IN RNA-DIRECTED DNA METHYLATION 1 (DRD1) [4] and DEFECTIVE IN MERISTEM SILENCING 3 (DMS3) [5, 6], we found that they copurify with each other and with a novel protein, RNA-DIRECTED DNA METHYLATION 1 (RDM1), forming a complex we term DDR. We also found that DRD1 copurified with Pol V subunits and that RDM1, like DRD1 [3] and DMS3 [7], is required for the production of Pol V-dependent transcripts. These results suggest that the DDR complex acts in RdDM at a step upstream of the recruitment or activation of Pol V. © 2010 Elsevier Ltd. All rights reserved.

  13. MYRRHA cryogenic system study on performances and reliability requirements

    International Nuclear Information System (INIS)

    Junquera, T.; Chevalier, N.R.; Thermeau, J.P.; Medeiros Romao, L.; Vandeplassche, D.

    2015-01-01

    A precise evaluation of the cryogenic requirements for accelerator-driven system such as the MYRRHA project has been performed. In particular, operation temperature, thermal losses, and required cryogenic power have been evaluated. A preliminary architecture of the cryogenic system including all its major components, as well as the principles for the cryogenic fluids distribution has been proposed. A detailed study on the reliability aspects has also been initiated. This study is based on the reliability of large cryogenic systems used for accelerators like HERA, LHC or SNS Linac. The requirements to guarantee good cryogenic system availability can be summarised as follows: 1) Mean Time Between Maintenance (MTBM) should be > 8 000 hours; 2) Valves, heat exchangers and turbines are particularly sensitive elements to impurities (dust, oil, gases), improvements are necessary to keep a minimal level in these components; 3) Redundancy studies for all elements containing moving/vibrating parts (turbines, compressors, including their respective bearings and seal shafts) are necessary; 4) Periodic maintenance is mandatory: oil checks, control of screw compressors every 10.000-15.000 hours, vibration surveillance programme, etc; 5) Special control and maintenance of utilities equipment (supply of cooling water, compressed air and electrical supply) is necessary; 6) Periodic vacuum checks to identify leakage appearance such as insulation vacuum of transfer lines and distribution boxes are necessary; 7) Easily exchangeable cold compressors are required

  14. Impact of the motion and visual complexity of the background on players' performance in video game-like displays.

    Science.gov (United States)

    Caroux, Loïc; Le Bigot, Ludovic; Vibert, Nicolas

    2013-01-01

    The visual interfaces of virtual environments such as video games often show scenes where objects are superimposed on a moving background. Three experiments were designed to better understand the impact of the complexity and/or overall motion of two types of visual backgrounds often used in video games on the detection and use of superimposed, stationary items. The impact of background complexity and motion was assessed during two typical video game tasks: a relatively complex visual search task and a classic, less demanding shooting task. Background motion impaired participants' performance only when they performed the shooting game task, and only when the simplest of the two backgrounds was used. In contrast, and independently of background motion, performance on both tasks was impaired when the complexity of the background increased. Eye movement recordings demonstrated that most of the findings reflected the impact of low-level features of the two backgrounds on gaze control.

  15. Modeling the Non-functional Requirements in the Context of Usability, Performance, Safety and Security

    OpenAIRE

    Sadiq, Mazhar

    2007-01-01

    Requirement engineering is the most significant part of the software development life cycle. Until now great emphasis has been put on the maturity of the functional requirements. But with the passage of time it reveals that the success of software development does not only pertain to the functional requirements rather non-functional requirements should also be taken into consideration. Among the non-functional requirements usability, performance, safety and security are considered important. ...

  16. A Fluorine-18 Radiolabeling Method Enabled by Rhenium(I) Complexation Circumvents the Requirement of Anhydrous Conditions.

    Science.gov (United States)

    Klenner, Mitchell A; Pascali, Giancarlo; Zhang, Bo; Sia, Tiffany R; Spare, Lawson K; Krause-Heuer, Anwen M; Aldrich-Wright, Janice R; Greguric, Ivan; Guastella, Adam J; Massi, Massimiliano; Fraser, Benjamin H

    2017-05-11

    Azeotropic distillation is typically required to achieve fluorine-18 radiolabeling during the production of positron emission tomography (PET) imaging agents. However, this time-consuming process also limits fluorine-18 incorporation, due to radioactive decay of the isotope and its adsorption to the drying vessel. In addressing these limitations, the fluorine-18 radiolabeling of one model rhenium(I) complex is reported here, which is significantly improved under conditions that do not require azeotropic drying. This work could open a route towards the investigation of a simplified metal-mediated late-stage radiofluorination method, which would expand upon the accessibility of new PET and PET-optical probes. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. A Study on Performance Requirements for Advanced Alarm System

    International Nuclear Information System (INIS)

    Seong, Duk Hyun; Jeong, Jae Hoon; Sim, Young Rok; Ko, Jong Hyun; Kim, Jung Seon; Jang, Gwi Sook; Park, Geun Ok

    2005-01-01

    A design goals of advanced alarm system is providing advanced alarm information to operator in main control room. To achive this, we applied computer based system to Alarm System. Because, It should apply data management and advanced alarm processing(ie. Data Base Mangegment System and S/W module for alarm processing). These are not impossible in analog based alarm system. And, preexitance research examples are made on digital computer. We have digital systems for test of advanced alarm system table and have tested and studied using by test equipment in the view point of the system performance, stability and security. In this paper, we discribed about general software architecture of preexitance research examples. Also, CPU performance and requirements of system software that served to accommodate it, stability and security

  18. Managing today's complex healthcare business enterprise: reflections on distinctive requirements of healthcare management education.

    Science.gov (United States)

    Welton, William E

    2004-01-01

    In early 2001, the community of educational programs offering master's-level education in healthcare management began an odyssey to modernize its approach to the organization and delivery of healthcare management education. The community recognized that cumulative long-term changes within healthcare management practice required a careful examination of healthcare management context and manpower requirements. This article suggests an evidence-based rationale for defining the distinctive elements of healthcare management, thus suggesting a basis for review and transformation of master's-level healthcare management curricula. It also suggests ways to modernize these curricula in a manner that recognizes the distinctiveness of the healthcare business enterprise as well as the changing management roles and careers within these complex organizations and systems. Through such efforts, the healthcare management master's-level education community would be better prepared to meet current and future challenges, to increase its relevance to the management practice community, and to allocate scarce faculty and program resources more effectively.

  19. What model resolution is required in climatological downscaling over complex terrain?

    Science.gov (United States)

    El-Samra, Renalda; Bou-Zeid, Elie; El-Fadel, Mutasem

    2018-05-01

    This study presents results from the Weather Research and Forecasting (WRF) model applied for climatological downscaling simulations over highly complex terrain along the Eastern Mediterranean. We sequentially downscale general circulation model results, for a mild and wet year (2003) and a hot and dry year (2010), to three local horizontal resolutions of 9, 3 and 1 km. Simulated near-surface hydrometeorological variables are compared at different time scales against data from an observational network over the study area comprising rain gauges, anemometers, and thermometers. The overall performance of WRF at 1 and 3 km horizontal resolution was satisfactory, with significant improvement over the 9 km downscaling simulation. The total yearly precipitation from WRF's 1 km and 3 km domains exhibited quantitative measure of the potential errors for various hydrometeorological variables.

  20. Computational fluid dynamics: complex flows requiring supercomputers. January 1975-July 1988 (Citations from the INSPEC: Information Services for the Physics and Engineering Communities data base). Report for January 1975-July 1988

    International Nuclear Information System (INIS)

    1988-08-01

    This bibliography contains citations concerning computational fluid dynamics (CFD), a new method in computational science to perform complex flow simulations in three dimensions. Applications include aerodynamic design and analysis for aircraft, rockets, and missiles, and automobiles; heat-transfer studies; and combustion processes. Included are references to supercomputers, array processors, and parallel processors where needed for complete, integrated design. Also included are software packages and grid-generation techniques required to apply CFD numerical solutions. Numerical methods for fluid dynamics, not requiring supercomputers, are found in a separate published search. (Contains 83 citations fully indexed and including a title list.)

  1. Dual chromatin recognition by the histone deacetylase complex HCHC is required for proper DNA methylation in Neurospora crassa

    Science.gov (United States)

    Honda, Shinji; Bicocca, Vincent T.; Gessaman, Jordan D.; Rountree, Michael R.; Yokoyama, Ayumi; Yu, Eun Y.; Selker, Jeanne M. L.; Selker, Eric U.

    2016-01-01

    DNA methylation, heterochromatin protein 1 (HP1), histone H3 lysine 9 (H3K9) methylation, histone deacetylation, and highly repeated sequences are prototypical heterochromatic features, but their interrelationships are not fully understood. Prior work showed that H3K9 methylation directs DNA methylation and histone deacetylation via HP1 in Neurospora crassa and that the histone deacetylase complex HCHC is required for proper DNA methylation. The complex consists of the chromodomain proteins HP1 and chromodomain protein 2 (CDP-2), the histone deacetylase HDA-1, and the AT-hook motif protein CDP-2/HDA-1–associated protein (CHAP). We show that the complex is required for proper chromosome segregation, dissect its function, and characterize interactions among its components. Our analyses revealed the existence of an HP1-based DNA methylation pathway independent of its chromodomain. The pathway partially depends on CHAP but not on the CDP-2 chromodomain. CDP-2 serves as a bridge between the recognition of H3K9 trimethylation (H3K9me3) by HP1 and the histone deacetylase activity of HDA-1. CHAP is also critical for HDA-1 localization to heterochromatin. Specifically, the CHAP zinc finger interacts directly with the HDA-1 argonaute-binding protein 2 (Arb2) domain, and the CHAP AT-hook motifs recognize heterochromatic regions by binding to AT-rich DNA. Our data shed light on the interrelationships among the prototypical heterochromatic features and support a model in which dual recognition by the HP1 chromodomain and the CHAP AT-hooks are required for proper heterochromatin formation. PMID:27681634

  2. Predicting sample size required for classification performance

    Directory of Open Access Journals (Sweden)

    Figueroa Rosa L

    2012-02-01

    Full Text Available Abstract Background Supervised learning methods need annotated data in order to generate efficient models. Annotated data, however, is a relatively scarce resource and can be expensive to obtain. For both passive and active learning methods, there is a need to estimate the size of the annotated sample required to reach a performance target. Methods We designed and implemented a method that fits an inverse power law model to points of a given learning curve created using a small annotated training set. Fitting is carried out using nonlinear weighted least squares optimization. The fitted model is then used to predict the classifier's performance and confidence interval for larger sample sizes. For evaluation, the nonlinear weighted curve fitting method was applied to a set of learning curves generated using clinical text and waveform classification tasks with active and passive sampling methods, and predictions were validated using standard goodness of fit measures. As control we used an un-weighted fitting method. Results A total of 568 models were fitted and the model predictions were compared with the observed performances. Depending on the data set and sampling method, it took between 80 to 560 annotated samples to achieve mean average and root mean squared error below 0.01. Results also show that our weighted fitting method outperformed the baseline un-weighted method (p Conclusions This paper describes a simple and effective sample size prediction algorithm that conducts weighted fitting of learning curves. The algorithm outperformed an un-weighted algorithm described in previous literature. It can help researchers determine annotation sample size for supervised machine learning.

  3. Enhancing the Photovoltaic Performance of Perovskite Solar Cells with a Down-Conversion Eu-Complex.

    Science.gov (United States)

    Jiang, Ling; Chen, Wangchao; Zheng, Jiawei; Zhu, Liangzheng; Mo, Li'e; Li, Zhaoqian; Hu, Linhua; Hayat, Tasawar; Alsaedi, Ahmed; Zhang, Changneng; Dai, Songyuan

    2017-08-16

    Organometal halide perovskite solar cells (PSCs) have shown high photovoltaic performance but poor utilization of ultraviolet (UV) irradiation. Lanthanide complexes have a wide absorption range in the UV region and they can down-convert the absorbed UV light into visible light, which provides a possibility for PSCs to utilize UV light for higher photocurrent, efficiency, and stability. In this study, we use a transparent luminescent down-converting layer (LDL) of Eu-4,7-diphenyl-1,10-phenanthroline (Eu-complex) to improve the light utilization efficiency of PSCs. Compared with the uncoated PSC, the PSC coated with Eu-complex LDL on the reverse of the fluorine-doped tin oxide glass displayed an enhancement of 11.8% in short-circuit current density (J sc ) and 15.3% in efficiency due to the Eu-complex LDL re-emitting UV light (300-380 nm) in the visible range. It is indicated that the Eu-complex LDL plays the role of enhancing the power conversion efficiency as well as reducing UV degradation for PSCs.

  4. The Impact of Technology, Job Complexity and Religious Orientation on Managerial Performance

    Directory of Open Access Journals (Sweden)

    Jesmin Islam

    2011-12-01

    Full Text Available This paper explores the impact of technology, job complexity and religious orientation on the performance of managers in the financial services industries. Data were collected from bank managers representing Islamic and conventional banks in Bangladesh. Path models were used to analyse the data. The results provide supportfor the hypothesis that a management accounting systems (MAS adequacy gap exists in the financial sector in a developing country such as Bangladesh. These Islamic and conventional banks also experienced varied outcomes regarding the impact of the MAS adequacy gap on managerial effectiveness. Significant results emerged concerning the direct influence of technology and job complexity on managerial effectiveness, although these findings again differed across religious and conventional banks. Significant intervening effects of both MAS adequacy gap and job complexity on the relationships between contingency factors and managers' effectiveness were also found. Overall the findings showed that the type of religious orientation in Islamic banks wielded an important influence on the sensitivity of the MAS adequacy gap. Religious orientation served as a control device for the relationships between job-related contingency factors andmanagerial effectiveness.

  5. Task complexity and task, goal, and reward interdependence in group performance : a prescriptive model

    NARCIS (Netherlands)

    Vijfeijken, van H.T.G.A.; Kleingeld, P.A.M.; Tuijl, van H.F.J.M.; Algera, J.A.; Thierry, H.

    2002-01-01

    A prescriptive model on how to design effective combinations of goal setting and contingent rewards for group performance management is presented. The model incorporates the constructs task complexity, task interdependence, goal interdependence, and reward interdependence and specifies optimal fit

  6. The GARP complex is required for cellular sphingolipid homeostasis

    DEFF Research Database (Denmark)

    Fröhlich, Florian; Petit, Constance; Kory, Nora

    2015-01-01

    (GARP) complex, which functions in endosome-to-Golgi retrograde vesicular transport, as a critical player in sphingolipid homeostasis. GARP deficiency leads to accumulation of sphingolipid synthesis intermediates, changes in sterol distribution, and lysosomal dysfunction. A GARP complex mutation...... analogous to a VPS53 allele causing progressive cerebello-cerebral atrophy type 2 (PCCA2) in humans exhibits similar, albeit weaker, phenotypes in yeast, providing mechanistic insights into disease pathogenesis. Inhibition of the first step of de novo sphingolipid synthesis is sufficient to mitigate many...

  7. Cognition and procedure representational requirements for predictive human performance models

    Science.gov (United States)

    Corker, K.

    1992-01-01

    Models and modeling environments for human performance are becoming significant contributors to early system design and analysis procedures. Issues of levels of automation, physical environment, informational environment, and manning requirements are being addressed by such man/machine analysis systems. The research reported here investigates the close interaction between models of human cognition and models that described procedural performance. We describe a methodology for the decomposition of aircrew procedures that supports interaction with models of cognition on the basis of procedures observed; that serves to identify cockpit/avionics information sources and crew information requirements; and that provides the structure to support methods for function allocation among crew and aiding systems. Our approach is to develop an object-oriented, modular, executable software representation of the aircrew, the aircraft, and the procedures necessary to satisfy flight-phase goals. We then encode in a time-based language, taxonomies of the conceptual, relational, and procedural constraints among the cockpit avionics and control system and the aircrew. We have designed and implemented a goals/procedures hierarchic representation sufficient to describe procedural flow in the cockpit. We then execute the procedural representation in simulation software and calculate the values of the flight instruments, aircraft state variables and crew resources using the constraints available from the relationship taxonomies. The system provides a flexible, extensible, manipulative and executable representation of aircrew and procedures that is generally applicable to crew/procedure task-analysis. The representation supports developed methods of intent inference, and is extensible to include issues of information requirements and functional allocation. We are attempting to link the procedural representation to models of cognitive functions to establish several intent inference methods

  8. Compressive Sensing Based Bayesian Sparse Channel Estimation for OFDM Communication Systems: High Performance and Low Complexity

    Science.gov (United States)

    Xu, Li; Shan, Lin; Adachi, Fumiyuki

    2014-01-01

    In orthogonal frequency division modulation (OFDM) communication systems, channel state information (CSI) is required at receiver due to the fact that frequency-selective fading channel leads to disgusting intersymbol interference (ISI) over data transmission. Broadband channel model is often described by very few dominant channel taps and they can be probed by compressive sensing based sparse channel estimation (SCE) methods, for example, orthogonal matching pursuit algorithm, which can take the advantage of sparse structure effectively in the channel as for prior information. However, these developed methods are vulnerable to both noise interference and column coherence of training signal matrix. In other words, the primary objective of these conventional methods is to catch the dominant channel taps without a report of posterior channel uncertainty. To improve the estimation performance, we proposed a compressive sensing based Bayesian sparse channel estimation (BSCE) method which cannot only exploit the channel sparsity but also mitigate the unexpected channel uncertainty without scarifying any computational complexity. The proposed method can reveal potential ambiguity among multiple channel estimators that are ambiguous due to observation noise or correlation interference among columns in the training matrix. Computer simulations show that proposed method can improve the estimation performance when comparing with conventional SCE methods. PMID:24983012

  9. The operative management of children with complex perianal Crohn's disease.

    Science.gov (United States)

    Seemann, Natashia M; King, Sebastian K; Elkadri, Abdul; Walters, Thomas; Fish, Joel; Langer, Jacob C

    2016-12-01

    Perianal Crohn's disease (PCD) can affect both quality of life and psychological wellbeing. A subset of pediatric patients with complex PCD require surgical intervention, although appropriate timing and treatment regimens remain unclear. This study aimed to describe a large pediatric cohort in a tertiary center to determine the range of surgical management in children with complex PCD. A retrospective review of children requiring operative intervention for PCD over 13 years (2002-2014) was performed. PCD was divided into simple and complex based on the type of surgical procedure, and the two groups were compared. The 57 children were divided into two groups: the simple group (N=43) underwent abscess drainage ± seton insertion alone, and the complex group (N=14) underwent loop ileostomy ± more extensive surgery. In the complex group, females were more predominant (57% of complex vs 30% of simple), and the average age at diagnosis was lower. Anti-TNF therapy was utilized in 79.1% of simple and 100% of complex PCD. All 14 complex patients underwent a defunctioning ileostomy, with 7 requiring further operations (subtotal colectomy=4, proctocolectomy ± anal sparing=5, plastic surgery reconstruction with perineal flap/graft=4). Complex PCD represents a small but challenging subset of patients in which major surgical intervention may be necessary to alleviate the symptoms of this debilitating condition. retrospective case study with no control group - level IV. Copyright © 2016 Elsevier Inc. All rights reserved.

  10. Discussion on sealing performance required in disposal system. Hydraulic analysis of tunnel intersections

    International Nuclear Information System (INIS)

    Sugita, Yutaka; Takahashi, Yoshiaki; Uragami, Manabu; Kitayama, Kazumi; Fujita, Tomoo; Kawakami, Susumu; Yui, Mikazu; Umeki, Hiroyuki; Miyamoto, Yoichi

    2005-09-01

    The sealing performance of a repository must be considered in the safety assessment of the geological disposal system of the high-level radioactive waste. NUMO and JNC established 'Technical Commission on Sealing Technology of Repository' based on the cooperation agreement. The objectives of this commission are to present the concept on the sealing performance required in the disposal system and to develop the direction for future R and D programme for design requirements of closure components (backfilling material, clay plug, etc.) in the presented concept. In the first phase of this commission, the current status of domestic and international sealing technologies were reviewed; and repository components and repository environments were summarized subsequently, the hydraulic analysis of tunnel intersections, where a main tunnel and a disposal tunnel in a disposal panel meet, were performed, considering components in and around the engineered barrier system (EBS). Since all tunnels are connected in the underground facility, understanding the hydraulic behaviour of tunnel intersections is an important issue to estimate migration of radionuclides from the EBS and to evaluate the required sealing performance in the disposal system. In the analytical results, it was found that the direction of hydraulic gradient, hydraulic conductivities of concrete and backfilling materials and the position of clay plug had impact on flow condition around the EBS. (author)

  11. Reducing external speedup requirements for input-queued crossbars

    DEFF Research Database (Denmark)

    Berger, Michael Stubert

    2005-01-01

    performance degradation. This implies, that the required bandwidth between port card and switch card is 2 times the actual port speed, adding to cost and complexity. To reduce this bandwidth, a modified architecture is proposed that introduces a small amount of input and output memory on the switch card chip...

  12. L-Band Digital Aeronautical Communications System Engineering - Concepts of Use, Systems Performance, Requirements, and Architectures

    Science.gov (United States)

    Zelkin, Natalie; Henriksen, Stephen

    2010-01-01

    This NASA Contractor Report summarizes and documents the work performed to develop concepts of use (ConUse) and high-level system requirements and architecture for the proposed L-band (960 to 1164 MHz) terrestrial en route communications system. This work was completed as a follow-on to the technology assessment conducted by NASA Glenn Research Center and ITT for the Future Communications Study (FCS). ITT assessed air-to-ground (A/G) communications concepts of use and operations presented in relevant NAS-level, international, and NAS-system-level documents to derive the appropriate ConUse relevant to potential A/G communications applications and services for domestic continental airspace. ITT also leveraged prior concepts of use developed during the earlier phases of the FCS. A middle-out functional architecture was adopted by merging the functional system requirements identified in the bottom-up assessment of existing requirements with those derived as a result of the top-down analysis of ConUse and higher level functional requirements. Initial end-to-end system performance requirements were derived to define system capabilities based on the functional requirements and on NAS-SR-1000 and the Operational Performance Assessment conducted as part of the COCR. A high-level notional architecture of the L-DACS supporting A/G communication was derived from the functional architecture and requirements.

  13. Airflow patterns in complex workplaces

    International Nuclear Information System (INIS)

    Mishima, J.; Selby, J.M.; Lynch, T.P.; Langer, G.; Vallario, E.J.

    1987-01-01

    There are many considerations in obtaining an accurate evaluation of aerosols. One aspect that has been neglected is the study of airflow patterns within the workplace. In many nuclear facilities, the operations performed required extensive equipment (e.g., glove boxes, piping) that create complex arrangements of physical barriers to flow. To provide samples of the airborne materials, particularly particles, knowledge of these complex airflow patterns is required for sampler placement. Recent studies have shown that materials introduced into the air flow within a workplace act as plumes embedded in major airflow streams. Portions of the plumes can recycle through the ventilated area, be lost to dead air pockets, or exhaust through unusual, unexpected outlets. Unusual flow patterns are observed even in relatively uncomplicated arrangements of equipment. This behavior must be factored into sampling/monitoring programs for evaluation of the airborne hazard to personnel within the workplace consistent with the objective of the program. Other factors that also must be considered to provide valid samples of airborne particulate materials are objectives of the sampling program, characteristics of the airborne particulate materials, nonsegregatory transport for the extracted materials, and requirements for the measurement techniques used

  14. Control of complex physically simulated robot groups

    Science.gov (United States)

    Brogan, David C.

    2001-10-01

    Actuated systems such as robots take many forms and sizes but each requires solving the difficult task of utilizing available control inputs to accomplish desired system performance. Coordinated groups of robots provide the opportunity to accomplish more complex tasks, to adapt to changing environmental conditions, and to survive individual failures. Similarly, groups of simulated robots, represented as graphical characters, can test the design of experimental scenarios and provide autonomous interactive counterparts for video games. The complexity of writing control algorithms for these groups currently hinders their use. A combination of biologically inspired heuristics, search strategies, and optimization techniques serve to reduce the complexity of controlling these real and simulated characters and to provide computationally feasible solutions.

  15. Air Force Officials did not Consistently Comply with Requirements for Assessing Contractor Performance

    Science.gov (United States)

    2016-01-29

    31 Appendix B. Improvement in PAR Completion Statistics _________________________________ 33 vi...agencies must perform frequent evaluation of compliance with reporting requirements so they can readily identify delinquent past performance efforts...Reporting Program,” August 13, 2011 Appendixes DODIG-2016-043 │ 33 Appendix B Improvement in PAR Completion Statistics The Senate Armed Services Committee

  16. Sensor Performance Requirements for the Retrieval of Atmospheric Aerosols by Airborne Optical Remote Sensing

    Directory of Open Access Journals (Sweden)

    Klaus I. Itten

    2008-03-01

    Full Text Available This study explores performance requirements for the retrieval of the atmospheric aerosol optical depth (AOD by airborne optical remote sensing instruments. Independent of any retrieval techniques, the calculated AOD retrieval requirements are compared with the expected performance parameters of the upcoming hyperspectral sensor APEX at the reference wavelength of 550nm. The AOD accuracy requirements are defined to be capable of resolving transmittance differences of 0.01 to 0.04 according to the demands of atmospheric corrections for remote sensing applications. For the purposes of this analysis, the signal at the sensor level is simulated by radiation transfer equations. The resulting radiances are translated into the AOD retrieval sensitivity (Δτλaer and compared to the available measuring sensitivity of the sensor (NE ΔLλsensor. This is done for multiple signal-to-noise ratios (SNR and surface reflectance values. It is shown that an SNR of 100 is adequate for AOD retrieval at 550nm under typical remote sensing conditions and a surface reflectance of 10% or less. Such dark surfaces require the lowest SNR values and therefore offer the best sensitivity for measuring AOD. Brighter surfaces with up to 30% reflectance require an SNR of around 300. It is shown that AOD retrieval for targets above 50% surface reflectance is more problematic with the current sensor performance as it may require an SNR larger than 1000. In general, feasibility is proven for the analyzed cases under simulated conditions.

  17. Performance of community health workers: situating their intermediary position within complex adaptive health systems.

    Science.gov (United States)

    Kok, Maryse C; Broerse, Jacqueline E W; Theobald, Sally; Ormel, Hermen; Dieleman, Marjolein; Taegtmeyer, Miriam

    2017-09-02

    Health systems are social institutions, in which health worker performance is shaped by transactional processes between different actors.This analytical assessment unravels the complex web of factors that influence the performance of community health workers (CHWs) in low- and middle-income countries. It examines their unique intermediary position between the communities they serve and actors in the health sector, and the complexity of the health systems in which they operate. The assessment combines evidence from the international literature on CHW programmes with research outcomes from the 5-year REACHOUT consortium, undertaking implementation research to improve CHW performance in six contexts (two in Asia and four in Africa). A conceptual framework on CHW performance, which explicitly conceptualizes the interface role of CHWs, is presented. Various categories of factors influencing CHW performance are distinguished in the framework: the context, the health system and intervention hardware and the health system and intervention software. Hardware elements of CHW interventions comprise the supervision systems, training, accountability and communication structures, incentives, supplies and logistics. Software elements relate to the ideas, interests, relationships, power, values and norms of the health system actors. They influence CHWs' feelings of connectedness, familiarity, self-fulfilment and serving the same goals and CHWs' perceptions of support received, respect, competence, honesty, fairness and recognition.The framework shines a spotlight on the need for programmes to pay more attention to ideas, interests, relationships, power, values and norms of CHWs, communities, health professionals and other actors in the health system, if CHW performance is to improve.

  18. KDM2B recruitment of the Polycomb group complex, PRC1.1, requires cooperation between PCGF1 and BCORL1

    OpenAIRE

    Wong, Sarah J.; Gearhart, Micah D.; Taylor, Alexander B.; Nanyes, David R.; Ha, Daniel J.; Robinson, Angela K.; Artigas, Jason A.; Lee, Oliver J.; Demeler, Borries; Hart, P. John; Bardwell, Vivian J.; Kim, Chongwoo A.

    2016-01-01

    KDM2B recruits H2A-ubiquitinating activity of a non-canonical Polycomb Repression Complex 1 (PRC1.1) to CpG islands, facilitating gene repression. We investigated the molecular basis of recruitment using in vitro assembly assays to identify minimal components, subcomplexes and domains required for recruitment. A minimal four-component PRC1.1 complex can be assembled by combining two separately isolated subcomplexes: the DNA binding KDM2B/SKP1 heterodimer and the heterodimer of BCORL1 and the ...

  19. Recruitment of a SAP18-HDAC1 complex into HIV-1 virions and its requirement for viral replication.

    Directory of Open Access Journals (Sweden)

    Masha Sorin

    2009-06-01

    Full Text Available HIV-1 integrase (IN is a virally encoded protein required for integration of viral cDNA into host chromosomes. INI1/hSNF5 is a component of the SWI/SNF complex that interacts with HIV-1 IN, is selectively incorporated into HIV-1 (but not other retroviral virions, and modulates multiple steps, including particle production and infectivity. To gain further insight into the role of INI1 in HIV-1 replication, we screened for INI1-interacting proteins using the yeast two-hybrid system. We found that SAP18 (Sin3a associated protein 18 kD, a component of the Sin3a-HDAC1 complex, directly binds to INI1 in yeast, in vitro and in vivo. Interestingly, we found that IN also binds to SAP18 in vitro and in vivo. SAP18 and components of a Sin3A-HDAC1 complex were specifically incorporated into HIV-1 (but not SIV and HTLV-1 virions in an HIV-1 IN-dependent manner. Using a fluorescence-based assay, we found that HIV-1 (but not SIV virion preparations harbour significant deacetylase activity, indicating the specific recruitment of catalytically active HDAC into the virions. To determine the requirement of virion-associated HDAC1 to HIV-1 replication, an inactive, transdominant negative mutant of HDAC1 (HDAC1(H141A was utilized. Incorporation of HDAC1(H141A decreased the virion-associated histone deacetylase activity. Furthermore, incorporation of HDAC1(H141A decreased the infectivity of HIV-1 (but not SIV virions. The block in infectivity due to virion-associated HDAC1(H141A occurred specifically at the early reverse transcription stage, while entry of the virions was unaffected. RNA-interference mediated knock-down of HDAC1 in producer cells resulted in decreased virion-associated HDAC1 activity and a reduction in infectivity of these virions. These studies indicate that HIV-1 IN and INI1/hSNF5 bind SAP18 and selectively recruit components of Sin3a-HDAC1 complex into HIV-1 virions. Furthermore, HIV-1 virion-associated HDAC1 is required for efficient early post

  20. 42 CFR 493.53 - Notification requirements for laboratories issued a certificate for provider-performed microscopy...

    Science.gov (United States)

    2010-10-01

    ... certificate for provider-performed microscopy (PPM) procedures. 493.53 Section 493.53 Public Health CENTERS... CERTIFICATION LABORATORY REQUIREMENTS Registration Certificate, Certificate for Provider-performed Microscopy... certificate for provider-performed microscopy (PPM) procedures. Laboratories issued a certificate for PPM...

  1. Applying Required Navigation Performance Concept for Traffic Management of Small Unmanned Aircraft Systems

    Science.gov (United States)

    Jung, Jaewoo; D'Souza, Sarah N.; Johnson, Marcus A.; Ishihara, Abraham K.; Modi, Hemil C.; Nikaido, Ben; Hasseeb, Hashmatullah

    2016-01-01

    In anticipation of a rapid increase in the number of civil Unmanned Aircraft System(UAS) operations, NASA is researching prototype technologies for a UAS Traffic Management (UTM) system that will investigate airspace integration requirements for enabling safe, efficient low-altitude operations. One aspect a UTM system must consider is the correlation between UAS operations (such as vehicles, operation areas and durations), UAS performance requirements, and the risk to people and property in the operational area. This paper investigates the potential application of the International Civil Aviation Organizations (ICAO) Required Navigation Performance (RNP) concept to relate operational risk with trajectory conformance requirements. The approach is to first define a method to quantify operational risk and then define the RNP level requirement as a function of the operational risk. Greater operational risk corresponds to more accurate RNP level, or smaller tolerable Total System Error (TSE). Data from 19 small UAS flights are used to develop and validate a formula that defines this relationship. An approach to assessing UAS-RNP conformance capability using vehicle modeling and wind field simulation is developed to investigate how this formula may be applied in a future UTM system. The results indicate the modeled vehicles flight path is robust to the simulated wind variation, and it can meet RNP level requirements calculated by the formula. The results also indicate how vehicle-modeling fidelity may be improved to adequately verify assessed RNP level.

  2. Experience of two trauma-centers with pancreatic injuries requiring immediate surgery.

    Science.gov (United States)

    Ouaïssi, Mehdi; Sielezneff, Igor; Chaix, Jean Baptiste; Mardion, Remi Bon; Pirrò, Nicolas; Berdah, Stéphane; Emungania, Olivier; Consentino, Bernard; Cresti, Silvia; Dahan, Laetitia; Orsoni, Pierre; Moutardier, Vincent; Brunet, C; Sastre, Bernard

    2008-01-01

    Pancreatic injury from blunt trauma is infrequent. The aim of the present study was to evaluate a simplified approach of management of pancreatic trauma injuries requiring immediate surgery consisting of either drainage in complex situation or pancreatectomy in the other cases. From January 1986 to December 2006, 40 pancreatic traumas requiring immediate surgery were performed. Mechanism of trauma, clinical and laboratories findings were noted upon admission, classification of pancreatic injury according to Lucas' classification were considered. Fifteen (100%) drainages were performed for stage I (n=15), 60% splenopancreatectomies and 40% drainage was achieved for stage II (n=18), 3 Pancreaticoduonectomies and 2 exclusion of duodenum with drainage and 2 packing were performed for stage IV (n=7). There were 30 men and 10 women with mean age of 29+/-13 years (15-65). Thirty-eight patients had multiple trauma. Overall, mortality and global morbidity rate were 17% and 65% respectively, and the rates increased with Lucas' pancreatic trauma stage. Distal pancreatectomy is indicated for distal injuries with duct involvement, and complex procedures such as pancreaticoduodenectomy should be performed in hemodynamically stable patients.

  3. A survey of clinical performance skills requirements in medical radiation technology

    International Nuclear Information System (INIS)

    Rowntree, P.A.; Veitch, J.D.

    1993-01-01

    This paper outlines the reasons behind carry out a study of clinical performance skills requirements and the method being used to gather data. It describes the changes which have occurred in radiographer education in Queensland, the broader impact brought about by changes in professional body requirements and the development of a Competency based Standards Document for the profession. The paper provides examples of the survey design and layout being developed for distribution to third year students in the Medical Imaging Technology major of the Bachelor of Applied Science (Medical Radiation Technology) Queensland University of Technology, graduates and clinical departments in Queensland. 1 tab., 1 fig

  4. Performance-scalable volumetric data classification for online industrial inspection

    Science.gov (United States)

    Abraham, Aby J.; Sadki, Mustapha; Lea, R. M.

    2002-03-01

    Non-intrusive inspection and non-destructive testing of manufactured objects with complex internal structures typically requires the enhancement, analysis and visualization of high-resolution volumetric data. Given the increasing availability of fast 3D scanning technology (e.g. cone-beam CT), enabling on-line detection and accurate discrimination of components or sub-structures, the inherent complexity of classification algorithms inevitably leads to throughput bottlenecks. Indeed, whereas typical inspection throughput requirements range from 1 to 1000 volumes per hour, depending on density and resolution, current computational capability is one to two orders-of-magnitude less. Accordingly, speeding up classification algorithms requires both reduction of algorithm complexity and acceleration of computer performance. A shape-based classification algorithm, offering algorithm complexity reduction, by using ellipses as generic descriptors of solids-of-revolution, and supporting performance-scalability, by exploiting the inherent parallelism of volumetric data, is presented. A two-stage variant of the classical Hough transform is used for ellipse detection and correlation of the detected ellipses facilitates position-, scale- and orientation-invariant component classification. Performance-scalability is achieved cost-effectively by accelerating a PC host with one or more COTS (Commercial-Off-The-Shelf) PCI multiprocessor cards. Experimental results are reported to demonstrate the feasibility and cost-effectiveness of the data-parallel classification algorithm for on-line industrial inspection applications.

  5. Model Complexity and Out-of-Sample Performance: Evidence from S&P 500 Index Returns

    NARCIS (Netherlands)

    Kaeck, Andreas; Rodrigues, Paulo; Seeger, Norman J.

    We apply a range of out-of-sample specification tests to more than forty competing stochastic volatility models to address how model complexity affects out-of-sample performance. Using daily S&P 500 index returns, model confidence set estimations provide strong evidence that the most important model

  6. A complex symbol signal-to-noise ratio estimator and its performance

    Science.gov (United States)

    Feria, Y.

    1994-01-01

    This article presents an algorithm for estimating the signal-to-noise ratio (SNR) of signals that contain data on a downconverted suppressed carrier or the first harmonic of a square-wave subcarrier. This algorithm can be used to determine the performance of the full-spectrum combiner for the Galileo S-band (2.2- to 2.3-GHz) mission by measuring the input and output symbol SNR. A performance analysis of the algorithm shows that the estimator can estimate the complex symbol SNR using 10,000 symbols at a true symbol SNR of -5 dB with a mean of -4.9985 dB and a standard deviation of 0.2454 dB, and these analytical results are checked by simulations of 100 runs with a mean of -5.06 dB and a standard deviation of 0.2506 dB.

  7. BETWEEN PARCIMONY AND COMPLEXITY: COMPARING PERFORMANCE MEASURES FOR ROMANIAN BANKING INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    ANCA MUNTEANU

    2012-01-01

    Full Text Available The main objective of this study is to establish the relationship between traditional measures of performance (ROE, ROA and NIM and EVA in order to gain some insight about the relevance of using more sophisticated performance measurements tools. Towards this end the study uses two acknowledged statistical measures: Kendall’s Tau and Spearman rank correlation Index. Using data from 12 Romanian banking institutions that report under IFRS for the period 2006-2010 the results suggest that generally EVA is highly correlated with Residual Income in the years that present positive operational profits whereas for the years with negative outcome the correlation is low. ROA and ROE are the measure that best correlates with EVA for the entire period and thus -applying Occam’s razor- could be used as a substitute for more complex shareholder earnings measures.

  8. IC-tagged proteins are able to interact with each other and perform complex reactions when integrated into muNS-derived inclusions.

    Science.gov (United States)

    Brandariz-Nuñez, Alberto; Otero-Romero, Iria; Benavente, Javier; Martinez-Costas, Jose M

    2011-09-20

    We have recently developed a versatile tagging system (IC-tagging) that causes relocation of the tagged proteins to ARV muNS-derived intracellular globular inclusions. In the present study we demonstrate (i) that the IC-tag can be successfully fused either to the amino or carboxyl terminus of the protein to be tagged and (ii) that IC-tagged proteins are able to interact between them and perform complex reactions that require such interactions while integrated into muNS inclusions, increasing the versatility of the IC-tagging system. Also, our studies with the DsRed protein add some light on the structure/function relationship of the evolution of DsRed chromophore. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Turboelectric Aircraft Drive Key Performance Parameters and Functional Requirements

    Science.gov (United States)

    Jansen, Ralph H.; Brown, Gerald V.; Felder, James L.; Duffy, Kirsten P.

    2016-01-01

    The purpose of this paper is to propose specific power and efficiency as the key performance parameters for a turboelectric aircraft power system and investigate their impact on the overall aircraft. Key functional requirements are identified that impact the power system design. Breguet range equations for a base aircraft and a turboelectric aircraft are found. The benefits and costs that may result from the turboelectric system are enumerated. A break-even analysis is conducted to find the minimum allowable electric drive specific power and efficiency that can preserve the range, initial weight, operating empty weight, and payload weight of the base aircraft.

  10. A model of R-D performance evaluation for Rate-Distortion-Complexity evaluation of H.264 video coding

    DEFF Research Database (Denmark)

    Wu, Mo; Forchhammer, Søren

    2007-01-01

    This paper considers a method for evaluation of Rate-Distortion-Complexity (R-D-C) performance of video coding. A statistical model of the transformed coefficients is used to estimate the Rate-Distortion (R-D) performance. A model frame work for rate, distortion and slope of the R-D curve for inter...... and intra frame is presented. Assumptions are given for analyzing an R-D model for fast R-D-C evaluation. The theoretical expressions are combined with H.264 video coding, and confirmed by experimental results. The complexity frame work is applied to the integer motion estimation....

  11. Construction products performances and basic requirements for fire safety of facades in energy rehabilitation of buildings

    Directory of Open Access Journals (Sweden)

    Laban Mirjana Đ.

    2015-01-01

    Full Text Available Construction product means any product or kit which is produced and placed on the market for incorporation in a permanent manner in construction works, or parts thereof, and the performance of which has an effect on the performance of the construction works with respect to the basic requirements for construction works. Safety in case of fire and Energy economy and heat retention represent two among seven basic requirements which building has to meet according to contemporary technical rules on planning and construction. Performances of external walls building materials (particularly reaction to fire could significantly affect to fire spread on the façade and other building parts. Therefore, façade shaping and materialization in building renewal process, has to meet the fire safety requirement, as well as the energy requirement. Brief survey of fire protection regulations development in Serbia is presented in the paper. Preventive measures for fire risk reduction in building façade energy renewal are proposed according to contemporary fire safety requirements.

  12. Comparative Visual Analysis of Structure-Performance Relations in Complex Bulk-Heterojunction Morphologies

    KAUST Repository

    Aboulhassan, A.

    2017-07-04

    The structure of Bulk-Heterojunction (BHJ) materials, the main component of organic photovoltaic solar cells, is very complex, and the relationship between structure and performance is still largely an open question. Overall, there is a wide spectrum of fabrication configurations resulting in different BHJ morphologies and correspondingly different performances. Current state-of-the-art methods for assessing the performance of BHJ morphologies are either based on global quantification of morphological features or simply on visual inspection of the morphology based on experimental imaging. This makes finding optimal BHJ structures very challenging. Moreover, finding the optimal fabrication parameters to get an optimal structure is still an open question. In this paper, we propose a visual analysis framework to help answer these questions through comparative visualization and parameter space exploration for local morphology features. With our approach, we enable scientists to explore multivariate correlations between local features and performance indicators of BHJ morphologies. Our framework is built on shape-based clustering of local cubical regions of the morphology that we call patches. This enables correlating the features of clusters with intuition-based performance indicators computed from geometrical and topological features of charge paths.

  13. Comparative Visual Analysis of Structure-Performance Relations in Complex Bulk-Heterojunction Morphologies

    KAUST Repository

    Aboulhassan, A.; Sicat, R.; Baum, D.; Wodo, O.; Hadwiger, Markus

    2017-01-01

    The structure of Bulk-Heterojunction (BHJ) materials, the main component of organic photovoltaic solar cells, is very complex, and the relationship between structure and performance is still largely an open question. Overall, there is a wide spectrum of fabrication configurations resulting in different BHJ morphologies and correspondingly different performances. Current state-of-the-art methods for assessing the performance of BHJ morphologies are either based on global quantification of morphological features or simply on visual inspection of the morphology based on experimental imaging. This makes finding optimal BHJ structures very challenging. Moreover, finding the optimal fabrication parameters to get an optimal structure is still an open question. In this paper, we propose a visual analysis framework to help answer these questions through comparative visualization and parameter space exploration for local morphology features. With our approach, we enable scientists to explore multivariate correlations between local features and performance indicators of BHJ morphologies. Our framework is built on shape-based clustering of local cubical regions of the morphology that we call patches. This enables correlating the features of clusters with intuition-based performance indicators computed from geometrical and topological features of charge paths.

  14. Innovation in user-centered skills and performance improvement for sustainable complex service systems.

    Science.gov (United States)

    Karwowski, Waldemar; Ahram, Tareq Z

    2012-01-01

    In order to leverage individual and organizational learning and to remain competitive in current turbulent markets it is important for employees, managers, planners and leaders to perform at high levels over time. Employee competence and skills are extremely important matters in view of the general shortage of talent and the mobility of employees with talent. Two factors emerged to have the greatest impact on the competitiveness of complex service systems: improving managerial and employee's knowledge attainment for skills, and improving the training and development of the workforce. This paper introduces the knowledge-based user-centered service design approach for sustainable skill and performance improvement in education, design and modeling of the next generation of complex service systems. The rest of the paper cover topics in human factors and sustainable business process modeling for the service industry, and illustrates the user-centered service system development cycle with the integration of systems engineering concepts in service systems. A roadmap for designing service systems of the future is discussed. The framework introduced in this paper is based on key user-centered design principles and systems engineering applications to support service competitiveness.

  15. On improving the performance of nonphotochemical quenching in CP29 light-harvesting antenna complex

    Science.gov (United States)

    Berman, Gennady P.; Nesterov, Alexander I.; Sayre, Richard T.; Still, Susanne

    2016-03-01

    We model and simulate the performance of charge-transfer in nonphotochemical quenching (NPQ) in the CP29 light-harvesting antenna-complex associated with photosystem II (PSII). The model consists of five discrete excitonic energy states and two sinks, responsible for the potentially damaging processes and charge-transfer channels, respectively. We demonstrate that by varying (i) the parameters of the chlorophyll-based dimer, (ii) the resonant properties of the protein-solvent environment interaction, and (iii) the energy transfer rates to the sinks, one can significantly improve the performance of the NPQ. Our analysis suggests strategies for improving the performance of the NPQ in response to environmental changes, and may stimulate experimental verification.

  16. Task complexity and task, goal, and reward interdependence in group performance management : A prescriptive model

    NARCIS (Netherlands)

    van Vijfeijken, H.; Kleingeld, A.; van Tuijl, H.; Algera, J.A.; Thierry, Hk.

    2002-01-01

    A prescriptive model on how to design effective combinations of goal setting and contingent rewards for group performance management is presented. The model incorporates the constructs task complexity, task interdependence, goal interdependence, and reward interdependence and specifies optimal fit

  17. Verification and Examination Management of Complex Systems

    Directory of Open Access Journals (Sweden)

    Stian Ruud

    2014-10-01

    Full Text Available As ship systems become more complex, with an increasing number of safety-critical functions, many interconnected subsystems, tight integration to other systems, and a large amount of potential failure modes, several industry parties have identified the need for improved methods for managing the verification and examination efforts of such complex systems. Such needs are even more prominent now that the marine and offshore industries are targeting more activities and operations in the Arctic environment. In this paper, a set of requirements and a method for verification and examination management are proposed for allocating examination efforts to selected subsystems. The method is based on a definition of a verification risk function for a given system topology and given requirements. The marginal verification risks for the subsystems may then be evaluated, so that examination efforts for the subsystem can be allocated. Two cases of requirements and systems are used to demonstrate the proposed method. The method establishes a systematic relationship between the verification loss, the logic system topology, verification method performance, examination stop criterion, the required examination effort, and a proposed sequence of examinations to reach the examination stop criterion.

  18. Performance requirements of an inertial-fusion-energy source for hydrogen production

    International Nuclear Information System (INIS)

    Hovingh, J.

    1983-01-01

    Performance of an inertial fusion system for the production of hydrogen is compared to a tandem-mirror-system hydrogen producer. Both systems use the General Atomic sulfur-iodine hydrogen-production cycle and produce no net electric power to the grid. An ICF-driven hydrogen producer will have higher system gains and lower electrical-consumption ratios than the design point for the tandem-mirror system if the inertial-fusion-energy gain eta Q > 8.8. For the ICF system to have a higher hydrogen production rate per unit fusion power than the tandem-mirror system requires that eta Q > 17. These can be achieved utilizing realistic laser and pellet performances

  19. Front-office/back-office configurations and operational performance in complex health services.

    Science.gov (United States)

    Gemmel, Paul; van Steenis, Thomas; Meijboom, Bert

    2014-01-01

    Acquired brain injury (ABI) occurs from various causes at different ages and leads to many different types of healthcare needs. Several Dutch ABI-networks installed a local co-ordination and contact point (CCP) which functions as a central and easily accessible service for people to consult when they have questions related to ABI. To explore the relationship between front/back office design and operational performance by investigating the particular enquiry service provided by different CCPs for people affected by an ABI. In-depth interviews with 14 FO/BO employees from three case organizations, complemented with information from desk research and three one-day field visits. The CCPs applied different FO/BO configurations in terms of customer contact and in terms of grouping of front and/or back office activities into tasks for one employee. It is the complexity of the enquiry that determines which approach is more appropriate. For complex enquiries, the level of decoupling is high in all CCPs. This allows multiple experts to be involved in the process. For regular enquiries, CCPs have a choice: either working in the same way as in the complex enquiries or coupling FO/BO activities to be able to serve clients faster and without handovers.

  20. Influence of complexing agents on the mechanical performances of the cement conditioning matrix

    International Nuclear Information System (INIS)

    Nicu, M.; Mihai, F.; Turcanu, C.

    1998-01-01

    The safety of the radioactive waste disposal is a priority demand concerning the protection of the environment and population. For this reason, an engineering multi-barrier system is studied in order to be improved. This study aims to establish the influence of the complexing agents on the mechanical performances of the cement conditioning matrix. Radioactive effluents which contain agents as oxalic and citric acids are generated during the radioactive decontamination operation using chemical methods. The conditioning of these wastes by cementing process imposed the experimental determination of the mechanical performances of the matrix and the upper permissible level of complexing agent concentration. To determine the influence of complexing agents on the mechanical performances of cement conditioning matrix, cubic samples (20 mm x 20 mm x 20 mm) were prepared using commercial Portland cement and solutions of organic complexing acids or salts (citric acid, oxalic acid, tartaric acid, sodium citrate and ammonium oxalate). The complexation concentration varied between 0.25% and 1% in distilled and drinking water, respectively. The selected cement/water ratio was 0.5. The experiments were focused on: - establishing the firmness of the Pa 35 cement pastes and mortars in dependence on the water/cement ratio, by classical methods (Tetmeyer probe for pastes and standard cone for mortars) and by triclinic time through a funnel with 15 mm aperture; - studying the influence of the tartaric, oxalic, citric acids, ammonium oxalate and sodium citrate solution concentrations on water quantities used to obtain pastes with normal firmness and on Pa 35 cement setting; - the influence of oxalic acid, tartaric acid and ammonium oxalate solution concentrations on the strength of compression of the pastes with normal firmness; - for testing, standard test bar cubes with 20 mm sides were used and the strength of compression was tested at 28 days; - establishing the behaviour in time of

  1. Methodological approach to organizational performance improvement process

    OpenAIRE

    Buble, Marin; Dulčić, Želimir; Pavić, Ivan

    2017-01-01

    Organizational performance improvement is one of the fundamental enterprise tasks. This especially applies to the case when the term “performance improvement” implies efficiency improvement measured by indicators, such as ROI, ROE, ROA, or ROVA/ROI. Such tasks are very complex, requiring implementation by means of project management. In this paper, the authors propose a methodological approach to improving the organizational performance of a large enterprise.

  2. Engineering Complex Embedded Systems with State Analysis and the Mission Data System

    Science.gov (United States)

    Ingham, Michel D.; Rasmussen, Robert D.; Bennett, Matthew B.; Moncada, Alex C.

    2004-01-01

    It has become clear that spacecraft system complexity is reaching a threshold where customary methods of control are no longer affordable or sufficiently reliable. At the heart of this problem are the conventional approaches to systems and software engineering based on subsystem-level functional decomposition, which fail to scale in the tangled web of interactions typically encountered in complex spacecraft designs. Furthermore, there is a fundamental gap between the requirements on software specified by systems engineers and the implementation of these requirements by software engineers. Software engineers must perform the translation of requirements into software code, hoping to accurately capture the systems engineer's understanding of the system behavior, which is not always explicitly specified. This gap opens up the possibility for misinterpretation of the systems engineer s intent, potentially leading to software errors. This problem is addressed by a systems engineering methodology called State Analysis, which provides a process for capturing system and software requirements in the form of explicit models. This paper describes how requirements for complex aerospace systems can be developed using State Analysis and how these requirements inform the design of the system software, using representative spacecraft examples.

  3. Can Knowledge of the Characteristics of "High Performers" Be Generalised?

    Science.gov (United States)

    McKenna, Stephen

    2002-01-01

    Two managers described as high performing constructed complexity maps of their organization/world. The maps suggested that high performance is socially constructed and negotiated in specific contexts and management competencies associated with it are context specific. Development of high performers thus requires personalized coaching more than…

  4. Integration plan required by performance agreement SM 7.2.1

    International Nuclear Information System (INIS)

    Diediker, L.P.

    1997-01-01

    Fluor Daniel Hanford, Inc. and its major subcontractors are in agreement that environmental monitoring performed under the Project Hanford Management Contract is to be done in accordance with a single, integrated program. The purpose of this Integration Plan for Environmental Monitoring is to document the policies, systems, and processes being put in place to meet one key objective: manage and integrate a technically competent, multi-media ambient environmental monitoring program, in an efficient, cost effective manner. Fluor Daniel Hanford, Inc. and its major subcontractors also commit to conducting business in a manner consistent with the International Standards Organization 14000 Environmental Management System concepts. Because the integration of sitewide groundwater monitoring activities is managed by the Environmental Restoration Contractor, groundwater monitoring it is outside the scope of this document. Therefore, for the purpose of this Integration Plan for Environmental Monitoring, the Integrated Environmental Monitoring Program is defined as applicable to all environmental media except groundwater. This document provides recommendations on future activities to better integrate the overall environmental monitoring program, with emphasis on the near-field program. In addition, included is the Fluor Daniel Hanford, Inc. team review of the environmental monitoring activities on the Hanford Site, with concurrence of Pacific Northwest National Laboratory and Bechtel Hanford, Inc. (The narrative provided later in the Discussion Section describes the review and consideration given to each topic.) This document was developed to meet the requirements of the Project Hanford Management Contract performance agreement (SM7.2) and the tenets of the U.S. Department of Energy's Effluent and Environmental Monitoring Planning Process. This Plan is prepared for the U.S. Department of Energy, Richland Operations Office, Environmental Assurance, Permits, and Policy Division

  5. Design of low complexity sharp MDFT filter banks with perfect reconstruction using hybrid harmony-gravitational search algorithm

    Directory of Open Access Journals (Sweden)

    V. Sakthivel

    2015-12-01

    Full Text Available The design of low complexity sharp transition width Modified Discrete Fourier Transform (MDFT filter bank with perfect reconstruction (PR is proposed in this work. The current trends in technology require high data rates and speedy processing along with reduced power consumption, implementation complexity and chip area. Filters with sharp transition width are required for various applications in wireless communication. Frequency response masking (FRM technique is used to reduce the implementation complexity of sharp MDFT filter banks with PR. Further, to reduce the implementation complexity, the continuous coefficients of the filters in the MDFT filter banks are represented in discrete space using canonic signed digit (CSD. The multipliers in the filters are replaced by shifters and adders. The number of non-zero bits is reduced in the conversion process to minimize the number of adders and shifters required for the filter implementation. Hence the performances of the MDFT filter bank with PR may degrade. In this work, the performances of the MDFT filter banks with PR are improved using a hybrid Harmony-Gravitational search algorithm.

  6. THE MANAGEMENT METHODS IN PERFORMANCE SPORTS

    Directory of Open Access Journals (Sweden)

    Silvia GRĂDINARU

    2015-12-01

    Full Text Available Sports are a widespread phenomenon, capable of raising human energies and mobilize financial and material resources that can be difficult compared with those in other areas of social life. Management of sports organizations is influenced and determined by the compliance and requirements arising from the documents issued by international organizations with authority in the field. Organizational development is considered essentially as a strategy to increase organizational effectiveness by determining changes that consider both human resources and organizations. On the whole society, it is accelerated by an industry evolving sport with distinctive features. Its development is conditional on macroeconomics and technology. The complexity of the activities of sports organizations performance, the main laboratory performance national and international sports, requiring a more thorough investigation to enable knowledge of the complex mechanisms of their management and simultaneously identify some optimization solutions throughout the economic-financial and human resources.

  7. Satellite Ocean Color Sensor Design Concepts and Performance Requirements

    Science.gov (United States)

    McClain, Charles R.; Meister, Gerhard; Monosmith, Bryan

    2014-01-01

    In late 1978, the National Aeronautics and Space Administration (NASA) launched the Nimbus-7 satellite with the Coastal Zone Color Scanner (CZCS) and several other sensors, all of which provided major advances in Earth remote sensing. The inspiration for the CZCS is usually attributed to an article in Science by Clarke et al. who demonstrated that large changes in open ocean spectral reflectance are correlated to chlorophyll-a concentrations. Chlorophyll-a is the primary photosynthetic pigment in green plants (marine and terrestrial) and is used in estimating primary production, i.e., the amount of carbon fixed into organic matter during photosynthesis. Thus, accurate estimates of global and regional primary production are key to studies of the earth's carbon cycle. Because the investigators used an airborne radiometer, they were able to demonstrate the increased radiance contribution of the atmosphere with altitude that would be a major issue for spaceborne measurements. Since 1978, there has been much progress in satellite ocean color remote sensing such that the technique is well established and is used for climate change science and routine operational environmental monitoring. Also, the science objectives and accompanying methodologies have expanded and evolved through a succession of global missions, e.g., the Ocean Color and Temperature Sensor (OCTS), the Seaviewing Wide Field-of-view Sensor (SeaWiFS), the Moderate Resolution Imaging Spectroradiometer (MODIS), the Medium Resolution Imaging Spectrometer (MERIS), and the Global Imager (GLI). With each advance in science objectives, new and more stringent requirements for sensor capabilities (e.g., spectral coverage) and performance (e.g., signal-to-noise ratio, SNR) are established. The CZCS had four bands for chlorophyll and aerosol corrections. The Ocean Color Imager (OCI) recommended for the NASA Pre-Aerosol, Cloud, and Ocean Ecosystems (PACE) mission includes 5 nanometers hyperspectral coverage from 350 to

  8. Thermal performance envelopes for MHTGRs - Reliability by design

    International Nuclear Information System (INIS)

    Etzel, K.T.; Howard, W.W.; Zgliczynski, J.

    1992-01-01

    Thermal performance envelopes are used to specify steady-state design requirements for the systems of the modular high-temperature gas-cooled reactor (MHTGR) to maximize plant performance reliability with optimized design. The thermal performance envelopes are constructed around the expected operating point to account for uncertainties in actual plant as-built parameters and plant operation. The components are then designed to perform successfully at all points within the envelope. As a result, plant reliability is maximized by accounting for component thermal performance variation in the design. The design is optimized by providing a means to determine required margins in a disciplined and visible fashion. This is accomplished by coordinating these requirements with the various system and component designers in the early stages of the design, applying the principles of total quality management. The design is challenged by the more complex requirements associated with a range of operating conditions, but in return, high probability of delivering reliable performance throughout the plant life is ensured

  9. The Influence of Business Environmental Dynamism, Complexity and Munificence on Performance of Small and Medium Enterprises in Kenya

    Directory of Open Access Journals (Sweden)

    Washington Oduor Okeyo

    2014-08-01

    Full Text Available The main purpose of this article is to examine how business environment affects small and medium enterprises. The paper is motivated by the important contributions small and medium enterprises have in many countries, especially Kenya towards job creation, poverty reduction and economic development. Literature however argues that effectiveness of the contributions is conditioned by the state of business environmental factors such as politics, economy, socio-culture, technology, ecology and laws/regulations. Dynamism, complexity and munificence of these factors are therefore vital to achievement of organizational objectives and overall performance. Even so, a review of literature reveals contradictory views regarding the effect of these factors on performance of organizations. Furthermore, studies focusing on these factors in the Kenyan context, particularly with regard to their effect on performance of small and medium firms, are scarce. This article bridges this gap based on a study focusing on 800 manufacturing organizations in Nairobi – Kenya. A sample of 150 enterprises was selected through stratification by business sector followed by simple random sampling. The research design was cross sectional survey where data was collected using a structured questionnaire over a period of one month at the end of which 95 organizations responded giving a response rate of 64%. Reliability and validity of the instrument were determined through Cronbach’s alpha tests and expert reviews. Statistical Package for Social Sciences was used to determine normality through descriptive statistics and study hypotheses tested using inferential statistics. The study established that business environment had an overall impact on organizational performance. Specifically, dynamism, complexity and munificence each had a direct influence on the enterprises in the study. Furthermore the combined effect on performance was found to be greater than that of dynamism and

  10. Effects of Maize Source and Complex Enzymes on Performance and Nutrient Utilization of Broilers

    Directory of Open Access Journals (Sweden)

    Defu Tang

    2014-12-01

    Full Text Available The objective of this study was to investigate the effect of maize source and complex enzymes containing amylase, xylanase and protease on performance and nutrient utilization of broilers. The experiment was a 4×3 factorial design with diets containing four source maize samples (M1, M2, M3, and M4 and without or with two kinds of complex enzyme A (Axtra XAP and B (Avizyme 1502. Nine hundred and sixty day old Arbor Acres broiler chicks were used in the trial (12 treatments with 8 replicate pens of 10 chicks. Birds fed M1 diet had better body weight gain (BWG and lower feed/gain ratio compared with those fed M3 diet and M4 diet (p0.05, respectively. The fresh feces output was significantly decreased by the addition of enzyme B (p<0.05. Maize source affects the nutrients digestibility and performance of broilers, and a combination of amylase, xylanase and protease is effective in improving the growth profiles of broilers fed maize-soybean-rapeseed-cotton mixed diets.

  11. Determining required valve performance for discrete control of PTO cylinders for wave energy

    DEFF Research Database (Denmark)

    Hansen, Rico Hjerm; Andersen, Torben Ole; Pedersen, Henrik C.

    2012-01-01

    investigates the required valve performance to achieve this energy efficient operation, while meeting basic dynamic requirements. The components making up the total energy loss during shifting is identified by analytically expressing the losses from the governing differential equations. From the analysis...... a framework for evaluating the adequacy of a valve’s response is established, and the analysis shows the results may be normalised for a wider range of systems. Finally, the framework is successfully applied to the Wavestar converter....

  12. A series of copper complexes with carbazole and oxadiazole moieties: Synthesis, characterization and luminescence performance

    Energy Technology Data Exchange (ETDEWEB)

    Bai Weiyang, E-mail: baiwy02@163.com [College of Chemistry and Chemical Engineering, Chongqing University of Technology, Chongqing 400054 (China); Sun Li [Graduate University of Chinese Academy of Sciences, Beijing 100049 (China)

    2012-10-15

    In this paper, various moieties of ethyl, carbazole and oxadiazole are attached to 2-thiazol-4-yl-1H-benzoimidazole to form a series of diamine ligands. Their corresponding Cu(I) complexes are also synthesized using bis(2-(diphenylphosphanyl)phenyl) ether as the auxiliary ligand. Crystal structures, thermal property, electronic nature and luminescence property of these Cu(I) complexes are discussed in detail. These Cu(I) complexes are found to be efficient green-emitting ones in solutions and the emissive parameters are improved largely by the incorporation of substituent moieties. Detailed analysis suggests that the effective suppression of solvent-induced exciplex quenching is responsible for this phenomenon. On the other hand, the introduction of substituent moieties exerts no obvious influence on molecular structure, thermal stability and emitting-energy of the Cu(I) complexes, owing to their absence from inner coordination sphere. - Highlights: Black-Right-Pointing-Pointer Diamine ligands with various moieties and Cu(I) complexes are synthesized. Black-Right-Pointing-Pointer Crystal structures and photophysical property are discussed in detail. Black-Right-Pointing-Pointer The incorporation of substituent moieties improves luminescence performance. Black-Right-Pointing-Pointer Solvent-induced exciplex quenching is suppressed by substituent moieties.

  13. Position requirements for space station personnel and linkages to portable microcomputer performance assessment

    Science.gov (United States)

    Jeanneret, P. R.

    1988-01-01

    The development and use of a menu of performance tests that can be self-administered on a portable microcomputer are investigated. In order to identify, develop, or otherwise select the relevant human capabilities/attributes to measure and hence include in the performance battery, it is essential that an analysis be conducted of the jobs or functions that will be performed throughout a space shuttle mission. The primary job analysis instrument, the Position Analysis Questionnaire (PAQ), is discussed in detail so the reader will have sufficient background for understanding the application of the instrument to the various work activities included within the scope of the study, and the derivation of the human requirements (abilities/attributes) from the PAQ analyses. The research methodology is described and includes the procedures used for gathering the PAQ data. The results are presented in detail with specific emphasis on identifying critical requirements that can be measured with a portable computerized assessment battery. A discussion of the results is given with implications for future research.

  14. Common display performance requirements for military and commercial aircraft product lines

    Science.gov (United States)

    Hoener, Steven J.; Behrens, Arthur J.; Flint, John R.; Jacobsen, Alan R.

    2001-09-01

    Obtaining high quality Active Matrix Liquid Crystal (AMLCD) glass to meet the needs of the commercial and military aerospace business is a major challenge, at best. With the demise of all domestic sources of AMLCD substrate glass, the industry is now focused on overseas sources, which are primarily producing glass for consumer electronics. Previous experience with ruggedizing commercial glass leads to the expectation that the aerospace industry can leverage off the commercial market. The problem remains, while the commercial industry is continually changing and improving its products, the commercial and military aerospace industries require stable and affordable supplies of AMLCD glass for upwards of 20 years to support production and maintenance operations. The Boeing Engineering and Supplier Management Process Councils have chartered a group of displays experts from multiple aircraft product divisions within the Boeing Company, the Displays Process Action Team (DPAT), to address this situation from an overall corporate perspective. The DPAT has formulated a set of Common Displays Performance Requirements for use across the corporate line of commercial and military aircraft products. Though focused on the AMLCD problem, the proposed common requirements are largely independent of display technology. This paper describes the strategy being pursued within the Boeing Company to address the AMLCD supply problem and details the proposed implementation process, centered on common requirements for both commercial and military aircraft displays. Highlighted in this paper are proposed common, or standard, display sizes and the other major requirements established by the DPAT, along with the rationale for these requirements.

  15. Using Project Complexity Determinations to Establish Required Levels of Project Rigor

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Thomas D.

    2015-10-01

    This presentation discusses the project complexity determination process that was developed by National Security Technologies, LLC, for the U.S. Department of Energy, National Nuclear Security Administration Nevada Field Office for implementation at the Nevada National Security Site (NNSS). The complexity determination process was developed to address the diversity of NNSS project types, size, and complexity; to fill the need for one procedure but with provision for tailoring the level of rigor to the project type, size, and complexity; and to provide consistent, repeatable, effective application of project management processes across the enterprise; and to achieve higher levels of efficiency in project delivery. These needs are illustrated by the wide diversity of NNSS projects: Defense Experimentation, Global Security, weapons tests, military training areas, sensor development and testing, training in realistic environments, intelligence community support, sensor development, environmental restoration/waste management, and disposal of radioactive waste, among others.

  16. Virtual machine performance benchmarking.

    Science.gov (United States)

    Langer, Steve G; French, Todd

    2011-10-01

    The attractions of virtual computing are many: reduced costs, reduced resources and simplified maintenance. Any one of these would be compelling for a medical imaging professional attempting to support a complex practice on limited resources in an era of ever tightened reimbursement. In particular, the ability to run multiple operating systems optimized for different tasks (computational image processing on Linux versus office tasks on Microsoft operating systems) on a single physical machine is compelling. However, there are also potential drawbacks. High performance requirements need to be carefully considered if they are to be executed in an environment where the running software has to execute through multiple layers of device drivers before reaching the real disk or network interface. Our lab has attempted to gain insight into the impact of virtualization on performance by benchmarking the following metrics on both physical and virtual platforms: local memory and disk bandwidth, network bandwidth, and integer and floating point performance. The virtual performance metrics are compared to baseline performance on "bare metal." The results are complex, and indeed somewhat surprising.

  17. The association of students requiring remediation in the internal medicine clerkship with poor performance during internship.

    Science.gov (United States)

    Hemann, Brian A; Durning, Steven J; Kelly, William F; Dong, Ting; Pangaro, Louis N; Hemmer, Paul A

    2015-04-01

    To determine whether the Uniformed Services University (USU) system of workplace performance assessment for students in the internal medicine clerkship at the USU continues to be a sensitive predictor of subsequent poor performance during internship, when compared with assessments in other USU third year clerkships. Utilizing Program Director survey results from 2007 through 2011 and U.S. Medical Licensing Examination (USMLE) Step 3 examination results as the outcomes of interest, we compared performance during internship for students who had less than passing performance in the internal medicine clerkship and required remediation, against students whose performance in the internal medicine clerkship was successful. We further analyzed internship ratings for students who received less than passing grades during the same time period on other third year clerkships such as general surgery, pediatrics, obstetrics and gynecology, family medicine, and psychiatry to evaluate whether poor performance on other individual clerkships were associated with future poor performance at the internship level. Results for this recent cohort of graduates were compared with previously published findings. The overall survey response rate for this 5 year cohort was 81% (689/853). Students who received a less than passing grade in the internal medicine clerkship and required further remediation were 4.5 times more likely to be given poor ratings in the domain of medical expertise and 18.7 times more likely to demonstrate poor professionalism during internship. Further, students requiring internal medicine remediation were 8.5 times more likely to fail USMLE Step 3. No other individual clerkship showed any statistically significant associations with performance at the intern level. On the other hand, 40% of students who successfully remediated and did graduate were not identified during internship as having poor performance. Unsuccessful clinical performance which requires remediation in

  18. Finding optimal interaction interface alignments between biological complexes

    KAUST Repository

    Cui, Xuefeng

    2015-06-13

    Motivation: Biological molecules perform their functions through interactions with other molecules. Structure alignment of interaction interfaces between biological complexes is an indispensable step in detecting their structural similarities, which are keys to understanding their evolutionary histories and functions. Although various structure alignment methods have been developed to successfully access the similarities of protein structures or certain types of interaction interfaces, existing alignment tools cannot directly align arbitrary types of interfaces formed by protein, DNA or RNA molecules. Specifically, they require a \\'blackbox preprocessing\\' to standardize interface types and chain identifiers. Yet their performance is limited and sometimes unsatisfactory. Results: Here we introduce a novel method, PROSTA-inter, that automatically determines and aligns interaction interfaces between two arbitrary types of complex structures. Our method uses sequentially remote fragments to search for the optimal superimposition. The optimal residue matching problem is then formulated as a maximum weighted bipartite matching problem to detect the optimal sequence order-independent alignment. Benchmark evaluation on all non-redundant protein-DNA complexes in PDB shows significant performance improvement of our method over TM-align and iAlign (with the \\'blackbox preprocessing\\'). Two case studies where our method discovers, for the first time, structural similarities between two pairs of functionally related protein-DNA complexes are presented. We further demonstrate the power of our method on detecting structural similarities between a protein-protein complex and a protein-RNA complex, which is biologically known as a protein-RNA mimicry case. © The Author 2015. Published by Oxford University Press.

  19. Partial QoS-Aware Opportunistic Relay Selection Over Two-Hop Channels: End-to-End Performance Under Spectrum-Sharing Requirements

    KAUST Repository

    Yuli Yang,

    2014-10-01

    In this paper, we propose a partial quality-of-service (QoS)-oriented relay selection scheme with a decode-and-forward (DF) relaying protocol, to reduce the feedback amount required for relay selection. In the proposed scheme, the activated relay is the one with the maximum signal-to-noise power ratio (SNR) in the second hop among those whose packet loss rates (PLRs) in the first hop achieve a predetermined QoS level. For the purpose of evaluating the performance of the proposed scheme, we exploit it with transmission constraints imposed on the transmit power budget and interference to other users. By analyzing the statistics of received SNRs in the first and second hops, we obtain the end-to-end PLR of this scheme in closed form under the considered scenario. Moreover, to compare the proposed scheme with popular relay selection schemes, we also derive the closed-form PLR expressions for partial relay selection (PRS) and opportunistic relay selection (ORS) criteria in the same scenario under study. Illustrative numerical results demonstrate the accuracy of our derivations and substantiate that the proposed relay selection scheme is a promising alternative with respect to the tradeoff between performance and complexity.

  20. A (Small) complexity performance contest: SPT versus LBFS

    DEFF Research Database (Denmark)

    Nielsen, Erland Hejn

    2002-01-01

    When discussing the nature of loading rules relevant to continuous dynamic job/flow shop systems, the general understanding is that results obtained in simpler structures of 2 to 4 machines are invariant to scaling and can be generalised without problems to more complex structures. The SPT...... results on pure re-entrant flow shop structures emerges. It now seems that alternative loading rules as for instance the LBFS (Last Buffer First Served) due to its strong long run stabilising property attracts quite some interest. To be more precise about the complexity aspect, complexity in job...... are not entirely only of theoretical interest, as well as results from a standard serial job/flow shop set-up, but with resource limitations that prevent the independent operations of the individual stations in the system....

  1. A (Small) Complexity Performance Contest: SPT versus LBFS

    DEFF Research Database (Denmark)

    Nielsen, Erland Hejn

    2002-01-01

    When discussing the nature of loading rules relevant to continuous dynamic job/flow shop systems, the general understanding is that results obtained in simpler structures of 2 to 4 machines are invariant to scaling and can be generalised without problems to more complex structures. The SPT...... results on pure re-entrant flow shop structures emerges. It now seems that alternative loading rules as for instance the LBFS (Last Buffer First Served) due to its strong long run stabilising property attracts quite some interest. To be more precise about the complexity aspect, complexity in job...... are not entirely only of theoretical interest, as well as results from a standard serial job/flow shop set-up, but with resource limitations that prevent the independent operations of the individual stations in the system....

  2. A Low-Complexity and High-Performance 2D Look-Up Table for LDPC Hardware Implementation

    Science.gov (United States)

    Chen, Jung-Chieh; Yang, Po-Hui; Lain, Jenn-Kaie; Chung, Tzu-Wen

    In this paper, we propose a low-complexity, high-efficiency two-dimensional look-up table (2D LUT) for carrying out the sum-product algorithm in the decoding of low-density parity-check (LDPC) codes. Instead of employing adders for the core operation when updating check node messages, in the proposed scheme, the main term and correction factor of the core operation are successfully merged into a compact 2D LUT. Simulation results indicate that the proposed 2D LUT not only attains close-to-optimal bit error rate performance but also enjoys a low complexity advantage that is suitable for hardware implementation.

  3. Methodological approach to organizational performance improvement process

    Directory of Open Access Journals (Sweden)

    Marin Buble

    2001-01-01

    Full Text Available Organizational performance improvement is one of the fundamental enterprise tasks. This especially applies to the case when the term “performance improvement” implies efficiency improvement measured by indicators, such as ROI, ROE, ROA, or ROVA/ROI. Such tasks are very complex, requiring implementation by means of project management. In this paper, the authors propose a methodological approach to improving the organizational performance of a large enterprise.

  4. Combined effect of using near-infrared spectroscopy for nutritional evaluation of feed ingredients and non-starch polysaccharide carbohydrase complex on performance of broiler chickens.

    Science.gov (United States)

    Montanhini Neto, Roberto; N'Guetta, Eric; Gady, Cecile; Francesch, Maria; Preynat, Aurélie

    2017-12-01

    This study was carried out to evaluate the combined effect of using near-infrared spectroscopy (NIRS) for nutritional evaluation of feed ingredients and the addition of non-starch polysaccharide carbohydrase complex (NSP enzymes) on the growth performance of broilers fed diets produced with low-quality wheat and soybean meal. A 2 × 2 trial design was performed, with seven replicates of 40 male Ross 308 broilers per treatment, evaluating the effect of the addition of NSP enzymes and the ingredients' nutritional matrix based on table values or NIRS values. Diets without added enzymes were formulated to reach nutritional requirements, whereas diets with enzymes were reformulated, reducing the apparent metabolizable energy (AME) by 85 kcal/kg. In the overall period (days 0-35), broilers fed diets formulated using NIRS values had higher (P nutritional approaches are efficient in improving broilers' performances by themselves and even more so when they are combined. © 2017 Japanese Society of Animal Science.

  5. Leading healthcare in complexity.

    Science.gov (United States)

    Cohn, Jeffrey

    2014-12-01

    Healthcare institutions and providers are in complexity. Networks of interconnections from relationships and technology create conditions in which interdependencies and non-linear dynamics lead to surprising, unpredictable outcomes. Previous effective approaches to leadership, focusing on top-down bureaucratic methods, are no longer effective. Leading in complexity requires leaders to accept the complexity, create an adaptive space in which innovation and creativity can flourish and then integrate the successful practices that emerge into the formal organizational structure. Several methods for doing adaptive space work will be discussed. Readers will be able to contrast traditional leadership approaches with leading in complexity. They will learn new behaviours that are required of complexity leaders, along with challenges they will face, often from other leaders within the organization.

  6. Effects of Task Performance and Task Complexity on the Validity of Computational Models of Attention

    NARCIS (Netherlands)

    Koning, L. de; Maanen, P.P. van; Dongen, K. van

    2008-01-01

    Computational models of attention can be used as a component of decision support systems. For accurate support, a computational model of attention has to be valid and robust. The effects of task performance and task complexity on the validity of three different computational models of attention were

  7. 77 FR 11995 - Passenger Vessel Operator Financial Responsibility Requirements for Non-Performance of...

    Science.gov (United States)

    2012-02-28

    ... Vessel Operator Financial Responsibility Requirements for Non-Performance of Transportation AGENCY..., 2011, the Commission issued its Notice of Proposed Rulemaking (NPRM) to update its financial... cost of financial responsibility coverage because of the use of alternative coverage options. However...

  8. Evidence that Mediator is essential for Pol II transcription, but is not a required component of the preinitiation complex in vivo.

    Science.gov (United States)

    Petrenko, Natalia; Jin, Yi; Wong, Koon Ho; Struhl, Kevin

    2017-07-12

    The Mediator complex has been described as a general transcription factor, but it is unclear if it is essential for Pol II transcription and/or is a required component of the preinitiation complex (PIC) in vivo. Here, we show that depletion of individual subunits, even those essential for cell growth, causes a general but only modest decrease in transcription. In contrast, simultaneous depletion of all Mediator modules causes a drastic decrease in transcription. Depletion of head or middle subunits, but not tail subunits, causes a downstream shift in the Pol II occupancy profile, suggesting that Mediator at the core promoter inhibits promoter escape. Interestingly, a functional PIC and Pol II transcription can occur when Mediator is not detected at core promoters. These results provide strong evidence that Mediator is essential for Pol II transcription and stimulates PIC formation, but it is not a required component of the PIC in vivo.

  9. Toward High Performance in Industrial Refrigeration Systems

    DEFF Research Database (Denmark)

    Thybo, C.; Izadi-Zamanabadi, Roozbeh; Niemann, H.

    2002-01-01

    Achieving high performance in complex industrial systems requires information manipulation at different system levels. The paper shows how different models of same subsystems, but using different quality of information/data, are used for fault diagnosis as well as robust control design...

  10. Towards high performance in industrial refrigeration systems

    DEFF Research Database (Denmark)

    Thybo, C.; Izadi-Zamanabadi, R.; Niemann, Hans Henrik

    2002-01-01

    Achieving high performance in complex industrial systems requires information manipulation at different system levels. The paper shows how different models of same subsystems, but using different quality of information/data, are used for fault diagnosis as well as robust control design...

  11. Multi-criteria decision making under uncertainty in building performance assessment

    NARCIS (Netherlands)

    Hopfe, C.J.; Augenbroe, G.; Hensen, J.L.M.

    2013-01-01

    Building performance assessment is complex, as it has to respond to multiple criteria. Objectives originating from the demands that are put on energy consumption, acoustical performance, thermal occupant comfort, indoor air quality and many other issues must all be reconciled. An assessment requires

  12. The Impact of Environmental Complexity and Team Training on Team Processes and Performance in Multi-Team Environments

    National Research Council Canada - National Science Library

    Cobb, Marshall

    1999-01-01

    This study examined how manipulating the level of environmental complexity and the type of team training given to subject volunteers impacted important team process behaviors and performance outcomes...

  13. Profiles of Motor Laterality in Young Athletes' Performance of Complex Movements: Merging the MOTORLAT and PATHoops Tools

    Science.gov (United States)

    Castañer, Marta; Andueza, Juan; Hileno, Raúl; Puigarnau, Silvia; Prat, Queralt; Camerino, Oleguer

    2018-01-01

    Laterality is a key aspect of the analysis of basic and specific motor skills. It is relevant to sports because it involves motor laterality profiles beyond left-right preference and spatial orientation of the body. The aim of this study was to obtain the laterality profiles of young athletes, taking into account the synergies between the support and precision functions of limbs and body parts in the performance of complex motor skills. We applied two instruments: (a) MOTORLAT, a motor laterality inventory comprising 30 items of basic, specific, and combined motor skills, and (b) the Precision and Agility Tapping over Hoops (PATHoops) task, in which participants had to perform a path by stepping in each of 14 hoops arranged on the floor, allowing the observation of their feet, left-right preference and spatial orientation. A total of 96 young athletes performed the PATHoops task and the 30 MOTORLAT items, allowing us to obtain data about limb dominance and spatial orientation of the body in the performance of complex motor skills. Laterality profiles were obtained by means of a cluster analysis and a correlational analysis and a contingency analysis were applied between the motor skills and spatial orientation actions performed. The results obtained using MOTORLAT show that the combined motor skills criterion (for example, turning while jumping) differentiates athletes' uses of laterality, showing a clear tendency toward mixed laterality profiles in the performance of complex movements. In the PATHoops task, the best spatial orientation strategy was “same way” (same foot and spatial wing) followed by “opposite way” (opposite foot and spatial wing), in keeping with the research assumption that actions unfolding in a horizontal direction in front of an observer's eyes are common in a variety of sports. PMID:29930527

  14. Performance Measurement of Complex Event Platforms

    Directory of Open Access Journals (Sweden)

    Eva Zámečníková

    2016-12-01

    Full Text Available The aim of this paper is to find and compare existing solutions of complex event processing platforms (CEP. CEP platforms generally serve for processing and/or predicting of high frequency data. We intend to use CEP platform for processing of complex time series and integrate a solution for newly proposed method of decision making. The decision making process will be described by formal grammar. As there are lots of CEP solutions we will take the following characteristics under consideration - the processing in real time, possibility of processing of high volume data from multiple sources, platform independence, platform allowing integration with user solution and open license. At first we will talk about existing CEP tools and their specific way of use in praxis. Then we will mention the design of method for formalization of business rules used for decision making. Afterwards, we focus on two platforms which seem to be the best fit for integration of our solution and we will list the main pros and cons of each approach. Next part is devoted to benchmark platforms for CEP. Final part is devoted to experimental measurements of platform with integrated method for decision support.

  15. Commercially-driven human interplanetary propulsion systems: Rationale, concept, technology, and performance requirements

    International Nuclear Information System (INIS)

    Williams, C.H.; Borowski, S.K.

    1996-01-01

    Previous studies of human interplanetary missions are largely characterized by long trip times, limited performance capabilities, and enormous costs. Until these missions become dramatically more open-quote open-quote commercial-friendly close-quote close-quote, their funding source and rationale will be restricted to national governments and their political/scientific interests respectively. A rationale is discussed for human interplanetary space exploration predicated on the private sector. Space propulsion system requirements are identified for interplanetary transfer times of no more than a few weeks/months to and between the major outer planets. Nuclear fusion is identified as the minimum requisite space propulsion technology. A conceptual design is described and evolutionary catalyzed-DD to DHe 3 fuel cycles are proposed. Magnetic nozzles for direct thrust generation and quantifying the operational aspects of the energy exchange mechanisms between high energy reaction products and neutral propellants are identified as two of the many key supporting technologies essential to satisfying system performance requirements. Government support of focused, breakthrough technologies is recommended at funding levels appropriate to other ongoing federal research. copyright 1996 American Institute of Physics

  16. Efficient nuclear export of p65-IkappaBalpha complexes requires 14-3-3 proteins.

    Science.gov (United States)

    Aguilera, Cristina; Fernández-Majada, Vanessa; Inglés-Esteve, Julia; Rodilla, Verónica; Bigas, Anna; Espinosa, Lluís

    2006-09-01

    IkappaB are responsible for maintaining p65 in the cytoplasm under non-stimulating conditions and promoting the active export of p65 from the nucleus following NFkappaB activation to terminate the signal. We now show that 14-3-3 proteins regulate the NFkappaB signaling pathway by physically interacting with p65 and IkappaBalpha proteins. We identify two functional 14-3-3 binding domains in the p65 protein involving residues 38-44 and 278-283, and map the interaction region of IkappaBalpha in residues 60-65. Mutation of these 14-3-3 binding domains in p65 or IkappaBalpha results in a predominantly nuclear distribution of both proteins. TNFalpha treatment promotes recruitment of 14-3-3 and IkappaBalpha to NFkappaB-dependent promoters and enhances the binding of 14-3-3 to p65. Disrupting 14-3-3 activity by transfection with a dominant-negative 14-3-3 leads to the accumulation of nuclear p65-IkappaBalpha complexes and the constitutive association of p65 with the chromatin. In this situation, NFkappaB-dependent genes become unresponsive to TNFalpha stimulation. Together our results indicate that 14-3-3 proteins facilitate the nuclear export of IkappaBalpha-p65 complexes and are required for the appropriate regulation of NFkappaB signaling.

  17. 40 CFR 80.815 - What are the gasoline toxics performance requirements for refiners and importers?

    Science.gov (United States)

    2010-07-01

    ... toxics requirements of this subpart apply separately for each of the following types of gasoline produced...) The gasoline toxics performance requirements of this subpart apply to gasoline produced at a refinery... not apply to gasoline produced by a refinery approved under § 80.1334, pursuant to § 80.1334(c). (2...

  18. Three propositions on why characteristics of performance management systems converge across policy areas with different levels of task complexity

    DEFF Research Database (Denmark)

    Bjørnholt, Bente; Lindholst, Andrej Christian; Agger Nielsen, Jeppe

    2014-01-01

    of task complexity amidst a lack of formal and overarching, government-wide policies. We advance our propositions from a case study comparing the characteristics of performance management systems across social services (eldercare) and technical services (park services) in Denmark. Contrary to expectations......This article investigates the differences and similarities between performance management systems across public services. We offer three propositions as to why the characteristics of performance management systems may still converge across policy areas in the public sector with different levels...... for divergence due to differences in task complexity, the characteristics of performance management systems in the two policy areas are observed to converge. On the basis of a case study, we propose that convergence has occurred due to 1) similarities in policy-specific reforms, 2) institutional pressures, and 3...

  19. Product Complexity Impact on Quality and Delivery Performance

    OpenAIRE

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2011-01-01

    Existing literature on product portfolio complexity is mainly focused on cost related aspects. It is widely acknowledged that an increase in a company’s product portfolio will lead to an increase in complexity related costs such as order management, procurement and inventory. The objective of this article is to examine which other factors that might be affected when a company is expanding its product portfolio, if initiatives are not taken to accommodate this increase. Empirical work carried ...

  20. Conflict Resolution for Product Performance Requirements Based on Propagation Analysis in the Extension Theory

    Directory of Open Access Journals (Sweden)

    Yanwei Zhao

    2014-01-01

    Full Text Available Traditional product data mining methods are mainly focused on the static data. Performance requirements are generally met as possible by finding some cases and changing their structures. However, when one is satisfied with the structures changed, the other effects are not taken into account by analyzing the correlations; that is, design conflicts are not identified and resolved. An approach to resolving the conflict problems is proposed based on propagation analysis in Extension Theory. Firstly, the extension distance is improved to better fit evaluating the similarity among cases, then, a case retrieval method is developed. Secondly, the transformations that can be made on selected cases are formulated by understanding the conflict natures in the different performance requirements, which leads to the extension transformation strategy development for coordinating conflicts using propagation analysis. Thirdly, the effects and levels of propagation are determined by analyzing the performance values before and after the transformations, thus the co-existing conflict coordination strategy of multiple performances is developed. The method has been implemented in a working prototype system for supporting decision-making. And it has been demonstrated the feasible and effective through resolving the conflicts of noise, exhaust, weight and intake pressure for the screw air compressor performance design.

  1. Monitoring of performance management using Quality Assurance Indicators and ISO requirement

    Directory of Open Access Journals (Sweden)

    Dargahi H

    2007-06-01

    Full Text Available Background: Quality assurance is a prevention-oriented system that can be used to improve the quality of care, increase productivity and monitor the performance management in clinical laboratories. ISO 9001: 2000 requirements are a collection of management and technical systems designed to implement quality assurance and monitor performance management in organizations. Methods: A checklist was prepared to monitor the preanalytical, analytical and postanalytical stages of laboratory performance management in 16 areas and all laboratory activities in 14 of the clinical laboratories of the Tehran University of Medical Sciences (TUMS hospitals. Collected data were stored and statistically analyzed using SPSS software. Results: The best performance, in which 77.73% of quality assurance indicators were observed, was found in Sina Hospital. However, only 57.56% of these indicators were fulfilled at Farabi Hospital, with the lowest-level performance among the clinical laboratories of TUMS hospitals. The highest level of compliance with quality assurance indicators was in the hematology departments and for facility demands in management areas. Overall, quality assurance indicators were appropriately followed in only 7% of the clinical laboratories. Conclusion: The average quality assurance observation rate in the clinical laboratories studied was 67.22%, which is insufficient and must be remedied with stricter enforcement of the ISO 9001: 2000 regulations.

  2. Complex Correspondence Principle

    International Nuclear Information System (INIS)

    Bender, Carl M.; Meisinger, Peter N.; Hook, Daniel W.; Wang Qinghai

    2010-01-01

    Quantum mechanics and classical mechanics are distinctly different theories, but the correspondence principle states that quantum particles behave classically in the limit of high quantum number. In recent years much research has been done on extending both quantum and classical mechanics into the complex domain. These complex extensions continue to exhibit a correspondence, and this correspondence becomes more pronounced in the complex domain. The association between complex quantum mechanics and complex classical mechanics is subtle and demonstrating this relationship requires the use of asymptotics beyond all orders.

  3. Effects of Enzyme Complex Supplementation to a Paddy-based Diet on Performance and Nutrient Digestibility of Meat-type Ducks

    Directory of Open Access Journals (Sweden)

    P. Kang

    2013-02-01

    Full Text Available Paddy rice is rarely used as a feed because of its high fiber content. In this study, two experiments were conducted to study the effects of supplementing an enzyme complex consisting of xylanase, beta-glucanase and cellulase, to paddy-based diets on the performance and nutrient digestibility in meat-type ducks. In the both experiments, meat-type ducks (Cherry Valley were randomly assigned to four treatments. Treatment 1 was a basal diet of corn-soybean; treatment 2 was a basal diet of corn-paddy-soybean; treatment 3, had enzyme complex added to the corn-paddy-soybean basal diet at levels of 0.5 g/kg diet; and treatment 4, had enzyme complex added to the corn-paddy-soybean diet at levels of 1.0 g/kg diet. The results showed that the enzyme complex increased the ADG, and decreased the ADFI and F/G significantly (p0.05. The outcome of this research indicates that the application of enzyme complex made up of xylanase, beta-glucanase, and cellulase, in the corn-paddy-soybean diet, can improve performance and nutrition digestibility in meat-type ducks.

  4. Rickettsia parkeri invasion of diverse host cells involves an Arp2/3 complex, WAVE complex and Rho-family GTPase-dependent pathway.

    Science.gov (United States)

    Reed, Shawna C O; Serio, Alisa W; Welch, Matthew D

    2012-04-01

    Rickettsiae are obligate intracellular pathogens that are transmitted to humans by arthropod vectors and cause diseases such as spotted fever and typhus. Although rickettsiae require the host cell actin cytoskeleton for invasion, the cytoskeletal proteins that mediate this process have not been completely described. To identify the host factors important during cell invasion by Rickettsia parkeri, a member of the spotted fever group (SFG), we performed an RNAi screen targeting 105 proteins in Drosophila melanogaster S2R+ cells. The screen identified 21 core proteins important for invasion, including the GTPases Rac1 and Rac2, the WAVE nucleation-promoting factor complex and the Arp2/3 complex. In mammalian cells, including endothelial cells, the natural targets of R. parkeri, the Arp2/3 complex was also crucial for invasion, while requirements for WAVE2 as well as Rho GTPases depended on the particular cell type. We propose that R. parkeri invades S2R+ arthropod cells through a primary pathway leading to actin nucleation, whereas invasion of mammalian endothelial cells occurs via redundant pathways that converge on the host Arp2/3 complex. Our results reveal a key role for the WAVE and Arp2/3 complexes, as well as a higher degree of variation than previously appreciated in actin nucleation pathways activated during Rickettsia invasion. © 2011 Blackwell Publishing Ltd.

  5. Shifting effects in randomised controlled trials of complex interventions: a new kind of performance bias?

    Science.gov (United States)

    Gold, C; Erkkilä, J; Crawford, M J

    2012-11-01

    Randomised controlled trials (RCTs) aim to provide unbiased estimates of treatment effects. However, the process of implementing trial procedures may have an impact on the performance of complex interventions that rely strongly on the intuition and confidence of therapists. We aimed to examine whether shifting effects over the recruitment period can be observed that might indicate such impact. Three RCTs investigating music therapy vs. standard care were included. The intervention was performed by experienced therapists and based on established methods. We examined outcomes of participants graphically, analysed cumulative effects and tested for differences between first vs. later participants. We tested for potential confounding population shifts through multiple regression models. Cumulative differences suggested trends over the recruitment period. Effect sizes tended to be less favourable among the first participants than later participants. In one study, effects even changed direction. Age, gender and baseline severity did not account for these shifting effects. Some trials of complex interventions have shifting effects over the recruitment period that cannot be explained by therapist experience or shifting demographics. Replication and further research should aim to find out which interventions and trial designs are most vulnerable to this new kind of performance bias. © 2012 John Wiley & Sons A/S.

  6. Functional High Performance Financial IT

    DEFF Research Database (Denmark)

    Berthold, Jost; Filinski, Andrzej; Henglein, Fritz

    2011-01-01

    at the University of Copenhagen that attacks this triple challenge of increased performance, transparency and productivity in the financial sector by a novel integration of financial mathematics, domain-specific language technology, parallel functional programming, and emerging massively parallel hardware. HIPERFIT......The world of finance faces the computational performance challenge of massively expanding data volumes, extreme response time requirements, and compute-intensive complex (risk) analyses. Simultaneously, new international regulatory rules require considerably more transparency and external...... auditability of financial institutions, including their software systems. To top it off, increased product variety and customisation necessitates shorter software development cycles and higher development productivity. In this paper, we report about HIPERFIT, a recently etablished strategic research center...

  7. A low-complexity interacting multiple model filter for maneuvering target tracking

    KAUST Repository

    Khalid, Syed Safwan; Abrar, Shafayat

    2017-01-01

    In this work, we address the target tracking problem for a coordinate-decoupled Markovian jump-mean-acceleration based maneuvering mobility model. A novel low-complexity alternative to the conventional interacting multiple model (IMM) filter is proposed for this class of mobility models. The proposed tracking algorithm utilizes a bank of interacting filters where the interactions are limited to the mixing of the mean estimates, and it exploits a fixed off-line computed Kalman gain matrix for the entire filter bank. Consequently, the proposed filter does not require matrix inversions during on-line operation which significantly reduces its complexity. Simulation results show that the performance of the low-complexity proposed scheme remains comparable to that of the traditional (highly-complex) IMM filter. Furthermore, we derive analytical expressions that iteratively evaluate the transient and steady-state performance of the proposed scheme, and establish the conditions that ensure the stability of the proposed filter. The analytical findings are in close accordance with the simulated results.

  8. A low-complexity interacting multiple model filter for maneuvering target tracking

    KAUST Repository

    Khalid, Syed Safwan

    2017-01-22

    In this work, we address the target tracking problem for a coordinate-decoupled Markovian jump-mean-acceleration based maneuvering mobility model. A novel low-complexity alternative to the conventional interacting multiple model (IMM) filter is proposed for this class of mobility models. The proposed tracking algorithm utilizes a bank of interacting filters where the interactions are limited to the mixing of the mean estimates, and it exploits a fixed off-line computed Kalman gain matrix for the entire filter bank. Consequently, the proposed filter does not require matrix inversions during on-line operation which significantly reduces its complexity. Simulation results show that the performance of the low-complexity proposed scheme remains comparable to that of the traditional (highly-complex) IMM filter. Furthermore, we derive analytical expressions that iteratively evaluate the transient and steady-state performance of the proposed scheme, and establish the conditions that ensure the stability of the proposed filter. The analytical findings are in close accordance with the simulated results.

  9. Will hypertension performance measures used for pay-for-performance programs penalize those who care for medically complex patients?

    Science.gov (United States)

    Petersen, Laura A; Woodard, Lechauncy D; Henderson, Louise M; Urech, Tracy H; Pietz, Kenneth

    2009-06-16

    There is concern that performance measures, patient ratings of their care, and pay-for-performance programs may penalize healthcare providers of patients with multiple chronic coexisting conditions. We examined the impact of coexisting conditions on the quality of care for hypertension and patient perception of overall quality of their health care. We classified 141 609 veterans with hypertension into 4 condition groups: those with hypertension-concordant (diabetes mellitus, ischemic heart disease, dyslipidemia) and/or -discordant (arthritis, depression, chronic obstructive pulmonary disease) conditions or neither. We measured blood pressure control at the index visit, overall good quality of care for hypertension, including a follow-up interval, and patient ratings of satisfaction with their care. Associations between condition type and number of coexisting conditions on receipt of overall good quality of care were assessed with logistic regression. The relationship between patient assessment and objective measures of quality was assessed. Of the cohort, 49.5% had concordant-only comorbidities, 8.7% had discordant-only comorbidities, 25.9% had both, and 16.0% had none. Odds of receiving overall good quality after adjustment for age were higher for those with concordant comorbidities (odds ratio, 1.78; 95% confidence interval, 1.70 to 1.87), discordant comorbidities (odds ratio, 1.32; 95% confidence interval, 1.23 to 1.41), or both (odds ratio, 2.25; 95% confidence interval, 2.13 to 2.38) compared with neither. Findings did not change after adjustment for illness severity and/or number of primary care and specialty care visits. Patient assessment of quality did not vary by the presence of coexisting conditions and was not related to objective ratings of quality of care. Contrary to expectations, patients with greater complexity had higher odds of receiving high-quality care for hypertension. Subjective ratings of care did not vary with the presence or absence of

  10. Selecting a change and evaluating its impact on the performance of a complex adaptive health care delivery system

    Directory of Open Access Journals (Sweden)

    Malaz A Boustani

    2010-05-01

    Full Text Available Malaz A Boustani1,2,3,4, Stephanie Munger1,2, Rajesh Gulati3,4, Mickey Vogel4, Robin A Beck3,4, Christopher M Callahan1,2,3,41Indiana University Center for Aging Research, 2Regenstrief Institute Inc., 3Indiana University School of Medicine, Department of Medicine, Division of General Internal Medicine and Geriatrics, 4Indiana University Medical Group-Primary Care; Indianapolis, IN USAAbstract: Complexity science suggests that our current health care delivery system acts as a complex adaptive system (CAS. Such systems represent a dynamic and flexible network of individuals who can coevolve with their ever changing environment. The CAS performance fluctuates and its members’ interactions continuously change over time in response to the stress generated by its surrounding environment. This paper will review the challenges of intervening and introducing a planned change into a complex adaptive health care delivery system. We explore the role of the “reflective adaptive process” in developing delivery interventions and suggest different evaluation methodologies to study the impact of such interventions on the performance of the entire system. We finally describe the implementation of a new program, the Aging Brain Care Medical Home as a case study of our proposed evaluation process.Keywords: complexity, aging brain, implementation, complex adaptive system, sustained change, care delivery

  11. Electrochemically fabricated polypyrrole-cobalt-oxygen coordination complex as high-performance lithium-storage materials.

    Science.gov (United States)

    Guo, Bingkun; Kong, Qingyu; Zhu, Ying; Mao, Ya; Wang, Zhaoxiang; Wan, Meixiang; Chen, Liquan

    2011-12-23

    Current lithium-ion battery (LIB) technologies are all based on inorganic electrode materials, though organic materials have been used as electrodes for years. Disadvantages such as limited thermal stability and low specific capacity hinder their applications. On the other hand, the transition metal oxides that provide high lithium-storage capacity by way of electrochemical conversion reaction suffer from poor cycling stability. Here we report a novel high-performance, organic, lithium-storage material, a polypyrrole-cobalt-oxygen (PPy-Co-O) coordination complex, with high lithium-storage capacity and excellent cycling stability. Extended X-ray absorption fine structure and Raman spectroscopy and other physical and electrochemical characterizations demonstrate that this coordination complex can be electrochemically fabricated by cycling PPy-coated Co(3)O(4) between 0.0 V and 3.0 V versus Li(+)/Li. Density functional theory (DFT) calculations indicate that each cobalt atom coordinates with two nitrogen atoms within the PPy-Co coordination layer and the layers are connected with oxygen atoms between them. Coordination weakens the C-H bonds on PPy and makes the complex a novel lithium-storage material with high capacity and high cycling stability. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. PNN NGC 246: A Complex Photometric Behaviour That Requires Wet

    Directory of Open Access Journals (Sweden)

    Pérez J. M. González

    2003-03-01

    Full Text Available We present a study over three single-site campaigns to investigate the photometric behaviour of the PNN NGC 246. We observed this object in 2000 and 2001. The analysis of the light curves indicates complex and variable temporal spectra. Using wavelet analysis we have found evidences for changes on time scales of hours in the 2000 dataset. The temporal spectra obtained during 2001 are quite different from the results of the previous year. The modulations in the light curve are more noticeable and the temporal spectra present a higher number of modulation frequencies. One peculiar characteristic is the presence of a variable harmonic structure related to one of these modulation frequencies. This complex photometric behaviour may be explained by a more complicated unresolved combination of modulation frequencies, but more likely due to a combination of pulsations of the star plus modulations related to interaction with a close companion, maybe indicating a disc. However, these characteristics cannot be confirmed from single site observations. The complex and variable behaviour of NGC 246 needs the WET co-operation in order to completely resolve its light curve.

  13. KDM2B Recruitment of the Polycomb Group Complex, PRC1.1, Requires Cooperation between PCGF1 and BCORL1.

    Science.gov (United States)

    Wong, Sarah J; Gearhart, Micah D; Taylor, Alexander B; Nanyes, David R; Ha, Daniel J; Robinson, Angela K; Artigas, Jason A; Lee, Oliver J; Demeler, Borries; Hart, P John; Bardwell, Vivian J; Kim, Chongwoo A

    2016-10-04

    KDM2B recruits H2A-ubiquitinating activity of a non-canonical Polycomb Repression Complex 1 (PRC1.1) to CpG islands, facilitating gene repression. We investigated the molecular basis of recruitment using in vitro assembly assays to identify minimal components, subcomplexes, and domains required for recruitment. A minimal four-component PRC1.1 complex can be assembled by combining two separately isolated subcomplexes: the DNA-binding KDM2B/SKP1 heterodimer and the heterodimer of BCORL1 and PCGF1, a core component of PRC1.1. The crystal structure of the KDM2B/SKP1/BCORL1/PCGF1 complex illustrates the crucial role played by the PCGF1/BCORL1 heterodimer. The BCORL1 PUFD domain positions residues preceding the RAWUL domain of PCGF1 to create an extended interface for interaction with KDM2B, which is unique to the PCGF1-containing PRC1.1 complex. The structure also suggests how KDM2B might simultaneously function in PRC1.1 and an SCF ubiquitin ligase complex and the possible molecular consequences of BCOR PUFD internal tandem duplications found in pediatric kidney and brain tumors. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Estimation of waste package performance requirements for a nuclear waste repository in basalt

    International Nuclear Information System (INIS)

    Wood, B.J.

    1980-07-01

    A method of developing waste package performance requirements for specific nuclides is described, and based on federal regulations concerning permissible concentrations in solution at the point of discharge to the accessible environment, a simple and conservative transport model, and baseline and potential worst-case release scenarios

  15. Does strategy instruction on the Rey-Osterrieth Complex Figure task lead to transferred performance improvement on the Modified Taylor Complex Figure task? A randomized controlled trial in school-aged children.

    Science.gov (United States)

    Resch, Christine; Keulers, Esther; Martens, Rosa; van Heugten, Caroline; Hurks, Petra

    2018-04-05

    Providing children with organizational strategy instruction on the Rey Osterrieth Complex Figure (ROCF) has previously been found to improve organizational and accuracy performance on this task. It is unknown whether strategy instruction on the ROCF would also transfer to performance improvement on copying and the recall of another complex figure. Participants were 98 typically developing children (aged 9.5-12.6 years, M = 10.6). Children completed the ROCF (copy and recall) as a pretest. Approximately a month later, they were randomized to complete the ROCF with strategy instruction in the form of a stepwise administration of the ROCF or again in the standard format. All children then copied and recalled the Modified Taylor Complex Figure (MTCF). All productions were assessed in terms of organization, accuracy and completion time. Organization scores for the MTCF did not differ for the two groups for the copy production, but did differ for the recall production, indicating transfer. Accuracy and completion times did not differ between groups. Performance on all measures, except copy accuracy, improved between pretest ROCF and posttest MTCF production for both groups, suggesting practice effects. Findings indicate that transfer of strategy instruction from one complex figure to another is only present for organization of recalled information. The increase in RCF-OSS scores did not lead to a higher accuracy or a faster copy or recall.

  16. An evaluation of the performance of two binaural beamformers in complex and dynamic multitalker environments.

    Science.gov (United States)

    Best, Virginia; Mejia, Jorge; Freeston, Katrina; van Hoesel, Richard J; Dillon, Harvey

    2015-01-01

    Binaural beamformers are super-directional hearing aids created by combining microphone outputs from each side of the head. While they offer substantial improvements in SNR over conventional directional hearing aids, the benefits (and possible limitations) of these devices in realistic, complex listening situations have not yet been fully explored. In this study we evaluated the performance of two experimental binaural beamformers. Testing was carried out using a horizontal loudspeaker array. Background noise was created using recorded conversations. Performance measures included speech intelligibility, localization in noise, acceptable noise level, subjective ratings, and a novel dynamic speech intelligibility measure. Participants were 27 listeners with bilateral hearing loss, fitted with BTE prototypes that could be switched between conventional directional or binaural beamformer microphone modes. Relative to the conventional directional microphones, both binaural beamformer modes were generally superior for tasks involving fixed frontal targets, but not always for situations involving dynamic target locations. Binaural beamformers show promise for enhancing listening in complex situations when the location of the source of interest is predictable.

  17. Enhanced fluorescence sensitivity by coupling yttrium-analyte complexes and three-way fast high-performance liquid chromatography data modeling

    Energy Technology Data Exchange (ETDEWEB)

    Alcaraz, Mirta R.; Culzoni, María J., E-mail: mculzoni@fbcb.unl.edu.ar; Goicoechea, Héctor C., E-mail: hgoico@fbcb.unl.edu.ar

    2016-01-01

    The present study reports a sensitive chromatographic method for the analysis of seven fluoroquinolones (FQs) in environmental water samples, by coupling yttrium-analyte complex and three-way chromatographic data modeling. This method based on the use of HPLC-FSFD does not require complex or tedious sample treatments or enrichment processes before the analysis, due to the significant fluorescence increments of the analytes reached by the presence of Y{sup 3+}. Enhancement achieved for the FQs signals obtained after Y{sup 3+} addition reaches 103- to 1743-fold. Prediction results corresponding to the application of MCR-ALS to the validation set showed relative error of prediction (REP%) values below 10% in all cases. A recovery study that includes the simultaneous determination of the seven FQs in three different environmental aqueous matrices was conducted. The recovery studies assert the efficiency and the accuracy of the proposed method. The LOD values calculated are in the order of part per trillion (below 0.5 ng mL{sup −1} for all the FQs, except for enoxacin). It is noteworthy to mention that the method herein proposed, which does not include pre-concentration steps, allows reaching LOD values in the same order of magnitude than those achieved by more sophisticated methods based on SPE and UHPLC-MS/MS. - Highlights: • Highly sensitive method for the analysis of seven fluoroquinolones. • Coupling of yttrium-analyte complex and three-way modeling. • Complex or tedious sample treatments or enrichment processes are nor required. • Accuracy on the quantitation of fluoroquinolones in real water river samples.

  18. Probabilistic performance assessment of complex energy process systems - The case of a self-sustained sanitation system.

    Science.gov (United States)

    Kolios, Athanasios; Jiang, Ying; Somorin, Tosin; Sowale, Ayodeji; Anastasopoulou, Aikaterini; Anthony, Edward J; Fidalgo, Beatriz; Parker, Alison; McAdam, Ewan; Williams, Leon; Collins, Matt; Tyrrel, Sean

    2018-05-01

    A probabilistic modelling approach was developed and applied to investigate the energy and environmental performance of an innovative sanitation system, the "Nano-membrane Toilet" (NMT). The system treats human excreta via an advanced energy and water recovery island with the aim of addressing current and future sanitation demands. Due to the complex design and inherent characteristics of the system's input material, there are a number of stochastic variables which may significantly affect the system's performance. The non-intrusive probabilistic approach adopted in this study combines a finite number of deterministic thermodynamic process simulations with an artificial neural network (ANN) approximation model and Monte Carlo simulations (MCS) to assess the effect of system uncertainties on the predicted performance of the NMT system. The joint probability distributions of the process performance indicators suggest a Stirling Engine (SE) power output in the range of 61.5-73 W with a high confidence interval (CI) of 95%. In addition, there is high probability (with 95% CI) that the NMT system can achieve positive net power output between 15.8 and 35 W. A sensitivity study reveals the system power performance is mostly affected by SE heater temperature. Investigation into the environmental performance of the NMT design, including water recovery and CO 2 /NO x emissions, suggests significant environmental benefits compared to conventional systems. Results of the probabilistic analysis can better inform future improvements on the system design and operational strategy and this probabilistic assessment framework can also be applied to similar complex engineering systems.

  19. Complexity in practice: understanding primary care as a complex adaptive system

    Directory of Open Access Journals (Sweden)

    Beverley Ellis

    2010-06-01

    Conclusions The results are real-world exemplars of the emergent properties of complex adaptive systems. Improving clinical governance in primary care requires both complex social interactions and underpinning informatics. The socio-technical lessons learned from this research should inform future management approaches.

  20. Neurological surgery: the influence of physical and mental demands on humans performing complex operations.

    Science.gov (United States)

    Bourne, Sarah K; Walcott, Brian P; Sheth, Sameer A; Coumans, Jean-Valery C E

    2013-03-01

    Performing neurological surgery is an inherently demanding task on the human body, both physically and mentally. Neurosurgeons routinely perform "high stakes" operations in the setting of mental and physical fatigue. These conditions may be not only the result of demanding operations, but also influential to their outcome. Similar to other performance-based endurance activities, training is paramount to successful outcomes. The inflection point, where training reaches the point of diminishing returns, is intensely debated. For the neurosurgeon, this point must be exploited to the maximum, as patients require both the best-trained and best-performing surgeon. In this review, we explore the delicate balance of training and performance, as well as some routinely used adjuncts to improve human performance. Copyright © 2012 Elsevier Ltd. All rights reserved.

  1. Impact of Business Interoperability on the Performance of Complex Cooperative Supply Chain Networks: A Case Study

    Directory of Open Access Journals (Sweden)

    Izunildo Cabral

    2018-01-01

    Full Text Available This paper proposes an agent-based model for evaluating the effect of business interoperability on the performance of cooperative supply chain networks. The model is based on insights from the Industrial Marketing and Purchasing network approach and the complex systems theory perspective. To demonstrate its applicability, an explanatory case study regarding a Portuguese reverse logistics cooperative supply chain network is presented. Face-to-face interviews and forms were used to collect data. The findings show that the establishment of appropriate levels of business interoperability has helped to reduce several non-value-added interaction processes and consequently improve the operational performance of the Valorpneu network. Regarding the research implications, this paper extends the current knowledge on business interoperability and an important problem in business: how business interoperability gaps in dyadic organizational relationships affect the network of companies that the two companies belong to—network effect. In terms of practical implications, managers can use the proposed model as a starting point to simulate complex interactions between supply chain network partners and understand better how the performance of their networks emerges from these interactions and from the adoption of different levels of business interoperability.

  2. Performance analysis and prediction in triathlon.

    Science.gov (United States)

    Ofoghi, Bahadorreza; Zeleznikow, John; Macmahon, Clare; Rehula, Jan; Dwyer, Dan B

    2016-01-01

    Performance in triathlon is dependent upon factors that include somatotype, physiological capacity, technical proficiency and race strategy. Given the multidisciplinary nature of triathlon and the interaction between each of the three race components, the identification of target split times that can be used to inform the design of training plans and race pacing strategies is a complex task. The present study uses machine learning techniques to analyse a large database of performances in Olympic distance triathlons (2008-2012). The analysis reveals patterns of performance in five components of triathlon (three race "legs" and two transitions) and the complex relationships between performance in each component and overall performance in a race. The results provide three perspectives on the relationship between performance in each component of triathlon and the final placing in a race. These perspectives allow the identification of target split times that are required to achieve a certain final place in a race and the opportunity to make evidence-based decisions about race tactics in order to optimise performance.

  3. On improving the performance of nonphotochemical quenching in CP29 light-harvesting antenna complex

    Energy Technology Data Exchange (ETDEWEB)

    Berman, Gennady P. [Theoretical Division, T-4, Los Alamos National Laboratory, and the New Mexico Consortium, Los Alamos, NM 87544 (United States); Nesterov, Alexander I., E-mail: nesterov@cencar.udg.mx [Departamento de Física, CUCEI, Universidad de Guadalajara, Av. Revolución 1500, Guadalajara, CP 44420, Jalisco (Mexico); Sayre, Richard T. [Biological Division, B-11, Los Alamos National Laboratory, and the New Mexico Consortium, Los Alamos, NM 87544 (United States); Still, Susanne [Department of Information and Computer Sciences, and Department of Physics and Astronomy, University of Hawaii at Mānoa, 1860 East–West Road, Honolulu, HI 96822 (United States)

    2016-03-22

    We model and simulate the performance of charge-transfer in nonphotochemical quenching (NPQ) in the CP29 light-harvesting antenna-complex associated with photosystem II (PSII). The model consists of five discrete excitonic energy states and two sinks, responsible for the potentially damaging processes and charge-transfer channels, respectively. We demonstrate that by varying (i) the parameters of the chlorophyll-based dimer, (ii) the resonant properties of the protein-solvent environment interaction, and (iii) the energy transfer rates to the sinks, one can significantly improve the performance of the NPQ. Our analysis suggests strategies for improving the performance of the NPQ in response to environmental changes, and may stimulate experimental verification. - Highlights: • Improvement of the efficiency of the charge-transfer nonphotochemical quenching in CP29. • Strategy for restoring the NPQ efficiency when the environment changes. • By changing of energy transfer rates to the sinks, one can significantly improve the performance of the NPQ.

  4. SANS with contrast variation study of the bacteriorhodopsin-octyl glucoside complex

    Science.gov (United States)

    Mo, Yiming; Heller, William T.

    2010-11-01

    Membrane proteins (MPs), which play vital roles in trans-membrane trafficking and signalling between cells and their external environment, comprise a major fraction of the expressed proteomes of many organisms. MP production for biophysical characterization requires detergents for extracting MPs from their native membrane and to solubilize the MP in solution for purification and study. In a proper detergent solution, the detergent-associated MPs retain their native fold and oligomerization state, key requirements for biophysical characterization and crystallization. SANS with contrast variation was performed to characterize BR in complex with OG to better understand the MP-detergent complex. Contrast variation makes it possible to not only probe the conformation of the entire structure but also investigate the conformation of the polypeptide chain within the BR-OG complex. The BR-OG SANS contrast variation series is not consistent with a compact structure, such as a trimeric BR complex surrounded by a belt of detergent. The data strongly suggest that the protein is partially unfolded through its association with the detergent micelles.

  5. SANS with contrast variation study of the bacteriorhodopsin-octyl glucoside complex

    International Nuclear Information System (INIS)

    Mo Yiming; Heller, William T

    2010-01-01

    Membrane proteins (MPs), which play vital roles in trans-membrane trafficking and signalling between cells and their external environment, comprise a major fraction of the expressed proteomes of many organisms. MP production for biophysical characterization requires detergents for extracting MPs from their native membrane and to solubilize the MP in solution for purification and study. In a proper detergent solution, the detergent-associated MPs retain their native fold and oligomerization state, key requirements for biophysical characterization and crystallization. SANS with contrast variation was performed to characterize BR in complex with OG to better understand the MP-detergent complex. Contrast variation makes it possible to not only probe the conformation of the entire structure but also investigate the conformation of the polypeptide chain within the BR-OG complex. The BR-OG SANS contrast variation series is not consistent with a compact structure, such as a trimeric BR complex surrounded by a belt of detergent. The data strongly suggest that the protein is partially unfolded through its association with the detergent micelles.

  6. Performance of the new automated Abbott RealTime MTB assay for rapid detection of Mycobacterium tuberculosis complex in respiratory specimens.

    Science.gov (United States)

    Chen, J H K; She, K K K; Kwong, T-C; Wong, O-Y; Siu, G K H; Leung, C-C; Chang, K-C; Tam, C-M; Ho, P-L; Cheng, V C C; Yuen, K-Y; Yam, W-C

    2015-09-01

    The automated high-throughput Abbott RealTime MTB real-time PCR assay has been recently launched for Mycobacterium tuberculosis complex (MTBC) clinical diagnosis. This study would like to evaluate its performance. We first compared its diagnostic performance with the Roche Cobas TaqMan MTB assay on 214 clinical respiratory specimens. Prospective analysis of a total 520 specimens was then performed to further evaluate the Abbott assay. The Abbott assay showed a lower limit of detection at 22.5 AFB/ml, which was more sensitive than the Cobas assay (167.5 AFB/ml). The two assays demonstrated a significant difference in diagnostic performance (McNemar's test; P = 0.0034), in which the Abbott assay presented significantly higher area under curve (AUC) than the Cobas assay (1.000 vs 0.880; P = 0.0002). The Abbott assay demonstrated extremely low PCR inhibition on clinical respiratory specimens. The automated Abbott assay required only very short manual handling time (0.5 h), which could help to improve the laboratory management. In the prospective analysis, the overall estimates for sensitivity and specificity of the Abbott assay were both 100 % among smear-positive specimens, whereas the smear-negative specimens were 96.7 and 96.1 %, respectively. No cross-reactivity with non-tuberculosis mycobacterial species was observed. The superiority in sensitivity of the Abbott assay for detecting MTBC in smear-negative specimens could further minimize the risk in MTBC false-negative detection. The new Abbott RealTime MTB assay has good diagnostic performance which can be a useful diagnostic tool for rapid MTBC detection in clinical laboratories.

  7. Electronic Nose Testing Procedure for the Definition of Minimum Performance Requirements for Environmental Odor Monitoring

    Directory of Open Access Journals (Sweden)

    Lidia Eusebio

    2016-09-01

    Full Text Available Despite initial enthusiasm towards electronic noses and their possible application in different fields, and quite a lot of promising results, several criticalities emerge from most published research studies, and, as a matter of fact, the diffusion of electronic noses in real-life applications is still very limited. In general, a first step towards large-scale-diffusion of an analysis method, is standardization. The aim of this paper is describing the experimental procedure adopted in order to evaluate electronic nose performances, with the final purpose of establishing minimum performance requirements, which is considered to be a first crucial step towards standardization of the specific case of electronic nose application for environmental odor monitoring at receptors. Based on the experimental results of the performance testing of a commercialized electronic nose type with respect to three criteria (i.e., response invariability to variable atmospheric conditions, instrumental detection limit, and odor classification accuracy, it was possible to hypothesize a logic that could be adopted for the definition of minimum performance requirements, according to the idea that these are technologically achievable.

  8. High Performance Computing and Storage Requirements for Nuclear Physics: Target 2017

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wasserman, Harvey [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-04-30

    In April 2014, NERSC, ASCR, and the DOE Office of Nuclear Physics (NP) held a review to characterize high performance computing (HPC) and storage requirements for NP research through 2017. This review is the 12th in a series of reviews held by NERSC and Office of Science program offices that began in 2009. It is the second for NP, and the final in the second round of reviews that covered the six Office of Science program offices. This report is the result of that review

  9. Complex versus simple models: ion-channel cardiac toxicity prediction.

    Science.gov (United States)

    Mistry, Hitesh B

    2018-01-01

    There is growing interest in applying detailed mathematical models of the heart for ion-channel related cardiac toxicity prediction. However, a debate as to whether such complex models are required exists. Here an assessment in the predictive performance between two established large-scale biophysical cardiac models and a simple linear model B net was conducted. Three ion-channel data-sets were extracted from literature. Each compound was designated a cardiac risk category using two different classification schemes based on information within CredibleMeds. The predictive performance of each model within each data-set for each classification scheme was assessed via a leave-one-out cross validation. Overall the B net model performed equally as well as the leading cardiac models in two of the data-sets and outperformed both cardiac models on the latest. These results highlight the importance of benchmarking complex versus simple models but also encourage the development of simple models.

  10. Complex versus simple models: ion-channel cardiac toxicity prediction

    Directory of Open Access Journals (Sweden)

    Hitesh B. Mistry

    2018-02-01

    Full Text Available There is growing interest in applying detailed mathematical models of the heart for ion-channel related cardiac toxicity prediction. However, a debate as to whether such complex models are required exists. Here an assessment in the predictive performance between two established large-scale biophysical cardiac models and a simple linear model Bnet was conducted. Three ion-channel data-sets were extracted from literature. Each compound was designated a cardiac risk category using two different classification schemes based on information within CredibleMeds. The predictive performance of each model within each data-set for each classification scheme was assessed via a leave-one-out cross validation. Overall the Bnet model performed equally as well as the leading cardiac models in two of the data-sets and outperformed both cardiac models on the latest. These results highlight the importance of benchmarking complex versus simple models but also encourage the development of simple models.

  11. Predicting timing performance of advanced mechatronics control systems

    NARCIS (Netherlands)

    Voeten, J.P.M.; Hendriks, T.; Theelen, B.D.; Schuddemat, J.; Tabingh Suermondt, W.; Gemei, J.; Kotterink, C.; Huet, van J.; Eichler, G.; Kuepper, A.; Schau, V.; Fouchal, H.; Unger, H.

    2011-01-01

    Embedded control is a key product technology differentiator for many high-tech industries, including ASML. The strong increase in complexity of embedded control systems, combined with the occurrence of late changes in control requirements, results in many timing performance problems showing up only

  12. Childhood school performance, education and occupational complexity: a life-course study of dementia in the Kungsholmen Project.

    Science.gov (United States)

    Dekhtyar, Serhiy; Wang, Hui-Xin; Fratiglioni, Laura; Herlitz, Agneta

    2016-08-01

    Cognitive reserve hypothesis predicts that intellectually demanding activities over the life course protect against dementia. We investigate if childhood school performance remains associated with dementia once education and occupational complexity are taken into account. A cohort of 440 individuals aged 75+ from the Kungsholmen Project was followed up for 9 years to detect dementia. To measure early-life contributors to reserve, we used grades at age 9-10 extracted from the school archives. Data on formal education and occupational complexity were collected at baseline and first follow-up. Dementia was ascertained through comprehensive clinical examination. Cox models estimated the relationship between life-course cognitive reserve measures and dementia. Dementia risk was elevated [hazard ratio (HR): 1.54, 95% confidence interval (CI): 1.03 to 2.29] in individuals with low early-life school grades after adjustment for formal educational attainment and occupational complexity. Secondary education was associated with a lower risk of dementia (HR: 0.72, 95% CI: 0.50 to 1.03), although the effects of post-secondary and university degrees were indistinguishable from baseline. Occupational complexity with data and things was not related to dementia. However, an association was found between high occupational complexity with people and dementia, albeit only in women (HR: 0.39, 95% CI: 0.14 to 0.99). The pattern of results remained unchanged after adjustment for genetic susceptibility, comorbidities and depressive symptoms. Low early-life school performance is associated with an elevated risk of dementia, independent of subsequent educational and occupational attainment. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  13. Modular Software Performance Monitoring

    CERN Document Server

    Kruse, D F

    2011-01-01

    CPU clock frequency is not likely to be increased significantly in the coming years, and data analysis speed can be improved by using more processors or buying new machines, only if one is willing to change the paradigm to a parallel one. Therefore, performance monitoring procedures and tools are needed to help programmers to optimize existing software running on current and future hardware. Low level information from hardware performance counters is vital to spot specific performance problems slowing program execution. HEP software is often huge and complex, and existing tools are unable to give results with the required granularity. We will report on the approach we have chose to solve this problem that involves decomposing the application into parts and monitoring each of them separately. Both counting and sampling methods are used to allow an analysis with the required custom granularity: from global level, up to the function level. A set of tools (based on perfmon2 – a software interface to hardware co...

  14. A Low-Complexity Joint Detection-Decoding Algorithm for Nonbinary LDPC-Coded Modulation Systems

    OpenAIRE

    Wang, Xuepeng; Bai, Baoming; Ma, Xiao

    2010-01-01

    In this paper, we present a low-complexity joint detection-decoding algorithm for nonbinary LDPC codedmodulation systems. The algorithm combines hard-decision decoding using the message-passing strategy with the signal detector in an iterative manner. It requires low computational complexity, offers good system performance and has a fast rate of decoding convergence. Compared to the q-ary sum-product algorithm (QSPA), it provides an attractive candidate for practical applications of q-ary LDP...

  15. Recruitment of Mediator Complex by Cell Type and Stage-Specific Factors Required for Tissue-Specific TAF Dependent Gene Activation in an Adult Stem Cell Lineage.

    Science.gov (United States)

    Lu, Chenggang; Fuller, Margaret T

    2015-12-01

    Onset of terminal differentiation in adult stem cell lineages is commonly marked by robust activation of new transcriptional programs required to make the appropriate differentiated cell type(s). In the Drosophila male germ line stem cell lineage, the switch from proliferating spermatogonia to spermatocyte is accompanied by one of the most dramatic transcriptional changes in the fly, as over 1000 new transcripts turn on in preparation for meiosis and spermatid differentiation. Here we show that function of the coactivator complex Mediator is required for activation of hundreds of new transcripts in the spermatocyte program. Mediator appears to act in a sequential hierarchy, with the testis activating Complex (tMAC), a cell type specific form of the Mip/dREAM general repressor, required to recruit Mediator subunits to the chromatin, and Mediator function required to recruit the testis TAFs (tTAFs), spermatocyte specific homologs of subunits of TFIID. Mediator, tMAC and the tTAFs co-regulate expression of a major set of spermatid differentiation genes. The Mediator subunit Med22 binds the tMAC component Topi when the two are coexpressed in S2 cells, suggesting direct recruitment. Loss of Med22 function in spermatocytes causes meiosis I maturation arrest male infertility, similar to loss of function of the tMAC subunits or the tTAFs. Our results illuminate how cell type specific versions of the Mip/dREAM complex and the general transcription machinery cooperate to drive selective gene activation during differentiation in stem cell lineages.

  16. SIMPL enhancement of tumor necrosis factor-α dependent p65-MED1 complex formation is required for mammalian hematopoietic stem and progenitor cell function.

    Directory of Open Access Journals (Sweden)

    Weina Zhao

    Full Text Available Significant insight into the signaling pathways leading to activation of the Rel transcription factor family, collectively termed NF-κB, has been gained. Less well understood is how subsets of NF-κB-dependent genes are regulated in a signal specific manner. The SIMPL protein (signaling molecule that interacts with mouse pelle-like kinase is required for full Tumor Necrosis Factor-α (TNFα induced NF-κB activity. We show that SIMPL is required for steady-state hematopoiesis and the expression of a subset of TNFα induced genes whose products regulate hematopoietic cell activity. To gain insight into the mechanism through which SIMPL modulates gene expression we focused on the Tnf gene, an immune response regulator required for steady-state hematopoiesis. In response to TNFα SIMPL localizes to the Tnf gene promoter where it modulates the initiation of Tnf gene transcription. SIMPL binding partners identified by mass spectrometry include proteins involved in transcription and the interaction between SIMPL and MED1 was characterized in more detail. In response to TNFα, SIMPL is found in p65-MED1 complexes where SIMPL enhances p65/MED1/SIMPL complex formation. Together our results indicate that SIMPL functions as a TNFα-dependent p65 co-activator by facilitating the recruitment of MED1 to p65 containing transcriptional complexes to control the expression of a subset of TNFα-induced genes.

  17. Automated Planning Enables Complex Protocols on Liquid-Handling Robots.

    Science.gov (United States)

    Whitehead, Ellis; Rudolf, Fabian; Kaltenbach, Hans-Michael; Stelling, Jörg

    2018-03-16

    Robotic automation in synthetic biology is especially relevant for liquid handling to facilitate complex experiments. However, research tasks that are not highly standardized are still rarely automated in practice. Two main reasons for this are the substantial investments required to translate molecular biological protocols into robot programs, and the fact that the resulting programs are often too specific to be easily reused and shared. Recent developments of standardized protocols and dedicated programming languages for liquid-handling operations addressed some aspects of ease-of-use and portability of protocols. However, either they focus on simplicity, at the expense of enabling complex protocols, or they entail detailed programming, with corresponding skills and efforts required from the users. To reconcile these trade-offs, we developed Roboliq, a software system that uses artificial intelligence (AI) methods to integrate (i) generic formal, yet intuitive, protocol descriptions, (ii) complete, but usually hidden, programming capabilities, and (iii) user-system interactions to automatically generate executable, optimized robot programs. Roboliq also enables high-level specifications of complex tasks with conditional execution. To demonstrate the system's benefits for experiments that are difficult to perform manually because of their complexity, duration, or time-critical nature, we present three proof-of-principle applications for the reproducible, quantitative characterization of GFP variants.

  18. The Effects of Enzyme Complex on Performance, Intestinal Health and Nutrient Digestibility of Weaned Pigs

    Directory of Open Access Journals (Sweden)

    J. Q. Yi

    2013-08-01

    Full Text Available Two experiments were conducted to evaluate the effect of supplementing a corn-soybean meal-based diet with an enzyme complex containing amylase, protease and xylanase on the performance, intestinal health, apparent ileal digestibility of amino acids and nutrient digestibility of weaned pigs. In Exp. 1, 108 piglets weaned at 28 d of age were fed one of three diets containing 0 (control, 100, or 150 ppm enzyme complex for 4 wks, based on a two-phase feeding program namely 1 to 7 d (phase 1 and 8 to 28 d (phase 2. At the end of the experiment, six pigs from the control group and the group supplemented with 150 ppm enzyme complex were chosen to collect digesta samples from intestine to measure viscosity and pH in the stomach, ileum, and cecum, as well as volatile fatty acid concentrations and composition of the microflora in the cecum and colon. There were linear increases (p<0.01 in weight gain, gain: feed ratio and digestibility of gross energy with the increasing dose rate of enzyme supplementation during the whole experiment. Supplementation with enzyme complex increased the digesta viscosity in the stomach (p<0.05 and significantly increased (p<0.01 the concentrations of acetic, propionic and butyric acid in the cecum and colon. Enzyme supplementation also significantly increased the population of Lactobacilli (p<0.01 in the cecum and decreased the population of E. coli (p<0.05 in the colon. In Exp. 2, six crossbred barrows (initial body weight: 18.26±1.21 kg, fitted with a simple T-cannula at the distal ileum, were assigned to three dietary treatments according to a replicated 3×3 Latin Square design. The experimental diets were the same as the diets used in phase 2 in Exp. 1. Apparent ileal digestibility of isoleucine (p<0.01, valine (p<0.05 and aspartic acid (p<0.05 linearly increased with the increasing dose rate of enzyme supplementation. In conclusion, supplementation of the diet with an enzyme complex containing amylase, protease and

  19. NERSC Cyber Security Challenges That Require DOE Development andSupport

    Energy Technology Data Exchange (ETDEWEB)

    Draney, Brent; Campbell, Scott; Walter, Howard

    2007-01-16

    Traditional security approaches do not adequately addressall the requirements of open, scientific computing facilities. Many ofthe methods used for more restricted environments, including almost allcorporate/commercial systems, do not meet the needs of today's science.Use of only the available "state of the practice" commercial methods willhave adverse impact on the ability of DOE to accomplish its sciencegoals, and impacts the productivity of the DOE Science community. Inparticular, NERSC and other high performance computing (HPC) centers havespecial security challenges that are unlikely to be met unless DOE fundsdevelopment and support of reliable and effective tools designed to meetthe cyber security needs of High Performance Science. The securitychallenges facing NERSC can be collected into three basic problem sets:network performance and dynamics, application complexity and diversity,and a complex user community that can have transient affiliations withactual institutions. To address these problems, NERSC proposes thefollowing four general solutions: auditing user and system activityacross sites; firewall port configuration in real time;cross-site/virtual organization identity management and access control;and detecting security issues in application middleware. Solutions arealsoproposed for three general long term issues: data volume,application complexity, and information integration.

  20. Self-Efficacy, Task Complexity and Task Performance: Exploring Interactions in Two Versions of Vocabulary Learning Tasks

    Science.gov (United States)

    Wu, Xiaoli; Lowyck, Joost; Sercu, Lies; Elen, Jan

    2012-01-01

    The present study aimed for better understanding of the interactions between task complexity and students' self-efficacy beliefs and students' use of learning strategies, and finally their interacting effects on task performance. This investigation was carried out in the context of Chinese students learning English as a foreign language in a…

  1. Assessment of Performance-based Requirements for Structural Design

    DEFF Research Database (Denmark)

    Hertz, Kristian Dahl

    2005-01-01

    and for a detailed assessment of the requirements. The design requirements to be used for a factory producing elements for industrial housing for unknown costumers are discussed, and a fully developed fire is recommended as a common requirement for domestic houses, hotels, offices, schools and hospitals. In addition...

  2. The Complexity of Mitochondrial Complex IV: An Update of Cytochrome c Oxidase Biogenesis in Plants

    Science.gov (United States)

    Mansilla, Natanael; Racca, Sofia; Gras, Diana E.; Gonzalez, Daniel H.

    2018-01-01

    Mitochondrial respiration is an energy producing process that involves the coordinated action of several protein complexes embedded in the inner membrane to finally produce ATP. Complex IV or Cytochrome c Oxidase (COX) is the last electron acceptor of the respiratory chain, involved in the reduction of O2 to H2O. COX is a multimeric complex formed by multiple structural subunits encoded in two different genomes, prosthetic groups (heme a and heme a3), and metallic centers (CuA and CuB). Tens of accessory proteins are required for mitochondrial RNA processing, synthesis and delivery of prosthetic groups and metallic centers, and for the final assembly of subunits to build a functional complex. In this review, we perform a comparative analysis of COX composition and biogenesis factors in yeast, mammals and plants. We also describe possible external and internal factors controlling the expression of structural proteins and assembly factors at the transcriptional and post-translational levels, and the effect of deficiencies in different steps of COX biogenesis to infer the role of COX in different aspects of plant development. We conclude that COX assembly in plants has conserved and specific features, probably due to the incorporation of a different set of subunits during evolution. PMID:29495437

  3. Formal verification of complex properties on PLC programs

    CERN Document Server

    Darvas, D; Voros, A; Bartha, T; Blanco Vinuela, E; Gonzalez Suarez, V M

    2014-01-01

    Formal verification has become a recommended practice in the safety-critical application areas. However, due to the complexity of practical control and safety systems, the state space explosion often prevents the use of formal analysis. In this paper we extend our former verification methodology with effective property preserving reduction techniques. For this purpose we developed general rule-based reductions and a customized version of the Cone of Influence (COI) reduction. Using these methods, the verification of complex requirements formalised with temporal logics (e.g. CTL, LTL) can be orders of magnitude faster. We use the NuSMV model checker on a real-life PLC program from CERN to demonstrate the performance of our reduction techniques.

  4. Quality Management and Key Performance Indicators in Oncologic Esophageal Surgery.

    Science.gov (United States)

    Gockel, Ines; Ahlbrand, Constantin Johannes; Arras, Michael; Schreiber, Elke Maria; Lang, Hauke

    2015-12-01

    Ranking systems and comparisons of quality and performance indicators will be of increasing relevance for complex "high-risk" procedures such as esophageal cancer surgery. The identification of evidence-based standards relevant for key performance indicators in esophageal surgery is essential for establishing monitoring systems and furthermore a requirement to enhance treatment quality. In the course of this review, we analyze the key performance indicators case volume, radicality of resection, and postoperative morbidity and mortality, leading to continuous quality improvement. Ranking systems established on this basis will gain increased relevance in highly complex procedures within the national and international comparison and furthermore improve the treatment of patients with esophageal carcinoma.

  5. License plate localization in complex scenes based on oriented FAST and rotated BRIEF feature

    Science.gov (United States)

    Wang, Ran; Xia, Yuanchun; Wang, Guoyou; Tian, Jiangmin

    2015-09-01

    Within intelligent transportation systems, fast and robust license plate localization (LPL) in complex scenes is still a challenging task. Real-world scenes introduce complexities such as variation in license plate size and orientation, uneven illumination, background clutter, and nonplate objects. These complexities lead to poor performance using traditional LPL features, such as color, edge, and texture. Recently, state-of-the-art performance in LPL has been achieved by applying the scale invariant feature transform (SIFT) descriptor to LPL for visual matching. However, for applications that require fast processing, such as mobile phones, SIFT does not meet the efficiency requirement due to its relatively slow computational speed. To address this problem, a new approach for LPL, which uses the oriented FAST and rotated BRIEF (ORB) feature detector, is proposed. The feature extraction in ORB is much more efficient than in SIFT and is invariant to scale and grayscale as well as rotation changes, and hence is able to provide superior performance for LPL. The potential regions of a license plate are detected by considering spatial and color information simultaneously, which is different from previous approaches. The experimental results on a challenging dataset demonstrate the effectiveness and efficiency of the proposed method.

  6. 40 CFR 80.820 - What gasoline is subject to the toxics performance requirements of this subpart?

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 16 2010-07-01 2010-07-01 false What gasoline is subject to the toxics... PROTECTION AGENCY (CONTINUED) AIR PROGRAMS (CONTINUED) REGULATION OF FUELS AND FUEL ADDITIVES Gasoline Toxics Gasoline Toxics Performance Requirements § 80.820 What gasoline is subject to the toxics performance...

  7. High performance in software development

    CERN Multimedia

    CERN. Geneva; Haapio, Petri; Liukkonen, Juha-Matti

    2015-01-01

    What are the ingredients of high-performing software? Software development, especially for large high-performance systems, is one the most complex tasks mankind has ever tried. Technological change leads to huge opportunities but challenges our old ways of working. Processing large data sets, possibly in real time or with other tight computational constraints, requires an efficient solution architecture. Efficiency requirements span from the distributed storage and large-scale organization of computation and data onto the lowest level of processor and data bus behavior. Integrating performance behavior over these levels is especially important when the computation is resource-bounded, as it is in numerics: physical simulation, machine learning, estimation of statistical models, etc. For example, memory locality and utilization of vector processing are essential for harnessing the computing power of modern processor architectures due to the deep memory hierarchies of modern general-purpose computers. As a r...

  8. Anharmonic Vibrational Spectroscopy on Metal Transition Complexes

    Science.gov (United States)

    Latouche, Camille; Bloino, Julien; Barone, Vincenzo

    2014-06-01

    Advances in hardware performance and the availability of efficient and reliable computational models have made possible the application of computational spectroscopy to ever larger molecular systems. The systematic interpretation of experimental data and the full characterization of complex molecules can then be facilitated. Focusing on vibrational spectroscopy, several approaches have been proposed to simulate spectra beyond the double harmonic approximation, so that more details become available. However, a routine use of such tools requires the preliminary definition of a valid protocol with the most appropriate combination of electronic structure and nuclear calculation models. Several benchmark of anharmonic calculations frequency have been realized on organic molecules. Nevertheless, benchmarks of organometallics or inorganic metal complexes at this level are strongly lacking despite the interest of these systems due to their strong emission and vibrational properties. Herein we report the benchmark study realized with anharmonic calculations on simple metal complexes, along with some pilot applications on systems of direct technological or biological interest.

  9. The cognitive complexity of concurrent cognitive-motor tasks reveals age-related deficits in motor performance

    DEFF Research Database (Denmark)

    Oliveira, Anderson Souza; Reiche, Mikkel Staall; Vinescu, Cristina Ioana

    2018-01-01

    Aging reduces cognitive functions, and such impairments have implications in mental and motor performance. Cognitive function has been recently linked to the risk of falls in older adults. Physical activities have been used to attenuate the declines in cognitive functions and reduce fall incidence......, but little is known whether a physically active lifestyle can maintain physical performance under cognitively demanding conditions. The aim of this study was to verify whether physically active older adults present similar performance deficits during upper limb response time and precision stepping walking...... tasks when compared to younger adults. Both upper limb and walking tasks involved simple and complex cognitive demands through decision-making. For both tasks, decision-making was assessed by including a distracting factor to the execution. The results showed that older adults were substantially slower...

  10. Improving Operational Risk Management Using Business Performance Management Technologies

    OpenAIRE

    Bram Pieket Weeserik; Marco Spruit

    2018-01-01

    Operational Risk Management (ORM) comprises the continuous management of risks resulting from: human actions, internal processes, systems, and external events. With increasing requirements, complexity and a growing volume of risks, information systems provide benefits for integrating risk management activities and optimizing performance. Business Performance Management (BPM) technologies are believed to provide a solution for effective Operational Risk Management by offering several combined ...

  11. The NSL Complex Regulates Housekeeping Genes in Drosophila

    Science.gov (United States)

    Raja, Sunil Jayaramaiah; Holz, Herbert; Luscombe, Nicholas M.; Manke, Thomas; Akhtar, Asifa

    2012-01-01

    MOF is the major histone H4 lysine 16-specific (H4K16) acetyltransferase in mammals and Drosophila. In flies, it is involved in the regulation of X-chromosomal and autosomal genes as part of the MSL and the NSL complexes, respectively. While the function of the MSL complex as a dosage compensation regulator is fairly well understood, the role of the NSL complex in gene regulation is still poorly characterized. Here we report a comprehensive ChIP–seq analysis of four NSL complex members (NSL1, NSL3, MBD-R2, and MCRS2) throughout the Drosophila melanogaster genome. Strikingly, the majority (85.5%) of NSL-bound genes are constitutively expressed across different cell types. We find that an increased abundance of the histone modifications H4K16ac, H3K4me2, H3K4me3, and H3K9ac in gene promoter regions is characteristic of NSL-targeted genes. Furthermore, we show that these genes have a well-defined nucleosome free region and broad transcription initiation patterns. Finally, by performing ChIP–seq analyses of RNA polymerase II (Pol II) in NSL1- and NSL3-depleted cells, we demonstrate that both NSL proteins are required for efficient recruitment of Pol II to NSL target gene promoters. The observed Pol II reduction coincides with compromised binding of TBP and TFIIB to target promoters, indicating that the NSL complex is required for optimal recruitment of the pre-initiation complex on target genes. Moreover, genes that undergo the most dramatic loss of Pol II upon NSL knockdowns tend to be enriched in DNA Replication–related Element (DRE). Taken together, our findings show that the MOF-containing NSL complex acts as a major regulator of housekeeping genes in flies by modulating initiation of Pol II transcription. PMID:22723752

  12. Westinghouse loading pattern search methodology for complex core designs

    International Nuclear Information System (INIS)

    Chao, Y.A.; Alsop, B.H.; Johansen, B.J.; Morita, T.

    1991-01-01

    Pressurized water reactor core designs have become more complex and must meet a plethora of design constraints. Trends have been toward longer cycles with increased discharge burnup, increased burnable absorber (BA) number, mixed BA types, reduced radial leakage, axially blanketed fuel, and multiple-batch feed fuel regions. Obtaining economical reload core loading patterns (LPs) that meet design criteria is a difficult task to do manually. Automated LP search tools are needed. An LP search tool cannot possibly perform an exhaustive search because of the sheer size of the combinatorial problem. On the other hand, evolving complexity of the design features and constraints often invalidates expert rules based on past design experiences. Westinghouse has developed a sophisticated loading pattern search methodology. This methodology is embodied in the LPOP code, which Westinghouse nuclear designers use extensively. The LPOP code generates a variety of LPs meeting design constraints and performs a two-cycle economic evaluation of the generated LPs. The designer selects the most appropriate patterns for fine tuning and evaluation by the design codes. This paper describes the major features of the LPOP methodology that are relevant to fulfilling the aforementioned requirements. Data and examples are also provided to demonstrate the performance of LPOP in meeting the complex design needs

  13. The Effect of Focus on Form and Task Complexity on L2 Learners' Oral Task Performance

    Science.gov (United States)

    Salimi, Asghar

    2015-01-01

    Second Language learners' oral task performance has been one of interesting and research generating areas of investigations in the field of second language acquisition specially, task-based language teaching and learning. The main purpose of the present study is to investigate the effect of focus on form and task complexity on L2 learners' oral…

  14. Transforming Multidisciplinary Customer Requirements to Product Design Specifications

    Science.gov (United States)

    Ma, Xiao-Jie; Ding, Guo-Fu; Qin, Sheng-Feng; Li, Rong; Yan, Kai-Yin; Xiao, Shou-Ne; Yang, Guang-Wu

    2017-09-01

    With the increasing of complexity of complex mechatronic products, it is necessary to involve multidisciplinary design teams, thus, the traditional customer requirements modeling for a single discipline team becomes difficult to be applied in a multidisciplinary team and project since team members with various disciplinary backgrounds may have different interpretations of the customers' requirements. A new synthesized multidisciplinary customer requirements modeling method is provided for obtaining and describing the common understanding of customer requirements (CRs) and more importantly transferring them into a detailed and accurate product design specifications (PDS) to interact with different team members effectively. A case study of designing a high speed train verifies the rationality and feasibility of the proposed multidisciplinary requirement modeling method for complex mechatronic product development. This proposed research offersthe instruction to realize the customer-driven personalized customization of complex mechatronic product.

  15. Complex Functions with GeoGebra

    Science.gov (United States)

    Breda, Ana Maria D'azevedo; Dos Santos, José Manuel Dos Santos

    2016-01-01

    Complex functions, generally feature some interesting peculiarities, seen as extensions of real functions. The visualization of complex functions properties usually requires the simultaneous visualization of two-dimensional spaces. The multiple Windows of GeoGebra, combined with its ability of algebraic computation with complex numbers, allow the…

  16. Defense Organization Officials Did Not Consistently Comply With Requirements for Assessing Contractor Performance

    Science.gov (United States)

    2017-02-01

    must evaluate compliance with reporting requirements frequently so they can readily identify delinquent past performance reports.24 The FAR also...problems the contractor recovered from without impact to the contract/ order. There should have been no significant weaknesses identified. A...contractor had trouble overcoming and state how it impacted the Government. A Marginal rating should be supported by referencing the management tool

  17. Requirements management at Westinghouse Electric Company

    International Nuclear Information System (INIS)

    Gustavsson, Henrik

    2014-01-01

    Field studies and surveys made in various industry branches support the Westinghouse opinion that qualitative systems engineering and requirements management have a high value in the development of complex systems and products. Two key issues causing overspending and schedule delays in projects are underestimation of complexity and misunderstandings between the different sub-project teams. These issues often arise when a project jumps too early into detail design. Good requirements management practice before detail design helps the project teams avoid such issues. Westinghouse therefore puts great effort into requirements management. The requirements management methodology at Westinghouse rests primarily on four key cornerstones: 1 - Iterative team work when developing requirements specifications, 2 - Id number tags on requirements, 3 - Robust change routine, and 4 - Requirements Traceability Matrix. (authors)

  18. Recruitment of Mediator Complex by Cell Type and Stage-Specific Factors Required for Tissue-Specific TAF Dependent Gene Activation in an Adult Stem Cell Lineage.

    Directory of Open Access Journals (Sweden)

    Chenggang Lu

    2015-12-01

    Full Text Available Onset of terminal differentiation in adult stem cell lineages is commonly marked by robust activation of new transcriptional programs required to make the appropriate differentiated cell type(s. In the Drosophila male germ line stem cell lineage, the switch from proliferating spermatogonia to spermatocyte is accompanied by one of the most dramatic transcriptional changes in the fly, as over 1000 new transcripts turn on in preparation for meiosis and spermatid differentiation. Here we show that function of the coactivator complex Mediator is required for activation of hundreds of new transcripts in the spermatocyte program. Mediator appears to act in a sequential hierarchy, with the testis activating Complex (tMAC, a cell type specific form of the Mip/dREAM general repressor, required to recruit Mediator subunits to the chromatin, and Mediator function required to recruit the testis TAFs (tTAFs, spermatocyte specific homologs of subunits of TFIID. Mediator, tMAC and the tTAFs co-regulate expression of a major set of spermatid differentiation genes. The Mediator subunit Med22 binds the tMAC component Topi when the two are coexpressed in S2 cells, suggesting direct recruitment. Loss of Med22 function in spermatocytes causes meiosis I maturation arrest male infertility, similar to loss of function of the tMAC subunits or the tTAFs. Our results illuminate how cell type specific versions of the Mip/dREAM complex and the general transcription machinery cooperate to drive selective gene activation during differentiation in stem cell lineages.

  19. Typical Complexity Numbers

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Typical Complexity Numbers. Say. 1000 tones,; 100 Users,; Transmission every 10 msec. Full Crosstalk cancellation would require. Full cancellation requires a matrix multiplication of order 100*100 for all the tones. 1000*100*100*100 operations every second for the ...

  20. A high throughput architecture for a low complexity soft-output demapping algorithm

    Science.gov (United States)

    Ali, I.; Wasenmüller, U.; Wehn, N.

    2015-11-01

    Iterative channel decoders such as Turbo-Code and LDPC decoders show exceptional performance and therefore they are a part of many wireless communication receivers nowadays. These decoders require a soft input, i.e., the logarithmic likelihood ratio (LLR) of the received bits with a typical quantization of 4 to 6 bits. For computing the LLR values from a received complex symbol, a soft demapper is employed in the receiver. The implementation cost of traditional soft-output demapping methods is relatively large in high order modulation systems, and therefore low complexity demapping algorithms are indispensable in low power receivers. In the presence of multiple wireless communication standards where each standard defines multiple modulation schemes, there is a need to have an efficient demapper architecture covering all the flexibility requirements of these standards. Another challenge associated with hardware implementation of the demapper is to achieve a very high throughput in double iterative systems, for instance, MIMO and Code-Aided Synchronization. In this paper, we present a comprehensive communication and hardware performance evaluation of low complexity soft-output demapping algorithms to select the best algorithm for implementation. The main goal of this work is to design a high throughput, flexible, and area efficient architecture. We describe architectures to execute the investigated algorithms. We implement these architectures on a FPGA device to evaluate their hardware performance. The work has resulted in a hardware architecture based on the figured out best low complexity algorithm delivering a high throughput of 166 Msymbols/second for Gray mapped 16-QAM modulation on Virtex-5. This efficient architecture occupies only 127 slice registers, 248 slice LUTs and 2 DSP48Es.

  1. The role of human performance in safe operation of complex plants

    International Nuclear Information System (INIS)

    Preda, Irina Aida; Lazar, Roxana Elena; Croitoru, Cornelia

    1999-01-01

    According to statistics, about 20-30% from the failures occurring in plants are caused directly or indirectly by human errors. Furthermore, it was established that 10-15 percents of the global failures are related to the human errors. These are mainly due to the wrong actions, maintenance errors, and misinterpretation of instruments. The human performance is influenced by: professional ability, complexity and danger of the plant, experience in the same working place, level of skills, events in personal and/or professional life, discipline, social ambience and somatic health. The human performances assessment in the probabilistic safety assessment offers the possibility of evaluation for human contribution to the events sequences outcome. A human error may be recovered before the unwanted consequences had been occurred on system. This paper presents the possibilities to use the probabilistic methods (event tree, fault tree) to identify the solution for human reliability improvement in order to minimise the risk in industrial plant operation. Also, are defined the human error types and their causes and the 'decision tree method' is presented as technique in our analyses for human reliability assessment. The exemplification of human error analysis method was achieved based on operation data for Valcea heavy water pilot plant. (authors)

  2. 20 CFR 641.879 - What are the fiscal and performance reporting requirements for recipients?

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 3 2010-04-01 2010-04-01 false What are the fiscal and performance reporting requirements for recipients? 641.879 Section 641.879 Employees' Benefits EMPLOYMENT AND TRAINING ADMINISTRATION... Quarterly Progress Report (QPR) to the Department in electronic format via the Internet within 30 days after...

  3. An experimental study of the effect of octane number higher than engine requirement on the engine performance and emissions

    Energy Technology Data Exchange (ETDEWEB)

    Sayin, Cenk; Kilicaslan, Ibrahim; Canakci, Mustafa; Ozsezen, Necati [Kocaeli Univ., Dept. of Mechanical Education, Izmit (Turkey)

    2005-06-01

    In this study, the effect of using higher-octane gasoline than that of engine requirement on the performance and exhaust emissions was experimentally studied. The test engine chosen has a fuel system with carburettor because 60% of the vehicles in Turkey are equipped with the carburettor. The engine, which required 91-RON (Research Octane Number) gasoline, was tested using 95-RON and 91-RON. Results show that using octane ratings higher than the requirement of an engine not only decreases engine performance but also increases exhaust emissions. (Author)

  4. Uncertainty Requirement Analysis for the Orbit, Attitude, and Burn Performance of the 1st Lunar Orbit Insertion Maneuver

    Directory of Open Access Journals (Sweden)

    Young-Joo Song

    2016-12-01

    Full Text Available In this study, the uncertainty requirements for orbit, attitude, and burn performance were estimated and analyzed for the execution of the 1st lunar orbit insertion (LOI maneuver of the Korea Pathfinder Lunar Orbiter (KPLO mission. During the early design phase of the system, associate analysis is an essential design factor as the 1st LOI maneuver is the largest burn that utilizes the onboard propulsion system; the success of the lunar capture is directly affected by the performance achieved. For the analysis, the spacecraft is assumed to have already approached the periselene with a hyperbolic arrival trajectory around the moon. In addition, diverse arrival conditions and mission constraints were considered, such as varying periselene approach velocity, altitude, and orbital period of the capture orbit after execution of the 1st LOI maneuver. The current analysis assumed an impulsive LOI maneuver, and two-body equations of motion were adapted to simplify the problem for a preliminary analysis. Monte Carlo simulations were performed for the statistical analysis to analyze diverse uncertainties that might arise at the moment when the maneuver is executed. As a result, three major requirements were analyzed and estimated for the early design phase. First, the minimum requirements were estimated for the burn performance to be captured around the moon. Second, the requirements for orbit, attitude, and maneuver burn performances were simultaneously estimated and analyzed to maintain the 1st elliptical orbit achieved around the moon within the specified orbital period. Finally, the dispersion requirements on the B-plane aiming at target points to meet the target insertion goal were analyzed and can be utilized as reference target guidelines for a mid-course correction (MCC maneuver during the transfer. More detailed system requirements for the KPLO mission, particularly for the spacecraft bus itself and for the flight dynamics subsystem at the ground

  5. Organization structures for dealing with complexity

    NARCIS (Netherlands)

    Meijer, B.R.

    2006-01-01

    "Complexity is in the eye of the beholder" is a well known quote in the research field of complexity. In the world of managers the word complex is often a synonym for difficult, complicated, involving many factors and highly uncertain. A complex business decision requires careful preparation and

  6. Reciprocal Modulation of Cognitive and Emotional Aspects in Pianistic Performances

    OpenAIRE

    Higuchi, Marcia K. Kodama; Fornari, José; Del Ben, Cristina M.; Graeff, Frederico G.; Leite, João Pereira

    2011-01-01

    Background: High level piano performance requires complex integration of perceptual, motor, cognitive and emotive skills. Observations in psychology and neuroscience studies have suggested reciprocal inhibitory modulation of the cognition by emotion and emotion by cognition. However, it is still unclear how cognitive states may influence the pianistic performance. The aim of the present study is to verify the influence of cognitive and affective attention in the piano performances. Methods an...

  7. Performance-oriented Architecture and the Spatial and Material Organisation Complex. Rethinking the Definition, Role and Performative Capacity of the Spatial and Material Boundaries of the Built Environment

    Directory of Open Access Journals (Sweden)

    Michael Ulrich Hensel

    2011-03-01

    Full Text Available This article is based on the proposition that performance-oriented design is characterised by four domains of ‘active agency’: the human subject, the spatial and material organisation complex and the environment (Hensel, 2010. While these four domains are seen to be interdependent and interacting with one another, it is nevertheless necessary to examine each in its own right. However, the spatial and material organisation complex contains both the spatial and material domains, which are interdependent to such a degree that these need to be examined in relation to one another and also in relation to the specific environment they are set within and interacting with. To explore this combined domain within the context of performance-oriented design is the aim of this article, in particularly in relation to the question of the definition and performative capacity of spatial and material boundaries. The various sections are accompanied by research by design efforts undertaken in specified academic contexts, which are intended as examples of modes and areas of inquiry relative to the purpose of this article.

  8. Sequestration Coating Performance Requirements for ...

    Science.gov (United States)

    symposium paper The EPA’s National Homeland Security Research Center (NHSRC), in collaboration with ASTM International, developed performance standards for materials which could be applied to exterior surfaces contaminated by an RDD to mitigate the spread and migration of radioactive contamination.

  9. Requirements for high performance computing for lattice QCD. Report of the ECFA working panel

    International Nuclear Information System (INIS)

    Jegerlehner, F.; Kenway, R.D.; Martinelli, G.; Michael, C.; Pene, O.; Petersson, B.; Petronzio, R.; Sachrajda, C.T.; Schilling, K.

    2000-01-01

    This report, prepared at the request of the European Committee for Future Accelerators (ECFA), contains an assessment of the High Performance Computing resources which will be required in coming years by European physicists working in Lattice Field Theory and a review of the scientific opportunities which these resources would open. (orig.)

  10. Impact of Cognitive Abilities and Prior Knowledge on Complex Problem Solving Performance – Empirical Results and a Plea for Ecologically Valid Microworlds

    Directory of Open Access Journals (Sweden)

    Heinz-Martin Süß

    2018-05-01

    Full Text Available The original aim of complex problem solving (CPS research was to bring the cognitive demands of complex real-life problems into the lab in order to investigate problem solving behavior and performance under controlled conditions. Up until now, the validity of psychometric intelligence constructs has been scrutinized with regard to its importance for CPS performance. At the same time, different CPS measurement approaches competing for the title of the best way to assess CPS have been developed. In the first part of the paper, we investigate the predictability of CPS performance on the basis of the Berlin Intelligence Structure Model and Cattell’s investment theory as well as an elaborated knowledge taxonomy. In the first study, 137 students managed a simulated shirt factory (Tailorshop; i.e., a complex real life-oriented system twice, while in the second study, 152 students completed a forestry scenario (FSYS; i.e., a complex artificial world system. The results indicate that reasoning – specifically numerical reasoning (Studies 1 and 2 and figural reasoning (Study 2 – are the only relevant predictors among the intelligence constructs. We discuss the results with reference to the Brunswik symmetry principle. Path models suggest that reasoning and prior knowledge influence problem solving performance in the Tailorshop scenario mainly indirectly. In addition, different types of system-specific knowledge independently contribute to predicting CPS performance. The results of Study 2 indicate that working memory capacity, assessed as an additional predictor, has no incremental validity beyond reasoning. We conclude that (1 cognitive abilities and prior knowledge are substantial predictors of CPS performance, and (2 in contrast to former and recent interpretations, there is insufficient evidence to consider CPS a unique ability construct. In the second part of the paper, we discuss our results in light of recent CPS research, which predominantly

  11. Impact of Cognitive Abilities and Prior Knowledge on Complex Problem Solving Performance – Empirical Results and a Plea for Ecologically Valid Microworlds

    Science.gov (United States)

    Süß, Heinz-Martin; Kretzschmar, André

    2018-01-01

    The original aim of complex problem solving (CPS) research was to bring the cognitive demands of complex real-life problems into the lab in order to investigate problem solving behavior and performance under controlled conditions. Up until now, the validity of psychometric intelligence constructs has been scrutinized with regard to its importance for CPS performance. At the same time, different CPS measurement approaches competing for the title of the best way to assess CPS have been developed. In the first part of the paper, we investigate the predictability of CPS performance on the basis of the Berlin Intelligence Structure Model and Cattell’s investment theory as well as an elaborated knowledge taxonomy. In the first study, 137 students managed a simulated shirt factory (Tailorshop; i.e., a complex real life-oriented system) twice, while in the second study, 152 students completed a forestry scenario (FSYS; i.e., a complex artificial world system). The results indicate that reasoning – specifically numerical reasoning (Studies 1 and 2) and figural reasoning (Study 2) – are the only relevant predictors among the intelligence constructs. We discuss the results with reference to the Brunswik symmetry principle. Path models suggest that reasoning and prior knowledge influence problem solving performance in the Tailorshop scenario mainly indirectly. In addition, different types of system-specific knowledge independently contribute to predicting CPS performance. The results of Study 2 indicate that working memory capacity, assessed as an additional predictor, has no incremental validity beyond reasoning. We conclude that (1) cognitive abilities and prior knowledge are substantial predictors of CPS performance, and (2) in contrast to former and recent interpretations, there is insufficient evidence to consider CPS a unique ability construct. In the second part of the paper, we discuss our results in light of recent CPS research, which predominantly utilizes the

  12. Base technology development enhances state-of-the-art in meeting performance requirements

    International Nuclear Information System (INIS)

    Freedman, J.M.; Allen, G.C. Jr.; Luna, R.E.

    1987-01-01

    Sandia National Laboratories (SNL) has responsibility to the United States Department of Energy (DOE) for baseline technology to support the design of radioactive material transportation packages. To fulfill this responsibility, SNL works with industry, government agencies, and national laboratories to identify and develop state-of-the-art technology required to design and test safe, cost-effective radioactive materials packages. Principal elements of the base technology program include: 1) analysis techniques, 2) testing, 3) subsystem and component development, 4) packaging systems development support, and 5) technical support for policy development. These program elements support a systems approach for meeting performance requirements and assure that there is a sound underlying technical basis for both transportation packaging design and associated policy decisions. Highlights from the base technology program included in this paper are testing, design and analysis methods, advanced materials, risk assessment and logistics models, and transportation package support

  13. Y-12 National Security Complex Emergency Management Hazards Assessment (EMHA) Process; FINAL

    International Nuclear Information System (INIS)

    Bailiff, E.F.; Bolling, J.D.

    2001-01-01

    This document establishes requirements and standard methods for the development and maintenance of the Emergency Management Hazards Assessment (EMHA) process used by the lead and all event contractors at the Y-12 Complex for emergency planning and preparedness. The EMHA process provides the technical basis for the Y-12 emergency management program. The instructions provided in this document include methods and requirements for performing the following emergency management activities at Y-12: (1) hazards identification; (2) hazards survey, and (3) hazards assessment

  14. Construction and clinical application of complex utility programs in the SEGAMS-80 system

    International Nuclear Information System (INIS)

    Mate, E.; Csirik, J.; Csernay, L.; Makay, A.

    1981-01-01

    SEGAMS-80 is a system for processing isotope-diagnostic pictures easily and safely for physicians. The functions built into the system form a tree-structure. In certain stages of processing, tables completed according to the medical point of view show identification and a short description of the actual performable functions. The functions available allow an interactive performance of diagnostic processes for different purposes. Interactivity is undesirable while processing routine examinations, since the functions to be performed, their sequence and parameters could be identical in all cases. SEGAMS-80 makes it possible to construct complex programs, to put them into the system and execute them. During the complex program the desired functions are automatically executed. The operator's interference is needed only where the author of the complex program has stated that it is necessary from the medical aspects. Experience gained with several SEGAMS-80 systems has shown that they can be successfully used in isotope diagnostics, without requiring any training in computing techniques from the physicians. A schematic description is given of the structure of SEGAMS-80 together with a detailed account of how to construct complex utility programs. (author)

  15. A Hybrid Testbed for Performance Evaluation of Large-Scale Datacenter Networks

    DEFF Research Database (Denmark)

    Pilimon, Artur; Ruepp, Sarah Renée

    2018-01-01

    Datacenters (DC) as well as their network interconnects are growing in scale and complexity. They are constantly being challenged in terms of energy and resource utilization efficiency, scalability, availability, reliability and performance requirements. Therefore, these resource-intensive enviro......Datacenters (DC) as well as their network interconnects are growing in scale and complexity. They are constantly being challenged in terms of energy and resource utilization efficiency, scalability, availability, reliability and performance requirements. Therefore, these resource......-intensive environments must be properly tested and analyzed in order to make timely upgrades and transformations. However, a limited number of academic institutions and Research and Development companies have access to production scale DC Network (DCN) testing facilities, and resource-limited studies can produce...... misleading or inaccurate results. To address this problem, we introduce an alternative solution, which forms a solid base for a more realistic and comprehensive performance evaluation of different aspects of DCNs. It is based on the System-in-the-loop (SITL) concept, where real commercial DCN equipment...

  16. Postural adjustments are modulated by manual task complexity

    Directory of Open Access Journals (Sweden)

    Luis Augusto Teixeira

    2009-09-01

    Full Text Available Daily life activities of humans are characterized by dual tasks, in which a manual task is performed concomitantly with a postural task. Based on the assumption that both manual and postural tasks require attentional resources, no consensus exists as to how the central nervous system modulates postural adjustments in dual tasks. The aim of the present study was to analyze the effect of a manual task requiring attentional resources on shoulder and ankle adjustments as a function of the direction and predictability of postural perturbation. The participants (n=6 were evaluated during the performance of a simple and a complex manual task, while the base of support was moved backward or forward. Latency of activation of the tibialis anterior and gastroc-nemius muscles and angular acceleration of the shoulder were analyzed. The results showed that execution of the complex manual task delayed postural adjustment. Moreover, this delay occurred differently depending on the direction of postural perturbation. The delay in postural adjustment occurred proximally in the case of anterior displacement of the platform, and distally in the case of posterior displacement. Postural adjustments were more affected by the attentional task than by the predictability of platform displacement. These results are consistent with the concept of an integrated control between manual actions and the maintenance of static posture.

  17. Analytic network process model for sustainable lean and green manufacturing performance indicator

    Science.gov (United States)

    Aminuddin, Adam Shariff Adli; Nawawi, Mohd Kamal Mohd; Mohamed, Nik Mohd Zuki Nik

    2014-09-01

    Sustainable manufacturing is regarded as the most complex manufacturing paradigm to date as it holds the widest scope of requirements. In addition, its three major pillars of economic, environment and society though distinct, have some overlapping among each of its elements. Even though the concept of sustainability is not new, the development of the performance indicator still needs a lot of improvement due to its multifaceted nature, which requires integrated approach to solve the problem. This paper proposed the best combination of criteria en route a robust sustainable manufacturing performance indicator formation via Analytic Network Process (ANP). The integrated lean, green and sustainable ANP model can be used to comprehend the complex decision system of the sustainability assessment. The finding shows that green manufacturing is more sustainable than lean manufacturing. It also illustrates that procurement practice is the most important criteria in the sustainable manufacturing performance indicator.

  18. Complexities in innovation management in companies from the European industry. A path model of innovation project performance determinants

    NARCIS (Netherlands)

    Tepic, M.; Kemp, R.G.M.; Omta, S.W.F.; Fortuin, F.T.J.M.

    2013-01-01

    Purpose – The purpose of this paper is to provide an integrated framework of complex relations among innovation characteristics, organizational capabilities, innovation potential and innovation performance. Design/methodology/approach – The model is tested using partial least squares (PLS) modeling

  19. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  20. Assessment of the potential human health risks from exposure to complex substances in accordance with REACH requirements. "White spirit" as a case study.

    Science.gov (United States)

    McKee, Richard H; Tibaldi, Rosalie; Adenuga, Moyinoluwa D; Carrillo, Juan-Carlos; Margary, Alison

    2018-02-01

    The European chemical control regulation (REACH) requires that data on physical/chemical, toxicological and environmental hazards be compiled. Additionally, REACH requires formal assessments to ensure that substances can be safely used for their intended purposes. For health hazard assessments, reference values (Derived No Effect levels, DNELs) are calculated from toxicology data and compared to estimated exposure levels. If the ratio of the predicted exposure level to the DNEL, i.e. the Risk Characterization Ratio (RCR), is less than 1, the risk is considered controlled; otherwise, additional Risk Management Measures (RMM) must be applied. These requirements pose particular challenges for complex substances. Herein, "white spirit", a complex hydrocarbon solvent, is used as an example to illustrate how these procedures were applied. Hydrocarbon solvents were divided into categories of similar substances. Representative substances were identified for DNEL determinations. Adjustment factors were applied to the no effect levels to calculate the DNELs. Exposure assessments utilized a standardized set of generic exposure scenarios (GES) which incorporated exposure predictions for solvent handling activities. Computer-based tools were developed to automate RCR calculations and identify appropriate RMMs, allowing consistent communications to users via safety data sheets. Copyright © 2017 ExxonMobil Biomedical Sciences Inc. Published by Elsevier Inc. All rights reserved.

  1. Software complex for geophysical data visualization

    Science.gov (United States)

    Kryukov, Ilya A.; Tyugin, Dmitry Y.; Kurkin, Andrey A.; Kurkina, Oxana E.

    2013-04-01

    The effectiveness of current research in geophysics is largely determined by the degree of implementation of the procedure of data processing and visualization with the use of modern information technology. Realistic and informative visualization of the results of three-dimensional modeling of geophysical processes contributes significantly into the naturalness of physical modeling and detailed view of the phenomena. The main difficulty in this case is to interpret the results of the calculations: it is necessary to be able to observe the various parameters of the three-dimensional models, build sections on different planes to evaluate certain characteristics and make a rapid assessment. Programs for interpretation and visualization of simulations are spread all over the world, for example, software systems such as ParaView, Golden Software Surfer, Voxler, Flow Vision and others. However, it is not always possible to solve the problem of visualization with the help of a single software package. Preprocessing, data transfer between the packages and setting up a uniform visualization style can turn into a long and routine work. In addition to this, sometimes special display modes for specific data are required and existing products tend to have more common features and are not always fully applicable to certain special cases. Rendering of dynamic data may require scripting languages that does not relieve the user from writing code. Therefore, the task was to develop a new and original software complex for the visualization of simulation results. Let us briefly list of the primary features that are developed. Software complex is a graphical application with a convenient and simple user interface that displays the results of the simulation. Complex is also able to interactively manage the image, resize the image without loss of quality, apply a two-dimensional and three-dimensional regular grid, set the coordinate axes with data labels and perform slice of data. The

  2. Comparison of surface extraction techniques performance in computed tomography for 3D complex micro-geometry dimensional measurements

    DEFF Research Database (Denmark)

    Torralba, Marta; Jiménez, Roberto; Yagüe-Fabra, José A.

    2018-01-01

    micro-geometries as well (i.e., in the sub-mm dimensional range). However, there are different factors that may influence the CT process performance, being one of them the surface extraction technique used. In this paper, two different extraction techniques are applied to measure a complex miniaturized......The number of industrial applications of computed tomography (CT) for dimensional metrology in 100–103 mm range has been continuously increasing, especially in the last years. Due to its specific characteristics, CT has the potential to be employed as a viable solution for measuring 3D complex...... dental file by CT in order to analyze its contribution to the final measurement uncertainty in complex geometries at the mm to sub-mm scales. The first method is based on a similarity analysis: the threshold determination; while the second one is based on a gradient or discontinuity analysis: the 3D...

  3. Low Complexity V-BLAST MIMO-OFDM Detector by Successive Iterations Reduction

    Directory of Open Access Journals (Sweden)

    AHMED, K.

    2015-02-01

    Full Text Available V-BLAST detection method suffers large computational complexity due to its successive detection of symbols. In this paper, we propose a modified V-BLAST algorithm to decrease the computational complexity by reducing the number of detection iterations required in MIMO communication systems. We begin by showing the existence of a maximum number of iterations, beyond which, no significant improvement is obtained. We establish a criterion for the number of maximum effective iterations. We propose a modified algorithm that uses the measured SNR to dynamically set the number of iterations to achieve an acceptable bit-error rate. Then, we replace the feedback algorithm with an approximate linear function to reduce the complexity. Simulations show that significant reduction in computational complexity is achieved compared to the ordinary V-BLAST, while maintaining a good BER performance.

  4. Enhanced Massive Visualization of Engines Performance

    International Nuclear Information System (INIS)

    Rostand, N D; Eglantine, H; Jerôme, L

    2012-01-01

    Today, we are witnessing an increasing complexity of transport in order to deal with requirements of safety, security, reliability and efficiency. Such transport is generally equipped with drive systems; it is nevertheless for engine manufacturers to overcome the performance requirements of energy efficiency throughout their operations. To this end, this article proposes a performance monitoring solution for a large fleet of engines in operation. It uses a pre-calibrated physical model developed by the engine manufacturer regarding the performance objectives as reference. The physical model is firstly decomposed into critical performance modules, and is secondly updated on current observations extracted at specific predefined operating conditions in order to derive residual errors status of each engine tested. Through a process of standardization of those contextual differences remaining, the solution offers a synthesis mapping to visualize the evolution of performance of each engine throughout its operations. This article describes the theoretical methodology of implementation mainly based on universal mathematical foundations, and vindicates the interests of its industrialization in the light of the proactive findings.

  5. Product Complexity Impact on Quality and Delivery Performance

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2011-01-01

    Existing literature on product portfolio complexity is mainly focused on cost related aspects. It is widely acknowledged that an increase in a company’s product portfolio will lead to an increase in complexity related costs such as order management, procurement and inventory. The objective...... is increased, but it is not the only factor affected. We can document that there is a tendency towards increasing lead times as well as a drop in on time delivery and quality for newly introduced product variants. This means that the company experiences a reduced ability to deliver on time while also receiving...... of this article is to examine which other factors that might be affected when a company is expanding its product portfolio, if initiatives are not taken to accommodate this increase. Empirical work carried out in a large international engineering company having a market leader position confirms that cost...

  6. Complexity Control of Fast Motion Estimation in H.264/MPEG-4 AVC with Rate-Distortion-Complexity optimization

    DEFF Research Database (Denmark)

    Wu, Mo; Forchhammer, Søren; Aghito, Shankar Manuel

    2007-01-01

    A complexity control algorithm for H.264 advanced video coding is proposed. The algorithm can control the complexity of integer inter motion estimation for a given target complexity. The Rate-Distortion-Complexity performance is improved by a complexity prediction model, simple analysis of the pa...... statistics and a control scheme. The algorithm also works well for scene change condition. Test results for coding interlaced video (720x576 PAL) are reported.......A complexity control algorithm for H.264 advanced video coding is proposed. The algorithm can control the complexity of integer inter motion estimation for a given target complexity. The Rate-Distortion-Complexity performance is improved by a complexity prediction model, simple analysis of the past...

  7. 75 FR 76254 - Official Performance and Procedural Requirements for Grain Weighing Equipment and Related Grain...

    Science.gov (United States)

    2010-12-08

    ... DEPARTMENT OF AGRICULTURE Grain Inspection, Packers and Stockyards Administration 7 CFR Part 802 [Docket GIPSA-2010-FGIS-0012] RIN 0580-AB19 Official Performance and Procedural Requirements for Grain Weighing Equipment and Related Grain Handling Systems AGENCY: Grain Inspection, Packers and Stockyards...

  8. Modulation of recognition memory performance by light requires both melanopsin and classical photoreceptors

    Science.gov (United States)

    Tam, Shu K. E.; Hasan, Sibah; Hughes, Steven; Hankins, Mark W.; Foster, Russell G.; Bannerman, David M.

    2016-01-01

    Acute light exposure exerts various effects on physiology and behaviour. Although the effects of light on brain network activity in humans are well demonstrated, the effects of light on cognitive performance are inconclusive, with the size, as well as direction, of the effect depending on the nature of the task. Similarly, in nocturnal rodents, bright light can either facilitate or disrupt performance depending on the type of task employed. Crucially, it is unclear whether the effects of light on behavioural performance are mediated via the classical image-forming rods and cones or the melanopsin-expressing photosensitive retinal ganglion cells. Here, we investigate the modulatory effects of light on memory performance in mice using the spontaneous object recognition task. Importantly, we examine which photoreceptors are required to mediate the effects of light on memory performance. By using a cross-over design, we show that object recognition memory is disrupted when the test phase is conducted under a bright light (350 lux), regardless of the light level in the sample phase (10 or 350 lux), demonstrating that exposure to a bright light at the time of test, rather than at the time of encoding, impairs performance. Strikingly, the modulatory effect of light on memory performance is completely abolished in both melanopsin-deficient and rodless–coneless mice. Our findings provide direct evidence that melanopsin-driven and rod/cone-driven photoresponses are integrated in order to mediate the effect of light on memory performance. PMID:28003454

  9. Subjective task complexity in the control room

    International Nuclear Information System (INIS)

    Braarud, Per Oeivind

    2000-05-01

    Understanding of what makes a control room situation difficult to handle is important when studying operator performance, both with respect to prediction as well as improvement of the human performance. Previous exploratory work on complexity showed a potential for prediction and explanation of operator performance. This report investigates in further detail the theoretical background and the structure of operator rated task complexity. The report complements the previous work on complexity to make a basis for development of operator performance analysis tools. The first part of the report outlines an approach for studying the complexity of the control room crew's work. The approach draws upon man-machine research as well as problem solving research. The approach identifies five complexity-shaping components: 'task work characteristics', 'teamwork characteristics', 'individual skill', 'teamwork skill', and 'interface and support systems'. The crew's work complexity is related to concepts of human performance quality and human error. The second part of the report is a post-hoc exploratory analysis of four empirical HRP studies, where operators' conception of the complexity of control room work is assessed by questionnaires. The analysis deals with the structure of complexity questionnaire ratings, and the relationship between complexity ratings and human performance measures. The main findings from the analysis of structure was the identification of three task work factors which were named Masking, Information load and Temporal demand, and in addition the identification of one interface factor which was named Navigation. Post-hoc analysis suggests that operator's subjective complexity, which was assessed by questionnaires, is related to workload, task and system performance, and operator's self-rated performance. (Author). 28 refs., 47 tabs

  10. Reference Proteome Extracts for Mass Spec Instrument Performance Validation and Method Development

    Science.gov (United States)

    Rosenblatt, Mike; Urh, Marjeta; Saveliev, Sergei

    2014-01-01

    Biological samples of high complexity are required to test protein mass spec sample preparation procedures and validate mass spec instrument performance. Total cell protein extracts provide the needed sample complexity. However, to be compatible with mass spec applications, such extracts should meet a number of design requirements: compatibility with LC/MS (free of detergents, etc.)high protein integrity (minimal level of protein degradation and non-biological PTMs)compatibility with common sample preparation methods such as proteolysis, PTM enrichment and mass-tag labelingLot-to-lot reproducibility Here we describe total protein extracts from yeast and human cells that meet the above criteria. Two extract formats have been developed: Intact protein extracts with primary use for sample preparation method development and optimizationPre-digested extracts (peptides) with primary use for instrument validation and performance monitoring

  11. Improvements to optical performance in diffractive elements used for off-axis illumination

    Science.gov (United States)

    Welch, Kevin; Fedor, Adam; Felder, Daniel; Childers, John; Emig, Tim

    2009-08-01

    As photolithographic tools are pressed to print the ever shrinking features required in today's devices, complex off-axis illumination is taking an ever increasing role in meeting this challenge. This, in turn, is driving tighter, more stringent requirements on the diffractive elements used in these illumination systems. Specifically, any imbalance in the poles of an off-axis illuminator will contribute to reductions in the ultimate imaging performance of a lithographic tool and increased complexity in tool-to-tool matching. The article will focus on improvements to the manufacturing process that achieve substantially better pole balance. The modeling of the possible process contributors will be discussed. Challenges resulting from the manufacturing methodology will be shared. Finally, the improvement in manufacturing process performance will be reported by means of a pole balance capability index.

  12. Cross-linking mass spectrometry identifies new interfaces of Augmin required to localise the γ-tubulin ring complex to the mitotic spindle

    Directory of Open Access Journals (Sweden)

    Jack W. C. Chen

    2017-05-01

    Full Text Available The hetero-octameric protein complex, Augmin, recruits γ-Tubulin ring complex (γ-TuRC to pre-existing microtubules (MTs to generate branched MTs during mitosis, facilitating robust spindle assembly. However, despite a recent partial reconstitution of the human Augmin complex in vitro, the molecular basis of this recruitment remains unclear. Here, we used immuno-affinity purification of in vivo Augmin from Drosophila and cross-linking/mass spectrometry to identify distance restraints between residues within the eight Augmin subunits in the absence of any other structural information. The results allowed us to predict potential interfaces between Augmin and γ-TuRC. We tested these predictions biochemically and in the Drosophila embryo, demonstrating that specific regions of the Augmin subunits, Dgt3, Dgt5 and Dgt6 all directly bind the γ-TuRC protein, Dgp71WD, and are required for the accumulation of γ-TuRC, but not Augmin, to the mitotic spindle. This study therefore substantially increases our understanding of the molecular mechanisms underpinning MT-dependent MT nucleation.

  13. 40 CFR Table 3 to Subpart Mmmmm of... - Performance Test Requirements for New or Reconstructed Flame Lamination Affected Sources

    Science.gov (United States)

    2010-07-01

    ... data only required for venturi scrubbers) every 15 minutes during the entire duration of each 1-hour... (pressure drop data only required for Venturi scrubbers) over the period of the performance test by... liquid flow rate, scrubber effluent pH, and pressure drop (pressure drop data only required for venturi...

  14. The Search Performance Evaluation and Prediction in Exploratory Search

    OpenAIRE

    LIU, FEI

    2016-01-01

    The exploratory search for complex search tasks requires an effective search behavior model to evaluate and predict user search performance. Few studies have investigated the relationship between user search behavior and search performance in exploratory search. This research adopts a mixed approach combining search system development, user search experiment, search query log analysis, and multivariate regression analysis to resolve the knowledge gap. Through this study, it is shown that expl...

  15. Multi-scalar agent-based complex design systems - the case of CECO (Climatic -Ecologies) Studio; informed generative design systems and performance-driven design workflows

    NARCIS (Netherlands)

    Mostafavi, S.; Yu, S.; Biloria, N.M.

    2014-01-01

    This paper illustrates the application of different types of complex systems for digital form finding and design decision making with underlying methodological and pedagogical aims to emphasize performance-driven design solutions via combining generative methods of complex systems with simulation

  16. Robotic partial nephrectomy for complex renal tumors: surgical technique.

    Science.gov (United States)

    Rogers, Craig G; Singh, Amar; Blatt, Adam M; Linehan, W Marston; Pinto, Peter A

    2008-03-01

    Laparoscopic partial nephrectomy requires advanced training to accomplish tumor resection and renal reconstruction while minimizing warm ischemia times. Complex renal tumors add an additional challenge to a minimally invasive approach to nephron-sparing surgery. We describe our technique, illustrated with video, of robotic partial nephrectomy for complex renal tumors, including hilar, endophytic, and multiple tumors. Robotic assistance was used to resect 14 tumors in eight patients (mean age: 50.3 yr; range: 30-68 yr). Three patients had hereditary kidney cancer. All patients had complex tumor features, including hilar tumors (n=5), endophytic tumors (n=4), and/or multiple tumors (n=3). Robotic partial nephrectomy procedures were performed successfully without complications. Hilar clamping was used with a mean warm ischemia time of 31 min (range: 24-45 min). Mean blood loss was 230 ml (range: 100-450 ml). Histopathology confirmed clear-cell renal cell carcinoma (n=3), hybrid oncocytic tumor (n=2), chromophobe renal cell carcinoma (n=2), and oncocytoma (n=1). All patients had negative surgical margins. Mean index tumor size was 3.6 cm (range: 2.6-6.4 cm). Mean hospital stay was 2.6 d. At 3-mo follow-up, no patients experienced a statistically significant change in serum creatinine or estimated glomerular filtration rate and there was no evidence of tumor recurrence. Robotic partial nephrectomy is safe and feasible for select patients with complex renal tumors, including hilar, endophytic, and multiple tumors. Robotic assistance may facilitate a minimally invasive, nephron-sparing approach for select patients with complex renal tumors who might otherwise require open surgery or total nephrectomy.

  17. Risk Management Capability Maturity and Performance of Complex Product and System (CoPS Projects with an Asian Perspective

    Directory of Open Access Journals (Sweden)

    Ren, Y.

    2014-07-01

    Full Text Available Complex Products and Systems (CoPS are high value, technology and engineering-intensive capital goods. The motivation of this study is the persistent high failure rate of CoPS projects, Asian CoPS provider’s weak capability and lack of specific research on CoPS risk management. This paper evaluates risk management maturity level of CoPS projects against a general CoPS risk management capability maturity model (RM-CMM developed by the authors. An Asian based survey was conducted to investigate the value of RM to project performance, and Asian (non-Japanese CoPS implementers’ perceived application of RM practices, their strengths and weaknesses. The survey result shows that higher RM maturity level leads to higher CoPS project performance. It also shows project complexity and uncertainty moderates the relationship between some RM practices and project performance, which implies that a contingency approach should be adopted to manage CoPS risks effectively. In addition, it shows that Asian CoPS implementers are weak in RM process and there are also rooms for improvement in the softer aspects of organizational capabilities and robustness.

  18. Identification of chromatin-associated regulators of MSL complex targeting in Drosophila dosage compensation.

    Directory of Open Access Journals (Sweden)

    Erica Larschan

    Full Text Available Sex chromosome dosage compensation in Drosophila provides a model for understanding how chromatin organization can modulate coordinate gene regulation. Male Drosophila increase the transcript levels of genes on the single male X approximately two-fold to equal the gene expression in females, which have two X-chromosomes. Dosage compensation is mediated by the Male-Specific Lethal (MSL histone acetyltransferase complex. Five core components of the MSL complex were identified by genetic screens for genes that are specifically required for male viability and are dispensable for females. However, because dosage compensation must interface with the general transcriptional machinery, it is likely that identifying additional regulators that are not strictly male-specific will be key to understanding the process at a mechanistic level. Such regulators would not have been recovered from previous male-specific lethal screening strategies. Therefore, we have performed a cell culture-based, genome-wide RNAi screen to search for factors required for MSL targeting or function. Here we focus on the discovery of proteins that function to promote MSL complex recruitment to "chromatin entry sites," which are proposed to be the initial sites of MSL targeting. We find that components of the NSL (Non-specific lethal complex, and a previously unstudied zinc-finger protein, facilitate MSL targeting and display a striking enrichment at MSL entry sites. Identification of these factors provides new insight into how MSL complex establishes the specialized hyperactive chromatin required for dosage compensation in Drosophila.

  19. Using iMCFA to Perform the CFA, Multilevel CFA, and Maximum Model for Analyzing Complex Survey Data.

    Science.gov (United States)

    Wu, Jiun-Yu; Lee, Yuan-Hsuan; Lin, John J H

    2018-01-01

    To construct CFA, MCFA, and maximum MCFA with LISREL v.8 and below, we provide iMCFA (integrated Multilevel Confirmatory Analysis) to examine the potential multilevel factorial structure in the complex survey data. Modeling multilevel structure for complex survey data is complicated because building a multilevel model is not an infallible statistical strategy unless the hypothesized model is close to the real data structure. Methodologists have suggested using different modeling techniques to investigate potential multilevel structure of survey data. Using iMCFA, researchers can visually set the between- and within-level factorial structure to fit MCFA, CFA and/or MAX MCFA models for complex survey data. iMCFA can then yield between- and within-level variance-covariance matrices, calculate intraclass correlations, perform the analyses and generate the outputs for respective models. The summary of the analytical outputs from LISREL is gathered and tabulated for further model comparison and interpretation. iMCFA also provides LISREL syntax of different models for researchers' future use. An empirical and a simulated multilevel dataset with complex and simple structures in the within or between level was used to illustrate the usability and the effectiveness of the iMCFA procedure on analyzing complex survey data. The analytic results of iMCFA using Muthen's limited information estimator were compared with those of Mplus using Full Information Maximum Likelihood regarding the effectiveness of different estimation methods.

  20. Flash memories economic principles of performance, cost and reliability optimization

    CERN Document Server

    Richter, Detlev

    2014-01-01

    The subject of this book is to introduce a model-based quantitative performance indicator methodology applicable for performance, cost and reliability optimization of non-volatile memories. The complex example of flash memories is used to introduce and apply the methodology. It has been developed by the author based on an industrial 2-bit to 4-bit per cell flash development project. For the first time, design and cost aspects of 3D integration of flash memory are treated in this book. Cell, array, performance and reliability effects of flash memories are introduced and analyzed. Key performance parameters are derived to handle the flash complexity. A performance and array memory model is developed and a set of performance indicators characterizing architecture, cost and durability is defined.   Flash memories are selected to apply the Performance Indicator Methodology to quantify design and technology innovation. A graphical representation based on trend lines is introduced to support a requirement based pr...

  1. Foreign currency-related translation complexities in cross-border healthcare applications.

    Science.gov (United States)

    Kumar, Anand; Rodrigues, Jean M

    2009-01-01

    International cross-border private hospital chains need to apply the standards for foreign currency translation in order to consolidate the balance sheet and income statements. This not only exposes such chains to exchange rate fluctuations in different ways, but also creates added requirements for enterprise-level IT systems especially when they produce parameters which are used to measure the financial and operational performance of the foreign subsidiary or the parent hospital. Such systems would need to come to terms with the complexities involved in such currency-related translations in order to provide the correct data for performance benchmarking.

  2. FADES: A tool for automated fault analysis of complex systems

    International Nuclear Information System (INIS)

    Wood, C.

    1990-01-01

    FADES is an Expert System for performing fault analyses on complex connected systems. By using a graphical editor to draw components and link them together the FADES system allows the analyst to describe a given system. The knowledge base created is used to qualitatively simulate the system behaviour. By inducing all possible component failures in the system and determining their effects, a set of facts is built up. These facts are then used to create Fault Trees, or FMEA tables. The facts may also be used for explanation effects and to generate diagnostic rules allowing system instrumentation to be optimised. The prototype system has been built and tested and is preently undergoing testing by users. All comments from these trials will be used to tailor the system to the requirements of the user so that the end product performs the exact task required

  3. Implementation of Complex Biological Logic Circuits Using Spatially Distributed Multicellular Consortia

    Science.gov (United States)

    Urrios, Arturo; de Nadal, Eulàlia; Solé, Ricard; Posas, Francesc

    2016-01-01

    Engineered synthetic biological devices have been designed to perform a variety of functions from sensing molecules and bioremediation to energy production and biomedicine. Notwithstanding, a major limitation of in vivo circuit implementation is the constraint associated to the use of standard methodologies for circuit design. Thus, future success of these devices depends on obtaining circuits with scalable complexity and reusable parts. Here we show how to build complex computational devices using multicellular consortia and space as key computational elements. This spatial modular design grants scalability since its general architecture is independent of the circuit’s complexity, minimizes wiring requirements and allows component reusability with minimal genetic engineering. The potential use of this approach is demonstrated by implementation of complex logical functions with up to six inputs, thus demonstrating the scalability and flexibility of this method. The potential implications of our results are outlined. PMID:26829588

  4. Issues in performance assessments for disposal of US Department of Energy low-level waste

    International Nuclear Information System (INIS)

    Wood, D.E.; Wilhite, E.L.; Duggan, G.J.

    1994-12-01

    The US Department of Energy (DOE) and its contractors have long been pioneers in the field of radiological performance assessment (PA). Much effort has been expended in developing technology and acquiring data to facilitate the assessment process. This is reflected in DOE Order 5820.2A, Radioactive Waste Management Chapter III of the Order lists policy and requirements to manage the DOEs low-level waste; performance objectives for low-level waste management are stated to ensure the protection of public health and the environment. A radiological PA is also required to demonstrate compliance with the performance objectives. DOE Order 5820.2A further requires that an Oversight and Peer Review Panel be established to ensure consistency and technical quality around the DOE complex in the development and application of PA models that include site-specific geohydrology and waste composition. The DOE has also established a Performance Assessment Task Team (PATT) to integrate the activities of sites that are preparing PAs. The PATT's purpose is to recommend policy and guidance to DOE on issues that impact PAs so that the approaches taken are as consistent as possible across the DOE complex

  5. The EARP Complex and Its Interactor EIPR-1 Are Required for Cargo Sorting to Dense-Core Vesicles.

    Directory of Open Access Journals (Sweden)

    Irini Topalidou

    2016-05-01

    Full Text Available The dense-core vesicle is a secretory organelle that mediates the regulated release of peptide hormones, growth factors, and biogenic amines. Dense-core vesicles originate from the trans-Golgi of neurons and neuroendocrine cells, but it is unclear how this specialized organelle is formed and acquires its specific cargos. To identify proteins that act in dense-core vesicle biogenesis, we performed a forward genetic screen in Caenorhabditis elegans for mutants defective in dense-core vesicle function. We previously reported the identification of two conserved proteins that interact with the small GTPase RAB-2 to control normal dense-core vesicle cargo-sorting. Here we identify several additional conserved factors important for dense-core vesicle cargo sorting: the WD40 domain protein EIPR-1 and the endosome-associated recycling protein (EARP complex. By assaying behavior and the trafficking of dense-core vesicle cargos, we show that mutants that lack EIPR-1 or EARP have defects in dense-core vesicle cargo-sorting similar to those of mutants in the RAB-2 pathway. Genetic epistasis data indicate that RAB-2, EIPR-1 and EARP function in a common pathway. In addition, using a proteomic approach in rat insulinoma cells, we show that EIPR-1 physically interacts with the EARP complex. Our data suggest that EIPR-1 is a new interactor of the EARP complex and that dense-core vesicle cargo sorting depends on the EARP-dependent trafficking of cargo through an endosomal sorting compartment.

  6. The Test of Logical Thinking as a predictor of first-year pharmacy students' performance in required first-year courses.

    Science.gov (United States)

    Etzler, Frank M; Madden, Michael

    2014-08-15

    To investigate the correlation of scores on the Test of Logical Thinking (TOLT) with first-year pharmacy students' performance in selected courses. The TOLT was administered to 130 first-year pharmacy students. The examination was administered during the first quarter in a single session. The TOLT scores correlated with grades earned in Pharmaceutical Calculations, Physical Pharmacy, and Basic Pharmacokinetics courses. Performance on the TOLT has been correlated to performance in courses that required the ability to use quantitative reasoning to complete required tasks. In the future, it may be possible to recommend remediation, retention, and/or admission based in part on the results from the TOLT.

  7. Phonological similarity effect in complex span task.

    Science.gov (United States)

    Camos, Valérie; Mora, Gérôme; Barrouillet, Pierre

    2013-01-01

    The aim of our study was to test the hypothesis that two systems are involved in verbal working memory; one is specifically dedicated to the maintenance of phonological representations through verbal rehearsal while the other would maintain multimodal representations through attentional refreshing. This theoretical framework predicts that phonologically related phenomena such as the phonological similarity effect (PSE) should occur when the domain-specific system is involved in maintenance, but should disappear when concurrent articulation hinders its use. Impeding maintenance in the domain-general system by a concurrent attentional demand should impair recall performance without affecting PSE. In three experiments, we manipulated the concurrent articulation and the attentional demand induced by the processing component of complex span tasks in which participants had to maintain lists of either similar or dissimilar words. Confirming our predictions, PSE affected recall performance in complex span tasks. Although both the attentional demand and the articulatory requirement of the concurrent task impaired recall, only the induction of an articulatory suppression during maintenance made the PSE disappear. These results suggest a duality in the systems devoted to verbal maintenance in the short term, constraining models of working memory.

  8. A Study on Relationships between Functional Performance and Task Performance Measure through Experiments in NPP MCR

    International Nuclear Information System (INIS)

    Jang, In Seok; Seong, Poong Hyun; Park, Jin Kyun

    2011-01-01

    Further improvements in levels of organization, management, man-machine interfaces, education, training, etc. are required, if high operating reliability of operators in huge and complex plants such as chemical plants and electrical power generating plants is to be maintained. Improvement requires good understanding of operators' behavior, including defining what is good performance for operators, especially in emergency situations. Human performance measures, therefore, are important to enhance performance and to reduce the probability of incidents and accidents in Nuclear Power Plants (NPPs). Operators' performance measures are used for multi-objectives such as control room design, human system interface evaluation, training, procedure and so on. There are two kinds of representative methods to measure operators' performance. These methods are now known as the functional performance measure and task performance measure. Functional performance measures are basically based on the plant process parameters. Functional performance measures indicate how well the operators controlled selected critical parameters. The parameters selected in this paper are derived from the four Critical Safety Functions (CSFs) identified in the emergency operating procedures such as achievement of subcriticality, maintenance of core cooling, maintenance of heat sink and maintenance of containment integrity. Task performance measures are based on the task analysis. Task analysis is to determine the tasks required and how operators are performed. In this paper, task analysis is done with ideal path for an accident completed by experts and Emergency Operation Procedure (EOP). However, most literatures related to operators' performance have been using one of these measures and there is no research to find out the relationships between two measures. In this paper, the relationships between functional performance measure and task performance measure are investigated using experiments. Shortly

  9. Camp Verde Adult Reading Program. Final Performance Report.

    Science.gov (United States)

    Maynard, David A.

    This document begins with a four-page performance report describing how the Camp Verde Adult Reading Program site was relocated to the Community Center Complex, and the Town Council contracted directly with the Friends of the Camp Verde Library to provide for the requirements of the program. The U.S. Department of Education grant allowed the…

  10. Complexity in Picture Books

    Science.gov (United States)

    Sierschynski, Jarek; Louie, Belinda; Pughe, Bronwyn

    2015-01-01

    One of the key requirements of Common Core State Standards (CCSS) in English Language Arts is that students are able to read and access complex texts across all grade levels. The CCSS authors emphasize both the limitations and lack of accuracy in the current CCSS model of text complexity, calling for the development of new frameworks. In response…

  11. Social behavior of bacteria: from physics to complex organization

    Science.gov (United States)

    Ben-Jacob, E.

    2008-10-01

    I describe how bacteria develop complex colonial patterns by utilizing intricate communication capabilities, such as quorum sensing, chemotactic signaling and exchange of genetic information (plasmids) Bacteria do not store genetically all the information required for generating the patterns for all possible environments. Instead, additional information is cooperatively generated as required for the colonial organization to proceed. Each bacterium is, by itself, a biotic autonomous system with its own internal cellular informatics capabilities (storage, processing and assessments of information). These afford the cell certain plasticity to select its response to biochemical messages it receives, including self-alteration and broadcasting messages to initiate alterations in other bacteria. Hence, new features can collectively emerge during self-organization from the intra-cellular level to the whole colony. Collectively bacteria store information, perform decision make decisions (e.g. to sporulate) and even learn from past experience (e.g. exposure to antibiotics)-features we begin to associate with bacterial social behavior and even rudimentary intelligence. I also take Schrdinger’s’ “feeding on negative entropy” criteria further and propose that, in addition organisms have to extract latent information embedded in the environment. By latent information we refer to the non-arbitrary spatio-temporal patterns of regularities and variations that characterize the environmental dynamics. In other words, bacteria must be able to sense the environment and perform internal information processing for thriving on latent information embedded in the complexity of their environment. I then propose that by acting together, bacteria can perform this most elementary cognitive function more efficiently as can be illustrated by their cooperative behavior.

  12. Unified analysis of ensemble and single-complex optical spectral data from light-harvesting complex-2 chromoproteins for gaining deeper insight into bacterial photosynthesis

    Science.gov (United States)

    Pajusalu, Mihkel; Kunz, Ralf; Rätsep, Margus; Timpmann, Kõu; Köhler, Jürgen; Freiberg, Arvi

    2015-11-01

    Bacterial light-harvesting pigment-protein complexes are very efficient at converting photons into excitons and transferring them to reaction centers, where the energy is stored in a chemical form. Optical properties of the complexes are known to change significantly in time and also vary from one complex to another; therefore, a detailed understanding of the variations on the level of single complexes and how they accumulate into effects that can be seen on the macroscopic scale is required. While experimental and theoretical methods exist to study the spectral properties of light-harvesting complexes on both individual complex and bulk ensemble levels, they have been developed largely independently of each other. To fill this gap, we simultaneously analyze experimental low-temperature single-complex and bulk ensemble optical spectra of the light-harvesting complex-2 (LH2) chromoproteins from the photosynthetic bacterium Rhodopseudomonas acidophila in order to find a unique theoretical model consistent with both experimental situations. The model, which satisfies most of the observations, combines strong exciton-phonon coupling with significant disorder, characteristic of the proteins. We establish a detailed disorder model that, in addition to containing a C2-symmetrical modulation of the site energies, distinguishes between static intercomplex and slow conformational intracomplex disorders. The model evaluations also verify that, despite best efforts, the single-LH2-complex measurements performed so far may be biased toward complexes with higher Huang-Rhys factors.

  13. Performance of a full-scale ITER metal hydride storage bed in comparison with requirements

    International Nuclear Information System (INIS)

    Beloglazov, S.; Glugla, M.; Fanghaenel, E.; Perevezentsev, A.; Wagner, R.

    2008-01-01

    The storage of hydrogen isotopes as metal hydride is the technique chosen for the ITER Tritium Plant Storage and Delivery System (SDS). A prototype storage bed of a full-scale has been designed, manufactured and intensively tested at the Tritium Laboratory, addressing main performance parameters specified for the ITER application. The main requirements for the hydrogen storage bed are a strict physical limitation of the tritium storage capacity (currently 70 g T 2 ), a high supply flow rate of hydrogen isotopes, in-situ calorimetry capabilities with an accuracy of 1 g and a fully tritium compatible design. The pressure composition isotherm of the ZrCo hydrogen system, as a reference material for ITER, is characterised by significant slope. As a result technical implementation of the ZrCo hydride bed in the SDS system requires further considerations. The paper presents the experience from the operation of ZrCo getter bed including loading/de-loading operation, calorimetric loop performance, and active gas cooling of the bed for fast absorption operation. The implications of hydride material characteristics on the SDS system configuration and design are discussed. (authors)

  14. Distributed coding/decoding complexity in video sensor networks.

    Science.gov (United States)

    Cordeiro, Paulo J; Assunção, Pedro

    2012-01-01

    Video Sensor Networks (VSNs) are recent communication infrastructures used to capture and transmit dense visual information from an application context. In such large scale environments which include video coding, transmission and display/storage, there are several open problems to overcome in practical implementations. This paper addresses the most relevant challenges posed by VSNs, namely stringent bandwidth usage and processing time/power constraints. In particular, the paper proposes a novel VSN architecture where large sets of visual sensors with embedded processors are used for compression and transmission of coded streams to gateways, which in turn transrate the incoming streams and adapt them to the variable complexity requirements of both the sensor encoders and end-user decoder terminals. Such gateways provide real-time transcoding functionalities for bandwidth adaptation and coding/decoding complexity distribution by transferring the most complex video encoding/decoding tasks to the transcoding gateway at the expense of a limited increase in bit rate. Then, a method to reduce the decoding complexity, suitable for system-on-chip implementation, is proposed to operate at the transcoding gateway whenever decoders with constrained resources are targeted. The results show that the proposed method achieves good performance and its inclusion into the VSN infrastructure provides an additional level of complexity control functionality.

  15. Tools and techniques for developing policies for complex and uncertain systems.

    Science.gov (United States)

    Bankes, Steven C

    2002-05-14

    Agent-based models (ABM) are examples of complex adaptive systems, which can be characterized as those systems for which no model less complex than the system itself can accurately predict in detail how the system will behave at future times. Consequently, the standard tools of policy analysis, based as they are on devising policies that perform well on some best estimate model of the system, cannot be reliably used for ABM. This paper argues that policy analysis by using ABM requires an alternative approach to decision theory. The general characteristics of such an approach are described, and examples are provided of its application to policy analysis.

  16. Study by magnetic resonance and relaxation of carbon 13 of some paramagnetic coordination complexes

    International Nuclear Information System (INIS)

    Ronfard-Haret, Jean-Claude

    1977-01-01

    This research thesis reports the study of coordination complexes by using NMR. After a brief recall of the theoretical background required for the processing of experimental data (hyper-fine coupling and magnetic resonance, spin density distribution, chemical displacement, dipolar, scalar and electronic relaxation), the author describes the conditions in which experiments have been performed and presents measurement methods (pulsed nuclear magnetic resonance, relaxation time measurement, determination of hyper-fine coupling constants, spectrometers and reactants). The next chapters address the study of different coordination complexes: [(pyridine-N-oxide) 2 Ni(acetylacetonate) 2 ], carbon 13 in alkyl-anilines-Ni II, complexation of 1- and 2-aminonaphthalene by transition ions, complexation of pyridine-N-oxide by the nickel Ni ++ ion in presence of water

  17. The role of human performance in the safety complex plants' operation

    International Nuclear Information System (INIS)

    Preda, Irina Aida; Lazar, Roxana Elena; Croitoru, Cornelia

    1999-01-01

    According to statistics, about 20-30% from the failures occurred in the plants are caused directly or indirectly by human errors. Furthermore, it was established that 10-15% of the global failures are related with the human errors. These are mainly due to the wrong actions, maintenance errors, and misinterpretation of instruments. The human performance is influenced by: professional ability, complexity and danger to the plant experience in the working place, level of skills, events in personal and/or professional life, discipline, social ambience, somatic health. The human performances' assessment in the probabilistic safety assessment offers the possibility of evaluation of human contribution to the events sequences outcome. Not all the human errors have impact on the system. A human error may be recovered before the unwanted consequences had been occurred on system. This paper presents the possibilities to use the probabilistic method (event tree, fault tree) to identify the solutions for human reliability improved in order to minimize the risk in industrial plants' operation. Also, the human error types and their causes are defined and the 'decision tree method' as technique in our analysis for human reliability assessment is presented. The exemplification of human error analysis method was achieved based on operation data for Valcea Heavy Water Pilot Plant. As initiating event for the accident state 'the steam supply interruption' event has been considered. The human errors' contribution was analysed for the accident sequence with the worst consequences. (authors)

  18. 75 FR 37711 - Automatic Dependent Surveillance-Broadcast (ADS-B) Out Performance Requirements To Support Air...

    Science.gov (United States)

    2010-06-30

    ... preamble and cross references in the preamble and rule text of that final rule. DATES: The final rule and..., Performance and Interoperability Requirements Document for Enhanced Air Traffic in Radar-Controlled Areas.... 21.618 to Sec. 21.609 to reflect the correct section number. The FAA also is issuing a separate...

  19. "One Task Fits All"? The Roles of Task Complexity, Modality, and Working Memory Capacity in L2 Performance

    Science.gov (United States)

    Zalbidea, Janire

    2017-01-01

    The present study explores the independent and interactive effects of task complexity and task modality on linguistic dimensions of second language (L2) performance and investigates how these effects are modulated by individual differences in working memory capacity. Thirty-two intermediate learners of L2 Spanish completed less and more complex…

  20. A cross-sectional retrospective analysis of the regionalization of complex surgery.

    Science.gov (United States)

    Studnicki, James; Craver, Christopher; Blanchette, Christopher M; Fisher, John W; Shahbazi, Sara

    2014-08-16

    The Veterans Health Administration (VHA) system has assigned a surgical complexity level to each of its medical centers by specifying requirements to perform standard, intermediate or complex surgical procedures. No study to similarly describe the patterns of relative surgical complexity among a population of United States (U.S) civilian hospitals has been completed. single year, retrospective, cross-sectional. the study used Florida Inpatient Discharge Data from short-term acute hospitals for calendar year 2009. Two hundred hospitals with 2,542,920 discharges were organized into four quartiles (Q 1, 2, 3, 4) based on the number of complex procedures per hospital. The VHA surgical complexity matrix was applied to assign relative complexity to each procedure. The Clinical Classification Software (CCS) system assigned complex procedures to clinically meaningful groups. For outcome comparisons, propensity score matching methods adjusted for the surgical procedure, age, gender, race, comorbidities, mechanical ventilator use and type of admission. in-hospital mortality and length-of-stay (LOS). Only 5.2% of all inpatient discharges involve a complex procedure. The highest volume complex procedure hospitals (Q4) have 49.8% of all discharges but 70.1% of all complex procedures. In the 133,436 discharges with a primary complex procedure, 374 separate specific procedures are identified, only about one third of which are performed in the lowest volume complex procedure (Q1) hospitals. Complex operations of the digestive, respiratory, integumentary and musculoskeletal systems are the least concentrated and proportionately more likely to occur in the lower volume hospitals. Operations of the cardiovascular system and certain technology dependent miscellaneous diagnostic and therapeutic procedures are the most concentrated in high volume hospitals. Organ transplants are only done in Q4 hospitals. There were no significant differences in in-hospital mortality rates and the

  1. Performance modeling of network data services

    Energy Technology Data Exchange (ETDEWEB)

    Haynes, R.A.; Pierson, L.G.

    1997-01-01

    Networks at major computational organizations are becoming increasingly complex. The introduction of large massively parallel computers and supercomputers with gigabyte memories are requiring greater and greater bandwidth for network data transfers to widely dispersed clients. For networks to provide adequate data transfer services to high performance computers and remote users connected to them, the networking components must be optimized from a combination of internal and external performance criteria. This paper describes research done at Sandia National Laboratories to model network data services and to visualize the flow of data from source to sink when using the data services.

  2. Neutrosophy for software requirement prioritization

    Directory of Open Access Journals (Sweden)

    Ronald Barriga Dias

    2017-09-01

    Full Text Available Software engineers are involved in complex decisions that require multiples viewpoints. A specific case is the requirement prioritization process. This process is used to decide which software requirement to develop in certain release from a group of candidate requirements. Criteria involved in this process can involve indeterminacy. In this paper a software requirement prioritization model is develop based SVN numbers. Finally, an illustrative example is presented in order to show the proposed model.

  3. [Precautions of physical performance requirements and test methods during product standard drafting process of medical devices].

    Science.gov (United States)

    Song, Jin-Zi; Wan, Min; Xu, Hui; Yao, Xiu-Jun; Zhang, Bo; Wang, Jin-Hong

    2009-09-01

    The major idea of this article is to discuss standardization and normalization for the product standard of medical devices. Analyze the problem related to the physical performance requirements and test methods during product standard drafting process and make corresponding suggestions.

  4. Risk Modeling of Interdependent Complex Systems of Systems: Theory and Practice.

    Science.gov (United States)

    Haimes, Yacov Y

    2018-01-01

    The emergence of the complexity characterizing our systems of systems (SoS) requires a reevaluation of the way we model, assess, manage, communicate, and analyze the risk thereto. Current models for risk analysis of emergent complex SoS are insufficient because too often they rely on the same risk functions and models used for single systems. These models commonly fail to incorporate the complexity derived from the networks of interdependencies and interconnectedness (I-I) characterizing SoS. There is a need to reevaluate currently practiced risk analysis to respond to this reality by examining, and thus comprehending, what makes emergent SoS complex. The key to evaluating the risk to SoS lies in understanding the genesis of characterizing I-I of systems manifested through shared states and other essential entities within and among the systems that constitute SoS. The term "essential entities" includes shared decisions, resources, functions, policies, decisionmakers, stakeholders, organizational setups, and others. This undertaking can be accomplished by building on state-space theory, which is fundamental to systems engineering and process control. This article presents a theoretical and analytical framework for modeling the risk to SoS with two case studies performed with the MITRE Corporation and demonstrates the pivotal contributions made by shared states and other essential entities to modeling and analysis of the risk to complex SoS. A third case study highlights the multifarious representations of SoS, which require harmonizing the risk analysis process currently applied to single systems when applied to complex SoS. © 2017 Society for Risk Analysis.

  5. Kinetic and spectroscopic studies of cytochrome b-563 in isolated cytochrome b/f complex and in thylakoid membranes

    Energy Technology Data Exchange (ETDEWEB)

    Hind, G.; Clark, R.D.; Houchins, J.P.

    1983-01-01

    Extensive studies, performed principally by Hauska, Hurt and collaborators, have shown that a cytochrome (cyt) b/f complex isolated from photosynthetic membranes of spinach or Anabaena catalyzes electron transport from plastoquinol (PQH/sub 2/) to plastocyanin or algal cyt c-552. The complex from spinach thylakoids generated a membrane potential when reconstituted into liposomes, and although the electrogenic mechanism remains unknown, a key role for cyt b-563 is widely accepted. Electrogenesis by a Q-cycle mechanism requires a plastoquinone (PQ) reductase to be associated with the stromal side of the thylakoid b/f complex though this activity has yet to be demonstrated. It seemed possible that more gentle isolation of the complex might yield a form containing additional polypeptides, perhaps including a PQ reductase or a component involved in returning electrons from reduced ferredoxin to the complex in cyclic electron flow. Optimization of the isolation of cyt b/f complex for Hybrid 424 spinach from a growth room was also required. The procedure we devised is compared to the protocol of Hurt and Hauska (1982). 13 references.

  6. Microchip Immunoaffinity Electrophoresis of Antibody-Thymidine Kinase 1 Complex

    Science.gov (United States)

    Pagaduan, Jayson V.; Ramsden, Madison; O’Neill, Kim; Woolley, Adam T.

    2015-01-01

    Thymidine kinase-1 (TK1) is an important cancer biomarker whose serum levels are elevated in early cancer development. We developed a microchip electrophoresis immunoaffinity assay to measure recombinant purified TK1 (pTK1) using an antibody that binds to human TK1. We fabricated poly(methyl methacrylate) microfluidic devices to test the feasibility of detecting antibody (Ab)-pTK1 immune complexes as a step towards TK1 analysis in clinical serum samples. We were able to separate immune complexes from unbound antibodies using 0.5X phosphate buffer saline (pH 7.4) containing 0.01% Tween-20, with 1% w/v methylcellulose that acts as a dynamic surface coating and sieving matrix. Separation of the antibody and Ab-pTK1 complex was observed within a 5 mm effective separation length. This method of detecting pTK1 is easy to perform, requires only a 10 μL sample volume, and takes just 1 minute for separation. PMID:25486911

  7. Optimized design of embedded DSP system hardware supporting complex algorithms

    Science.gov (United States)

    Li, Yanhua; Wang, Xiangjun; Zhou, Xinling

    2003-09-01

    The paper presents an optimized design method for a flexible and economical embedded DSP system that can implement complex processing algorithms as biometric recognition, real-time image processing, etc. It consists of a floating-point DSP, 512 Kbytes data RAM, 1 Mbytes FLASH program memory, a CPLD for achieving flexible logic control of input channel and a RS-485 transceiver for local network communication. Because of employing a high performance-price ratio DSP TMS320C6712 and a large FLASH in the design, this system permits loading and performing complex algorithms with little algorithm optimization and code reduction. The CPLD provides flexible logic control for the whole DSP board, especially in input channel, and allows convenient interface between different sensors and DSP system. The transceiver circuit can transfer data between DSP and host computer. In the paper, some key technologies are also introduced which make the whole system work efficiently. Because of the characters referred above, the hardware is a perfect flat for multi-channel data collection, image processing, and other signal processing with high performance and adaptability. The application section of this paper presents how this hardware is adapted for the biometric identification system with high identification precision. The result reveals that this hardware is easy to interface with a CMOS imager and is capable of carrying out complex biometric identification algorithms, which require real-time process.

  8. On the “cost-optimal levels” of energy performance requirements and its economic evaluation in Italy

    Directory of Open Access Journals (Sweden)

    Lamberto Tronchin

    2014-10-01

    Full Text Available The European energy policies about climate and energy package, known as the “20-20-20” targets define ambitious, but achievable, national energy objectives. As regards the Directives closely related to the 2020 targets, the EU Energy Performance of Buildings Directive (EPBD Recast- DIR 2010/31/EU is the main European legislative instrument for improving the energy performance of buildings, taking into account outdoor climatic and local conditions, as well as indoor climate requirements and cost-effectiveness. The EPBD recast now requests that Member States shall ensure that minimum energy performance requirements for buildings are set “with a view to achieving cost-optimal levels”. The cost optimum level shall be calculated in accordance with a comparative methodology framework, leaving the Member States to determine which of these calculations is to become the national benchmark against which national minimum energy performance requirements will be assessed. The European standards (ENs- Umbrella Document V7 (prCEN/TR 15615 are intended to support the EPBD by providing the calculation methods and associated material to obtain the overall energy performance of a building. For Italy the Energy Performance of Building Simulations EPBS must be calculated with standard UNITS 11300. The energy building behaviour is referred to standard and not to real use, nor climate or dynamic energy evaluation. Since retrofitting of existing buildings offers significant opportunities for reducing energy consumption and greenhouse gas emissions, a case study of retrofitting is described and the CostOptimal Level EU procedure in an Italian context is analysed. Following this procedure, it is shown not only that the energy cost depends on several conditions and most of them are not indexed at national level but also that the cost of improvement depends on local variables and contract tender. The case study highlights the difficulties to apply EU rules, and

  9. Distributed redundancy and robustness in complex systems

    KAUST Repository

    Randles, Martin

    2011-03-01

    The uptake and increasing prevalence of Web 2.0 applications, promoting new large-scale and complex systems such as Cloud computing and the emerging Internet of Services/Things, requires tools and techniques to analyse and model methods to ensure the robustness of these new systems. This paper reports on assessing and improving complex system resilience using distributed redundancy, termed degeneracy in biological systems, to endow large-scale complicated computer systems with the same robustness that emerges in complex biological and natural systems. However, in order to promote an evolutionary approach, through emergent self-organisation, it is necessary to specify the systems in an \\'open-ended\\' manner where not all states of the system are prescribed at design-time. In particular an observer system is used to select robust topologies, within system components, based on a measurement of the first non-zero Eigen value in the Laplacian spectrum of the components\\' network graphs; also known as the algebraic connectivity. It is shown, through experimentation on a simulation, that increasing the average algebraic connectivity across the components, in a network, leads to an increase in the variety of individual components termed distributed redundancy; the capacity for structurally distinct components to perform an identical function in a particular context. The results are applied to a specific application where active clustering of like services is used to aid load balancing in a highly distributed network. Using the described procedure is shown to improve performance and distribute redundancy. © 2010 Elsevier Inc.

  10. Comparative Performance of Complex-Valued B-Spline and Polynomial Models Applied to Iterative Frequency-Domain Decision Feedback Equalization of Hammerstein Channels.

    Science.gov (United States)

    Chen, Sheng; Hong, Xia; Khalaf, Emad F; Alsaadi, Fuad E; Harris, Chris J

    2017-12-01

    Complex-valued (CV) B-spline neural network approach offers a highly effective means for identifying and inverting practical Hammerstein systems. Compared with its conventional CV polynomial-based counterpart, a CV B-spline neural network has superior performance in identifying and inverting CV Hammerstein systems, while imposing a similar complexity. This paper reviews the optimality of the CV B-spline neural network approach. Advantages of B-spline neural network approach as compared with the polynomial based modeling approach are extensively discussed, and the effectiveness of the CV neural network-based approach is demonstrated in a real-world application. More specifically, we evaluate the comparative performance of the CV B-spline and polynomial-based approaches for the nonlinear iterative frequency-domain decision feedback equalization (NIFDDFE) of single-carrier Hammerstein channels. Our results confirm the superior performance of the CV B-spline-based NIFDDFE over its CV polynomial-based counterpart.

  11. 5 CFR 9901.405 - Performance management system requirements.

    Science.gov (United States)

    2010-01-01

    ...) Holds supervisors and managers accountable for effectively managing the performance of employees under... and communicating performance expectations, monitoring performance and providing feedback, and... (b) of this section, supervisors and managers will— (1) Clearly communicate performance expectations...

  12. Interpersonal Perception: Cognitive Complexity and Trait Implication

    Science.gov (United States)

    Halverson, Charles F., Jr.

    1970-01-01

    Demonstrates that evaluative connotations of personality characteristics have more persuasive effect on interpersonal judgment for persons low in cognitive complexity than for cognitively complex persons. Stresses need for conceptualizing interpersonal judgment as function of interaction between cognitive complexity and evaluative requirements of…

  13. Musical theatre: the hazards of the performer's workplace.

    Science.gov (United States)

    Morton, Jennie

    2015-03-01

    Being a musical theatre performer requires excellence in the combined skills of dancing, singing, and acting, and artists undergo rigorous training in these disciplines in order to achieve the professional standards expected by a discerning audience. However, the performer has more to do than just execute the choreography, vocal repertoire, and dialogue--he or she will also be navigating the often highly complex on-stage and off-stage areas which are fraught with hazards. This article seeks to highlight the challenges that lie beyond the visible part of the performance and to raise questions of how best to equip our musical theatre performers to safely negotiate these issues.

  14. 5 CFR 9701.405 - Performance management system requirements.

    Science.gov (United States)

    2010-01-01

    ... feedback, and developing, rating, and rewarding performance; and (6) Specify the criteria and procedures to... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Performance management system... HOMELAND SECURITY HUMAN RESOURCES MANAGEMENT SYSTEM Performance Management § 9701.405 Performance...

  15. A Simple and High Performing Rate Control Initialization Method for H.264 AVC Coding Based on Motion Vector Map and Spatial Complexity at Low Bitrate

    Directory of Open Access Journals (Sweden)

    Yalin Wu

    2014-01-01

    Full Text Available The temporal complexity of video sequences can be characterized by motion vector map which consists of motion vectors of each macroblock (MB. In order to obtain the optimal initial QP (quantization parameter for the various video sequences which have different spatial and temporal complexities, this paper proposes a simple and high performance initial QP determining method based on motion vector map and temporal complexity to decide an initial QP in given target bit rate. The proposed algorithm produces the reconstructed video sequences with outstanding and stable quality. For any video sequences, the initial QP can be easily determined from matrices by target bit rate and mapped spatial complexity using proposed mapping method. Experimental results show that the proposed algorithm can show more outstanding objective and subjective performance than other conventional determining methods.

  16. Argument Complexity: Teaching Undergraduates to Make Better Arguments

    Science.gov (United States)

    Kelly, Matthew A.; West, Robert L.

    2017-01-01

    The task of turning undergrads into academics requires teaching them to reason about the world in a more complex way. We present the Argument Complexity Scale, a tool for analysing the complexity of argumentation, based on the Integrative Complexity and Conceptual Complexity Scales from, respectively, political psychology and personality theory.…

  17. QM/MM studies of cisplatin complexes with DNA dimer and octamer

    KAUST Repository

    Gkionis, Konstantinos

    2012-08-01

    Hybrid QM/MM calculations on adducts of cisplatin with DNA dimer and octamer are reported. Starting from the crystal structure of a cisplatin-DNA dimer complex and an NMR structure of a cisplatin-DNA octamer complex, several variants of the ONIOM approach are tested, all employing BHandH for the QM part and AMBER for MM. We demonstrate that a generic set of molecular mechanics parameters for description of Pt-coordination can be used within the subtractive ONIOM scheme without loss of accuracy, such that dedicated parameters for new platinum complexes may not be required. Comparison of optimised structures obtained with different strategies indicates that electrostatic embedding is vital for proper description of the complex, while inclusion of water molecules as explicit solvent further improves performance. The resulting DNA structural parameters are in good general agreement with the experimental structure obtained, particularly when the inherent variability in NMR-derived parameters is taken into account. © 2012 Elsevier B.V.

  18. The NDUFB6 subunit of the mitochondrial respiratory chain complex I is required for electron transfer activity: A proof of principle study on stable and controlled RNA interference in human cell lines

    International Nuclear Information System (INIS)

    Loublier, Sandrine; Bayot, Aurelien; Rak, Malgorzata; El-Khoury, Riyad; Benit, Paule; Rustin, Pierre

    2011-01-01

    Highlights: → NDUFB6 is required for activity of mitochondrial complex I in human cell lines. → Lentivirus based RNA interference results in frequent off target insertions. → Flp-In recombinase mediated miRNA insertion allows gene-specific extinction. -- Abstract: Molecular bases of inherited deficiencies of mitochondrial respiratory chain complex I are still unknown in a high proportion of patients. Among 45 subunits making up this large complex, more than half has unknown function(s). Understanding the function of these subunits would contribute to our knowledge on mitochondrial physiology but might also reveal that some of these subunits are not required for the catalytic activity of the complex. A direct consequence of this finding would be the reduction of the number of candidate genes to be sequenced in patients with decreased complex I activity. In this study, we tested two different methods to stably extinct complex I subunits in cultured cells. We first found that lentivirus-mediated shRNA expression frequently resulted in the unpredicted extinction of additional gene(s) beside targeted ones. This can be ascribed to uncontrolled genetic material insertions in the genome of the host cell. This approach thus appeared inappropriate to study unknown functions of a gene. Next, we found it possible to specifically extinct a CI subunit gene by direct insertion of a miR targeting CI subunits in a Flp site (HEK293 Flp-In cells). By using this strategy we unambiguously demonstrated that the NDUFB6 subunit is required for complex I activity, and defined conditions suitable to undertake a systematic and stable extinction of the different supernumerary subunits in human cells.

  19. The NDUFB6 subunit of the mitochondrial respiratory chain complex I is required for electron transfer activity: A proof of principle study on stable and controlled RNA interference in human cell lines

    Energy Technology Data Exchange (ETDEWEB)

    Loublier, Sandrine; Bayot, Aurelien; Rak, Malgorzata; El-Khoury, Riyad; Benit, Paule [Inserm U676, Hopital Robert Debre, F-75019 Paris (France); Universite Paris 7, Faculte de medecine Denis Diderot, IFR02 Paris (France); Rustin, Pierre, E-mail: pierre.rustin@inserm.fr [Inserm U676, Hopital Robert Debre, F-75019 Paris (France); Universite Paris 7, Faculte de medecine Denis Diderot, IFR02 Paris (France)

    2011-10-22

    Highlights: {yields} NDUFB6 is required for activity of mitochondrial complex I in human cell lines. {yields} Lentivirus based RNA interference results in frequent off target insertions. {yields} Flp-In recombinase mediated miRNA insertion allows gene-specific extinction. -- Abstract: Molecular bases of inherited deficiencies of mitochondrial respiratory chain complex I are still unknown in a high proportion of patients. Among 45 subunits making up this large complex, more than half has unknown function(s). Understanding the function of these subunits would contribute to our knowledge on mitochondrial physiology but might also reveal that some of these subunits are not required for the catalytic activity of the complex. A direct consequence of this finding would be the reduction of the number of candidate genes to be sequenced in patients with decreased complex I activity. In this study, we tested two different methods to stably extinct complex I subunits in cultured cells. We first found that lentivirus-mediated shRNA expression frequently resulted in the unpredicted extinction of additional gene(s) beside targeted ones. This can be ascribed to uncontrolled genetic material insertions in the genome of the host cell. This approach thus appeared inappropriate to study unknown functions of a gene. Next, we found it possible to specifically extinct a CI subunit gene by direct insertion of a miR targeting CI subunits in a Flp site (HEK293 Flp-In cells). By using this strategy we unambiguously demonstrated that the NDUFB6 subunit is required for complex I activity, and defined conditions suitable to undertake a systematic and stable extinction of the different supernumerary subunits in human cells.

  20. Whirlin and PDZ domain-containing 7 (PDZD7) proteins are both required to form the quaternary protein complex associated with Usher syndrome type 2.

    Science.gov (United States)

    Chen, Qian; Zou, Junhuang; Shen, Zuolian; Zhang, Weiping; Yang, Jun

    2014-12-26

    Usher syndrome (USH) is the leading genetic cause of combined hearing and vision loss. Among the three USH clinical types, type 2 (USH2) occurs most commonly. USH2A, GPR98, and WHRN are three known causative genes of USH2, whereas PDZD7 is a modifier gene found in USH2 patients. The proteins encoded by these four USH genes have been proposed to form a multiprotein complex, the USH2 complex, due to interactions found among some of these proteins in vitro, their colocalization in vivo, and mutual dependence of some of these proteins for their normal in vivo localizations. However, evidence showing the formation of the USH2 complex is missing, and details on how this complex is formed remain elusive. Here, we systematically investigated interactions among the intracellular regions of the four USH proteins using colocalization, yeast two-hybrid, and pull-down assays. We show that multiple domains of the four USH proteins interact among one another. Importantly, both WHRN and PDZD7 are required for the complex formation with USH2A and GPR98. In this USH2 quaternary complex, WHRN prefers to bind to USH2A, whereas PDZD7 prefers to bind to GPR98. Interaction between WHRN and PDZD7 is the bridge between USH2A and GPR98. Additionally, the USH2 quaternary complex has a variable stoichiometry. These findings suggest that a non-obligate, short term, and dynamic USH2 quaternary protein complex may exist in vivo. Our work provides valuable insight into the physiological role of the USH2 complex in vivo and informs possible reconstruction of the USH2 complex for future therapy. © 2014 by The American Society for Biochemistry and Molecular Biology, Inc.

  1. Performing the processing required for automatically get a PDF/A version of the CERN Library documentation

    CERN Document Server

    Molina Garcia-Retamero, Antonio

    2015-01-01

    The aim of the project was to perform the processing required for automatically get a PDF/A version of the CERN Library documentation. For this, it is necessary to extract as much metadata as possible from the sources files, inject the required data into the original source files creating new ones ready for being compiled with all related dependencies. Besides, I’ve proposed the creation of a HTML version consistent with the PDF and navigable for easy access, I’ve been trying to perform some Natural Language Processing for extracting metadata, I’ve proposed the injection of the cern library documentation into the HTML version of the long writeups where it is referenced (for instance, when a CERN Library function is referenced in a sample code) Finally, I’ve designed and implemented a Graphical User Interface in order to simplify the process for the user.

  2. Task Complexity Modulates Sleep-Related Offline Learning in Sequential Motor Skills

    Directory of Open Access Journals (Sweden)

    Klaus Blischke

    2017-07-01

    Full Text Available Recently, a number of authors have advocated the introduction of gross motor tasks into research on sleep-related motor offline learning. Such tasks are often designed to be more complex than traditional key-pressing tasks. However, until now, little effort has been undertaken to scrutinize the role of task complexity in any systematic way. Therefore, the effect of task complexity on the consolidation of gross motor sequence memory was examined by our group in a series of three experiments. Criterion tasks always required participants to produce unrestrained arm movement sequences by successively fitting a small peg into target holes on a pegboard. The sequences always followed a certain spatial pattern in the horizontal plane. The targets were visualized prior to each transport movement on a computer screen. The tasks differed with respect to sequence length and structural complexity. In each experiment, half of the participants initially learned the task in the morning and were retested 12 h later following a wake retention interval. The other half of the subjects underwent practice in the evening and was retested 12 h later following a night of sleep. The dependent variables were the error rate and total sequence execution time (inverse to the sequence execution speed. Performance generally improved during acquisition. The error rate was always low and remained stable during retention. The sequence execution time significantly decreased again following sleep but not after waking when the sequence length was long and structural complexity was high. However, sleep-related offline improvements were absent when the sequence length was short or when subjects performed a highly regular movement pattern. It is assumed that the occurrence of sleep-related offline performance improvements in sequential motor tasks is associated with a sufficient amount of motor task complexity.

  3. Performance and complexity of tunable sparse network coding with gradual growing tuning functions over wireless networks

    OpenAIRE

    Garrido Ortiz, Pablo; Sørensen, Chres W.; Lucani Roetter, Daniel Enrique; Agüero Calvo, Ramón

    2016-01-01

    Random Linear Network Coding (RLNC) has been shown to be a technique with several benefits, in particular when applied over wireless mesh networks, since it provides robustness against packet losses. On the other hand, Tunable Sparse Network Coding (TSNC) is a promising concept, which leverages a trade-off between computational complexity and goodput. An optimal density tuning function has not been found yet, due to the lack of a closed-form expression that links density, performance and comp...

  4. Cdt1p, through its interaction with Mcm6p, is required for the formation, nuclear accumulation and chromatin loading of the MCM complex.

    Science.gov (United States)

    Wu, Rentian; Wang, Jiafeng; Liang, Chun

    2012-01-01

    Regulation of DNA replication initiation is essential for the faithful inheritance of genetic information. Replication initiation is a multi-step process involving many factors including ORC, Cdt1p, Mcm2-7p and other proteins that bind to replication origins to form a pre-replicative complex (pre-RC). As a prerequisite for pre-RC assembly, Cdt1p and the Mcm2-7p heterohexameric complex accumulate in the nucleus in G1 phase in an interdependent manner in budding yeast. However, the nature of this interdependence is not clear, nor is it known whether Cdt1p is required for the assembly of the MCM complex. In this study, we provide the first evidence that Cdt1p, through its interaction with Mcm6p with the C-terminal regions of the two proteins, is crucial for the formation of the MCM complex in both the cytoplasm and nucleoplasm. We demonstrate that disruption of the interaction between Cdt1p and Mcm6p prevents the formation of the MCM complex, excludes Mcm2-7p from the nucleus, and inhibits pre-RC assembly and DNA replication. Our findings suggest a function for Cdt1p in promoting the assembly of the MCM complex and maintaining its integrity by interacting with Mcm6p.

  5. A study on the identification of cognitive complexity factors related to the complexity of procedural steps

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Kyun; Jeong, Kwang Sup; Jung, Won Dea [KAERI, Taejon (Korea, Republic of)

    2004-07-01

    In complex systems, it is well recognized that the provision of understandable procedures that allow operators to clarify 'what needs to be done' and 'how to do it' is one of the requisites to confirm their safety. In this regard, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is evinced from the comparisons between SC scores and operators' performance data. To this end, the comparisons between operators' performance data with their behavior in conducting prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect operators' cognitive burden are identified. Although a well-designed experiment is indispensable in confirming the appropriateness of cognitive complexity factors, it is strongly believed that the change of an operator's performance can be more authentically explained if they are taken into consideration.

  6. A study on the identification of cognitive complexity factors related to the complexity of procedural steps

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jeong, Kwang Sup; Jung, Won Dea

    2004-01-01

    In complex systems, it is well recognized that the provision of understandable procedures that allow operators to clarify 'what needs to be done' and 'how to do it' is one of the requisites to confirm their safety. In this regard, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is evinced from the comparisons between SC scores and operators' performance data. To this end, the comparisons between operators' performance data with their behavior in conducting prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect operators' cognitive burden are identified. Although a well-designed experiment is indispensable in confirming the appropriateness of cognitive complexity factors, it is strongly believed that the change of an operator's performance can be more authentically explained if they are taken into consideration

  7. ICDF Complex Remedial Action Work Plan

    Energy Technology Data Exchange (ETDEWEB)

    W. M. Heileson

    2006-12-01

    This Remedial Action Work Plan provides the framework for operation of the Idaho Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) Disposal Facility Complex (ICDF). This facility includes (a) an engineered landfill that meets the substantial requirements of DOE Order 435.1, Resource Conservation and Recovery Act Subtitle C, Idaho Hazardous Waste Management Act, and Toxic Substances Control Act polychlorinated biphenyl landfill requirements; (b) centralized receiving, inspections, administration, storage/staging, and treatment facilities necessary for CERCLA investigation-derived, remedial, and removal waste at the Idaho National Laboratory (INL) prior to final disposition in the disposal facility or shipment off-Site; and (c) an evaporation pond that has been designated as a corrective action management unit. The ICDF Complex, including a buffer zone, will cover approximately 40 acres, with a landfill disposal capacity of approximately 510,000 yd3. The ICDF Complex is designed and authorized to accept INL CERCLA-generated wastes, and includes the necessary subsystems and support facilities to provide a complete waste management system. This Remedial Action Work Plan presents the operational approach and requirements for the various components that are part of the ICDF Complex. Summaries of the remedial action work elements are presented herein, with supporting information and documents provided as appendixes to this work plan that contain specific detail about the operation of the ICDF Complex. This document presents the planned operational process based upon an evaluation of the remedial action requirements set forth in the Operable Unit 3-13 Final Record of Decision.

  8. Decision paths in complex tasks

    Science.gov (United States)

    Galanter, Eugene

    1991-01-01

    Complex real world action and its prediction and control has escaped analysis by the classical methods of psychological research. The reason is that psychologists have no procedures to parse complex tasks into their constituents. Where such a division can be made, based say on expert judgment, there is no natural scale to measure the positive or negative values of the components. Even if we could assign numbers to task parts, we lack rules i.e., a theory, to combine them into a total task representation. We compare here two plausible theories for the amalgamation of the value of task components. Both of these theories require a numerical representation of motivation, for motivation is the primary variable that guides choice and action in well-learned tasks. We address this problem of motivational quantification and performance prediction by developing psychophysical scales of the desireability or aversiveness of task components based on utility scaling methods (Galanter 1990). We modify methods used originally to scale sensory magnitudes (Stevens and Galanter 1957), and that have been applied recently to the measure of task 'workload' by Gopher and Braune (1984). Our modification uses utility comparison scaling techniques which avoid the unnecessary assumptions made by Gopher and Braune. Formula for the utility of complex tasks based on the theoretical models are used to predict decision and choice of alternate paths to the same goal.

  9. The JPL functional requirements tool

    Science.gov (United States)

    Giffin, Geoff; Skinner, Judith; Stoller, Richard

    1987-01-01

    Planetary spacecraft are complex vehicles which are built according to many thousands of requirements. Problems encountered in documenting and maintaining these requirements led to the current attempt to reduce or eliminate these problems by a computer automated data base Functional Requirements Tool. The tool developed at JPL and in use on several JPL Projects is described. The organization and functionality of the Tool, together with an explanation of the data base inputs, their relationships, and use are presented. Methods of interfacing with external documents, representation of tables and figures, and methods of approval and change processing are discussed. The options available for disseminating information from the Tool are identified. The implementation of the Requirements Tool is outlined, and the operation is summarized. The conclusions drawn from this work is that the Requirements Tool represents a useful addition to the System Engineer's Tool kit, it is not currently available elsewhere, and a clear development path exists to expand the capabilities of the Tool to serve larger and more complex projects.

  10. 42 CFR 433.123 - Notification of changes in system requirements, performance standards or other conditions for...

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Notification of changes in system requirements, performance standards or other conditions for approval or reapproval. 433.123 Section 433.123 Public Health... ASSISTANCE PROGRAMS STATE FISCAL ADMINISTRATION Mechanized Claims Processing and Information Retrieval...

  11. Metric-based method of software requirements correctness improvement

    Directory of Open Access Journals (Sweden)

    Yaremchuk Svitlana

    2017-01-01

    Full Text Available The work highlights the most important principles of software reliability management (SRM. The SRM concept construes a basis for developing a method of requirements correctness improvement. The method assumes that complicated requirements contain more actual and potential design faults/defects. The method applies a newer metric to evaluate the requirements complexity and double sorting technique evaluating the priority and complexity of a particular requirement. The method enables to improve requirements correctness due to identification of a higher number of defects with restricted resources. Practical application of the proposed method in the course of demands review assured a sensible technical and economic effect.

  12. Remediation management of complex sites using an adaptive site management approach.

    Science.gov (United States)

    Price, John; Spreng, Carl; Hawley, Elisabeth L; Deeb, Rula

    2017-12-15

    Complex sites require a disproportionate amount of resources for environmental remediation and long timeframes to achieve remediation objectives, due to their complex geologic conditions, hydrogeologic conditions, geochemical conditions, contaminant-related conditions, large scale of contamination, and/or non-technical challenges. A recent team of state and federal environmental regulators, federal agency representatives, industry experts, community stakeholders, and academia worked together as an Interstate Technology & Regulatory Council (ITRC) team to compile resources and create new guidance on the remediation management of complex sites. This article summarizes the ITRC team's recommended process for addressing complex sites through an adaptive site management approach. The team provided guidance for site managers and other stakeholders to evaluate site complexities and determine site remediation potential, i.e., whether an adaptive site management approach is warranted. Adaptive site management was described as a comprehensive, flexible approach to iteratively evaluate and adjust the remedial strategy in response to remedy performance. Key aspects of adaptive site management were described, including tools for revising and updating the conceptual site model (CSM), the importance of setting interim objectives to define short-term milestones on the journey to achieving site objectives, establishing a performance model and metrics to evaluate progress towards meeting interim objectives, and comparing actual with predicted progress during scheduled periodic evaluations, and establishing decision criteria for when and how to adapt/modify/revise the remedial strategy in response to remedy performance. Key findings will be published in an ITRC Technical and Regulatory guidance document in 2017 and free training webinars will be conducted. More information is available at www.itrc-web.org. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. CONSIDERATIONS REGARDING THE IMPLEMENTATION OF A PERFORMANCE MANAGEMENT SYSTEM IN PRIVATE HOSPITALS

    Directory of Open Access Journals (Sweden)

    Marian TAICU

    2013-09-01

    Full Text Available Obtaining performance in private hospitals require a proper management of costs and implementing a situation for performance monitoring. The implementation of a cost calculation method in hospitals is a complex process that must take into account the particularities of the activity in health care system. This paper presents a comparative analysis of four costing methods and a model of performance monitoring situation, adapted to the specific of the hospitals.

  14. Analysis of complex networks using aggressive abstraction.

    Energy Technology Data Exchange (ETDEWEB)

    Colbaugh, Richard; Glass, Kristin.; Willard, Gerald

    2008-10-01

    This paper presents a new methodology for analyzing complex networks in which the network of interest is first abstracted to a much simpler (but equivalent) representation, the required analysis is performed using the abstraction, and analytic conclusions are then mapped back to the original network and interpreted there. We begin by identifying a broad and important class of complex networks which admit abstractions that are simultaneously dramatically simplifying and property preserving we call these aggressive abstractions -- and which can therefore be analyzed using the proposed approach. We then introduce and develop two forms of aggressive abstraction: 1.) finite state abstraction, in which dynamical networks with uncountable state spaces are modeled using finite state systems, and 2.) onedimensional abstraction, whereby high dimensional network dynamics are captured in a meaningful way using a single scalar variable. In each case, the property preserving nature of the abstraction process is rigorously established and efficient algorithms are presented for computing the abstraction. The considerable potential of the proposed approach to complex networks analysis is illustrated through case studies involving vulnerability analysis of technological networks and predictive analysis for social processes.

  15. Sequestration Coating Performance Requirements for Mitigation of Contamination from a Radiological Dispersion Device

    International Nuclear Information System (INIS)

    Drake, J.

    2009-01-01

    Immediate action would be necessary to minimize the effects of a radiological 'dirty bomb' detonation in a major city. After a dirty bomb has been detonated, vehicular and pedestrian traffic, as well as weather effects, would increase the spread of loose contamination, making control and recovery more difficult and costly. While contaminant migration and chemical binding into surface materials can be relatively rapid, the immediate treatment of surfaces with large quantities of an appropriate compound could alleviate much of the difficulty in decontamination. The EPA's National Homeland Security Research Center (NHSRC), in collaboration with ASTM International, is currently developing performance standards for materials which could be applied to exterior surfaces contaminated by an RDD to mitigate the spread and migration of radioactive contamination. These performance standards are being promulgated via an ASTM Standard Specification to be published by ASTM International. Test methods will be developed to determine if candidate coatings meet the performance requirements stipulated in the ASTM performance standard. These test methods will be adapted from existing standard methods, or will be devised through laboratory research. The final set of test methods will be codified in an ASTM or other standard test method. The principal market for products described in the ASTM performance standard would be federal, state and local government emergency responders and response planners, decontamination service providers and those whose interests include protection and recovery of real estate potentially at risk from radiological terrorism. (authors)

  16. A framework for the analysis of cognitive reliability in complex systems: a recovery centred approach

    International Nuclear Information System (INIS)

    Kontogiannis, Tom

    1997-01-01

    Managing complex industrial systems requires reliable performance of cognitive tasks undertaken by operating crews. The infrequent practice of cognitive skills and the reliance on operator performance for novel situations raised cognitive reliability into an urgent and essential aspect in system design and risk analysis. The aim of this article is to contribute to the development of methods for the analysis of cognitive tasks in complex man-machine interactions. A practical framework is proposed for analysing cognitive errors and enhancing error recovery through interface design. Cognitive errors are viewed as failures in problem solving which are difficult to recover under the task constrains imposed by complex systems. In this sense, the interaction between context and cognition, on the one hand, and the process of error recovery, on the other hand, become the focal points of the proposed framework which is illustrated in an analysis of a simulated emergency

  17. Systematic handling of requirements and conditions (in compliance with waste acceptance requirements for a radioactive waste disposal facility)

    International Nuclear Information System (INIS)

    Keyser, Peter; Helander, Anita

    2012-01-01

    environment; iv) Aiding decision making on the authorization / licensing of radioactive waste disposal; and v) Facilitating communication amongst stakeholders on issues relating to the disposal facility. How can we ensure and control compliance with WAC during Pre-disposal activities? The link between the safety cases of Pre-disposal activities and the Disposal facility is primarily the Waste Acceptance Criteria (WAC), defined as 'those requirements that are to be met by conditioned radioactive wastes, forming packages, to be accepted at an Interim Storage or a Disposal Facility'. It is advised that also WAC should be set up for each stage of the pre-disposal activities in the Waste Management Plan or Strategy. Waste characterization requirements are typically developed from disposal performance assessment in addition to waste acceptance criteria (WAC), process control and quality assurance requirements, transportation requirements, and worker safety requirements. A matrix showing where each WAC originates can greatly assist with understanding the philosophy behind the overall characterization program and put the elements into context. The complexity of waste categorization requires the need for systematic handling of requirements and conditions during pre-disposal activities. How can we ensure the fulfillment of WAC for a radioactive waste disposal facility? Requirements management, sometimes called configuration management, is an area that recently has received increasing attention in the project management context. There exist international guidelines on the use of configuration management within an organization and it is applicable to the support of products from concept to disposal. It first outlines the responsibilities and authorities before describing the configuration management process that includes configuration management planning, configuration identification, change control, configuration status accounting and configuration audit. The methodology develops a

  18. Multiple Stressors and Ecological Complexity Require A New Approach to Coral Reef Research

    Directory of Open Access Journals (Sweden)

    Linwood Hagan Pendleton

    2016-03-01

    Full Text Available Ocean acidification, climate change, and other environmental stressors threaten coral reef ecosystems and the people who depend upon them. New science reveals that these multiple stressors interact and may affect a multitude of physiological and ecological processes in complex ways. The interaction of multiple stressors and ecological complexity may mean that the negative effects on coral reef ecosystems will happen sooner and be more severe than previously thought. Yet, most research on the effects of global change on coral reefs focus on one or few stressors and pathways or outcomes (e.g. bleaching. Based on a critical review of the literature, we call for a regionally targeted strategy of mesocosm-level research that addresses this complexity and provides more realistic projections about coral reef impacts in the face of global environmental change. We believe similar approaches are needed for other ecosystems that face global environmental change.

  19. Complex analysis

    CERN Document Server

    Freitag, Eberhard

    2005-01-01

    The guiding principle of this presentation of ``Classical Complex Analysis'' is to proceed as quickly as possible to the central results while using a small number of notions and concepts from other fields. Thus the prerequisites for understanding this book are minimal; only elementary facts of calculus and algebra are required. The first four chapters cover the essential core of complex analysis: - differentiation in C (including elementary facts about conformal mappings) - integration in C (including complex line integrals, Cauchy's Integral Theorem, and the Integral Formulas) - sequences and series of analytic functions, (isolated) singularities, Laurent series, calculus of residues - construction of analytic functions: the gamma function, Weierstrass' Factorization Theorem, Mittag-Leffler Partial Fraction Decomposition, and -as a particular highlight- the Riemann Mapping Theorem, which characterizes the simply connected domains in C. Further topics included are: - the theory of elliptic functions based on...

  20. Improvement of a three-dimensional atmospheric dynamic model and examination of its performance over complex terrain

    International Nuclear Information System (INIS)

    Nagai, Haruyasu; Yamazawa, Hiromi

    1994-11-01

    A three-dimensional atmospheric dynamic model (PHYSIC) was improved and its performance was examined using the meteorological data observed at a coastal area with a complex terrain. To introduce synoptic meteorological conditions into the model, the initial and boundary conditions were improved. By this improvement, the model can predict the temporal change of wind field for more than 24 hours. Moreover, the model successfully simulates the land and sea breeze observed at Shimokita area in the summer of 1992. (author)

  1. Technical requirements for the actinide source-term waste test program

    Energy Technology Data Exchange (ETDEWEB)

    Phillips, M.L.F.; Molecke, M.A.

    1993-10-01

    This document defines the technical requirements for a test program designed to measure time-dependent concentrations of actinide elements from contact-handled transuranic (CH TRU) waste immersed in brines similar to those found in the underground workings of the Waste Isolation Pilot Plant (WIPP). This test program wig determine the influences of TRU waste constituents on the concentrations of dissolved and suspended actinides relevant to the performance of the WIPP. These influences (which include pH, Eh, complexing agents, sorbent phases, and colloidal particles) can affect solubilities and colloidal mobilization of actinides. The test concept involves fully inundating several TRU waste types with simulated WIPP brines in sealed containers and monitoring the concentrations of actinide species in the leachate as a function of time. The results from this program will be used to test numeric models of actinide concentrations derived from laboratory studies. The model is required for WIPP performance assessment with respect to the Environmental Protection Agency`s 40 CFR Part 191B.

  2. Technical requirements for the actinide source-term waste test program

    International Nuclear Information System (INIS)

    Phillips, M.L.F.; Molecke, M.A.

    1993-10-01

    This document defines the technical requirements for a test program designed to measure time-dependent concentrations of actinide elements from contact-handled transuranic (CH TRU) waste immersed in brines similar to those found in the underground workings of the Waste Isolation Pilot Plant (WIPP). This test program wig determine the influences of TRU waste constituents on the concentrations of dissolved and suspended actinides relevant to the performance of the WIPP. These influences (which include pH, Eh, complexing agents, sorbent phases, and colloidal particles) can affect solubilities and colloidal mobilization of actinides. The test concept involves fully inundating several TRU waste types with simulated WIPP brines in sealed containers and monitoring the concentrations of actinide species in the leachate as a function of time. The results from this program will be used to test numeric models of actinide concentrations derived from laboratory studies. The model is required for WIPP performance assessment with respect to the Environmental Protection Agency's 40 CFR Part 191B

  3. Diabetic retinopathy and complexity of retinal surgery in a general hospital.

    Science.gov (United States)

    Mijangos-Medina, Laura Fanny; Hurtado-Noriega, Blanca Esmeralda; Lima-Gómez, Virgilio

    2012-01-01

    Usual retinal surgery (vitrectomy or surgery for retinal detachment) may require additional procedures to deal with complex cases, which increase time and resource use and delay access to treatment. We undertook this study to identify the proportion of primary retinal surgeries that required complex procedures and the associated causes. We carried out an observational, descriptive, cross-sectional, retrospective study. Patients with primary retinal surgery were evaluated (January 2007-December 2010). The proportion and 95% confidence intervals (CI) of preoperative diagnosis and cause of the disease requiring retinal surgery as well as the causes for complex retinal surgery were identified. Complex retinal surgery was defined as that requiring lens extraction, intraocular lens implantation, heavy perfluorocarbon liquids, silicone oil tamponade or intravitreal drugs, in addition to the usual surgical retinal procedure. The proportion of complex retinal surgeries was compared among preoperative diagnoses and among causes (χ(2), odds ratio [OR]). We studied 338 eyes. Mean age of subjects was 53.7 years, and there were 49% females. The most common diagnoses were vitreous hemorrhage (27.2%) and rhegmatogenous retinal detachment (24.6%). The most common cause was diabetes (50.6%); 273 eyes required complex surgery (80.8%, 95% CI: 76.6-85). The proportion did not differ among diagnoses but was higher in diabetic retinopathy (89%, p diabetic retinopathy increased by 3-fold the probability of requiring these complex procedures. Early treatment of diabetic retinopathy may reduce the proportion of complex retinal surgery by 56%.

  4. Endosomal sorting complexes required for ESCRTing cells toward death during neurogenesis, neurodevelopment and neurodegeneration.

    Science.gov (United States)

    Kaul, Zenia; Chakrabarti, Oishee

    2018-03-25

    The endosomal sorting complexes required for transport (ESCRT) proteins help in the recognition, sorting and degradation of ubiquitinated cargoes from the cell surface, long-lived proteins or aggregates, and aged organelles present in the cytosol. These proteins take part in the endo-lysosomal system of degradation. The ESCRT proteins also play an integral role in cytokinesis, viral budding and mRNA transport. Many neurodegenerative diseases are caused by toxic accumulation of cargo in the cell, which causes stress and ultimately leads to neuronal death. This accumulation of cargo occurs because of defects in the endo-lysosomal degradative pathway-loss of function of ESCRTs has been implicated in this mechanism. ESCRTs also take part in many survival processes, lack of which can culminate in neuronal cell death. While the role played by the ESCRT proteins in maintaining healthy neurons is known, their role in neurodegenerative diseases is still poorly understood. In this review, we highlight the importance of ESCRTs in maintaining healthy neurons and then suggest how perturbations in many of the survival mechanisms governed by these proteins could eventually lead to cell death; quite often these correlations are not so obviously laid out. Extensive neuronal death eventually culminates in neurodegeneration. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Rybp, a polycomb complex-associated protein, is required for mouse eye development

    Directory of Open Access Journals (Sweden)

    Schreiber-Agus Nicole

    2007-04-01

    Full Text Available Abstract Background Rybp (Ring1 and YY1 binding protein is a zinc finger protein which interacts with the members of the mammalian polycomb complexes. Previously we have shown that Rybp is critical for early embryogenesis and that haploinsufficiency of Rybp in a subset of embryos causes failure of neural tube closure. Here we investigated the requirement for Rybp in ocular development using four in vivo mouse models which resulted in either the ablation or overexpression of Rybp. Results Our results demonstrate that loss of a single Rybp allele in conventional knockout mice often resulted in retinal coloboma, an incomplete closure of the optic fissure, characterized by perturbed localization of Pax6 but not of Pax2. In addition, about one half of Rybp-/- Rybp+/+ chimeric embryos also developed retinal colobomas and malformed lenses. Tissue-specific transgenic overexpression of Rybp in the lens resulted in abnormal fiber cell differentiation and severe lens opacification with increased levels of AP-2α and Sox2, and reduced levels of βA4-crystallin gene expression. Ubiquitous transgenic overexpression of Rybp in the entire eye caused abnormal retinal folds, corneal neovascularization, and lens opacification. Additional changes included defects in anterior eye development. Conclusion These studies establish Rybp as a novel gene that has been associated with coloboma. Other genes linked to coloboma encode various classes of transcription factors such as BCOR, CBP, Chx10, Pax2, Pax6, Six3, Ski, Vax1 and Vax2. We propose that the multiple functions for Rybp in regulating mouse retinal and lens development are mediated by genetic, epigenetic and physical interactions between these genes and proteins.

  6. Measuring acute rehabilitation needs in trauma: preliminary evaluation of the Rehabilitation Complexity Scale.

    Science.gov (United States)

    Hoffman, Karen; West, Anita; Nott, Philippa; Cole, Elaine; Playford, Diane; Liu, Clarence; Brohi, Karim

    2013-01-01

    Injury severity, disability and care dependency are frequently used as surrogate measures for rehabilitation requirements following trauma. The true rehabilitation needs of patients may be different but there are no validated tools for the measurement of rehabilitation complexity in acute trauma care. The aim of the study was to evaluate the potential utility of the Rehabilitation Complexity Scale (RCS) version 2 in measuring acute rehabilitation needs in trauma patients. A prospective observation study of 103 patients with traumatic injuries in a Major Trauma Centre. Rehabilitation complexity was measured using the RCS and disability was measured using the Barthel Index. Demographic information and injury characteristics were obtained from the trauma database. The RCS was closely correlated with injury severity (r=0.69, p<0.001) and the Barthel Index (r=0.91, p<0.001). However the Barthel was poor at discriminating between patients rehabilitation needs, especially for patients with higher injury severities. Of 58 patients classified as 'very dependent' by the Barthel, 21 (36%) had low or moderate rehabilitation complexity. The RCS correlated with acute hospital length of stay (r=0.64, p=<0.001) and patients with a low RCS were more likely to be discharged home. The Barthel which had a flooring effect (56% of patients classified as very dependent were discharged home) and lacked discrimination despite close statistical correlation. The RCS outperformed the ISS and the Barthel in its ability to identify rehabilitation requirements in relation to injury severity, rehabilitation complexity, length of stay and discharge destination. The RCS is potentially a feasible and useful tool for the assessment of rehabilitation complexity in acute trauma care by providing specific measurement of patients' rehabilitation requirements. A larger longitudinal study is needed to evaluate the RCS in the assessment of patient need, service provision and trauma system performance

  7. Intracellular Transport of Vaccinia Virus in HeLa Cells Requires WASH-VPEF/FAM21-Retromer Complexes and Recycling Molecules Rab11 and Rab22

    Science.gov (United States)

    Hsiao, Jye-Chian; Chu, Li-Wei; Lo, Yung-Tsun; Lee, Sue-Ping; Chen, Tzu-Jung; Huang, Cheng-Yen

    2015-01-01

    ABSTRACT Vaccinia virus, the prototype of the Orthopoxvirus genus in the family Poxviridae, infects a wide range of cell lines and animals. Vaccinia mature virus particles of the WR strain reportedly enter HeLa cells through fluid-phase endocytosis. However, the intracellular trafficking process of the vaccinia mature virus between cellular uptake and membrane fusion remains unknown. We used live imaging of single virus particles with a combination of various cellular vesicle markers, to track fluorescent vaccinia mature virus particle movement in cells. Furthermore, we performed functional interference assays to perturb distinct vesicle trafficking processes in order to delineate the specific route undertaken by vaccinia mature virus prior to membrane fusion and virus core uncoating in cells. Our results showed that vaccinia virus traffics to early endosomes, where recycling endosome markers Rab11 and Rab22 are recruited to participate in subsequent virus trafficking prior to virus core uncoating in the cytoplasm. Furthermore, we identified WASH-VPEF/FAM21-retromer complexes that mediate endosome fission and sorting of virus-containing vesicles prior to virus core uncoating in the cytoplasm. IMPORTANCE Vaccinia mature virions of the WR strain enter HeLa cells through fluid phase endocytosis. We previously demonstrated that virus-containing vesicles are internalized into phosphatidylinositol 3-phosphate positive macropinosomes, which are then fused with Rab5-positive early endosomes. However, the subsequent process of sorting the virion-containing vesicles prior to membrane fusion remains unclear. We dissected the intracellular trafficking pathway of vaccinia mature virions in cells up to virus core uncoating in cytoplasm. We show that vaccinia mature virions first travel to early endosomes. Subsequent trafficking events require the important endosome-tethered protein VPEF/FAM21, which recruits WASH and retromer protein complexes to the endosome. There, the complex

  8. Low-Complexity Interference-Free Downlink Channel Assignment with Improved Performance in Coordinated Small Cells

    KAUST Repository

    Radaydeh, Redha M.

    2015-05-01

    This paper proposes a low-complexity interference-free channel assignment scheme with improved desired downlink performance in coordinated multi-antenna small-coverage access points (APs) that employ the open-access control strategy. The adopted system treats the case when each user can be granted an access to one of the available channels at a time. Moreover, each receive terminal can suppress a limited number of resolvable interfering sources via its highly-correlated receive array. On the other hand, the operation of the deployed APs can be coordinated to serve active users, and the availability of multiple physical channels and the use of uncorrelated transmit antennas at each AP are exploited to improve the performance of supported users. The analysis provides new approaches to use the transmit antenna array at each AP, the multiple physical channels, the receive antenna array at each user in order to identify interference-free channels per each user, and then to select a downlink channel that provides the best possible improved performance. The event of concurrent interference-free channel identification by different users is also treated to further improve the desired link associated with the scheduled user. The analysis considers the practical scenario of imperfect identification of interference-free channel by an active user and/or the imperfectness in scheduling concurrent users requests on the same channel. The developed formulations can be used to study any performance metric and they are applicable for any statistical and geometric channel models. © 2015 IEEE.

  9. Arc Requires PSD95 for Assembly into Postsynaptic Complexes Involved with Neural Dysfunction and Intelligence

    Directory of Open Access Journals (Sweden)

    Esperanza Fernández

    2017-10-01

    Full Text Available Arc is an activity-regulated neuronal protein, but little is known about its interactions, assembly into multiprotein complexes, and role in human disease and cognition. We applied an integrated proteomic and genetic strategy by targeting a tandem affinity purification (TAP tag and Venus fluorescent protein into the endogenous Arc gene in mice. This allowed biochemical and proteomic characterization of native complexes in wild-type and knockout mice. We identified many Arc-interacting proteins, of which PSD95 was the most abundant. PSD95 was essential for Arc assembly into 1.5-MDa complexes and activity-dependent recruitment to excitatory synapses. Integrating human genetic data with proteomic data showed that Arc-PSD95 complexes are enriched in schizophrenia, intellectual disability, autism, and epilepsy mutations and normal variants in intelligence. We propose that Arc-PSD95 postsynaptic complexes potentially affect human cognitive function.

  10. Rhenium carbene complexes and their applications; Rhenium-Carben-Komplexe und ihre Anwendungen

    Energy Technology Data Exchange (ETDEWEB)

    Hille, Claudia Heidi

    2016-01-25

    New pharmaceutically suitable metal complexes play an important role in the development of diagnostic and therapeutic agents for cancer treatment. One option concerning new radiopharmaceuticals, is the application of the rhenium isotopes {sup 186}Re and {sup 188}Re. Therefore, water soluble but at the same time stable complexes, which can be synthesized straightforward, are required. In this thesis, several synthetic pathways to such rhenium complexes bearing Nheterocyclic carbenes are presented and applicability tests of literature known complexes conducted. The selected target structures based on monocarbenes turned out to be inappropriate for use in radiopharmaceutical applications, due to their long reaction times and purification issues. Additionally, sterical and electronical effects of the carbene ligands concerning complex formation have been investigated. Possibilities of functionalization at different positions on the heterocycle as well as hydrophilic wingtips - to achieve a better stability in an aqueous media - have been examined to gain information about chemical and physical properties of the resulting complexes. Furthermore, experiments regarding the coordination of various biscarbene ligands, which provides besides the stable chelatisation additionally the possibility of varying the linking bridge, to rhenium(I/V) precursors, have been performed. Dioxo-bis-(1,1{sup '}-methylene-bis(3,3{sup '}-diisopropylimidazolium-2-ylidene)) rhenium(V )-hexafluorophosphate was synthesized via a transmetalation reaction of the corresponding silver carbene with ReOCl{sub 3}(PPh{sub 3}){sub 2} and silver hexafluorophosphate. This complex provided the basis for the first radiolabeled {sup 188}Rhenium NHC complex later on. An enhancement of the kinetic and thermodynamic stability of potential rhenium biscarbene complexes based on modifications concerning the length and character of the bridging moiety between the chelating NHC rings as well as the nature of

  11. ASF1 is required to load histones on the HIRA complex in preparation of paternal chromatin assembly at fertilization.

    Science.gov (United States)

    Horard, Béatrice; Sapey-Triomphe, Laure; Bonnefoy, Emilie; Loppin, Benjamin

    2018-05-11

    Anti-Silencing Factor 1 (ASF1) is a conserved H3-H4 histone chaperone involved in both Replication-Coupled and Replication-Independent (RI) nucleosome assembly pathways. At DNA replication forks, ASF1 plays an important role in regulating the supply of H3.1/2 and H4 to the CAF-1 chromatin assembly complex. ASF1 also provides H3.3-H4 dimers to HIRA and DAXX chaperones for RI nucleosome assembly. The early Drosophila embryo is an attractive system to study chromatin assembly in a developmental context. The formation of a diploid zygote begins with the unique, genome-wide RI assembly of paternal chromatin following sperm protamine eviction. Then, within the same cytoplasm, syncytial embryonic nuclei undergo a series of rapid, synchronous S and M phases to form the blastoderm embryo. Here, we have investigated the implication of ASF1 in these two distinct assembly processes. We show that depletion of the maternal pool of ASF1 with a specific shRNA induces a fully penetrant, maternal effect embryo lethal phenotype. Unexpectedly, despite the depletion of ASF1 protein to undetectable levels, we show that asf1 knocked-down (KD) embryos can develop to various stages, thus demonstrating that ASF1 is not absolutely required for the amplification of cleavage nuclei. Remarkably, we found that ASF1 is required for the formation of the male pronucleus, although ASF1 protein does not reside in the decondensing sperm nucleus. In asf1 KD embryos, HIRA localizes to the male nucleus but is only capable of limited and insufficient chromatin assembly. Finally, we show that the conserved HIRA B domain, which is involved in ASF1-HIRA interaction, is dispensable for female fertility. We conclude that ASF1 is critically required to load H3.3-H4 dimers on the HIRA complex prior to histone deposition on paternal DNA. This separation of tasks could optimize the rapid assembly of paternal chromatin within the gigantic volume of the egg cell. In contrast, ASF1 is surprisingly dispensable for the

  12. Towards performance requirements for structural connections

    NARCIS (Netherlands)

    Stark, J.W.B.

    1999-01-01

    There is a tendency in the Construction Industry to move from solution driven specifications towards performance specifications. Traditionally structural specifications including those for steel construction used to be mainly solution driven. In this paper the position of the draft European

  13. Pancreaticoduodenectomy: a rare procedure for the management of complex pancreaticoduodenal injuries.

    Science.gov (United States)

    Asensio, Juan A; Petrone, Patrizio; Roldán, Gustavo; Kuncir, Eric; Demetriades, Demetrios

    2003-12-01

    Pancreaticoduodenectomy (Whipple's procedure) is a formidable procedure when undertaken for severe pancreaticoduodenal injury. The purposes of this study were to review our experience with this procedure for trauma; to classify injury grades for both pancreatic and duodenal injuries in patients undergoing pancreaticoduodenectomy according to the American Association for the Surgery of Trauma-Organ Injury Scale for pancreatic and duodenal injury; and to validate existing indications for performance of this procedure. We performed a retrospective 126-month study (May 1992 to December 2002) of all patients admitted with proven complex pancreaticoduodenal injuries requiring pancreaticoduodenectomy. Eighteen patients were included; mean age was 32 +/- 12 years (SD), mean Revised Trauma Score was 6.84 +/- 2.13 (SD), and mean Injury Severity Score was 27 +/- 8 (SD). There were 17 penetrating injuries (94%) and 1 blunt injury (6%). One of 18 patients had an emergency department thoracotomy and died (100% mortality); 5 of the remaining 17 patients required operating room thoracotomies, and only 1 survived (80% mortality). There was 1 AAST-OIS pancreas grade IV injury, and there were 17 pancreas grade V injuries and 18 AAST-OIS duodenum grade V injuries. Indications for pancreaticoduodenectomy were: massive uncontrollable retropancreatic hemorrhage, 13 patients (72%); massive unreconstructable injury to the head of the pancreas/main pancreatic duct and intrapancreatic portion/distal common bile duct, 18 patients (100%); and massive unreconstructable injury, 18 patients (100%). Mean estimated blood loss was 6,888 +/- 7,866 mL, and overall survival was 67% (12 of 18 patients). Complex pancreaticoduodenal injuries requiring pancreaticoduodenectomy (Whipple's procedure) are uncommon but highly lethal; virtually all are classified as AAST-OIS grade V for both pancreas and duodenum. Current indications for performance of pancreaticoduodenectomy are valid and should be strictly

  14. A method for work modeling at complex systems: towards applying information systems in family health care units.

    Science.gov (United States)

    Jatobá, Alessandro; de Carvalho, Paulo Victor R; da Cunha, Amauri Marques

    2012-01-01

    Work in organizations requires a minimum level of consensus on the understanding of the practices performed. To adopt technological devices to support the activities in environments where work is complex, characterized by the interdependence among a large number of variables, understanding about how work is done not only takes an even greater importance, but also becomes a more difficult task. Therefore, this study aims to present a method for modeling of work in complex systems, which allows improving the knowledge about the way activities are performed where these activities do not simply happen by performing procedures. Uniting techniques of Cognitive Task Analysis with the concept of Work Process, this work seeks to provide a method capable of providing a detailed and accurate vision of how people perform their tasks, in order to apply information systems for supporting work in organizations.

  15. Performance Demonstration Initiative U.S. implementation of ASME B and PV code section 11 Appendix 8

    International Nuclear Information System (INIS)

    Becker, F.L.; Ammirato, F.; Huffman, K.

    1994-01-01

    New requirements have now been added to Section 11 as mandatory Appendix 8, ''Performance Demonstration Requirements for Ultrasonic Examination systems''. The appendix was recently published and incorporates performance demonstration requirements for ultrasonic examination equipment, procedures, and personnel. These new requirements will have far reaching and significant impact on the conduct of ISI at all nuclear power plants. For the first time since Section 11 was issued in 1970, the effectiveness of ultrasonic examination procedures and the proficiency of examiners must be demonstrated on reactor pressure vessel (RPV), piping, and bolting markups containing real flaws, Recognizing the importance and complexity of Appendix 8 implementation, representatives from all US nuclear utilities have formed the Performance Demonstration Initiative (PDI) to implement Appendix 8 to provide for uniform implementation

  16. A high performance architecture for accelerator controls

    International Nuclear Information System (INIS)

    Allen, M.; Hunt, S.M; Lue, H.; Saltmarsh, C.G.; Parker, C.R.C.B.

    1991-01-01

    The demands placed on the Superconducting Super Collider (SSC) control system due to large distances, high bandwidth and fast response time required for operation will require a fresh approach to the data communications architecture of the accelerator. The prototype design effort aims at providing deterministic communication across the accelerator complex with a response time of < 100 ms and total bandwidth of 2 Gbits/sec. It will offer a consistent interface for a large number of equipment types, from vacuum pumps to beam position monitors, providing appropriate communications performance for each equipment type. It will consist of highly parallel links to all equipment: those with computing resources, non-intelligent direct control interfaces, and data concentrators. This system will give each piece of equipment a dedicated link of fixed bandwidth to the control system. Application programs will have access to all accelerator devices which will be memory mapped into a global virtual addressing scheme. Links to devices in the same geographical area will be multiplexed using commercial Time Division Multiplexing equipment. Low-level access will use reflective memory techniques, eliminating processing overhead and complexity of traditional data communication protocols. The use of commercial standards and equipment will enable a high performance system to be built at low cost

  17. A high performance architecture for accelerator controls

    International Nuclear Information System (INIS)

    Allen, M.; Hunt, S.M.; Lue, H.; Saltmarsh, C.G.; Parker, C.R.C.B.

    1991-03-01

    The demands placed on the Superconducting Super Collider (SSC) control system due to large distances, high bandwidth and fast response time required for operation will require a fresh approach to the data communications architecture of the accelerator. The prototype design effort aims at providing deterministic communication across the accelerator complex with a response time of <100 ms and total bandwidth of 2 Gbits/sec. It will offer a consistent interface for a large number of equipment types, from vacuum pumps to beam position monitors, providing appropriate communications performance for each equipment type. It will consist of highly parallel links to all equipments: those with computing resources, non-intelligent direct control interfaces, and data concentrators. This system will give each piece of equipment a dedicated link of fixed bandwidth to the control system. Application programs will have access to all accelerator devices which will be memory mapped into a global virtual addressing scheme. Links to devices in the same geographical area will be multiplexed using commercial Time Division Multiplexing equipment. Low-level access will use reflective memory techniques, eliminating processing overhead and complexity of traditional data communication protocols. The use of commercial standards and equipment will enable a high performance system to be built at low cost. 1 fig

  18. Study of the layout plan in the tokamak complex building for ITER

    International Nuclear Information System (INIS)

    Sato, Kazuyoshi; Yagenji, Akira; Sekiya, Shigeki; Takahashi, Hideo; Tamura, Kousaku; Neyatani, Yuzuru; Hashimoto, Masayoshi; Ogino, Shunji; Nagamatsu, Nobuhide; Motohashi, Keiichi; Uehara, Masaharu; Kataoka, Takahiro; Ohashi, Hironori

    2006-03-01

    This report summarizes study of the layout plan in the ITER Tokamak complex building as an invite to set up its plant in Japan. To draw up this arrangement plan, final design report (FDR), which was designed for main components and determined for the non-site specific design, was reconstructed systematically for the Japanese site. A supplementary design was performed for the insufficiency parts of FDR. An additional study was also performed for the adaptation of a regulatory framework including technical safety requirements in Japan. We proposed the tokamak complex building with seismic isolation to combine with the hot cell building. Through the studies, a layout plan has been constructed including maintenance plan for personnel access and component route with in the building from assembly to operation period. This layout plan would be used as a basis in the construction period, although final decision will be done by ITER organization. (author)

  19. Study of In-Pile test facility for fast reactor safety research: performance requirements and design features

    Energy Technology Data Exchange (ETDEWEB)

    Nonaka, N.; Kawatta, N.; Niwa, H.; Kondo, S.; Maeda, K

    1996-12-31

    This paper describes a program and the main design features of a new in-pile safety facility SERAPH planned for future fast reactor safety research. The current status of R and D on technical developments in relation to the research objectives and performance requirements to the facility design is given.

  20. The complexities of complex span: explaining individual differences in working memory in children and adults.

    Science.gov (United States)

    Bayliss, Donna M; Jarrold, Christopher; Gunn, Deborah M; Baddeley, Alan D

    2003-03-01

    Two studies are presented that investigated the constraints underlying working memory performance in children and adults. In each case, independent measures of processing efficiency and storage capacity are assessed to determine their relative importance in predicting performance on complex span tasks,which measure working memory capacity. Results show that complex span performance was independently constrained by individual differences in domain-general processing efficiency and domain-specific storage capacity. Residual variance, which may reflect the ability to coordinate storage and processing, also predicted academic achievement. These results challenge the view that complex span taps a limited-capacity resource pool shared between processing and storage operations. Rather, they are consistent with a multiple-component model in which separate resource pools support the processing and storage functions of working memory.