WorldWideScience

Sample records for complexes final performance

  1. 10 CFR 603.890 - Final performance report.

    Science.gov (United States)

    2010-01-01

    ... to Other Administrative Matters Financial and Programmatic Reporting § 603.890 Final performance report. A TIA must require a final performance report that addresses all major accomplishments under the... 10 Energy 4 2010-01-01 2010-01-01 false Final performance report. 603.890 Section 603.890 Energy...

  2. Complexities and constraints influencing learner performance in physical science

    Directory of Open Access Journals (Sweden)

    Mavhungu Abel Mafukata

    2016-01-01

    Full Text Available This paper explores complexities and constraints affecting performance and output of physical science learners in Vhembe District, Limpopo Province, South Africa. The study was motivated by the desire of the researcher to establish, profile and characterise the complexities and constraints reminiscence of poor performance of learners in physical science as measured through end-of-year Grade 12 (final year of high school education examination results. Twenty six schools (n=26 were purposively selected from three circuits of education (n=3. From these schools, two learners were randomly selected (n=52 for interviews. In addition, two circuit managers (n=2 were conveniently selected as part of Key Informant Interviews (KII. For the Focus Group Discussions (FGDs, twelve (n=12 parents were randomly selected to form two groups of six members each. Multi-factor complexities and constraints impeding performance of learners were discovered. Intensive teacher in-service programme is recommended. Community engagement should be encouraged to educate parents on the value of involvement in the education of their children. Free access learner support structures such as Homework and Extra-lessons Assistance Centre (H&EACs should be established.

  3. Complexity factors and prediction of performance

    International Nuclear Information System (INIS)

    Braarud, Per Oeyvind

    1998-03-01

    Understanding of what makes a control room situation difficult to handle is important when studying operator performance, both with respect to prediction as well as improvement of the human performance. A factor analytic approach identified eight factors from operators' answers to an 39 item questionnaire about complexity of the operator's task in the control room. A Complexity Profiling Questionnaire was developed, based on the factor analytic results from the operators' conception of complexity. The validity of the identified complexity factors was studied by prediction of crew performance and prediction of plant performance from ratings of the complexity of scenarios. The scenarios were rated by both process experts and the operators participating in the scenarios, using the Complexity Profiling Questionnaire. The process experts' complexity ratings predicted both crew performance and plant performance, while the operators' rating predicted plant performance only. The results reported are from initial studies of complexity, and imply a promising potential for further studies of the concept. The approach used in the study as well as the reported results are discussed. A chapter about the structure of the conception of complexity, and a chapter about further research conclude the report. (author)

  4. SLC Final Performance and Lessons

    International Nuclear Information System (INIS)

    Phinney, Nan

    2000-01-01

    The Stanford Linear Collider (SLC) was the first prototype of a new type of accelerator, the electron-positron linear collider. Many years of dedicated effort were required to understand the physics of this new technology and to develop the techniques for maximizing performance. Key issues were emittance dilution, stability, final beam optimization and background control. Precision, non-invasive diagnostics were required to measure and monitor the beams throughout the machine. Beam-based feedback systems were needed to stabilize energy, trajectory, intensity and the final beam size at the interaction point. variety of new tuning techniques were developed to correct for residual optical or alignment errors. The final focus system underwent a series of refinements in order to deliver sub-micron size beams. It also took many iterations to understand the sources of backgrounds and develop the methods to control them. The benefit from this accumulated experience was seen in the performance of the SLC during its final run in 1997-98. The luminosity increased by a factor of three to 3*10 30 and the 350,000 Z data sample delivered was nearly double that from all previous runs combined

  5. Power performance assessment. Final report

    International Nuclear Information System (INIS)

    Frandsen, S.

    1998-12-01

    In the increasingly commercialised wind power marketplace, the lack of precise assessment methods for the output of an investment is becoming a barrier for wider penetration of wind power. Thus, addressing this problem, the overall objectives of the project are to reduce the financial risk in investment in wind power projects by significantly improving the power performance assessment methods. Ultimately, if this objective is successfully met, the project may also result in improved tuning of the individual wind turbines and in optimisation methods for wind farm operation. The immediate, measurable objectives of the project are: To prepare a review of existing contractual aspects of power performance verification procedures of wind farms; to provide information on production sensitivity to specific terrain characteristics and wind turbine parameters by analyses of a larger number of wind farm power performance data available to the proposers; to improve the understanding of the physical parameters connected to power performance in complex environment by comparing real-life wind farm power performance data with 3D computational flow models and 3D-turbulence wind turbine models; to develop the statistical framework including uncertainty analysis for power performance assessment in complex environments; and to propose one or more procedures for power performance evaluation of wind power plants in complex environments to be applied in contractual agreements between purchasers and manufacturers on production warranties. Although the focus in this project is on power performance assessment the possible results will also be of benefit to energy yield forecasting, since the two tasks are strongly related. (au) JOULE III. 66 refs.; In Co-operation Renewable Energy System Ltd. (GB); Centre for Renewable Energy (GR); Aeronautic Research Centre (SE); National Engineering Lab. (GB); Public Power Cooperation (GR)

  6. A low complexity visualization tool that helps to perform complex systems analysis

    International Nuclear Information System (INIS)

    Beiro, M G; Alvarez-Hamelin, J I; Busch, J R

    2008-01-01

    In this paper, we present an extension of large network visualization (LaNet-vi), a tool to visualize large scale networks using the k-core decomposition. One of the new features is how vertices compute their angular position. While in the later version it is done using shell clusters, in this version we use the angular coordinate of vertices in higher k-shells, and arrange the highest shell according to a cliques decomposition. The time complexity goes from O(n√n) to O(n) upon bounds on a heavy-tailed degree distribution. The tool also performs a k-core-connectivity analysis, highlighting vertices that are not k-connected; e.g. this property is useful to measure robustness or quality of service (QoS) capabilities in communication networks. Finally, the actual version of LaNet-vi can draw labels and all the edges using transparencies, yielding an accurate visualization. Based on the obtained figure, it is possible to distinguish different sources and types of complex networks at a glance, in a sort of 'network iris-print'.

  7. A low complexity visualization tool that helps to perform complex systems analysis

    Science.gov (United States)

    Beiró, M. G.; Alvarez-Hamelin, J. I.; Busch, J. R.

    2008-12-01

    In this paper, we present an extension of large network visualization (LaNet-vi), a tool to visualize large scale networks using the k-core decomposition. One of the new features is how vertices compute their angular position. While in the later version it is done using shell clusters, in this version we use the angular coordinate of vertices in higher k-shells, and arrange the highest shell according to a cliques decomposition. The time complexity goes from O(n\\sqrt n) to O(n) upon bounds on a heavy-tailed degree distribution. The tool also performs a k-core-connectivity analysis, highlighting vertices that are not k-connected; e.g. this property is useful to measure robustness or quality of service (QoS) capabilities in communication networks. Finally, the actual version of LaNet-vi can draw labels and all the edges using transparencies, yielding an accurate visualization. Based on the obtained figure, it is possible to distinguish different sources and types of complex networks at a glance, in a sort of 'network iris-print'.

  8. Effects of orientation on Rey complex figure performance.

    Science.gov (United States)

    Ferraro, F Richard; Grossman, Jennifer; Bren, Amy; Hoverson, Allysa

    2002-10-01

    An experiment was performed that examined the impact of stimulus orientation on performance on the Rey complex figure. A total of 48 undergraduates (24 men, 24 women) were randomly assigned to one of four Rey figure orientation groups (0 degrees, 90 degrees, 180 degrees, and 270 degrees ). Participants followed standard procedures for the Rey figure, initially copying it in whatever orientation group they were assigned to. Next, all participants performed a 15-20 min lexical decision experiment, used as a filler task. Finally, and unbeknownest to them, participants were asked to recall as much of the figure as they could. As expected, results revealed a main effect of Task (F = 83.92, p orientation was not significant, nor did orientation interact with task (Fs .57). The results are important from an applied setting, especially if testing conditions are less than optimal and a fixed stimulus position is not possible (e.g., testing at the bedside).

  9. Decontamination and decommissioning of the EBR-I Complex. Final report

    International Nuclear Information System (INIS)

    Kendall, E.W.; Wang, D.K.

    1975-07-01

    This final report covers the Decontamination and Decommissioning (D and D) of the Experimental Breeder Reactor No. 1 (EBR-I) Complex funded under Contract No. AT(10-1)-1375. The major effort consisted of removal and processing of 5500 gallons of sodium/potassium (NaK) coolant from the EBR-I reactor system. Tests were performed to assess the explosive hazards of NaK and KO 2 in various environments and in contact with various contaminants likely to be encountered in the removal and processing operations. A NaK process plant was designed and constructed and the operation was successfully completed. Lesser effort was required for D and D of the Zero Power Reactor (ZPR-III) Facility, the Argonne Fast Source Reactor (AFSR) Shielding, and removal of contaminated NaK from the storage pit. The D and D effort was completed by 13 June 1975, ahead of schedule. (auth)

  10. Final Report: Performance Engineering Research Institute

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2014-10-27

    This document is a final report about the work performed for cooperative agreement DE-FC02-06ER25764, the Rice University effort of Performance Engineering Research Institute (PERI). PERI was an Enabling Technologies Institute of the Scientific Discovery through Advanced Computing (SciDAC-2) program supported by the Department of Energy's Office of Science Advanced Scientific Computing Research (ASCR) program. The PERI effort at Rice University focused on (1) research and development of tools for measurement and analysis of application program performance, and (2) engagement with SciDAC-2 application teams.

  11. Complexity rating of abnormal events and operator performance

    International Nuclear Information System (INIS)

    Oeivind Braarud, Per

    1998-01-01

    The complexity of the work situation during abnormal situations is a major topic in a discussion of safety aspects of Nuclear Power plants. An understanding of complexity and its impact on operator performance in abnormal situations is important. One way to enhance understanding is to look at the dimensions that constitute complexity for NPP operators, and how those dimensions can be measured. A further step is to study how dimensions of complexity of the event are related to performance of operators. One aspect of complexity is the operator 's subjective experience of given difficulties of the event. Another related aspect of complexity is subject matter experts ratings of the complexity of the event. A definition and a measure of this part of complexity are being investigated at the OECD Halden Reactor Project in Norway. This paper focus on the results from a study of simulated scenarios carried out in the Halden Man-Machine Laboratory, which is a full scope PWR simulator. Six crews of two licensed operators each performed in 16 scenarios (simulated events). Before the experiment subject matter experts rated the complexity of the scenarios, using a Complexity Profiling Questionnaire. The Complexity Profiling Questionnaire contains eight previously identified dimensions associated with complexity. After completing the scenarios the operators received a questionnaire containing 39 questions about perceived complexity. This questionnaire was used for development of a measure of subjective complexity. The results from the study indicated that Process experts' rating of scenario complexity, using the Complexity Profiling Questionnaire, were able to predict crew performance quite well. The results further indicated that a measure of subjective complexity could be developed that was related to crew performance. Subjective complexity was found to be related to subjective work load. (author)

  12. Controlling Initial and Final Radii to Achieve a Low-Complexity Sphere Decoding Technique in MIMO Channels

    Directory of Open Access Journals (Sweden)

    Fatemeh Eshagh Hosseini

    2012-01-01

    Full Text Available In order to apply sphere decoding algorithm in multiple-input multiple-output communication systems and to make it feasible for real-time applications, its computational complexity should be decreased. To achieve this goal, this paper provides some useful insights into the effect of initial and the final sphere radii and estimating them effortlessly. It also discusses practical ways of initiating the algorithm properly and terminating it before the normal end of the process as well as the cost of these methods. Besides, a novel algorithm is introduced which utilizes the presented techniques according to a threshold factor which is defined in terms of the number of transmit antennas and the noise variance. Simulation results show that the proposed algorithm offers a desirable performance and reasonable complexity satisfying practical constraints.

  13. CARE-HHH-APD Workshop on Finalizing the Roadmap for the Upgrade of the CERN and GSI Accelerator Complex

    CERN Document Server

    Zimmermann, Frank; BEAM'07; BEAM 2007; Finalizing the Roadmap for the Upgrade of the LHC and GSI Accelerator Complex

    2008-01-01

    This report contains the Proceedings of the CARE-HHH-APD Event BEAM’07, “Finalizing the Roadmap for the Upgrade of the CERN & GSI Accelerator Complex,” which was held at CERN in Geneva, Switzerland, from 1 to 5 October 2007. BEAM’07 was primarily devoted to beam dynamics limitations for the two, or three, alternative baseline scenarios of the LHC luminosity upgrade and to critical design choices for the upgrade of the LHC injector complex at CERN and for the FAIR complex at GSI. It comprised five parts: (1) a Mini-Workshop on LHC+ Beam Performance, (2) a CERN-GSI Meeting on Collective Effects, (3) the Francesco Ruggiero Memorial Symposium, (4) a Mini-Workshop on the LHC Injectors Upgrade, and (5) the BEAM’07 Summaries. Topics addressed in the first mini-workshop of BEAM’07 ranged from the luminosity performance reach of the upgraded LHC in different scenarios, over the generation and stability of the future LHC beams, the turnaround time, beam–beam effects, luminosity levelling methods, and ...

  14. Self-Efficacy, Task Complexity and Task Performance: Exploring Interactions in Two Versions of Vocabulary Learning Tasks

    Science.gov (United States)

    Wu, Xiaoli; Lowyck, Joost; Sercu, Lies; Elen, Jan

    2012-01-01

    The present study aimed for better understanding of the interactions between task complexity and students' self-efficacy beliefs and students' use of learning strategies, and finally their interacting effects on task performance. This investigation was carried out in the context of Chinese students learning English as a foreign language in a…

  15. Managing teams performing complex innovation projects

    NARCIS (Netherlands)

    Oeij, P.R.A.; Vroome, E.M.M. de; Dhondt, S.; Gaspersz, J.B.R.

    2012-01-01

    Complexity of projects is hotly debated and a factor which affects innovativeness of team performance. Much attention in the past is paid to technical complexity and many issues are related to natural and physical sciences. A growing awareness of the importance of socio-organisational issues is

  16. Complex performance in construction

    DEFF Research Database (Denmark)

    Bougrain, Frédéric; Forman, Marianne; Gottlieb, Stefan Christoffer

    To fulfil the expectations of demanding clients, new project-delivery mechanisms have been developed. Approaches focusing on performance-based building or new procurement processers such as new forms of private-public partnerships are considered as solutions improving the overall performance...... to the end users. This report summarises the results from work undertaken in the international collaborative project “Procuring and Operating Complex Products and Systems in Construction” (POCOPSC). POCOPSC was carried out in the period 2010-2014. The project was executed in collaboration between CSTB...

  17. Kinetic intermediates en route to the final serpin-protease complex: studies of complexes of α1-protease inhibitor with trypsin.

    Science.gov (United States)

    Maddur, Ashoka A; Swanson, Richard; Izaguirre, Gonzalo; Gettins, Peter G W; Olson, Steven T

    2013-11-01

    Serpin protein protease inhibitors inactivate their target proteases through a unique mechanism in which a major serpin conformational change, resulting in a 70-Å translocation of the protease from its initial reactive center loop docking site to the opposite pole of the serpin, kinetically traps the acyl-intermediate complex. Although the initial Michaelis and final trapped acyl-intermediate complexes have been well characterized structurally, the intermediate stages involved in this remarkable transformation are not well understood. To better characterize such intermediate steps, we undertook rapid kinetic studies of the FRET and fluorescence perturbation changes of site-specific fluorophore-labeled derivatives of the serpin, α1-protease inhibitor (α1PI), which report the serpin and protease conformational changes involved in transforming the Michaelis complex to the trapped acyl-intermediate complex in reactions with trypsin. Two kinetically resolvable conformational changes were observed in the reactions, ascribable to (i) serpin reactive center loop insertion into sheet A with full protease translocation but incomplete protease distortion followed by, (ii) full conformational distortion and movement of the protease and coupled serpin conformational changes involving the F helix-sheet A interface. Kinetic studies of calcium effects on the labeled α1PI-trypsin reactions demonstrated both inactive and low activity states of the distorted protease in the final complex that were distinct from the intermediate distorted state. These studies provide new insights into the nature of the serpin and protease conformational changes involved in trapping the acyl-intermediate complex in serpin-protease reactions and support a previously proposed role for helix F in the trapping mechanism.

  18. Bell Inequalities for Complex Networks

    Science.gov (United States)

    2015-10-26

    AFRL-AFOSR-VA-TR-2015-0355 YIP Bell Inequalities for Complex Networks Greg Ver Steeg UNIVERSITY OF SOUTHERN CALIFORNIA LOS ANGELES Final Report 10/26...performance report PI: Greg Ver Steeg Young Investigator Award Grant Title: Bell Inequalities for Complex Networks Grant #: FA9550-12-1-0417 Reporting...October 20, 2015 Final Report for “Bell Inequalities for Complex Networks” Greg Ver Steeg Abstract This effort studied new methods to understand the effect

  19. Procuring complex performance

    DEFF Research Database (Denmark)

    Hartmann, A.; Roehrich, J.; Frederiksen, Lars

    2014-01-01

    the transition process. Design/methodology/approach – A multiple, longitudinal case study method is used to examine the transition towards PCP. The study deploys rich qualitative data sets by combining semi-structured interviews, focus group meetings and organisational reports and documents. Findings...... and relational challenges they need to master when facing higher levels of performance and infrastructural complexity. Originality/value – The study adds to the limited empirical and conceptual understanding on the nature of long-term public-private interactions in PCP. It contributes through a rare focus...

  20. Sleep and Final Exam Performance in Introductory Physics

    Science.gov (United States)

    Coletta, Vincent; Wikholm, Colin; Pascoe, Daniel

    2018-03-01

    Most physics instructors believe that adequate sleep is important in order for students to perform well on problem solving, and many instructors advise students to get plenty of sleep the night before an exam. After years of giving such advice to students at Loyola Marymount University (LMU), one of us decided to find out how many hours students actually do sleep the night before an exam, and how that would relate to their performance. The effect of inadequate sleep on exam performance was explored in a second-semester introductory physics course. At the end of the final exam, students reported the number of hours they slept the night before. Sleep deprivation corresponded to lower final exam scores. The main purpose of this study is to provide evidence that instructors can provide to their students to convince them that their time is better spent sleeping rather than studying all night before an exam.

  1. Factors Influencing Student Nurses' Performance in the Final ...

    African Journals Online (AJOL)

    Factors Influencing Student Nurses' Performance in the Final Practical Examination ... Staff development courses can be held to coordinate the work of the school ... to authentic individual nursing care of patients so that they use the individual ...

  2. Performance Potential at one Complex, Specific Site

    DEFF Research Database (Denmark)

    Laursen, Bjørn

    2015-01-01

    disciplines: performance, drama, dance and music. Complex rules of “borders” between audience and actors/performers appeared to be present and active during this long happening. Different narrative genres were active simultaneously during the experimental session. A lot of complex and surprising phenomena...... and combinations of spatial, dramaturgical, narrative and interactive challenges, which appear to be of special interest for the kind of experiences an audience might gather in a site like this, originally created with totally different intentions. Or was it?...

  3. Pitch Sequence Complexity and Long-Term Pitcher Performance

    Directory of Open Access Journals (Sweden)

    Joel R. Bock

    2015-03-01

    Full Text Available Winning one or two games during a Major League Baseball (MLB season is often the difference between a team advancing to post-season play, or “waiting until next year”. Technology advances have made it feasible to augment historical data with in-game contextual data to provide managers immediate insights regarding an opponent’s next move, thereby providing a competitive edge. We developed statistical models of pitcher behavior using pitch sequences thrown during three recent MLB seasons (2011–2013. The purpose of these models was to predict the next pitch type, for each pitcher, based on data available at the immediate moment, in each at-bat. Independent models were developed for each player’s most frequent four pitches. The overall predictability of next pitch type is 74:5%. Additional analyses on pitcher predictability within specific game situations are discussed. Finally, using linear regression analysis, we show that an index of pitch sequence predictability may be used to project player performance in terms of Earned Run Average (ERA and Fielding Independent Pitching (FIP over a longer term. On a restricted range of the independent variable, reducing complexity in selection of pitches is correlated with higher values of both FIP and ERA for the players represented in the sample. Both models were significant at the α = 0.05 level (ERA: p = 0.022; FIP: p = 0.0114. With further development, such models may reduce risk faced by management in evaluation of potential trades, or to scouts assessing unproven emerging talent. Pitchers themselves might benefit from awareness of their individual statistical tendencies, and adapt their behavior on the mound accordingly. To our knowledge, the predictive model relating pitch-wise complexity and long-term performance appears to be novel.

  4. Particle-induced amorphization of complex ceramics. Final report

    International Nuclear Information System (INIS)

    Ewing, R.C.; Wang, L.M.

    1998-01-01

    The crystalline-to-amorphous (c-a) phase transition is of fundamental importance. Particle irradiations provide an important, highly controlled means of investigating this phase transformation and the structure of the amorphous state. The interaction of heavy-particles with ceramics is complex because these materials have a wide range of structure types, complex compositions, and because chemical bonding is variable. Radiation damage and annealing can produce diverse results, but most commonly, single crystals become aperiodic or break down into a polycrystalline aggregate. The authors continued the studies of the transition from the periodic-to-aperiodic state in natural materials that have been damaged by α-recoil nuclei in the uranium and thorium decay series and in synthetic, analogous structures. The transition from the periodic to aperiodic state was followed by detailed x-ray diffraction analysis, in-situ irradiation/transmission electron microscopy, high resolution transmission electron microscopy, extended x-ray absorption fine structure spectroscopy/x-ray absorption near edge spectroscopy and other spectroscopic techniques. These studies were completed in conjunction with bulk irradiations that can be completed at Los Alamos National Laboratory or Sandia National Laboratories. Principal questions addressed in this research program included: (1) What is the process at the atomic level by which a ceramic material is transformed into a disordered or aperiodic state? (2) What are the controlling effects of structural topology, bond-type, dose rate, and irradiation temperature on the final state of the irradiated material? (3) What is the structure of the damaged material? (4) What are the mechanisms and kinetics for the annealing of interstitial and aggregate defects in these irradiated ceramic materials? (5) What general criteria may be applied to the prediction of amorphization in complex ceramics?

  5. Mastoidectomy performance assessment of virtual simulation training using final-product analysis

    DEFF Research Database (Denmark)

    Andersen, Steven A W; Cayé-Thomasen, Per; Sørensen, Mads S

    2015-01-01

    a modified Welling scale. The simulator gathered basic metrics on time, steps, and volumes in relation to the on-screen tutorial and collisions with vital structures. RESULTS: Substantial inter-rater reliability (kappa = 0.77) for virtual simulation and moderate inter-rater reliability (kappa = 0.......59) for dissection final-product assessment was found. The simulation and dissection performance scores had significant correlation (P = .014). None of the basic simulator metrics correlated significantly with the final-product score except for number of steps completed in the simulator. CONCLUSIONS: A modified...... version of a validated final-product performance assessment tool can be used to assess mastoidectomy on virtual temporal bones. Performance assessment of virtual mastoidectomy could potentially save the use of cadaveric temporal bones for more advanced training when a basic level of competency...

  6. Phase V storage (Project W-112) Central Waste Complex operational readiness review, final report

    International Nuclear Information System (INIS)

    Wight, R.H.

    1997-01-01

    This document is the final report for the RFSH conducted, Contractor Operational Readiness Review (ORR) for the Central Waste Complex (CWC) Project W-112 and Interim Safety Basis implementation. As appendices, all findings, observations, lines of inquiry and the implementation plan are included

  7. Phase 5 storage (Project W-112) Central Waste Complex operational readiness review, final report

    Energy Technology Data Exchange (ETDEWEB)

    Wight, R.H.

    1997-05-30

    This document is the final report for the RFSH conducted, Contractor Operational Readiness Review (ORR) for the Central Waste Complex (CWC) Project W-112 and Interim Safety Basis implementation. As appendices, all findings, observations, lines of inquiry and the implementation plan are included.

  8. Poor academic performance: A perspective of final year diagnostic radiography students

    International Nuclear Information System (INIS)

    Gqweta, Ntokozo

    2012-01-01

    Introduction: A study was conducted on final year diagnostic radiography students at a University of Technology in Durban. The aim of the study was to investigate the final year diagnostic radiography students' opinions and views on academic performance in order to inform teaching and learning methods. The objectives were: •To explore the students' opinions regarding poor performance. •To identify strategies to improve academic performance. Method: A qualitative, interpretive approach was used to explain and understand the students' lived experiences of their academic performances. A short open ended questionnaire was administered to a cohort of final diagnostic radiography students following feedback on a written assessment. Questionnaire responses were then manually captured and analyzed. Results: Five (5) themes were identified that could possibly be associated with poor academic performance. These themes were, poor preparation, lack of independent study, difficulty in understanding learning content and misinterpretation of assessment questions, inefficient studying techniques as well as perceived improvement strategies. Conclusion: Students identified their inadequate preparation and the lack of dedicated independent studying as the main reasons for poor performance. Students preferred to be taught in an assessment oriented manner. However their identified improvement strategies were aligned with the learner centred approach.

  9. Value assessment aid to complex decision making. Final report

    International Nuclear Information System (INIS)

    Humphress, G.; Lewis, E.

    1982-07-01

    Value assessment (VA) is a new decision aid that can improve the performance of decisionmakers confronted with multiple attributes and conflicting objectives. Managers who are not supported by formal decision aids turn to various ''satisficing'' or effort-reducing biases that can lead to serious errors in the decisionmaking process. Value assessment, on the other hand, is an optimizing approach to problem-solving behavior. VA helps decisionmakers overcome the tendency to turn to effort-reducing biases by reducing the complexity of making tradeoffs and weighing all available information. Many of the issues which confront modern electric utility managements are complex, multiple attribute problems which must be viewed from engineering, financial and socio-political perspectives simultaneously. Added to this are the complications contributed by factors like uncertainty, risk, incomplete information and conflicting objectives among the public it serves. This is the complex decisionmaking arena which VA is intended to support

  10. Cloud Computing for Complex Performance Codes.

    Energy Technology Data Exchange (ETDEWEB)

    Appel, Gordon John [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Hadgu, Teklu [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Klein, Brandon Thorin [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miner, John Gifford [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-02-01

    This report describes the use of cloud computing services for running complex public domain performance assessment problems. The work consisted of two phases: Phase 1 was to demonstrate complex codes, on several differently configured servers, could run and compute trivial small scale problems in a commercial cloud infrastructure. Phase 2 focused on proving non-trivial large scale problems could be computed in the commercial cloud environment. The cloud computing effort was successfully applied using codes of interest to the geohydrology and nuclear waste disposal modeling community.

  11. Peak and ceiling effects in final-product analysis of mastoidectomy performance

    DEFF Research Database (Denmark)

    West, N; Konge, L; Cayé-Thomasen, P

    2015-01-01

    BACKGROUND: Virtual reality surgical simulation of mastoidectomy is a promising training tool for novices. Final-product analysis for assessing novice mastoidectomy performance could be limited by a peak or ceiling effect. These may be countered by simulator-integrated tutoring. METHODS: Twenty......-two participants completed a single session of self-directed practice of the mastoidectomy procedure in a virtual reality simulator. Participants were randomised for additional simulator-integrated tutoring. Performances were assessed at 10-minute intervals using final-product analysis. RESULTS: In all, 45.5 per...

  12. Short communication: final year students' deficits in physical examination skills performance in Germany.

    Science.gov (United States)

    Krautter, Markus; Diefenbacher, Katja; Koehl-Hackert, Nadja; Buss, Beate; Nagelmann, Lars; Herzog, Wolfgang; Jünger, Jana; Nikendei, Christoph

    2015-01-01

    The physical examination of patients is an important diagnostic competence, but little is known about the examination skills of final-year medical students. To investigate physical examination skills of final-year medical students. In a cross-sectional study, 40 final-year students were asked to perform a detailed physical examination on standardized patients. Their performances were video-recorded and rated by independent video assessors. Video ratings showed a mean success rate of 40.1 % (SD 8.2). As regards accompanying doctor-patient communication, final-year students achieved a mean of no more than 36.7 % (SD 8.9) in the appropriate use of the corresponding communication items. Our study revealed severe deficits among final-year medical students in performing a detailed physical examination on a standardized patient. Thus, physical examination skills training should aim to improve these deficits while also paying attention to communicative aspects. Copyright © 2015. Published by Elsevier GmbH.

  13. Performance Measurement of Complex Event Platforms

    Directory of Open Access Journals (Sweden)

    Eva Zámečníková

    2016-12-01

    Full Text Available The aim of this paper is to find and compare existing solutions of complex event processing platforms (CEP. CEP platforms generally serve for processing and/or predicting of high frequency data. We intend to use CEP platform for processing of complex time series and integrate a solution for newly proposed method of decision making. The decision making process will be described by formal grammar. As there are lots of CEP solutions we will take the following characteristics under consideration - the processing in real time, possibility of processing of high volume data from multiple sources, platform independence, platform allowing integration with user solution and open license. At first we will talk about existing CEP tools and their specific way of use in praxis. Then we will mention the design of method for formalization of business rules used for decision making. Afterwards, we focus on two platforms which seem to be the best fit for integration of our solution and we will list the main pros and cons of each approach. Next part is devoted to benchmark platforms for CEP. Final part is devoted to experimental measurements of platform with integrated method for decision support.

  14. Performance-complexity tradeoff in sequential decoding for the unconstrained AWGN channel

    KAUST Repository

    Abediseid, Walid

    2013-06-01

    In this paper, the performance limits and the computational complexity of the lattice sequential decoder are analyzed for the unconstrained additive white Gaussian noise channel. The performance analysis available in the literature for such a channel has been studied only under the use of the minimum Euclidean distance decoder that is commonly referred to as the lattice decoder. Lattice decoders based on solutions to the NP-hard closest vector problem are very complex to implement, and the search for low complexity receivers for the detection of lattice codes is considered a challenging problem. However, the low computational complexity advantage that sequential decoding promises, makes it an alternative solution to the lattice decoder. In this work, we characterize the performance and complexity tradeoff via the error exponent and the decoding complexity, respectively, of such a decoder as a function of the decoding parameter - the bias term. For the above channel, we derive the cut-off volume-to-noise ratio that is required to achieve a good error performance with low decoding complexity. © 2013 IEEE.

  15. Similarity, Not Complexity, Determines Visual Working Memory Performance

    Science.gov (United States)

    Jackson, Margaret C.; Linden, David E. J.; Roberts, Mark V.; Kriegeskorte, Nikolaus; Haenschel, Corinna

    2015-01-01

    A number of studies have shown that visual working memory (WM) is poorer for complex versus simple items, traditionally accounted for by higher information load placing greater demands on encoding and storage capacity limits. Other research suggests that it may not be complexity that determines WM performance per se, but rather increased…

  16. Total System Performance Assessment Sensitivity Analyses for Final Nuclear Regulatory Commission Regulations

    International Nuclear Information System (INIS)

    Bechtel SAIC Company

    2001-01-01

    This Letter Report presents the results of supplemental evaluations and analyses designed to assess long-term performance of the potential repository at Yucca Mountain. The evaluations were developed in the context of the Nuclear Regulatory Commission (NRC) final public regulation, or rule, 10 CFR Part 63 (66 FR 55732 [DIRS 156671]), which was issued on November 2, 2001. This Letter Report addresses the issues identified in the Department of Energy (DOE) technical direction letter dated October 2, 2001 (Adams 2001 [DIRS 156708]). The main objective of this Letter Report is to evaluate performance of the potential Yucca Mountain repository using assumptions consistent with performance-assessment-related provisions of 10 CFR Part 63. The incorporation of the final Environmental Protection Agency (EPA) standard, 40 CFR Part 197 (66 FR 32074 [DIRS 155216]), and the analysis of the effect of the 40 CFR Part 197 EPA final rule on long-term repository performance are presented in the Total System Performance Assessment--Analyses for Disposal of Commercial and DOE Waste Inventories at Yucca Mountain--Input to Final Environmental Impact Statement and Site Suitability Evaluation (BSC 2001 [DIRS 156460]), referred to hereafter as the FEIS/SSE Letter Report. The Total System Performance Assessment (TSPA) analyses conducted and documented prior to promulgation of the NRC final rule 10 CFR Part 63 (66 FR 55732 [DIRS 156671]), were based on the NRC proposed rule (64 FR 8640 [DIRS 101680]). Slight differences exist between the NRC's proposed and final rules which were not within the scope of the FEIS/SSE Letter Report (BSC 2001 [DIRS 156460]), the Preliminary Site Suitability Evaluation (PSSE) (DOE 2001 [DIRS 155743]), and supporting documents for these reports. These differences include (1) the possible treatment of ''unlikely'' features, events and processes (FEPs) in evaluation of both the groundwater protection standard and the human-intrusion scenario of the individual

  17. Operations-oriented performance measures for freeway management systems : final report.

    Science.gov (United States)

    2008-12-01

    This report describes the second and final year activities of the project titled Using Operations-Oriented Performance Measures to Support Freeway Management Systems. Work activities included developing a prototype system architecture for testi...

  18. The impact of manufacturing complexity drivers on performance-a preliminary study

    Science.gov (United States)

    Huah Leang, Suh; Mahmood, Wan Hasrulnizzam Wan; Rahman, Muhamad Arfauz A.

    2018-03-01

    Manufacturing systems, in pursuit of cost, time and flexibility optimisation are becoming more and more complex, exhibiting a dynamic and nonlinear behaviour. Unpredictability is a distinct characteristic of such behaviour and effects production planning significantly. Therefore, this study was undertaken to investigate the priority level and current achievement of manufacturing performance in Malaysia’s manufacturing industry and the complexity drivers on manufacturing productivity performance. The results showed that Malaysia’s manufacturing industry prioritised product quality and they managed to achieve a good on time delivery performance. However, for other manufacturing performance, there was a difference where the current achievement of manufacturing performances in Malaysia’s manufacturing industry is slightly lower than the priority given to them. The strong correlation of significant value for priority status was observed between efficient production levelling (finished goods) and finish product management while the strong correlation of significant value for current achievement was minimised the number of workstation and factory transportation system. This indicates that complexity drivers have an impact towards manufacturing performance. Consequently, it is necessary to identify complexity drivers to achieve well manufacturing performance.

  19. Assessing vocal performance in complex birdsong: a novel approach.

    Science.gov (United States)

    Geberzahn, Nicole; Aubin, Thierry

    2014-08-06

    Vocal performance refers to the ability to produce vocal signals close to physical limits. Such motor skills can be used by conspecifics to assess a signaller's competitive potential. For example it is difficult for birds to produce repeated syllables both rapidly and with a broad frequency bandwidth. Deviation from an upper-bound regression of frequency bandwidth on trill rate has been widely used to assess vocal performance. This approach is, however, only applicable to simple trilled songs, and even then may be affected by differences in syllable complexity. Using skylarks (Alauda arvensis) as a birdsong model with a very complex song structure, we detected another performance trade-off: minimum gap duration between syllables was longer when the frequency ratio between the end of one syllable and the start of the next syllable (inter-syllable frequency shift) was large. This allowed us to apply a novel measure of vocal performance ¿ vocal gap deviation: the deviation from a lower-bound regression of gap duration on inter-syllable frequency shift. We show that skylarks increase vocal performance in an aggressive context suggesting that this trait might serve as a signal for competitive potential. We suggest using vocal gap deviation in future studies to assess vocal performance in songbird species with complex structure.

  20. Comparison of surface extraction techniques performance in computed tomography for 3D complex micro-geometry dimensional measurements

    DEFF Research Database (Denmark)

    Torralba, Marta; Jiménez, Roberto; Yagüe-Fabra, José A.

    2018-01-01

    micro-geometries as well (i.e., in the sub-mm dimensional range). However, there are different factors that may influence the CT process performance, being one of them the surface extraction technique used. In this paper, two different extraction techniques are applied to measure a complex miniaturized......The number of industrial applications of computed tomography (CT) for dimensional metrology in 100–103 mm range has been continuously increasing, especially in the last years. Due to its specific characteristics, CT has the potential to be employed as a viable solution for measuring 3D complex...... dental file by CT in order to analyze its contribution to the final measurement uncertainty in complex geometries at the mm to sub-mm scales. The first method is based on a similarity analysis: the threshold determination; while the second one is based on a gradient or discontinuity analysis: the 3D...

  1. VI-G, Sec. 661, P.L. 91-230. Final Performance Report.

    Science.gov (United States)

    1976

    Presented is the final performance report of the CSDC model which is designed to provide services for learning disabled high school students. Sections cover the following program aspects: organizational structure, inservice sessions, identification of students, materials and equipment, evaluation of student performance, evaluation of the model,…

  2. Camp Verde Adult Reading Program. Final Performance Report.

    Science.gov (United States)

    Maynard, David A.

    This document begins with a four-page performance report describing how the Camp Verde Adult Reading Program site was relocated to the Community Center Complex, and the Town Council contracted directly with the Friends of the Camp Verde Library to provide for the requirements of the program. The U.S. Department of Education grant allowed the…

  3. Selecting a change and evaluating its impact on the performance of a complex adaptive health care delivery system

    Directory of Open Access Journals (Sweden)

    Malaz A Boustani

    2010-05-01

    Full Text Available Malaz A Boustani1,2,3,4, Stephanie Munger1,2, Rajesh Gulati3,4, Mickey Vogel4, Robin A Beck3,4, Christopher M Callahan1,2,3,41Indiana University Center for Aging Research, 2Regenstrief Institute Inc., 3Indiana University School of Medicine, Department of Medicine, Division of General Internal Medicine and Geriatrics, 4Indiana University Medical Group-Primary Care; Indianapolis, IN USAAbstract: Complexity science suggests that our current health care delivery system acts as a complex adaptive system (CAS. Such systems represent a dynamic and flexible network of individuals who can coevolve with their ever changing environment. The CAS performance fluctuates and its members’ interactions continuously change over time in response to the stress generated by its surrounding environment. This paper will review the challenges of intervening and introducing a planned change into a complex adaptive health care delivery system. We explore the role of the “reflective adaptive process” in developing delivery interventions and suggest different evaluation methodologies to study the impact of such interventions on the performance of the entire system. We finally describe the implementation of a new program, the Aging Brain Care Medical Home as a case study of our proposed evaluation process.Keywords: complexity, aging brain, implementation, complex adaptive system, sustained change, care delivery

  4. Chemical Frustration. A Design Principle for the Discovery of New Complex Alloy and Intermetallic Phases, Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Fredrickson, Daniel C [Univ. of Wisconsin, Madison, WI (United States)

    2015-06-23

    Final technical report for "Chemical Frustration: A Design Principle for the Discovery of New Complex Alloy and Intermetallic Phases" funded by the Office of Science through the Materials Chemistry Program of the Office of Basic Energy Sciences.

  5. Roles of Working Memory Performance and Instructional Strategy in Complex Cognitive Task Performance

    Science.gov (United States)

    Cevik, V.; Altun, A.

    2016-01-01

    This study aims to investigate how working memory (WM) performances and instructional strategy choices affect learners' complex cognitive task performance in online environments. Three different e-learning environments were designed based on Merrill's (2006a) model of instructional strategies. The lack of experimental research on his framework is…

  6. Joining Distributed Complex Objects: Definition and Performance

    NARCIS (Netherlands)

    Teeuw, W.B.; Teeuw, Wouter B.; Blanken, Henk

    1992-01-01

    The performance of a non-standard distributed database system is strongly ifluenced by complex objects. The effective exploitation of parallelism in querying them and a suitable structure to store them are required in order to obtain acceptable response times in these database environments where

  7. Procurement of complex performance in public infrastructure: a process perspective

    OpenAIRE

    Hartmann, Andreas; Roehrich, Jens; Davies, Andrew; Frederiksen, Lars; Davies, J.; Harrington, T.; Kirkwood, D.; Holweg, M.

    2011-01-01

    The paper analyzes the process of transitioning from procuring single products and services to procuring complex performance in public infrastructure. The aim is to examine the change in the interactions between buyer and supplier, the emergence of value co-creation and the capability development during the transition process. Based on a multiple, longitudinal case study the paper proposes three generic transition stages towards increased performance and infrastructural complexity. These stag...

  8. 4D Dynamic Required Navigation Performance Final Report

    Science.gov (United States)

    Finkelsztein, Daniel M.; Sturdy, James L.; Alaverdi, Omeed; Hochwarth, Joachim K.

    2011-01-01

    New advanced four dimensional trajectory (4DT) procedures under consideration for the Next Generation Air Transportation System (NextGen) require an aircraft to precisely navigate relative to a moving reference such as another aircraft. Examples are Self-Separation for enroute operations and Interval Management for in-trail and merging operations. The current construct of Required Navigation Performance (RNP), defined for fixed-reference-frame navigation, is not sufficiently specified to be applicable to defining performance levels of such air-to-air procedures. An extension of RNP to air-to-air navigation would enable these advanced procedures to be implemented with a specified level of performance. The objective of this research effort was to propose new 4D Dynamic RNP constructs that account for the dynamic spatial and temporal nature of Interval Management and Self-Separation, develop mathematical models of the Dynamic RNP constructs, "Required Self-Separation Performance" and "Required Interval Management Performance," and to analyze the performance characteristics of these air-to-air procedures using the newly developed models. This final report summarizes the activities led by Raytheon, in collaboration with GE Aviation and SAIC, and presents the results from this research effort to expand the RNP concept to a dynamic 4D frame of reference.

  9. Does complexity matter? Meta-analysis of learner performance in artificial grammar tasks.

    Science.gov (United States)

    Schiff, Rachel; Katan, Pesia

    2014-01-01

    Complexity has been shown to affect performance on artificial grammar learning (AGL) tasks (categorization of test items as grammatical/ungrammatical according to the implicitly trained grammar rules). However, previously published AGL experiments did not utilize consistent measures to investigate the comprehensive effect of grammar complexity on task performance. The present study focused on computerizing Bollt and Jones's (2000) technique of calculating topological entropy (TE), a quantitative measure of AGL charts' complexity, with the aim of examining associations between grammar systems' TE and learners' AGL task performance. We surveyed the literature and identified 56 previous AGL experiments based on 10 different grammars that met the sampling criteria. Using the automated matrix-lift-action method, we assigned a TE value for each of these 10 previously used AGL systems and examined its correlation with learners' task performance. The meta-regression analysis showed a significant correlation, demonstrating that the complexity effect transcended the different settings and conditions in which the categorization task was performed. The results reinforced the importance of using this new automated tool to uniformly measure grammar systems' complexity when experimenting with and evaluating the findings of AGL studies.

  10. French Modular Impoundment: Final Cost and Performance Evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Drown, Peter [French Development Enterprises, LLC, North Billerica, MA (United States); French, Bill [French Development Enterprises, LLC, North Billerica, MA (United States)

    2017-05-17

    This report comprises the Final Cost and Performance Report for the Department of Energy Award # EE0007244, the French Modular Impoundment (aka the “French Dam”.) The French Dam is a system of applying precast modular construction to water control structures. The “French Dam” is a term used to cover the construction means/methods used to construct or rehabilitate dams, diversion structures, powerhouses, and other hydraulic structures which impound water and are covered under FDE’s existing IP (Patents # US8414223B2; US9103084B2.)

  11. Predictive-property-ranked variable reduction in partial least squares modelling with final complexity adapted models: comparison of properties for ranking.

    Science.gov (United States)

    Andries, Jan P M; Vander Heyden, Yvan; Buydens, Lutgarde M C

    2013-01-14

    The calibration performance of partial least squares regression for one response (PLS1) can be improved by eliminating uninformative variables. Many variable-reduction methods are based on so-called predictor-variable properties or predictive properties, which are functions of various PLS-model parameters, and which may change during the steps of the variable-reduction process. Recently, a new predictive-property-ranked variable reduction method with final complexity adapted models, denoted as PPRVR-FCAM or simply FCAM, was introduced. It is a backward variable elimination method applied on the predictive-property-ranked variables. The variable number is first reduced, with constant PLS1 model complexity A, until A variables remain, followed by a further decrease in PLS complexity, allowing the final selection of small numbers of variables. In this study for three data sets the utility and effectiveness of six individual and nine combined predictor-variable properties are investigated, when used in the FCAM method. The individual properties include the absolute value of the PLS1 regression coefficient (REG), the significance of the PLS1 regression coefficient (SIG), the norm of the loading weight (NLW) vector, the variable importance in the projection (VIP), the selectivity ratio (SR), and the squared correlation coefficient of a predictor variable with the response y (COR). The selective and predictive performances of the models resulting from the use of these properties are statistically compared using the one-tailed Wilcoxon signed rank test. The results indicate that the models, resulting from variable reduction with the FCAM method, using individual or combined properties, have similar or better predictive abilities than the full spectrum models. After mean-centring of the data, REG and SIG, provide low numbers of informative variables, with a meaning relevant to the response, and lower than the other individual properties, while the predictive abilities are

  12. Cooperative decoding in femtocell networks: Performance-complexity tradeoff

    KAUST Repository

    Benkhelifa, Fatma

    2012-06-01

    Femtocells, which are low cost low power, stand alone cellular access points, are a potential solution to provide good indoor coverage with high data rate. However, the femtocell deployment may also increase the co-channel interference (CCI) by reducing the distance reuse of the spectrum. In this paper, we introduce methods to cancel out the interference among the femtocells while considering that macrocells operate orthogonally to the femtocells. The femtocells may also cooperate through joint detection of the received signal and improve the overall error performance at the expense of an increased computational complexity. In this paper, the performance-complexity tradeoff of cooperative detection is investigated for uplink transmissions. Numerical results show that the cooperative detection gain may reach 10 dB at a Bit-Error Rate (BER) of 10 -2 when compared to the case without cooperation. © 2012 IEEE.

  13. Cooperative decoding in femtocell networks: Performance-complexity tradeoff

    KAUST Repository

    Benkhelifa, Fatma; Rezki, Zouheir; Alouini, Mohamed-Slim

    2012-01-01

    Femtocells, which are low cost low power, stand alone cellular access points, are a potential solution to provide good indoor coverage with high data rate. However, the femtocell deployment may also increase the co-channel interference (CCI) by reducing the distance reuse of the spectrum. In this paper, we introduce methods to cancel out the interference among the femtocells while considering that macrocells operate orthogonally to the femtocells. The femtocells may also cooperate through joint detection of the received signal and improve the overall error performance at the expense of an increased computational complexity. In this paper, the performance-complexity tradeoff of cooperative detection is investigated for uplink transmissions. Numerical results show that the cooperative detection gain may reach 10 dB at a Bit-Error Rate (BER) of 10 -2 when compared to the case without cooperation. © 2012 IEEE.

  14. Complex matrix multiplication operations with data pre-conditioning in a high performance computing architecture

    Science.gov (United States)

    Eichenberger, Alexandre E; Gschwind, Michael K; Gunnels, John A

    2014-02-11

    Mechanisms for performing a complex matrix multiplication operation are provided. A vector load operation is performed to load a first vector operand of the complex matrix multiplication operation to a first target vector register. The first vector operand comprises a real and imaginary part of a first complex vector value. A complex load and splat operation is performed to load a second complex vector value of a second vector operand and replicate the second complex vector value within a second target vector register. The second complex vector value has a real and imaginary part. A cross multiply add operation is performed on elements of the first target vector register and elements of the second target vector register to generate a partial product of the complex matrix multiplication operation. The partial product is accumulated with other partial products and a resulting accumulated partial product is stored in a result vector register.

  15. 42 CFR 493.1453 - Condition: Laboratories performing high complexity testing; clinical consultant.

    Science.gov (United States)

    2010-10-01

    ... Condition: Laboratories performing high complexity testing; clinical consultant. The laboratory must have a... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Laboratories performing high complexity testing; clinical consultant. 493.1453 Section 493.1453 Public Health CENTERS FOR MEDICARE & MEDICAID...

  16. Ground Combat Training Squadron Complex Final Environmental Assessment

    Science.gov (United States)

    2011-08-01

    Squadron E nvironm ental A ssessm ent Page 1-6 Eglin A ir Force Base, FL Final Figure 1-2. R esources N ot C arried Forw ard for D etailed A...Base, FL Final Figure 3-1. W ater R esources w ithin or near the Proposed A ction or A lternative L ocations Legend c::J Project Sites 1 00...bat T raining Squadron E nvironm ental A ssessm ent Page 3-14 Eglin A ir Force Base, FL Final Figure 3-3. B iological R esources w ithin or

  17. Product variety, product complexity and manufacturing operational performance: A systematic literature review

    DEFF Research Database (Denmark)

    Trattner, Alexandria Lee; Hvam, Lars; Herbert-Hansen, Zaza Nadja Lee

    Manufacturing in the twenty-first century has been wrought with the struggle to satisfy the rising demand for greater product variety and more complex products while still maintaining efficient manufacturing operations. However, the literature lacks an overview of which operational performance...... measures are most affected by increased variety and complexity. This study presents a systematic literature review of the recent scholarly literature on variety, complexity and manufacturing operational performance (MOP). Results show that product variety has a consistently negative relationship with MOP...... across different time, cost, quality and flexibility measures while product complexity lacks evidence of strong relationships with MOP measures....

  18. Adaptive Beamforming Based on Complex Quaternion Processes

    Directory of Open Access Journals (Sweden)

    Jian-wu Tao

    2014-01-01

    Full Text Available Motivated by the benefits of array signal processing in quaternion domain, we investigate the problem of adaptive beamforming based on complex quaternion processes in this paper. First, a complex quaternion least-mean squares (CQLMS algorithm is proposed and its performance is analyzed. The CQLMS algorithm is suitable for adaptive beamforming of vector-sensor array. The weight vector update of CQLMS algorithm is derived based on the complex gradient, leading to lower computational complexity. Because the complex quaternion can exhibit the orthogonal structure of an electromagnetic vector-sensor in a natural way, a complex quaternion model in time domain is provided for a 3-component vector-sensor array. And the normalized adaptive beamformer using CQLMS is presented. Finally, simulation results are given to validate the performance of the proposed adaptive beamformer.

  19. The Role of Task Complexity, Modality, and Aptitude in Narrative Task Performance

    Science.gov (United States)

    Kormos, Judit; Trebits, Anna

    2012-01-01

    The study reported in this paper investigated the relationship between components of aptitude and the fluency, lexical variety, syntactic complexity, and accuracy of performance in two types of written and spoken narrative tasks. We also addressed the question of how narrative performance varies in tasks of different cognitive complexity in the…

  20. FINAL IMPLEMENTATION AND PERFORMANCE OF THE LHC COLLIMATOR CONTROL SYSTEM

    CERN Document Server

    Redaelli, S; Masi, A; Losito, R

    2009-01-01

    The 2008 collimation system of the CERN Large Hadron Collider (LHC) included 80 movable collimators for a total of 316 degrees of freedom. Before beam operation, the final controls implementation was deployed and commissioned. The control system enabled remote control and appropriate diagnostics of the relevant parameters. The collimator motion is driven with time-functions, synchronized with other accelerator systems, which allows controlling the collimator jaw positions with a micrometer accuracy during all machine phases. The machine protection functionality of the system, which also relies on function-based tolerance windows, was also fully validated. The collimator control challenges are reviewed and the final system architecture is presented. The results of the remote system commissioning and the overall performance are discussed.

  1. Effect of Repeated/Spaced Formative Assessments on Medical School Final Exam Performance

    Directory of Open Access Journals (Sweden)

    Edward K. Chang

    2017-06-01

    Discussion: Performance on weekly formative assessments was predictive of final exam scores. Struggling medical students will benefit from extra cumulative practice exams while students who are excelling do not need extra practice.

  2. Radioactive Waste Management Complex performance assessment: Draft

    Energy Technology Data Exchange (ETDEWEB)

    Case, M.J.; Maheras, S.J.; McKenzie-Carter, M.A.; Sussman, M.E.; Voilleque, P.

    1990-06-01

    A radiological performance assessment of the Radioactive Waste Management Complex at the Idaho National Engineering Laboratory was conducted to demonstrate compliance with appropriate radiological criteria of the US Department of Energy and the US Environmental Protection Agency for protection of the general public. The calculations involved modeling the transport of radionuclides from buried waste, to surface soil and subsurface media, and eventually to members of the general public via air, ground water, and food chain pathways. Projections of doses were made for both offsite receptors and individuals intruding onto the site after closure. In addition, uncertainty analyses were performed. Results of calculations made using nominal data indicate that the radiological doses will be below appropriate radiological criteria throughout operations and after closure of the facility. Recommendations were made for future performance assessment calculations.

  3. Radioactive Waste Management Complex performance assessment: Draft

    International Nuclear Information System (INIS)

    Case, M.J.; Maheras, S.J.; McKenzie-Carter, M.A.; Sussman, M.E.; Voilleque, P.

    1990-06-01

    A radiological performance assessment of the Radioactive Waste Management Complex at the Idaho National Engineering Laboratory was conducted to demonstrate compliance with appropriate radiological criteria of the US Department of Energy and the US Environmental Protection Agency for protection of the general public. The calculations involved modeling the transport of radionuclides from buried waste, to surface soil and subsurface media, and eventually to members of the general public via air, ground water, and food chain pathways. Projections of doses were made for both offsite receptors and individuals intruding onto the site after closure. In addition, uncertainty analyses were performed. Results of calculations made using nominal data indicate that the radiological doses will be below appropriate radiological criteria throughout operations and after closure of the facility. Recommendations were made for future performance assessment calculations

  4. The final story on the ALA3/ALIS1 complex

    DEFF Research Database (Denmark)

    Lopez Marques, Rosa Laura

    The final story on the ALA3/ALIS1 complex. Lisbeth R. Poulsena, Rosa L. López-Marquésa, Alexander Schultza, Stephen C. McDowellb, Juha Okkeric, Dirk Lichtc, Thomas Pomorskic,  Jeffrey F. Harperb, and Michael G. Palmgrena,1 aCentre for Membrane Pumps in Cells and Disease - PUMPKIN, Danish National......).              Through a database search we have previously identified five Cdc50p/Lem3p homologues in Arabidopsis (ALIS1-5 for ALA Interacting Subunit)..We investigated the capacity of ALA3, alone and in combination with expressed ALIS proteins, to functionally complement a battery of yeast mutants carrying deletions...... in endogenous P4-ATPases. Our results indicated that ALIS1 functions as a true ß-subunit for the Arabidopsis putative flippase ALA3, being required for ATP-dependent phospholipid transport and for genetic complementation of the yeast P4-ATPase gene Drs2, which is involved in vesicle budding from the late Golgi...

  5. Complex Multi-Chamber Airbag Performance Simulation Final Report CRADA No. TSB-961-94

    Energy Technology Data Exchange (ETDEWEB)

    Kay, Gregory [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kithil, Philip [Advanced Safety Concepts, Inc. (ASCI), Santa Fe, NM (United States)

    2018-01-22

    The purpose of this small business CRADA was to evaluate the performance of new airbag concepts which were developed by the Advanced Safety Concepts, Inc. (ASCI). These new airbag concepts, if successful, could have major potential savings to society in terms of fewer injuries, lost time and lives.

  6. Procuring complex performance:case: public infrastructure projects

    OpenAIRE

    Leppänen, T. (Tero)

    2015-01-01

    Abstract This research studies procuring complex performance (PCP) in the case of public infrastructure projects. Focus of the research is on the interface between public clients and private sector contractors. Purpose of this research is to find out what are the main challenges of different project delivery methods according to literature (RQ1) and what are the practical challenges of public procurement (RQ2). As an end re...

  7. Emotional intelligence and academic performance in first and final year medical students: a cross-sectional study.

    Science.gov (United States)

    Chew, Boon How; Zain, Azhar Md; Hassan, Faezah

    2013-03-27

    Research on emotional intelligence (EI) suggests that it is associated with more pro-social behavior, better academic performance and improved empathy towards patients. In medical education and clinical practice, EI has been related to higher academic achievement and improved doctor-patient relationships. This study examined the effect of EI on academic performance in first- and final-year medical students in Malaysia. This was a cross-sectional study using an objectively-scored measure of EI, the Mayer-Salovey-Caruso Emotional Intelligence Test (MSCEIT). Academic performance of medical school students was measured using continuous assessment (CA) and final examination (FE) results. The first- and final-year students were invited to participate during their second semester. Students answered a paper-based demographic questionnaire and completed the online MSCEIT on their own. Relationships between the total MSCEIT score to academic performance were examined using multivariate analyses. A total of 163 (84 year one and 79 year five) medical students participated (response rate of 66.0%). The gender and ethnic distribution were representative of the student population. The total EI score was a predictor of good overall CA (OR 1.01), a negative predictor of poor result in overall CA (OR 0.97), a predictor of the good overall FE result (OR 1.07) and was significantly related to the final-year FE marks (adjusted R(2) = 0.43). Medical students who were more emotionally intelligent performed better in both the continuous assessments and the final professional examination. Therefore, it is possible that emotional skill development may enhance medical students' academic performance.

  8. Performances and improvement of copper-hydrazine complexation deoxidising resin

    International Nuclear Information System (INIS)

    Liu Fenfen; Zhang Hao; Sun Haijun; Liu Xiaojie

    2012-01-01

    Copper-hydrazine complexation deoxidising resin is tested to examine its performances including effluent water quality and capacity of deoxidisation. By the means of changing the resin type and regeneration, the deoxidising capacity of the resin can be improved to 13 times more than before. At the same time, physical performances of the resin are also greatly improved while maintaining its velocity of deoxidisation and effluent quality. (authors)

  9. Orchestra Festival Evaluations: Interjudge Agreement and Relationships between Performance Categories and Final Ratings.

    Science.gov (United States)

    Garman, Barry R.; And Others

    1991-01-01

    Band, orchestra, and choir festival evaluations are a regular part of many secondary school music programs, and most such festivals engage adjudicators who rate each group's performance. Because music ensemble performance is complex and multi-dimensional, it does not lend itself readily to precise measurement; generally, musical performances are…

  10. High performance parallel computing of flows in complex geometries: II. Applications

    International Nuclear Information System (INIS)

    Gourdain, N; Gicquel, L; Staffelbach, G; Vermorel, O; Duchaine, F; Boussuge, J-F; Poinsot, T

    2009-01-01

    Present regulations in terms of pollutant emissions, noise and economical constraints, require new approaches and designs in the fields of energy supply and transportation. It is now well established that the next breakthrough will come from a better understanding of unsteady flow effects and by considering the entire system and not only isolated components. However, these aspects are still not well taken into account by the numerical approaches or understood whatever the design stage considered. The main challenge is essentially due to the computational requirements inferred by such complex systems if it is to be simulated by use of supercomputers. This paper shows how new challenges can be addressed by using parallel computing platforms for distinct elements of a more complex systems as encountered in aeronautical applications. Based on numerical simulations performed with modern aerodynamic and reactive flow solvers, this work underlines the interest of high-performance computing for solving flow in complex industrial configurations such as aircrafts, combustion chambers and turbomachines. Performance indicators related to parallel computing efficiency are presented, showing that establishing fair criterions is a difficult task for complex industrial applications. Examples of numerical simulations performed in industrial systems are also described with a particular interest for the computational time and the potential design improvements obtained with high-fidelity and multi-physics computing methods. These simulations use either unsteady Reynolds-averaged Navier-Stokes methods or large eddy simulation and deal with turbulent unsteady flows, such as coupled flow phenomena (thermo-acoustic instabilities, buffet, etc). Some examples of the difficulties with grid generation and data analysis are also presented when dealing with these complex industrial applications.

  11. Performance in complex motor tasks deteriorates in hyperthermic humans

    DEFF Research Database (Denmark)

    Piil, Jacob Feder; Lundbye-Jensen, Jesper; Trangmar, Steven J

    2017-01-01

    -motor tracking performance was reduced by 10.7 ± 6.5% following exercise-induced hyperthermia when integrated in the multipart protocol and 4.4 ± 5.7% when tested separately (bothP 1.3% (P math tasks...... of information or decision-making prior to responding. We hypothesized that divergences could relate to task complexity and developed a protocol consisting of 1) simple motor task [TARGET_pinch], 2) complex motor task [Visuo-motor tracking], 3) simple math task [MATH_type], 4) combined motor-math task [MATH...

  12. Performance Engineering Research Institute SciDAC-2 Enabling Technologies Institute Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Mary [University of Utah

    2014-09-19

    Enhancing the performance of SciDAC applications on petascale systems has high priority within DOE SC. As we look to the future, achieving expected levels of performance on high-end com-puting (HEC) systems is growing ever more challenging due to enormous scale, increasing archi-tectural complexity, and increasing application complexity. To address these challenges, PERI has implemented a unified, tripartite research plan encompassing: (1) performance modeling and prediction; (2) automatic performance tuning; and (3) performance engineering of high profile applications. The PERI performance modeling and prediction activity is developing and refining performance models, significantly reducing the cost of collecting the data upon which the models are based, and increasing model fidelity, speed and generality. Our primary research activity is automatic tuning (autotuning) of scientific software. This activity is spurred by the strong user preference for automatic tools and is based on previous successful activities such as ATLAS, which has automatically tuned components of the LAPACK linear algebra library, and other re-cent work on autotuning domain-specific libraries. Our third major component is application en-gagement, to which we are devoting approximately 30% of our effort to work directly with Sci-DAC-2 applications. This last activity not only helps DOE scientists meet their near-term per-formance goals, but also helps keep PERI research focused on the real challenges facing DOE computational scientists as they enter the Petascale Era.

  13. Influence of year-on-year performance on final degree classification in a chiropractic master's degree program.

    Science.gov (United States)

    Dewhurst, Philip; Rix, Jacqueline; Newell, David

    2016-03-01

    We explored if any predictors of success could be identified from end-of-year grades in a chiropractic master's program and whether these grades could predict final-year grade performance and year-on-year performance. End-of-year average grades and module grades for a single cohort of students covering all academic results for years 1-4 of the 2013 graduating class were used for this analysis. Analysis consisted of within-year correlations of module grades with end-of-year average grades, linear regression models for continuous data, and logistic regression models for predicting final degree classifications. In year 1, 140 students were enrolled; 85.7% of students completed the program 4 years later. End-of-year average grades for years 1-3 were correlated (Pearson r values ranging from .75 to .87), but the end-of-year grades for years 1-3 were poorly correlated with clinic internship performance. In linear regression, several modules were predictive of end-of-year average grades for each year. For year 1, logistic regression showed that the modules Physiology and Pharmacology and Investigative Imaging were predictive of year 1 performance (odds ratio [OR] = 1.15 and 0.9, respectively). In year 3, the modules Anatomy and Histopathology 3 and Problem Solving were predictors of the difference between a pass/merit or distinction final degree classification (OR = 1.06 and 1.12, respectively). Early academic performance is weakly correlated with final-year clinic internship performance. The modules of Anatomy and Histopathology year 3 and Problem Solving year 3 emerged more consistently than other modules as being associated with final-year classifications.

  14. Influence of complexing agents on the mechanical performances of the cement conditioning matrix

    International Nuclear Information System (INIS)

    Nicu, M.; Mihai, F.; Turcanu, C.

    1998-01-01

    The safety of the radioactive waste disposal is a priority demand concerning the protection of the environment and population. For this reason, an engineering multi-barrier system is studied in order to be improved. This study aims to establish the influence of the complexing agents on the mechanical performances of the cement conditioning matrix. Radioactive effluents which contain agents as oxalic and citric acids are generated during the radioactive decontamination operation using chemical methods. The conditioning of these wastes by cementing process imposed the experimental determination of the mechanical performances of the matrix and the upper permissible level of complexing agent concentration. To determine the influence of complexing agents on the mechanical performances of cement conditioning matrix, cubic samples (20 mm x 20 mm x 20 mm) were prepared using commercial Portland cement and solutions of organic complexing acids or salts (citric acid, oxalic acid, tartaric acid, sodium citrate and ammonium oxalate). The complexation concentration varied between 0.25% and 1% in distilled and drinking water, respectively. The selected cement/water ratio was 0.5. The experiments were focused on: - establishing the firmness of the Pa 35 cement pastes and mortars in dependence on the water/cement ratio, by classical methods (Tetmeyer probe for pastes and standard cone for mortars) and by triclinic time through a funnel with 15 mm aperture; - studying the influence of the tartaric, oxalic, citric acids, ammonium oxalate and sodium citrate solution concentrations on water quantities used to obtain pastes with normal firmness and on Pa 35 cement setting; - the influence of oxalic acid, tartaric acid and ammonium oxalate solution concentrations on the strength of compression of the pastes with normal firmness; - for testing, standard test bar cubes with 20 mm sides were used and the strength of compression was tested at 28 days; - establishing the behaviour in time of

  15. Complex Hand Dexterity: A Review of Biomechanical Methods for Measuring Musical Performance

    Directory of Open Access Journals (Sweden)

    Cheryl Diane Metcalf

    2014-05-01

    Full Text Available Complex hand dexterity is fundamental to our interactions with the physical, social and cultural environment. Dexterity can be an expression of creativity and precision in a range of activities, including musical performance. Little is understood about complex hand dexterity or how virtuoso expertise is acquired, due to the versatility of movement combinations available to complete any given task. This has historically limited progress of the field because of difficulties in measuring movements of the hand. Recent developments in methods of motion capture and analysis mean it is now possible to explore the intricate movements of the hand and fingers. These methods allow us insights into the neurophysiological mechanisms underpinning complex hand dexterity and motor learning. They also allow investigation into the key factors that contribute to injury, recovery and functional compensation.The application of such analytical techniques within musical performance provides a multidisciplinary framework for purposeful investigation into the process of learning and skill acquisition in instrumental performance. These highly skilled manual and cognitive tasks present the ultimate achievement in complex hand dexterity. This paper will review methods of assessing instrumental performance in music, focusing specifically on biomechanical measurement and the associated technical challenges faced when measuring highly dexterous activities.

  16. Clinical observed performance evaluation: a prospective study in final year students of surgery.

    LENUS (Irish Health Repository)

    Markey, G C

    2010-06-24

    We report a prospective study of clinical observed performance evaluation (COPE) for 197 medical students in the pre-qualification year of clinical education. Psychometric quality was the main endpoint. Students were assessed in groups of 5 in 40-min patient encounters, with each student the focus of evaluation for 8 min. Each student had a series of assessments in a 25-week teaching programme. Over time, several clinicians from a pool of 16 surgical consultants and registrars evaluated each student by direct observation. A structured rating form was used for assessment data. Variance component analysis (VCA), internal consistency and inter-rater agreement were used to estimate reliability. The predictive and convergent validity of COPE in relation to summative OSCE, long case, and overall final examination was estimated. Median number of COPE assessments per student was 7. Generalisability of a mean score over 7 COPE assessments was 0.66, equal to that of an 8 x 7.5 min station final OSCE. Internal consistency was 0.88-0.97 and inter-rater agreement 0.82. Significant correlations were observed with OSCE performance (R = 0.55 disattenuated) and long case (R = 0.47 disattenuated). Convergent validity was 0.81 by VCA. Overall final examination performance was linearly related to mean COPE score with standard error 3.7%. COPE permitted efficient serial assessment of a large cohort of final year students in a real world setting. Its psychometric quality compared well with conventional assessments and with other direct observation instruments as reported in the literature. Effect on learning, and translation to clinical care, are directions for future research.

  17. Geometric and Algebraic Approaches in the Concept of Complex Numbers

    Science.gov (United States)

    Panaoura, A.; Elia, I.; Gagatsis, A.; Giatilis, G.-P.

    2006-01-01

    This study explores pupils' performance and processes in tasks involving equations and inequalities of complex numbers requiring conversions from a geometric representation to an algebraic representation and conversions in the reverse direction, and also in complex numbers problem solving. Data were collected from 95 pupils of the final grade from…

  18. Development of High-Performance Cast Crankshafts. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, Mark E [General Motors, Detroit, MI (United States)

    2017-03-31

    The objective of this project was to develop technologies that would enable the production of cast crankshafts that can replace high performance forged steel crankshafts. To achieve this, the Ultimate Tensile Strength (UTS) of the new material needs to be 850 MPa with a desired minimum Yield Strength (YS; 0.2% offset) of 615 MPa and at least 10% elongation. Perhaps more challenging, the cast material needs to be able to achieve sufficient local fatigue properties to satisfy the durability requirements in today’s high performance gasoline and diesel engine applications. The project team focused on the development of cast steel alloys for application in crankshafts to take advantage of the higher stiffness over other potential material choices. The material and process developed should be able to produce high-performance crankshafts at no more than 110% of the cost of current production cast units, perhaps the most difficult objective to achieve. To minimize costs, the primary alloy design strategy was to design compositions that can achieve the required properties with minimal alloying and post-casting heat treatments. An Integrated Computational Materials Engineering (ICME) based approach was utilized, rather than relying only on traditional trial-and-error methods, which has been proven to accelerate alloy development time. Prototype melt chemistries designed using ICME were cast as test specimens and characterized iteratively to develop an alloy design within a stage-gate process. Standard characterization and material testing was done to validate the alloy performance against design targets and provide feedback to material design and manufacturing process models. Finally, the project called for Caterpillar and General Motors (GM) to develop optimized crankshaft designs using the final material and manufacturing processing path developed. A multi-disciplinary effort was to integrate finite element analyses by engine designers and geometry-specific casting

  19. Optimation and Determination of Fe-Oxinate Complex by Using High Performance Liquid Chromatography

    Science.gov (United States)

    Oktavia, B.; Nasra, E.; Sary, R. C.

    2018-04-01

    The need for iron will improve the industrial processes that require iron as its raw material. Control of industrial iron waste is very important to do. One method of iron analysis is to conduct indirect analysis of iron (III) ions by complexing with 8-Hydroxyquinoline or oxine. In this research, qualitative and quantitative tests of iron (III) ions in the form of complex with oxine. The analysis was performed using HPLC at a wavelength of 470 nm with an ODS C18 column. Three methods of analysis were performed: 1) Fe-oxinate complexes were prepared in an ethanol solvent so no need for separation anymore, (2) Fe-oxinate complexes were made in chloroform so that a solvent extraction was required before the complex was injected into the column while the third complex was formed in the column, wherein the eluent contains the oxide and the metal ions are then injected. The resulting chromatogram shows that the 3rd way provides a better chromatogram for iron analysis.

  20. Where to from here? Future applications of mental models of complex performance

    International Nuclear Information System (INIS)

    Hahn, H.A.; Nelson, W.R.; Blackman, H.S.

    1988-01-01

    The purpose of this paper is to raise issues for discussion regarding the applications of mental models in the study of complex performance. Applications for training, expert systems and decision aids, job selection, workstation design, and other complex environments are considered. 1 ref

  1. Task complexity, student perceptions of vocabulary learning in EFL, and task performance.

    Science.gov (United States)

    Wu, Xiaoli; Lowyck, Joost; Sercu, Lies; Elen, Jan

    2013-03-01

    The study deepened our understanding of how students' self-efficacy beliefs contribute to the context of teaching English as a foreign language in the framework of cognitive mediational paradigm at a fine-tuned task-specific level. The aim was to examine the relationship among task complexity, self-efficacy beliefs, domain-related prior knowledge, learning strategy use, and task performance as they were applied to English vocabulary learning from reading tasks. Participants were 120 second-year university students (mean age 21) from a Chinese university. This experiment had two conditions (simple/complex). A vocabulary level test was first conducted to measure participants' prior knowledge of English vocabulary. Participants were then randomly assigned to one of the learning tasks. Participants were administered task booklets together with the self-efficacy scales, measures of learning strategy use, and post-tests. Data obtained were submitted to multivariate analysis of variance (MANOVA) and path analysis. Results from the MANOVA model showed a significant effect of vocabulary level on self-efficacy beliefs, learning strategy use, and task performance. Task complexity showed no significant effect; however, an interaction effect between vocabulary level and task complexity emerged. Results from the path analysis showed self-efficacy beliefs had an indirect effect on performance. Our results highlighted the mediating role of self-efficacy beliefs and learning strategy use. Our findings indicate that students' prior knowledge plays a crucial role on both self-efficacy beliefs and task performance, and the predictive power of self-efficacy on task performance may lie in its association with learning strategy use. © 2011 The British Psychological Society.

  2. The effects of physical threat on team processes during complex task performance

    NARCIS (Netherlands)

    Kamphuis, W.; Gaillard, A.W.K.; Vogelaar, A.L.W.

    2011-01-01

    Teams have become the norm for operating in dangerous and complex situations. To investigate how physical threat affects team performance, 27 threeperson teams engaged in a complex planning and problem-solving task, either under physical threat or under normal conditions. Threat consisted of the

  3. Final anatomic and visual outcomes appear independent of duration of silicone oil intraocular tamponade in complex retinal detachment surgery.

    Science.gov (United States)

    Rhatigan, Maedbh; McElnea, Elizabeth; Murtagh, Patrick; Stephenson, Kirk; Harris, Elaine; Connell, Paul; Keegan, David

    2018-01-01

    To report anatomic and visual outcomes following silicone oil removal in a cohort of patients with complex retinal detachment, to determine association between duration of tamponade and outcomes and to compare patients with oil removed and those with oil in situ in terms of demographic, surgical and visual factors. We reported a four years retrospective case series of 143 patients with complex retinal detachments who underwent intraocular silicone oil tamponade. Analysis between anatomic and visual outcomes, baseline demographics, duration of tamponade and number of surgical procedures were carried out using Fisher's exact test and unpaired two-tailed t -test. One hundred and six patients (76.2%) had undergone silicone oil removal at the time of review with 96 patients (90.6%) showing retinal reattachment following oil removal. Duration of tamponade was not associated with final reattachment rate or with a deterioration in best corrected visual acuity (BCVA). Patients with oil removed had a significantly better baseline and final BCVA compared to those under oil tamponade ( P =0.0001, <0.0001 respectively). Anatomic and visual outcomes in this cohort are in keeping with those reported in the literature. Favorable outcomes were seen with oil removal but duration of oil tamponade does not affect final attachment rate with modern surgical techniques and should be managed on a case by case basis.

  4. Performance analysis and prediction in triathlon.

    Science.gov (United States)

    Ofoghi, Bahadorreza; Zeleznikow, John; Macmahon, Clare; Rehula, Jan; Dwyer, Dan B

    2016-01-01

    Performance in triathlon is dependent upon factors that include somatotype, physiological capacity, technical proficiency and race strategy. Given the multidisciplinary nature of triathlon and the interaction between each of the three race components, the identification of target split times that can be used to inform the design of training plans and race pacing strategies is a complex task. The present study uses machine learning techniques to analyse a large database of performances in Olympic distance triathlons (2008-2012). The analysis reveals patterns of performance in five components of triathlon (three race "legs" and two transitions) and the complex relationships between performance in each component and overall performance in a race. The results provide three perspectives on the relationship between performance in each component of triathlon and the final placing in a race. These perspectives allow the identification of target split times that are required to achieve a certain final place in a race and the opportunity to make evidence-based decisions about race tactics in order to optimise performance.

  5. The Effect of Performance-Contingent Incentives when Task Complexity is Manipulated through Instruction

    Directory of Open Access Journals (Sweden)

    Monte Wynder

    2010-12-01

    Full Text Available When, and how, performance-contingent incentives improve performance is an important question fororganisations. Empirical results have been mixed – performance-contingent incentives sometimes increaseperformance, sometimes decrease performance, and sometimes have no effect. Theorists have called forfurther research to identify the effect of various moderating variables, including knowledge and taskcomplexity. This study responds by considering the role of instruction in providing the necessary knowledgeto reduce task complexity. The results suggest that a performance-contingent penalty can be a particularlyeffective means of directing effort for a simple task. For a complex task, performance can be improvedthrough instruction. The type of instruction is important – with rule-based instruction effectively directingeffort – however principle-based instruction is necessary to facilitate problem investigation and problemsolving.

  6. HIGH PERFORMANCE PIAA CORONAGRAPHY WITH COMPLEX AMPLITUDE FOCAL PLANE MASKS

    International Nuclear Information System (INIS)

    Guyon, Olivier; Martinache, Frantz; Belikov, Ruslan; Soummer, Remi

    2010-01-01

    We describe a coronagraph approach where the performance of a Phase-Induced Amplitude Apodization (PIAA) coronagraph is improved by using a partially transmissive phase-shifting focal plane mask and a Lyot stop. This approach combines the low inner working angle offered by phase mask coronagraphy, the full throughput and uncompromized angular resolution of the PIAA approach, and the design flexibility of Apodized Pupil Lyot Coronagraph. A PIAA complex mask coronagraph (PIAACMC) is fully described by the focal plane mask size, or, equivalently, its complex transmission which ranges from 0 (opaque) to -1 (phase shifting). For all values of the transmission, the PIAACMC theoretically offers full on-axis extinction and 100% throughput at large angular separations. With a pure phase focal plane mask (complex transmission = -1), the PIAACMC offers 50% throughput at 0.64 λ/D while providing total extinction of an on-axis point source. This performance is very close to the 'fundamental performance limit' of coronagraphy derived from first principles. For very high contrast level, imaging performance with PIAACMC is in practice limited by the angular size of the on-axis target (usually a star). We show that this fundamental limitation must be taken into account when choosing the optimal value of the focal plane mask size in the PIAACMC design. We show that the PIAACMC enables visible imaging of Jupiter-like planets at ∼1.2 λ/D from the host star, and can therefore offer almost three times more targets than a PIAA coronagraph optimized for this type of observation. We find that for visible imaging of Earth-like planets, the PIAACMC gain over a PIAA is probably much smaller, as coronagraphic performance is then strongly constrained by stellar angular size. For observations at 'low' contrast (below ∼ 10 8 ), the PIAACMC offers significant performance enhancement over PIAA. This is especially relevant for ground-based high contrast imaging systems in the near-IR, where

  7. Self-management support by final year nursing students: A correlational study of performance and person-related associated factors.

    Science.gov (United States)

    Duprez, Veerle; Beeckman, Dimitri; Verhaeghe, Sofie; Van Hecke, Ann

    2017-09-01

    Chronic conditions put a heavy burden on healthcare in every country. Supporting persons with a chronic illness to take an active role in the management of their condition is a core component in the Chronic Care Model. It implies confidence and good skills from professionals. To date, there is no evidence on final year nursing students' performance in supporting patients' self-management, nor on factors associated with this performance. To explore self-reported performance of supporting patients' self-management by final year nursing students, and person-related factors associated with this performance. A correlational multi-centre study of final year nursing students (N=256) from eight nursing schools. Students were recruited from a convenience sample of eight nursing schools. All final year students were invited to participate. Data were collected between January 2015 and May 2016 using self-administered validated questionnaires. Theoretical behavioural frameworks were used to select hypothesized associated factors for self-management support: self-efficacy to perform self-management support and socio-structural factors (Social Cognitive Theory); needs for autonomy, competence and relatedness, and patient-invested contingent self-esteem (Self-Determination Theory); and attitudes towards supporting patients' self-management (Theory of Planned Behaviour). Final year nursing students (N=256) reported an overall low level of performance in delivering self-management support during internship. Students lacked mainly competencies in collaborative goal setting and shared decision making. Students reported a significant gap between their confidence and their actual performance in self-management support (pLearning opportunities can be introduced in classroom activities and on internship. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Variations in task constraints shape emergent performance outcomes and complexity levels in balancing.

    Science.gov (United States)

    Caballero Sánchez, Carla; Barbado Murillo, David; Davids, Keith; Moreno Hernández, Francisco J

    2016-06-01

    This study investigated the extent to which specific interacting constraints of performance might increase or decrease the emergent complexity in a movement system, and whether this could affect the relationship between observed movement variability and the central nervous system's capacity to adapt to perturbations during balancing. Fifty-two healthy volunteers performed eight trials where different performance constraints were manipulated: task difficulty (three levels) and visual biofeedback conditions (with and without the center of pressure (COP) displacement and a target displayed). Balance performance was assessed using COP-based measures: mean velocity magnitude (MVM) and bivariate variable error (BVE). To assess the complexity of COP, fuzzy entropy (FE) and detrended fluctuation analysis (DFA) were computed. ANOVAs showed that MVM and BVE increased when task difficulty increased. During biofeedback conditions, individuals showed higher MVM but lower BVE at the easiest level of task difficulty. Overall, higher FE and lower DFA values were observed when biofeedback was available. On the other hand, FE reduced and DFA increased as difficulty level increased, in the presence of biofeedback. However, when biofeedback was not available, the opposite trend in FE and DFA values was observed. Regardless of changes to task constraints and the variable investigated, balance performance was positively related to complexity in every condition. Data revealed how specificity of task constraints can result in an increase or decrease in complexity emerging in a neurobiological system during balance performance.

  9. Does model performance improve with complexity? A case study with three hydrological models

    Science.gov (United States)

    Orth, Rene; Staudinger, Maria; Seneviratne, Sonia I.; Seibert, Jan; Zappa, Massimiliano

    2015-04-01

    In recent decades considerable progress has been made in climate model development. Following the massive increase in computational power, models became more sophisticated. At the same time also simple conceptual models have advanced. In this study we validate and compare three hydrological models of different complexity to investigate whether their performance varies accordingly. For this purpose we use runoff and also soil moisture measurements, which allow a truly independent validation, from several sites across Switzerland. The models are calibrated in similar ways with the same runoff data. Our results show that the more complex models HBV and PREVAH outperform the simple water balance model (SWBM) in case of runoff but not for soil moisture. Furthermore the most sophisticated PREVAH model shows an added value compared to the HBV model only in case of soil moisture. Focusing on extreme events we find generally improved performance of the SWBM during drought conditions and degraded agreement with observations during wet extremes. For the more complex models we find the opposite behavior, probably because they were primarily developed for prediction of runoff extremes. As expected given their complexity, HBV and PREVAH have more problems with over-fitting. All models show a tendency towards better performance in lower altitudes as opposed to (pre-) alpine sites. The results vary considerably across the investigated sites. In contrast, the different metrics we consider to estimate the agreement between models and observations lead to similar conclusions, indicating that the performance of the considered models is similar at different time scales as well as for anomalies and long-term means. We conclude that added complexity does not necessarily lead to improved performance of hydrological models, and that performance can vary greatly depending on the considered hydrological variable (e.g. runoff vs. soil moisture) or hydrological conditions (floods vs. droughts).

  10. Frequency of chest pain in primary care, diagnostic tests performed and final diagnoses.

    Science.gov (United States)

    Hoorweg, Beatrijs Bn; Willemsen, Robert Ta; Cleef, Lotte E; Boogaerts, Tom; Buntinx, Frank; Glatz, Jan Fc; Dinant, Geert Jan

    2017-11-01

    Observational study of patients with chest pain in primary care: determination of incidence, referral rate, diagnostic tests and (agreement between) working and final diagnoses. 118 general practitioners (GPs) in the Netherlands and Belgium recorded all patient contacts during  2weeks. Furthermore, patients presenting with chest pain were registered extensively. A follow-up form was filled in after 30 days. 22 294 patient contacts were registered. In 281 (1.26%), chest pain was a reason for consulting the GP (mean age for men 54.4/women 53 years). In this cohort of 281 patients, in 38.1% of patients, acute coronary syndrome (ACS) was suspected at least temporarily during consultation, 40.2% of patients were referred to secondary care and 512 diagnostic tests were performed by GPs and consulted specialists. Musculoskeletal pain was the most frequent working (26.1%) and final diagnoses (33.1%). Potentially life-threatening diseases as final diagnosis (such as myocardial infarction) accounted for 8.4% of all chest pain cases. In 23.1% of cases, a major difference between working and final diagnoses was found, in 0.7% a severe disease was initially missed by the GP. Chest pain was present in 281 patients (1.26% of all consultations). Final diagnoses were mostly non-life-threatening. Nevertheless, in 8.4% of patients with chest pain, life-threatening underlying causes were identified. This seems reflected in the magnitude and wide variety of diagnostic tests performed in these patients by GPs and specialists, in the (safe) overestimation of life-threatening diseases by GPs at initial assessment and in the high referral rate we found. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  11. Can motto-goals outperform learning and performance goals? Influence of goal setting on performance and affect in a complex problem solving task

    Directory of Open Access Journals (Sweden)

    Miriam S. Rohe

    2016-09-01

    Full Text Available In this paper, we bring together research on complex problem solving with that on motivational psychology about goal setting. Complex problems require motivational effort because of their inherent difficulties. Goal Setting Theory has shown with simple tasks that high, specific performance goals lead to better performance outcome than do-your-best goals. However, in complex tasks, learning goals have proven more effective than performance goals. Based on the Zurich Resource Model (Storch & Krause, 2014, so-called motto-goals (e.g., "I breathe happiness" should activate a person’s resources through positive affect. It was found that motto-goals are effective with unpleasant duties. Therefore, we tested the hypothesis that motto-goals outperform learning and performance goals in the case of complex problems. A total of N = 123 subjects participated in the experiment. In dependence of their goal condition, subjects developed a personal motto, learning, or performance goal. This goal was adapted for the computer-simulated complex scenario Tailorshop, where subjects worked as managers in a small fictional company. Other than expected, there was no main effect of goal condition for the management performance. As hypothesized, motto goals led to higher positive and lower negative affect than the other two goal types. Even though positive affect decreased and negative affect increased in all three groups during Tailorshop completion, participants with motto goals reported the lowest rates of negative affect over time. Exploratory analyses investigated the role of affect in complex problem solving via mediational analyses and the influence of goal type on perceived goal attainment.

  12. A Novel Method for Assessing Task Complexity in Outpatient Clinical-Performance Measures.

    Science.gov (United States)

    Hysong, Sylvia J; Amspoker, Amber B; Petersen, Laura A

    2016-04-01

    Clinical-performance measurement has helped improve the quality of health-care; yet success in attaining high levels of quality across multiple domains simultaneously still varies considerably. Although many sources of variability in care quality have been studied, the difficulty required to complete the clinical work itself has received little attention. We present a task-based methodology for evaluating the difficulty of clinical-performance measures (CPMs) by assessing the complexity of their component requisite tasks. Using Functional Job Analysis (FJA), subject-matter experts (SMEs) generated task lists for 17 CPMs; task lists were rated on ten dimensions of complexity, and then aggregated into difficulty composites. Eleven outpatient work SMEs; 133 VA Medical Centers nationwide. Clinical Performance: 17 outpatient CPMs (2000-2008) at 133 VA Medical Centers nationwide. Measure Difficulty: for each CPM, the number of component requisite tasks and the average rating across ten FJA complexity scales for the set of tasks comprising the measure. Measures varied considerably in the number of component tasks (M = 10.56, SD = 6.25, min = 5, max = 25). Measures of chronic care following acute myocardial infarction exhibited significantly higher measure difficulty ratings compared to diabetes or screening measures, but not to immunization measures ([Formula: see text] = 0.45, -0.04, -0.05, and -0.06 respectively; F (3, 186) = 3.57, p = 0.015). Measure difficulty ratings were not significantly correlated with the number of component tasks (r = -0.30, p = 0.23). Evaluating the difficulty of achieving recommended CPM performance levels requires more than simply counting the tasks involved; using FJA to assess the complexity of CPMs' component tasks presents an alternate means of assessing the difficulty of primary-care CPMs and accounting for performance variation among measures and performers. This in turn could be used in designing

  13. Comment Response on the Final Report: Peer Review of the Total System Performance Assessment-Viability Assessment (TSPA-VA)

    International Nuclear Information System (INIS)

    Pendleton, M. W.

    1999-01-01

    The Management and Operating Contractor established a Performance Assessment Peer Review Panel (hereinafter ''the Panel'') at the request of the U.S. Department of Energy Yucca Mountain Site Characterization Office. The objectives of the peer review were to provide: (1) A formal, independent evaluation and critique of Viability Assessment of a Repository at Yucca Mountain: Total System Performance Assessment, Volume 3 (DOE 1998a; hereinafter ''Total System Performance Assessment-Viability Assessment'') that was conducted in support of the Viability Assessment of a Repository at Yucca Mountain (DOE 1998b). (2) Suggestions for improvements as the U.S. Department of Energy prepares to develop the documentation for a Total System Performance Assessment to support a potential License Application. The Panel conducted a phased review over a two-year period to observe the development and, ultimately, to review the Total System Performance Assessment-Viability Assessment (DOE 1998a). During the development of the Total System Performance Assessment-Viability Assessment (DOE 1998a), the Panel submitted three Interim Reports (Whipple et al., 1997a, 1997b, and 1998) to the Management and Operating Contractor with recommendations and comments on the process models, model abstractions, and draft documentation for the Total System Performance Assessment-Viability Assessment (DOE 1998a). The Panel's Final Report Total System Performance Assessment Peer Review Panel (Whipple et al. 1999; hereinafter ''Final Report'') on the Total System Performance Assessment-Viability Assessment (DOE 1998a) is based primarily on the completed Total System Performance Assessment-Viability Assessment (DOE 1998a), the Total System Performance Assessment-Viability Assessment (TSPA-VA) Analyses Technical Basis Document (CRWMS M and O 1998), and the cited references. The Final Report (Whipple et al. 1999) includes the major points from the three Interim Reports (Whipple et al. 1997a, 1997b, and 1998

  14. The disruptive effects of pain on complex cognitive performance and executive control.

    Directory of Open Access Journals (Sweden)

    Edmund Keogh

    Full Text Available Pain interferes and disrupts attention. What is less clear is how pain affects performance on complex tasks, and the strategies used to ensure optimal outcomes. The aim of the current study was to examine the effect of pain on higher-order executive control processes involved in managing complex tasks. Sixty-two adult volunteers (40 female completed two computer-based tasks: a breakfast making task and a word generation puzzle. Both were complex, involving executive control functions, including goal-directed planning and switching. Half of those recruited performed the tasks under conditions of thermal heat pain, and half with no accompanying pain. Whilst pain did not affect central performance on either task, it did have indirect effects. For the breakfast task, pain resulted in a decreased ability to multitask, with performance decrements found on the secondary task. However, no effects of pain were found on the processes thought to underpin this task. For the word generation puzzle, pain did not affect task performance, but did alter subjective accounts of the processes used to complete the task; pain affected the perceived allocation of time to the task, as well as switching perceptions. Sex differences were also found. When studying higher-order cognitive processes, pain-related interference effects are varied, and may result in subtle or indirect changes in cognition.

  15. The disruptive effects of pain on complex cognitive performance and executive control.

    Science.gov (United States)

    Keogh, Edmund; Moore, David J; Duggan, Geoffrey B; Payne, Stephen J; Eccleston, Christopher

    2013-01-01

    Pain interferes and disrupts attention. What is less clear is how pain affects performance on complex tasks, and the strategies used to ensure optimal outcomes. The aim of the current study was to examine the effect of pain on higher-order executive control processes involved in managing complex tasks. Sixty-two adult volunteers (40 female) completed two computer-based tasks: a breakfast making task and a word generation puzzle. Both were complex, involving executive control functions, including goal-directed planning and switching. Half of those recruited performed the tasks under conditions of thermal heat pain, and half with no accompanying pain. Whilst pain did not affect central performance on either task, it did have indirect effects. For the breakfast task, pain resulted in a decreased ability to multitask, with performance decrements found on the secondary task. However, no effects of pain were found on the processes thought to underpin this task. For the word generation puzzle, pain did not affect task performance, but did alter subjective accounts of the processes used to complete the task; pain affected the perceived allocation of time to the task, as well as switching perceptions. Sex differences were also found. When studying higher-order cognitive processes, pain-related interference effects are varied, and may result in subtle or indirect changes in cognition.

  16. Y-12 National Security Complex Emergency Management Hazards Assessment (EMHA) Process; FINAL

    International Nuclear Information System (INIS)

    Bailiff, E.F.; Bolling, J.D.

    2001-01-01

    This document establishes requirements and standard methods for the development and maintenance of the Emergency Management Hazards Assessment (EMHA) process used by the lead and all event contractors at the Y-12 Complex for emergency planning and preparedness. The EMHA process provides the technical basis for the Y-12 emergency management program. The instructions provided in this document include methods and requirements for performing the following emergency management activities at Y-12: (1) hazards identification; (2) hazards survey, and (3) hazards assessment

  17. Relationship between push phase and final race time in skeleton performance.

    Science.gov (United States)

    Zanoletti, Costanza; La Torre, Antonio; Merati, Giampiero; Rampinini, Ermanno; Impellizzeri, Franco M

    2006-08-01

    The aim of this study was to examine the relationship between push-time and final race time in skeleton participants during a series of major international competitions to determine the importance of the push phase in skeleton performance. Correlations were computed from the first and second heat split data measured during 24 men and 24 women skeleton competitions. Body mass, height, age, and years of experience of the first 30 men and women athletes of the skeleton, bobsleigh and luge 2003-2004 World Cup ranking were used for the comparison between sliding sports. Moderate but significant correlations (p push-time and final race time in men (r(mean) = 0.48) and women (r(mean) = 0.63). No correlations were found between changes in the individual push-time between the first and second heat with the corresponding changes in final race time. The bobsleigh sliders are heavier than the athletes of the other sliding disciplines. Luge athletes have more experience and are younger than bobsleigh and skeleton sliders. The results of this study suggest that a fast push phase is a prerequisite to success in competition and confirms that the selection of skeleton athletes based on the ability to accelerate to a maximum speed quickly could be valid. However, a good or improved push-time does not ensure a placement in the top finishing positions. On the basis of these results, we suggest that strength and power training is necessary to maintain a short push-time but additional physical training aimed to enhance the push phase might not reflect performance improvements. The recruitment of younger athletes and an increase of youthful competitive activity may be another effective way to reach international competitive results.

  18. Benchmarking in pathology: development of a benchmarking complexity unit and associated key performance indicators.

    Science.gov (United States)

    Neil, Amanda; Pfeffer, Sally; Burnett, Leslie

    2013-01-01

    This paper details the development of a new type of pathology laboratory productivity unit, the benchmarking complexity unit (BCU). The BCU provides a comparative index of laboratory efficiency, regardless of test mix. It also enables estimation of a measure of how much complex pathology a laboratory performs, and the identification of peer organisations for the purposes of comparison and benchmarking. The BCU is based on the theory that wage rates reflect productivity at the margin. A weighting factor for the ratio of medical to technical staff time was dynamically calculated based on actual participant site data. Given this weighting, a complexity value for each test, at each site, was calculated. The median complexity value (number of BCUs) for that test across all participating sites was taken as its complexity value for the Benchmarking in Pathology Program. The BCU allowed implementation of an unbiased comparison unit and test listing that was found to be a robust indicator of the relative complexity for each test. Employing the BCU data, a number of Key Performance Indicators (KPIs) were developed, including three that address comparative organisational complexity, analytical depth and performance efficiency, respectively. Peer groups were also established using the BCU combined with simple organisational and environmental metrics. The BCU has enabled productivity statistics to be compared between organisations. The BCU corrects for differences in test mix and workload complexity of different organisations and also allows for objective stratification into peer groups.

  19. The Complexity of Mitochondrial Complex IV: An Update of Cytochrome c Oxidase Biogenesis in Plants

    Science.gov (United States)

    Mansilla, Natanael; Racca, Sofia; Gras, Diana E.; Gonzalez, Daniel H.

    2018-01-01

    Mitochondrial respiration is an energy producing process that involves the coordinated action of several protein complexes embedded in the inner membrane to finally produce ATP. Complex IV or Cytochrome c Oxidase (COX) is the last electron acceptor of the respiratory chain, involved in the reduction of O2 to H2O. COX is a multimeric complex formed by multiple structural subunits encoded in two different genomes, prosthetic groups (heme a and heme a3), and metallic centers (CuA and CuB). Tens of accessory proteins are required for mitochondrial RNA processing, synthesis and delivery of prosthetic groups and metallic centers, and for the final assembly of subunits to build a functional complex. In this review, we perform a comparative analysis of COX composition and biogenesis factors in yeast, mammals and plants. We also describe possible external and internal factors controlling the expression of structural proteins and assembly factors at the transcriptional and post-translational levels, and the effect of deficiencies in different steps of COX biogenesis to infer the role of COX in different aspects of plant development. We conclude that COX assembly in plants has conserved and specific features, probably due to the incorporation of a different set of subunits during evolution. PMID:29495437

  20. Executive Functioning and School Performance Among Pediatric Survivors of Complex Congenital Heart Disease

    Science.gov (United States)

    Gerstle, Melissa; Beebe, Dean W.; Drotar, Dennis; Cassedy, Amy; Marino, Bradley S.

    2016-01-01

    Objective To investigate the presence and severity of real-world impairments in executive functioning– responsible for children’s regulatory skills (metacognition, behavioral regulation) – and its potential impact on school performance among pediatric survivors of complex congenital heart disease (CHD). Study design Survivors of complex CHD aged 8–16 years (n=143)and their parents/guardians from a regional CHD survivor registry participated (81% participation rate). Parents completed proxy measures of executive functioning, school competency, and school-related quality of life (QOL). Patients also completed a measure of school QOL and underwent IQ testing. Patients were categorized into two groups based on heart lesion complexity: two-ventricle or single-ventricle. Results Survivors of complex CHD performed significantly worse than norms for executive functioning, IQ, school competency, and school QOL. Metacognition was more severely affected than behavioral regulation, and metacognitive deficits were more often present in older children. Even after taking into account demographic factors, disease severity, and IQ, metacognition uniquely and strongly predicted poorer school performance. In exploratory analyses, patients with single-ventricle lesions were rated as having lower school competency and school QOL, and patients with two-ventricle lesions were rated as having poorer behavioral regulation. Conclusions Survivors of complex CHD experience greater executive functioning difficulties than healthy peers, with metacognition particularly impacted and particularly relevant for day-to-day school performance. Especially in older children, clinicians should watch for metacognitive deficits, such as problems with organization, planning, self-monitoring, and follow-through on tasks. PMID:26875011

  1. Measuring cognitive load: performance, mental effort and simulation task complexity.

    Science.gov (United States)

    Haji, Faizal A; Rojas, David; Childs, Ruth; de Ribaupierre, Sandrine; Dubrowski, Adam

    2015-08-01

    Interest in applying cognitive load theory in health care simulation is growing. This line of inquiry requires measures that are sensitive to changes in cognitive load arising from different instructional designs. Recently, mental effort ratings and secondary task performance have shown promise as measures of cognitive load in health care simulation. We investigate the sensitivity of these measures to predicted differences in intrinsic load arising from variations in task complexity and learner expertise during simulation-based surgical skills training. We randomly assigned 28 novice medical students to simulation training on a simple or complex surgical knot-tying task. Participants completed 13 practice trials, interspersed with computer-based video instruction. On trials 1, 5, 9 and 13, knot-tying performance was assessed using time and movement efficiency measures, and cognitive load was assessed using subjective rating of mental effort (SRME) and simple reaction time (SRT) on a vibrotactile stimulus-monitoring secondary task. Significant improvements in knot-tying performance (F(1.04,24.95)  = 41.1, p cognitive load (F(2.3,58.5)  = 57.7, p load among novices engaged in simulation-based learning. These measures can be used to track cognitive load during skills training. Mental effort ratings are also sensitive to small differences in intrinsic load arising from variations in the physical complexity of a simulation task. The complementary nature of these subjective and objective measures suggests their combined use is advantageous in simulation instructional design research. © 2015 John Wiley & Sons Ltd.

  2. 42 CFR 493.1415 - Condition: Laboratories performing moderate complexity testing; clinical consultant.

    Science.gov (United States)

    2010-10-01

    ... § 493.1415 Condition: Laboratories performing moderate complexity testing; clinical consultant. The laboratory must have a clinical consultant who meets the qualification requirements of § 493.1417 of this... 42 Public Health 5 2010-10-01 2010-10-01 false Condition: Laboratories performing moderate...

  3. Final hazard classification and auditable safety analysis for the 308 Building Complex during post-deactivation surveillance and maintenance mode

    International Nuclear Information System (INIS)

    Dexheimer, D.

    1996-11-01

    This document summarizes the inventories of radioactive and hazardous materials present within the 308 Building Complex, and presents the hazard evaluation methodology used to prepare the hazard classification for the Complex. The complex includes the 308 Building (process area and office facilities) and the 308 Building Annex, which includes the former Neutron Radiography Facility containing a shutdown (and partially decommissioned) reactor. This document applies to the post-deactivation surveillance and maintenance mode only, and provides an authorization basis limited to surveillance and maintenance activities. This document does not authorize decommissioning and decontamination activities, movement of fissile materials, modification to facility confinement structures, nor the introduction or storage of additional radionuclides in the 308 Building Complex. This document established a final hazard classification and identifies appropriate and adequate safety functions and controls to reduce or mitigate the risk associated with the surveillance and maintenance mode. The most consequential hazard event scenario is a postulated unmitigated release from an earthquake event involving the entire complex. That release is equivalent to 30% of the Nuclear Category 3 threshold adjusted as allowed by DOE-STD-1027-92 (DOE 1992). The dominant isotopes are 239 Pu, 240 Pu, and 241 Am in the gloveboxes

  4. Atmospheric stability and topography effects on wind turbine performance and wake properties in complex terrain

    DEFF Research Database (Denmark)

    Han, Xingxing; Liu, Deyou; Xu, Chang

    2018-01-01

    This paper evaluates the influence of atmospheric stability and topography on wind turbine performance and wake properties in complex terrain. To assess atmospheric stability effects on wind turbine performance, an equivalent wind speed calculated with the power output and the manufacture power...... and topography have significant influences on wind turbine performance and wake properties. Considering effects of atmospheric stability and topography will benefit the wind resource assessment in complex terrain....

  5. 7X performance results - final report : ASCI Red vs Red Storm.

    Energy Technology Data Exchange (ETDEWEB)

    Dinge, Dennis C. (Cray Inc., Albuquerque, NM); Davis, Michael E. (Cray Inc., Albuquerque, NM); Haskell, Karen H.; Ballance, Robert A.; Gardiner, Thomas Anthony; Stevenson, Joel O.; Noe, John P.

    2011-04-01

    The goal of the 7X performance testing was to assure Sandia National Laboratories, Cray Inc., and the Department of Energy that Red Storm would achieve its performance requirements which were defined as a comparison between ASCI Red and Red Storm. Our approach was to identify one or more problems for each application in the 7X suite, run those problems at multiple processor sizes in the capability computing range, and compare the results between ASCI Red and Red Storm. The first part of this report describes the two computer systems, the applications in the 7X suite, the test problems, and the results of the performance tests on ASCI Red and Red Storm. During the course of the testing on Red Storm, we had the opportunity to run the test problems in both single-core mode and dual-core mode and the second part of this report describes those results. Finally, we reflect on lessons learned in undertaking a major head-to-head benchmark comparison.

  6. Mining Important Nodes in Directed Weighted Complex Networks

    Directory of Open Access Journals (Sweden)

    Yunyun Yang

    2017-01-01

    Full Text Available In complex networks, mining important nodes has been a matter of concern by scholars. In recent years, scholars have focused on mining important nodes in undirected unweighted complex networks. But most of the methods are not applicable to directed weighted complex networks. Therefore, this paper proposes a Two-Way-PageRank method based on PageRank for further discussion of mining important nodes in directed weighted complex networks. We have mainly considered the frequency of contact between nodes and the length of time of contact between nodes. We have considered the source of the nodes (in-degree and the whereabouts of the nodes (out-degree simultaneously. We have given node important performance indicators. Through numerical examples, we analyze the impact of variation of some parameters on node important performance indicators. Finally, the paper has verified the accuracy and validity of the method through empirical network data.

  7. The Influence of Time Pressure and Case Complexity on Physicians׳ Diagnostic Performance

    Directory of Open Access Journals (Sweden)

    Dalal A. ALQahtani

    2016-12-01

    Conclusions: Time pressure did not impact the diagnostic performance, whereas the complexity of the clinical case negatively influenced the diagnostic accuracy. Further studies with the enhanced experimental manipulation of time pressure are needed to reveal the effect of time pressure, if any, on a physician׳s diagnostic performance.

  8. Radiological Characterization and Final Facility Status Report Tritium Research Laboratory

    International Nuclear Information System (INIS)

    Garcia, T.B.; Gorman, T.P.

    1996-08-01

    This document contains the specific radiological characterization information on Building 968, the Tritium Research Laboratory (TRL) Complex and Facility. We performed the characterization as outlined in its Radiological Characterization Plan. The Radiological Characterization and Final Facility Status Report (RC ampersand FFSR) provides historic background information on each laboratory within the TRL complex as related to its original and present radiological condition. Along with the work outlined in the Radiological Characterization Plan (RCP), we performed a Radiological Soils Characterization, Radiological and Chemical Characterization of the Waste Water Hold-up System including all drains, and a Radiological Characterization of the Building 968 roof ventilation system. These characterizations will provide the basis for the Sandia National Laboratory, California (SNL/CA) Site Termination Survey .Plan, when appropriate

  9. Performance and Complexity Evaluation of Iterative Receiver for Coded MIMO-OFDM Systems

    Directory of Open Access Journals (Sweden)

    Rida El Chall

    2016-01-01

    Full Text Available Multiple-input multiple-output (MIMO technology in combination with channel coding technique is a promising solution for reliable high data rate transmission in future wireless communication systems. However, these technologies pose significant challenges for the design of an iterative receiver. In this paper, an efficient receiver combining soft-input soft-output (SISO detection based on low-complexity K-Best (LC-K-Best decoder with various forward error correction codes, namely, LTE turbo decoder and LDPC decoder, is investigated. We first investigate the convergence behaviors of the iterative MIMO receivers to determine the required inner and outer iterations. Consequently, the performance of LC-K-Best based receiver is evaluated in various LTE channel environments and compared with other MIMO detection schemes. Moreover, the computational complexity of the iterative receiver with different channel coding techniques is evaluated and compared with different modulation orders and coding rates. Simulation results show that LC-K-Best based receiver achieves satisfactory performance-complexity trade-offs.

  10. Endogeneity in Strategy-Performance Analysis

    DEFF Research Database (Denmark)

    Rocha, Vera; Van Praag, Mirjam; B. Folta, Timothy

    2018-01-01

    , such as employees, strategic partners, customers, or investors, whose choices and preferences also affect the final decision. We discuss how endogeneity can plague the measurement of the performance effects of these two-sided strategic decisions—which are more complex, but more realistic, than prior representations...

  11. Performance Analysis with Network-Enhanced Complexities: On Fading Measurements, Event-Triggered Mechanisms, and Cyber Attacks

    Directory of Open Access Journals (Sweden)

    Derui Ding

    2014-01-01

    Full Text Available Nowadays, the real-world systems are usually subject to various complexities such as parameter uncertainties, time-delays, and nonlinear disturbances. For networked systems, especially large-scale systems such as multiagent systems and systems over sensor networks, the complexities are inevitably enhanced in terms of their degrees or intensities because of the usage of the communication networks. Therefore, it would be interesting to (1 examine how this kind of network-enhanced complexities affects the control or filtering performance; and (2 develop some suitable approaches for controller/filter design problems. In this paper, we aim to survey some recent advances on the performance analysis and synthesis with three sorts of fashionable network-enhanced complexities, namely, fading measurements, event-triggered mechanisms, and attack behaviors of adversaries. First, these three kinds of complexities are introduced in detail according to their engineering backgrounds, dynamical characteristic, and modelling techniques. Then, the developments of the performance analysis and synthesis issues for various networked systems are systematically reviewed. Furthermore, some challenges are illustrated by using a thorough literature review and some possible future research directions are highlighted.

  12. Software Performs Complex Design Analysis

    Science.gov (United States)

    2008-01-01

    Designers use computational fluid dynamics (CFD) to gain greater understanding of the fluid flow phenomena involved in components being designed. They also use finite element analysis (FEA) as a tool to help gain greater understanding of the structural response of components to loads, stresses and strains, and the prediction of failure modes. Automated CFD and FEA engineering design has centered on shape optimization, which has been hindered by two major problems: 1) inadequate shape parameterization algorithms, and 2) inadequate algorithms for CFD and FEA grid modification. Working with software engineers at Stennis Space Center, a NASA commercial partner, Optimal Solutions Software LLC, was able to utilize its revolutionary, one-of-a-kind arbitrary shape deformation (ASD) capability-a major advancement in solving these two aforementioned problems-to optimize the shapes of complex pipe components that transport highly sensitive fluids. The ASD technology solves the problem of inadequate shape parameterization algorithms by allowing the CFD designers to freely create their own shape parameters, therefore eliminating the restriction of only being able to use the computer-aided design (CAD) parameters. The problem of inadequate algorithms for CFD grid modification is solved by the fact that the new software performs a smooth volumetric deformation. This eliminates the extremely costly process of having to remesh the grid for every shape change desired. The program can perform a design change in a markedly reduced amount of time, a process that would traditionally involve the designer returning to the CAD model to reshape and then remesh the shapes, something that has been known to take hours, days-even weeks or months-depending upon the size of the model.

  13. Performance of community health workers:situating their intermediary position within complex adaptive health systems

    OpenAIRE

    Kok, Maryse. C; Broerse, Jacqueline E.W; Theobald, Sally; Ormel, Hermen; Dieleman, Marjolein; Taegtmeyer, Miriam

    2017-01-01

    Health systems are social institutions, in which health worker performance is shaped by transactional processes between different actors. This analytical assessment unravels the complex web of factors that influence the performance of community health workers (CHWs) in low- and middle-income countries. It examines their unique intermediary position between the communities they serve and actors in the health sector, and the complexity of the health systems in which they operate. The assessment...

  14. Cordilleran metamorphic core complexes and their uranium favorability. Final report

    International Nuclear Information System (INIS)

    Coney, P.J.; Reynolds, S.J.

    1980-11-01

    The objective of this report is to provide a descriptive body of knowledge on Cordilleran metamorphic core complexes including their lithologic and structural characteristics, their distribution within the Cordillera, and their evolutionary history and tectonic setting. The occurrence of uranium in the context of possibility for uranium concentration is also examined. Chapter 1 is an overview of Cordilleran metamorphic core complexes which describes their physical characteristics, tectonic setting and geologic history. This overview is accompanied by a tectonic map. Chapter 2 is a discussion of the mantled gneiss dome concept. The purpose of including this work is to provide a basic history of this concept and to describe the characteristics and distribution of gneiss domes throughout the world to enable one to compare and contrast them with the metamorphic core complexes as discussed in this report. Some gneiss domes are known producers of uranium (as are also some core complexes). Chapter 3 is an examination of the effects of the core complex process on adjacent sedimentary and volcanic cover terranes. Also included is a discussion of the kinematic significance of these cover terranes as they are related to process within the cores of the complexes. Some of the cover terranes have uranium prospects in them. Chapter 4 is a detailed discussion of uranium in Cordilleran metamorphic core complexes and includes the conceptual basis for the various types of occurrences and the processes that might favor concentration of uranium. The report is supported by a 5-part Appendix. The majority of the core complexes discussed in this report either do not appear or are not recognizable on existing published geologic maps

  15. High Performance Building Facade Solutions - PIER Final Project Report

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Eleanor; Selkowitz, Stephen

    2009-12-31

    Building facades directly influence heating and cooling loads and indirectly influence lighting loads when daylighting is considered, and are therefore a major determinant of annual energy use and peak electric demand. Facades also significantly influence occupant comfort and satisfaction, making the design optimization challenge more complex than many other building systems.This work focused on addressing significant near-term opportunities to reduce energy use in California commercial building stock by a) targeting voluntary, design-based opportunities derived from the use of better design guidelines and tools, and b) developing and deploying more efficient glazings, shading systems, daylighting systems, facade systems and integrated controls. This two-year project, supported by the California Energy Commission PIER program and the US Department of Energy, initiated a collaborative effort between The Lawrence Berkeley National Laboratory (LBNL) and major stakeholders in the facades industry to develop, evaluate, and accelerate market deployment of emerging, high-performance, integrated facade solutions. The LBNL Windows Testbed Facility acted as the primary catalyst and mediator on both sides of the building industry supply-user business transaction by a) aiding component suppliers to create and optimize cost effective, integrated systems that work, and b) demonstrating and verifying to the owner, designer, and specifier community that these integrated systems reliably deliver required energy performance. An industry consortium was initiated amongst approximately seventy disparate stakeholders, who unlike the HVAC or lighting industry, has no single representative, multi-disciplinary body or organized means of communicating and collaborating. The consortium provided guidance on the project and more importantly, began to mutually work out and agree on the goals, criteria, and pathways needed to attain the ambitious net zero energy goals defined by California and

  16. Measuring working memory in aphasia: Comparing performance on complex span and N-back tasks

    Directory of Open Access Journals (Sweden)

    Maria Ivanova

    2014-04-01

    No significant correlations were observed between performance on complex span task and N-back tasks.Furthermore, performance on the modified listening span was related to performance on the comprehension subtest of the QASA, while no relationship was found for 2-back and 0-back tasks.Our results mirror studies in healthy controls that demonstrated no relationship between performance on the two tasks(Jaeggi et al., 2010; Kane et al., 2007. Thus although N-back tasks seem similar to traditional complex span measures and may also index abilities related to cognitive processing, the evidence to date does not warrant their direct association with the construct of WM. Implications for future investigation of cognitive deficits in aphasia will be discussed.

  17. Behind the Final Grade in Hybrid v. Traditional Courses: Comparing Student Performance by Assessment Type, Core Competency, and Course Objective

    Science.gov (United States)

    Bain, Lisa Z.

    2012-01-01

    There are many different delivery methods used by institutions of higher education. These include traditional, hybrid, and online course offerings. The comparisons of these typically use final grade as the measure of student performance. This research study looks behind the final grade and compares student performance by assessment type, core…

  18. Falling with Style: Bats Perform Complex Aerial Rotations by Adjusting Wing Inertia.

    Directory of Open Access Journals (Sweden)

    Attila J Bergou

    Full Text Available The remarkable maneuverability of flying animals results from precise movements of their highly specialized wings. Bats have evolved an impressive capacity to control their flight, in large part due to their ability to modulate wing shape, area, and angle of attack through many independently controlled joints. Bat wings, however, also contain many bones and relatively large muscles, and thus the ratio of bats' wing mass to their body mass is larger than it is for all other extant flyers. Although the inertia in bat wings would typically be associated with decreased aerial maneuverability, we show that bat maneuvers challenge this notion. We use a model-based tracking algorithm to measure the wing and body kinematics of bats performing complex aerial rotations. Using a minimal model of a bat with only six degrees of kinematic freedom, we show that bats can perform body rolls by selectively retracting one wing during the flapping cycle. We also show that this maneuver does not rely on aerodynamic forces, and furthermore that a fruit fly, with nearly massless wings, would not exhibit this effect. Similar results are shown for a pitching maneuver. Finally, we combine high-resolution kinematics of wing and body movements during landing and falling maneuvers with a 52-degree-of-freedom dynamical model of a bat to show that modulation of wing inertia plays the dominant role in reorienting the bat during landing and falling maneuvers, with minimal contribution from aerodynamic forces. Bats can, therefore, use their wings as multifunctional organs, capable of sophisticated aerodynamic and inertial dynamics not previously observed in other flying animals. This may also have implications for the control of aerial robotic vehicles.

  19. Performance Prediction for Large-Scale Nuclear Waste Repositories: Final Report

    International Nuclear Information System (INIS)

    Glassley, W E; Nitao, J J; Grant, W; Boulos, T N; Gokoffski, M O; Johnson, J W; Kercher, J R; Levatin, J A; Steefel, C I

    2001-01-01

    The goal of this project was development of a software package capable of utilizing terascale computational platforms for solving subsurface flow and transport problems important for disposal of high level nuclear waste materials, as well as for DOE-complex clean-up and stewardship efforts. We sought to develop a tool that would diminish reliance on abstracted models, and realistically represent the coupling between subsurface fluid flow, thermal effects and chemical reactions that both modify the physical framework of the rock materials and which change the rock mineralogy and chemistry of the migrating fluid. Providing such a capability would enhance realism in models and increase confidence in long-term predictions of performance. Achieving this goal also allows more cost-effective design and execution of monitoring programs needed to evaluate model results. This goal was successfully accomplished through the development of a new simulation tool (NUFT-C). This capability allows high resolution modeling of complex coupled thermal-hydrological-geochemical processes in the saturated and unsaturated zones of the Earth's crust. The code allows consideration of virtually an unlimited number of chemical species and minerals in a multi-phase, non-isothermal environment. Because the code is constructed to utilize the computational power of the tera-scale IBM ASCI computers, simulations that encompass large rock volumes and complex chemical systems can now be done without sacrificing spatial or temporal resolution. The code is capable of doing one-, two-, and three-dimensional simulations, allowing unprecedented evaluation of the evolution of rock properties and mineralogical and chemical change as a function of time. The code has been validated by comparing results of simulations to laboratory-scale experiments, other benchmark codes, field scale experiments, and observations in natural systems. The results of these exercises demonstrate that the physics and chemistry

  20. School Performance

    Directory of Open Access Journals (Sweden)

    Héctor A. Lamas

    2015-03-01

    Full Text Available The school performance study of students is, due to its relevance and complexity, one of the issues of major controversy in the educational research, and it has been given special attention in the last decades. This study is intended to show a conceptual approach to the school performance construct, contextualizing the reality in the regular basic education classrooms. The construct of learning approaches is presented as one of the factors that influences the school performance of students. Besides, an outlook of the empirical research works related to variables that are presented as relevant when explaining the reason for a specific performance in students is shown. Finally, some models and techniques allowing an appropriate study of school performance are presented.

  1. The effect of two complexity factors on the performance of emergency tasks-An experimental verification

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jung, Wondea; Jung, Kwangtae

    2008-01-01

    It is well known that the use of procedures is very important in securing the safety of process systems, since good procedures effectively guide human operators by providing 'what should be done' and 'how to do it', especially under stressful conditions. At the same time, it has been emphasized that the use of complicated procedures could drastically impair operators' performance. This means that a systematic approach that can properly evaluate the complexity of procedures is indispensable for minimizing the side effects of complicated procedures. For this reason, Park et al. have developed a task complexity measure called TACOM that can be used to quantify the complexity of tasks stipulated in emergency operating procedures (EOPs) of nuclear power plants (NPPs). The TACOM measure consists of five sub-measures that can cover five important factors making the performance of emergency tasks complicated. However, a verification activity for two kinds of complexity factors-the level of abstraction hierarchy (AH) and engineering decision (ED)-seems to be insufficient. In this study, therefore, an experiment is conducted by using a low-fidelity simulator in order to clarify the appropriateness of these complexity factors. As a result, it seems that subjects' performance data are affected by the level of AH as well as ED. Therefore it is anticipate that both the level of AH and ED will play an important role in evaluating the complexity of EOPs

  2. Hospitality Industry Technology Training (HITT). Final Performance Report, April 1, 1989-December 31, 1990.

    Science.gov (United States)

    Mount Hood Community Coll., Gresham, OR.

    This final performance report includes a third-party evaluation and a replication guide. The first section describes a project to develop and implement an articulated curriculum for grades 8-14 to prepare young people for entry into hospitality/tourism-related occupations. It discusses the refinement of existing models, pilot test, curriculum…

  3. Obesity-specific neural cost of maintaining gait performance under complex conditions in community-dwelling older adults.

    Science.gov (United States)

    Osofundiya, Olufunmilola; Benden, Mark E; Dowdy, Diane; Mehta, Ranjana K

    2016-06-01

    Recent evidence of obesity-related changes in the prefrontal cortex during cognitive and seated motor activities has surfaced; however, the impact of obesity on neural activity during ambulation remains unclear. The purpose of this study was to determine obesity-specific neural cost of simple and complex ambulation in older adults. Twenty non-obese and obese individuals, 65years and older, performed three tasks varying in the types of complexity of ambulation (simple walking, walking+cognitive dual-task, and precision walking). Maximum oxygenated hemoglobin, a measure of neural activity, was measured bilaterally using a portable functional near infrared spectroscopy system, and gait speed and performance on the complex tasks were also obtained. Complex ambulatory tasks were associated with ~2-3.5 times greater cerebral oxygenation levels and ~30-40% slower gait speeds when compared to the simple walking task. Additionally, obesity was associated with three times greater oxygenation levels, particularly during the precision gait task, despite obese adults demonstrating similar gait speeds and performances on the complex gait tasks as non-obese adults. Compared to existing studies that focus solely on biomechanical outcomes, the present study is one of the first to examine obesity-related differences in neural activity during ambulation in older adults. In order to maintain gait performance, obesity was associated with higher neural costs, and this was augmented during ambulatory tasks requiring greater precision control. These preliminary findings have clinical implications in identifying individuals who are at greater risk of mobility limitations, particularly when performing complex ambulatory tasks. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. High-Speed, High-Performance DQPSK Optical Links with Reduced Complexity VDFE Equalizers

    Directory of Open Access Journals (Sweden)

    Maki Nanou

    2017-02-01

    Full Text Available Optical transmission technologies optimized for optical network segments sensitive to power consumption and cost, comprise modulation formats with direct detection technologies. Specifically, non-return to zero differential quaternary phase shift keying (NRZ-DQPSK in deployed fiber plants, combined with high-performance, low-complexity electronic equalizers to compensate residual impairments at the receiver end, can be proved as a viable solution for high-performance, high-capacity optical links. Joint processing of the constructive and the destructive signals at the single-ended DQPSK receiver provides improved performance compared to the balanced configuration, however, at the expense of higher hardware requirements, a fact that may not be neglected especially in the case of high-speed optical links. To overcome this bottleneck, the use of partially joint constructive/destructive DQPSK equalization is investigated in this paper. Symbol-by-symbol equalization is performed by means of Volterra decision feedback-type equalizers, driven by a reduced subset of signals selected from the constructive and the destructive ports of the optical detectors. The proposed approach offers a low-complexity alternative for electronic equalization, without sacrificing much of the performance compared to the fully-deployed counterpart. The efficiency of the proposed equalizers is demonstrated by means of computer simulation in a typical optical transmission scenario.

  5. Performance and evaluation of a coupled prognostic model TAPM over a mountainous complex terrain industrial area

    Science.gov (United States)

    Matthaios, Vasileios N.; Triantafyllou, Athanasios G.; Albanis, Triantafyllos A.; Sakkas, Vasileios; Garas, Stelios

    2018-05-01

    Atmospheric modeling is considered an important tool with several applications such as prediction of air pollution levels, air quality management, and environmental impact assessment studies. Therefore, evaluation studies must be continuously made, in order to improve the accuracy and the approaches of the air quality models. In the present work, an attempt is made to examine the air pollution model (TAPM) efficiency in simulating the surface meteorology, as well as the SO2 concentrations in a mountainous complex terrain industrial area. Three configurations under different circumstances, firstly with default datasets, secondly with data assimilation, and thirdly with updated land use, ran in order to investigate the surface meteorology for a 3-year period (2009-2011) and one configuration applied to predict SO2 concentration levels for the year of 2011.The modeled hourly averaged meteorological and SO2 concentration values were statistically compared with those from five monitoring stations across the domain to evaluate the model's performance. Statistical measures showed that the surface temperature and relative humidity are predicted well in all three simulations, with index of agreement (IOA) higher than 0.94 and 0.70 correspondingly, in all monitoring sites, while an overprediction of extreme low temperature values is noted, with mountain altitudes to have an important role. However, the results also showed that the model's performance is related to the configuration regarding the wind. TAPM default dataset predicted better the wind variables in the center of the simulation than in the boundaries, while improvement in the boundary horizontal winds implied the performance of TAPM with updated land use. TAPM assimilation predicted the wind variables fairly good in the whole domain with IOA higher than 0.83 for the wind speed and higher than 0.85 for the horizontal wind components. Finally, the SO2 concentrations were assessed by the model with IOA varied from 0

  6. Swimming performance changes during the final 3 weeks of training leading to the Sydney 2000 Olympic Games.

    Science.gov (United States)

    Mujika, I; Padilla, S; Pyne, D

    2002-11-01

    The purpose of this study was to determine the magnitude of the swimming performance change during the final 3 weeks of training (F3T) leading to the Sydney 2000 Olympic Games. Olympic swimmers who took part in the same event or events at the Telstra 2000 Grand Prix Series in Melbourne, Australia, (26 - 27 August 2000), and 21 - 28 d later at the Sydney 2000 Olympic Games (16 - 23 September 2000) were included in this analysis. A total of 99 performances (50 male, 49 female) were analysed. The overall performance improvement between pre- and post-F3T conditions for all swimmers was 2.18 +/- 1.50 % (p pre-Olympic F3T elicited a significant performance improvement of 2.57 % for male and 1.78 % for female swimmers at the Sydney 2000 Olympic Games. The magnitude was similar for all competition events, and was achieved by swimmers from different countries and performance levels. These data provide a quantitative framework for coaches and swimmers to set realistic performance goals based on individual performance levels before the final training phase leading to important competitions.

  7. Wind turbine power performance verification in complex terrain and wind farms

    DEFF Research Database (Denmark)

    Friis Pedersen, Troels; Gjerding, S.; Enevoldsen, P.

    2002-01-01

    is a power performance verification procedure for individual wind turbines. The third is a power performance measurement procedure of whole wind farms, and the fourth is a power performance measurement procedurefor non-grid (small) wind turbines. This report presents work that was made to support the basis......The IEC/EN 61400-12 Ed 1 standard for wind turbine power performance testing is being revised. The standard will be divided into four documents. The first one of these is more or less a revision of the existing document on power performance measurementson individual wind turbines. The second one...... then been investigated in more detail. The work has given rise to a range of conclusionsand recommendations regarding: guaranties on power curves in complex terrain; investors and bankers experience with verification of power curves; power performance in relation to regional correction curves for Denmark...

  8. The Effects of Differential Goal Weights on the Performance of a Complex Financial Task.

    Science.gov (United States)

    Edmister, Robert O.; Locke, Edwin A.

    1987-01-01

    Determined whether people could obtain outcomes on a complex task that would be in line with differential goal weights corresponding to different aspects of the task. Bank lending officers were run through lender-simulation exercises. Five performance goals were weighted. Demonstrated effectiveness of goal setting with complex tasks, using group…

  9. Improving performance of DS-CDMA systems using chaotic complex Bernoulli spreading codes

    Science.gov (United States)

    Farzan Sabahi, Mohammad; Dehghanfard, Ali

    2014-12-01

    The most important goal of spreading spectrum communication system is to protect communication signals against interference and exploitation of information by unintended listeners. In fact, low probability of detection and low probability of intercept are two important parameters to increase the performance of the system. In Direct Sequence Code Division Multiple Access (DS-CDMA) systems, these properties are achieved by multiplying the data information in spreading sequences. Chaotic sequences, with their particular properties, have numerous applications in constructing spreading codes. Using one-dimensional Bernoulli chaotic sequence as spreading code is proposed in literature previously. The main feature of this sequence is its negative auto-correlation at lag of 1, which with proper design, leads to increase in efficiency of the communication system based on these codes. On the other hand, employing the complex chaotic sequences as spreading sequence also has been discussed in several papers. In this paper, use of two-dimensional Bernoulli chaotic sequences is proposed as spreading codes. The performance of a multi-user synchronous and asynchronous DS-CDMA system will be evaluated by applying these sequences under Additive White Gaussian Noise (AWGN) and fading channel. Simulation results indicate improvement of the performance in comparison with conventional spreading codes like Gold codes as well as similar complex chaotic spreading sequences. Similar to one-dimensional Bernoulli chaotic sequences, the proposed sequences also have negative auto-correlation. Besides, construction of complex sequences with lower average cross-correlation is possible with the proposed method.

  10. High performance parallel computing of flows in complex geometries: I. Methods

    International Nuclear Information System (INIS)

    Gourdain, N; Gicquel, L; Montagnac, M; Vermorel, O; Staffelbach, G; Garcia, M; Boussuge, J-F; Gazaix, M; Poinsot, T

    2009-01-01

    Efficient numerical tools coupled with high-performance computers, have become a key element of the design process in the fields of energy supply and transportation. However flow phenomena that occur in complex systems such as gas turbines and aircrafts are still not understood mainly because of the models that are needed. In fact, most computational fluid dynamics (CFD) predictions as found today in industry focus on a reduced or simplified version of the real system (such as a periodic sector) and are usually solved with a steady-state assumption. This paper shows how to overcome such barriers and how such a new challenge can be addressed by developing flow solvers running on high-end computing platforms, using thousands of computing cores. Parallel strategies used by modern flow solvers are discussed with particular emphases on mesh-partitioning, load balancing and communication. Two examples are used to illustrate these concepts: a multi-block structured code and an unstructured code. Parallel computing strategies used with both flow solvers are detailed and compared. This comparison indicates that mesh-partitioning and load balancing are more straightforward with unstructured grids than with multi-block structured meshes. However, the mesh-partitioning stage can be challenging for unstructured grids, mainly due to memory limitations of the newly developed massively parallel architectures. Finally, detailed investigations show that the impact of mesh-partitioning on the numerical CFD solutions, due to rounding errors and block splitting, may be of importance and should be accurately addressed before qualifying massively parallel CFD tools for a routine industrial use.

  11. Generalized Combination Complex Synchronization for Fractional-Order Chaotic Complex Systems

    Directory of Open Access Journals (Sweden)

    Cuimei Jiang

    2015-07-01

    Full Text Available Based on two fractional-order chaotic complex drive systems and one fractional-order chaotic complex response system with different dimensions, we propose generalized combination complex synchronization. In this new synchronization scheme, there are two complex scaling matrices that are non-square matrices. On the basis of the stability theory of fractional-order linear systems, we design a general controller via active control. Additionally, by virtue of two complex scaling matrices, generalized combination complex synchronization between fractional-order chaotic complex systems and real systems is investigated. Finally, three typical examples are given to demonstrate the effectiveness and feasibility of the schemes.

  12. Final waste forms project: Performance criteria for phase I treatability studies

    International Nuclear Information System (INIS)

    Gilliam, T.M.; Hutchins, D.A.; Chodak, P. III

    1994-06-01

    This document defines the product performance criteria to be used in Phase I of the Final Waste Forms Project. In Phase I, treatability studies will be performed to provide open-quotes proof-of-principleclose quotes data to establish the viability of stabilization/solidification (S/S) technologies. This information is required by March 1995. In Phase II, further treatability studies, some at the pilot scale, will be performed to provide sufficient data to allow treatment alternatives identified in Phase I to be more fully developed and evaluated, as well as to reduce performance uncertainties for those methods chosen to treat a specific waste. Three main factors influence the development and selection of an optimum waste form formulation and hence affect selection of performance criteria. These factors are regulatory, process-specific, and site-specific waste form standards or requirements. Clearly, the optimum waste form formulation will require consideration of performance criteria constraints from each of the three categories. Phase I will focus only on the regulatory criteria. These criteria may be considered the minimum criteria for an acceptable waste form. In other words, a S/S technology is considered viable only if it meet applicable regulatory criteria. The criteria to be utilized in the Phase I treatability studies were primarily taken from Environmental Protection Agency regulations addressed in 40 CFR 260 through 265 and 268; and Nuclear Regulatory Commission regulations addressed in 10 CFR 61. Thus the majority of the identified criteria are independent of waste form matrix composition (i.e., applicable to cement, glass, organic binders etc.)

  13. Final waste forms project: Performance criteria for phase I treatability studies

    Energy Technology Data Exchange (ETDEWEB)

    Gilliam, T.M. [Oak Ridge National Lab., TN (United States); Hutchins, D.A. [Martin Marietta Energy Systems, Inc., Oak Ridge, TN (United States); Chodak, P. III [Massachusetts Institute of Technology (United States)

    1994-06-01

    This document defines the product performance criteria to be used in Phase I of the Final Waste Forms Project. In Phase I, treatability studies will be performed to provide {open_quotes}proof-of-principle{close_quotes} data to establish the viability of stabilization/solidification (S/S) technologies. This information is required by March 1995. In Phase II, further treatability studies, some at the pilot scale, will be performed to provide sufficient data to allow treatment alternatives identified in Phase I to be more fully developed and evaluated, as well as to reduce performance uncertainties for those methods chosen to treat a specific waste. Three main factors influence the development and selection of an optimum waste form formulation and hence affect selection of performance criteria. These factors are regulatory, process-specific, and site-specific waste form standards or requirements. Clearly, the optimum waste form formulation will require consideration of performance criteria constraints from each of the three categories. Phase I will focus only on the regulatory criteria. These criteria may be considered the minimum criteria for an acceptable waste form. In other words, a S/S technology is considered viable only if it meet applicable regulatory criteria. The criteria to be utilized in the Phase I treatability studies were primarily taken from Environmental Protection Agency regulations addressed in 40 CFR 260 through 265 and 268; and Nuclear Regulatory Commission regulations addressed in 10 CFR 61. Thus the majority of the identified criteria are independent of waste form matrix composition (i.e., applicable to cement, glass, organic binders etc.).

  14. Cognitive function predicts listening effort performance during complex tasks in normally aging adults

    Directory of Open Access Journals (Sweden)

    Jennine Harvey

    2017-01-01

    Full Text Available Purpose: This study examines whether cognitive function, as measured by the subtests of the Woodcock–Johnson III (WCJ-III assessment, predicts listening-effort performance during dual tasks across the adults of varying ages. Materials and Methods: Participants were divided into two groups. Group 1 consisted of 14 listeners (number of females = 11 who were 41–61 years old [mean = 53.18; standard deviation (SD = 5.97]. Group 2 consisted of 15 listeners (number of females = 9 who were 63–81 years old (mean = 72.07; SD = 5.11. Participants were administered the WCJ-III Memory for Words, Auditory Working Memory, Visual Matching, and Decision Speed subtests. All participants were tested in each of the following three dual-task experimental conditions, which were varying in complexity: (1 auditory word recognition + visual processing, (2 auditory working memory (word + visual processing, and (3 auditory working memory (sentence + visual processing in noise. Results: A repeated measures analysis of variance revealed that task complexity significantly affected the performance measures of auditory accuracy, visual accuracy, and processing speed. Linear regression revealed that the cognitive subtests of the WCJ-III test significantly predicted performance across dependent variable measures. Conclusion: Listening effort is significantly affected by task complexity, regardless of age. Performance on the WCJ-III test may predict listening effort in adults and may assist speech-language pathologist (SLPs to understand challenges faced by participants when subjected to noise.

  15. Complex Analysis of Financial State and Performance of Construction Enterprises

    Directory of Open Access Journals (Sweden)

    Algirdas Krivka

    2015-12-01

    Full Text Available The paper analyses the financial state and performance of large constructions enterprises by applying financial indicators. As there is no one single decisive financial indicator enabling to objectively assess enterprise performance, the multi-criteria decision making (MCDM methods are applied with four groups of financial ratios (profitability, liquidity, solvency and asset turnover acting as evaluation criteria, while the alternatives assessed are two enterprises compared throughout the reference period of three years, also with the average indicator values of the whole construction sector. The weights of the criteria have been estimated by involving competent experts with chi-square test employed to check the degree of agreement of expert estimates. The research methodology contributes to the issue of complex evaluation of enterprise financial state and performance, while the result of the multi-criteria assessment – the ranking of enterprises and sector average with respect to financial state and performance – could be considered worth attention from business owners, potential investors, customers or other possible stakeholders.

  16. Mathematical approaches for complexity/predictivity trade-offs in complex system models : LDRD final report.

    Energy Technology Data Exchange (ETDEWEB)

    Goldsby, Michael E.; Mayo, Jackson R.; Bhattacharyya, Arnab (Massachusetts Institute of Technology, Cambridge, MA); Armstrong, Robert C.; Vanderveen, Keith

    2008-09-01

    The goal of this research was to examine foundational methods, both computational and theoretical, that can improve the veracity of entity-based complex system models and increase confidence in their predictions for emergent behavior. The strategy was to seek insight and guidance from simplified yet realistic models, such as cellular automata and Boolean networks, whose properties can be generalized to production entity-based simulations. We have explored the usefulness of renormalization-group methods for finding reduced models of such idealized complex systems. We have prototyped representative models that are both tractable and relevant to Sandia mission applications, and quantified the effect of computational renormalization on the predictive accuracy of these models, finding good predictivity from renormalized versions of cellular automata and Boolean networks. Furthermore, we have theoretically analyzed the robustness properties of certain Boolean networks, relevant for characterizing organic behavior, and obtained precise mathematical constraints on systems that are robust to failures. In combination, our results provide important guidance for more rigorous construction of entity-based models, which currently are often devised in an ad-hoc manner. Our results can also help in designing complex systems with the goal of predictable behavior, e.g., for cybersecurity.

  17. Performance of community health workers : situating their intermediary position within complex adaptive health systems

    NARCIS (Netherlands)

    Kok, Maryse C; Broerse, Jacqueline E W; Theobald, Sally; Ormel, Hermen; Dieleman, Marjolein; Taegtmeyer, Miriam

    2017-01-01

    Health systems are social institutions, in which health worker performance is shaped by transactional processes between different actors.This analytical assessment unravels the complex web of factors that influence the performance of community health workers (CHWs) in low- and middle-income

  18. Performance and Complexity Co-evaluation of the Advanced Video Coding Standard for Cost-Effective Multimedia Communications

    Directory of Open Access Journals (Sweden)

    Saponara Sergio

    2004-01-01

    Full Text Available The advanced video codec (AVC standard, recently defined by a joint video team (JVT of ITU-T and ISO/IEC, is introduced in this paper together with its performance and complexity co-evaluation. While the basic framework is similar to the motion-compensated hybrid scheme of previous video coding standards, additional tools improve the compression efficiency at the expense of an increased implementation cost. As a first step to bridge the gap between the algorithmic design of a complex multimedia system and its cost-effective realization, a high-level co-evaluation approach is proposed and applied to a real-life AVC design. An exhaustive analysis of the codec compression efficiency versus complexity (memory and computational costs project space is carried out at the early algorithmic design phase. If all new coding features are used, the improved AVC compression efficiency (up to 50% compared to current video coding technology comes with a complexity increase of a factor 2 for the decoder and larger than one order of magnitude for the encoder. This represents a challenge for resource-constrained multimedia systems such as wireless devices or high-volume consumer electronics. The analysis also highlights important properties of the AVC framework allowing for complexity reduction at the high system level: when combining the new coding features, the implementation complexity accumulates, while the global compression efficiency saturates. Thus, a proper use of the AVC tools maintains the same performance as the most complex configuration while considerably reducing complexity. The reported results provide inputs to assist the profile definition in the standard, highlight the AVC bottlenecks, and select optimal trade-offs between algorithmic performance and complexity.

  19. Effect of action verbs on the performance of a complex movement.

    Directory of Open Access Journals (Sweden)

    Tahar Rabahi

    Full Text Available The interaction between language and motor action has been approached by studying the effect of action verbs, kinaesthetic imagery and mental subtraction upon the performance of a complex movement, the squat vertical jump (SVJ. The time of flight gave the value of the height of the SVJ and was measured with an Optojump® and a Myotest® apparatuses. The results obtained by the effects of the cognitive stimuli showed a statistically significant improvement of the SVJ performance after either loudly or silently pronouncing, hearing or reading the verb saute (jump in French language. Action verbs specific for other motor actions (pince = pinch, lèche = lick or non-specific (bouge = move showed no or little effect. A meaningless verb for the French subjects (tiáo = jump in Chinese showed no effect as did rêve (dream, tombe (fall and stop. The verb gagne (win improved significantly the SVJ height, as did its antonym perds (lose suggesting a possible influence of affects in the subjects' performance. The effect of the specific action verb jump upon the heights of SVJ was similar to that obtained after kinaesthetic imagery and after mental subtraction of two digits numbers from three digits ones; possibly, in the latter, because of the intervention of language in calculus. It appears that the effects of the specific action verb jump did seem effective but not totally exclusive for the enhancement of the SVJ performance. The results imply an interaction among language and motor brain areas in the performance of a complex movement resulting in a clear specificity of the corresponding action verb. The effect upon performance may probably be influenced by the subjects' intention, increased attention and emotion produced by cognitive stimuli among which action verbs.

  20. Managing airlines: the cost of complexity

    Energy Technology Data Exchange (ETDEWEB)

    Trapote-Barreira, C.; Deutschmann, A.; Robuste, F.

    2016-07-01

    This paper is dedicated to the structure of airline networks as a sink of efficient airline operations. Parameters of complexity were derived and mirrored on level of service as well as efficiency parameters. Airlines usually considerers an operational overhead to predict the total flight operation cost. This parameter includes the expected cost for disruptions and delays. When an airline has to mobilize an aircraft in a base for recovering the service or for breaking an emergent dynamic, then it is running extra costs. The cost of managing complexity in the airline industry has a direct impact on profit and loss account. Therefore, this paper presents an integrated approach to evaluate this cost, based on padding and aircrafts dedicated to recover disruptions. Finally, some additional indicators are derived to evaluate reliability improvement as part of complex performance. (Author)

  1. Study of the layout plan in the tokamak complex building for ITER

    International Nuclear Information System (INIS)

    Sato, Kazuyoshi; Yagenji, Akira; Sekiya, Shigeki; Takahashi, Hideo; Tamura, Kousaku; Neyatani, Yuzuru; Hashimoto, Masayoshi; Ogino, Shunji; Nagamatsu, Nobuhide; Motohashi, Keiichi; Uehara, Masaharu; Kataoka, Takahiro; Ohashi, Hironori

    2006-03-01

    This report summarizes study of the layout plan in the ITER Tokamak complex building as an invite to set up its plant in Japan. To draw up this arrangement plan, final design report (FDR), which was designed for main components and determined for the non-site specific design, was reconstructed systematically for the Japanese site. A supplementary design was performed for the insufficiency parts of FDR. An additional study was also performed for the adaptation of a regulatory framework including technical safety requirements in Japan. We proposed the tokamak complex building with seismic isolation to combine with the hot cell building. Through the studies, a layout plan has been constructed including maintenance plan for personnel access and component route with in the building from assembly to operation period. This layout plan would be used as a basis in the construction period, although final decision will be done by ITER organization. (author)

  2. Simulating the Daylight Performance of Complex Fenestration Systems Using Bidirectional Scattering Distribution Functions within Radiance

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Gregory; Mistrick, Ph.D., Richard; Lee, Eleanor; McNeil, Andrew; Jonsson, Ph.D., Jacob

    2011-01-21

    We describe two methods which rely on bidirectional scattering distribution functions (BSDFs) to model the daylighting performance of complex fenestration systems (CFS), enabling greater flexibility and accuracy in evaluating arbitrary assemblies of glazing, shading, and other optically-complex coplanar window systems. Two tools within Radiance enable a) efficient annual performance evaluations of CFS, and b) accurate renderings of CFS despite the loss of spatial resolution associated with low-resolution BSDF datasets for inhomogeneous systems. Validation, accuracy, and limitations of the methods are discussed.

  3. Final report: A Broad Research Project in the Sciences of Complexity; FINAL

    International Nuclear Information System (INIS)

    None

    2000-01-01

    Previous DOE support for ''A Broad Research Program in the Sciences of Complexity'' permitted the Santa Fe Institute to initiate new collaborative research within its Integrative Core activities as well as to host visitors to participate in research on specific topics that serve as motivation and testing-ground for the study of general principles of complex systems. The critical aspect of this support is its effectiveness in seeding new areas of research. Indeed, this Integrative Core has been the birthplace of dozens of projects that later became more specifically focused and then won direct grant support independent of the core grants. But at early stages most of this multidisciplinary research was unable to win grant support as individual projects-both because it did not match well with existing grant program guidelines, and because the amount of handing needed was often too modest to justify a formal proposal to an agency. In fact, one of the attributes of core support has been that it permitted SFI to encourage high-risk activities because the cost was quite low. What is significant is how many of those initial efforts have been productive in the SFI environment. Many of SFI'S current research foci began with a short visit from a researcher new to the SFI community, or as small working groups that brought together carefully selected experts from a variety of fields. As mentioned above, many of the ensuing research projects are now being supported by other funding agencies or private foundations. Some of these successes are described

  4. A review of human factors challenges of complex adaptive systems: discovering and understanding chaos in human performance.

    Science.gov (United States)

    Karwowski, Waldemar

    2012-12-01

    In this paper, the author explores a need for a greater understanding of the true nature of human-system interactions from the perspective of the theory of complex adaptive systems, including the essence of complexity, emergent properties of system behavior, nonlinear systems dynamics, and deterministic chaos. Human performance, more often than not, constitutes complex adaptive phenomena with emergent properties that exhibit nonlinear dynamical (chaotic) behaviors. The complexity challenges in the design and management of contemporary work systems, including service systems, are explored. Examples of selected applications of the concepts of nonlinear dynamics to the study of human physical performance are provided. Understanding and applications of the concepts of theory of complex adaptive and dynamical systems should significantly improve the effectiveness of human-centered design efforts of a large system of systems. Performance of many contemporary work systems and environments may be sensitive to the initial conditions and may exhibit dynamic nonlinear properties and chaotic system behaviors. Human-centered design of emergent human-system interactions requires application of the theories of nonlinear dynamics and complex adaptive system. The success of future human-systems integration efforts requires the fusion of paradigms, knowledge, design principles, and methodologies of human factors and ergonomics with those of the science of complex adaptive systems as well as modern systems engineering.

  5. Semantic Segmentation of Real-time Sensor Data Stream for Complex Activity Recognition

    OpenAIRE

    Triboan, Darpan; Chen, Liming; Chen, Feng; Wang, Zumin

    2016-01-01

    Department of Information Engineering, Dalian University, China The file attached to this record is the author's final peer reviewed version. The Publisher's final version can be found by following the DOI link. Data segmentation plays a critical role in performing human activity recognition (HAR) in the ambient assistant living (AAL) systems. It is particularly important for complex activity recognition when the events occur in short bursts with attributes of multiple sub-tasks. Althou...

  6. ASYMMETRIC PRICE TRANSMISSION MODELING: THE IMPORTANCE OF MODEL COMPLEXITY AND THE PERFORMANCE OF THE SELECTION CRITERIA

    Directory of Open Access Journals (Sweden)

    Henry de-Graft Acquah

    2013-01-01

    Full Text Available Information Criteria provides an attractive basis for selecting the best model from a set of competing asymmetric price transmission models or theories. However, little is understood about the sensitivity of the model selection methods to model complexity. This study therefore fits competing asymmetric price transmission models that differ in complexity to simulated data and evaluates the ability of the model selection methods to recover the true model. The results of Monte Carlo experimentation suggest that in general BIC, CAIC and DIC were superior to AIC when the true data generating process was the standard error correction model, whereas AIC was more successful when the true model was the complex error correction model. It is also shown that the model selection methods performed better in large samples for a complex asymmetric data generating process than with a standard asymmetric data generating process. Except for complex models, AIC's performance did not make substantial gains in recovery rates as sample size increased. The research findings demonstrate the influence of model complexity in asymmetric price transmission model comparison and selection.

  7. Sewage Treatment Plants: Standards of Performance for New Stationary Sources 1977 Final Rule (42 FR 58520)

    Science.gov (United States)

    This document includes a copy of the Federal Register publication of the November 10, 1977 Final Rule for the Standards of Performance of New Stationary Sources for 40 CFR 60 Subparts O. This document is provided curtesy of HeinOnline.

  8. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Final Comprehensive Performance Test Report, P/N 1331720-2TST, S/N 105/A1

    Science.gov (United States)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Final Comprehensive Performance Test (CPT) Report, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). This specification establishes the requirements for the CPT and Limited Performance Test (LPT) of the AMSU-1A, referred to here in as the unit. The sequence in which the several phases of this test procedure shall take place is shown.

  9. Circadian Effects on Simple Components of Complex Task Performance

    Science.gov (United States)

    Clegg, Benjamin A.; Wickens, Christopher D.; Vieane, Alex Z.; Gutzwiller, Robert S.; Sebok, Angelia L.

    2015-01-01

    The goal of this study was to advance understanding and prediction of the impact of circadian rhythm on aspects of complex task performance during unexpected automation failures, and subsequent fault management. Participants trained on two tasks: a process control simulation, featuring automated support; and a multi-tasking platform. Participants then completed one task in a very early morning (circadian night) session, and the other during a late afternoon (circadian day) session. Small effects of time of day were seen on simple components of task performance, but impacts on more demanding components, such as those that occur following an automation failure, were muted relative to previous studies where circadian rhythm was compounded with sleep deprivation and fatigue. Circadian low participants engaged in compensatory strategies, rather than passively monitoring the automation. The findings and implications are discussed in the context of a model that includes the effects of sleep and fatigue factors.

  10. High performance MEAs. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2012-07-15

    The aim of the present project is through modeling, material and process development to obtain significantly better MEA performance and to attain the technology necessary to fabricate stable catalyst materials thereby providing a viable alternative to current industry standard. This project primarily focused on the development and characterization of novel catalyst materials for the use in high temperature (HT) and low temperature (LT) proton-exchange membrane fuel cells (PEMFC). New catalysts are needed in order to improve fuel cell performance and reduce the cost of fuel cell systems. Additional tasks were the development of new, durable sealing materials to be used in PEMFC as well as the computational modeling of heat and mass transfer processes, predominantly in LT PEMFC, in order to improve fundamental understanding of the multi-phase flow issues and liquid water management in fuel cells. An improved fundamental understanding of these processes will lead to improved fuel cell performance and hence will also result in a reduced catalyst loading to achieve the same performance. The consortium have obtained significant research results and progress for new catalyst materials and substrates with promising enhanced performance and fabrication of the materials using novel methods. However, the new materials and synthesis methods explored are still in the early research and development phase. The project has contributed to improved MEA performance using less precious metal and has been demonstrated for both LT-PEM, DMFC and HT-PEM applications. New novel approach and progress of the modelling activities has been extremely satisfactory with numerous conference and journal publications along with two potential inventions concerning the catalyst layer. (LN)

  11. Economic Complexity and Human Development: DEA performance measurement in Asia and Latin America

    OpenAIRE

    Ferraz, Diogo; Moralles, Hérick Fernando; Suarez Campoli, Jéssica; Ribeiro de Oliveira, Fabíola Cristina; do Nascimento Rebelatto, Daisy Aparecida

    2018-01-01

    Economic growth is not the unique factor to explain human development. Due to that many authors have prioritized studies to measure the Human Development Index. However, these indices do not analyze how Economic Complexity can increase Human Development. The aim of this paper is to determine the efficiency of a set of nations from Latin America and Asia, to measure a country’s performance in converting Economic Complexity into Human Development, between 2010 and 2014. The method used was Data...

  12. Reduced-Complexity Deterministic Annealing for Vector Quantizer Design

    Directory of Open Access Journals (Sweden)

    Ortega Antonio

    2005-01-01

    Full Text Available This paper presents a reduced-complexity deterministic annealing (DA approach for vector quantizer (VQ design by using soft information processing with simplified assignment measures. Low-complexity distributions are designed to mimic the Gibbs distribution, where the latter is the optimal distribution used in the standard DA method. These low-complexity distributions are simple enough to facilitate fast computation, but at the same time they can closely approximate the Gibbs distribution to result in near-optimal performance. We have also derived the theoretical performance loss at a given system entropy due to using the simple soft measures instead of the optimal Gibbs measure. We use thederived result to obtain optimal annealing schedules for the simple soft measures that approximate the annealing schedule for the optimal Gibbs distribution. The proposed reduced-complexity DA algorithms have significantly improved the quality of the final codebooks compared to the generalized Lloyd algorithm and standard stochastic relaxation techniques, both with and without the pairwise nearest neighbor (PNN codebook initialization. The proposed algorithms are able to evade the local minima and the results show that they are not sensitive to the choice of the initial codebook. Compared to the standard DA approach, the reduced-complexity DA algorithms can operate over 100 times faster with negligible performance difference. For example, for the design of a 16-dimensional vector quantizer having a rate of 0.4375 bit/sample for Gaussian source, the standard DA algorithm achieved 3.60 dB performance in 16 483 CPU seconds, whereas the reduced-complexity DA algorithm achieved the same performance in 136 CPU seconds. Other than VQ design, the DA techniques are applicable to problems such as classification, clustering, and resource allocation.

  13. Soft plasma electrolysis with complex ions for optimizing electrochemical performance

    Science.gov (United States)

    Kamil, Muhammad Prisla; Kaseem, Mosab; Ko, Young Gun

    2017-03-01

    Plasma electrolytic oxidation (PEO) was a promising surface treatment for light metals to tailor an oxide layer with excellent properties. However, porous coating structure was generally exhibited due to excessive plasma discharges, restraining its performance. The present work utilized ethylenediaminetetraacetic acid (EDTA) and Cu-EDTA complexing agents as electrolyte additives that alter the plasma discharges to improve the electrochemical properties of Al-1.1Mg alloy coated by PEO. To achieve this purpose, PEO coatings were fabricated under an alternating current in silicate electrolytes containing EDTA and Cu-EDTA. EDTA complexes were found to modify the plasma discharging behaviour during PEO that led to a lower porosity than that without additives. This was attributed to a more homogeneous electrical field throughout the PEO process while the coating growth would be maintained by an excess of dissolved Al due to the EDTA complexes. When Cu-EDTA was used, the number of discharge channels in the coating layer was lower than that with EDTA due to the incorporation of Cu2O and CuO altering the dielectric behaviour. Accordingly, the sample in the electrolyte containing Cu-EDTA constituted superior corrosion resistance to that with EDTA. The electrochemical mechanism for excellent corrosion protection was elucidated in the context of equivalent circuit model.

  14. Final report for the 190-D complex decontamination and decommissioning

    International Nuclear Information System (INIS)

    Thoren, S.D.

    1996-09-01

    This report documents the decontamination and decommissioning (D ampersand D) of the 190-D complex. (located on the Hanford Site in Richland, Washington). D ampersand D of the 190-D complex included decontaminating and removing hazardous and radiologically contaminated materials; dismantling equipment piping and utility infrastructure; demolishing the structure; and restoring the site. The 100-D Area contains two of the nine inactive plutonium production reactors. The reactor sites are located along the south shore of the Columbia River where the sites cover the northern part of the Hanford Site. The 190-D complex is located in the 100-D Area and is composed of the following seven buildings: 185-D De-aeration Building, 189-D Refrigeration Building, 190-D Tank Room Highbay, 190-D Process Pump Room, 190-DA Process Pump Room Annex, 195-D Vertical Safety Rod Test Tower, 1724-D Underwater Test Facility

  15. Analyzing complex wake-terrain interactions and its implications on wind-farm performance.

    Science.gov (United States)

    Tabib, Mandar; Rasheed, Adil; Fuchs, Franz

    2016-09-01

    Rotating wind turbine blades generate complex wakes involving vortices (helical tip-vortex, root-vortex etc.).These wakes are regions of high velocity deficits and high turbulence intensities and they tend to degrade the performance of down-stream turbines. Hence, a conservative inter-turbine distance of up-to 10 times turbine diameter (10D) is sometimes used in wind-farm layout (particularly in cases of flat terrain). This ensures that wake-effects will not reduce the overall wind-farm performance, but this leads to larger land footprint for establishing a wind-farm. In-case of complex-terrain, within a short distance (say 10D) itself, the nearby terrain can rise in altitude and be high enough to influence the wake dynamics. This wake-terrain interaction can happen either (a) indirectly, through an interaction of wake (both near tip vortex and far wake large-scale vortex) with terrain induced turbulence (especially, smaller eddies generated by small ridges within the terrain) or (b) directly, by obstructing the wake-region partially or fully in its flow-path. Hence, enhanced understanding of wake- development due to wake-terrain interaction will help in wind farm design. To this end the current study involves: (1) understanding the numerics for successful simulation of vortices, (2) understanding fundamental vortex-terrain interaction mechanism through studies devoted to interaction of a single vortex with different terrains, (3) relating influence of vortex-terrain interactions to performance of a wind-farm by studying a multi-turbine wind-farm layout under different terrains. The results on interaction of terrain and vortex has shown a much faster decay of vortex for complex terrain compared to a flatter-terrain. The potential reasons identified explaining the observation are (a) formation of secondary vortices in flow and its interaction with the primary vortex and (b) enhanced vorticity diffusion due to increased terrain-induced turbulence. The implications of

  16. Safety KPIs - Monitoring of safety performance

    Directory of Open Access Journals (Sweden)

    Andrej Lališ

    2014-09-01

    Full Text Available This paper aims to provide brief overview of aviation safety development focusing on modern trends represented by implementation of Safety Key Performance Indicators. Even though aviation is perceived as safe means of transport, it is still struggling with its complexity given by long-term growth and robustness which it has reached today. Thus nowadays safety issues are much more complex and harder to handle than ever before. We are more and more concerned about organizational factors and control mechanisms which have potential to further increase level of aviation safety. Within this paper we will not only introduce the concept of Key Performance Indicators in area of aviation safety as an efficient control mechanism, but also analyse available legislation and documentation. Finally we will propose complex set of indicators which could be applied to Czech Air Navigation Service Provider.

  17. Ligand effect on the performance of organic light-emitting diodes based on europium complexes

    International Nuclear Information System (INIS)

    Fang Junfeng; You Han; Gao Jia; Lu Wu; Ma Dongge

    2007-01-01

    A series of europium complexes were synthesized and their electroluminescent (EL) characteristics were studied. It was found by comparison that the different substituted groups, such as methyl, chlorine, and nitryl, on ligand 1,10-phenanthroline affect significantly the EL performance of devices based on these complexes. The more methyl-substituted groups on ligand 1,10-phenanthroline led to higher device efficiency. A chlorine-substituted group showed the approximate EL performance as two methyl-substituted groups, whereas a nitryl substituent reduced significantly the EL luminous efficiency. However, β-diketonate ligand TTA and DBM exhibited similar EL performance. The improved EL luminous efficiency by proper substituted groups on the 1,10-phenanthroline was attributed to the reduction of the energy loss caused by light hydrogen atom vibration, as well as concentration quenching caused by intermolecular interaction, and the match of energy level between the ligand and Eu 3+

  18. High-performance liquid chromatography of metal complexes of pheophytins a and b

    International Nuclear Information System (INIS)

    Brykina, G.D.; Lazareva, E.E.; Uvarova, M.I.; Shpigun, O.A.

    1997-01-01

    Cu(2), Zn(2), Pb(2), Hg(2), and Ce(4) complexes of phenophytins a and b were synthesized. The chromatographic retention parameters of pheophytins a and b, chlorophylls a and b, and the above complexes were determined under conditions of normal-phase and reversed-phase high-performance liquid chromatography (HPLC). The adsorption of metal pheophytinates in the hexane-n-butanol (96:4)-Silasorb 600 and acetonitrile-ethanol-acetic acid (40:40:16)-Nucleosil C 18 systems was studied by HPLC. Factors that affect the chromatographic and adsorption characteristics of compounds (structural differences between pheophytinates of the a and b series, the nature of the central metal atom, and the nature of the mobile and stationary phases) are discussed. It is demonstrated that pheophytins a and b their metal complexes can be identified and quantiatively determined by HPLC in the concentration range (0.6-44.0)[10 -6 M

  19. High performance ultrasonic field simulation on complex geometries

    Science.gov (United States)

    Chouh, H.; Rougeron, G.; Chatillon, S.; Iehl, J. C.; Farrugia, J. P.; Ostromoukhov, V.

    2016-02-01

    Ultrasonic field simulation is a key ingredient for the design of new testing methods as well as a crucial step for NDT inspection simulation. As presented in a previous paper [1], CEA-LIST has worked on the acceleration of these simulations focusing on simple geometries (planar interfaces, isotropic materials). In this context, significant accelerations were achieved on multicore processors and GPUs (Graphics Processing Units), bringing the execution time of realistic computations in the 0.1 s range. In this paper, we present recent works that aim at similar performances on a wider range of configurations. We adapted the physical model used by the CIVA platform to design and implement a new algorithm providing a fast ultrasonic field simulation that yields nearly interactive results for complex cases. The improvements over the CIVA pencil-tracing method include adaptive strategies for pencil subdivisions to achieve a good refinement of the sensor geometry while keeping a reasonable number of ray-tracing operations. Also, interpolation of the times of flight was used to avoid time consuming computations in the impulse response reconstruction stage. To achieve the best performance, our algorithm runs on multi-core superscalar CPUs and uses high performance specialized libraries such as Intel Embree for ray-tracing, Intel MKL for signal processing and Intel TBB for parallelization. We validated the simulation results by comparing them to the ones produced by CIVA on identical test configurations including mono-element and multiple-element transducers, homogeneous, meshed 3D CAD specimens, isotropic and anisotropic materials and wave paths that can involve several interactions with interfaces. We show performance results on complete simulations that achieve computation times in the 1s range.

  20. Mixed Waste Salt Encapsulation Using Polysiloxane - Final Report

    International Nuclear Information System (INIS)

    Miller, C.M.; Loomis, G.G.; Prewett, S.W.

    1997-01-01

    A proof-of-concept experimental study was performed to investigate the use of Orbit Technologies polysiloxane grouting material for encapsulation of U.S. Department of Energy mixed waste salts leading to a final waste form for disposal. Evaporator pond salt residues and other salt-like material contaminated with both radioactive isotopes and hazardous components are ubiquitous in the DOE complex and may exceed 250,000,000 kg of material. Current treatment involves mixing low waste percentages (less than 10% by mass salt) with cement or costly thermal treatment followed by cementation to the ash residue. The proposed technology involves simple mixing of the granular salt material (with relatively high waste loadings-greater than 50%) in a polysiloxane-based system that polymerizes to form a silicon-based polymer material. This study involved a mixing study to determine optimum waste loadings and compressive strengths of the resultant monoliths. Following the mixing study, durability testing was performed on promising waste forms. Leaching studies including the accelerated leach test and the toxicity characteristic leaching procedure were also performed on a high nitrate salt waste form. In addition to this testing, the waste form was examined by scanning electron microscope. Preliminary cost estimates for applying this technology to the DOE complex mixed waste salt problem is also given

  1. 42 CFR 493.1467 - Condition: Laboratories performing high complexity testing; cytology general supervisor.

    Science.gov (United States)

    2010-10-01

    ... testing; cytology general supervisor. 493.1467 Section 493.1467 Public Health CENTERS FOR MEDICARE....1467 Condition: Laboratories performing high complexity testing; cytology general supervisor. For the subspecialty of cytology, the laboratory must have a general supervisor who meets the qualification...

  2. Influence of step complexity and presentation style on step performance of computerized emergency operating procedures

    Energy Technology Data Exchange (ETDEWEB)

    Xu Song [Department of Industrial Engineering, Tsinghua University, Beijing 100084 (China); Li Zhizhong [Department of Industrial Engineering, Tsinghua University, Beijing 100084 (China)], E-mail: zzli@tsinghua.edu.cn; Song Fei; Luo Wei; Zhao Qianyi; Salvendy, Gavriel [Department of Industrial Engineering, Tsinghua University, Beijing 100084 (China)

    2009-02-15

    With the development of information technology, computerized emergency operating procedures (EOPs) are taking the place of paper-based ones. However, ergonomics issues of computerized EOPs have not been studied adequately since the industrial practice is quite limited yet. This study examined the influence of step complexity and presentation style of EOPs on step performance. A simulated computerized EOP system was developed in two presentation styles: Style A: one- and two-dimensional flowcharts combination; Style B: two-dimensional flowchart and success logic tree combination. Step complexity was quantified by a complexity measure model based on an entropy concept. Forty subjects participated in the experiment of EOP execution using the simulated system. The results of data analysis on the experiment data indicate that step complexity and presentation style could significantly influence step performance (both step error rate and operation time). Regression models were also developed. The regression analysis results imply that operation time of a step could be well predicted by step complexity while step error rate could only partly predicted by it. The result of a questionnaire investigation implies that step error rate was influenced not only by the operation task itself but also by other human factors. These findings may be useful for the design and assessment of computerized EOPs.

  3. Ethical and legal issues arising from complex genetic disorders. DOE final report

    Energy Technology Data Exchange (ETDEWEB)

    Andrews, Lori

    2002-10-09

    The project analyzed the challenges raised by complex genetic disorders in genetic counselling, for clinical practice, for public health, for quality assurance, and for protection against discrimination. The research found that, in some settings, solutions created in the context of single gene disorders are more difficult to apply to complex disorders. In other settings, the single gene solutions actually backfired and created additional problems when applied to complex genetic disorders. The literature of five common, complex genetic disorders--Alzheimer's, asthma, coronary heart disease, diabetes, and psychiatric illnesses--was evaluated in depth.

  4. Towards the final MRPC design. Performance test with heavy ion beam

    Energy Technology Data Exchange (ETDEWEB)

    Deppner, Ingo; Herrmann, Norbert [Physikalisches Institut Uni. Heidelberg, Heidelberg (Germany)

    2015-07-01

    The Compressed Baryonic Matter spectrometer (CBM) is a future heavy ion experiment located at the Facility for Anti-proton and Ion Research (FAIR) in Darmstadt, Germany. The key element in CBM providing hadron identification at incident energies between 2 and 35 AGeV will be a 120 m{sup 2} large Time-of-Flight (ToF) wall composed of Multi-gap Resistive Plate Chambers (MRPC) with a system time resolution better than 80 ps. Aiming for an interaction rate of 10 MHz for Au+Au collisions the MRPCs have to cope with an incident particle flux between 0.1 kHz/cm{sup 2} and 25 kHz/cm{sup 2} depending on their location. Characterized by granularity and rate capability the actual conceptual design of the ToF-wall foresees 4 different counter types called MRPC1 - MRPC4. In order to elaborate the final MRPC design of these counters a heavy ion test beam time was performed at GSI. In this contribution we present performance test results of 2 different MRPC3 full size prototypes developed at Heidelberg University and Tsinghua University, Beijing.

  5. Predictive modelling of complex agronomic and biological systems.

    Science.gov (United States)

    Keurentjes, Joost J B; Molenaar, Jaap; Zwaan, Bas J

    2013-09-01

    Biological systems are tremendously complex in their functioning and regulation. Studying the multifaceted behaviour and describing the performance of such complexity has challenged the scientific community for years. The reduction of real-world intricacy into simple descriptive models has therefore convinced many researchers of the usefulness of introducing mathematics into biological sciences. Predictive modelling takes such an approach another step further in that it takes advantage of existing knowledge to project the performance of a system in alternating scenarios. The ever growing amounts of available data generated by assessing biological systems at increasingly higher detail provide unique opportunities for future modelling and experiment design. Here we aim to provide an overview of the progress made in modelling over time and the currently prevalent approaches for iterative modelling cycles in modern biology. We will further argue for the importance of versatility in modelling approaches, including parameter estimation, model reduction and network reconstruction. Finally, we will discuss the difficulties in overcoming the mathematical interpretation of in vivo complexity and address some of the future challenges lying ahead. © 2013 John Wiley & Sons Ltd.

  6. Wind turbine power performance verification in complex terrain and wind farms

    Energy Technology Data Exchange (ETDEWEB)

    Friis Pedersen, T.; Gjerding, S.; Ingham, P.; Enevoldsen, P.; Kjaer Hansen, J.; Kanstrup Joergensen, H.

    2002-04-01

    The IEC/EN 61400-12 Ed 1 standard for wind turbine power performance testing is being revised. The standard will be divided into four documents. The first one of these is more or less a revision of the existing document on power performance measurements on individual wind turbines. The second one is a power performance verification procedure for individual wind turbines. The third is a power performance measurement procedure of whole wind farms, and the fourth is a power performance measurement procedure for non-grid (small) wind turbines. This report presents work that was made to support the basis for this standardisation work. The work addressed experience from several national and international research projects and contractual and field experience gained within the wind energy community on this matter. The work was wide ranging and addressed 'grey' areas of knowledge regarding existing methodologies, which has then been investigated in more detail. The work has given rise to a range of conclusions and recommendations regarding: guaranties on power curves in complex terrain; investors and bankers experience with verification of power curves; power performance in relation to regional correction curves for Denmark; anemometry and the influence of inclined flow. (au)

  7. Managing project complexity : A study into adapting early project phases to improve project performance in large engineering projects

    NARCIS (Netherlands)

    Bosch-Rekveldt, M.G.C.

    2011-01-01

    Engineering projects become increasingly more complex and project complexity is assumed to be one of the causes for projects being delivered late and over budget. However, what this project complexity actually comprised of was unclear. To improve the overall project performance, this study focuses

  8. Radioactive Waste Management Complex low-level waste radiological performance assessment

    Energy Technology Data Exchange (ETDEWEB)

    Maheras, S.J.; Rood, A.S.; Magnuson, S.O.; Sussman, M.E.; Bhatt, R.N.

    1994-04-01

    This report documents the projected radiological dose impacts associated with the disposal of radioactive low-level waste at the Radioactive Waste Management Complex at the Idaho National Engineering Laboratory. This radiological performance assessment was conducted to evaluate compliance with applicable radiological criteria of the US Department of Energy and the US Environmental Protection Agency for protection of the public and the environment. The calculations involved modeling the transport of radionuclides from buried waste, to surface soil and subsurface media, and eventually to members of the public via air, groundwater, and food chain pathways. Projections of doses were made for both offsite receptors and individuals inadvertently intruding onto the site after closure. In addition, uncertainty and sensitivity analyses were performed. The results of the analyses indicate compliance with established radiological criteria and provide reasonable assurance that public health and safety will be protected.

  9. Radioactive Waste Management Complex low-level waste radiological performance assessment

    International Nuclear Information System (INIS)

    Maheras, S.J.; Rood, A.S.; Magnuson, S.O.; Sussman, M.E.; Bhatt, R.N.

    1994-04-01

    This report documents the projected radiological dose impacts associated with the disposal of radioactive low-level waste at the Radioactive Waste Management Complex at the Idaho National Engineering Laboratory. This radiological performance assessment was conducted to evaluate compliance with applicable radiological criteria of the US Department of Energy and the US Environmental Protection Agency for protection of the public and the environment. The calculations involved modeling the transport of radionuclides from buried waste, to surface soil and subsurface media, and eventually to members of the public via air, groundwater, and food chain pathways. Projections of doses were made for both offsite receptors and individuals inadvertently intruding onto the site after closure. In addition, uncertainty and sensitivity analyses were performed. The results of the analyses indicate compliance with established radiological criteria and provide reasonable assurance that public health and safety will be protected

  10. Macrocyclic ligands for uranium complexation. Final report, August 1, 1986--March 31, 1993

    International Nuclear Information System (INIS)

    Potts, K.T.

    1993-01-01

    Macrocycles, designed for complexation of the uranyl ion by computer modeling studies and utilizing six ligating atoms in the equatorial plane of the uranyl ions, have been prepared and their complexation of the uranyl ions evaluated. The ligating atoms, either oxygen or sulfur, were part of acylurea, biuret or thiobiuret subunits with alkane chains or pyridine units completing the macrocyclic periphery. These macrocycles with only partial preorganization formed uranyl complexes in solution but no crystalline complexes were isolated. Refinement of the cavity diameter by variation of the peripheral functional groups is currently studied to achieve an optimized cavity diameter of 4.7--5.2 angstrom. Acyclic ligands containing the same ligating atoms in equivalent functional entities were found to form a crystalline 1:1 uranyl-ligand complex (stability constant log K = 10.7) whose structure was established by X-ray data. This complex underwent a facile, DMSO-induced rearrangement to a 2:1 uranyl-ligand complex whose structure was also established by X-ray data. The intermediates to the macrocycles all behaved as excellent ligands for the complexation of transition metals. Acylthiourea complexes of copper and nickel as well as intermolecular, binuclear copper and nickel complexes of bidentate carbonyl thioureas formed readily and their structures were established in several representative instances by X-ray structural determinations. Tetradentate bis(carbonylthioureas) were found to be very efficient selective reagents for the complexation of copper in the presence of nickel ions. Several preorganized macrocycles were also prepared but in most instances these macrocycles underwent ring-opening under complexation conditions

  11. The Institute for Sustained Performance, Energy, and Resilience, University of North Carolina, Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, Robert [Univ. of North Carolina, Chapel Hill, NC (United States)

    2018-01-20

    This is the final report for the UNC component of the SciDAD Institute for Sustained Performance, Energy, and Resilience. In this report, we describe activities on the SUPER project at RENCI at the University of North Carolina at Chapel Hill. While we focus particularly on UNC, we touch on project-wide activities as well as, on interactions with, and impacts on, other projects.

  12. Impact of the motion and visual complexity of the background on players' performance in video game-like displays.

    Science.gov (United States)

    Caroux, Loïc; Le Bigot, Ludovic; Vibert, Nicolas

    2013-01-01

    The visual interfaces of virtual environments such as video games often show scenes where objects are superimposed on a moving background. Three experiments were designed to better understand the impact of the complexity and/or overall motion of two types of visual backgrounds often used in video games on the detection and use of superimposed, stationary items. The impact of background complexity and motion was assessed during two typical video game tasks: a relatively complex visual search task and a classic, less demanding shooting task. Background motion impaired participants' performance only when they performed the shooting game task, and only when the simplest of the two backgrounds was used. In contrast, and independently of background motion, performance on both tasks was impaired when the complexity of the background increased. Eye movement recordings demonstrated that most of the findings reflected the impact of low-level features of the two backgrounds on gaze control.

  13. Comparative Study of the Tuning Performances of the Nominal and Long L* CLIC Final Focus System at √s = 380 GeV

    CERN Document Server

    Plassard, F; Marin, E; Tomás, R

    2017-01-01

    Mitigation of static imperfections for emittance preservation is one of the most important and challenging tasks faced by the Compact Linear Collider (CLIC) beam delivery system. A simulation campaign has been performed to recover the nominal luminosity by means of different alignment procedures. The state of the art of the tuning studies is drawn up. Comparative studies of the tuning performances and a tuning-based final focus system design optimization for two L options are presented. The effectiveness of the tuning techniques applied to these different lattices will be decisive for the final layout of the CLIC final focus system at √s = 380 GeV.

  14. Photochemical activation and reactivity of polynuclear transition metal complex molecules. Final report

    International Nuclear Information System (INIS)

    Endicott, J.F.; Lintvedt, R.L.

    1982-06-01

    Several bi- and trinuclear metal complexes containing ligands from β-polyketonates have been synthesized and characterized including homo- and hetero-polynuclear complexes. New synthetic approaches to the preparation of heterobi- and trinuclear complexes have been developed that allow the preparation of a large number of molecules containing heavy-metal ions such as Pd 2+ or UO 2 2+ and a first-row transition-metal ion. The electrochemical properties of these complexes have been investigated and many exhibit the ability to transfer two electrons at very nearly the same potential. Photochemical studies on binuclear Cu(II) and Ni(II) showed that these compounds yielded reduced metal species and decomposition upon irradiation. Luminescence of hetero-complexes of uranyl polyketonates is observed at 77 0 K with the UO 2 2+ moiety functioning as an isolated chromophore in which emission is observed only on direct excitation of UO 2 2+ and energy transfer to lower states in the molecule is not observed

  15. Final Phase Flight Performance and Touchdown Time Assessment of TDV in RLV-TD HEX-01 Mission

    Science.gov (United States)

    Yadav, Sandeep; Jayakumar, M.; Nizin, Aziya; Kesavabrahmaji, K.; Shyam Mohan, N.

    2017-12-01

    RLV-TD HEX-01 mission was configured as a precursor flight to actual two stages to orbit vehicle. In this mission RLV-TD was designed as a two stage vehicle for demonstrating the hypersonic flight of a winged body vehicle at Mach No. 5. One of the main objectives of this mission was to generate data for better understanding of new technologies required to design the future vehicle. In this mission, the RLV-TD vehicle was heavily instrumented to get data related to performance of different subsystems. As per the mission design, RLV-TD will land in sea after flight duration of 700 s and travelling a distance of nearly 500 km in Bay of Bengal from the launch site for a nominal trajectory. The visibility studies for telemetry data of vehicle for the nominal and off nominal trajectories were carried out. Based on that, three ground stations were proposed for the telemetry data reception (including one in sea). Even with this scheme it was seen that during the final phase of the flight there will not be any ground station visible to the flight due to low elevation. To have the mission critical data during final phase of the flight, telemetry through INSAT scheme was introduced. During the end of the mission RLV-TD will be landing in the sea on a hypothetical runway. To know the exact time of touchdown for the flight in sea, there was no direct measurement available. Simultaneously there were all chances of losing ground station visibility just before touchdown, making it difficult to assess flight performance during that phase. In this work, telemetry and instrumentation scheme of RLV-TD HEX-01 mission is discussed with an objective to determine the flight performance during the final phase. Further, using various flight sensor data the touchdown time of TDV is assessed for this mission.

  16. How students process equations in solving quantitative synthesis problems? Role of mathematical complexity in students’ mathematical performance

    Directory of Open Access Journals (Sweden)

    Bashirah Ibrahim

    2017-10-01

    Full Text Available We examine students’ mathematical performance on quantitative “synthesis problems” with varying mathematical complexity. Synthesis problems are tasks comprising multiple concepts typically taught in different chapters. Mathematical performance refers to the formulation, combination, and simplification of equations. Generally speaking, formulation and combination of equations require conceptual reasoning; simplification of equations requires manipulation of equations as computational tools. Mathematical complexity is operationally defined by the number and the type of equations to be manipulated concurrently due to the number of unknowns in each equation. We use two types of synthesis problems, namely, sequential and simultaneous tasks. Sequential synthesis tasks require a chronological application of pertinent concepts, and simultaneous synthesis tasks require a concurrent application of the pertinent concepts. A total of 179 physics major students from a second year mechanics course participated in the study. Data were collected from written tasks and individual interviews. Results show that mathematical complexity negatively influences the students’ mathematical performance on both types of synthesis problems. However, for the sequential synthesis tasks, it interferes only with the students’ simplification of equations. For the simultaneous synthesis tasks, mathematical complexity additionally impedes the students’ formulation and combination of equations. Several reasons may explain this difference, including the students’ different approaches to the two types of synthesis problems, cognitive load, and the variation of mathematical complexity within each synthesis type.

  17. Collaborative crew performance in complex operational systems: L'Efficacité du travail en équipage dans des systèmes opérationnel complexes

    National Research Council Canada - National Science Library

    1999-01-01

    .... As we progress towards the next millennium, complex operations will increasingly require consideration and integration of the collaborative element wherein crew performance becomes a critical factor for success...

  18. Enhancing the Photovoltaic Performance of Perovskite Solar Cells with a Down-Conversion Eu-Complex.

    Science.gov (United States)

    Jiang, Ling; Chen, Wangchao; Zheng, Jiawei; Zhu, Liangzheng; Mo, Li'e; Li, Zhaoqian; Hu, Linhua; Hayat, Tasawar; Alsaedi, Ahmed; Zhang, Changneng; Dai, Songyuan

    2017-08-16

    Organometal halide perovskite solar cells (PSCs) have shown high photovoltaic performance but poor utilization of ultraviolet (UV) irradiation. Lanthanide complexes have a wide absorption range in the UV region and they can down-convert the absorbed UV light into visible light, which provides a possibility for PSCs to utilize UV light for higher photocurrent, efficiency, and stability. In this study, we use a transparent luminescent down-converting layer (LDL) of Eu-4,7-diphenyl-1,10-phenanthroline (Eu-complex) to improve the light utilization efficiency of PSCs. Compared with the uncoated PSC, the PSC coated with Eu-complex LDL on the reverse of the fluorine-doped tin oxide glass displayed an enhancement of 11.8% in short-circuit current density (J sc ) and 15.3% in efficiency due to the Eu-complex LDL re-emitting UV light (300-380 nm) in the visible range. It is indicated that the Eu-complex LDL plays the role of enhancing the power conversion efficiency as well as reducing UV degradation for PSCs.

  19. The Impact of Technology, Job Complexity and Religious Orientation on Managerial Performance

    Directory of Open Access Journals (Sweden)

    Jesmin Islam

    2011-12-01

    Full Text Available This paper explores the impact of technology, job complexity and religious orientation on the performance of managers in the financial services industries. Data were collected from bank managers representing Islamic and conventional banks in Bangladesh. Path models were used to analyse the data. The results provide supportfor the hypothesis that a management accounting systems (MAS adequacy gap exists in the financial sector in a developing country such as Bangladesh. These Islamic and conventional banks also experienced varied outcomes regarding the impact of the MAS adequacy gap on managerial effectiveness. Significant results emerged concerning the direct influence of technology and job complexity on managerial effectiveness, although these findings again differed across religious and conventional banks. Significant intervening effects of both MAS adequacy gap and job complexity on the relationships between contingency factors and managers' effectiveness were also found. Overall the findings showed that the type of religious orientation in Islamic banks wielded an important influence on the sensitivity of the MAS adequacy gap. Religious orientation served as a control device for the relationships between job-related contingency factors andmanagerial effectiveness.

  20. Task complexity and task, goal, and reward interdependence in group performance : a prescriptive model

    NARCIS (Netherlands)

    Vijfeijken, van H.T.G.A.; Kleingeld, P.A.M.; Tuijl, van H.F.J.M.; Algera, J.A.; Thierry, H.

    2002-01-01

    A prescriptive model on how to design effective combinations of goal setting and contingent rewards for group performance management is presented. The model incorporates the constructs task complexity, task interdependence, goal interdependence, and reward interdependence and specifies optimal fit

  1. Final-state interactions and relativistic effects in the quasielastic (e,e') reaction

    International Nuclear Information System (INIS)

    Chinn, C.R.; Physics Division, Los Alamos National Laboratory, Los Alamos, New Mexico 87545); Picklesimer, A.; Van Orden, J.W.

    1989-01-01

    The longitudinal and transverse response functions for the inclusive quasielastic (e,e') reaction are analyzed in detail. A microscopic theoretical framework for the many-body reaction provides a clear conceptual (nonrelativistic) basis for treating final-state interactions and goes far beyond simple plane-wave or Hermitean potential models. The many-body physics of inelastic final-state channels as described by optical and multiple scattering theories is properly included by incorporating a full complex optical potential. Explicit nonrelativistic and relativistic momentum-space calculations quantitatively demonstrate the importance of such a treatment of final-state interactions for both the transverse and longitudinal response. Nonrelativistic calculations are performed using final-state interactions based on phenomenology, local density models, and microscopic multiple scattering theory. Relativistic calculations span a similar range of models and employ Dirac bound-state wave functions. The theoretical extension to relativistic dynamics is of course not clear, but is done in obvious parallel to elastic proton scattering. Extensive calculations are performed for 40 Ca at momentum transfers of 410, 550, and 700 MeV/c. A number of interesting physical effects are observed, including significant relativistic suppressions (especially for R L ), large off-shell and virtual pair effects, enhancement of the tails of the response by the final-state interactions, and large qualitative and even shape distinctions between the predictions of the various models of the final-state interactions. None of the models is found to be able to simultaneously predict the data for both response functions. This strongly suggests that additional physical mechanisms are of qualitative importance in inclusive quasielastic electron scattering

  2. Post-weaning and whole-of-life performance of pigs is determined by live weight at weaning and the complexity of the diet fed after weaning

    Directory of Open Access Journals (Sweden)

    Cherie L. Collins

    2017-12-01

    Full Text Available The production performance and financial outcomes associated with weaner diet complexity for pigs of different weight classes at weaning were examined in this experiment. A total of 720 weaner pigs (360 entire males and 360 females were selected at weaning (27 ± 3 d and allocated to pens of 10 based on individual weaning weight (light weaning weight: pigs below 6.5 kg; medium weaning weight: 6.5 to 8 kg; heavy weaning weight: above 8.5 kg. Pens were then allocated in a 3 × 2 × 2 factorial arrangement of treatments with the respective factors being weaning weight (heavy, medium and light; H, M and L, respectively, weaner diet complexity (high complexity/cost, HC; low complexity/cost, LC, and gender (male and female. Common diets were fed to both treatment groups during the final 4 weeks of the weaner period (a period of 39 days. In the first 6 d after weaning, pigs offered the HC diets gained weight faster and used feed more efficiently than those offered the LC diets (P = 0.031. Pigs fed a HC diet after weaning tended to be heavier at the sale live weight of 123 d of age compared with pigs fed the LC diet (P = 0.056. There were no other main effects of the feeding program on growth performance through to slaughter. Weaning weight had a profound influence on lifetime growth performance and weight at 123 d of age, with H pigs at weaning increasing their weight advantage over the M and L pigs (101.3, 97.1, 89.6 kg respectively, P < 0.001. Cost-benefit analyses suggested there was a minimal benefit in terms of cost per unit live weight gain over lifetime when pigs were offered a HC feeding program to L, with a lower feed cost/kg gain. The results from this investigation confirm the impact of weaning weight on lifetime growth performance, and suggest that a HC feeding program should be focused on L weaner pigs (i.e., weaning weight less than 6.5 kg at 27 d of age in order to maximise financial returns.

  3. Practical reliability and uncertainty quantification in complex systems : final report.

    Energy Technology Data Exchange (ETDEWEB)

    Grace, Matthew D.; Ringland, James T.; Marzouk, Youssef M. (Massachusetts Institute of Technology, Cambridge, MA); Boggs, Paul T.; Zurn, Rena M.; Diegert, Kathleen V. (Sandia National Laboratories, Albuquerque, NM); Pebay, Philippe Pierre; Red-Horse, John Robert (Sandia National Laboratories, Albuquerque, NM)

    2009-09-01

    The purpose of this project was to investigate the use of Bayesian methods for the estimation of the reliability of complex systems. The goals were to find methods for dealing with continuous data, rather than simple pass/fail data; to avoid assumptions of specific probability distributions, especially Gaussian, or normal, distributions; to compute not only an estimate of the reliability of the system, but also a measure of the confidence in that estimate; to develop procedures to address time-dependent or aging aspects in such systems, and to use these models and results to derive optimal testing strategies. The system is assumed to be a system of systems, i.e., a system with discrete components that are themselves systems. Furthermore, the system is 'engineered' in the sense that each node is designed to do something and that we have a mathematical description of that process. In the time-dependent case, the assumption is that we have a general, nonlinear, time-dependent function describing the process. The major results of the project are described in this report. In summary, we developed a sophisticated mathematical framework based on modern probability theory and Bayesian analysis. This framework encompasses all aspects of epistemic uncertainty and easily incorporates steady-state and time-dependent systems. Based on Markov chain, Monte Carlo methods, we devised a computational strategy for general probability density estimation in the steady-state case. This enabled us to compute a distribution of the reliability from which many questions, including confidence, could be addressed. We then extended this to the time domain and implemented procedures to estimate the reliability over time, including the use of the method to predict the reliability at a future time. Finally, we used certain aspects of Bayesian decision analysis to create a novel method for determining an optimal testing strategy, e.g., we can estimate the 'best' location to

  4. Modeling and Performance Considerations for Automated Fault Isolation in Complex Systems

    Science.gov (United States)

    Ferrell, Bob; Oostdyk, Rebecca

    2010-01-01

    The purpose of this paper is to document the modeling considerations and performance metrics that were examined in the development of a large-scale Fault Detection, Isolation and Recovery (FDIR) system. The FDIR system is envisioned to perform health management functions for both a launch vehicle and the ground systems that support the vehicle during checkout and launch countdown by using suite of complimentary software tools that alert operators to anomalies and failures in real-time. The FDIR team members developed a set of operational requirements for the models that would be used for fault isolation and worked closely with the vendor of the software tools selected for fault isolation to ensure that the software was able to meet the requirements. Once the requirements were established, example models of sufficient complexity were used to test the performance of the software. The results of the performance testing demonstrated the need for enhancements to the software in order to meet the demands of the full-scale ground and vehicle FDIR system. The paper highlights the importance of the development of operational requirements and preliminary performance testing as a strategy for identifying deficiencies in highly scalable systems and rectifying those deficiencies before they imperil the success of the project

  5. Complexities of Social Capital in Boards of Directors

    DEFF Research Database (Denmark)

    Sulinska, Iwona Magdalena

    and firm performance. Chapter 3 explores social capital of board chair, which has been overlooked in previous studies. It suggests that individual social capital of board chair is as important for organizational performance as social capital of CEO and directors. Therefore, performance effect derives from...... and external networks of social relationships created by board members. Evolution paths are consequently proposed for diversity and strength of external network ties, and for internal network cohesion. In light of the overarching research question, the final chapter summarizes the findings.......The aim of the dissertation is to disentangle complexities of social capital in boards of directors through proposing new theoretical perspectives and methodological approaches. Although extant previous research has discussed various aspects of social capital and its association with numerous...

  6. Molding cork sheets to complex shapes

    Science.gov (United States)

    Sharpe, M. H.; Simpson, W. G.; Walker, H. M.

    1977-01-01

    Partially cured cork sheet is easily formed to complex shapes and then final-cured. Temperature and pressure levels required for process depend upon resin system used and final density and strength desired. Sheet can be bonded to surface during final cure, or can be first-formed in mold and bonded to surface in separate step.

  7. Fourier Transform Microwave Spectroscopy of Multiconformational Molecules and Van Der Waals Complexes.

    Science.gov (United States)

    Hight Walker, Angela Renee

    1995-01-01

    With the use of a Fourier transform microwave (FTM) spectrometer, structural determinations of two types of species; multiconformational molecules and van der Waals complexes, have been performed. Presented in this thesis are three sections summarizing this research effort. The first section contains a detailed explanation of the FTM instrument. In Section II, the study of three multiconformational molecules is presented as two chapters. Finally, three chapters in Section III outline the work still in progress on many van der Waals complexes. Section I was written to be a "manual" for the FTM spectrometer and to aid new additions to the group in their understanding of the instrument. An instruction guide is necessary for home-built instruments such as this one due to their unique design and application. Vital techniques and theories are discussed and machine operation is outlined. A brief explanation of general microwave spectroscopy as performed on an FTM spectrometer is also given. Section II is composed of two chapters pertaining to multiconformational molecules. In Chapter 2, a complete structural analysis of dipropyl ether is reported. The only conformer assigned had C_{rm s} symmetry. Many transitions are yet unassigned. Chapter 3 summarizes an investigation of two nitrosamines; methyl ethyl and methyl propyl nitrosamine. Only one conformer was observed for methyl ethyl nitrosamine, but two were assigned to methyl propyl nitrosamine. Nuclear hyperfine structure and internal methyl rotation complicated the spectra. The final section, Section III, contains the ongoing progress on weakly bound van der Waals complexes. The analysis of the OCS--HBr complex identified the structure as quasi-linear with large amplitude bending motions. Five separate isotopomers were assigned. Transitions originating from the HBr--DBr complex were measured and presented in Chapter 5. Although early in the analysis, the structure was determined to be bent and deuterium bonded. The

  8. Innovative subsurface stabilization project -- Final Report

    International Nuclear Information System (INIS)

    Loomis, G.G.; Zdinak, A.P.; Bishop, C.W.

    1996-11-01

    This is a report of results of applying four innovative grouting materials and one commercially available material for creating monoliths out of buried waste sites using jet grouting. The four innovative materials included a proprietary water-based epoxy, an Idaho National Engineering Laboratory-developed two-component grout that resembles hematite when cured with soil, molten low-temperature paraffin, and a proprietary iron oxide cement-based grout called TECT. The commercial grout was Type-H high-sulfate-resistant cement. These materials were tested in specially designed cold test pits that simulate buried transuranic waste at the Idaho National Engineering Laboratory. In addition to the grouting studies, specially designed field-scale permeameters were constructed to perform full-scale controlled mass balance hydraulic conductivity studies. An ungrouted field-scale permeameter contained simulated buried waste and soil and was left ungrouted, and a second identical field-scale permeameter was grouted with commercial-grade Type-H cement. The field demonstrations were performed in an area referred to as the Cold Test Pit at the Idaho National Engineering Laboratory. The Cold Test Pit is adjacent to the laboratory's Radioactive Waste Management Complex. At the complex, 2 million ft 3 of transuranic waste is commingled with 6--8 million ft 3 of soil in shallow land burial, and improving the confinement of this waste is one of the options for final waste disposition. This report gives results of grouting, coring, hydraulic conductivity, and destructive examination of the grouted buried waste matrix

  9. Performance of community health workers: situating their intermediary position within complex adaptive health systems.

    Science.gov (United States)

    Kok, Maryse C; Broerse, Jacqueline E W; Theobald, Sally; Ormel, Hermen; Dieleman, Marjolein; Taegtmeyer, Miriam

    2017-09-02

    Health systems are social institutions, in which health worker performance is shaped by transactional processes between different actors.This analytical assessment unravels the complex web of factors that influence the performance of community health workers (CHWs) in low- and middle-income countries. It examines their unique intermediary position between the communities they serve and actors in the health sector, and the complexity of the health systems in which they operate. The assessment combines evidence from the international literature on CHW programmes with research outcomes from the 5-year REACHOUT consortium, undertaking implementation research to improve CHW performance in six contexts (two in Asia and four in Africa). A conceptual framework on CHW performance, which explicitly conceptualizes the interface role of CHWs, is presented. Various categories of factors influencing CHW performance are distinguished in the framework: the context, the health system and intervention hardware and the health system and intervention software. Hardware elements of CHW interventions comprise the supervision systems, training, accountability and communication structures, incentives, supplies and logistics. Software elements relate to the ideas, interests, relationships, power, values and norms of the health system actors. They influence CHWs' feelings of connectedness, familiarity, self-fulfilment and serving the same goals and CHWs' perceptions of support received, respect, competence, honesty, fairness and recognition.The framework shines a spotlight on the need for programmes to pay more attention to ideas, interests, relationships, power, values and norms of CHWs, communities, health professionals and other actors in the health system, if CHW performance is to improve.

  10. Final Performance Report

    Energy Technology Data Exchange (ETDEWEB)

    Hsieh, S. T. [Tulane Univ., New Orleans, LA (United States)

    2013-08-31

    U.S./China Energy and Environmental Technology Center (EETC), Payson Center for International Development, Law School of Tulane University was officially established in 1997 with initial funds from private sector, US Environmental Protection Agency and the US Department of Energy (DOE.) Lately, DOE has provided EETC funds for operations with cost share from the Ministry of Science and Technology, China. EETC was created to facilitate the development of friendly, broad-based U.S./China relations. Tulane University signed the Memorandum of Understanding (MOU) with the Chinese People’s Institute of Foreign Affairs (1995) to promote the formation of Chinese partners for EETC. EETC’s original goal is to enhance the competitiveness of US clean fossil energy technology in China so that, as her economy expands, local and global environment are well protected. Specifically, through the demonstration and broadly deployment of US developed clean coal technology for power generation, transmission, and emission reductions in China. EETC is also focused on US industry partnerships for local economic development. One of the main the objectives of the EETC is to promote the efficient, responsible production and utilization of energy with a focus on clean fossil energy, promote US clean energy and environmental technologies, and encourage environmental performance while improving the quality of life in China. Another objective is to assist China with environmental and energy policy development and provide supports for China’s development with expertise (best practices) from US industry.

  11. Planning for risk-informed/performance-based fire protection at nuclear power plants. Final report

    International Nuclear Information System (INIS)

    Najafi, B.; Parkinson, W.J.; Lee, J.A.

    1997-12-01

    This document presents a framework for discussing issues and building consensus towards use of fire modeling and risk technology in nuclear power plant fire protection program implementation. The plan describes a three-phase approach: development of core technologies, implementation of methods, and finally, case studies and pilot applications to verify viability of such methods. The core technologies are defined as fire modeling, fire and system tests, use of operational data, and system and risk techniques. The implementation phase addresses the programmatic issues involved in implementing a risk-informed/performance-based approach in an integrated approach with risk/performance measures. The programmatic elements include: (1) a relationship with fire codes and standards development as defined by the ongoing effort of NFPA for development of performance-based standards; (2) the ability for NRC to undertake inspection and enforcement; and (3) the benefit to utilities in terms of cost versus safety. The case studies are intended to demonstrate applicability of single issue resolution while pilot applications are intended to check the applicability of the integrated program as a whole

  12. Model Complexity and Out-of-Sample Performance: Evidence from S&P 500 Index Returns

    NARCIS (Netherlands)

    Kaeck, Andreas; Rodrigues, Paulo; Seeger, Norman J.

    We apply a range of out-of-sample specification tests to more than forty competing stochastic volatility models to address how model complexity affects out-of-sample performance. Using daily S&P 500 index returns, model confidence set estimations provide strong evidence that the most important model

  13. A complex symbol signal-to-noise ratio estimator and its performance

    Science.gov (United States)

    Feria, Y.

    1994-01-01

    This article presents an algorithm for estimating the signal-to-noise ratio (SNR) of signals that contain data on a downconverted suppressed carrier or the first harmonic of a square-wave subcarrier. This algorithm can be used to determine the performance of the full-spectrum combiner for the Galileo S-band (2.2- to 2.3-GHz) mission by measuring the input and output symbol SNR. A performance analysis of the algorithm shows that the estimator can estimate the complex symbol SNR using 10,000 symbols at a true symbol SNR of -5 dB with a mean of -4.9985 dB and a standard deviation of 0.2454 dB, and these analytical results are checked by simulations of 100 runs with a mean of -5.06 dB and a standard deviation of 0.2506 dB.

  14. BETWEEN PARCIMONY AND COMPLEXITY: COMPARING PERFORMANCE MEASURES FOR ROMANIAN BANKING INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    ANCA MUNTEANU

    2012-01-01

    Full Text Available The main objective of this study is to establish the relationship between traditional measures of performance (ROE, ROA and NIM and EVA in order to gain some insight about the relevance of using more sophisticated performance measurements tools. Towards this end the study uses two acknowledged statistical measures: Kendall’s Tau and Spearman rank correlation Index. Using data from 12 Romanian banking institutions that report under IFRS for the period 2006-2010 the results suggest that generally EVA is highly correlated with Residual Income in the years that present positive operational profits whereas for the years with negative outcome the correlation is low. ROA and ROE are the measure that best correlates with EVA for the entire period and thus -applying Occam’s razor- could be used as a substitute for more complex shareholder earnings measures.

  15. Synchronization coupled systems to complex networks

    CERN Document Server

    Boccaletti, Stefano; del Genio, Charo I; Amann, Andreas

    2018-01-01

    A modern introduction to synchronization phenomena, this text presents recent discoveries and the current state of research in the field, from low-dimensional systems to complex networks. The book describes some of the main mechanisms of collective behaviour in dynamical systems, including simple coupled systems, chaotic systems, and systems of infinite-dimension. After introducing the reader to the basic concepts of nonlinear dynamics, the book explores the main synchronized states of coupled systems and describes the influence of noise and the occurrence of synchronous motion in multistable and spatially-extended systems. Finally, the authors discuss the underlying principles of collective dynamics on complex networks, providing an understanding of how networked systems are able to function as a whole in order to process information, perform coordinated tasks, and respond collectively to external perturbations. The demonstrations, numerous illustrations and application examples will help advanced graduate s...

  16. Integrated modeling tool for performance engineering of complex computer systems

    Science.gov (United States)

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  17. A novel hybrid color image encryption algorithm using two complex chaotic systems

    Science.gov (United States)

    Wang, Leyuan; Song, Hongjun; Liu, Ping

    2016-02-01

    Based on complex Chen and complex Lorenz systems, a novel color image encryption algorithm is proposed. The larger chaotic ranges and more complex behaviors of complex chaotic systems, which compared with real chaotic systems could additionally enhance the security and enlarge key space of color image encryption. The encryption algorithm is comprised of three step processes. In the permutation process, the pixels of plain image are scrambled via two-dimensional and one-dimensional permutation processes among RGB channels individually. In the diffusion process, the exclusive-or (XOR for short) operation is employed to conceal pixels information. Finally, the mixing RGB channels are used to achieve a multilevel encryption. The security analysis and experimental simulations demonstrate that the proposed algorithm is large enough to resist the brute-force attack and has excellent encryption performance.

  18. A model of R-D performance evaluation for Rate-Distortion-Complexity evaluation of H.264 video coding

    DEFF Research Database (Denmark)

    Wu, Mo; Forchhammer, Søren

    2007-01-01

    This paper considers a method for evaluation of Rate-Distortion-Complexity (R-D-C) performance of video coding. A statistical model of the transformed coefficients is used to estimate the Rate-Distortion (R-D) performance. A model frame work for rate, distortion and slope of the R-D curve for inter...... and intra frame is presented. Assumptions are given for analyzing an R-D model for fast R-D-C evaluation. The theoretical expressions are combined with H.264 video coding, and confirmed by experimental results. The complexity frame work is applied to the integer motion estimation....

  19. Do Work Placements Improve Final Year Academic Performance or Do High-Calibre Students Choose to Do Work Placements?

    Science.gov (United States)

    Jones, C. M.; Green, J. P.; Higson, H. E.

    2017-01-01

    This study investigates whether the completion of an optional sandwich work placement enhances student performance in final year examinations. Using Propensity Score Matching, our analysis departs from the literature by controlling for self-selection. Previous studies may have overestimated the impact of sandwich work placements on performance…

  20. Technologies and tools for high-performance distributed computing. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Karonis, Nicholas T.

    2000-05-01

    In this project we studied the practical use of the MPI message-passing interface in advanced distributed computing environments. We built on the existing software infrastructure provided by the Globus Toolkit{trademark}, the MPICH portable implementation of MPI, and the MPICH-G integration of MPICH with Globus. As a result of this project we have replaced MPICH-G with its successor MPICH-G2, which is also an integration of MPICH with Globus. MPICH-G2 delivers significant improvements in message passing performance when compared to its predecessor MPICH-G and was based on superior software design principles resulting in a software base that was much easier to make the functional extensions and improvements we did. Using Globus services we replaced the default implementation of MPI's collective operations in MPICH-G2 with more efficient multilevel topology-aware collective operations which, in turn, led to the development of a new timing methodology for broadcasts [8]. MPICH-G2 was extended to include client/server functionality from the MPI-2 standard [23] to facilitate remote visualization applications and, through the use of MPI idioms, MPICH-G2 provided application-level control of quality-of-service parameters as well as application-level discovery of underlying Grid-topology information. Finally, MPICH-G2 was successfully used in a number of applications including an award-winning record-setting computation in numerical relativity. In the sections that follow we describe in detail the accomplishments of this project, we present experimental results quantifying the performance improvements, and conclude with a discussion of our applications experiences. This project resulted in a significant increase in the utility of MPICH-G2.

  1. Building and Running the Yucca Mountain Total System Performance Model in a Quality Environment

    International Nuclear Information System (INIS)

    D.A. Kalinich; K.P. Lee; J.A. McNeish

    2005-01-01

    A Total System Performance Assessment (TSPA) model has been developed to support the Safety Analysis Report (SAR) for the Yucca Mountain High-Level Waste Repository. The TSPA model forecasts repository performance over a 20,000-year simulation period. It has a high degree of complexity due to the complexity of its underlying process and abstraction models. This is reflected in the size of the model (a 27,000 element GoldSim file), its use of dynamic-linked libraries (14 DLLs), the number and size of its input files (659 files totaling 4.7 GB), and the number of model input parameters (2541 input database entries). TSPA model development and subsequent simulations with the final version of the model were performed to a set of Quality Assurance (QA) procedures. Due to the complexity of the model, comments on previous TSPAs, and the number of analysts involved (22 analysts in seven cities across four time zones), additional controls for the entire life-cycle of the TSPA model, including management, physical, model change, and input controls were developed and documented. These controls did not replace the QA. procedures, rather they provided guidance for implementing the requirements of the QA procedures with the specific intent of ensuring that the model development process and the simulations performed with the final version of the model had sufficient checking, traceability, and transparency. Management controls were developed to ensure that only management-approved changes were implemented into the TSPA model and that only management-approved model runs were performed. Physical controls were developed to track the use of prototype software and preliminary input files, and to ensure that only qualified software and inputs were used in the final version of the TSPA model. In addition, a system was developed to name, file, and track development versions of the TSPA model as well as simulations performed with the final version of the model

  2. Participation of the Black Community in Selected Aspects of the Educational Institution of Newark, 1958-1972. Final Report.

    Science.gov (United States)

    Phillips, W. M., Jr.; And Others

    This document is the final report of a two-year study of the interdependency of race and education in Newark, New Jersey. The report is organized into sections describing how the research was performed and presents the results on a set of topics defined as central for providing a useful understanding of the complex interrelationships of race and…

  3. Complex Hollow Nanostructures: Synthesis and Energy-Related Applications.

    Science.gov (United States)

    Yu, Le; Hu, Han; Wu, Hao Bin; Lou, Xiong Wen David

    2017-04-01

    Hollow nanostructures offer promising potential for advanced energy storage and conversion applications. In the past decade, considerable research efforts have been devoted to the design and synthesis of hollow nanostructures with high complexity by manipulating their geometric morphology, chemical composition, and building block and interior architecture to boost their electrochemical performance, fulfilling the increasing global demand for renewable and sustainable energy sources. In this Review, we present a comprehensive overview of the synthesis and energy-related applications of complex hollow nanostructures. After a brief classification, the design and synthesis of complex hollow nanostructures are described in detail, which include hierarchical hollow spheres, hierarchical tubular structures, hollow polyhedra, and multi-shelled hollow structures, as well as their hybrids with nanocarbon materials. Thereafter, we discuss their niche applications as electrode materials for lithium-ion batteries and hybrid supercapacitors, sulfur hosts for lithium-sulfur batteries, and electrocatalysts for oxygen- and hydrogen-involving energy conversion reactions. The potential superiorities of complex hollow nanostructures for these applications are particularly highlighted. Finally, we conclude this Review with urgent challenges and further research directions of complex hollow nanostructures for energy-related applications. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Comparative Visual Analysis of Structure-Performance Relations in Complex Bulk-Heterojunction Morphologies

    KAUST Repository

    Aboulhassan, A.

    2017-07-04

    The structure of Bulk-Heterojunction (BHJ) materials, the main component of organic photovoltaic solar cells, is very complex, and the relationship between structure and performance is still largely an open question. Overall, there is a wide spectrum of fabrication configurations resulting in different BHJ morphologies and correspondingly different performances. Current state-of-the-art methods for assessing the performance of BHJ morphologies are either based on global quantification of morphological features or simply on visual inspection of the morphology based on experimental imaging. This makes finding optimal BHJ structures very challenging. Moreover, finding the optimal fabrication parameters to get an optimal structure is still an open question. In this paper, we propose a visual analysis framework to help answer these questions through comparative visualization and parameter space exploration for local morphology features. With our approach, we enable scientists to explore multivariate correlations between local features and performance indicators of BHJ morphologies. Our framework is built on shape-based clustering of local cubical regions of the morphology that we call patches. This enables correlating the features of clusters with intuition-based performance indicators computed from geometrical and topological features of charge paths.

  5. Comparative Visual Analysis of Structure-Performance Relations in Complex Bulk-Heterojunction Morphologies

    KAUST Repository

    Aboulhassan, A.; Sicat, R.; Baum, D.; Wodo, O.; Hadwiger, Markus

    2017-01-01

    The structure of Bulk-Heterojunction (BHJ) materials, the main component of organic photovoltaic solar cells, is very complex, and the relationship between structure and performance is still largely an open question. Overall, there is a wide spectrum of fabrication configurations resulting in different BHJ morphologies and correspondingly different performances. Current state-of-the-art methods for assessing the performance of BHJ morphologies are either based on global quantification of morphological features or simply on visual inspection of the morphology based on experimental imaging. This makes finding optimal BHJ structures very challenging. Moreover, finding the optimal fabrication parameters to get an optimal structure is still an open question. In this paper, we propose a visual analysis framework to help answer these questions through comparative visualization and parameter space exploration for local morphology features. With our approach, we enable scientists to explore multivariate correlations between local features and performance indicators of BHJ morphologies. Our framework is built on shape-based clustering of local cubical regions of the morphology that we call patches. This enables correlating the features of clusters with intuition-based performance indicators computed from geometrical and topological features of charge paths.

  6. Schedulability Analysis for Java Finalizers

    DEFF Research Database (Denmark)

    Bøgholm, Thomas; Hansen, Rene Rydhof; Søndergaard, Hans

    2010-01-01

    Java finalizers perform clean-up and finalisation of objects at garbage collection time. In real-time Java profiles the use of finalizers is either discouraged (RTSJ, Ravenscar Java) or even disallowed (JSR-302), mainly because of the unpredictability of finalizers and in particular their impact...... on the schedulability analysis. In this paper we show that a controlled scoped memory model results in a structured and predictable execution of finalizers, more reminiscent of C++ destructors than Java finalizers. Furthermore, we incorporate finalizers into a (conservative) schedulability analysis for Predictable Java...... programs. Finally, we extend the SARTS tool for automated schedulability analysis of Java bytecode programs to handle finalizers in a fully automated way....

  7. Innovation in user-centered skills and performance improvement for sustainable complex service systems.

    Science.gov (United States)

    Karwowski, Waldemar; Ahram, Tareq Z

    2012-01-01

    In order to leverage individual and organizational learning and to remain competitive in current turbulent markets it is important for employees, managers, planners and leaders to perform at high levels over time. Employee competence and skills are extremely important matters in view of the general shortage of talent and the mobility of employees with talent. Two factors emerged to have the greatest impact on the competitiveness of complex service systems: improving managerial and employee's knowledge attainment for skills, and improving the training and development of the workforce. This paper introduces the knowledge-based user-centered service design approach for sustainable skill and performance improvement in education, design and modeling of the next generation of complex service systems. The rest of the paper cover topics in human factors and sustainable business process modeling for the service industry, and illustrates the user-centered service system development cycle with the integration of systems engineering concepts in service systems. A roadmap for designing service systems of the future is discussed. The framework introduced in this paper is based on key user-centered design principles and systems engineering applications to support service competitiveness.

  8. On improving the performance of nonphotochemical quenching in CP29 light-harvesting antenna complex

    Science.gov (United States)

    Berman, Gennady P.; Nesterov, Alexander I.; Sayre, Richard T.; Still, Susanne

    2016-03-01

    We model and simulate the performance of charge-transfer in nonphotochemical quenching (NPQ) in the CP29 light-harvesting antenna-complex associated with photosystem II (PSII). The model consists of five discrete excitonic energy states and two sinks, responsible for the potentially damaging processes and charge-transfer channels, respectively. We demonstrate that by varying (i) the parameters of the chlorophyll-based dimer, (ii) the resonant properties of the protein-solvent environment interaction, and (iii) the energy transfer rates to the sinks, one can significantly improve the performance of the NPQ. Our analysis suggests strategies for improving the performance of the NPQ in response to environmental changes, and may stimulate experimental verification.

  9. Task complexity and task, goal, and reward interdependence in group performance management : A prescriptive model

    NARCIS (Netherlands)

    van Vijfeijken, H.; Kleingeld, A.; van Tuijl, H.; Algera, J.A.; Thierry, Hk.

    2002-01-01

    A prescriptive model on how to design effective combinations of goal setting and contingent rewards for group performance management is presented. The model incorporates the constructs task complexity, task interdependence, goal interdependence, and reward interdependence and specifies optimal fit

  10. Ready for practice? A study of confidence levels of final year dental students at Cardiff University and University College Cork.

    Science.gov (United States)

    Honey, J; Lynch, C D; Burke, F M; Gilmour, A S M

    2011-05-01

    The aim of this study was to describe the self-reported confidence levels of final year students at the School of Dentistry, Cardiff University and at the University Dental School & Hospital, Cork, Ireland in performing a variety of dental procedures commonly completed in primary dental care settings. A questionnaire was distributed to 61 final year students at Cardiff and 34 final year students at Cork. Information requested related to the respondents confidence in performing a variety of routine clinical tasks, using a five-point scale (1=very little confidence, 5=very confident). Comparisons were made between the two schools, gender of the respondent, and whether or not a student intended completing a year of vocational training after graduation. A response rate of 74% was achieved (n=70). The greatest self-reported confidence scores were for 'scale and polish' (4.61), fissure sealants (4.54) and delivery of oral hygiene instruction (4.51). Areas with the least confidence were placement of stainless steel crowns (2.83), vital tooth bleaching (2.39) and surgical extractions (2.26). Students at Cardiff were more confident than those at Cork in performing simple extractions (Cardiff: 4.31; Cork: 3.76) and surgical extractions (Cardiff: 2.61; Cork: 1.88), whilst students in Cork were more confident in caries diagnosis (Cork: 4.24; Cardiff: 3.89) fissure sealing (Cork: 4.76; Cardiff: 4.33) and placement of preventive resin restorations (Cork: 4.68; Cardiff: 4.22).   Final year students at Cardiff and Cork were most confident in simpler procedures and procedures in which they had had most clinical experience. They were least confident in more complex procedures and procedures in which they had the least clinical experience. Increased clinical time in complex procedures may help in increasing final year students' confidence in those areas. © 2011 John Wiley & Sons A/S.

  11. Front-office/back-office configurations and operational performance in complex health services.

    Science.gov (United States)

    Gemmel, Paul; van Steenis, Thomas; Meijboom, Bert

    2014-01-01

    Acquired brain injury (ABI) occurs from various causes at different ages and leads to many different types of healthcare needs. Several Dutch ABI-networks installed a local co-ordination and contact point (CCP) which functions as a central and easily accessible service for people to consult when they have questions related to ABI. To explore the relationship between front/back office design and operational performance by investigating the particular enquiry service provided by different CCPs for people affected by an ABI. In-depth interviews with 14 FO/BO employees from three case organizations, complemented with information from desk research and three one-day field visits. The CCPs applied different FO/BO configurations in terms of customer contact and in terms of grouping of front and/or back office activities into tasks for one employee. It is the complexity of the enquiry that determines which approach is more appropriate. For complex enquiries, the level of decoupling is high in all CCPs. This allows multiple experts to be involved in the process. For regular enquiries, CCPs have a choice: either working in the same way as in the complex enquiries or coupling FO/BO activities to be able to serve clients faster and without handovers.

  12. Self-perceived versus objectively measured competence in performing clinical practical procedures by final year medical students

    OpenAIRE

    Katowa-Mukwato, Patricia; Banda, Sekelani

    2016-01-01

    Objectives To determine and compare the self-perceived and objectively measured competence in performing 14 core-clinical practical procedures by Final Year Medical Students of the University of Zambia. Methods The study included 56 out of 60 graduating University of Zambia Medical Students of the 2012/2013 academic year. Self-perceived competence: students rated their competence on 14 core- clinical practical procedures using a self-administered questionnaire on a 5-point Likert scale. Objec...

  13. The effects of overtime work and task complexity on the performance of nuclear plant operators: A proposed methodology

    International Nuclear Information System (INIS)

    Banks, W.W.; Potash, L.

    1985-01-01

    This document presents a very general methodology for determining the effect of overtime work and task complexity on operator performance in response to simulated out-of-limit nuclear plant conditions. The independent variables consist of three levels of overtime work and three levels of task complexity. Multiple dependent performance measures are proposed for use and discussion. Overtime work is operationally defined in terms of the number of hours worked by nuclear plant operators beyond the traditional 8 hours per shift. Task complexity is operationalized in terms of the number of operator tasks required to remedy a given plant anomalous condition and bring the plant back to a ''within limits'' or ''normal'' steady-state condition. The proposed methodology would employ a 2 factor repeated measures design along with the analysis of variance (linear) model

  14. Task versus relationship conflict, team performance, and team member satisfaction: a meta-analysis.

    Science.gov (United States)

    De Dreu, Carsten K W; Weingart, Laurie R

    2003-08-01

    This study provides a meta-analysis of research on the associations between relationship conflict, task conflict, team performance, and team member satisfaction. Consistent with past theorizing, results revealed strong and negative correlations between relationship conflict, team performance, and team member satisfaction. In contrast to what has been suggested in both academic research and introductory textbooks, however, results also revealed strong and negative (instead of the predicted positive) correlations between task conflict team performance, and team member satisfaction. As predicted, conflict had stronger negative relations with team performance in highly complex (decision making, project, mixed) than in less complex (production) tasks. Finally, task conflict was less negatively related to team performance when task conflict and relationship conflict were weakly, rather than strongly, correlated.

  15. Random walk-based similarity measure method for patterns in complex object

    Directory of Open Access Journals (Sweden)

    Liu Shihu

    2017-04-01

    Full Text Available This paper discusses the similarity of the patterns in complex objects. The complex object is composed both of the attribute information of patterns and the relational information between patterns. Bearing in mind the specificity of complex object, a random walk-based similarity measurement method for patterns is constructed. In this method, the reachability of any two patterns with respect to the relational information is fully studied, and in the case of similarity of patterns with respect to the relational information can be calculated. On this bases, an integrated similarity measurement method is proposed, and algorithms 1 and 2 show the performed calculation procedure. One can find that this method makes full use of the attribute information and relational information. Finally, a synthetic example shows that our proposed similarity measurement method is validated.

  16. The Influence of Business Environmental Dynamism, Complexity and Munificence on Performance of Small and Medium Enterprises in Kenya

    Directory of Open Access Journals (Sweden)

    Washington Oduor Okeyo

    2014-08-01

    Full Text Available The main purpose of this article is to examine how business environment affects small and medium enterprises. The paper is motivated by the important contributions small and medium enterprises have in many countries, especially Kenya towards job creation, poverty reduction and economic development. Literature however argues that effectiveness of the contributions is conditioned by the state of business environmental factors such as politics, economy, socio-culture, technology, ecology and laws/regulations. Dynamism, complexity and munificence of these factors are therefore vital to achievement of organizational objectives and overall performance. Even so, a review of literature reveals contradictory views regarding the effect of these factors on performance of organizations. Furthermore, studies focusing on these factors in the Kenyan context, particularly with regard to their effect on performance of small and medium firms, are scarce. This article bridges this gap based on a study focusing on 800 manufacturing organizations in Nairobi – Kenya. A sample of 150 enterprises was selected through stratification by business sector followed by simple random sampling. The research design was cross sectional survey where data was collected using a structured questionnaire over a period of one month at the end of which 95 organizations responded giving a response rate of 64%. Reliability and validity of the instrument were determined through Cronbach’s alpha tests and expert reviews. Statistical Package for Social Sciences was used to determine normality through descriptive statistics and study hypotheses tested using inferential statistics. The study established that business environment had an overall impact on organizational performance. Specifically, dynamism, complexity and munificence each had a direct influence on the enterprises in the study. Furthermore the combined effect on performance was found to be greater than that of dynamism and

  17. Effects of Maize Source and Complex Enzymes on Performance and Nutrient Utilization of Broilers

    Directory of Open Access Journals (Sweden)

    Defu Tang

    2014-12-01

    Full Text Available The objective of this study was to investigate the effect of maize source and complex enzymes containing amylase, xylanase and protease on performance and nutrient utilization of broilers. The experiment was a 4×3 factorial design with diets containing four source maize samples (M1, M2, M3, and M4 and without or with two kinds of complex enzyme A (Axtra XAP and B (Avizyme 1502. Nine hundred and sixty day old Arbor Acres broiler chicks were used in the trial (12 treatments with 8 replicate pens of 10 chicks. Birds fed M1 diet had better body weight gain (BWG and lower feed/gain ratio compared with those fed M3 diet and M4 diet (p0.05, respectively. The fresh feces output was significantly decreased by the addition of enzyme B (p<0.05. Maize source affects the nutrients digestibility and performance of broilers, and a combination of amylase, xylanase and protease is effective in improving the growth profiles of broilers fed maize-soybean-rapeseed-cotton mixed diets.

  18. Radio elements / bottom salts separation by nano-filtration aided by complexation in a highly saline environment

    International Nuclear Information System (INIS)

    Gaubert, Eric

    1997-01-01

    This research thesis addresses the use of a membrane-based technique, nano-filtration, aided or not by complexation, for the processing of highly saline liquid effluents produced by radio-chemical decontamination. The objective is to separate non-radioactive elements (sodium nitrate) from radio-elements (caesium, strontium and actinides) in order to reduce the volume of wastes. Within the perspective of an industrial application, a system to concentrate the effluent is firstly defined. Different nano-filtration membranes are tested and reveal to be insufficient in highly saline environment. A stage of selective complexation of radio-elements is therefore considered before nano-filtration. The main factors affecting performance of nano-filtration-complexation (for a given membrane system) are identified: ionic force, pH, ligand content, trans-membrane pressure. Finally, a nano-filtration pilot is implemented to perform nano-filtration-complexation operations by remote handling on radioactive substances [fr

  19. Market-Based Adult Lifelong Learning Performance Measures for Public Libraries Serving Lower Income and Majority-Minority Markets. Final Performance Report. September 1, 1996-August 31, 1999.

    Science.gov (United States)

    Koontz, Christine; Jue, Dean K.; Lance, Keith Curry

    This document is the final performance report for a Field Initiated Studies (FIS) project that addressed the need for a better assessment of public library services for adult lifelong learning in majority-minority and lower income library market areas. After stating the major educational problem addressed by the FIS project, the report lists the…

  20. A series of copper complexes with carbazole and oxadiazole moieties: Synthesis, characterization and luminescence performance

    Energy Technology Data Exchange (ETDEWEB)

    Bai Weiyang, E-mail: baiwy02@163.com [College of Chemistry and Chemical Engineering, Chongqing University of Technology, Chongqing 400054 (China); Sun Li [Graduate University of Chinese Academy of Sciences, Beijing 100049 (China)

    2012-10-15

    In this paper, various moieties of ethyl, carbazole and oxadiazole are attached to 2-thiazol-4-yl-1H-benzoimidazole to form a series of diamine ligands. Their corresponding Cu(I) complexes are also synthesized using bis(2-(diphenylphosphanyl)phenyl) ether as the auxiliary ligand. Crystal structures, thermal property, electronic nature and luminescence property of these Cu(I) complexes are discussed in detail. These Cu(I) complexes are found to be efficient green-emitting ones in solutions and the emissive parameters are improved largely by the incorporation of substituent moieties. Detailed analysis suggests that the effective suppression of solvent-induced exciplex quenching is responsible for this phenomenon. On the other hand, the introduction of substituent moieties exerts no obvious influence on molecular structure, thermal stability and emitting-energy of the Cu(I) complexes, owing to their absence from inner coordination sphere. - Highlights: Black-Right-Pointing-Pointer Diamine ligands with various moieties and Cu(I) complexes are synthesized. Black-Right-Pointing-Pointer Crystal structures and photophysical property are discussed in detail. Black-Right-Pointing-Pointer The incorporation of substituent moieties improves luminescence performance. Black-Right-Pointing-Pointer Solvent-induced exciplex quenching is suppressed by substituent moieties.

  1. High-performance execution of psychophysical tasks with complex visual stimuli in MATLAB

    Science.gov (United States)

    Asaad, Wael F.; Santhanam, Navaneethan; McClellan, Steven

    2013-01-01

    Behavioral, psychological, and physiological experiments often require the ability to present sensory stimuli, monitor and record subjects' responses, interface with a wide range of devices, and precisely control the timing of events within a behavioral task. Here, we describe our recent progress developing an accessible and full-featured software system for controlling such studies using the MATLAB environment. Compared with earlier reports on this software, key new features have been implemented to allow the presentation of more complex visual stimuli, increase temporal precision, and enhance user interaction. These features greatly improve the performance of the system and broaden its applicability to a wider range of possible experiments. This report describes these new features and improvements, current limitations, and quantifies the performance of the system in a real-world experimental setting. PMID:23034363

  2. Cyclomatic Complexity: theme and variations

    Directory of Open Access Journals (Sweden)

    Brian Henderson-Sellers

    1993-11-01

    Full Text Available Focussing on the "McCabe family" of measures for the decision/logic structure of a program, leads to an evaluation of extensions to modularization, nesting and, potentially, to object-oriented program structures. A comparison of rated, operating and essential complexities of programs suggests two new metrics: "inessential complexity" as a measure of unstructuredness and "product complexity" as a potential objective measure of structural complexity. Finally, nesting and abstraction levels are considered, especially as to how metrics from the "McCabe family" might be applied in an object-oriented systems development environment.

  3. The Homogeneity Research of Urban Rail Transit Network Performance

    Directory of Open Access Journals (Sweden)

    Wang Fu-jian

    2016-01-01

    Full Text Available Urban Rail Transit is an important part of the public transit, it is necessary to carry out the corresponding network function analysis. Previous studies mainly about network performance analysis of a single city rail transit, lacking of horizontal comparison between the multi-city, it is difficult to find inner unity of different Urban Rail Transit network functions. Taking into account the Urban Rail Transit network is a typical complex networks, so this paper proposes the application of complex network theory to research the homogeneity of Urban Rail Transit network performance. This paper selects rail networks of Beijing, Shanghai, Guangzhou as calculation case, gave them a complex network mapping through the L, P Space method and had a static topological analysis using complex network theory, Network characteristics in three cities were calculated and analyzed form node degree distribution and node connection preference. Finally, this paper studied the network efficiency changes of Urban Rail Transit system under different attack mode. The results showed that, although rail transport network size, model construction and construction planning of the three cities are different, but their network performance in many aspects showed high homogeneity.

  4. Final Report

    DEFF Research Database (Denmark)

    Heiselberg, Per; Brohus, Henrik; Nielsen, Peter V.

    This final report for the Hybrid Ventilation Centre at Aalborg University describes the activities and research achievement in the project period from August 2001 to August 2006. The report summarises the work performed and the results achieved with reference to articles and reports published...

  5. Molecular dynamics simulations of ter-pyridine, BTP, and their complexes with La3+, Eu3+ and Lu3+

    International Nuclear Information System (INIS)

    Guilbaud, P.; Dognon, J.P.

    2000-01-01

    This poster presents molecular dynamics simulations performed to study ter-pyridine and bis-triazinyl-pyridine with lanthanide cations for the gas phase and for water solution. Different counter-ions have been tested in order to assess their influence on complexes structures and stabilities in both phases. For stable complexes, Gibbs free energy calculations have been achieved to estimate the selectivity of these complexes towards the lanthanide cations. Finally, some tests have been done adding a polarization term in the potential energy in order to have a more precise description of interaction energies. (authors)

  6. Libraries of Middlesex, Final Performance Report for Library Services and Construction Act (LSCA) Title VI, Library Literacy Program.

    Science.gov (United States)

    Director, Elissa

    This final performance report for the Libraries of Middlesex literacy project begins with a section that compares actual accomplishments to the following objectives for 1992-93: (1) recruit and enroll at least 150 new volunteers in Basic Reading of English as a Second Language (ESL) tutor training; (2) have at least 125 volunteers successfully…

  7. A (Small) complexity performance contest: SPT versus LBFS

    DEFF Research Database (Denmark)

    Nielsen, Erland Hejn

    2002-01-01

    When discussing the nature of loading rules relevant to continuous dynamic job/flow shop systems, the general understanding is that results obtained in simpler structures of 2 to 4 machines are invariant to scaling and can be generalised without problems to more complex structures. The SPT...... results on pure re-entrant flow shop structures emerges. It now seems that alternative loading rules as for instance the LBFS (Last Buffer First Served) due to its strong long run stabilising property attracts quite some interest. To be more precise about the complexity aspect, complexity in job...... are not entirely only of theoretical interest, as well as results from a standard serial job/flow shop set-up, but with resource limitations that prevent the independent operations of the individual stations in the system....

  8. A (Small) Complexity Performance Contest: SPT versus LBFS

    DEFF Research Database (Denmark)

    Nielsen, Erland Hejn

    2002-01-01

    When discussing the nature of loading rules relevant to continuous dynamic job/flow shop systems, the general understanding is that results obtained in simpler structures of 2 to 4 machines are invariant to scaling and can be generalised without problems to more complex structures. The SPT...... results on pure re-entrant flow shop structures emerges. It now seems that alternative loading rules as for instance the LBFS (Last Buffer First Served) due to its strong long run stabilising property attracts quite some interest. To be more precise about the complexity aspect, complexity in job...... are not entirely only of theoretical interest, as well as results from a standard serial job/flow shop set-up, but with resource limitations that prevent the independent operations of the individual stations in the system....

  9. A Low-Complexity and High-Performance 2D Look-Up Table for LDPC Hardware Implementation

    Science.gov (United States)

    Chen, Jung-Chieh; Yang, Po-Hui; Lain, Jenn-Kaie; Chung, Tzu-Wen

    In this paper, we propose a low-complexity, high-efficiency two-dimensional look-up table (2D LUT) for carrying out the sum-product algorithm in the decoding of low-density parity-check (LDPC) codes. Instead of employing adders for the core operation when updating check node messages, in the proposed scheme, the main term and correction factor of the core operation are successfully merged into a compact 2D LUT. Simulation results indicate that the proposed 2D LUT not only attains close-to-optimal bit error rate performance but also enjoys a low complexity advantage that is suitable for hardware implementation.

  10. Effects of Task Performance and Task Complexity on the Validity of Computational Models of Attention

    NARCIS (Netherlands)

    Koning, L. de; Maanen, P.P. van; Dongen, K. van

    2008-01-01

    Computational models of attention can be used as a component of decision support systems. For accurate support, a computational model of attention has to be valid and robust. The effects of task performance and task complexity on the validity of three different computational models of attention were

  11. Larval Performance in the Context of Ecological Diversification and Speciation in Lycaeides Butterflies

    Directory of Open Access Journals (Sweden)

    Cynthia F. Scholl

    2012-01-01

    Full Text Available The role of ecology in diversification has been widely investigated, though few groups have been studied in enough detail to allow comparisons of different ecological traits that potentially contribute to reproductive isolation. We investigated larval performance within a species complex of Lycaeides butterflies. Caterpillars from seven populations were reared on five host plants, asking if host-specific, adaptive larval traits exist. We found large differences in performance across plants and fewer differences among populations. The patterns of performance are complex and suggest both conserved traits (i.e., plant effects across populations and more recent dynamics of local adaptation, in particular for L. melissa that has colonized an exotic host. We did not find a relationship between oviposition preference and larval performance, suggesting that preference did not evolve to match performance. Finally, we put larval performance within the context of several other traits that might contribute to ecologically based reproductive isolation in the Lycaeides complex. This larger context, involving multiple ecological and behavioral traits, highlights the complexity of ecological diversification and emphasizes the need for detailed studies on the strength of putative barriers to gene flow in order to fully understand the process of ecological speciation.

  12. The Impact of Environmental Complexity and Team Training on Team Processes and Performance in Multi-Team Environments

    National Research Council Canada - National Science Library

    Cobb, Marshall

    1999-01-01

    This study examined how manipulating the level of environmental complexity and the type of team training given to subject volunteers impacted important team process behaviors and performance outcomes...

  13. Profiles of Motor Laterality in Young Athletes' Performance of Complex Movements: Merging the MOTORLAT and PATHoops Tools

    Science.gov (United States)

    Castañer, Marta; Andueza, Juan; Hileno, Raúl; Puigarnau, Silvia; Prat, Queralt; Camerino, Oleguer

    2018-01-01

    Laterality is a key aspect of the analysis of basic and specific motor skills. It is relevant to sports because it involves motor laterality profiles beyond left-right preference and spatial orientation of the body. The aim of this study was to obtain the laterality profiles of young athletes, taking into account the synergies between the support and precision functions of limbs and body parts in the performance of complex motor skills. We applied two instruments: (a) MOTORLAT, a motor laterality inventory comprising 30 items of basic, specific, and combined motor skills, and (b) the Precision and Agility Tapping over Hoops (PATHoops) task, in which participants had to perform a path by stepping in each of 14 hoops arranged on the floor, allowing the observation of their feet, left-right preference and spatial orientation. A total of 96 young athletes performed the PATHoops task and the 30 MOTORLAT items, allowing us to obtain data about limb dominance and spatial orientation of the body in the performance of complex motor skills. Laterality profiles were obtained by means of a cluster analysis and a correlational analysis and a contingency analysis were applied between the motor skills and spatial orientation actions performed. The results obtained using MOTORLAT show that the combined motor skills criterion (for example, turning while jumping) differentiates athletes' uses of laterality, showing a clear tendency toward mixed laterality profiles in the performance of complex movements. In the PATHoops task, the best spatial orientation strategy was “same way” (same foot and spatial wing) followed by “opposite way” (opposite foot and spatial wing), in keeping with the research assumption that actions unfolding in a horizontal direction in front of an observer's eyes are common in a variety of sports. PMID:29930527

  14. Three propositions on why characteristics of performance management systems converge across policy areas with different levels of task complexity

    DEFF Research Database (Denmark)

    Bjørnholt, Bente; Lindholst, Andrej Christian; Agger Nielsen, Jeppe

    2014-01-01

    of task complexity amidst a lack of formal and overarching, government-wide policies. We advance our propositions from a case study comparing the characteristics of performance management systems across social services (eldercare) and technical services (park services) in Denmark. Contrary to expectations......This article investigates the differences and similarities between performance management systems across public services. We offer three propositions as to why the characteristics of performance management systems may still converge across policy areas in the public sector with different levels...... for divergence due to differences in task complexity, the characteristics of performance management systems in the two policy areas are observed to converge. On the basis of a case study, we propose that convergence has occurred due to 1) similarities in policy-specific reforms, 2) institutional pressures, and 3...

  15. Final report of the project performance assessment and economic evaluation of nuclear waste management

    International Nuclear Information System (INIS)

    Rasilainen, K.; Anttila, M.; Hautojaervi, A.

    1993-05-01

    The publication is the final report of project Performance Assessment and Economic Evaluation of Nuclear Waste Management (TOKA) at the Nuclear Engineering Laboratory of VTT (Technical Research Centre of Finland), forming part of the Publicly Financed Nuclear Waste Management Research Programme (JYT). The project covers safety and cost aspects of all phases of nuclear waste management. The main emphasis has been on developing an integrated system of models for performance assessment of nuclear waste repositories. During the four years the project has so far been in progress, the total amount of work has been around 14 person-years. Computer codes are the main tools in the project, they are either developed by the project team or acquired from abroad. In-house model development has been especially active in groundwater flow, near-field and migration modelling. The quantitative interpretation of Finnish tracer experiments in the laboratory and natural analogue studies at Palmottu support performance assessments via increased confidence in the migration concepts used. The performance assessment philosophy adopted by the team consists of deterministic modelling and pragmatic scenario analysis. This is supported by the long-term experience in practical performance assessment of the team, and in theoretical probabilistic modelling exercises. The radiological risks of spent fuel transportation from the Loviisa nuclear power plant to Russia have been analysed using a probabilistic computer code and Finnish traffic accident statistics. The project assists the authorities in the annual assessment of utility estimates of funding needs for future nuclear waste management operations. The models and methods used within the project are tested in international verification/validation projects

  16. Product Complexity Impact on Quality and Delivery Performance

    OpenAIRE

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2011-01-01

    Existing literature on product portfolio complexity is mainly focused on cost related aspects. It is widely acknowledged that an increase in a company’s product portfolio will lead to an increase in complexity related costs such as order management, procurement and inventory. The objective of this article is to examine which other factors that might be affected when a company is expanding its product portfolio, if initiatives are not taken to accommodate this increase. Empirical work carried ...

  17. Traffic Management Systems Performance Measurement: Final Report

    OpenAIRE

    Banks, James H.; Kelly, Gregory

    1997-01-01

    This report documents a study of performance measurement for Transportation Management Centers (TMCs). Performance measurement requirements were analyzed, data collection and management techniques were investigated, and case study traffic data system improvement plans were prepared for two Caltrans districts.

  18. Fourier domain optical coherence tomography achieves full range complex imaging in vivo by introducing a carrier frequency during scanning

    International Nuclear Information System (INIS)

    Wang, Ruikang K

    2007-01-01

    The author describes a Fourier domain optical coherence tomography (FDOCT) system that is capable of full range complex imaging in vivo. This is achieved by introducing a constant carrier frequency into the OCT spectral interferograms at the time when imaging is performed. The complex functions of the spatial interferograms formed by each single wavelength are constructed before performing the Fourier transformation to localize the scatters within a sample. Two algorithms, based on Fourier filtering and Hilbert transformation, respectively, are described to achieve the full range complex FDOCT imaging. It is shown that the Hilbert transformation approach delivers better performance than the Fourier filtering method does in terms of tolerating the sample movement in vivo. The author finally demonstrates experimentally the system and algorithms for true in vivo imaging at a rate of 20 000 axial scans per second

  19. Effects of Enzyme Complex Supplementation to a Paddy-based Diet on Performance and Nutrient Digestibility of Meat-type Ducks

    Directory of Open Access Journals (Sweden)

    P. Kang

    2013-02-01

    Full Text Available Paddy rice is rarely used as a feed because of its high fiber content. In this study, two experiments were conducted to study the effects of supplementing an enzyme complex consisting of xylanase, beta-glucanase and cellulase, to paddy-based diets on the performance and nutrient digestibility in meat-type ducks. In the both experiments, meat-type ducks (Cherry Valley were randomly assigned to four treatments. Treatment 1 was a basal diet of corn-soybean; treatment 2 was a basal diet of corn-paddy-soybean; treatment 3, had enzyme complex added to the corn-paddy-soybean basal diet at levels of 0.5 g/kg diet; and treatment 4, had enzyme complex added to the corn-paddy-soybean diet at levels of 1.0 g/kg diet. The results showed that the enzyme complex increased the ADG, and decreased the ADFI and F/G significantly (p0.05. The outcome of this research indicates that the application of enzyme complex made up of xylanase, beta-glucanase, and cellulase, in the corn-paddy-soybean diet, can improve performance and nutrition digestibility in meat-type ducks.

  20. Shifting effects in randomised controlled trials of complex interventions: a new kind of performance bias?

    Science.gov (United States)

    Gold, C; Erkkilä, J; Crawford, M J

    2012-11-01

    Randomised controlled trials (RCTs) aim to provide unbiased estimates of treatment effects. However, the process of implementing trial procedures may have an impact on the performance of complex interventions that rely strongly on the intuition and confidence of therapists. We aimed to examine whether shifting effects over the recruitment period can be observed that might indicate such impact. Three RCTs investigating music therapy vs. standard care were included. The intervention was performed by experienced therapists and based on established methods. We examined outcomes of participants graphically, analysed cumulative effects and tested for differences between first vs. later participants. We tested for potential confounding population shifts through multiple regression models. Cumulative differences suggested trends over the recruitment period. Effect sizes tended to be less favourable among the first participants than later participants. In one study, effects even changed direction. Age, gender and baseline severity did not account for these shifting effects. Some trials of complex interventions have shifting effects over the recruitment period that cannot be explained by therapist experience or shifting demographics. Replication and further research should aim to find out which interventions and trial designs are most vulnerable to this new kind of performance bias. © 2012 John Wiley & Sons A/S.

  1. Low-level waste disposal site performance assessment with the RQ/PQ methodology. Final report

    International Nuclear Information System (INIS)

    Rogers, V.C.; Grant, M.W.; Sutherland, A.A.

    1982-12-01

    A methodology called RQ/PQ (retention quotient/performance quotient) has been developed for relating the potential hazard of radioactive waste to the natural and man-made barriers provided by a disposal facility. The methodology utilizes a systems approach to quantify the safety of low-level waste disposed in a near-surface facility. The main advantages of the RQ/PQ methodology are its simplicity of analysis and clarity of presentation while still allowing a comprehensive set of nuclides and pathways to be treated. Site performance and facility designs for low-level waste disposal can be easily investigated with relatively few parameters needed to define the problem. Application of the methodology has revealed that the key factor affecting the safety of low-level waste disposal in near surface facilities is the potential for intrusion events. Food, inhalation and well water pathways dominate in the analysis of such events. While the food and inhalation pathways are not strongly site-dependent, the well water pathway is. Finally, burial at depths of 5 m or more was shown to reduce the impacts from intrusion events

  2. Final Performance Progress Report

    Energy Technology Data Exchange (ETDEWEB)

    Houldin, Joseph [Delaware Valley Industrial Resource Center, Philadelphia, PA (United States); Saboor, Veronica [Delaware Valley Industrial Resource Center, Philadelphia, PA (United States)

    2016-03-30

    about assessing a company’s technical assets, broadening our view of the business to go beyond what they make or what NAICS code they have…to better understand their capacity, capability, and expertise, and to learn more about THEIR customers. Knowing more about the markets they serve can often provide insight into their level of technical knowledge and sophistication. Finally, in the spirit of realizing the intent of the Accelerator we strove to align and integrate the work and activities supported by the five funding agencies to leverage each effort. To that end, we include in the Integrated Work Plan a graphic that illustrates that integration. What follows is our summary report of the project, aggregated from prior reports.

  3. Will hypertension performance measures used for pay-for-performance programs penalize those who care for medically complex patients?

    Science.gov (United States)

    Petersen, Laura A; Woodard, Lechauncy D; Henderson, Louise M; Urech, Tracy H; Pietz, Kenneth

    2009-06-16

    There is concern that performance measures, patient ratings of their care, and pay-for-performance programs may penalize healthcare providers of patients with multiple chronic coexisting conditions. We examined the impact of coexisting conditions on the quality of care for hypertension and patient perception of overall quality of their health care. We classified 141 609 veterans with hypertension into 4 condition groups: those with hypertension-concordant (diabetes mellitus, ischemic heart disease, dyslipidemia) and/or -discordant (arthritis, depression, chronic obstructive pulmonary disease) conditions or neither. We measured blood pressure control at the index visit, overall good quality of care for hypertension, including a follow-up interval, and patient ratings of satisfaction with their care. Associations between condition type and number of coexisting conditions on receipt of overall good quality of care were assessed with logistic regression. The relationship between patient assessment and objective measures of quality was assessed. Of the cohort, 49.5% had concordant-only comorbidities, 8.7% had discordant-only comorbidities, 25.9% had both, and 16.0% had none. Odds of receiving overall good quality after adjustment for age were higher for those with concordant comorbidities (odds ratio, 1.78; 95% confidence interval, 1.70 to 1.87), discordant comorbidities (odds ratio, 1.32; 95% confidence interval, 1.23 to 1.41), or both (odds ratio, 2.25; 95% confidence interval, 2.13 to 2.38) compared with neither. Findings did not change after adjustment for illness severity and/or number of primary care and specialty care visits. Patient assessment of quality did not vary by the presence of coexisting conditions and was not related to objective ratings of quality of care. Contrary to expectations, patients with greater complexity had higher odds of receiving high-quality care for hypertension. Subjective ratings of care did not vary with the presence or absence of

  4. Electrochemically fabricated polypyrrole-cobalt-oxygen coordination complex as high-performance lithium-storage materials.

    Science.gov (United States)

    Guo, Bingkun; Kong, Qingyu; Zhu, Ying; Mao, Ya; Wang, Zhaoxiang; Wan, Meixiang; Chen, Liquan

    2011-12-23

    Current lithium-ion battery (LIB) technologies are all based on inorganic electrode materials, though organic materials have been used as electrodes for years. Disadvantages such as limited thermal stability and low specific capacity hinder their applications. On the other hand, the transition metal oxides that provide high lithium-storage capacity by way of electrochemical conversion reaction suffer from poor cycling stability. Here we report a novel high-performance, organic, lithium-storage material, a polypyrrole-cobalt-oxygen (PPy-Co-O) coordination complex, with high lithium-storage capacity and excellent cycling stability. Extended X-ray absorption fine structure and Raman spectroscopy and other physical and electrochemical characterizations demonstrate that this coordination complex can be electrochemically fabricated by cycling PPy-coated Co(3)O(4) between 0.0 V and 3.0 V versus Li(+)/Li. Density functional theory (DFT) calculations indicate that each cobalt atom coordinates with two nitrogen atoms within the PPy-Co coordination layer and the layers are connected with oxygen atoms between them. Coordination weakens the C-H bonds on PPy and makes the complex a novel lithium-storage material with high capacity and high cycling stability. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Integrated approaches for assessment of cellular performance in industrially relevant filamantous fungi

    DEFF Research Database (Denmark)

    Workman, Mhairi; Andersen, Mikael Rørdam; Thykær, Jette

    2013-01-01

    The performance of filamentous fungi in submerged cultivation determines their suitability for large-scale industrial biotechnology processes and is the result of complex interplay between the physical and chemical parameters of the process and the cellular biology of the fungi. Filamentous fungi...... of these organisms. Increased future focus on multicellular physiology and relevant assays will lead to fungal cells and processes that are customizable to a greater degree, finally allowing the full potential of these complex organisms and their product diversity to unfold....

  6. Does strategy instruction on the Rey-Osterrieth Complex Figure task lead to transferred performance improvement on the Modified Taylor Complex Figure task? A randomized controlled trial in school-aged children.

    Science.gov (United States)

    Resch, Christine; Keulers, Esther; Martens, Rosa; van Heugten, Caroline; Hurks, Petra

    2018-04-05

    Providing children with organizational strategy instruction on the Rey Osterrieth Complex Figure (ROCF) has previously been found to improve organizational and accuracy performance on this task. It is unknown whether strategy instruction on the ROCF would also transfer to performance improvement on copying and the recall of another complex figure. Participants were 98 typically developing children (aged 9.5-12.6 years, M = 10.6). Children completed the ROCF (copy and recall) as a pretest. Approximately a month later, they were randomized to complete the ROCF with strategy instruction in the form of a stepwise administration of the ROCF or again in the standard format. All children then copied and recalled the Modified Taylor Complex Figure (MTCF). All productions were assessed in terms of organization, accuracy and completion time. Organization scores for the MTCF did not differ for the two groups for the copy production, but did differ for the recall production, indicating transfer. Accuracy and completion times did not differ between groups. Performance on all measures, except copy accuracy, improved between pretest ROCF and posttest MTCF production for both groups, suggesting practice effects. Findings indicate that transfer of strategy instruction from one complex figure to another is only present for organization of recalled information. The increase in RCF-OSS scores did not lead to a higher accuracy or a faster copy or recall.

  7. An evaluation of the performance of two binaural beamformers in complex and dynamic multitalker environments.

    Science.gov (United States)

    Best, Virginia; Mejia, Jorge; Freeston, Katrina; van Hoesel, Richard J; Dillon, Harvey

    2015-01-01

    Binaural beamformers are super-directional hearing aids created by combining microphone outputs from each side of the head. While they offer substantial improvements in SNR over conventional directional hearing aids, the benefits (and possible limitations) of these devices in realistic, complex listening situations have not yet been fully explored. In this study we evaluated the performance of two experimental binaural beamformers. Testing was carried out using a horizontal loudspeaker array. Background noise was created using recorded conversations. Performance measures included speech intelligibility, localization in noise, acceptable noise level, subjective ratings, and a novel dynamic speech intelligibility measure. Participants were 27 listeners with bilateral hearing loss, fitted with BTE prototypes that could be switched between conventional directional or binaural beamformer microphone modes. Relative to the conventional directional microphones, both binaural beamformer modes were generally superior for tasks involving fixed frontal targets, but not always for situations involving dynamic target locations. Binaural beamformers show promise for enhancing listening in complex situations when the location of the source of interest is predictable.

  8. Adaptive control for a class of nonlinear complex dynamical systems with uncertain complex parameters and perturbations.

    Directory of Open Access Journals (Sweden)

    Jian Liu

    Full Text Available In this paper, adaptive control is extended from real space to complex space, resulting in a new control scheme for a class of n-dimensional time-dependent strict-feedback complex-variable chaotic (hyperchaotic systems (CVCSs in the presence of uncertain complex parameters and perturbations, which has not been previously reported in the literature. In detail, we have developed a unified framework for designing the adaptive complex scalar controller to ensure this type of CVCSs asymptotically stable and for selecting complex update laws to estimate unknown complex parameters. In particular, combining Lyapunov functions dependent on complex-valued vectors and back-stepping technique, sufficient criteria on stabilization of CVCSs are derived in the sense of Wirtinger calculus in complex space. Finally, numerical simulation is presented to validate our theoretical results.

  9. Adaptive control for a class of nonlinear complex dynamical systems with uncertain complex parameters and perturbations.

    Science.gov (United States)

    Liu, Jian; Liu, Kexin; Liu, Shutang

    2017-01-01

    In this paper, adaptive control is extended from real space to complex space, resulting in a new control scheme for a class of n-dimensional time-dependent strict-feedback complex-variable chaotic (hyperchaotic) systems (CVCSs) in the presence of uncertain complex parameters and perturbations, which has not been previously reported in the literature. In detail, we have developed a unified framework for designing the adaptive complex scalar controller to ensure this type of CVCSs asymptotically stable and for selecting complex update laws to estimate unknown complex parameters. In particular, combining Lyapunov functions dependent on complex-valued vectors and back-stepping technique, sufficient criteria on stabilization of CVCSs are derived in the sense of Wirtinger calculus in complex space. Finally, numerical simulation is presented to validate our theoretical results.

  10. Probabilistic performance assessment of complex energy process systems - The case of a self-sustained sanitation system.

    Science.gov (United States)

    Kolios, Athanasios; Jiang, Ying; Somorin, Tosin; Sowale, Ayodeji; Anastasopoulou, Aikaterini; Anthony, Edward J; Fidalgo, Beatriz; Parker, Alison; McAdam, Ewan; Williams, Leon; Collins, Matt; Tyrrel, Sean

    2018-05-01

    A probabilistic modelling approach was developed and applied to investigate the energy and environmental performance of an innovative sanitation system, the "Nano-membrane Toilet" (NMT). The system treats human excreta via an advanced energy and water recovery island with the aim of addressing current and future sanitation demands. Due to the complex design and inherent characteristics of the system's input material, there are a number of stochastic variables which may significantly affect the system's performance. The non-intrusive probabilistic approach adopted in this study combines a finite number of deterministic thermodynamic process simulations with an artificial neural network (ANN) approximation model and Monte Carlo simulations (MCS) to assess the effect of system uncertainties on the predicted performance of the NMT system. The joint probability distributions of the process performance indicators suggest a Stirling Engine (SE) power output in the range of 61.5-73 W with a high confidence interval (CI) of 95%. In addition, there is high probability (with 95% CI) that the NMT system can achieve positive net power output between 15.8 and 35 W. A sensitivity study reveals the system power performance is mostly affected by SE heater temperature. Investigation into the environmental performance of the NMT design, including water recovery and CO 2 /NO x emissions, suggests significant environmental benefits compared to conventional systems. Results of the probabilistic analysis can better inform future improvements on the system design and operational strategy and this probabilistic assessment framework can also be applied to similar complex engineering systems.

  11. Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Aristos Aristidou Natureworks); Robert Kean (NatureWorks); Tom Schechinger (IronHorse Farms, Mat); Stuart Birrell (Iowa State); Jill Euken (Wallace Foundation & Iowa State)

    2007-10-01

    The two main objectives of this project were: 1) to develop and test technologies to harvest, transport, store, and separate corn stover to supply a clean raw material to the bioproducts industry, and 2) engineer fermentation systems to meet performance targets for lactic acid and ethanol manufacturers. Significant progress was made in testing methods to harvest corn stover in a “single pass” harvest mode (collect corn grain and stover at the same time). This is technically feasible on small scale, but additional equipment refinements will be needed to facilitate cost effective harvest on a larger scale. Transportation models were developed, which indicate that at a corn stover yield of 2.8 tons/acre and purchase price of $35/ton stover, it would be unprofitable to transport stover more than about 25 miles; thus suggesting the development of many regional collection centers. Therefore, collection centers should be located within about 30 miles of the farm, to keep transportation costs to an acceptable level. These collection centers could then potentially do some preprocessing (to fractionate or increase bulk density) and/or ship the biomass by rail or barge to the final customers. Wet storage of stover via ensilage was tested, but no clear economic advantages were evident. Wet storage eliminates fire risk, but increases the complexity of component separation and may result in a small loss of carbohydrate content (fermentation potential). A study of possible supplier-producer relationships, concluded that a “quasi-vertical” integration model would be best suited for new bioproducts industries based on stover. In this model, the relationship would involve a multiyear supply contract (processor with purchase guarantees, producer group with supply guarantees). Price will likely be fixed or calculated based on some formula (possibly a cost plus). Initial quality requirements will be specified (but subject to refinement).Producers would invest in harvest

  12. Final disposal room structural response calculations

    International Nuclear Information System (INIS)

    Stone, C.M.

    1997-08-01

    Finite element calculations have been performed to determine the structural response of waste-filled disposal rooms at the WIPP for a period of 10,000 years after emplacement of the waste. The calculations were performed to generate the porosity surface data for the final set of compliance calculations. The most recent reference data for the stratigraphy, waste characterization, gas generation potential, and nonlinear material response have been brought together for this final set of calculations

  13. Impact of Business Interoperability on the Performance of Complex Cooperative Supply Chain Networks: A Case Study

    Directory of Open Access Journals (Sweden)

    Izunildo Cabral

    2018-01-01

    Full Text Available This paper proposes an agent-based model for evaluating the effect of business interoperability on the performance of cooperative supply chain networks. The model is based on insights from the Industrial Marketing and Purchasing network approach and the complex systems theory perspective. To demonstrate its applicability, an explanatory case study regarding a Portuguese reverse logistics cooperative supply chain network is presented. Face-to-face interviews and forms were used to collect data. The findings show that the establishment of appropriate levels of business interoperability has helped to reduce several non-value-added interaction processes and consequently improve the operational performance of the Valorpneu network. Regarding the research implications, this paper extends the current knowledge on business interoperability and an important problem in business: how business interoperability gaps in dyadic organizational relationships affect the network of companies that the two companies belong to—network effect. In terms of practical implications, managers can use the proposed model as a starting point to simulate complex interactions between supply chain network partners and understand better how the performance of their networks emerges from these interactions and from the adoption of different levels of business interoperability.

  14. Synthesis, crystal structure, spectroscopic characterization and nonlinear optical properties of Co(II)- picolinate complex

    Energy Technology Data Exchange (ETDEWEB)

    Tamer, Ömer, E-mail: omertamer@sakarya.edu.tr; Avcı, Davut; Atalay, Yusuf

    2015-11-15

    A cobalt(II) complex of picolinate was synthesized, and its structure was fully characterized by the applying of X-ray diffraction method as well as FT-IR, FT-Raman and UV–vis spectroscopies. In order to both support the experimental results and convert study to more advanced level, density functional theory calculations were performed by using B3LYP level. Single crystal X-ray structural analysis shows that cobalt(II) ion was located to the center of distorted octahedral geometry. The C=O, C=C and C=N stretching vibrations were found as highly active and strong peaks, inducing the molecular charge transfer within Co(II) complex. The small energy gap between frontier molecular orbital energies was another indicator of molecular charge transfer interactions within Co(II) complex. The nonlinear optical properties of Co(II) complex were investigated at DFT/B3LYP level, and the hypepolarizability parameter was found to be decreased due to the presence of inversion symmetry. The natural bond orbital (NBO) analysis was performed to investigate molecular stability, hyperconjugative interactions, intramolecular charge transfer (ICT) and bond strength for Co(II) complex. Finally, molecular electrostatic potential (MEP) and spin density distributions for Co(II) complex were evaluated. - Highlights: • Co(II) complex of picolinate was prepared. • Its FT-IR, FT-Raman and UV–vis spectra were measured. • DFT calculations were performed to support experimental results. • Small HOMO-LUMO energy gap is an indicator of molecular charge transfer. • Spin density localized on Co(II) as well as O and N atoms.

  15. On improving the performance of nonphotochemical quenching in CP29 light-harvesting antenna complex

    Energy Technology Data Exchange (ETDEWEB)

    Berman, Gennady P. [Theoretical Division, T-4, Los Alamos National Laboratory, and the New Mexico Consortium, Los Alamos, NM 87544 (United States); Nesterov, Alexander I., E-mail: nesterov@cencar.udg.mx [Departamento de Física, CUCEI, Universidad de Guadalajara, Av. Revolución 1500, Guadalajara, CP 44420, Jalisco (Mexico); Sayre, Richard T. [Biological Division, B-11, Los Alamos National Laboratory, and the New Mexico Consortium, Los Alamos, NM 87544 (United States); Still, Susanne [Department of Information and Computer Sciences, and Department of Physics and Astronomy, University of Hawaii at Mānoa, 1860 East–West Road, Honolulu, HI 96822 (United States)

    2016-03-22

    We model and simulate the performance of charge-transfer in nonphotochemical quenching (NPQ) in the CP29 light-harvesting antenna-complex associated with photosystem II (PSII). The model consists of five discrete excitonic energy states and two sinks, responsible for the potentially damaging processes and charge-transfer channels, respectively. We demonstrate that by varying (i) the parameters of the chlorophyll-based dimer, (ii) the resonant properties of the protein-solvent environment interaction, and (iii) the energy transfer rates to the sinks, one can significantly improve the performance of the NPQ. Our analysis suggests strategies for improving the performance of the NPQ in response to environmental changes, and may stimulate experimental verification. - Highlights: • Improvement of the efficiency of the charge-transfer nonphotochemical quenching in CP29. • Strategy for restoring the NPQ efficiency when the environment changes. • By changing of energy transfer rates to the sinks, one can significantly improve the performance of the NPQ.

  16. The Curvilinear Relationship between State Neuroticism and Momentary Task Performance

    Science.gov (United States)

    Debusscher, Jonas; Hofmans, Joeri; De Fruyt, Filip

    2014-01-01

    A daily diary and two experience sampling studies were carried out to investigate curvilinearity of the within-person relationship between state neuroticism and task performance, as well as the moderating effects of within-person variation in momentary job demands (i.e., work pressure and task complexity). In one, results showed that under high work pressure, the state neuroticism–task performance relationship was best described by an exponentially decreasing curve, whereas an inverted U-shaped curve was found for tasks low in work pressure, while in another study, a similar trend was visible for task complexity. In the final study, the state neuroticism–momentary task performance relationship was a linear one, and this relationship was moderated by momentary task complexity. Together, results from all three studies showed that it is important to take into account the moderating effects of momentary job demands because within-person variation in job demands affects the way in which state neuroticism relates to momentary levels of task performance. Specifically, we found that experiencing low levels of state neuroticism may be most beneficial in high demanding tasks, whereas more moderate levels of state neuroticism are optimal under low momentary job demands. PMID:25238547

  17. Structural response of Paks NPP WWER-440 MW main building complex to blast input motion. Final report

    International Nuclear Information System (INIS)

    1999-01-01

    The Soviet standard design units WWER-440/213 type installed in Paks NPP were not originally designed for a Safe Shutdown Earthquake. At the time of selection of Paks site on the basis of historical earthquake data was supposed that the maximum earthquake is of grade V according MSK-64 scale. This seismicity level had not required any special measures to account for seismic event effects on the Main Building Complex Structure. Current site seismicity studies reveal that the seismic hazard for the site significantly exceeds the originally estimated. In addition the safety rules and seismic code requirements became more rugged. As a part of the activities to increase the seismic safety of the Paks NPP the study on dynamic behaviour of the Main Building Complex Structure has been performed with support of IAEA. The explosion full scale tests were carried out for determining the dynamic behaviour of the structure and for assessment of the Soil Structure Interaction (SSI) effects in the modelling and analysis procedures, used in the dynamic response analyses. The objective of the project was to evaluate the blast response of the WWER-440/213 Main Building Complex at Paks NPP, based on the data available for the soil properties, recorded free-field blast input motion, and structural design. The scope of EQE-Bulgaria study was to conduct a state-of-the-art SSI analysis with a multiple foundations supported model of the Main Building Complex to assess the structure blast response. The analysis was focused on a modelling technique that assess realistically the SSI effects on the dynamic response of a structure supported on multiple foundation instead of simplified, but more conservative techniques. The scope of research was covered splitting the study into the following steps: development of a twin units model for Main Building Complex structure; development of a Low Strain Soil Properties Model; development of SSI Parameters consisting of a Multiple Foundations System

  18. Is performance in pre-clinical assessment a good predictor of the final Doctor of Medicine grade?

    Science.gov (United States)

    Al-Wardy, Nadia M; Rizvi, Syed G; Bayoumi, Riad A

    2009-12-01

    To investigate if any correlation exists between students' grades on their final doctor of Medicine (MD) assessment and their overall preclinical grade point average (GPA) and its component parts. Student data available from the Deanship of Admissions and Registration were analyzed. Pearson correlation coefficient was obtained to assess the degree of linear relationship between performance in the preclinical and the MD assessment of 529 students who graduated from the College of Medicine and Health Sciences, Sultan Qaboos University, Al-Khoud, Oman from June 1998 to June 2005. Simple and multiple regression analyses were performed to evaluate individual and combined impact of the preclinical courses' grades on MD grades. Preclinical GPA correlated highly with MD GPA (r=0.641). The science component taught early in the preclinical phase correlated more strongly (r=0.457) than student electives (r=0.246). This correlation was better in the good English group. Students' performance, however, was best in electives, but worst in English. Most students who had low MD GPA (2.5, and limiting the credit hour requirement of electives by the College seems to be justified.

  19. Childhood school performance, education and occupational complexity: a life-course study of dementia in the Kungsholmen Project.

    Science.gov (United States)

    Dekhtyar, Serhiy; Wang, Hui-Xin; Fratiglioni, Laura; Herlitz, Agneta

    2016-08-01

    Cognitive reserve hypothesis predicts that intellectually demanding activities over the life course protect against dementia. We investigate if childhood school performance remains associated with dementia once education and occupational complexity are taken into account. A cohort of 440 individuals aged 75+ from the Kungsholmen Project was followed up for 9 years to detect dementia. To measure early-life contributors to reserve, we used grades at age 9-10 extracted from the school archives. Data on formal education and occupational complexity were collected at baseline and first follow-up. Dementia was ascertained through comprehensive clinical examination. Cox models estimated the relationship between life-course cognitive reserve measures and dementia. Dementia risk was elevated [hazard ratio (HR): 1.54, 95% confidence interval (CI): 1.03 to 2.29] in individuals with low early-life school grades after adjustment for formal educational attainment and occupational complexity. Secondary education was associated with a lower risk of dementia (HR: 0.72, 95% CI: 0.50 to 1.03), although the effects of post-secondary and university degrees were indistinguishable from baseline. Occupational complexity with data and things was not related to dementia. However, an association was found between high occupational complexity with people and dementia, albeit only in women (HR: 0.39, 95% CI: 0.14 to 0.99). The pattern of results remained unchanged after adjustment for genetic susceptibility, comorbidities and depressive symptoms. Low early-life school performance is associated with an elevated risk of dementia, independent of subsequent educational and occupational attainment. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  20. Photon final states at the Tevatron

    Energy Technology Data Exchange (ETDEWEB)

    Campanelli, Mario; /University Coll. London

    2008-04-01

    The authors present here several recent measurements involving associate production of photons and jets at the Tevatron. In particular, inclusive photon + met from D0, and photon + b-jets and photon + b-jet + leptons + MET from CDF are described in some detail. These measurements offer a good test of QCD predictions in rather complex final states.

  1. The Effects of Enzyme Complex on Performance, Intestinal Health and Nutrient Digestibility of Weaned Pigs

    Directory of Open Access Journals (Sweden)

    J. Q. Yi

    2013-08-01

    Full Text Available Two experiments were conducted to evaluate the effect of supplementing a corn-soybean meal-based diet with an enzyme complex containing amylase, protease and xylanase on the performance, intestinal health, apparent ileal digestibility of amino acids and nutrient digestibility of weaned pigs. In Exp. 1, 108 piglets weaned at 28 d of age were fed one of three diets containing 0 (control, 100, or 150 ppm enzyme complex for 4 wks, based on a two-phase feeding program namely 1 to 7 d (phase 1 and 8 to 28 d (phase 2. At the end of the experiment, six pigs from the control group and the group supplemented with 150 ppm enzyme complex were chosen to collect digesta samples from intestine to measure viscosity and pH in the stomach, ileum, and cecum, as well as volatile fatty acid concentrations and composition of the microflora in the cecum and colon. There were linear increases (p<0.01 in weight gain, gain: feed ratio and digestibility of gross energy with the increasing dose rate of enzyme supplementation during the whole experiment. Supplementation with enzyme complex increased the digesta viscosity in the stomach (p<0.05 and significantly increased (p<0.01 the concentrations of acetic, propionic and butyric acid in the cecum and colon. Enzyme supplementation also significantly increased the population of Lactobacilli (p<0.01 in the cecum and decreased the population of E. coli (p<0.05 in the colon. In Exp. 2, six crossbred barrows (initial body weight: 18.26±1.21 kg, fitted with a simple T-cannula at the distal ileum, were assigned to three dietary treatments according to a replicated 3×3 Latin Square design. The experimental diets were the same as the diets used in phase 2 in Exp. 1. Apparent ileal digestibility of isoleucine (p<0.01, valine (p<0.05 and aspartic acid (p<0.05 linearly increased with the increasing dose rate of enzyme supplementation. In conclusion, supplementation of the diet with an enzyme complex containing amylase, protease and

  2. Finalizing a measurement framework for the burden of treatment in complex patients with chronic conditions

    Directory of Open Access Journals (Sweden)

    Eton DT

    2015-03-01

    % were coping with multiple chronic conditions. A preliminary conceptual framework using data from the first 32 interviews was evaluated and was modified using narrative data from 18 additional interviews with a racially and socioeconomically diverse sample of patients. The final framework features three overarching themes with associated subthemes. These themes included: 1 work patients must do to care for their health (eg, taking medications, keeping medical appointments, monitoring health; 2 challenges/stressors that exacerbate perceived burden (eg, financial, interpersonal, provider obstacles; and 3 impacts of burden (eg, role limitations, mental exhaustion. All themes and subthemes were subsequently confirmed in focus groups. Conclusion: The final conceptual framework can be used as a foundation for building a patient self-report measure to systematically study treatment burden for research and analytical purposes, as well as to promote meaningful clinic-based dialogue between patients and providers about the challenges inherent in maintaining complex self-management of health. Keywords: treatment burden, conceptual framework, adherence, questionnaire, self-management, multi-morbidity

  3. Final Report: Optimal Model Complexity in Geological Carbon Sequestration: A Response Surface Uncertainty Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Ye [Univ. of Wyoming, Laramie, WY (United States)

    2018-01-17

    equivalency, all the stratigraphic models were successfully upscaled from the reference heterogeneous model for bulk flow and transport predictions (Zhang & Zhang, 2015). GCS simulation was then simulated with all models, yielding insights into the level of parameterization complexity that is needed for the accurate simulation of reservoir pore pressure, CO2 storage, leakage, footprint, and dissolution over both short (i.e., injection) and longer (monitoring) time scales. Important uncertainty parameters that impact these key performance metrics were identified for the stratigraphic models as well as for the heterogeneous model, leading to the development of reduced/simplified models at lower characterization cost that can be used for the reservoir uncertainty analysis. All the CO2 modeling was conducted using PFLOTRAN – a massively parallel, multiphase, multi-component, and reactive transport simulator developed by a multi-laboratory DOE/SciDAC (Scientific Discovery through Advanced Computing) project (Zhang et al., 2017, in review). Within the uncertainty analysis framework, increasing reservoir depth were investigated to explore its effect on the uncertainty outcomes and the potential for developing gravity-stable injection with increased storage security (Dai et al., 20126; Dai et al., 2017, in review). Finally, to accurately model CO2 fluid-rock reactions and resulting long-term storage as secondary carbonate minerals, a modified kinetic rate law for general mineral dissolution and precipitation was proposed and verified that is invariant to a scale transformation of the mineral formula weight. This new formulation will lead to more accurate assessment of mineral storage over geologic time scales (Lichtner, 2016).

  4. A Memristor-Based Hyperchaotic Complex Lü System and Its Adaptive Complex Generalized Synchronization

    Directory of Open Access Journals (Sweden)

    Shibing Wang

    2016-02-01

    Full Text Available This paper introduces a new memristor-based hyperchaotic complex Lü system (MHCLS and investigates its adaptive complex generalized synchronization (ACGS. Firstly, the complex system is constructed based on a memristor-based hyperchaotic real Lü system, and its properties are analyzed theoretically. Secondly, its dynamical behaviors, including hyperchaos, chaos, transient phenomena, as well as periodic behaviors, are explored numerically by means of bifurcation diagrams, Lyapunov exponents, phase portraits, and time history diagrams. Thirdly, an adaptive controller and a parameter estimator are proposed to realize complex generalized synchronization and parameter identification of two identical MHCLSs with unknown parameters based on Lyapunov stability theory. Finally, the numerical simulation results of ACGS and its applications to secure communication are presented to verify the feasibility and effectiveness of the proposed method.

  5. High performance yellow organic electroluminescent devices by doping iridium(III) complex into host materials with stepwise energy levels

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Rongzhen; Zhou, Liang, E-mail: zhoul@ciac.ac.cn; Jiang, Yunlong; Li, Yanan; Zhao, Xuesen; Zhang, Hongjie, E-mail: hongjie@ciac.ac.cn

    2015-10-15

    In this work, we aim to further improve the electroluminescent (EL) performances of a yellow light-emitting iridium(III) complex by designing double light-emitting layers (EMLs) devices having stepwise energy levels. Compared with single-EML devices, these designed double-EML devices showed improved EL efficiency and brightness attributed to better balance in carriers. In addition, the stepwise distribution in energy levels of host materials is instrumental in broadening the recombination zone, thus delaying the roll-off of EL efficiency. Based on the investigation of carriers' distribution, device structure was further optimized by adjusting the thickness of deposited layers. Finally, yellow EL device (Commission Internationale de l'Eclairage (CIE) coordinates of (0.446, 0.542)) with maximum current efficiency, power efficiency and brightness up to 78.62 cd/A (external quantum efficiency (EQE) of 21.1%), 82.28 lm/W and 72,713 cd/m{sup 2}, respectively, was obtained. Even at the high brightness of 1000 cd/m{sup 2}, EL efficiency as high as 65.54 cd/A (EQE=17.6%) can be retained. - Highlights: • Yellow electroluminescent devices were designed and fabricated. • P-type and n-type materials having stepwise energy levels were chosen as host materials. • Better balance of holes and electrons causes the enhanced efficiencies. • Improved carriers' trapping suppresses the emission of host material.

  6. The Evaluation of Preprocessing Choices in Single-Subject BOLD fMRI Using NPAIRS Performance Metrics

    DEFF Research Database (Denmark)

    Stephen, LaConte; Rottenberg, David; Strother, Stephen

    2003-01-01

    to obtain cross-validation-based model performance estimates of prediction accuracy and global reproducibility for various degrees of model complexity. We rely on the concept of an analysis chain meta-model in which all parameters of the preprocessing steps along with the final statistical model are treated...

  7. Effects of task complexity on rhythmic reproduction performance in adults.

    Science.gov (United States)

    Iannarilli, Flora; Vannozzi, Giuseppe; Iosa, Marco; Pesce, Caterina; Capranica, Laura

    2013-02-01

    The aim of the present study was to investigate the effect of task complexity on the capability to reproduce rhythmic patterns. Sedentary musically illiterate individuals (age: 34.8±4.2 yrs; M±SD) were administered a rhythmic test including three rhythmic patterns to be reproduced by means of finger-tapping, foot-tapping and walking. For the quantification of subjects' ability in the reproduction of rhythmic patterns, qualitative and quantitative parameters were submitted to analysis. A stereophotogrammetric system was used to reconstruct and evaluate individual performances. The findings indicated a good internal stability of the rhythmic reproduction, suggesting that the present experimental design is suitable to discriminate the participants' rhythmic ability. Qualitative aspects of rhythmic reproduction (i.e., speed of execution and temporal ratios between events) varied as a function of the perceptual-motor requirements of the rhythmic reproduction task, with larger reproduction deviations in the walking task. Copyright © 2013 Elsevier B.V. All rights reserved.

  8. Energy Impact Illinois - Final Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Olson, Daniel [Senior Energy Efficiency Planner; Plagman, Emily [Senior Energy Planner; Silberhorn, Joey-Lin [Energy Efficiency Program Assistant

    2014-02-18

    Energy Impact Illinois (EI2) is an alliance of government organizations, nonprofits, and regional utility companies led by the Chicago Metropolitan Agency for Planning (CMAP) that is dedicated to helping communities in the Chicago metropolitan area become more energy efficient. Originally organized as the Chicago Region Retrofit Ramp-Up (CR3), EI2 became part of the nationwide Better Buildings Neighborhood Program (BBNP) in May 2010 after receiving a $25 million award from the U.S. Department of Energy (DOE) authorized through the American Recovery and Reinvestment Act of 2009 (ARRA). The program’s primary goal was to fund initiatives that mitigate barriers to energy efficiency retrofitting activities across residential, multifamily, and commercial building sectors in the seven-county CMAP region and to help to build a sustainable energy efficiency marketplace. The EI2 Final Technical Report provides a detailed review of the strategies, implementation methods, challenges, lessons learned, and final results of the EI2 program during the initial grant period from 2010-2013. During the program period, EI2 successfully increased direct retrofit activity in the region and was able to make a broader impact on the energy efficiency market in the Chicago region. As the period of performance for the initial grant comes to an end, EI2’s legacy raises the bar for the region in terms of helping homeowners and building owners to take action on the continually complex issue of energy efficiency.

  9. Development of [103Pd]-labeled-bis(N4-methylthiosemicarbazone) complexes as possible therapeutic agents

    International Nuclear Information System (INIS)

    Jalilian, A.R.; Sadeghi, M.; Kamrani, Y.Y.

    2006-01-01

    Due to interesting tumor seeking properties of bis-thiosemicarbazones, two radio palladium-bis-thiosemicarbazone complexes, i.e., [ 103 Pd]-pyruvaldehyde-bis(N 4 -methylthiosemicarbazone) ([ 103 Pd] PTSM) and [ 103 Pd]-diacetyl-bis(N 4 -methylthiosemicarbazone) ([ 103 Pd]ATSM) were prepared according to the analogy of radio copper homologs. Palladium-103 (t 1/2 = 16.96 d) was produced via the 103 Rh(p, n) 103 Pd nuclear reaction with proton energy 18 MeV. The final activity was eluted in form of Pd(NH 3 ) 2 Cl 2 in order to react with bis-thiosemicarbazones to yield [ 103 Pd]-labeled compounds. Chemical purity of the product was confirmed to be below the accepted limits by polarography. [ 103 Pd]-labeled bis-thiosemicarbazones were prepared with a radiochemical yield of more than 80% at room temperature after 60-90 min by vortexing a mixture of thiosemicarbazones and Pd activity in ethanol. The purification of the labeled compounds performed by reverse phase column chromatography using C 18 plus Sep-Pak. Radiochemical purity of more than 99% specific activity of about 12500-13 000 Ci/mol was obtained. The stability of the complexes was checked in final product and presence of human serum at 37 C up to 48 h. The partition co-efficients of the final complexes were determined. The initial physico-chemical properties of the labeled compounds were compared to those of their copper homologues. (orig.)

  10. Discovering functional interdependence relationship in PPI networks for protein complex identification.

    Science.gov (United States)

    Lam, Winnie W M; Chan, Keith C C

    2012-04-01

    Protein molecules interact with each other in protein complexes to perform many vital functions, and different computational techniques have been developed to identify protein complexes in protein-protein interaction (PPI) networks. These techniques are developed to search for subgraphs of high connectivity in PPI networks under the assumption that the proteins in a protein complex are highly interconnected. While these techniques have been shown to be quite effective, it is also possible that the matching rate between the protein complexes they discover and those that are previously determined experimentally be relatively low and the "false-alarm" rate can be relatively high. This is especially the case when the assumption of proteins in protein complexes being more highly interconnected be relatively invalid. To increase the matching rate and reduce the false-alarm rate, we have developed a technique that can work effectively without having to make this assumption. The name of the technique called protein complex identification by discovering functional interdependence (PCIFI) searches for protein complexes in PPI networks by taking into consideration both the functional interdependence relationship between protein molecules and the network topology of the network. The PCIFI works in several steps. The first step is to construct a multiple-function protein network graph by labeling each vertex with one or more of the molecular functions it performs. The second step is to filter out protein interactions between protein pairs that are not functionally interdependent of each other in the statistical sense. The third step is to make use of an information-theoretic measure to determine the strength of the functional interdependence between all remaining interacting protein pairs. Finally, the last step is to try to form protein complexes based on the measure of the strength of functional interdependence and the connectivity between proteins. For performance evaluation

  11. Effects of enzyme complex SSF (solid state fermentation in pellet diets for Nile tilapia

    Directory of Open Access Journals (Sweden)

    Guilherme de Souza Moura

    2012-10-01

    Full Text Available The effects of enzyme complex SSF (solid state fermentation on growth performance and the availability of sucrose and monosaccharides in the chyme of Nile were involved. The study included 360 fish (70g±4.43 in a completely randomized design with six dietary treatments (0, 50, 100, 150, 200 and 250 ppm of SSF arranged in six replicates, with 10 fish per replicate. Every 15 days, one tilapia of each experimental unit was sacrificed for analyses of carbohydrate in the chyme. On day 60 of the experiment, the performance parameters were measured. There was a linear effect according to treatment for final weight and weight gain. For the other performance parameters, there were no differences. There was quadratic effect for sucrose and glucose in function of the treatment, whereas the fructose levels increased linearly. The addition of 150 ppm of the enzyme complex SSF in the feed improves the performance of Nile tilapia and increases the availability of sucrose and monosaccharides in the chyme.

  12. Chemical and mechanical performance properties for various final waste forms -- PSPI scoping study

    International Nuclear Information System (INIS)

    Farnsworth, R.K.; Larsen, E.D.; Sears, J.W.; Eddy, T.L.; Anderson, G.L.

    1996-09-01

    The US DOE is obtaining data on the performance properties of the various final waste forms that may be chosen as primary treatment products for the alpha-contaminated low-level and transuranic waste at the INEL's Transuranic Storage Area. This report collects and compares selected properties that are key indicators of mechanical and chemical durability for Portland cement concrete, concrete formed under elevated temperature and pressure, sulfur polymer cement, borosilicate glass, and various forms of alumino-silicate glass, including in situ vitrification glass and various compositions of iron-enriched basalt (IEB) and iron-enriched basalt IV (IEB4). Compressive strength and impact resistance properties were used as performance indicators in comparative evaluation of the mechanical durability of each waste form, while various leachability data were used in comparative evaluation of each waste form's chemical durability. The vitrified waste forms were generally more durable than the non-vitrified waste forms, with the iron-enriched alumino-silicate glasses and glass/ceramics exhibiting the most favorable chemical and mechanical durabilities. It appears that the addition of zirconia and titania to IEB (forming IEB4) increases the leach resistance of the lanthanides. The large compositional ranges for IEB and IEB4 more easily accommodate the compositions of the waste stored at the INEL than does the composition of borosilicate glass. It appears, however, that the large potential variation in IEB and IEB4 compositions resulting from differing waste feed compositions can impact waste form durability. Further work is needed to determine the range of waste stream feed compositions and rates of waste form cooling that will result in acceptable and optimized IEB or IEB4 waste form performance. 43 refs

  13. Testing a Firefly-Inspired Synchronization Algorithm in a Complex Wireless Sensor Network.

    Science.gov (United States)

    Hao, Chuangbo; Song, Ping; Yang, Cheng; Liu, Xiongjun

    2017-03-08

    Data acquisition is the foundation of soft sensor and data fusion. Distributed data acquisition and its synchronization are the important technologies to ensure the accuracy of soft sensors. As a research topic in bionic science, the firefly-inspired algorithm has attracted widespread attention as a new synchronization method. Aiming at reducing the design difficulty of firefly-inspired synchronization algorithms for Wireless Sensor Networks (WSNs) with complex topologies, this paper presents a firefly-inspired synchronization algorithm based on a multiscale discrete phase model that can optimize the performance tradeoff between the network scalability and synchronization capability in a complex wireless sensor network. The synchronization process can be regarded as a Markov state transition, which ensures the stability of this algorithm. Compared with the Miroll and Steven model and Reachback Firefly Algorithm, the proposed algorithm obtains better stability and performance. Finally, its practicality has been experimentally confirmed using 30 nodes in a real multi-hop topology with low quality links.

  14. Effects of long-term practice and task complexity on brain activities when performing abacus-based mental calculations: a PET study

    International Nuclear Information System (INIS)

    Wu, Tung-Hsin; Chen, Chia-Lin; Huang, Yung-Hui; Liu, Ren-Shyan; Hsieh, Jen-Chuen; Lee, Jason J.S.

    2009-01-01

    The aim of this study was to examine the neural bases for the exceptional mental calculation ability possessed by Chinese abacus experts through PET imaging. We compared the different regional cerebral blood flow (rCBF) patterns using 15 O-water PET in 10 abacus experts and 12 non-experts while they were performing each of the following three tasks: covert reading, simple addition, and complex contiguous addition. All data collected were analyzed using SPM2 and MNI templates. For non-experts during the tasks of simple addition, the observed activation of brain regions were associated with coordination of language (inferior frontal network) and visuospatial processing (left parietal/frontal network). Similar activation patterns but with a larger visuospatial processing involvement were observed during complex contiguous addition tasks, suggesting the recruitment of more visuospatial memory for solving the complex problems. For abacus experts, however, the brain activation patterns showed slight differences when they were performing simple and complex addition tasks, both of which involve visuospatial processing (bilateral parietal/frontal network). These findings supported the notion that the experts were completing all the calculation process on a virtual mental abacus and relying on this same computational strategy in both simple and complex tasks, which required almost no increasing brain workload for solving the latter. In conclusion, after intensive training and practice, the neural pathways in an abacus expert have been connected more effectively for performing the number encoding and retrieval that are required in abacus tasks, resulting in exceptional mental computational ability. (orig.)

  15. Phosphorylation site on yeast pyruvate dehydrogenase complex

    International Nuclear Information System (INIS)

    Uhlinger, D.J.

    1986-01-01

    The pyruvate dehydrogenase complex was purified to homogeneity from baker's yeast (Saccharomyces cerevisiae). Yeast cells were disrupted in a Manton-Gaulin laboratory homogenizer. The pyruvate dehydrogenase complex was purified by fractionation with polyethylene glycol, isoelectric precipitation, ultracentrifugation and chromatography on hydroxylapatite. Final purification of the yeast pyruvate dehydrogenase complex was achieved by cation-exchange high pressure liquid chromatography (HPLC). No endogenous pyruvate dehydrogenase kinase activity was detected during the purification. However, the yeast pyruvate dehydrogenase complex was phosphorylated and inactivated with purified pyruvate dehydrogenase kinase from bovine kidney. Tryptic digestion of the 32 P-labeled complex yielded a single phosphopeptide which was purified to homogeniety. The tryptic digest was subjected to chromatography on a C-18 reverse phase HPLC column with a linear gradient of acetonitrile. Radioactive fractions were pooled, concentrated, and subjected to anion-exchange HPLC. The column was developed with a linear gradient of ammonium acetate. Final purification of the phosphopeptide was achieved by chromatography on a C-18 reverse phase HPLC column developed with a linear gradient of acetonitrile. The amino acid sequence of the homogeneous peptide was determined by manual modified Edman degradation

  16. 3D complex: a structural classification of protein complexes.

    Directory of Open Access Journals (Sweden)

    Emmanuel D Levy

    2006-11-01

    Full Text Available Most of the proteins in a cell assemble into complexes to carry out their function. It is therefore crucial to understand the physicochemical properties as well as the evolution of interactions between proteins. The Protein Data Bank represents an important source of information for such studies, because more than half of the structures are homo- or heteromeric protein complexes. Here we propose the first hierarchical classification of whole protein complexes of known 3-D structure, based on representing their fundamental structural features as a graph. This classification provides the first overview of all the complexes in the Protein Data Bank and allows nonredundant sets to be derived at different levels of detail. This reveals that between one-half and two-thirds of known structures are multimeric, depending on the level of redundancy accepted. We also analyse the structures in terms of the topological arrangement of their subunits and find that they form a small number of arrangements compared with all theoretically possible ones. This is because most complexes contain four subunits or less, and the large majority are homomeric. In addition, there is a strong tendency for symmetry in complexes, even for heteromeric complexes. Finally, through comparison of Biological Units in the Protein Data Bank with the Protein Quaternary Structure database, we identified many possible errors in quaternary structure assignments. Our classification, available as a database and Web server at http://www.3Dcomplex.org, will be a starting point for future work aimed at understanding the structure and evolution of protein complexes.

  17. Course Syllabi and Their Effects on Students' Final Grade Performance.

    Science.gov (United States)

    Serafin, Ana Gil

    This study examined the relationship between the changes introduced in a course syllabus for a course titled "Instructional Strategies" and the final grades obtained by freshman and sophomore students in three successive academic periods. A sample of 150 subjects was randomly selected from students enrolled in the course at the…

  18. Morphological inversion of complex diffusion

    Science.gov (United States)

    Nguyen, V. A. T.; Vural, D. C.

    2017-09-01

    Epidemics, neural cascades, power failures, and many other phenomena can be described by a diffusion process on a network. To identify the causal origins of a spread, it is often necessary to identify the triggering initial node. Here, we define a new morphological operator and use it to detect the origin of a diffusive front, given the final state of a complex network. Our method performs better than algorithms based on distance (closeness) and Jordan centrality. More importantly, our method is applicable regardless of the specifics of the forward model, and therefore can be applied to a wide range of systems such as identifying the patient zero in an epidemic, pinpointing the neuron that triggers a cascade, identifying the original malfunction that causes a catastrophic infrastructure failure, and inferring the ancestral species from which a heterogeneous population evolves.

  19. Losses, Expansions, and Novel Subunit Discovery of Adaptor Protein Complexes in Haptophyte Algae.

    Science.gov (United States)

    Lee, Laura J Y; Klute, Mary J; Herman, Emily K; Read, Betsy; Dacks, Joel B

    2015-11-01

    The phylum Haptophyta (Diaphoratickes) contains marine algae that perform biomineralization, extruding large, distinctive calcium carbonate scales (coccoliths) that completely cover the cell. Coccolith production is an important part of global carbon cycling; however, the membrane trafficking pathway by which they are secreted has not yet been elucidated. In most eukaryotes, post-Golgi membrane trafficking involves five heterotetrameric adaptor protein (AP) complexes, which impart cargo selection specificity. To better understand coccolith secretion, we performed comparative genomic, phylogenetic, and transcriptomic analyses of the AP complexes in Emiliania huxleyi strains 92A, Van556, EH2, and CCMP1516, and related haptophytes Gephyrocapsa oceanica and Isochrysis galbana; the latter has lost the ability to biomineralize. We show that haptophytes have a modified membrane trafficking system (MTS), as we found both AP subunit losses and duplications. Additionally, we identified a single conserved subunit of the AP-related TSET complex, whose expression suggests a functional role in membrane trafficking. Finally, we detected novel alpha adaptin ear and gamma adaptin ear proteins, the first of their kind to be described outside of opisthokonts. These novel ear proteins and the sculpting of the MTS may support the capacity for biomineralization in haptophytes, enhancing their ability to perform this highly specialized form of secretion. Copyright © 2015 Elsevier GmbH. All rights reserved.

  20. The cognitive complexity of concurrent cognitive-motor tasks reveals age-related deficits in motor performance

    DEFF Research Database (Denmark)

    Oliveira, Anderson Souza; Reiche, Mikkel Staall; Vinescu, Cristina Ioana

    2018-01-01

    Aging reduces cognitive functions, and such impairments have implications in mental and motor performance. Cognitive function has been recently linked to the risk of falls in older adults. Physical activities have been used to attenuate the declines in cognitive functions and reduce fall incidence......, but little is known whether a physically active lifestyle can maintain physical performance under cognitively demanding conditions. The aim of this study was to verify whether physically active older adults present similar performance deficits during upper limb response time and precision stepping walking...... tasks when compared to younger adults. Both upper limb and walking tasks involved simple and complex cognitive demands through decision-making. For both tasks, decision-making was assessed by including a distracting factor to the execution. The results showed that older adults were substantially slower...

  1. The Effect of Focus on Form and Task Complexity on L2 Learners' Oral Task Performance

    Science.gov (United States)

    Salimi, Asghar

    2015-01-01

    Second Language learners' oral task performance has been one of interesting and research generating areas of investigations in the field of second language acquisition specially, task-based language teaching and learning. The main purpose of the present study is to investigate the effect of focus on form and task complexity on L2 learners' oral…

  2. Analysis of the dynamics of movement of the landing vehicle with an inflatable braking device on the final trajectory under the influence of wind load

    Science.gov (United States)

    Koryanov, V.; Kazakovtsev, V.; Harri, A.-M.; Heilimo, J.; Haukka, H.; Aleksashkin, S.

    2015-10-01

    This research work is devoted to analysis of angular motion of the landing vehicle (LV) with an inflatable braking device (IBD), taking into account the influence of the wind load on the final stage of the movement. Using methods to perform a calculation of parameters of angular motion of the landing vehicle with an inflatable braking device based on the availability of small asymmetries, which are capable of complex dynamic phenomena, analyzes motion of the landing vehicle at the final stage of motion in the atmosphere.

  3. High-performance mussel-inspired adhesives of reduced complexity.

    Science.gov (United States)

    Ahn, B Kollbe; Das, Saurabh; Linstadt, Roscoe; Kaufman, Yair; Martinez-Rodriguez, Nadine R; Mirshafian, Razieh; Kesselman, Ellina; Talmon, Yeshayahu; Lipshutz, Bruce H; Israelachvili, Jacob N; Waite, J Herbert

    2015-10-19

    Despite the recent progress in and demand for wet adhesives, practical underwater adhesion remains limited or non-existent for diverse applications. Translation of mussel-inspired wet adhesion typically entails catechol functionalization of polymers and/or polyelectrolytes, and solution processing of many complex components and steps that require optimization and stabilization. Here we reduced the complexity of a wet adhesive primer to synthetic low-molecular-weight catecholic zwitterionic surfactants that show very strong adhesion (∼50 mJ m(-2)) and retain the ability to coacervate. This catecholic zwitterion adheres to diverse surfaces and self-assembles into a molecularly smooth, thin (adhesive for nanofabrication. This study significantly simplifies bio-inspired themes for wet adhesion by combining catechol with hydrophobic and electrostatic functional groups in a small molecule.

  4. Exterior insulating shutter final prototype design. Final report, Phase II

    Energy Technology Data Exchange (ETDEWEB)

    Dike, G.A.; Kinney, L.F.

    1982-12-01

    The final prototype shutter described uses sliding panels composed of inch-thick thermax sandwiched between 60 mil thick ultraviolet-resistant plastic on the outside, and 20 mil stryrene on the inside. The shuter system was shown to have an effective R-value of 6 using ASHRAE procedures to convert from still air conditions to 15 mph wind conditions in a simulated cold environment. Tests were performed for cyclical operation, vulnerability to ice and wind, thermal performance, and air infiltration. Marketing efforts are described. Cost effectiveness is determined via present value analysis. (LEW)

  5. Organic Donor-Acceptor Complexes as Novel Organic Semiconductors.

    Science.gov (United States)

    Zhang, Jing; Xu, Wei; Sheng, Peng; Zhao, Guangyao; Zhu, Daoben

    2017-07-18

    systematically controlled by changing the components. Finally, theoretical calculations based on cocrystals with unique stacking could widen our understanding of structure-property relationships and in turn help us design high-performance semiconductors based on DA complexes. In this Account, we focus on discussing organic DA complexes as a new class of semiconducting materials, including their design, growth methods, packing modes, charge-transport properties, and structure-property relationships. We have also fabricated and investigated devices based on these binary crystals. This interdisciplinary work combines techniques from the fields of self-assembly, crystallography, condensed-matter physics, and theoretical chemistry. Researchers have designed new complex systems, including donor and acceptor compounds that self-assemble in feasible ways into highly ordered cocrystals. We demonstrate that using this crystallization method can easily realize ambipolar or unipolar transport. To further improve device performance, we propose several design strategies, such as using new kinds of donors and acceptors, modulating the energy alignment of the donor (ionization potential, IP) and acceptor (electron affinity, EA) components, and extending the π-conjugated backbones. In addition, we have found that when we use molecular "doping" (2:1 cocrystallization), the charge-transport nature of organic semiconductors can be switched from hole-transport-dominated to electron-transport-dominated. We expect that the formation of cocrystals through the complexation of organic donor and acceptor species will serve as a new strategy to develop semiconductors for organic electronics with superior performances over their corresponding individual components.

  6. The role of human performance in safe operation of complex plants

    International Nuclear Information System (INIS)

    Preda, Irina Aida; Lazar, Roxana Elena; Croitoru, Cornelia

    1999-01-01

    According to statistics, about 20-30% from the failures occurring in plants are caused directly or indirectly by human errors. Furthermore, it was established that 10-15 percents of the global failures are related to the human errors. These are mainly due to the wrong actions, maintenance errors, and misinterpretation of instruments. The human performance is influenced by: professional ability, complexity and danger of the plant, experience in the same working place, level of skills, events in personal and/or professional life, discipline, social ambience and somatic health. The human performances assessment in the probabilistic safety assessment offers the possibility of evaluation for human contribution to the events sequences outcome. A human error may be recovered before the unwanted consequences had been occurred on system. This paper presents the possibilities to use the probabilistic methods (event tree, fault tree) to identify the solution for human reliability improvement in order to minimise the risk in industrial plant operation. Also, are defined the human error types and their causes and the 'decision tree method' is presented as technique in our analyses for human reliability assessment. The exemplification of human error analysis method was achieved based on operation data for Valcea heavy water pilot plant. (authors)

  7. Complexity and dynamics of switched human balance control during quiet standing.

    Science.gov (United States)

    Nema, Salam; Kowalczyk, Piotr; Loram, Ian

    2015-10-01

    In this paper, we use a combination of numerical simulations, time series analysis, and complexity measures to investigate the dynamics of switched systems with noise, which are often used as models of human balance control during quiet standing. We link the results with complexity measures found in experimental data of human sway motion during quiet standing. The control model ensuring balance, which we use, is based on an act-and-wait control concept, that is, a human controller is switched on when a certain sway angle is reached. Otherwise, there is no active control present. Given a time series data, we determine how does it look a typical pattern of control strategy in our model system. We detect the switched nonlinearity in the system using a frequency analysis method in the absence of noise. We also analyse the effect of time delay on the existence of limit cycles in the system in the absence of noise. We perform the entropy and detrended fluctuation analyses in view of linking the switchings (and the dead zone) with the occurrences of complexity in the model system in the presence of noise. Finally, we perform the entropy and detrended fluctuation analyses on experimental data and link the results with numerical findings in our model example.

  8. Wavelet evolutionary network for complex-constrained portfolio rebalancing

    Science.gov (United States)

    Suganya, N. C.; Vijayalakshmi Pai, G. A.

    2012-07-01

    Portfolio rebalancing problem deals with resetting the proportion of different assets in a portfolio with respect to changing market conditions. The constraints included in the portfolio rebalancing problem are basic, cardinality, bounding, class and proportional transaction cost. In this study, a new heuristic algorithm named wavelet evolutionary network (WEN) is proposed for the solution of complex-constrained portfolio rebalancing problem. Initially, the empirical covariance matrix, one of the key inputs to the problem, is estimated using the wavelet shrinkage denoising technique to obtain better optimal portfolios. Secondly, the complex cardinality constraint is eliminated using k-means cluster analysis. Finally, WEN strategy with logical procedures is employed to find the initial proportion of investment in portfolio of assets and also rebalance them after certain period. Experimental studies of WEN are undertaken on Bombay Stock Exchange, India (BSE200 index, period: July 2001-July 2006) and Tokyo Stock Exchange, Japan (Nikkei225 index, period: March 2002-March 2007) data sets. The result obtained using WEN is compared with the only existing counterpart named Hopfield evolutionary network (HEN) strategy and also verifies that WEN performs better than HEN. In addition, different performance metrics and data envelopment analysis are carried out to prove the robustness and efficiency of WEN over HEN strategy.

  9. A Novel Final Focus Design for Future Linear Colliders

    Energy Technology Data Exchange (ETDEWEB)

    Seryi, Andrei

    2000-05-30

    The length, complexity and cost of the present Final Focus designs for linear colliders grows very quickly with the beam energy. In this letter, a novel final focus system is presented and compared with the one proposed for NLC. This new design is simpler, shorter and cheaper, with comparable bandwidth, tolerances and tunability. Moreover, the length scales slower than linearly with energy allowing for a more flexible design which is applicable over a much larger energy range.

  10. The CERN accelerator complex

    CERN Multimedia

    Mobs, Esma Anais

    2016-01-01

    The LHC is the last ring (dark blue line) in a complex chain of particle accelerators. The smaller machines are used in a chain to help boost the particles to their final energies and provide beams to a whole set of smaller experiments, which also aim to uncover the mysteries of the Universe.

  11. The CERN accelerator complex

    CERN Multimedia

    Christiane Lefèvre

    2008-01-01

    The LHC is the last ring (dark grey line) in a complex chain of particle accelerators. The smaller machines are used in a chain to help boost the particles to their final energies and provide beams to a whole set of smaller experiments, which also aim to uncover the mysteries of the Universe.

  12. The CERN accelerator complex

    CERN Multimedia

    Haffner, Julie

    2013-01-01

    The LHC is the last ring (dark grey line) in a complex chain of particle accelerators. The smaller machines are used in a chain to help boost the particles to their final energies and provide beams to a whole set of smaller experiments, which also aim to uncover the mysteries of the Universe.

  13. Iterative optimization of performance libraries by hierarchical division of codes

    International Nuclear Information System (INIS)

    Donadio, S.

    2007-09-01

    The increasing complexity of hardware features incorporated in modern processors makes high performance code generation very challenging. Library generators such as ATLAS, FFTW and SPIRAL overcome this issue by empirically searching in the space of possible program versions for the one that performs the best. This thesis explores fully automatic solution to adapt a compute-intensive application to the target architecture. By mimicking complex sequences of transformations useful to optimize real codes, we show that generative programming is a practical tool to implement a new hierarchical compilation approach for the generation of high performance code relying on the use of state-of-the-art compilers. As opposed to ATLAS, this approach is not application-dependant but can be applied to fairly generic loop structures. Our approach relies on the decomposition of the original loop nest into simpler kernels. These kernels are much simpler to optimize and furthermore, using such codes makes the performance trade off problem much simpler to express and to solve. Finally, we propose a new approach for the generation of performance libraries based on this decomposition method. We show that our method generates high-performance libraries, in particular for BLAS. (author)

  14. Performance-oriented Architecture and the Spatial and Material Organisation Complex. Rethinking the Definition, Role and Performative Capacity of the Spatial and Material Boundaries of the Built Environment

    Directory of Open Access Journals (Sweden)

    Michael Ulrich Hensel

    2011-03-01

    Full Text Available This article is based on the proposition that performance-oriented design is characterised by four domains of ‘active agency’: the human subject, the spatial and material organisation complex and the environment (Hensel, 2010. While these four domains are seen to be interdependent and interacting with one another, it is nevertheless necessary to examine each in its own right. However, the spatial and material organisation complex contains both the spatial and material domains, which are interdependent to such a degree that these need to be examined in relation to one another and also in relation to the specific environment they are set within and interacting with. To explore this combined domain within the context of performance-oriented design is the aim of this article, in particularly in relation to the question of the definition and performative capacity of spatial and material boundaries. The various sections are accompanied by research by design efforts undertaken in specified academic contexts, which are intended as examples of modes and areas of inquiry relative to the purpose of this article.

  15. Impact of Cognitive Abilities and Prior Knowledge on Complex Problem Solving Performance – Empirical Results and a Plea for Ecologically Valid Microworlds

    Directory of Open Access Journals (Sweden)

    Heinz-Martin Süß

    2018-05-01

    Full Text Available The original aim of complex problem solving (CPS research was to bring the cognitive demands of complex real-life problems into the lab in order to investigate problem solving behavior and performance under controlled conditions. Up until now, the validity of psychometric intelligence constructs has been scrutinized with regard to its importance for CPS performance. At the same time, different CPS measurement approaches competing for the title of the best way to assess CPS have been developed. In the first part of the paper, we investigate the predictability of CPS performance on the basis of the Berlin Intelligence Structure Model and Cattell’s investment theory as well as an elaborated knowledge taxonomy. In the first study, 137 students managed a simulated shirt factory (Tailorshop; i.e., a complex real life-oriented system twice, while in the second study, 152 students completed a forestry scenario (FSYS; i.e., a complex artificial world system. The results indicate that reasoning – specifically numerical reasoning (Studies 1 and 2 and figural reasoning (Study 2 – are the only relevant predictors among the intelligence constructs. We discuss the results with reference to the Brunswik symmetry principle. Path models suggest that reasoning and prior knowledge influence problem solving performance in the Tailorshop scenario mainly indirectly. In addition, different types of system-specific knowledge independently contribute to predicting CPS performance. The results of Study 2 indicate that working memory capacity, assessed as an additional predictor, has no incremental validity beyond reasoning. We conclude that (1 cognitive abilities and prior knowledge are substantial predictors of CPS performance, and (2 in contrast to former and recent interpretations, there is insufficient evidence to consider CPS a unique ability construct. In the second part of the paper, we discuss our results in light of recent CPS research, which predominantly

  16. Impact of Cognitive Abilities and Prior Knowledge on Complex Problem Solving Performance – Empirical Results and a Plea for Ecologically Valid Microworlds

    Science.gov (United States)

    Süß, Heinz-Martin; Kretzschmar, André

    2018-01-01

    The original aim of complex problem solving (CPS) research was to bring the cognitive demands of complex real-life problems into the lab in order to investigate problem solving behavior and performance under controlled conditions. Up until now, the validity of psychometric intelligence constructs has been scrutinized with regard to its importance for CPS performance. At the same time, different CPS measurement approaches competing for the title of the best way to assess CPS have been developed. In the first part of the paper, we investigate the predictability of CPS performance on the basis of the Berlin Intelligence Structure Model and Cattell’s investment theory as well as an elaborated knowledge taxonomy. In the first study, 137 students managed a simulated shirt factory (Tailorshop; i.e., a complex real life-oriented system) twice, while in the second study, 152 students completed a forestry scenario (FSYS; i.e., a complex artificial world system). The results indicate that reasoning – specifically numerical reasoning (Studies 1 and 2) and figural reasoning (Study 2) – are the only relevant predictors among the intelligence constructs. We discuss the results with reference to the Brunswik symmetry principle. Path models suggest that reasoning and prior knowledge influence problem solving performance in the Tailorshop scenario mainly indirectly. In addition, different types of system-specific knowledge independently contribute to predicting CPS performance. The results of Study 2 indicate that working memory capacity, assessed as an additional predictor, has no incremental validity beyond reasoning. We conclude that (1) cognitive abilities and prior knowledge are substantial predictors of CPS performance, and (2) in contrast to former and recent interpretations, there is insufficient evidence to consider CPS a unique ability construct. In the second part of the paper, we discuss our results in light of recent CPS research, which predominantly utilizes the

  17. Superconducting quadrupoles for the SLC final focus

    International Nuclear Information System (INIS)

    Erickson, R.; Fieguth, T.; Murray, J.J.

    1987-01-01

    The final focus system of the SLC will be upgraded by replacing the final quadrupoles with higher gradient superconducting magnets positioned closer to the interaction point. The parameters of the new system have been chosen to be compatible with the experimental detectors with a minimum of changes to other final focus components. These parameter choices are discussed along with the expected improvement in SLC performance

  18. Superconducting quadrupoles for the SLC final focus

    International Nuclear Information System (INIS)

    Erickson, R.; Fieguth, T.; Murray, J.J.

    1987-01-01

    The final focus system of the SLC will be upgraded by replacing the final quadrupoles with higher gradient supperconducting magnets positioned closer to the interaction point. The parameters of the new system have been chosen to be compatible with the experimental detectors with a minimum of changes to other final focus components. These parameter choices are discussed along with the expected improvement in SLC performance

  19. Decay dynamics of neutral and charged excitonic complexes in single InAs/GaAs QDs

    International Nuclear Information System (INIS)

    Feucker, Max; Seguin, Robert; Rodt, Sven; Poetschke, Konstantin; Bimberg, Dieter

    2008-01-01

    Across the inhomogeneously broadened lineshape of a quantum dot (QD) ensemble the decay times are expected to vary since the wavefunctions and the oscillator strengths are sensitive to the actual geometry of the QD. We performed time-resolved cathodoluminescence spectroscopy of 26 different single InAs/GaAs QDs to investigate the decay dynamics of neutral and charged excitonic complexes. The largest decay rate was found for the XX + , followed by XX, X + and finally the X. We will show that the ratios of lifetimes of the different excitonic complexes are mainly governed by the number of involved recombination channels. There is excellent agreement between the measured and predicted values for the lifetime ratios of the neutral (X/XX) and the positively charged (X + /XX + ) complexes. Surprisingly the lifetime of the exciton (X) shows a much larger yet unexplained scatter than that of all the other complexes

  20. Stop: a fast procedure for the exact computation of the performance of complex probabilistic systems

    International Nuclear Information System (INIS)

    Corynen, G.C.

    1982-01-01

    A new set-theoretic method for the exact and efficient computation of the probabilistic performance of complex systems has been developed. The core of the method is a fast algorithm for disjointing a collection of product sets which is intended for systems with more than 1000 components and 100,000 cut sets. The method is based on a divide-and-conquer approach, in which a multidimensional problem is progressively decomposed into lower-dimensional subproblems along its dimensions. The method also uses a particular pointer system that eliminates the need to store the subproblems by only requiring the storage of pointers to those problems. Examples of the algorithm and the divide-and-conquer strategy are provided, and comparisons with other significant methods are made. Statistical complexity studies show that the expected time and space complexity of other methods is O(me/sup n/), but that our method is O(nm 3 log(m)). Problems which would require days of Cray-1 computer time with present methods can now be solved in seconds. Large-scale systems that can only be approximated with other techniques can now also be evaluated exactly

  1. Complexities in innovation management in companies from the European industry. A path model of innovation project performance determinants

    NARCIS (Netherlands)

    Tepic, M.; Kemp, R.G.M.; Omta, S.W.F.; Fortuin, F.T.J.M.

    2013-01-01

    Purpose – The purpose of this paper is to provide an integrated framework of complex relations among innovation characteristics, organizational capabilities, innovation potential and innovation performance. Design/methodology/approach – The model is tested using partial least squares (PLS) modeling

  2. Deriving force field parameters for coordination complexes

    DEFF Research Database (Denmark)

    Norrby, Per-Ola; Brandt, Peter

    2001-01-01

    The process of deriving molecular mechanics force fields for coordination complexes is outlined. Force field basics are introduced with an emphasis on special requirements for metal complexes. The review is then focused on how to set up the initial model, define the target, refine the parameters......, and validate the final force field, Alternatives to force field derivation are discussed briefly....

  3. Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling

    Science.gov (United States)

    Mog, Robert A.

    1997-01-01

    Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.

  4. Final Exam Weighting as Part of Course Design

    Science.gov (United States)

    Franke, Matthew

    2018-01-01

    The weighting of a final exam or a final assignment is an essential part of course design that is rarely discussed in pedagogical literature. Depending on the weighting, a final exam or assignment may provide unequal benefits to students depending on their prior performance in the class. Consequently, uncritical grade weighting can discount…

  5. Nanoscale Dewetting Transition in Protein Complex Folding

    Science.gov (United States)

    Hua, Lan; Huang, Xuhui; Liu, Pu; Zhou, Ruhong; Berne, Bruce J.

    2011-01-01

    In a previous study, a surprising drying transition was observed to take place inside the nanoscale hydrophobic channel in the tetramer of the protein melittin. The goal of this paper is to determine if there are other protein complexes capable of displaying a dewetting transition during their final stage of folding. We searched the entire protein data bank (PDB) for all possible candidates, including protein tetramers, dimers, and two-domain proteins, and then performed the molecular dynamics (MD) simulations on the top candidates identified by a simple hydrophobic scoring function based on aligned hydrophobic surface areas. Our large scale MD simulations found several more proteins, including three tetramers, six dimers, and two two-domain proteins, which display a nanoscale dewetting transition in their final stage of folding. Even though the scoring function alone is not sufficient (i.e., a high score is necessary but not sufficient) in identifying the dewetting candidates, it does provide useful insights into the features of complex interfaces needed for dewetting. All top candidates have two features in common: (1) large aligned (matched) hydrophobic areas between two corresponding surfaces, and (2) large connected hydrophobic areas on the same surface. We have also studied the effect on dewetting of different water models and different treatments of the long-range electrostatic interactions (cutoff vs PME), and found the dewetting phenomena is fairly robust. This work presents a few proteins other than melittin tetramer for further experimental studies of the role of dewetting in the end stages of protein folding. PMID:17608515

  6. Studies of complexity in fluid systems

    Energy Technology Data Exchange (ETDEWEB)

    Nagel, Sidney R.

    2000-06-12

    This is the final report of Grant DE-FG02-92ER25119, ''Studies of Complexity in Fluids'', we have investigated turbulence, flow in granular materials, singularities in evolution of fluid surfaces and selective withdrawal fluid flows. We have studied numerical methods for dealing with complex phenomena, and done simulations on the formation of river networks. We have also studied contact-line deposition that occurs in a drying drop.

  7. Fuzzy Modeling and Synchronization of a New Hyperchaotic Complex System with Uncertainties

    Directory of Open Access Journals (Sweden)

    Hadi Delavari

    2015-07-01

    Full Text Available In this paper, the synchronization of a new hyperchaotic complex system based on T-S fuzzy model is proposed. First the considered hyperchaotic system is represented by T-S fuzzy model equivalently. Then by using the parallel distributed compensation (PDC method and by applying linear system theory and exact linearization (EL technique, a fuzzy controller is designed to realize the synchronization. Finally, simulation results are carried out to demonstrate the performance of our proposed control scheme, and also the robustness of the designed fuzzy controller to uncertainties.

  8. Synchronization in complex networks

    Energy Technology Data Exchange (ETDEWEB)

    Arenas, A.; Diaz-Guilera, A.; Moreno, Y.; Zhou, C.; Kurths, J.

    2007-12-12

    Synchronization processes in populations of locally interacting elements are in the focus of intense research in physical, biological, chemical, technological and social systems. The many efforts devoted to understand synchronization phenomena in natural systems take now advantage of the recent theory of complex networks. In this review, we report the advances in the comprehension of synchronization phenomena when oscillating elements are constrained to interact in a complex network topology. We also overview the new emergent features coming out from the interplay between the structure and the function of the underlying pattern of connections. Extensive numerical work as well as analytical approaches to the problem are presented. Finally, we review several applications of synchronization in complex networks to different disciplines: biological systems and neuroscience, engineering and computer science, and economy and social sciences.

  9. Product Complexity Impact on Quality and Delivery Performance

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2011-01-01

    Existing literature on product portfolio complexity is mainly focused on cost related aspects. It is widely acknowledged that an increase in a company’s product portfolio will lead to an increase in complexity related costs such as order management, procurement and inventory. The objective...... is increased, but it is not the only factor affected. We can document that there is a tendency towards increasing lead times as well as a drop in on time delivery and quality for newly introduced product variants. This means that the company experiences a reduced ability to deliver on time while also receiving...... of this article is to examine which other factors that might be affected when a company is expanding its product portfolio, if initiatives are not taken to accommodate this increase. Empirical work carried out in a large international engineering company having a market leader position confirms that cost...

  10. Complexity Control of Fast Motion Estimation in H.264/MPEG-4 AVC with Rate-Distortion-Complexity optimization

    DEFF Research Database (Denmark)

    Wu, Mo; Forchhammer, Søren; Aghito, Shankar Manuel

    2007-01-01

    A complexity control algorithm for H.264 advanced video coding is proposed. The algorithm can control the complexity of integer inter motion estimation for a given target complexity. The Rate-Distortion-Complexity performance is improved by a complexity prediction model, simple analysis of the pa...... statistics and a control scheme. The algorithm also works well for scene change condition. Test results for coding interlaced video (720x576 PAL) are reported.......A complexity control algorithm for H.264 advanced video coding is proposed. The algorithm can control the complexity of integer inter motion estimation for a given target complexity. The Rate-Distortion-Complexity performance is improved by a complexity prediction model, simple analysis of the past...

  11. Subjective task complexity in the control room

    International Nuclear Information System (INIS)

    Braarud, Per Oeivind

    2000-05-01

    Understanding of what makes a control room situation difficult to handle is important when studying operator performance, both with respect to prediction as well as improvement of the human performance. Previous exploratory work on complexity showed a potential for prediction and explanation of operator performance. This report investigates in further detail the theoretical background and the structure of operator rated task complexity. The report complements the previous work on complexity to make a basis for development of operator performance analysis tools. The first part of the report outlines an approach for studying the complexity of the control room crew's work. The approach draws upon man-machine research as well as problem solving research. The approach identifies five complexity-shaping components: 'task work characteristics', 'teamwork characteristics', 'individual skill', 'teamwork skill', and 'interface and support systems'. The crew's work complexity is related to concepts of human performance quality and human error. The second part of the report is a post-hoc exploratory analysis of four empirical HRP studies, where operators' conception of the complexity of control room work is assessed by questionnaires. The analysis deals with the structure of complexity questionnaire ratings, and the relationship between complexity ratings and human performance measures. The main findings from the analysis of structure was the identification of three task work factors which were named Masking, Information load and Temporal demand, and in addition the identification of one interface factor which was named Navigation. Post-hoc analysis suggests that operator's subjective complexity, which was assessed by questionnaires, is related to workload, task and system performance, and operator's self-rated performance. (Author). 28 refs., 47 tabs

  12. COMPLEX TRAINING: A BRIEF REVIEW

    Directory of Open Access Journals (Sweden)

    William P. Ebben

    2002-06-01

    Full Text Available The effectiveness of plyometric training is well supported by research. Complex training has gained popularity as a training strategy combining weight training and plyometric training. Anecdotal reports recommend training in this fashion in order to improve muscular power and athletic performance. Recently, several studies have examined complex training. Despite the fact that questions remain about the potential effectiveness and implementation of this type of training, results of recent studies are useful in guiding practitioners in the development and implementation of complex training programs. In some cases, research suggests that complex training has an acute ergogenic effect on upper body power and the results of acute and chronic complex training include improved jumping performance. Improved performance may require three to four minutes rest between the weight training and plyometrics sets and the use of heavy weight training loads

  13. Complexity attack resistant flow lookup achemes for IPv6: a measurement based comparison

    OpenAIRE

    Malone, David; Tobin, R. Joshua

    2008-01-01

    In this paper we look at the problem of choosing a good flow state lookup scheme for IPv6 firewalls. We want to choose a scheme which is fast when dealing with typical traffic, but whose performance will not degrade unnecessarily when subject to a complexity attack. We demonstrate the existing problem and, using captured traffic, assess a number of replacement schemes that are hash and tree based. Our aim is to improve FreeBSD’s ipfw firewall, and so finally we implement the most pro...

  14. Relationship Between Final Performance and Block Times with the Traditional and the New Starting Platforms with A Back Plate in International Swimming Championship 50-M and 100-M Freestyle Events

    Directory of Open Access Journals (Sweden)

    Antonio Garcia-Hermoso

    2013-12-01

    Full Text Available The purpose of this study was to investigate the association between block time and final performance for each sex in 50-m and 100-m individual freestyle, distinguishing between classification (1st to 3rd, 4th to 8th, 9th to 16th and type of starting platform (old and new in international competitions. Twenty-six international competitions covering a 13-year period (2000-2012 were analysed retrospectively. The data corresponded to a total of 1657 swimmers’ competition histories. A two-way ANOVA (sex x classification was performed for each event and starting platform with the Bonferroni post-hoc test, and another two-way ANOVA for sex and starting platform (sex x starting platform. Pearson’s simple correlation coefficient was used to determine correlations between the block time and the final performance. Finally, a simple linear regression analysis was done between the final time and the block time for each sex and platform. The men had shorter starting block times than the women in both events and from both platforms. For 50-m event, medalists had shorter block times than semi- finalists with the old starting platforms. Block times were directly related to performance with the old starting platforms. With the new starting platforms, however, the relationship was inverse, notably in the women’s 50-m event. The block time was related for final performance in the men’s 50- m event with the old starting platform, but with the new platform it was critical only for the women’s 50-m event.

  15. Improvements to optical performance in diffractive elements used for off-axis illumination

    Science.gov (United States)

    Welch, Kevin; Fedor, Adam; Felder, Daniel; Childers, John; Emig, Tim

    2009-08-01

    As photolithographic tools are pressed to print the ever shrinking features required in today's devices, complex off-axis illumination is taking an ever increasing role in meeting this challenge. This, in turn, is driving tighter, more stringent requirements on the diffractive elements used in these illumination systems. Specifically, any imbalance in the poles of an off-axis illuminator will contribute to reductions in the ultimate imaging performance of a lithographic tool and increased complexity in tool-to-tool matching. The article will focus on improvements to the manufacturing process that achieve substantially better pole balance. The modeling of the possible process contributors will be discussed. Challenges resulting from the manufacturing methodology will be shared. Finally, the improvement in manufacturing process performance will be reported by means of a pole balance capability index.

  16. Multi-scalar agent-based complex design systems - the case of CECO (Climatic -Ecologies) Studio; informed generative design systems and performance-driven design workflows

    NARCIS (Netherlands)

    Mostafavi, S.; Yu, S.; Biloria, N.M.

    2014-01-01

    This paper illustrates the application of different types of complex systems for digital form finding and design decision making with underlying methodological and pedagogical aims to emphasize performance-driven design solutions via combining generative methods of complex systems with simulation

  17. Risk Management Capability Maturity and Performance of Complex Product and System (CoPS Projects with an Asian Perspective

    Directory of Open Access Journals (Sweden)

    Ren, Y.

    2014-07-01

    Full Text Available Complex Products and Systems (CoPS are high value, technology and engineering-intensive capital goods. The motivation of this study is the persistent high failure rate of CoPS projects, Asian CoPS provider’s weak capability and lack of specific research on CoPS risk management. This paper evaluates risk management maturity level of CoPS projects against a general CoPS risk management capability maturity model (RM-CMM developed by the authors. An Asian based survey was conducted to investigate the value of RM to project performance, and Asian (non-Japanese CoPS implementers’ perceived application of RM practices, their strengths and weaknesses. The survey result shows that higher RM maturity level leads to higher CoPS project performance. It also shows project complexity and uncertainty moderates the relationship between some RM practices and project performance, which implies that a contingency approach should be adopted to manage CoPS risks effectively. In addition, it shows that Asian CoPS implementers are weak in RM process and there are also rooms for improvement in the softer aspects of organizational capabilities and robustness.

  18. Performance life of HMA mixes : final report.

    Science.gov (United States)

    2016-01-01

    A number of hot mix asphalt (HMA) types, such as permeable friction course (PFC), stone mastic asphalts : (SMA), performance design mixes and conventional dense graded mixes are currently used to construct or overlay : roads. One of the important inp...

  19. Tunnel Boring Machine Performance Study. Final Report

    Science.gov (United States)

    1984-06-01

    Full face tunnel boring machine "TBM" performance during the excavation of 6 tunnels in sedimentary rock is considered in terms of utilization, penetration rates and cutter wear. The construction records are analyzed and the results are used to inves...

  20. Complex networks-based energy-efficient evolution model for wireless sensor networks

    Energy Technology Data Exchange (ETDEWEB)

    Zhu Hailin [Beijing Key Laboratory of Intelligent Telecommunications Software and Multimedia, Beijing University of Posts and Telecommunications, P.O. Box 106, Beijing 100876 (China)], E-mail: zhuhailin19@gmail.com; Luo Hong [Beijing Key Laboratory of Intelligent Telecommunications Software and Multimedia, Beijing University of Posts and Telecommunications, P.O. Box 106, Beijing 100876 (China); Peng Haipeng; Li Lixiang; Luo Qun [Information Secure Center, State Key Laboratory of Networking and Switching Technology, Beijing University of Posts and Telecommunications, P.O. Box 145, Beijing 100876 (China)

    2009-08-30

    Based on complex networks theory, we present two self-organized energy-efficient models for wireless sensor networks in this paper. The first model constructs the wireless sensor networks according to the connectivity and remaining energy of each sensor node, thus it can produce scale-free networks which have a performance of random error tolerance. In the second model, we not only consider the remaining energy, but also introduce the constraint of links to each node. This model can make the energy consumption of the whole network more balanced. Finally, we present the numerical experiments of the two models.

  1. Complex networks-based energy-efficient evolution model for wireless sensor networks

    International Nuclear Information System (INIS)

    Zhu Hailin; Luo Hong; Peng Haipeng; Li Lixiang; Luo Qun

    2009-01-01

    Based on complex networks theory, we present two self-organized energy-efficient models for wireless sensor networks in this paper. The first model constructs the wireless sensor networks according to the connectivity and remaining energy of each sensor node, thus it can produce scale-free networks which have a performance of random error tolerance. In the second model, we not only consider the remaining energy, but also introduce the constraint of links to each node. This model can make the energy consumption of the whole network more balanced. Finally, we present the numerical experiments of the two models.

  2. ASBESTOS PIPE-INSULATION REMOVAL ROBOT SYSTEM; FINAL

    International Nuclear Information System (INIS)

    Unknown

    2000-01-01

    This final topical report details the development, experimentation and field-testing activities for a robotic asbestos pipe-insulation removal robot system developed for use within the DOE's weapon complex as part of their ER and WM program, as well as in industrial abatement. The engineering development, regulatory compliance, cost-benefit and field-trial experiences gathered through this program are summarized

  3. Modeling and Simulation of Project Management through the PMBOK® Standard Using Complex Networks

    Directory of Open Access Journals (Sweden)

    Luz Stella Cardona-Meza

    2017-01-01

    Full Text Available Discussion about project management, in both the academic literature and industry, is predominantly based on theories of control, many of which have been developed since the 1950s. However, issues arise when these ideas are applied unilaterally to all types of projects and in all contexts. In complex environments, management problems arise from assuming that results, predicted at the start of a project, can be sufficiently described and delivered as planned. Thus, once a project reaches a critical size, a calendar, and a certain level of ambiguity and interconnection, the analysis centered on control does not function adequately. Projects that involve complex situations can be described as adaptive complex systems, consistent in multiple interdependent dynamic components, multiple feedback processes, nonlinear relations, and management of hard data (process dynamics and soft data (executive team dynamics. In this study, through a complex network, the dynamic structure of a project and its trajectories are simulated using inference processes. Finally, some numerical simulations are described, leading to a decision making tool that identifies critical processes, thereby obtaining better performance outcomes of projects.

  4. Good distractions: Testing the effects of listening to an audiobook on driving performance in simple and complex road environments.

    Science.gov (United States)

    Nowosielski, Robert J; Trick, Lana M; Toxopeus, Ryan

    2018-02-01

    Distracted driving (driving while performing a secondary task) causes many collisions. Most research on distracted driving has focused on operating a cell-phone, but distracted driving can include eating while driving, conversing with passengers or listening to music or audiobooks. Although the research has focused on the deleterious effects of distraction, there may be situations where distraction improves driving performance. Fatigue and boredom are also associated with collision risk and it is possible that secondary tasks can help alleviate the effects of fatigue and boredom. Furthermore, it has been found that individuals with high levels of executive functioning as measured by the OSPAN (Operation Span) task show better driving while multitasking. In this study, licensed drivers were tested in a driving simulator (a car body surrounded by screens) that simulated simple or complex roads. Road complexity was manipulated by increasing traffic, scenery, and the number of curves in the drive. Participants either drove, or drove while listening to an audiobook. Driving performance was measured in terms of braking response time to hazards (HRT): the time required to brake in response to pedestrians or vehicles that suddenly emerged from the periphery into the path of the vehicle, speed, standard deviation of speed, standard deviation of lateral position (SDLP). Overall, braking times to hazards were higher on the complex drive than the simple one, though the effects of secondary tasks such as audiobooks were especially deleterious on the complex drive. In contrast, on the simple drive, driving while listening to an audiobook lead to faster HRT. We found evidence that individuals with high OSPAN scores had faster HRTs when listening to an audiobook. These results suggest that there are environmental and individual factors behind difference in the allocation of attention while listening to audiobooks while driving. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Norbadione A: synthetic approach and cesium complexation studies

    International Nuclear Information System (INIS)

    Desage - El Murr, M.

    2003-10-01

    This work was dedicated to the study of the synthesis and complexation studies of norbadione A: a pigment originating from a mushroom. A synthetic approach, based on a double Suzuki-Miyaura coupling, was developed. This strategy was applied with high yields to the synthesis of various norbadione A analogues, as well as to the synthesis of simple pulvinic acids. Access to functionalized precursors of the molecule was also studied and the final coupling remains to be done. Besides, a speciation study based on electro-spray ionization mass spectrometry was conducted with norbadione A and one of the analogues. This study allowed the assessment of the cesium complexation abilities of each molecule. Structural data was also obtained and complexation constants were calculated. Finally, norbadione A and various synthetic products have been tested via high-throughput screening methods and strong antioxidant properties were observed. Other biological results are also reported. (author)

  6. Using iMCFA to Perform the CFA, Multilevel CFA, and Maximum Model for Analyzing Complex Survey Data.

    Science.gov (United States)

    Wu, Jiun-Yu; Lee, Yuan-Hsuan; Lin, John J H

    2018-01-01

    To construct CFA, MCFA, and maximum MCFA with LISREL v.8 and below, we provide iMCFA (integrated Multilevel Confirmatory Analysis) to examine the potential multilevel factorial structure in the complex survey data. Modeling multilevel structure for complex survey data is complicated because building a multilevel model is not an infallible statistical strategy unless the hypothesized model is close to the real data structure. Methodologists have suggested using different modeling techniques to investigate potential multilevel structure of survey data. Using iMCFA, researchers can visually set the between- and within-level factorial structure to fit MCFA, CFA and/or MAX MCFA models for complex survey data. iMCFA can then yield between- and within-level variance-covariance matrices, calculate intraclass correlations, perform the analyses and generate the outputs for respective models. The summary of the analytical outputs from LISREL is gathered and tabulated for further model comparison and interpretation. iMCFA also provides LISREL syntax of different models for researchers' future use. An empirical and a simulated multilevel dataset with complex and simple structures in the within or between level was used to illustrate the usability and the effectiveness of the iMCFA procedure on analyzing complex survey data. The analytic results of iMCFA using Muthen's limited information estimator were compared with those of Mplus using Full Information Maximum Likelihood regarding the effectiveness of different estimation methods.

  7. Quantitative comparison of performance analysis techniques for modular and generic network-on-chip

    Directory of Open Access Journals (Sweden)

    M. C. Neuenhahn

    2009-05-01

    Full Text Available NoC-specific parameters feature a huge impact on performance and implementation costs of NoC. Hence, performance and cost evaluation of these parameter-dependent NoC is crucial in different design-stages but the requirements on performance analysis differ from stage to stage. In an early design-stage an analysis technique featuring reduced complexity and limited accuracy can be applied, whereas in subsequent design-stages more accurate techniques are required.

    In this work several performance analysis techniques at different levels of abstraction are presented and quantitatively compared. These techniques include a static performance analysis using timing-models, a Colored Petri Net-based approach, VHDL- and SystemC-based simulators and an FPGA-based emulator. Conducting NoC-experiments with NoC-sizes from 9 to 36 functional units and various traffic patterns, characteristics of these experiments concerning accuracy, complexity and effort are derived.

    The performance analysis techniques discussed here are quantitatively evaluated and finally assigned to the appropriate design-stages in an automated NoC-design-flow.

  8. Complexity is simple!

    Science.gov (United States)

    Cottrell, William; Montero, Miguel

    2018-02-01

    In this note we investigate the role of Lloyd's computational bound in holographic complexity. Our goal is to translate the assumptions behind Lloyd's proof into the bulk language. In particular, we discuss the distinction between orthogonalizing and `simple' gates and argue that these notions are useful for diagnosing holographic complexity. We show that large black holes constructed from series circuits necessarily employ simple gates, and thus do not satisfy Lloyd's assumptions. We also estimate the degree of parallel processing required in this case for elementary gates to orthogonalize. Finally, we show that for small black holes at fixed chemical potential, the orthogonalization condition is satisfied near the phase transition, supporting a possible argument for the Weak Gravity Conjecture first advocated in [1].

  9. The NSL Complex Regulates Housekeeping Genes in Drosophila

    Science.gov (United States)

    Raja, Sunil Jayaramaiah; Holz, Herbert; Luscombe, Nicholas M.; Manke, Thomas; Akhtar, Asifa

    2012-01-01

    MOF is the major histone H4 lysine 16-specific (H4K16) acetyltransferase in mammals and Drosophila. In flies, it is involved in the regulation of X-chromosomal and autosomal genes as part of the MSL and the NSL complexes, respectively. While the function of the MSL complex as a dosage compensation regulator is fairly well understood, the role of the NSL complex in gene regulation is still poorly characterized. Here we report a comprehensive ChIP–seq analysis of four NSL complex members (NSL1, NSL3, MBD-R2, and MCRS2) throughout the Drosophila melanogaster genome. Strikingly, the majority (85.5%) of NSL-bound genes are constitutively expressed across different cell types. We find that an increased abundance of the histone modifications H4K16ac, H3K4me2, H3K4me3, and H3K9ac in gene promoter regions is characteristic of NSL-targeted genes. Furthermore, we show that these genes have a well-defined nucleosome free region and broad transcription initiation patterns. Finally, by performing ChIP–seq analyses of RNA polymerase II (Pol II) in NSL1- and NSL3-depleted cells, we demonstrate that both NSL proteins are required for efficient recruitment of Pol II to NSL target gene promoters. The observed Pol II reduction coincides with compromised binding of TBP and TFIIB to target promoters, indicating that the NSL complex is required for optimal recruitment of the pre-initiation complex on target genes. Moreover, genes that undergo the most dramatic loss of Pol II upon NSL knockdowns tend to be enriched in DNA Replication–related Element (DRE). Taken together, our findings show that the MOF-containing NSL complex acts as a major regulator of housekeeping genes in flies by modulating initiation of Pol II transcription. PMID:22723752

  10. Complex dynamics in the distribution of players’ scoring performance in Rugby Union world cups

    Science.gov (United States)

    Seuront, Laurent

    2013-09-01

    The evolution of the scoring performance of Rugby Union players is investigated over the seven rugby world cups (RWC) that took place from 1987 to 2011, and a specific attention is given to how they may have been impacted by the switch from amateurism to professionalism that occurred in 1995. The distribution of the points scored by individual players, Ps, ranked in order of performance were well described by the simplified canonical law Ps∝(, where r is the rank, and ϕ and α are the parameters of the distribution. The parameter α did not significantly change from 1987 to 2007 (α=0.92±0.03), indicating a negligible effect of professionalism on players’ scoring performance. In contrast, the parameter ϕ significantly increased from ϕ=1.32 for 1987 RWC, ϕ=2.30 for 1999 to 2003 RWC and ϕ=5.60 for 2007 RWC, suggesting a progressive decrease in the relative performance of the best players. Finally, the sharp decreases observed in both α(α=0.38) and ϕ(ϕ=0.70) in the 2011 RWC indicate a more even distribution of the performance of individuals among scorers, compared to the more heterogeneous distributions observed from 1987 to 2007, and suggest a sharp increase in the level of competition leading to an increase in the average quality of players and a decrease in the relative skills of the top players. Note that neither α nor ϕ significantly correlate with traditional performance indicators such as the number of points scored by the best players, the number of games played by the best players, the number of points scored by the team of the best players or the total number of points scored over each RWC. This indicates that the dynamics of the scoring performance of Rugby Union players is influenced by hidden processes hitherto inaccessible through standard performance metrics; this suggests that players’ scoring performance is connected to ubiquitous phenomena such as anomalous diffusion.

  11. Pinning Synchronization of Switched Complex Dynamical Networks

    Directory of Open Access Journals (Sweden)

    Liming Du

    2015-01-01

    Full Text Available Network topology and node dynamics play a key role in forming synchronization of complex networks. Unfortunately there is no effective synchronization criterion for pinning synchronization of complex dynamical networks with switching topology. In this paper, pinning synchronization of complex dynamical networks with switching topology is studied. Two basic problems are considered: one is pinning synchronization of switched complex networks under arbitrary switching; the other is pinning synchronization of switched complex networks by design of switching when synchronization cannot achieved by using any individual connection topology alone. For the two problems, common Lyapunov function method and single Lyapunov function method are used respectively, some global synchronization criteria are proposed and the designed switching law is given. Finally, simulation results verify the validity of the results.

  12. Confidence level in performing endodontic treatment among final year undergraduate dental students from the University of Medical Science and Technology, Sudan (2014

    Directory of Open Access Journals (Sweden)

    Elhadi Mohieldin Awooda

    2016-01-01

    Full Text Available Aim: This study is aimed to evaluate the confidence level of undergraduate final year dental students in performing root canal treatment (RCT and how it may affect their performance and perception regarding endodontics. Materials and Methods: A self-administered questionnaire was distributed to the final year dental students, at the University of Medical Sciences and Technology, Khartoum, Sudan (2013–2014. A total of 21 students were requested to participate voluntary and were asked to score their level of confidence using a 5-point Likert's scale. Results: Response rate was 100%, all the students (100% stated that the requirements set were enough, and 66.7% rated endodontic as average in terms of difficulty. When rating the mean of self-confidence for performing RCT in the dentition, maxillary teeth (2.43 ± 0.51 followed by mandibular teeth (2.71 ± 0.64 were higher, whereas the molars were the least. Higher scores of self-confidence were in administrating local anesthesia (4.24 ± 0.70, followed by root canal shaping by hand instrument (3.76 ± 0.54. No association was found between overall confidence level and the number of performed RCT (P = 0.721. No association was found between overall confidence level of students who were subjected to instrument fracture and their frequency of fracture (P = 0.507, supervisor' reaction (P = 0.587, and willingness to specialize in endodontics (P = 0.530. Conclusion: Students displayed high confidence in performing basic endodontic and treating single-rooted teeth. More exposure is recommended to enhance the students' self-confidence.

  13. Fundamental Processes in Plasmas. Final report

    International Nuclear Information System (INIS)

    O'Neil, Thomas M.; Driscoll, C. Fred

    2009-01-01

    This research focuses on fundamental processes in plasmas, and emphasizes problems for which precise experimental tests of theory can be obtained. Experiments are performed on non-neutral plasmas, utilizing three electron traps and one ion trap with a broad range of operating regimes and diagnostics. Theory is focused on fundamental plasma and fluid processes underlying collisional transport and fluid turbulence, using both analytic techniques and medium-scale numerical simulations. The simplicity of these systems allows a depth of understanding and a precision of comparison between theory and experiment which is rarely possible for neutral plasmas in complex geometry. The recent work has focused on three areas in basic plasma physics. First, experiments and theory have probed fundamental characteristics of plasma waves: from the low-amplitude thermal regime, to inviscid damping and fluid echoes, to cold fluid waves in cryogenic ion plasmas. Second, the wide-ranging effects of dissipative separatrices have been studied experimentally and theoretically, finding novel wave damping and coupling effects and important plasma transport effects. Finally, correlated systems have been investigated experimentally and theoretically: UCSD experients have now measured the Salpeter correlation enhancement, and theory work has characterized the 'guiding center atoms of antihydrogen created at CERN

  14. Verbal Final Exam in Introductory Biology Yields Gains in Student Content Knowledge and Longitudinal Performance

    Science.gov (United States)

    Luckie, Douglas B.; Rivkin, Aaron M.; Aubry, Jacob R.; Marengo, Benjamin J.; Creech, Leah R.; Sweeder, Ryan D.

    2013-01-01

    We studied gains in student learning over eight semesters in which an introductory biology course curriculum was changed to include optional verbal final exams (VFs). Students could opt to demonstrate their mastery of course material via structured oral exams with the professor. In a quantitative assessment of cell biology content knowledge, students who passed the VF outscored their peers on the medical assessment test (MAT), an exam built with 40 Medical College Admissions Test (MCAT) questions (66.4% [n = 160] and 62% [n = 285], respectively; p students performed better on MCAT questions in all topic categories tested; the greatest gain occurred on the topic of cellular respiration. Because the VF focused on a conceptually parallel topic, photosynthesis, there may have been authentic knowledge transfer. In longitudinal tracking studies, passing the VF also correlated with higher performance in a range of upper-level science courses, with greatest significance in physiology, biochemistry, and organic chemistry. Participation had a wide range but not equal representation in academic standing, gender, and ethnicity. Yet students nearly unanimously (92%) valued the option. Our findings suggest oral exams at the introductory level may allow instructors to assess and aid students striving to achieve higher-level learning. PMID:24006399

  15. Complexity-aware high efficiency video coding

    CERN Document Server

    Correa, Guilherme; Agostini, Luciano; Cruz, Luis A da Silva

    2016-01-01

    This book discusses computational complexity of High Efficiency Video Coding (HEVC) encoders with coverage extending from the analysis of HEVC compression efficiency and computational complexity to the reduction and scaling of its encoding complexity. After an introduction to the topic and a review of the state-of-the-art research in the field, the authors provide a detailed analysis of the HEVC encoding tools compression efficiency and computational complexity.  Readers will benefit from a set of algorithms for scaling the computational complexity of HEVC encoders, all of which take advantage from the flexibility of the frame partitioning structures allowed by the standard.  The authors also provide a set of early termination methods based on data mining and machine learning techniques, which are able to reduce the computational complexity required to find the best frame partitioning structures. The applicability of the proposed methods is finally exemplified with an encoding time control system that emplo...

  16. The role of human performance in the safety complex plants' operation

    International Nuclear Information System (INIS)

    Preda, Irina Aida; Lazar, Roxana Elena; Croitoru, Cornelia

    1999-01-01

    According to statistics, about 20-30% from the failures occurred in the plants are caused directly or indirectly by human errors. Furthermore, it was established that 10-15% of the global failures are related with the human errors. These are mainly due to the wrong actions, maintenance errors, and misinterpretation of instruments. The human performance is influenced by: professional ability, complexity and danger to the plant experience in the working place, level of skills, events in personal and/or professional life, discipline, social ambience, somatic health. The human performances' assessment in the probabilistic safety assessment offers the possibility of evaluation of human contribution to the events sequences outcome. Not all the human errors have impact on the system. A human error may be recovered before the unwanted consequences had been occurred on system. This paper presents the possibilities to use the probabilistic method (event tree, fault tree) to identify the solutions for human reliability improved in order to minimize the risk in industrial plants' operation. Also, the human error types and their causes are defined and the 'decision tree method' as technique in our analysis for human reliability assessment is presented. The exemplification of human error analysis method was achieved based on operation data for Valcea Heavy Water Pilot Plant. As initiating event for the accident state 'the steam supply interruption' event has been considered. The human errors' contribution was analysed for the accident sequence with the worst consequences. (authors)

  17. ICDF Complex Remedial Action Work Plan

    Energy Technology Data Exchange (ETDEWEB)

    W. M. Heileson

    2006-12-01

    This Remedial Action Work Plan provides the framework for operation of the Idaho Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) Disposal Facility Complex (ICDF). This facility includes (a) an engineered landfill that meets the substantial requirements of DOE Order 435.1, Resource Conservation and Recovery Act Subtitle C, Idaho Hazardous Waste Management Act, and Toxic Substances Control Act polychlorinated biphenyl landfill requirements; (b) centralized receiving, inspections, administration, storage/staging, and treatment facilities necessary for CERCLA investigation-derived, remedial, and removal waste at the Idaho National Laboratory (INL) prior to final disposition in the disposal facility or shipment off-Site; and (c) an evaporation pond that has been designated as a corrective action management unit. The ICDF Complex, including a buffer zone, will cover approximately 40 acres, with a landfill disposal capacity of approximately 510,000 yd3. The ICDF Complex is designed and authorized to accept INL CERCLA-generated wastes, and includes the necessary subsystems and support facilities to provide a complete waste management system. This Remedial Action Work Plan presents the operational approach and requirements for the various components that are part of the ICDF Complex. Summaries of the remedial action work elements are presented herein, with supporting information and documents provided as appendixes to this work plan that contain specific detail about the operation of the ICDF Complex. This document presents the planned operational process based upon an evaluation of the remedial action requirements set forth in the Operable Unit 3-13 Final Record of Decision.

  18. Final report: A Broad Research Project in the Sciences of Complexity

    Energy Technology Data Exchange (ETDEWEB)

    None

    2000-02-01

    Previous DOE support for ''A Broad Research Program in the Sciences of Complexity'' permitted the Santa Fe Institute to initiate new collaborative research within its Integrative Core activities as well as to host visitors to participate in research on specific topics that serve as motivation and testing-ground for the study of general principles of complex systems. The critical aspect of this support is its effectiveness in seeding new areas of research. Indeed, this Integrative Core has been the birthplace of dozens of projects that later became more specifically focused and then won direct grant support independent of the core grants. But at early stages most of this multidisciplinary research was unable to win grant support as individual projects--both because it did not match well with existing grant program guidelines, and because the amount of handing needed was often too modest to justify a formal proposal to an agency. In fact, one of the attributes of core support has been that it permitted SFI to encourage high-risk activities because the cost was quite low. What is significant is how many of those initial efforts have been productive in the SFI environment. Many of SFI'S current research foci began with a short visit from a researcher new to the SFI community, or as small working groups that brought together carefully selected experts from a variety of fields. As mentioned above, many of the ensuing research projects are now being supported by other funding agencies or private foundations. Some of these successes are described.

  19. "One Task Fits All"? The Roles of Task Complexity, Modality, and Working Memory Capacity in L2 Performance

    Science.gov (United States)

    Zalbidea, Janire

    2017-01-01

    The present study explores the independent and interactive effects of task complexity and task modality on linguistic dimensions of second language (L2) performance and investigates how these effects are modulated by individual differences in working memory capacity. Thirty-two intermediate learners of L2 Spanish completed less and more complex…

  20. Complex endoscopic treatment of acute gastrointestinal bleeding of ulcer origin

    Directory of Open Access Journals (Sweden)

    V. V. Izbitsky

    2013-06-01

    Full Text Available Gastrointestinal bleeding (GIB is determined in 20-30% of patients with peptic ulcer disease. Acute gastrointestinal bleeding is on the first place as the main cause of deaths from peptic ulcer ahead of the other complications. Rebleeding occurs in 30-38% of patients. Materials and Methods For getting of the objective endoscopic picture in patients with bleeding gastroduodenal ulcers we used the classification of J.A. Forrest in our study: Type I - active bleeding: • I a - pulsating jet; • I b - stream. Type II - signs of recent bleeding: • II a - visible (non-bleeding visible vessel; • II b - fixed thrombus - a clot; • II c - flat black spot (black bottom ulcers. Type III - ulcer with a clean (white down. Integrated endoscopic hemostasis included: irrigation of ulcer defect and area around it with 3% hydrogen peroxide solution in a volume of 10 - 30ml; Injection of 2-4 mL of diluted epinephrine (1:10000 for hemostasis; use of Argon plasma coagulation. Results and Discussion Integrated endoscopic stop of bleeding was performed in 57 patients who were examined and treated at the Department of Surgery from 2006 to 2012. In 16 patients bleeding was caused by gastric ulcer. Gastric ulcer type I localization according to classification (HD Johnson, 1965 was determined in 9 patients, type II - in 2 patients, type III – in 5 patients. In 31 patients bleeding was caused by duodenal peptic ulcer, in 4 patients - erosive gastritis, 1 - erosive esophagitis, and in 5 patients - gastroenteroanastomosis area peptic ulcer. Final hemostasis was achieved in 55 (96.5% patients. In 50 (87.7% patients it was sufficient to conduct a single session of complex endoscopic treatment. In 5 (8.8% patients – it was done two times. In 2 (3.5% cases operation was performed due to the recurrent bleeding. The source of major bleeding in these patients was: chronic, duodenal ulcer penetrating into the head of the pancreas in one case complicated by subcompensated

  1. Environmental-performance research priorities: Wood products. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-01-15

    This report describes a research plan to establish environmental, energy, and economic performance measures for renewable building materials, and to identify management and technology alternatives to improve environmental performance in a cost-effective manner. The research plan is designed to: (1) collect environmental and economic data on all life-cycle stages of the materials, (2) ensure that the data follows consistent definitions and collection procedures, and (3) develop analytical procedures for life-cycle analysis to address environmental performance questions. The research will be subdivided into a number of individual project modules. The five processing stages of wood used to organize the research plan are: (1) resource management and harvesting; (2) processing; (3) design and construction of structures; (4) use, maintenance, and disposal; and (5) waste recycling. Individual research module descriptions are provided in the report, as well as assessment techniques, research standards and protocol, and research management. 13 refs., 5 figs., 3 tabs.

  2. Self-perceived versus objectively measured competence in performing clinical practical procedures by final year medical students.

    Science.gov (United States)

    Katowa-Mukwato, Patricia; Banda, Sekelani

    2016-04-30

    To determine and compare the self-perceived and objectively measured competence in performing 14 core-clinical practical procedures by Final Year Medical Students of the University of Zambia. The study included 56 out of 60 graduating University of Zambia Medical Students of the 2012/2013 academic year. Self-perceived competence: students rated their competence on 14 core- clinical practical procedures using a self-administered questionnaire on a 5-point Likert scale. Objective competence: it was measured by Objective Structured Clinical Examination (OSCE) by faculty using predetermined rating scales. Rank order correlation test was performed for self-perceived and objectively measured competence. Two thirds 36 (66.7%) of the participants perceived themselves as moderately competent, 15 (27.8%) rated themselves as highly competent while 3 (5.6%) had low self-perception. With objective competence, the majority 52 (92.8%) were barely competent while 4 (7.2%) were absolutely competent. When overall self-perception was compared to objectively measured competence, there was a discordance which was demonstrated by a negative correlation (Spearman rho -.123). Significant numbers of students reported low self-competence in performing procedures such as endotracheal intubation, gastric lavage and cardiopulmonary resuscitation which most never performed during the clinical years of medical education. In addition, the negative correlation between self-perceived and objectively measured competence demonstrated the inability of students to assess and rate themselves objectively due to fear that others may know their weaknesses and realize that they are not as competent as expected at a specific level of training.

  3. Pollution going multimodal: the complex impact of the human-altered sensory environment on animal perception and performance.

    Science.gov (United States)

    Halfwerk, Wouter; Slabbekoorn, Hans

    2015-04-01

    Anthropogenic sensory pollution is affecting ecosystems worldwide. Human actions generate acoustic noise, emanate artificial light and emit chemical substances. All of these pollutants are known to affect animals. Most studies on anthropogenic pollution address the impact of pollutants in unimodal sensory domains. High levels of anthropogenic noise, for example, have been shown to interfere with acoustic signals and cues. However, animals rely on multiple senses, and pollutants often co-occur. Thus, a full ecological assessment of the impact of anthropogenic activities requires a multimodal approach. We describe how sensory pollutants can co-occur and how covariance among pollutants may differ from natural situations. We review how animals combine information that arrives at their sensory systems through different modalities and outline how sensory conditions can interfere with multimodal perception. Finally, we describe how sensory pollutants can affect the perception, behaviour and endocrinology of animals within and across sensory modalities. We conclude that sensory pollution can affect animals in complex ways due to interactions among sensory stimuli, neural processing and behavioural and endocrinal feedback. We call for more empirical data on covariance among sensory conditions, for instance, data on correlated levels in noise and light pollution. Furthermore, we encourage researchers to test animal responses to a full-factorial set of sensory pollutants in the presence or the absence of ecologically important signals and cues. We realize that such approach is often time and energy consuming, but we think this is the only way to fully understand the multimodal impact of sensory pollution on animal performance and perception. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  4. Explaining high and low performers in complex intervention trials: a new model based on diffusion of innovations theory.

    Science.gov (United States)

    McMullen, Heather; Griffiths, Chris; Leber, Werner; Greenhalgh, Trisha

    2015-05-31

    Complex intervention trials may require health care organisations to implement new service models. In a recent cluster randomised controlled trial, some participating organisations achieved high recruitment, whereas others found it difficult to assimilate the intervention and were low recruiters. We sought to explain this variation and develop a model to inform organisational participation in future complex intervention trials. The trial included 40 general practices in a London borough with high HIV prevalence. The intervention was offering a rapid HIV test as part of the New Patient Health Check. The primary outcome was mean CD4 cell count at diagnosis. The process evaluation consisted of several hundred hours of ethnographic observation, 21 semi-structured interviews and analysis of routine documents (e.g., patient leaflets, clinical protocols) and trial documents (e.g., inclusion criteria, recruitment statistics). Qualitative data were analysed thematically using--and, where necessary, extending--Greenhalgh et al.'s model of diffusion of innovations. Narrative synthesis was used to prepare case studies of four practices representing maximum variety in clinicians' interest in HIV (assessed by level of serological testing prior to the trial) and performance in the trial (high vs. low recruiters). High-recruiting practices were, in general though not invariably, also innovative practices. They were characterised by strong leadership, good managerial relations, readiness for change, a culture of staff training and available staff time ('slack resources'). Their front-line staff believed that patients might benefit from the rapid HIV test ('relative advantage'), were emotionally comfortable administering it ('compatibility'), skilled in performing it ('task issues') and made creative adaptations to embed the test in local working practices ('reinvention'). Early experience of a positive HIV test ('observability') appeared to reinforce staff commitment to recruiting

  5. Complexity management in engineering design a primer

    CERN Document Server

    Maurer, Maik

    2017-01-01

    The treatise supports understanding the phenomena of complexity in engineering, distinguishes complexity from other challenges and presents an overview of definitions and applied approaches. The historical background of complexity management is explained by highlighting the important epochs, their key actors and their discoveries, findings and developments. Knowing about the appearance of early system awareness in ancient Greece, the creation of mechanical philosophy in the 17th century and the discovery of classic physics enables the reader to better comprehend modern system sciences and management approaches. A classification of complexity management approaches by research fields indicates current focus areas and starting points for future discussions. In a comprehensive map, the classification points out mutual overlaps between engineering disciplines in terms of similar complexity management approaches. Finally, the treatise introduces a generic complexity management framework, which is based on structura...

  6. How to include frequency dependent complex permeability Into SPICE models to improve EMI filters design?

    Science.gov (United States)

    Sixdenier, Fabien; Yade, Ousseynou; Martin, Christian; Bréard, Arnaud; Vollaire, Christian

    2018-05-01

    Electromagnetic interference (EMI) filters design is a rather difficult task where engineers have to choose adequate magnetic materials, design the magnetic circuit and choose the size and number of turns. The final design must achieve the attenuation requirements (constraints) and has to be as compact as possible (goal). Alternating current (AC) analysis is a powerful tool to predict global impedance or attenuation of any filter. However, AC analysis are generally performed without taking into account the frequency-dependent complex permeability behaviour of soft magnetic materials. That's why, we developed two frequency-dependent complex permeability models able to be included into SPICE models. After an identification process, the performances of each model are compared to measurements made on a realistic EMI filter prototype in common mode (CM) and differential mode (DM) to see the benefit of the approach. Simulation results are in good agreement with the measured ones especially in the middle frequency range.

  7. Traffic Dynamics on Complex Networks: A Survey

    Directory of Open Access Journals (Sweden)

    Shengyong Chen

    2012-01-01

    Full Text Available Traffic dynamics on complex networks are intriguing in recent years due to their practical implications in real communication networks. In this survey, we give a brief review of studies on traffic routing dynamics on complex networks. Strategies for improving transport efficiency, including designing efficient routing strategies and making appropriate adjustments to the underlying network structure, are introduced in this survey. Finally, a few open problems are discussed in this survey.

  8. Complex-wide review of DOE's Low-Level Waste Management ES ampersand H vulnerabilities. Volume II. Final report

    International Nuclear Information System (INIS)

    1996-05-01

    Volume I of this report presents a summary of DOE's complex-wide review of its low-level waste management system, including the assessment scope and methodology, site-specific and complex-wide vulnerabilities, and DOE's conclusions and recommendations. Volume II presents a more detailed discussion of the assessment methodology and evaluation instruments developed by the Assessment Working Group for identifying site-specific vulnerabilities, categorizing and classifying vulnerabilities, and identifying and analyzing complex-wide vulnerabilities. Attachments A and B of this volume contain, respectively, the Site Evaluation Survey and the Vulnerability Assessment Form used in those processes. Volume III contains the site-specific assessment reports for the 36 sites (38 facilities) assessed in the complex-wide review from which the complex-wide vulnerabilities were drawn

  9. Comparative Performance of Complex-Valued B-Spline and Polynomial Models Applied to Iterative Frequency-Domain Decision Feedback Equalization of Hammerstein Channels.

    Science.gov (United States)

    Chen, Sheng; Hong, Xia; Khalaf, Emad F; Alsaadi, Fuad E; Harris, Chris J

    2017-12-01

    Complex-valued (CV) B-spline neural network approach offers a highly effective means for identifying and inverting practical Hammerstein systems. Compared with its conventional CV polynomial-based counterpart, a CV B-spline neural network has superior performance in identifying and inverting CV Hammerstein systems, while imposing a similar complexity. This paper reviews the optimality of the CV B-spline neural network approach. Advantages of B-spline neural network approach as compared with the polynomial based modeling approach are extensively discussed, and the effectiveness of the CV neural network-based approach is demonstrated in a real-world application. More specifically, we evaluate the comparative performance of the CV B-spline and polynomial-based approaches for the nonlinear iterative frequency-domain decision feedback equalization (NIFDDFE) of single-carrier Hammerstein channels. Our results confirm the superior performance of the CV B-spline-based NIFDDFE over its CV polynomial-based counterpart.

  10. Performance and Complexity Analysis of Blind FIR Channel Identification Algorithms Based on Deterministic Maximum Likelihood in SIMO Systems

    DEFF Research Database (Denmark)

    De Carvalho, Elisabeth; Omar, Samir; Slock, Dirk

    2013-01-01

    We analyze two algorithms that have been introduced previously for Deterministic Maximum Likelihood (DML) blind estimation of multiple FIR channels. The first one is a modification of the Iterative Quadratic ML (IQML) algorithm. IQML gives biased estimates of the channel and performs poorly at low...... to the initialization. Its asymptotic performance does not reach the DML performance though. The second strategy, called Pseudo-Quadratic ML (PQML), is naturally denoised. The denoising in PQML is furthermore more efficient than in DIQML: PQML yields the same asymptotic performance as DML, as opposed to DIQML......, but requires a consistent initialization. We furthermore compare DIQML and PQML to the strategy of alternating minimization w.r.t. symbols and channel for solving DML (AQML). An asymptotic performance analysis, a complexity evaluation and simulation results are also presented. The proposed DIQML and PQML...

  11. Final design and performance of in situ testing in Grimsel

    International Nuclear Information System (INIS)

    Fuentes-Cantillana, J.L.; Garcia-SiNeriz, J.L.

    1998-01-01

    This report is focused on the design, engineering, and construction aspects of the in situ test carried out at the Grimsel underground laboratory in Switzerland. This reproduces the AGP-granite concept of ENRESA for HLW repositories in crystalline rock. Two heaters, similar in dimensions and weight to the canisters in the reference concept, have been placed in a horizontal drift with a 2.28-m diameter, a total test length of 17.4 m, and backfilled with a total of 115.7 † of highly-compacted bentonite blocks. The backfilled area has been closed with a concrete plug which is 2.7 m thick. More than 600 sensors have been installed in the test to monitor different parameters such as temperature, pressures, humidity, etc., within both the buffer material and the host rock. The installation was completed and commissioned in February 1997, and then the heating phase, which will last for at least 3 years, was started. During this period, the test will basically be operated in an automatic mode, controlled and monitored from Spain via modem. The report is the Final Report from AITEMIN for Phase 4 of the project and includes a description of the test configuration and layout; the design, engineering, and manufacturing aspects of the different test components and equipment; the emplacement operation; and the as built information regarding the final position of the main components and the sensors. (Author)

  12. A Simple and High Performing Rate Control Initialization Method for H.264 AVC Coding Based on Motion Vector Map and Spatial Complexity at Low Bitrate

    Directory of Open Access Journals (Sweden)

    Yalin Wu

    2014-01-01

    Full Text Available The temporal complexity of video sequences can be characterized by motion vector map which consists of motion vectors of each macroblock (MB. In order to obtain the optimal initial QP (quantization parameter for the various video sequences which have different spatial and temporal complexities, this paper proposes a simple and high performance initial QP determining method based on motion vector map and temporal complexity to decide an initial QP in given target bit rate. The proposed algorithm produces the reconstructed video sequences with outstanding and stable quality. For any video sequences, the initial QP can be easily determined from matrices by target bit rate and mapped spatial complexity using proposed mapping method. Experimental results show that the proposed algorithm can show more outstanding objective and subjective performance than other conventional determining methods.

  13. Development of Higher Education teaching: visibility and professional performance

    Directory of Open Access Journals (Sweden)

    Marilda Aparecida Behrens

    2010-07-01

    Full Text Available This text presents some reflections based on a thorough review of studies carried out by the PEFOP (Educational Paradigms and Teacher Education group on teaching performance in higher education. The complex scenario of teaching activity and the challenges imposed by daily academic tasks were investigated in order to specify some indicators used by students to qualify teaching activity at university. It could be observed that such indicators interfere with the evaluation of the professors’ performance. The investigation also showed how professors deal with the required institutional evaluations, whose indicators and metric indices cause concern. Finally, a qualitative field research based on a semi-open-ended questionnaire was carried out with 89 students from a community university.

  14. Complexation of Eu(III) with a polymeric cement additive as a potential carrier of actinides

    Energy Technology Data Exchange (ETDEWEB)

    Lippold, Holger; Becker, Michael [Helmholtz-Zentrum Dresden-Rossendorf e.V., Dresden (Germany). Reactive Transport

    2017-06-01

    In the long term, cementitious materials in a final repository will be exposed to leaching processes generating highly alkaline solutions. Polymeric additives, so-called superplasticizers, are considered as potential mobilizing agents for released radionuclides, since it is uncertain whether complete degradation will take place under the evolving aqueous conditions. Regarding the complexing properties of superplasticizers, there are only indirect assessments so far. In this study, first systematic investigations on complexation with Eu(III) as an analogue of trivalent actinides were performed at variable pH and electrolyte content (NaCl, CaCl{sub 2}) using ultrafiltration as a separation method. A stability constant was derived according to the charge neutralization model. For this purpose, the proton exchange capacity was determined by potentiometric titration.

  15. IMPER: Characterization of the wind field over a large wind turbine rotor - final report; Improved performance

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt Paulsen, U.; Wagner, R.

    2012-01-15

    A modern wind turbine rotor with a contemporary rotor size would easily with the tips penetrate the air between 116 m and 30 m and herby experience effects of different wind. With current rules on power performance measurements such as IEC 61400-121 the reference wind speed is measured at hub height, an oversimplification of the wind energy power over the rotor disk area is carried out. The project comprised a number of innovative and coordinated measurements on a full scale turbine with remote sensing technology and simulations on a 500 kW wind turbine for the effects of wind field characterization. The objective with the present report is to give a short overview of the different experiments carried out and results obtained within the final phase of this project. (Author)

  16. Test planning and performance

    International Nuclear Information System (INIS)

    Zola, Maurizio

    2001-01-01

    Testing plan should include Safety guide Q4 - Inspection and testing - A testing plan should be prepared including following information: General information (facility name, item or system reference, procurement document reference, document reference number and status, associated procedures and drawings); A sequential listing of all testing activities; Procedure, work instruction, specification or standard to be followed in respect of each operation and test; Acceptance criteria; Identification of who is performing tests; Identification of hold points; Type of records to be prepared for each test; Persons and organizations having authority for final acceptance. Proposed activities sequence is: visual, electrical and mechanical checks; environmental tests (thermal aging, vibrations aging, radioactive aging); performance evaluation in extreme conditions; dynamic tests with functional checks; final electrical and mechanical checks The planning of the tests should always be performed taking into account an interpretative model: a very tight cooperation is advisable between experimental people and numerical people dealing with the analysis of more or less complex models for the seismic assessment of structures and components. Preparatory phase should include the choice of the following items should be agreed upon with the final user of the tests: Excitation points, Excitation types, Excitation amplitude with respect to frequency, Measuring points. Data acquisition, recording and storage, should take into account the characteristics of the successive data processing: to much data can be cumbersome to be processed, but to few data can make unusable the experimental results. The parameters for time history acquisition should be chosen taking into account data processing: for Shock Response Spectrum calculation some special requirements should be met: frequency bounded signal, high frequency sampling, shock noise. For stationary random-like excitation, the sample length

  17. Protein complex detection in PPI networks based on data integration and supervised learning method.

    Science.gov (United States)

    Yu, Feng; Yang, Zhi; Hu, Xiao; Sun, Yuan; Lin, Hong; Wang, Jian

    2015-01-01

    Revealing protein complexes are important for understanding principles of cellular organization and function. High-throughput experimental techniques have produced a large amount of protein interactions, which makes it possible to predict protein complexes from protein-protein interaction (PPI) networks. However, the small amount of known physical interactions may limit protein complex detection. The new PPI networks are constructed by integrating PPI datasets with the large and readily available PPI data from biomedical literature, and then the less reliable PPI between two proteins are filtered out based on semantic similarity and topological similarity of the two proteins. Finally, the supervised learning protein complex detection (SLPC), which can make full use of the information of available known complexes, is applied to detect protein complex on the new PPI networks. The experimental results of SLPC on two different categories yeast PPI networks demonstrate effectiveness of the approach: compared with the original PPI networks, the best average improvements of 4.76, 6.81 and 15.75 percentage units in the F-score, accuracy and maximum matching ratio (MMR) are achieved respectively; compared with the denoising PPI networks, the best average improvements of 3.91, 4.61 and 12.10 percentage units in the F-score, accuracy and MMR are achieved respectively; compared with ClusterONE, the start-of the-art complex detection method, on the denoising extended PPI networks, the average improvements of 26.02 and 22.40 percentage units in the F-score and MMR are achieved respectively. The experimental results show that the performances of SLPC have a large improvement through integration of new receivable PPI data from biomedical literature into original PPI networks and denoising PPI networks. In addition, our protein complexes detection method can achieve better performance than ClusterONE.

  18. Performance and complexity of tunable sparse network coding with gradual growing tuning functions over wireless networks

    OpenAIRE

    Garrido Ortiz, Pablo; Sørensen, Chres W.; Lucani Roetter, Daniel Enrique; Agüero Calvo, Ramón

    2016-01-01

    Random Linear Network Coding (RLNC) has been shown to be a technique with several benefits, in particular when applied over wireless mesh networks, since it provides robustness against packet losses. On the other hand, Tunable Sparse Network Coding (TSNC) is a promising concept, which leverages a trade-off between computational complexity and goodput. An optimal density tuning function has not been found yet, due to the lack of a closed-form expression that links density, performance and comp...

  19. Final Report - Certifying the Performance of Small Wind Turbines

    Energy Technology Data Exchange (ETDEWEB)

    Sherwood, Larry [Small Wind Certification Council, Clifton Park, NY (United States)

    2015-08-28

    The Small Wind Certification Council (SWCC) created a successful accredited certification program for small and medium wind turbines using the funding from this grant. SWCC certifies small turbines (200 square meters of swept area or less) to the American Wind Energy Association (AWEA) Small Wind Turbine Performance and Safety Standard (AWEA Standard 9.1 – 2009). SWCC also certifies medium wind turbines to the International Electrical Commission (IEC) Power Performance Standard (IEC 61400-12-1) and Acoustic Performance Standard (IEC 61400-11).

  20. Fast Flux Test Facility performance monitoring management information: [Final report

    International Nuclear Information System (INIS)

    Newland, D.J.

    1987-09-01

    The purpose of this report is to provide management with performance data on key performance indicators for the month of July, 1987. This report contains the results for key performance indicators divided into two categories of ''overall'' and ''other''. The ''overall'' performance indicators, when considered in the aggregate, provide one means of monitoring overall plant performance

  1. Monitoring performance of a highly distributed and complex computing infrastructure in LHCb

    Science.gov (United States)

    Mathe, Z.; Haen, C.; Stagni, F.

    2017-10-01

    In order to ensure an optimal performance of the LHCb Distributed Computing, based on LHCbDIRAC, it is necessary to be able to inspect the behavior over time of many components: firstly the agents and services on which the infrastructure is built, but also all the computing tasks and data transfers that are managed by this infrastructure. This consists of recording and then analyzing time series of a large number of observables, for which the usage of SQL relational databases is far from optimal. Therefore within DIRAC we have been studying novel possibilities based on NoSQL databases (ElasticSearch, OpenTSDB and InfluxDB) as a result of this study we developed a new monitoring system based on ElasticSearch. It has been deployed on the LHCb Distributed Computing infrastructure for which it collects data from all the components (agents, services, jobs) and allows creating reports through Kibana and a web user interface, which is based on the DIRAC web framework. In this paper we describe this new implementation of the DIRAC monitoring system. We give details on the ElasticSearch implementation within the DIRAC general framework, as well as an overview of the advantages of the pipeline aggregation used for creating a dynamic bucketing of the time series. We present the advantages of using the ElasticSearch DSL high-level library for creating and running queries. Finally we shall present the performances of that system.

  2. A study on the identification of cognitive complexity factors related to the complexity of procedural steps

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Kyun; Jeong, Kwang Sup; Jung, Won Dea [KAERI, Taejon (Korea, Republic of)

    2004-07-01

    In complex systems, it is well recognized that the provision of understandable procedures that allow operators to clarify 'what needs to be done' and 'how to do it' is one of the requisites to confirm their safety. In this regard, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is evinced from the comparisons between SC scores and operators' performance data. To this end, the comparisons between operators' performance data with their behavior in conducting prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect operators' cognitive burden are identified. Although a well-designed experiment is indispensable in confirming the appropriateness of cognitive complexity factors, it is strongly believed that the change of an operator's performance can be more authentically explained if they are taken into consideration.

  3. A study on the identification of cognitive complexity factors related to the complexity of procedural steps

    International Nuclear Information System (INIS)

    Park, Jin Kyun; Jeong, Kwang Sup; Jung, Won Dea

    2004-01-01

    In complex systems, it is well recognized that the provision of understandable procedures that allow operators to clarify 'what needs to be done' and 'how to do it' is one of the requisites to confirm their safety. In this regard, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is evinced from the comparisons between SC scores and operators' performance data. To this end, the comparisons between operators' performance data with their behavior in conducting prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect operators' cognitive burden are identified. Although a well-designed experiment is indispensable in confirming the appropriateness of cognitive complexity factors, it is strongly believed that the change of an operator's performance can be more authentically explained if they are taken into consideration

  4. Research on Design and Simulation of Biaxial Tensile-Bending Complex Mechanical Performance Test Apparatus

    Directory of Open Access Journals (Sweden)

    Hailian Li

    2017-09-01

    Full Text Available In order to realize a micro-mechanic performance test of biaxial tensile-bending-combined loading and solve the problem of incompatibility of test apparatus and observation apparatus, novel biaxial-combined tensile-bending micro-mechanical performance test apparatus was designed. The working principle and major functions of key constituent parts of test apparatus, including the servo drive unit, clamping unit and test system, were introduced. Based on the finite element method, biaxial tensile and tension-bending-combined mechanical performances of the test-piece were studied as guidance to learn the distribution of elastic deformation and plastic deformation of all sites of the test-piece and to better plan test regions. Finally, this test apparatus was used to conduct a biaxial tensile test under different pre-bending loading and a tensile test at different rates; the image of the fracture of the test-piece was acquired by a scanning electron microscope and analyzed. It was indicated that as the pre-bending force rises, the elastic deformation phase would gradually shorten and the slope of the elastic deformation phase curve would slightly rise so that a yield limit would appear ahead of time. Bending speed could exert a positive and beneficial influence on tensile strength but weaken fracture elongation. If bending speed is appropriately raised, more ideal anti-tensile strength could be obtained, but fracture elongation would decline.

  5. On Measuring the Complexity of Networks: Kolmogorov Complexity versus Entropy

    Directory of Open Access Journals (Sweden)

    Mikołaj Morzy

    2017-01-01

    Full Text Available One of the most popular methods of estimating the complexity of networks is to measure the entropy of network invariants, such as adjacency matrices or degree sequences. Unfortunately, entropy and all entropy-based information-theoretic measures have several vulnerabilities. These measures neither are independent of a particular representation of the network nor can capture the properties of the generative process, which produces the network. Instead, we advocate the use of the algorithmic entropy as the basis for complexity definition for networks. Algorithmic entropy (also known as Kolmogorov complexity or K-complexity for short evaluates the complexity of the description required for a lossless recreation of the network. This measure is not affected by a particular choice of network features and it does not depend on the method of network representation. We perform experiments on Shannon entropy and K-complexity for gradually evolving networks. The results of these experiments point to K-complexity as the more robust and reliable measure of network complexity. The original contribution of the paper includes the introduction of several new entropy-deceiving networks and the empirical comparison of entropy and K-complexity as fundamental quantities for constructing complexity measures for networks.

  6. Review of Public Safety in Viewpoint of Complex Networks

    International Nuclear Information System (INIS)

    Gai Chengcheng; Weng Wenguo; Yuan Hongyong

    2010-01-01

    In this paper, a brief review of public safety in viewpoint of complex networks is presented. Public safety incidents are divided into four categories: natural disasters, industry accidents, public health and social security, in which the complex network approaches and theories are need. We review how the complex network methods was developed and used in the studies of the three kinds of public safety incidents. The typical public safety incidents studied by the complex network methods in this paper are introduced, including the natural disaster chains, blackouts on electric power grids and epidemic spreading. Finally, we look ahead to the application prospects of the complex network theory on public safety.

  7. Final focus systems for linear colliders

    International Nuclear Information System (INIS)

    Erickson, R.A.

    1987-11-01

    The final focus system of a linear collider must perform two primary functions, it must focus the two opposing beams so that their transverse dimensions at the interaction point are small enough to yield acceptable luminosity, and it must steer the beams together to maintain collisions. In addition, the final focus system must transport the outgoing beams to a location where they can be recycled or safely dumped. Elementary optical considerations for linear collider final focus systems are discussed, followed by chromatic aberrations. The design of the final focus system of the SLAC Linear Collider (SLC) is described. Tuning and diagnostics and steering to collision are discussed. Most of the examples illustrating the concepts covered are drawn from the SLC, but the principles and conclusions are said to be generally applicable to other linear collider designs as well. 26 refs., 17 figs

  8. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [The University of Texas at Austin

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUARO Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.

  9. Stability analysis of impulsive parabolic complex networks

    Energy Technology Data Exchange (ETDEWEB)

    Wang Jinliang, E-mail: wangjinliang1984@yahoo.com.cn [Science and Technology on Aircraft Control Laboratory, School of Automation Science and Electrical Engineering, Beihang University, XueYuan Road, No. 37, HaiDian District, Beijing 100191 (China); Wu Huaining [Science and Technology on Aircraft Control Laboratory, School of Automation Science and Electrical Engineering, Beihang University, XueYuan Road, No. 37, HaiDian District, Beijing 100191 (China)

    2011-11-15

    Highlights: > Two impulsive parabolic complex network models are proposed. > The global exponential stability of impulsive parabolic complex networks are considered. > The robust global exponential stability of impulsive parabolic complex networks are considered. - Abstract: In the present paper, two kinds of impulsive parabolic complex networks (IPCNs) are considered. In the first one, all nodes have the same time-varying delay. In the second one, different nodes have different time-varying delays. Using the Lyapunov functional method combined with the inequality techniques, some global exponential stability criteria are derived for the IPCNs. Furthermore, several robust global exponential stability conditions are proposed to take uncertainties in the parameters of the IPCNs into account. Finally, numerical simulations are presented to illustrate the effectiveness of the results obtained here.

  10. Modeling Musical Complexity: Commentary on Eerola (2016

    Directory of Open Access Journals (Sweden)

    Joshua Albrecht

    2016-07-01

    Full Text Available In his paper, "Expectancy violation and information-theoretic models of melodic complexity," Eerola compares a number of models that correlate musical features of monophonic melodies with participant ratings of perceived melodic complexity. He finds that fairly strong results can be achieved using several different approaches to modeling perceived melodic complexity. The data used in this study are gathered from several previously published studies that use widely different types of melodies, including isochronous folk melodies, isochronous 12-tone rows, and rhythmically complex African folk melodies. This commentary first briefly reviews the article's method and main findings, then suggests a rethinking of the theoretical framework of the study. Finally, some of the methodological issues of the study are discussed.

  11. Stability analysis of impulsive parabolic complex networks

    International Nuclear Information System (INIS)

    Wang Jinliang; Wu Huaining

    2011-01-01

    Highlights: → Two impulsive parabolic complex network models are proposed. → The global exponential stability of impulsive parabolic complex networks are considered. → The robust global exponential stability of impulsive parabolic complex networks are considered. - Abstract: In the present paper, two kinds of impulsive parabolic complex networks (IPCNs) are considered. In the first one, all nodes have the same time-varying delay. In the second one, different nodes have different time-varying delays. Using the Lyapunov functional method combined with the inequality techniques, some global exponential stability criteria are derived for the IPCNs. Furthermore, several robust global exponential stability conditions are proposed to take uncertainties in the parameters of the IPCNs into account. Finally, numerical simulations are presented to illustrate the effectiveness of the results obtained here.

  12. Improvement of a three-dimensional atmospheric dynamic model and examination of its performance over complex terrain

    International Nuclear Information System (INIS)

    Nagai, Haruyasu; Yamazawa, Hiromi

    1994-11-01

    A three-dimensional atmospheric dynamic model (PHYSIC) was improved and its performance was examined using the meteorological data observed at a coastal area with a complex terrain. To introduce synoptic meteorological conditions into the model, the initial and boundary conditions were improved. By this improvement, the model can predict the temporal change of wind field for more than 24 hours. Moreover, the model successfully simulates the land and sea breeze observed at Shimokita area in the summer of 1992. (author)

  13. Low-Complexity Interference-Free Downlink Channel Assignment with Improved Performance in Coordinated Small Cells

    KAUST Repository

    Radaydeh, Redha M.

    2015-05-01

    This paper proposes a low-complexity interference-free channel assignment scheme with improved desired downlink performance in coordinated multi-antenna small-coverage access points (APs) that employ the open-access control strategy. The adopted system treats the case when each user can be granted an access to one of the available channels at a time. Moreover, each receive terminal can suppress a limited number of resolvable interfering sources via its highly-correlated receive array. On the other hand, the operation of the deployed APs can be coordinated to serve active users, and the availability of multiple physical channels and the use of uncorrelated transmit antennas at each AP are exploited to improve the performance of supported users. The analysis provides new approaches to use the transmit antenna array at each AP, the multiple physical channels, the receive antenna array at each user in order to identify interference-free channels per each user, and then to select a downlink channel that provides the best possible improved performance. The event of concurrent interference-free channel identification by different users is also treated to further improve the desired link associated with the scheduled user. The analysis considers the practical scenario of imperfect identification of interference-free channel by an active user and/or the imperfectness in scheduling concurrent users requests on the same channel. The developed formulations can be used to study any performance metric and they are applicable for any statistical and geometric channel models. © 2015 IEEE.

  14. Estimating the operator's performance time of emergency procedural tasks based on a task complexity measure

    International Nuclear Information System (INIS)

    Jung, Won Dae; Park, Jink Yun

    2012-01-01

    It is important to understand the amount of time required to execute an emergency procedural task in a high-stress situation for managing human performance under emergencies in a nuclear power plant. However, the time to execute an emergency procedural task is highly dependent upon expert judgment due to the lack of actual data. This paper proposes an analytical method to estimate the operator's performance time (OPT) of a procedural task, which is based on a measure of the task complexity (TACOM). The proposed method for estimating an OPT is an equation that uses the TACOM as a variable, and the OPT of a procedural task can be calculated if its relevant TACOM score is available. The validity of the proposed equation is demonstrated by comparing the estimated OPTs with the observed OPTs for emergency procedural tasks in a steam generator tube rupture scenario.

  15. Selenophene transition metal complexes

    Energy Technology Data Exchange (ETDEWEB)

    White, Carter James [Iowa State Univ., Ames, IA (United States)

    1994-07-27

    This research shows that selenophene transition metal complexes have a chemistry that is similar to their thiophene analogs. Selenophene coordination has been demonstrated and confirmed by molecular structure in both the η5- and the η1(Se)-coordination modes. The reaction chemistry of selenophene complexes closely resembles that of the analogous thiophene complexes. One major difference, however, is that selenophene is a better donor ligand than thiophene making the selenophene complexes more stable than the corresponding thiophene complexes. The 77Se NMR chemical shift values for selenophene complexes fall within distinct regions primarily depending on the coordination mode of the selenophene ligand. In the final paper, the C-H bond activation of η1(S)-bound thiophenes, η1(S)-benzothiophene and η1(Se)-bound selenophenes has been demonstrated. The deprotonation and rearrangement of the η1(E)-bound ligand to the carbon bound L-yl complex readily occurs in the presence of base. Reprotonation with a strong acid gives a carbene complex that is unreactive towards nucleophilic attack at the carbene carbon and is stable towards exposure to air. The molecular structure of [Cp(NO)(PPh3)Re(2-benzothioenylcarbene)]O3SCF3 was determined and contains a Re-C bond with substantial double bond character. Methyl substitution for the thienylcarbene or selenylcarbene gives a carbene that rearranges thermally to give back the η1(E)-bound complex. Based on these model reactions, a new mechanism for the H/D exchange of thiophene over the hydrodesulfurization catalyst has been proposed.

  16. Quantum Google in a Complex Network

    Science.gov (United States)

    Paparo, Giuseppe Davide; Müller, Markus; Comellas, Francesc; Martin-Delgado, Miguel Angel

    2013-01-01

    We investigate the behaviour of the recently proposed Quantum PageRank algorithm, in large complex networks. We find that the algorithm is able to univocally reveal the underlying topology of the network and to identify and order the most relevant nodes. Furthermore, it is capable to clearly highlight the structure of secondary hubs and to resolve the degeneracy in importance of the low lying part of the list of rankings. The quantum algorithm displays an increased stability with respect to a variation of the damping parameter, present in the Google algorithm, and a more clearly pronounced power-law behaviour in the distribution of importance, as compared to the classical algorithm. We test the performance and confirm the listed features by applying it to real world examples from the WWW. Finally, we raise and partially address whether the increased sensitivity of the quantum algorithm persists under coordinated attacks in scale-free and random networks. PMID:24091980

  17. Quantum Google in a Complex Network

    Science.gov (United States)

    Paparo, Giuseppe Davide; Müller, Markus; Comellas, Francesc; Martin-Delgado, Miguel Angel

    2013-10-01

    We investigate the behaviour of the recently proposed Quantum PageRank algorithm, in large complex networks. We find that the algorithm is able to univocally reveal the underlying topology of the network and to identify and order the most relevant nodes. Furthermore, it is capable to clearly highlight the structure of secondary hubs and to resolve the degeneracy in importance of the low lying part of the list of rankings. The quantum algorithm displays an increased stability with respect to a variation of the damping parameter, present in the Google algorithm, and a more clearly pronounced power-law behaviour in the distribution of importance, as compared to the classical algorithm. We test the performance and confirm the listed features by applying it to real world examples from the WWW. Finally, we raise and partially address whether the increased sensitivity of the quantum algorithm persists under coordinated attacks in scale-free and random networks.

  18. Evaluating supplier quality performance using analytical hierarchy process

    Science.gov (United States)

    Kalimuthu Rajoo, Shanmugam Sundram; Kasim, Maznah Mat; Ahmad, Nazihah

    2013-09-01

    This paper elaborates the importance of evaluating supplier quality performance to an organization. Supplier quality performance evaluation reflects the actual performance of the supplier exhibited at customer's end. It is critical in enabling the organization to determine the area of improvement and thereafter works with supplier to close the gaps. Success of the customer partly depends on supplier's quality performance. Key criteria as quality, cost, delivery, technology support and customer service are categorized as main factors in contributing to supplier's quality performance. 18 suppliers' who were manufacturing automotive application parts evaluated in year 2010 using weight point system. There were few suppliers with common rating which led to common ranking observed by few suppliers'. Analytical Hierarchy Process (AHP), a user friendly decision making tool for complex and multi criteria problems was used to evaluate the supplier's quality performance challenging the weight point system that was used for 18 suppliers'. The consistency ratio was checked for criteria and sub-criteria. Final results of AHP obtained with no overlap ratings, therefore yielded a better decision making methodology as compared to weight point rating system.

  19. Does doctors’ workload impact supervision and ward activities of final-year students? A prospective study

    Directory of Open Access Journals (Sweden)

    Celebi Nora

    2012-06-01

    Full Text Available Abstract Background Hospital doctors face constantly increasing workloads. Besides caring for patients, their duties also comprise the education of future colleagues. The aim of this study was to objectively investigate whether the workload arising from increased patient care interferes with student supervision and is associated with more non-medical activities of final-year medical students. Methods A total of 54 final-year students were asked to keep a diary of their daily activities over a three-week period at the beginning of their internship in Internal Medicine. Students categorized their activities – both medical and non-medical - according to whether they had: (1 only watched, (2 assisted the ward resident, (3 performed the activity themselves under supervision of the ward resident, or (4 performed the activity without supervision. The activities reported on a particular day were matched with a ward specific workload-index derived from the hospital information system, including the number of patients treated on the corresponding ward on that day, a correction factor according to the patient comorbidity complexity level (PCCL, and the number of admissions and discharges. Both students and ward residents were blinded to the study question. Results A total of 32 diaries (59 %, 442 recorded working days were handed back. Overall, the students reported 1.2 ± 1.3 supervised, 1.8 ±1.6 medical and 3.6 ± 1.7 non-medical activities per day. The more supervised activities were reported, the more the number of reported medical activities increased (p  Conclusions There was a significant association between ward doctors’ supervision of students and the number of medical activities performed by medical students. The workload had no significant effect on supervision or the number of medical or non-medical activities of final-year students.

  20. A high performance, low power computational platform for complex sensing operations in smart cities

    KAUST Repository

    Jiang, Jiming; Claudel, Christian

    2017-01-01

    This paper presents a new wireless platform designed for an integrated traffic/flash flood monitoring system. The sensor platform is built around a 32-bit ARM Cortex M4 microcontroller and a 2.4GHz 802.15.4802.15.4 ISM compliant radio module. It can be interfaced with fixed traffic sensors, or receive data from vehicle transponders. This platform is specifically designed for solar-powered, low bandwidth, high computational performance wireless sensor network applications. A self-recovering unit is designed to increase reliability and allow periodic hard resets, an essential requirement for sensor networks. A radio monitoring circuitry is proposed to monitor incoming and outgoing transmissions, simplifying software debugging. We illustrate the performance of this wireless sensor platform on complex problems arising in smart cities, such as traffic flow monitoring, machine-learning-based flash flood monitoring or Kalman-filter based vehicle trajectory estimation. All design files have been uploaded and shared in an open science framework, and can be accessed from [1]. The hardware design is under CERN Open Hardware License v1.2.

  1. A high performance, low power computational platform for complex sensing operations in smart cities

    KAUST Repository

    Jiang, Jiming

    2017-02-02

    This paper presents a new wireless platform designed for an integrated traffic/flash flood monitoring system. The sensor platform is built around a 32-bit ARM Cortex M4 microcontroller and a 2.4GHz 802.15.4802.15.4 ISM compliant radio module. It can be interfaced with fixed traffic sensors, or receive data from vehicle transponders. This platform is specifically designed for solar-powered, low bandwidth, high computational performance wireless sensor network applications. A self-recovering unit is designed to increase reliability and allow periodic hard resets, an essential requirement for sensor networks. A radio monitoring circuitry is proposed to monitor incoming and outgoing transmissions, simplifying software debugging. We illustrate the performance of this wireless sensor platform on complex problems arising in smart cities, such as traffic flow monitoring, machine-learning-based flash flood monitoring or Kalman-filter based vehicle trajectory estimation. All design files have been uploaded and shared in an open science framework, and can be accessed from [1]. The hardware design is under CERN Open Hardware License v1.2.

  2. Interpretation of stream programs: characterizing type 2 polynomial time complexity

    OpenAIRE

    Férée , Hugo; Hainry , Emmanuel; Hoyrup , Mathieu; Péchoux , Romain

    2010-01-01

    International audience; We study polynomial time complexity of type 2 functionals. For that purpose, we introduce a first order functional stream language. We give criteria, named well-founded, on such programs relying on second order interpretation that characterize two variants of type 2 polynomial complexity including the Basic Feasible Functions (BFF). These charac- terizations provide a new insight on the complexity of stream programs. Finally, we adapt these results to functions over th...

  3. Research on image complexity evaluation method based on color information

    Science.gov (United States)

    Wang, Hao; Duan, Jin; Han, Xue-hui; Xiao, Bo

    2017-11-01

    In order to evaluate the complexity of a color image more effectively and find the connection between image complexity and image information, this paper presents a method to compute the complexity of image based on color information.Under the complexity ,the theoretical analysis first divides the complexity from the subjective level, divides into three levels: low complexity, medium complexity and high complexity, and then carries on the image feature extraction, finally establishes the function between the complexity value and the color characteristic model. The experimental results show that this kind of evaluation method can objectively reconstruct the complexity of the image from the image feature research. The experimental results obtained by the method of this paper are in good agreement with the results of human visual perception complexity,Color image complexity has a certain reference value.

  4. Performance of predictive models in phase equilibria of complex associating systems: PC-SAFT and CEOS/GE

    Directory of Open Access Journals (Sweden)

    N. Bender

    2013-03-01

    Full Text Available Cubic equations of state combined with excess Gibbs energy predictive models (like UNIFAC and equations of state based on applied statistical mechanics are among the main alternatives for phase equilibria prediction involving polar substances in wide temperature and pressure ranges. In this work, the predictive performances of the PC-SAFT with association contribution and Peng-Robinson (PR combined with UNIFAC (Do through mixing rules are compared. Binary and multi-component systems involving polar and non-polar substances were analyzed. Results were also compared to experimental data available in the literature. Results show a similar predictive performance for PC-SAFT with association and cubic equations combined with UNIFAC (Do through mixing rules. Although PC-SAFT with association requires less parameters, it is more complex and requires more computation time.

  5. The complexities of complex span: explaining individual differences in working memory in children and adults.

    Science.gov (United States)

    Bayliss, Donna M; Jarrold, Christopher; Gunn, Deborah M; Baddeley, Alan D

    2003-03-01

    Two studies are presented that investigated the constraints underlying working memory performance in children and adults. In each case, independent measures of processing efficiency and storage capacity are assessed to determine their relative importance in predicting performance on complex span tasks,which measure working memory capacity. Results show that complex span performance was independently constrained by individual differences in domain-general processing efficiency and domain-specific storage capacity. Residual variance, which may reflect the ability to coordinate storage and processing, also predicted academic achievement. These results challenge the view that complex span taps a limited-capacity resource pool shared between processing and storage operations. Rather, they are consistent with a multiple-component model in which separate resource pools support the processing and storage functions of working memory.

  6. Benchmark and Continuous Improvement of Performance

    Directory of Open Access Journals (Sweden)

    Alina Alecse Stanciu

    2017-12-01

    Full Text Available The present Economic Environment is challenge us to perform, to think and re-think our personal strategies in according with our entities strategies, even if we are simply employed or we are entrepreneurs. Is an environment characterised by Volatility, Uncertainity, Complexity and Ambiguity - a VUCA World in which the entities must fight for their position gained in the market, disrupt new markets and new economies, developing their client portofolio, with the Performance as one final goal. The pressure of driving forces known as the 2030 Megatrends: Globalization 2.0, Environmental Crisis and the Scarcity of Resources, Individualism and Value Pluralism, Demographic Change, This paper examines whether using benchmark is an opportunity to increase the competitiveness of Romanian SMEs and the results show that benchmark is therefore a powerful instrument, combining reduced negative impact on the environment with a positive impact on the economy and society.

  7. Sewage Sludge Incinerators: Final Standards of Performance for New Stationary Sources and Emission Guidelines for Existing Sources Final Rule Fact Sheets

    Science.gov (United States)

    This page contains a February 2011 fact sheet with information regarding the final NSPS and Emission Guidelines for Existing Sources for Sewage Sludge Incinerators (SSI). This document provides a summary of the information for these regulations.

  8. Methods of characterization of salt formations in view of spent fuel final disposal

    International Nuclear Information System (INIS)

    Diaconu, Daniela; Balan, Valeriu; Mirion, Ilie

    2002-01-01

    Deep disposal in geological formations of salt, granite and clay seems to be at present the most proper and commonly adopted solution for final disposal of high-level radioactive wastes and spent fuel. Disposing such wastes represents the top-priority issue of the European research community in the field of nuclear power. Although seemingly premature for Romanian power system, the interest for final disposal of spent fuel is justified by the long duration implied by the studies targeting this objective. At the same time these studies represent the Romanian nuclear research contribution in the frame of the efforts of integration within the European research field. Although Romania has not made so far a decision favoring a given geological formation for the final disposal of spent fuel resulting from Cernavoda NPP, the most generally taken into consideration appears the salt formation. The final decision will be made following the evaluation of its performances to spent fuel disposal based on the values of the specific parameters of the geological formation. In order to supply the data required as input parameters in the codes of evaluation of the geological formation performances, the INR Pitesti initiated a package of modern and complex methodologies for such determinations. The studies developed so far followed up the special phenomenon of salt convergence, a phenomenon characteristic for only this kind of rock, as well as the radionuclide migration. These studies allow a better understanding of these processes of upmost importance for disposal's safety. The methods and the experimental installation designed and realized at INR Pitesti aimed at determination of thermal expansion coefficient, thermal conductivity, specific heat, which are all parameters of high specific interest for high level radioactive waste or spent fuel disposal. The paper presents the results of these studies as well as the methodologies, the experimental installations and the findings

  9. Heat pumps for geothermal applications: availability and performance. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Reistad, G.M.; Means, P.

    1980-05-01

    A study of the performance and availability of water-source heat pumps was carried out. The primary purposes were to obtain the necessary basic information required for proper evaluation of the role of water-source heat pumps in geothermal energy utilization and/or to identify the research needed to provide this information. The Search of Relevant Literature considers the historical background, applications, achieved and projected performance evaluations and performance improvement techniques. The commercial water-source heat pump industry is considered in regard to both the present and projected availability and performance of units. Performance evaluations are made for units that use standard components but are redesigned for use in geothermal heating.

  10. High Density Hydrogen Storage System Demonstration Using NaAlH4 Based Complex Compound Hydrides

    Energy Technology Data Exchange (ETDEWEB)

    Daniel A. Mosher; Xia Tang; Ronald J. Brown; Sarah Arsenault; Salvatore Saitta; Bruce L. Laube; Robert H. Dold; Donald L. Anton

    2007-07-27

    This final report describes the motivations, activities and results of the hydrogen storage independent project "High Density Hydrogen Storage System Demonstration Using NaAlH4 Based Complex Compound Hydrides" performed by the United Technologies Research Center under the Department of Energy Hydrogen Program, contract # DE-FC36-02AL67610. The objectives of the project were to identify and address the key systems technologies associated with applying complex hydride materials, particularly ones which differ from those for conventional metal hydride based storage. This involved the design, fabrication and testing of two prototype systems based on the hydrogen storage material NaAlH4. Safety testing, catalysis studies, heat exchanger optimization, reaction kinetics modeling, thermochemical finite element analysis, powder densification development and material neutralization were elements included in the effort.

  11. Performance Measures for Public Participation Methods : Final Report

    Science.gov (United States)

    2018-01-01

    Public engagement is an important part of transportation project development, but measuring its effectiveness is typically piecemealed. Performance measurementdescribed by the Urban Institute as the measurement on a regular basis of the results (o...

  12. Sixty-five-year old final clarifier performance rivals that of modern designs.

    Science.gov (United States)

    Barnard, James L; Kunetz, Thomas E; Sobanski, Joseph P

    2008-01-01

    The Stickney plant of the Metropolitan Wastewater Reclamation District of Greater Chicago (MWRDGC), one of the largest wastewater treatment plants in the world, treats an average dry weather flow of 22 m3/s and a sustained wet weather flow of 52 m3/s that can peak to 63 m3/s. Most of the inner city of Chicago has combined sewers, and in order to reduce pollution through combined sewer overflows (CSO), the 175 km Tunnel and Reservoir Plan (TARP) tunnels, up to 9.1 m in diameter, were constructed to receive and convey CSO to a reservoir from where it will be pumped to the Stickney treatment plant. Pumping back storm flows will result in sustained wet weather flows over periods of weeks. Much of the success of the plant will depend on the ability of 96 circular final clarifiers to produce an effluent of acceptable quality. The nitrifying activated sludge plant is arranged in a plug-flow configuration, and some denitrification takes place as a result of the high oxygen demand in the first pass of the four-pass aeration basins that have a length to width ratio of 18:1. The SVI of the mixed liquor varies between 60 and 80 ml/g. The final clarifiers, which were designed by the District's design office in 1938, have functioned for more than 65 years without major changes and are still producing very high-quality effluent. This paper will discuss the design and operation of these final clarifiers and compare the design with more modern design practices. (c) IWA Publishing 2008.

  13. Modelling and transmission-line calculations of the final superconducting dipole and quadrupole chains of CERN's LHC collider methods and results

    CERN Document Server

    Dahlerup-Petersen, K

    2001-01-01

    Summary form only given, as follows. A long chain of superconducting magnets represents a complex load impedance for the powering and turns into a complex generator during the energy extraction. Detailed information about the circuit is needed for the calculation of a number of parameters and features, which are of vital importance for the choice of powering and extraction equipment and for the prediction of the circuit performance under normal and fault conditions. Constitution of the complex magnet chain impedance is based on a synthesized, electrical model of the basic magnetic elements. This is derived from amplitude and phase measurements of coil and ground impedances from d.c. to 50 kHz and the identification of poles and zeros of the impedance and transfer functions. An electrically compatible RLC model of each magnet type was then synthesized by means of a combination of conventional algorithms. Such models have been elaborated for the final, 15-m long LHC dipole (both apertures in series) as well as ...

  14. A performance assessment methodology for low-level radioactive waste disposal

    International Nuclear Information System (INIS)

    Deering, L.R.; Kozak, M.W.

    1990-01-01

    To demonstrate compliance with the performance objectives governing protection of the general population in 10 CFR 61.41, applicants for land disposal of low-level radioactive waste are required to conduct a pathways analysis, or quantitative evaluation of radionuclide release, transport through environmental media, and dose to man. The Nuclear Regulatory Commission staff defined a strategy and initiated a project at Sandia National Laboratories to develop a methodology for independently evaluating an applicant's analysis of postclosure performance. This performance assessment methodology was developed in five stages: (1) identification of environmental pathways, (2) ranking, the significance of the pathways, (3) identification and integration of models for pathway analyses, (4) identification and selection of computer codes and techniques for the methodology, and (5) implementation of the codes and documentation of the methodology. The final methodology implements analytical and simple numerical solutions for source term, ground-water flow and transport, surface water transport, air transport, food chain, and dosimetry analyses, as well as more complex numerical solutions for multidimensional or transient analyses when more detailed assessments are needed. The capability to perform both simple and complex analyses is accomplished through modular modeling, which permits substitution of various models and codes to analyze system components

  15. Sleep spindle and K-complex detection using tunable Q-factor wavelet transform and morphological component analysis

    Directory of Open Access Journals (Sweden)

    Tarek eLajnef

    2015-07-01

    Full Text Available A novel framework for joint detection of sleep spindles and K-complex events, two hallmarks of sleep stage S2, is proposed. Sleep electroencephalography (EEG signals are split into oscillatory (spindles and transient (K-complex components. This decomposition is conveniently achieved by applying morphological component analysis (MCA to a sparse representation of EEG segments obtained by the recently introduced discrete tunable Q-factor wavelet transform (TQWT. Tuning the Q-factor provides a convenient and elegant tool to naturally decompose the signal into an oscillatory and a transient component. The actual detection step relies on thresholding (i the transient component to reveal K-complexes and (ii the time-frequency representation of the oscillatory component to identify sleep spindles. Optimal thresholds are derived from ROC-like curves (sensitivity versus FDR on training sets and the performance of the method is assessed on test data sets. We assessed the performance of our method using full-night sleep EEG data we collected from 14 participants. In comparison to visual scoring (Expert 1, the proposed method detected spindles with a sensitivity of 83.18% and false discovery rate (FDR of 39%, while K-complexes were detected with a sensitivity of 81.57% and an FDR of 29.54%. Similar performances were obtained when using a second expert as benchmark. In addition, when the TQWT and MCA steps were excluded from the pipeline the detection sensitivities dropped down to 70% for spindles and to 76.97% for K-complexes, while the FDR rose up to 43.62% and 49.09% respectively. Finally, we also evaluated the performance of the proposed method on a set of publicly available sleep EEG recordings. Overall, the results we obtained suggest that the TQWT-MCA method may be a valuable alternative to existing spindle and K-complex detection methods. Paths for improvements and further validations with large-scale standard open-access benchmarking data sets are

  16. General Synthesis of Transition-Metal Oxide Hollow Nanospheres/Nitrogen-Doped Graphene Hybrids by Metal-Ammine Complex Chemistry for High-Performance Lithium-Ion Batteries.

    Science.gov (United States)

    Chen, Jiayuan; Wu, Xiaofeng; Gong, Yan; Wang, Pengfei; Li, Wenhui; Mo, Shengpeng; Peng, Shengpan; Tan, Qiangqiang; Chen, Yunfa

    2018-02-09

    We present a general and facile synthesis strategy, on the basis of metal-ammine complex chemistry, for synthesizing hollow transition-metal oxides (Co 3 O 4 , NiO, CuO-Cu 2 O, and ZnO)/nitrogen-doped graphene hybrids, potentially applied in high-performance lithium-ion batteries. The oxygen-containing functional groups of graphene oxide play a prerequisite role in the formation of hollow transition-metal oxides on graphene nanosheets, and a significant hollowing process occurs only when forming metal (Co 2+ , Ni 2+ , Cu 2+ , or Zn 2+ )-ammine complex ions. Moreover, the hollowing process is well correlated with the complexing capacity between metal ions and NH 3 molecules. The significant hollowing process occurs for strong metal-ammine complex ions including Co 2+ , Ni 2+ , Cu 2+ , and Zn 2+ ions, and no hollow structures formed for weak and/or noncomplex Mn 2+ and Fe 3+ ions. Simultaneously, this novel strategy can also achieve the direct doping of nitrogen atoms into the graphene framework. The electrochemical performance of two typical hollow Co 3 O 4 or NiO/nitrogen-doped graphene hybrids was evaluated by their use as anodic materials. It was demonstrated that these unique nanostructured hybrids, in contrast with the bare counterparts, solid transition-metal oxides/nitrogen-doped graphene hybrids, perform with significantly improved specific capacity, superior rate capability, and excellent capacity retention. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. CubiCal - Fast radio interferometric calibration suite exploiting complex optimisation

    Science.gov (United States)

    Kenyon, J. S.; Smirnov, O. M.; Grobler, T. L.; Perkins, S. J.

    2018-05-01

    It has recently been shown that radio interferometric gain calibration can be expressed succinctly in the language of complex optimisation. In addition to providing an elegant framework for further development, it exposes properties of the calibration problem which can be exploited to accelerate traditional non-linear least squares solvers such as Gauss-Newton and Levenberg-Marquardt. We extend existing derivations to chains of Jones terms: products of several gains which model different aberrant effects. In doing so, we find that the useful properties found in the single term case still hold. We also develop several specialised solvers which deal with complex gains parameterised by real values. The newly developed solvers have been implemented in a Python package called CubiCal, which uses a combination of Cython, multiprocessing and shared memory to leverage the power of modern hardware. We apply CubiCal to both simulated and real data, and perform both direction-independent and direction-dependent self-calibration. Finally, we present the results of some rudimentary profiling to show that CubiCal is competitive with respect to existing calibration tools such as MeqTrees.

  18. Relationship Between Final Performance and Block Times with the Traditional and the New Starting Platforms with A Back Plate in International Swimming Championship 50-M and 100-M Freestyle Events

    Science.gov (United States)

    Garcia-Hermoso, Antonio; Escalante, Yolanda; Arellano, Raul; Navarro, Fernando; Domínguez, Ana M.; Saavedra, Jose M.

    2013-01-01

    The purpose of this study was to investigate the association between block time and final performance for each sex in 50-m and 100-m individual freestyle, distinguishing between classification (1st to 3rd, 4th to 8th, 9th to 16th) and type of starting platform (old and new) in international competitions. Twenty-six international competitions covering a 13-year period (2000-2012) were analysed retrospectively. The data corresponded to a total of 1657 swimmers’ competition histories. A two-way ANOVA (sex x classification) was performed for each event and starting platform with the Bonferroni post-hoc test, and another two-way ANOVA for sex and starting platform (sex x starting platform). Pearson’s simple correlation coefficient was used to determine correlations between the block time and the final performance. Finally, a simple linear regression analysis was done between the final time and the block time for each sex and platform. The men had shorter starting block times than the women in both events and from both platforms. For 50-m event, medalists had shorter block times than semi- finalists with the old starting platforms. Block times were directly related to performance with the old starting platforms. With the new starting platforms, however, the relationship was inverse, notably in the women’s 50-m event. The block time was related for final performance in the men’s 50- m event with the old starting platform, but with the new platform it was critical only for the women’s 50-m event. Key Points The men had shorter block times than the women in both events and with both platforms. For both distances, the swimmers had shorter block times in their starts from the new starting platform with a back plate than with the old platform. For the 50-m event with the old starting platform, the medalists had shorter block times than the semi-finalists. The new starting platform block time was only determinant in the women’s 50-m event. In order to improve

  19. Pecan Street Grid Demonstration Program. Final technology performance report

    Energy Technology Data Exchange (ETDEWEB)

    None, None

    2015-02-10

    This document represents the final Regional Demonstration Project Technical Performance Report (TPR) for Pecan Street Inc.’s (Pecan Street) Smart Grid Demonstration Program, DE-OE-0000219. Pecan Street is a 501(c)(3) smart grid/clean energy research and development organization headquartered at The University of Texas at Austin (UT). Pecan Street worked in collaboration with Austin Energy, UT, Environmental Defense Fund (EDF), the City of Austin, the Austin Chamber of Commerce and selected consultants, contractors, and vendors to take a more detailed look at the energy load of residential and small commercial properties while the power industry is undergoing modernization. The Pecan Street Smart Grid Demonstration Program signed-up over 1,000 participants who are sharing their home or businesses’s electricity consumption data with the project via green button protocols, smart meters, and/or a home energy monitoring system (HEMS). Pecan Street completed the installation of HEMS in 750 homes and 25 commercial properties. The program provided incentives to increase the installed base of roof-top solar photovoltaic (PV) systems, plug-in electric vehicles with Level 2 charging, and smart appliances. Over 200 participants within a one square mile area took advantage of Austin Energy and Pecan Street’s joint PV incentive program and installed roof-top PV as part of this project. Of these homes, 69 purchased or leased an electric vehicle through Pecan Street’s PV rebate program and received a Level 2 charger from Pecan Street. Pecan Street studied the impacts of these technologies along with a variety of consumer behavior interventions, including pricing models, real-time feedback on energy use, incentive programs, and messaging, as well as the corresponding impacts on Austin Energy’s distribution assets.The primary demonstration site was the Mueller community in Austin, Texas. The Mueller development, located less than three miles from the Texas State Capitol

  20. MARKETING MIX IN OLTENIA ENERGY COMPLEX

    Directory of Open Access Journals (Sweden)

    Păunescu Alberto Nicolae

    2012-12-01

    Full Text Available Electricity generation in Romania it’s realized in percentage 30 % in OLTENIA ENERGY COMPLEX. This is the biggest producer of energy, end coal in the country. Therefore Marketing mix is very important to ensure that the company grows. The final objective is that the volume of sales, market share and growth.

  1. Performance and Complexity of Tunable Sparse Network Coding with Gradual Growing Tuning Functions over Wireless Networks

    DEFF Research Database (Denmark)

    Garrido, Pablo; Sørensen, Chres Wiant; Roetter, Daniel Enrique Lucani

    2016-01-01

    Random Linear Network Coding (RLNC) has been shown to be a technique with several benefits, in particular when applied over wireless mesh networks, since it provides robustness against packet losses. On the other hand, Tunable Sparse Network Coding (TSNC) is a promising concept, which leverages...... a trade-off between computational complexity and goodput. An optimal density tuning function has not been found yet, due to the lack of a closed-form expression that links density, performance and computational cost. In addition, it would be difficult to implement, due to the feedback delay. In this work...

  2. Building on the EGIPPS performance assessment: the multipolar framework as a heuristic to tackle the complexity of performance of public service oriented health care organisations.

    Science.gov (United States)

    Marchal, Bruno; Hoerée, Tom; da Silveira, Valéria Campos; Van Belle, Sara; Prashanth, Nuggehalli S; Kegels, Guy

    2014-04-17

    Performance of health care systems is a key concern of policy makers and health service managers all over the world. It is also a major challenge, given its multidimensional nature that easily leads to conceptual and methodological confusion. This is reflected by a scarcity of models that comprehensively analyse health system performance. In health, one of the most comprehensive performance frameworks was developed by the team of Leggat and Sicotte. Their framework integrates 4 key organisational functions (goal attainment, production, adaptation to the environment, and values and culture) and the tensions between these functions.We modified this framework to better fit the assessment of the performance of health organisations in the public service domain and propose an analytical strategy that takes it into the social complexity of health organisations. The resulting multipolar performance framework (MPF) is a meta-framework that facilitates the analysis of the relations and interactions between the multiple actors that influence the performance of health organisations. Using the MPF in a dynamic reiterative mode not only helps managers to identify the bottlenecks that hamper performance, but also the unintended effects and feedback loops that emerge. Similarly, it helps policymakers and programme managers at central level to better anticipate the potential results and side effects of and required conditions for health policies and programmes and to steer their implementation accordingly.

  3. Synthesis, structure and catalytic activities of nickel(II) complexes bearing N4 tetradentate Schiff base ligand

    Science.gov (United States)

    Sarkar, Saikat; Nag, Sanat Kumar; Chattopadhyay, Asoke Prasun; Dey, Kamalendu; Islam, Sk. Manirul; Sarkar, Avijit; Sarkar, Sougata

    2018-05-01

    Two new nickel(II) complexes [Ni(L)Cl2] (1) and [Ni(L)(NCS)2] (2) of a neutral tetradentate mono-condensed Schiff base ligand, 3-(2-(2-aminoethylamino)ethylimino)butan-2-one oxime (L) have been synthesized and characterized using different physicochemical techniques e.g. elemental analyses, spectroscopic (IR, Electronic, NMR) methods, conductivity and molecular measurements. The crystal structure of complex (2) has been determined by using single crystal X-ray diffraction method and it suggests a distorted octahedral geometry around nickel(II) having a NiN6 coordinating atmosphere. The non-coordinated Osbnd H group on the ligand L remain engaged in H-bonding interactions with the S end of the coordinated thiocyanate moiety. These H-bonding interactions lead to Osbnd S separations of 3.132 Å and play prominent role in crystal packing. It is observed that the mononuclear units are glued together with such Osbnd H…S interactions and finally results in an 1D supramolecular sheet-like arrangement. DFT/TDDFT based theoretical calculations were also performed on the ligand and the complexes aiming at the accomplishment of idea regarding their optimized geometry, electronic transitions and the molecular energy levels. Finally the catalytic behavior of the complexes for oxidation of styrene has also been carried out. A variety of reaction conditions like the effect of solvent, effect of temperature and time as well as the effect of ratio of substrate to oxidant were thoroughly studied to judge the catalytic efficiency of the Ni(II) coordination entity.

  4. Resource-Constrained Low-Complexity Video Coding for Wireless Transmission

    DEFF Research Database (Denmark)

    Ukhanova, Ann

    of video quality. We proposed a new metric for objective quality assessment that considers frame rate. As many applications deal with wireless video transmission, we performed an analysis of compression and transmission systems with a focus on power-distortion trade-off. We proposed an approach...... for ratedistortion-complexity optimization of upcoming video compression standard HEVC. We also provided a new method allowing decrease of power consumption on mobile devices in 3G networks. Finally, we proposed low-delay and low-power approaches for video transmission over wireless personal area networks, including......Constrained resources like memory, power, bandwidth and delay requirements in many mobile systems pose limitations for video applications. Standard approaches for video compression and transmission do not always satisfy system requirements. In this thesis we have shown that it is possible to modify...

  5. Expectancy-Violation and Information-Theoretic Models of Melodic Complexity

    Directory of Open Access Journals (Sweden)

    Tuomas Eerola

    2016-07-01

    Full Text Available The present study assesses two types of models for melodic complexity: one based on expectancy violations and the other one related to an information-theoretic account of redundancy in music. Seven different datasets spanning artificial sequences, folk and pop songs were used to refine and assess the models. The refinement eliminated unnecessary components from both types of models. The final analysis pitted three variants of the two model types against each other and could explain from 46-74% of the variance in the ratings across the datasets. The most parsimonious models were identified with an information-theoretic criterion. This suggested that the simplified expectancy-violation models were the most efficient for these sets of data. However, the differences between all optimized models were subtle in terms both of performance and simplicity.

  6. Predictive performance simulations for a sustainable lecture building complex

    CSIR Research Space (South Africa)

    Conradie, Dirk CU

    2012-06-01

    Full Text Available the site and building are not ideally oriented regarding the prevailing wind directions that generally follow the coast. From the perspective of the design team, the commitment to use a building information management (BIM) system at inception needed a... far more integrated approach to design development. Engineers typically wait for the architects to design the whole building, and then only drill down to final calculated structural design configurations and sizes. With BIM, these activities should...

  7. Automatic decomposition of a complex hologram based on the virtual diffraction plane framework

    International Nuclear Information System (INIS)

    Jiao, A S M; Tsang, P W M; Lam, Y K; Poon, T-C; Liu, J-P; Lee, C-C

    2014-01-01

    Holography is a technique for capturing the hologram of a three-dimensional scene. In many applications, it is often pertinent to retain specific items of interest in the hologram, rather than retaining the full information, which may cause distraction in the analytical process that follows. For a real optical image that is captured with a camera or scanner, this process can be realized by applying image segmentation algorithms to decompose an image into its constituent entities. However, because it is different from an optical image, classic image segmentation methods cannot be applied directly to a hologram, as each pixel in the hologram carries holistic, rather than local, information of the object scene. In this paper, we propose a method to perform automatic decomposition of a complex hologram based on a recently proposed technique called the virtual diffraction plane (VDP) framework. Briefly, a complex hologram is back-propagated to a hypothetical plane known as the VDP. Next, the image on the VDP is automatically decomposed, through the use of the segmentation on the magnitude of the VDP image, into multiple sub-VDP images, each representing the diffracted waves of an isolated entity in the scene. Finally, each sub-VDP image is reverted back to a hologram. As such, a complex hologram can be decomposed into a plurality of subholograms, each representing a discrete object in the scene. We have demonstrated the successful performance of our proposed method by decomposing a complex hologram that is captured through the optical scanning holography (OSH) technique. (papers)

  8. Cognitive performance modeling based on general systems performance theory.

    Science.gov (United States)

    Kondraske, George V

    2010-01-01

    General Systems Performance Theory (GSPT) was initially motivated by problems associated with quantifying different aspects of human performance. It has proved to be invaluable for measurement development and understanding quantitative relationships between human subsystem capacities and performance in complex tasks. It is now desired to bring focus to the application of GSPT to modeling of cognitive system performance. Previous studies involving two complex tasks (i.e., driving and performing laparoscopic surgery) and incorporating measures that are clearly related to cognitive performance (information processing speed and short-term memory capacity) were revisited. A GSPT-derived method of task analysis and performance prediction termed Nonlinear Causal Resource Analysis (NCRA) was employed to determine the demand on basic cognitive performance resources required to support different levels of complex task performance. This approach is presented as a means to determine a cognitive workload profile and the subsequent computation of a single number measure of cognitive workload (CW). Computation of CW may be a viable alternative to measuring it. Various possible "more basic" performance resources that contribute to cognitive system performance are discussed. It is concluded from this preliminary exploration that a GSPT-based approach can contribute to defining cognitive performance models that are useful for both individual subjects and specific groups (e.g., military pilots).

  9. Trauma to the nail complex

    Directory of Open Access Journals (Sweden)

    Jefferson Braga Silva

    2014-04-01

    Full Text Available OBJECTIVE: to analyze the results from surgical intervention to treat trauma of the nail complex.METHODS: we retrospectively reviewed a series of 94 consecutive patients with trauma of the nail complex who were treated between 2000 and 2009. In 42 patients, nail bed suturing was performed. In 27 patients, nail bed suturing was performed subsequent to osteosynthesis of the distal phalanx. In 15, immediate grafting was performed, and in 10, late-stage grafting of the nail bed. The growth, size and shape of the nail were evaluated in comparison with the contralateral finger. The results were obtained by summing scores and classifying them as good, fair or poor.RESULTS: the results were considered to be good particularly in the patients who underwent nail bed suturing or nail bed suturing with osteosynthesis of the distal phalanx. Patients who underwent immediate or late-stage nail grafting had poor results.CONCLUSION: trauma of the nail complex without loss of substance presented better results than did deferred treatment for reconstruction of the nail complex.

  10. The complex patient: A concept clarification.

    Science.gov (United States)

    Manning, Eli; Gagnon, Marilou

    2017-03-01

    Over the last decade, the concept of the "complex patient" has not only been more widely used in multidisciplinary healthcare teams and across various healthcare disciplines, but it has also become more vacuous in meaning. The uptake of the concept of the "complex patient" spans across disciplines, such as medicine, nursing, and social work, with no consistent definition. We review the chronological evolution of this concept and its surrogate terms, namely "comorbidity," "multimorbidity," "polypathology," "dual diagnosis," and "multiple chronic conditions." Drawing on key principles of concept clarification, we highlight disciplinary usage in the literature published between 2005 and 2015 in health sciences, attending to overlaps and revealing nuances of the complex patient concept. Finally, we discuss the implications of this concept for practice, research, and theory. © 2017 John Wiley & Sons Australia, Ltd.

  11. Salt-induced phase separation for the determination of metals as their diethyldithiocarbamate complexes by high-performance liquid chromatography

    International Nuclear Information System (INIS)

    Mueller, B.J.; Lovett, R.J.

    1987-01-01

    Reversed-phase high-performance liquid chromatography with ultraviolet detection can be used to determine trace levels of Pt(II), Pd(II), Rh(III), Co(III), Ru(III), and Ir in aqueous solution following complexation with diethyldithiocarbamate. The metal complexes are extracted into acetonitrile from aqueous solution by the addition of a saturated salt solution. Quantitative metal recovery from aqueous solution is achievable for most metals for a wide solution pH range. Detection limits for the metals are <3 ng of metal/mL of original aqueous sample. Analyses of real samples are highly reproducible and sensitive. Ir an interfere in the determination of Pt(II) and Rh(III). A general protocol for chromatographic separation and determination of Pt(II), Pd(II), Rh(III), Ru(III), and Ir in aqueous solution is presented

  12. Complexation of buffer constituents with neutral complexation agents: part II. Practical impact in capillary zone electrophoresis.

    Science.gov (United States)

    Beneš, Martin; Riesová, Martina; Svobodová, Jana; Tesařová, Eva; Dubský, Pavel; Gaš, Bohuslav

    2013-09-17

    This article elucidates the practical impact of the complexation of buffer constituents with complexation agents on electrophoretic results, namely, complexation constant determination, system peak development, and proper separation of analytes. Several common buffers, which were selected based on the pH study in Part I of this paper series (Riesová, M.; Svobodová, J.; Tošner, Z.; Beneš, M.; Tesařová, E.; Gaš, B. Anal. Chem., 2013, DOI: 10.1021/ac4013804); e.g., CHES, MES, MOPS, Tricine were used to demonstrate behavior of such complex separation systems. We show that the value of a complexation constant determined in the interacting buffers environment depends not only on the analyte and complexation agent but it is also substantially affected by the type and concentration of buffer constituents. As a result, the complexation parameters determined in the interacting buffers cannot be regarded as thermodynamic ones and may provide misleading information about the strength of complexation of the compound of interest. We also demonstrate that the development of system peaks in interacting buffer systems significantly differs from the behavior known for noncomplexing systems, as the mobility of system peaks depends on the concentration and type of neutral complexation agent. Finally, we show that the use of interacting buffers can totally ruin the results of electrophoretic separation because the buffer properties change as the consequence of the buffer constituents' complexation. As a general conclusion, the interaction of buffer constituents with the complexation agent should always be considered in any method development procedures.

  13. Assessment of the potential consequences of a large primary to secondary leakage accident. Final report

    International Nuclear Information System (INIS)

    D'Auria, F.S.; Sartmadjiev, A.; Spalj, S.; Macek, J.; Kantee, H.; Elter, J.; Kostka, P.; Bukin, N.; Alexandrov, A.G.; Kristof, M.; Kvizda, B.; Matejovic, P.; Makihara, Y.

    2006-01-01

    The present paper discusses one of the IAEA's Coordinated Research Projects (CRPs). The CRP was started in 2003 to evaluate complex phenomena of primary to secondary leakage (PRISE) accidents for WWER-440 reactors. The first Research Coordination Meeting (RCM), held in March 2003, identified the possible consequences of PRISE accidents (radioactive release to the atmosphere, pressurized thermal shock, boron dilution, loss of integrity of secondary systems and severe accidents) and designated six task groups to evaluate these, as well as uncertainties associated with PRISE analyses. The second RCM, held in March 2004, discussed the preliminary results of each task group and addressed the main safety concerns related to PRISE phenomena as well as providing recommendations on modelling for PRISE analyses and on operator actions. The third RCM, held in March 2005, discussed the results of the work performed in 2004. The CRP was concluded in 2005. Publication of the final results of the CRP is planned as an IAEA TECDOC. The paper provides a review of the final results of the project. (author)

  14. Human performance in nondestructive inspections and functional tests: Final report

    International Nuclear Information System (INIS)

    Harris, D.H.

    1988-10-01

    Human performance plays a vital role in the inspections and tests conducted to assure the physical integrity of nuclear power plants. Even when technically-sophisticated equipment is employed, the outcome is highly dependent on human control actions, calibrations, observations, analyses, and interpretations. The principal consequences of inadequate performance are missed or falsely-reported defects. However, the cost-avoidance that stems from addressing potential risks promptly, and the increasing costs likely with aging plants, emphasize that timeliness and efficiency are important inspection-performance considerations also. Human performance issues were studied in a sample of inspections and tests regularly conducted in nuclear power plants. These tasks, selected by an industry advisory panel, were: eddy-current inspection of steam-generator tubes; ultrasonic inspection of pipe welds; inservice testing of pumps and valves; and functional testing of shock suppressors. Information was obtained for the study from industry and plant procedural documents; training materials; research reports and related documents; interviews with training specialists, inspectors, supervisory personnel, and equipment designers; and first-hand observations of task performance. Eleven recommendations are developed for improving human performance on nondestructive inspections and functional tests. Two recommendations were for the more-effective application of existing knowledge; nine recommendations were for research projects that should be undertaken to assure continuing improvements in human performance on these tasks. 25 refs., 9 figs., 1 tab

  15. The complexity of wine: clarifying the role of microorganisms.

    Science.gov (United States)

    Tempère, Sophie; Marchal, Axel; Barbe, Jean-Christophe; Bely, Marina; Masneuf-Pomarede, Isabelle; Marullo, Philippe; Albertin, Warren

    2018-05-01

    The concept of wine complexity has gained considerable interest in recent years, both for wine consumers and wine scientists. As a consequence, some research programs concentrate on the factors that could improve the perceived complexity of a wine. Notably, the possible influence of microbiological factors is particularly investigated. However, wine complexity is a multicomponent concept not easily defined. In this review, we first describe the actual knowledge regarding wine complexity, its perception, and wine chemical composition. In particular, we emphasize that, contrary to expectations, the perception of wine complexity is not related to wine chemical complexity. Then, we review the impact of wine microorganisms on wine complexity, with a specific focus on publications including sensory analyses. While microorganisms definitively can impact wine complexity, the underlying mechanisms and molecules are far from being deciphered. Finally, we discuss some prospective research fields that will help improving our understanding of wine complexity, including perceptive interactions, microbial interactions, and other challenging phenomena.

  16. Deterministic ripple-spreading model for complex networks.

    Science.gov (United States)

    Hu, Xiao-Bing; Wang, Ming; Leeson, Mark S; Hines, Evor L; Di Paolo, Ezequiel

    2011-04-01

    This paper proposes a deterministic complex network model, which is inspired by the natural ripple-spreading phenomenon. The motivations and main advantages of the model are the following: (i) The establishment of many real-world networks is a dynamic process, where it is often observed that the influence of a few local events spreads out through nodes, and then largely determines the final network topology. Obviously, this dynamic process involves many spatial and temporal factors. By simulating the natural ripple-spreading process, this paper reports a very natural way to set up a spatial and temporal model for such complex networks. (ii) Existing relevant network models are all stochastic models, i.e., with a given input, they cannot output a unique topology. Differently, the proposed ripple-spreading model can uniquely determine the final network topology, and at the same time, the stochastic feature of complex networks is captured by randomly initializing ripple-spreading related parameters. (iii) The proposed model can use an easily manageable number of ripple-spreading related parameters to precisely describe a network topology, which is more memory efficient when compared with traditional adjacency matrix or similar memory-expensive data structures. (iv) The ripple-spreading model has a very good potential for both extensions and applications.

  17. Vacuum insulation - Panel properties and building applications. HiPTI - High Performance Thermal Insulation - IEA/ECBCS Annex 39 - Final report

    Energy Technology Data Exchange (ETDEWEB)

    Erb, M. (ed.)

    2005-12-15

    This paper takes a look at the properties of vacuum insulation panels (VIP) that have already been developed some time ago for use in appliances such as refrigerators and deep-freezers. Their insulation performance is a factor of five to ten times better than that of conventional insulation. The paper discusses the use of such panels in buildings to provide thin, highly-insulating constructions for walls, roofs and floors. The motivation for examining the applicability of high performance thermal insulation in buildings is discussed, including solutions where severe space limitations and other technical and aesthetic considerations exist. The use of nano-structured materials and laminated foils is examined and discussed. The questions arising from the use of such panels in buildings is discussed and the open questions and risks involved are examined. Finally, an outlook on the introduction of VIP technology is presented and quality assurance aspects are examined. This work was done within the framework of the Task 39 'High Performance Thermal Insulation' of the 'Energy Conservation in Buildings and Community Systems ECBCS' programme of the International Energy Agency IEA.

  18. Pediatric echocardiograms performed at primary centers: Diagnostic errors and missing links!

    International Nuclear Information System (INIS)

    Saraf, Rahul P; Suresh, PV; Maheshwari, Sunita; Shah, Sejal S

    2015-01-01

    The present study was undertaken to assess the accuracy of pediatric echocardiograms done at non-tertiary centers and to evaluate the relationship of inaccurate interpretations with age, echocardiogram performer and complexity of congenital heart disease (CHD). The echocardiogram reports of 182 consecutive children with CHD (5 days-16 years) who were evaluated at a non-tertiary center and subsequently referred to our center were reviewed. Age of the child at echocardiogram, echocardiogram performer and complexity of CHD were noted. These reports were compared with echocardiogram done at our center. Discrepancies were noted and categorized. To assess our own error rate, we compared our echocardiogram reports with the findings obtained during surgery (n = 172), CT scan (n = 9) or cardiac catheterization reports (n = 1). Most of the children at the non-tertiary center (92%) underwent echocardiogram by personnel other than a pediatric cardiologist. Overall, diagnostic errors were found in 69/182 (38%) children. Moderate and major discrepancies affecting the final management were found in 42/182 (23%) children. Discrepancies were higher when the echocardiogram was done by personnel other than pediatric cardiologist (P < 0.01) and with moderate and high complexity lesions (P = 0.0001). There was no significant difference in proportion of these discrepancies in children ≤ 1 year vs. >1 year of age. A significant number of pediatric echocardiograms done at non-tertiary centers had discrepancies that affected the management of these children. More discrepancies were seen when the echocardiogram performer was not a pediatric cardiologist and with complex CHD

  19. Development of Human Performance Analysis and Advanced HRA Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-15

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants.

  20. Development of Human Performance Analysis and Advanced HRA Methodology

    International Nuclear Information System (INIS)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-01

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants

  1. Fabrication of advanced Bragg gratings with complex apodization profiles by use of the polarization control method

    DEFF Research Database (Denmark)

    Deyerl, Hans-Jürgen; Plougmann, Nikolai; Jensen, Jesper Bo Damm

    2004-01-01

    The polarization control method offers a flexible, robust, and low-cost route for the parallel fabrication of gratings with complex apodization profiles including several discrete phase shifts and chirp. The performance of several test gratings is evaluated in terms of their spectral response...... and compared with theoretical predictions. Short gratings with sidelobe-suppression levels in excess of 32 dB and transmission dips lower than 80 dB have been realized. Finally, most of the devices fabricated by the polarization control method show comparable quality to gratings manufactured by far more...

  2. [Complex posttraumatic stress disorder].

    Science.gov (United States)

    Green, Tamar; Kotler, Moshe

    2007-11-01

    The characteristic symptoms resulting from exposure to an extreme trauma include three clusters of symptoms: persistent experience of the traumatic event, persistent avoidance of stimuli associated with the trauma and persistent symptoms of increased arousal. Beyond the accepted clusters of symptoms for posttraumatic stress disorder exists a formation of symptoms related to exposure to extreme or prolonged stress e.g. childhood abuse, physical violence, rape, and confinement within a concentration camp. With accumulated evidence of the existence of these symptoms began a trail to classify a more complex syndrome, which included, but was not confined to the symptoms of posttraumatic stress disorder. This review addresses several subjects for study in complex posttraumatic stress disorder, which is a complicated and controversial topic. Firstly, the concept of complex posttraumatic stress disorder is presented. Secondly, the professional literature relevant to this disturbance is reviewed and finally, the authors present the polemic being conducted between the researchers of posttraumatic disturbances regarding validity, reliability and the need for separate diagnosis for these symptoms.

  3. Impact of automation: Measurement of performance, workload and behaviour in a complex control environment.

    Science.gov (United States)

    Balfe, Nora; Sharples, Sarah; Wilson, John R

    2015-03-01

    This paper describes an experiment that was undertaken to compare three levels of automation in rail signalling; a high level in which an automated agent set routes for trains using timetable information, a medium level in which trains were routed along pre-defined paths, and a low level where the operator (signaller) was responsible for the movement of all trains. These levels are described in terms of a Rail Automation Model based on previous automation theory (Parasuraman et al., 2000). Performance, subjective workload, and signaller activity were measured for each level of automation running under both normal operating conditions and abnormal, or disrupted, conditions. The results indicate that perceived workload, during both normal and disrupted phases of the experiment, decreased as the level of automation increased and performance was most consistent (i.e. showed the least variation between participants) with the highest level of automation. The results give a strong case in favour of automation, particularly in terms of demonstrating the potential for automation to reduce workload, but also suggest much benefit can achieved from a mid-level of automation potentially at a lower cost and complexity. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  4. Complex Systems and Dependability

    CERN Document Server

    Zamojski, Wojciech; Sugier, Jaroslaw

    2012-01-01

    Typical contemporary complex system is a multifaceted amalgamation of technical, information, organization, software and human (users, administrators and management) resources. Complexity of such a system comes not only from its involved technical and organizational structure but mainly from complexity of information processes that must be implemented in the operational environment (data processing, monitoring, management, etc.). In such case traditional methods of reliability analysis focused mainly on technical level are usually insufficient in performance evaluation and more innovative meth

  5. Structure identification and adaptive synchronization of uncertain general complex dynamical networks

    International Nuclear Information System (INIS)

    Xu Yuhua; Zhou Wuneng; Fang Jian'an; Lu Hongqian

    2009-01-01

    This Letter proposes an approach to identify the topological structure and unknown parameters for uncertain general complex networks simultaneously. By designing effective adaptive controllers, we achieve synchronization between two complex networks. The unknown network topological structure and system parameters of uncertain general complex dynamical networks are identified simultaneously in the process of synchronization. Several useful criteria for synchronization are given. Finally, an illustrative example is presented to demonstrate the application of the theoretical results.

  6. Structure identification and adaptive synchronization of uncertain general complex dynamical networks

    Energy Technology Data Exchange (ETDEWEB)

    Xu Yuhua, E-mail: yuhuaxu2004@163.co [College of Information Science and Technology, Donghua University, Shanghai 201620 (China) and Department of Maths, Yunyang Teacher' s College, Hubei 442000 (China); Zhou Wuneng, E-mail: wnzhou@163.co [College of Information Science and Technology, Donghua University, Shanghai 201620 (China); Fang Jian' an [College of Information Science and Technology, Donghua University, Shanghai 201620 (China); Lu Hongqian [Shandong Institute of Light Industry, Shandong Jinan 250353 (China)

    2009-12-28

    This Letter proposes an approach to identify the topological structure and unknown parameters for uncertain general complex networks simultaneously. By designing effective adaptive controllers, we achieve synchronization between two complex networks. The unknown network topological structure and system parameters of uncertain general complex dynamical networks are identified simultaneously in the process of synchronization. Several useful criteria for synchronization are given. Finally, an illustrative example is presented to demonstrate the application of the theoretical results.

  7. Evaluating the response of complex systems to environmental threats: the Σ II method

    International Nuclear Information System (INIS)

    Corynen, G.C.

    1983-05-01

    The Σ II method was developed to model and compute the probabilistic performance of systems that operate in a threatening environment. Although we emphasize the vulnerability of complex systems to earthquakes and to electromagnetic threats such as EMP (electromagnetic pulse), the method applies in general to most large-scale systems or networks that are embedded in a potentially harmful environment. Other methods exist for obtaining system vulnerability, but their complexity increases exponentially as the size of systems is increased. The complexity of the Σ II method is polynomial, and accurate solutions are now possible for problems for which current methods require the use of rough statistical bounds, confidence statements, and other approximations. For super-large problems, where the costs of precise answers may be prohibitive, a desired accuracy can be specified, and the Σ II algorithms will halt when that accuracy has been reached. We summarize the results of a theoretical complexity analysis - which is reported elsewhere - and validate the theory with computer experiments conducted both on worst-case academic problems and on more reasonable problems occurring in practice. Finally, we compare our method with the exact methods of Abraham and Nakazawa, and with current bounding methods, and we demonstrate the computational efficiency and accuracy of Σ II

  8. Real-Time and Real-Fast Performance of General-Purpose and Real-Time Operating Systems in Multithreaded Physical Simulation of Complex Mechanical Systems

    Directory of Open Access Journals (Sweden)

    Carlos Garre

    2014-01-01

    Full Text Available Physical simulation is a valuable tool in many fields of engineering for the tasks of design, prototyping, and testing. General-purpose operating systems (GPOS are designed for real-fast tasks, such as offline simulation of complex physical models that should finish as soon as possible. Interfacing hardware at a given rate (as in a hardware-in-the-loop test requires instead maximizing time determinism, for which real-time operating systems (RTOS are designed. In this paper, real-fast and real-time performance of RTOS and GPOS are compared when simulating models of high complexity with large time steps. This type of applications is usually present in the automotive industry and requires a good trade-off between real-fast and real-time performance. The performance of an RTOS and a GPOS is compared by running a tire model scalable on the number of degrees-of-freedom and parallel threads. The benchmark shows that the GPOS present better performance in real-fast runs but worse in real-time due to nonexplicit task switches and to the latency associated with interprocess communication (IPC and task switch.

  9. Structural systematics of some metal complexes with 4,5 ...

    Indian Academy of Sciences (India)

    study reveals that each metal(II) centre in the four complexes adopts distorted octahedral geometry with MN6 ... potassium permanganate (E Merck, India), potassium. 717 ... The final reaction solu- ..... ble in water, methanol, acetonitrile, etc.

  10. 76 FR 39015 - Contractor Performance Information

    Science.gov (United States)

    2011-07-05

    ...] Contractor Performance Information AGENCY: Environmental Protection Agency (EPA), ACTION: Direct final rule... contractor performance information. EPA is issuing a final rule because the changes are procedural in nature... Institutes of Health's Contractor Performance System (CPS) to the Department of Defense's Contractor...

  11. Final Exam Weighting as Part of Course Design

    Directory of Open Access Journals (Sweden)

    Matthew Franke

    2018-03-01

    Full Text Available The weighting of a final exam or a final assignment is an essential part of course design that is rarely discussed in pedagogical literature. Depending on the weighting, a final exam or assignment may provide unequal benefits to students depending on their prior performance in the class. Consequently, uncritical grade weighting can discount student learning, by ensuring that improved mastery of material at the semester’s end is not reflected in the course grade. Problems related to several common final exam weights are explored, as are potential solutions to unequal student outcomes made possible by uncritical grade weighting. Ultimately, this essay argues that choosing a weight for a final exam or a final assignment determines what types of student success ought to be possible in the class; therefore, instructors should assign exam weights intentionally, being fully aware of the potential benefits and problems of the weights that they choose.

  12. UCSD Performance in the Edge Plasma Simulation (EPSI) Project. Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Tynan, George Robert [Univ. of California, San Diego, CA (United States). Center for Energy Research

    2017-12-12

    This report contains a final report on the activities of UC San Diego PI G.R. Tynan and his collaborators as part of the EPSI Project, that was led by Dr. C.S. Chang, from PPPL. As a part of our work, we carried out several experiments on the ALCATOR C-­MOD tokamak device, aimed at unraveling the “trigger” or cause of the spontaneous transition from low-­mode confinement (L-­mode) to high confinement (H-­mode) that is universally observed in tokamak devices, and is planned for use in ITER.

  13. Information geometric methods for complexity

    Science.gov (United States)

    Felice, Domenico; Cafaro, Carlo; Mancini, Stefano

    2018-03-01

    Research on the use of information geometry (IG) in modern physics has witnessed significant advances recently. In this review article, we report on the utilization of IG methods to define measures of complexity in both classical and, whenever available, quantum physical settings. A paradigmatic example of a dramatic change in complexity is given by phase transitions (PTs). Hence, we review both global and local aspects of PTs described in terms of the scalar curvature of the parameter manifold and the components of the metric tensor, respectively. We also report on the behavior of geodesic paths on the parameter manifold used to gain insight into the dynamics of PTs. Going further, we survey measures of complexity arising in the geometric framework. In particular, we quantify complexity of networks in terms of the Riemannian volume of the parameter space of a statistical manifold associated with a given network. We are also concerned with complexity measures that account for the interactions of a given number of parts of a system that cannot be described in terms of a smaller number of parts of the system. Finally, we investigate complexity measures of entropic motion on curved statistical manifolds that arise from a probabilistic description of physical systems in the presence of limited information. The Kullback-Leibler divergence, the distance to an exponential family and volumes of curved parameter manifolds, are examples of essential IG notions exploited in our discussion of complexity. We conclude by discussing strengths, limits, and possible future applications of IG methods to the physics of complexity.

  14. Characterization of aspartame-cyclodextrin complexation.

    Science.gov (United States)

    Sohajda, Tamás; Béni, Szabolcs; Varga, Erzsébet; Iványi, Róbert; Rácz, Akos; Szente, Lajos; Noszál, Béla

    2009-12-05

    The inclusion complex formation of aspartame (guest) and various cyclodextrins (host) were examined using 1H NMR titration and capillary electrophoresis. Initially the protonation constants of aspartame were determined by NMR-pH titration with in situ pH measurement to yield log K1=7.83 and log K2=2.96. Based on these values the stability of the complexes formed by aspartame and 21 different cyclodextrins (CDs) were studied at pH 2.5, pH 5.2 and pH 9.0 values where aspartame exists predominantly in monocationic, zwitterionic and monoanionic form, respectively. The host cyclodextrin derivatives differed in various sidechains, degree of substitution, charge and purity so that the effect of these properties could be examined systematically. Concerning size, the seven-membered beta-cyclodextrin and its derivatives have been found to be the most suitable host molecules for complexation. Highest stability was observed for the acetylated derivative with a degree of substitution of 7. The purity of the CD enhanced the complexation while the degree of substitution did not provide obvious consequences. Finally, geometric aspects of the inclusion complex were assessed by 2D ROESY NMR and molecular modelling which proved that the guest's aromatic ring enters the wider end of the host cavity.

  15. Complex-wide review of DOE's Low-Level Waste Management ES ampersand H vulnerabilities. Volume I. Final report

    International Nuclear Information System (INIS)

    1996-05-01

    The Department of Energy (DOE) conducted a comprehensive complex-wide review of its management of low-level waste (LLW) and the radioactive component of mixed low-level waste (MLLW). This review was conducted in response to a recommendation from the Defense Nuclear Facilities Safety Board (DNFSB) which was established and authorized by Congress to oversee DOE. The DNFSB's recommendation concerning conformance with safety standards at DOE LLW sites was issued on September 8, 1994 and is referred to as Recommendation 94-2. DOE's Implementation Plan for its response to Recommendation 94-2 was submitted to the DNFSB on March 31, 1995. The DNFSB recommended that a complex-wide review of DOE's LLW management be initiated. The goal of the complex-wide review of DOE's LLW management system was to identify both programmatic and physical vulnerabilities that could lead to unnecessary radiation exposure of workers or the public or unnecessary releases of radioactive materials to the environment. Additionally, the DNFSB stated that an objective of the complex-wide review should be to establish the dimensions of the DOE LLW problem and support the identification of corrective actions to address safe disposition of past, present, and future volumes of LLW. The complex-wide review involved an evaluation of LLW management activities at 38 DOE facilities at 36 sites that actively manage LLW and MLLW

  16. Identifying cognitive complexity factors affecting the complexity of procedural steps in emergency operating procedures of a nuclear power plant

    International Nuclear Information System (INIS)

    Park, Jinkyun; Jeong, Kwangsup; Jung, Wondea

    2005-01-01

    In complex systems such as a nuclear and chemical plant, it is well known that the provision of understandable procedures that allow operators to clarify what needs to be done and how to do it is one of the requisites to secure their safety. As a previous study in providing understandable procedures, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is raised. To this end, the comparisons between operators' performance data measured by the form of a step performance time with their behavior in carrying out the prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect an operator's cognitive burden are identified. Although a well-designed experiment is indispensable for confirming the appropriateness of the additional complexity factors, it is strongly believed that the change of operators' performance data can be more authentically explained if the additional complexity factors are taken into consideration

  17. Identifying cognitive complexity factors affecting the complexity of procedural steps in emergency operating procedures of a nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jinkyun [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, P.O. Box 105, Duckjin-Dong, Yusong-Ku, Taejon 305-600 (Korea, Republic of)]. E-mail: kshpjk@kaeri.re.kr; Jeong, Kwangsup [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, P.O. Box 105, Duckjin-Dong, Yusong-Ku, Taejon 305-600 (Korea, Republic of); Jung, Wondea [Integrated Safety Assessment Division, Korea Atomic Energy Research Institute, P.O. Box 105, Duckjin-Dong, Yusong-Ku, Taejon 305-600 (Korea, Republic of)

    2005-08-01

    In complex systems such as a nuclear and chemical plant, it is well known that the provision of understandable procedures that allow operators to clarify what needs to be done and how to do it is one of the requisites to secure their safety. As a previous study in providing understandable procedures, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is raised. To this end, the comparisons between operators' performance data measured by the form of a step performance time with their behavior in carrying out the prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect an operator's cognitive burden are identified. Although a well-designed experiment is indispensable for confirming the appropriateness of the additional complexity factors, it is strongly believed that the change of operators' performance data can be more authentically explained if the additional complexity factors are taken into consideration.

  18. Identifying cognitive complexity factors affecting the complexity of procedural steps in emergency operating procedures of a nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Jinkyun Park; Kwangsup Jeong; Wondea Jung [Korea Atomic Energy Research Institute, Taejon (Korea). Integrated Safety Assessment Division

    2005-08-15

    In complex systems such as a nuclear and chemical plant, it is well known that the provision of understandable procedures that allow operators to clarify what needs to be done and how to do it is one of the requisites to secure their safety. As a previous study in providing understandable procedures, the step complexity (SC) measure that can quantify the complexity of procedural steps in emergency operating procedures (EOPs) of a nuclear power plant (NPP) was suggested. However, the necessity of additional complexity factors that can consider a cognitive aspect in evaluating the complexity of procedural steps is raised. To this end, the comparisons between operator' performance data measured by the form of a step performance time with their behavior in carrying out the prescribed activities of procedural steps are conducted in this study. As a result, two kinds of complexity factors (the abstraction level of knowledge and the level of engineering decision) that could affect an operator's cognitive burden are identified. Although a well-designed experiment is indispensable for confirming the appropriateness of the additional complexity factors, it is strongly believed that the change of operators' performance data can be more authentically explained if the additional complexity factors are taken into consideration. (author)

  19. Study of complex modes

    International Nuclear Information System (INIS)

    Pastrnak, J.W.

    1986-01-01

    This eighteen-month study has been successful in providing the designer and analyst with qualitative guidelines on the occurrence of complex modes in the dynamics of linear structures, and also in developing computer codes for determining quantitatively which vibration modes are complex and to what degree. The presence of complex modes in a test structure has been verified. Finite element analysis of a structure with non-proportional dumping has been performed. A partial differential equation has been formed to eliminate possible modeling errors

  20. Analysis of the existing Standard on Power performance measurement and its application in complex terrain

    International Nuclear Information System (INIS)

    Cuerva, A.

    1997-01-01

    There are some groups working on the improvement of the existing Standard and recommendation on WECS power performance measurement and analysis. One of them, besides the one working in this project, is the MEASNET expert group. This one is trying to adequate the main reference, the IEC1400-12 Ref. [9]. to the current requirements on technical quality and trueness. Within this group and the MEASNET one, many deficiencies have been detected in the procedure followed up to now. Several of them belong to general aspects of the method (calculations, assumptions, etc. ) but the most critical fact regards to the inherent characteristics of complex terrain and to the issue of site calibration and uncertainties due to it, specifically. (Author)

  1. Analysis of the existing Standard on Power performance measurement and its application in complex terrain

    Energy Technology Data Exchange (ETDEWEB)

    Cuerva, A.

    1997-10-01

    There are some groups working on the improvement of the existing Standard and recommendation on WECS power performance measurement and analysis. One of them, besides the one working in this project, is the MEASNET expert group. This one is trying to adequate the main reference, the IEC 1400-12 Re.[9]. to the current requirements on technical quality and trueness. Within this group and the MEASNET one, many deficiencies have been detected in the procedure followed up to now. Several of them belong to general aspects of the method (calculations, assumptions, etc.) but the most critical fact regards to the inherent characteristics of complex terrain and to the issue of site calibration and uncertainties due to it, specifically. (Author)

  2. Sleep and Final Exam Performance in Introductory Physics

    Science.gov (United States)

    Coletta, Vincent; Wikholm, Colin; Pascoe, Daniel

    2018-01-01

    Most physics instructors believe that adequate sleep is important in order for students to perform well on problem solving, and many instructors advise students to get plenty of sleep the night before an exam. After years of giving such advice to students at Loyola Marymount University (LMU), one of us decided to find out how many hours students…

  3. A simple approach to enhance the performance of complex-coefficient filter-based PLL in grid-connected applications

    DEFF Research Database (Denmark)

    Ramezani, Malek; Golestan, Saeed; Li, Shuhui

    2018-01-01

    In recent years, a large number of three-phase phase-locked loops (PLLs) have been developed. One of the most popular ones is the complex coefficient filterbased PLL (CCF-PLL). The CCFs benefit from a sequence selective filtering ability and, hence, enable the CCF-PLL to selectively reject/extract...... disturbances before the PLL control loop while maintaining an acceptable dynamic behavior. The aim of this paper is presenting a simple yet effective approach to enhance the standard CCF-PLL performance without requiring any additional computational load....

  4. Coal-fired high performance power generating system. Final report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-08-31

    As a result of the investigations carried out during Phase 1 of the Engineering Development of Coal-Fired High-Performance Power Generation Systems (Combustion 2000), the UTRC-led Combustion 2000 Team is recommending the development of an advanced high performance power generation system (HIPPS) whose high efficiency and minimal pollutant emissions will enable the US to use its abundant coal resources to satisfy current and future demand for electric power. The high efficiency of the power plant, which is the key to minimizing the environmental impact of coal, can only be achieved using a modern gas turbine system. Minimization of emissions can be achieved by combustor design, and advanced air pollution control devices. The commercial plant design described herein is a combined cycle using either a frame-type gas turbine or an intercooled aeroderivative with clean air as the working fluid. The air is heated by a coal-fired high temperature advanced furnace (HITAF). The best performance from the cycle is achieved by using a modern aeroderivative gas turbine, such as the intercooled FT4000. A simplified schematic is shown. In the UTRC HIPPS, the conversion efficiency for the heavy frame gas turbine version will be 47.4% (HHV) compared to the approximately 35% that is achieved in conventional coal-fired plants. This cycle is based on a gas turbine operating at turbine inlet temperatures approaching 2,500 F. Using an aeroderivative type gas turbine, efficiencies of over 49% could be realized in advanced cycle configuration (Humid Air Turbine, or HAT). Performance of these power plants is given in a table.

  5. Complex concentrate pretreatment

    International Nuclear Information System (INIS)

    Lokken, R.O.; Scheele, R.D.; Strachan, D.M.; Toste, A.P.

    1991-03-01

    After removal of the transuranics (TRU) by the TRUEX process, complex concentrate waste will be grouted for final storage. The purpose of this project, conducted at the Pacific Northwest Laboratory, is to support a future decision to grout the complexant waste without destruction of the organic contents. It has been demonstrated that grouts with acceptable parameters for the Transportable Grout Facility can be made using actual waste. The acceptability of these grouts from a regulatory view seems to be less of a problem than previously. None of the organics found in the waste have been found on the EPA hazardous chemicals list. Two potential problems with the processing of the complex concentrate wastes were identified during the use of the TRUEX process on samples of several milliliters. One was the amount of foam that is generated during acid addition to the alkaline waste. Some of this foam appears to be of a waxy nature but does redissolve when the waste is strongly acid. The second potential problem is that noticeable amounts of NO x gases are generated. No quantitative measure of the NO x gas generation was made. The problem relates to processing the waste in B-plant where there are no facilities to handle NO x gases. 5 refs., 4 figs., 4 tabs

  6. Final summarizing report on Grant DE-SC0001014 "Separation of Highly Complex Mixtures by Two-dimension Liquid Chromatography"

    Energy Technology Data Exchange (ETDEWEB)

    Guiochon, Georges [Univ. of Tennessee, Knoxville, TN (United States)

    2013-09-16

    The goal of our research was a fundamental investigation of methods available for the coupling of two separate chromatographic separations that would considerably enhance the individual separation power of each of these two separations. This gain arises from the combination of two independent retention mechanisms, one of them separating the components that coelute on the other column, making possible the separation of many more compounds in a given time. The two separation mechanisms used must be very different. This is possible because many retention mechanisms are available, using different kinds of molecular interactions, hydrophobic or hydrophilic interactions, polar interactions, hydrogen bonding, complex formation, ionic interactions, steric exclusion. Two methods can be used, allowing separations to be performed in space (spreading the bands of sample components on a plate covered with stationary phase layer) or in time (eluting the sample components through a column and detecting the bands leaving the column). Both offer a wide variety of possible combinations and were studied.

  7. Foundations for Improvements to Passive Detection Systems - Final Report

    International Nuclear Information System (INIS)

    Labov, S E; Pleasance, L; Sokkappa, P; Craig, W; Chapline, G; Frank, M; Gronberg, J; Jernigan, J G; Johnson, S; Kammeraad, J; Lange, D; Meyer, A; Nelson, K; Pohl, B; Wright, D; Wurtz, R

    2004-01-01

    This project explores the scientific foundation and approach for improving passive detection systems for plutonium and highly enriched uranium in real applications. Sources of gamma-ray radiation of interest were chosen to represent a range of national security threats, naturally occurring radioactive materials, industrial and medical radiation sources, and natural background radiation. The gamma-ray flux emerging from these sources, which include unclassified criticality experiment configurations as surrogates for nuclear weapons, were modeled in detail. The performance of several types of gamma-ray imaging systems using Compton scattering were modeled and compared. A mechanism was created to model the combine sources and background emissions and have the simulated radiation ''scene'' impinge on a model of a detector. These modeling tools are now being used in various projects to optimize detector performance and model detector sensitivity in complex measuring environments. This study also developed several automated algorithms for isotope identification from gamma-ray spectra and compared these to each other and to algorithms already in use. Verification testing indicates that these alternative isotope identification algorithms produced less false positive and false negative results than the ''GADRAS'' algorithms currently in use. In addition to these algorithms that used binned spectra, a new approach to isotope identification using ''event mode'' analysis was developed. Finally, a technique using muons to detect nuclear material was explored

  8. A finite element framework for multiscale/multiphysics analysis of structures with complex microstructures

    Science.gov (United States)

    Varghese, Julian

    This research work has contributed in various ways to help develop a better understanding of textile composites and materials with complex microstructures in general. An instrumental part of this work was the development of an object-oriented framework that made it convenient to perform multiscale/multiphysics analyses of advanced materials with complex microstructures such as textile composites. In addition to the studies conducted in this work, this framework lays the groundwork for continued research of these materials. This framework enabled a detailed multiscale stress analysis of a woven DCB specimen that revealed the effect of the complex microstructure on the stress and strain energy release rate distribution along the crack front. In addition to implementing an oxidation model, the framework was also used to implement strategies that expedited the simulation of oxidation in textile composites so that it would take only a few hours. The simulation showed that the tow architecture played a significant role in the oxidation behavior in textile composites. Finally, a coupled diffusion/oxidation and damage progression analysis was implemented that was used to study the mechanical behavior of textile composites under mechanical loading as well as oxidation. A parametric study was performed to determine the effect of material properties and the number of plies in the laminate on its mechanical behavior. The analyses indicated a significant effect of the tow architecture and other parameters on the damage progression in the laminates.

  9. Optimization-based topology identification of complex networks

    International Nuclear Information System (INIS)

    Tang Sheng-Xue; Chen Li; He Yi-Gang

    2011-01-01

    In many cases, the topological structures of a complex network are unknown or uncertain, and it is of significance to identify the exact topological structure. An optimization-based method of identifying the topological structure of a complex network is proposed in this paper. Identification of the exact network topological structure is converted into a minimal optimization problem by using the estimated network. Then, an improved quantum-behaved particle swarm optimization algorithm is used to solve the optimization problem. Compared with the previous adaptive synchronization-based method, the proposed method is simple and effective and is particularly valid to identify the topological structure of synchronization complex networks. In some cases where the states of a complex network are only partially observable, the exact topological structure of a network can also be identified by using the proposed method. Finally, numerical simulations are provided to show the effectiveness of the proposed method. (general)

  10. TSA waste stream and final waste form composition

    International Nuclear Information System (INIS)

    Grandy, J.D.; Eddy, T.L.; Anderson, G.L.

    1993-01-01

    A final vitrified waste form composition, based upon the chemical compositions of the input waste streams, is recommended for the transuranic-contaminated waste stored at the Transuranic Storage Area of the Radioactive Waste Management Complex at the Idaho National Engineering Laboratory. The quantities of waste are large with a considerable uncertainty in the distribution of various waste materials. It is therefore impractical to mix the input waste streams into an ''average'' transuranic-contaminated waste. As a result, waste stream input to a melter could vary widely in composition, with the potential of affecting the composition and properties of the final waste form. This work examines the extent of the variation in the input waste streams, as well as the final waste form under conditions of adding different amounts of soil. Five prominent Rocky Flats Plant 740 waste streams are considered, as well as nonspecial metals and the ''average'' transuranic-contaminated waste streams. The metals waste stream is the most extreme variation and results indicate that if an average of approximately 60 wt% of the mixture is soil, the final waste form will be predominantly silica, alumina, alkaline earth oxides, and iron oxide. This composition will have consistent properties in the final waste form, including high leach resistance, irrespective of the variation in waste stream. For other waste streams, much less or no soil could be required to yield a leach resistant waste form but with varying properties

  11. Moessbauer study of iron-sugar complexes

    International Nuclear Information System (INIS)

    Tonkovic, M.; Music, S.; Hadzija, O.; Nagy-Czako, I.; Vertes, A.

    1982-01-01

    Ferric-fructose complex has been prepared using FeCl 3 and Fe(NO 3 ) 3 solutions. Molecular weight determination and Moessbauer spectroscopic measurements proved that the ferric-fructose complex is polymeric in solid state and also in aqueous solution. The synthesis of a new iron-sorbose complex has been performed. Its Moessbauer spectra indicate a structure similar to that of the iron-fructose complex. (author)

  12. Adaptative synchronization in multi-output fractional-order complex dynamical networks and secure communications

    Science.gov (United States)

    Mata-Machuca, Juan L.; Aguilar-López, Ricardo

    2018-01-01

    This work deals with the adaptative synchronization of complex dynamical networks with fractional-order nodes and its application in secure communications employing chaotic parameter modulation. The complex network is composed of multiple fractional-order systems with mismatch parameters and the coupling functions are given to realize the network synchronization. We introduce a fractional algebraic synchronizability condition (FASC) and a fractional algebraic identifiability condition (FAIC) which are used to know if the synchronization and parameters estimation problems can be solved. To overcome these problems, an adaptative synchronization methodology is designed; the strategy consists in proposing multiple receiver systems which tend to follow asymptotically the uncertain transmitters systems. The coupling functions and parameters of the receiver systems are adjusted continually according to a convenient sigmoid-like adaptative controller (SLAC), until the measurable output errors converge to zero, hence, synchronization between transmitter and receivers is achieved and message signals are recovered. Indeed, the stability analysis of the synchronization error is based on the fractional Lyapunov direct method. Finally, numerical results corroborate the satisfactory performance of the proposed scheme by means of the synchronization of a complex network consisting of several fractional-order unified chaotic systems.

  13. Use of activity-based probes to develop high throughput screening assays that can be performed in complex cell extracts.

    Directory of Open Access Journals (Sweden)

    Edgar Deu

    2010-08-01

    Full Text Available High throughput screening (HTS is one of the primary tools used to identify novel enzyme inhibitors. However, its applicability is generally restricted to targets that can either be expressed recombinantly or purified in large quantities.Here, we described a method to use activity-based probes (ABPs to identify substrates that are sufficiently selective to allow HTS in complex biological samples. Because ABPs label their target enzymes through the formation of a permanent covalent bond, we can correlate labeling of target enzymes in a complex mixture with inhibition of turnover of a substrate in that same mixture. Thus, substrate specificity can be determined and substrates with sufficiently high selectivity for HTS can be identified. In this study, we demonstrate this method by using an ABP for dipeptidyl aminopeptidases to identify (Pro-Arg2-Rhodamine as a specific substrate for DPAP1 in Plasmodium falciparum lysates and Cathepsin C in rat liver extracts. We then used this substrate to develop highly sensitive HTS assays (Z'>0.8 that are suitable for use in screening large collections of small molecules (i.e >300,000 for inhibitors of these proteases. Finally, we demonstrate that it is possible to use broad-spectrum ABPs to identify target-specific substrates.We believe that this approach will have value for many enzymatic systems where access to large amounts of active enzyme is problematic.

  14. Teacher Transformation from Complex Thinking

    Directory of Open Access Journals (Sweden)

    Johana Carolina Peña Lozada

    2018-02-01

    Full Text Available The present article is a qualitative investigation of phenomenological interpretative paradigm, of documentary type; and seeks to analyze the transformation of the teacher from the complex thinking, centered on the teacher-student benefit, through a bibliographic documentary triangulation of the authors Edgar Morin and Matthew Lipman with the subject of complex thinking and the necessary knowledge for education, David Ausubel, Jean Piaget and Lev Vygotsky with educational psychology, Pérez Esclarín with the humanization of education, and finally with contributions from Honore Bernard, UNESCO in the field of teacher transformation. Faced with the crisis and the evolution of education in Latin America requires an educational reform where innovation, creativity, training, vocation and love of teaching practices are contemplated, looking towards the adjustment profile of the current reality of apprentices, assuming in a continuous way the challenge of breaking barriers that obstruct the goal that is pursued in the multidimensional, professional, spiritual and human field, immersed in the complexity of their work, and interacting with all the interior and exterior elements of their humanity that are exposed in the challenge of complex thinking.

  15. Environmental management at Nuclear Fuel Complex

    International Nuclear Information System (INIS)

    Choudhary, S.; Kalidas, R.

    2005-01-01

    Nuclear Fuel Complex (NFC) a unit of Department of Atomic Energy (DAE) is manufacturing and supplying fuel assemblies and structurals for Atomic Power Reactors, Seamless Stainless Steel/ Special Alloy Tubes and high purity/special materials for various industries including Atomic Energy, Space and Electronics. NFC is spread over about 200 acres area. It consists of various chemical, metallurgical, fabrication and assembly plants engaged in processing uranium from concentrate to final fuel assembly, processing zirconium from ore to metallic products and processing various special high purity materials from ore or intermediate level to the final product. The plants were commissioned in the early seventies and capacities of these plants have been periodically enhanced to cater to the growing demands of the Indian Nuclear Industry. In the two streams of plants processing Uranium and zirconium, various types and categories including low level radioactive wastes are generated. These require proper handling and disposal. The overall management of radioactive and other waste aims at minimizing the generation and release to the environment. In this presentation, the environment management methodologies as practiced in Nuclear Fuel Complex are discussed. (author)

  16. Metric for Calculation of System Complexity based on its Connections

    Directory of Open Access Journals (Sweden)

    João Ricardo Braga de Paiva

    2017-02-01

    Full Text Available This paper proposes a methodology based on system connections to calculate its complexity. Two study cases are proposed: the dining Chinese philosophers’ problem and the distribution center. Both studies are modeled using the theory of Discrete Event Systems and simulations in different contexts were performed in order to measure their complexities. The obtained results present i the static complexity as a limiting factor for the dynamic complexity, ii the lowest cost in terms of complexity for each unit of measure of the system performance and iii the output sensitivity to the input parameters. The associated complexity and performance measures aggregate knowledge about the system.

  17. Team dynamics in complex projects

    NARCIS (Netherlands)

    Oeij, P.; Vroome, E.E.M. de; Dhondt, S.; Gaspersz, J.B.R.

    2012-01-01

    Complexity of projects is hotly debated and a factor which affects innovativeness of team performance. Much attention in the past is paid to technical complexity and many issues are related to natural and physical sciences. A growing awareness of the importance of socioorganisational issues is

  18. A new approach to performance assessment of barriers in a repository. Executive summary, draft, technical appendices. Final report

    International Nuclear Information System (INIS)

    Mueller-Hoeppe, N.; Krone, J.; Niehues, N.; Poehler, M.; Raitz von Frentz, R.; Gauglitz, R.

    1999-06-01

    Multi-barrier systems are accepted as the basic approach for long term environmental safe isolation of radioactive waste in geological repositories. Assessing the performance of natural and engineered barriers is one of the major difficulties in producing evidence of environmental safety for any radioactive waste disposal facility, due to the enormous complexity of scenarios and uncertainties to be considered. This report outlines a new methodological approach originally developed basically for a repository in salt, but that can be transferred with minor modifications to any other host rock formation. The approach is based on the integration of following elements: (1) Implementation of a simple method and efficient criteria to assess and prove the tightness of geological and engineered barriers; (2) Using the method of Partial Safety Factors in order to assess barrier performance at certain reasonable level of confidence; (3) Integration of a diverse geochemical barrier in the near field of waste emplacement limiting systematically the radiological consequences from any radionuclide release in safety investigations and (4) Risk based approach for the assessment of radionuclide releases. Indicative calculations performed with extremely conservative assumptions allow to exclude any radiological health consequences from a HLW repository in salt to a reference person with a safety level of 99,9999% per year. (orig.)

  19. Thoughts toward a clinical database of architecture: evidence, complexity, and impact

    Directory of Open Access Journals (Sweden)

    Leonard R. Bachman

    2012-10-01

    Full Text Available This paper examines how architecture is building a clinical database similar to that of law and medicine and is developing this database for the purposes of acquiring complex design insight. This emerging clinical branch of architectural knowledge exceeds the scope of everyday experience of physical form and can thus be shown to enable a more satisfying scale of design thinking. It is argued that significant transformational kinds of professional transparency and accountability are thus intensifying. The tactics and methods of this paper are to connect previously disparate historical and contemporary events that mark the evolution of this database and then to fold those events into an explanatory narrative concerning clinical design practice. Beginning with architecture’s use of precedent (Collins 1971, the formulation of design as complex problems (Rittel and Webber 1973, high performance buildings to meet the crisis of climate change, social mandates of postindustrial society (Bell 1973, and other roots of evidence, the paper then elaborates the themes in which this database is evolving. Such themes include post-occupancy evaluation (Bordass and Leaman 2005, continuous commissioning, performance simulation, digital instrumentation, automation, and other modes of data collection in buildings. Finally, the paper concludes with some anticipated impacts that such a clinical database might have on design practice and how their benefits can be achieved through new interdisciplinary relations between academia and practice.

  20. Kinetics of the reactions of hydrated electrons with metal complexes

    International Nuclear Information System (INIS)

    Korsse, J.

    1983-01-01

    The reactivity of the hydrated electron towards metal complexes is considered. Experiments are described involving metal EDTA and similar complexes. The metal ions studied are mainly Ni 2+ , Co 2+ and Cu 2+ . Rates of the reactions of the complexes with e - (aq) were measured using the pulse radiolysis technique. It is shown that the reactions of e - (aq) with the copper complexes display unusually small kinetic salt effects. The results suggest long-range electron transfer by tunneling. A tunneling model is presented and the experimental results are discussed in terms of this model. Results of approximate molecular orbital calculations of some redox potentials are given, for EDTA chelates as well as for series of hexacyano and hexaquo complexes. Finally, equilibrium constants for the formation of ternary complexes are reported. (Auth./G.J.P.)

  1. Markov counting and reward processes for analysing the performance of a complex system subject to random inspections

    International Nuclear Information System (INIS)

    Ruiz-Castro, Juan Eloy

    2016-01-01

    In this paper, a discrete complex reliability system subject to internal failures and external shocks, is modelled algorithmically. Two types of internal failure are considered: repairable and non-repairable. When a repairable failure occurs, the unit goes to corrective repair. In addition, the unit is subject to external shocks that may produce an aggravation of the internal degradation level, cumulative damage or extreme failure. When a damage threshold is reached, the unit must be removed. When a non-repairable failure occurs, the device is replaced by a new, identical one. The internal performance and the external damage are partitioned in performance levels. Random inspections are carried out. When an inspection takes place, the internal performance of the system and the damage caused by external shocks are observed and if necessary the unit is sent to preventive maintenance. If the inspection observes minor state for the internal performance and/or external damage, then these states remain in memory when the unit goes to corrective or preventive maintenance. Transient and stationary analyses are performed. Markov counting and reward processes are developed in computational form to analyse the performance and profitability of the system with and without preventive maintenance. These aspects are implemented computationally with Matlab. - Highlights: • A multi-state device is modelled in an algorithmic and computational form. • The performance is partitioned in multi-states and degradation levels. • Several types of failures with repair times according to degradation levels. • Preventive maintenance as response to random inspection is introduced. • The performance-profitable is analysed through Markov counting and reward processes.

  2. Iron complexes of pharmaceutical interest: Antianemics

    Directory of Open Access Journals (Sweden)

    Cakić Milorad D.

    2003-01-01

    Full Text Available Preparations based on different compounds of bi- and trivalent iron are curently used for the prevention and therapy of sideropenic anemia in human and veterinary medicine. The application of preparations based on dextran started about 1950. Up to now, synthesis and production of preparations were performed with the purpose of improving pharmacological performance by using dextran oligosaccharides with different weight - average molar masses and their oxidized and hydrogenated derivatives. Synthesis of polynuclear iron(lll complexs with other oligosaccharides (inulin and pullulan and their derivatives was developed, with potential or valid pharmacological activity for sideropenic anemia treatment.A Review of iron(lll complexes with different oligosaccharides, their physico-chemical characterization pharmaco-biological performance, global structure, further research and possible applications of then complexes, are presented in this paper.

  3. Performance of four Dactylorhiza species over a complex trophic gradient

    NARCIS (Netherlands)

    Dijk, E; Grootjans, AB

    Spontananeous distribution and survival in experimental plots of four marsh orchids (Dactylorhiza spp.) in a hay-meadow complex were related to mineral composition of groundwater, soil nutrient availability and species composition of the vegetation. Differences in Ca2+ contents of the groundwater

  4. Driver’s Cognitive Workload and Driving Performance under Traffic Sign Information Exposure in Complex Environments: A Case Study of the Highways in China

    Directory of Open Access Journals (Sweden)

    Nengchao Lyu

    2017-02-01

    Full Text Available Complex traffic situations and high driving workload are the leading contributing factors to traffic crashes. There is a strong correlation between driving performance and driving workload, such as visual workload from traffic signs on highway off-ramps. This study aimed to evaluate traffic safety by analyzing drivers’ behavior and performance under the cognitive workload in complex environment areas. First, the driving workload of drivers was tested based on traffic signs with different quantities of information. Forty-four drivers were recruited to conduct a traffic sign cognition experiment under static controlled environment conditions. Different complex traffic signs were used for applying the cognitive workload. The static experiment results reveal that workload is highly related to the amount of information on traffic signs and reaction time increases with the information grade, while driving experience and gender effect are not significant. This shows that the cognitive workload of subsequent driving experiments can be controlled by the amount of information on traffic signs. Second, driving characteristics and driving performance were analyzed under different secondary task driving workload levels using a driving simulator. Drivers were required to drive at the required speed on a designed highway off-ramp scene. The cognitive workload was controlled by reading traffic signs with different information, which were divided into four levels. Drivers had to make choices by pushing buttons after reading traffic signs. Meanwhile, the driving performance information was recorded. Questionnaires on objective workload were collected right after each driving task. The results show that speed maintenance and lane deviations are significantly different under different levels of cognitive workload, and the effects of driving experience and gender groups are significant. The research results can be used to analyze traffic safety in highway

  5. FY2014 FES (Fusion Energy Sciences) Theory & Simulation Performance Target, Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Fu, Guoyong [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Budny, Robert [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Gorelenkov, Nikolai [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Poli, Francesca [Princeton Plasma Physics Lab. (PPPL), Princeton, NJ (United States); Chen, Yang [Univ. of Colorado, Boulder, CO (United States); McClenaghan, Joseph [Univ. of California, Irvine, CA (United States); Lin, Zhihong [Univ. of California, Irvine, CA (United States); Spong, Don [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bass, Eric [Univ. of California, San Diego, CA (United States); Waltz, Ron [General Atomics, San Diego, CA (United States)

    2014-10-14

    We report here the work done for the FY14 OFES Theory Performance Target as given below: "Understanding alpha particle confinement in ITER, the world's first burning plasma experiment, is a key priority for the fusion program. In FY 2014, determine linear instability trends and thresholds of energetic particle-driven shear Alfven eigenmodes in ITER for a range of parameters and profiles using a set of complementary simulation models (gyrokinetic, hybrid, and gyrofluid). Carry out initial nonlinear simulations to assess the effects of the unstable modes on energetic particle transport". In the past year (FY14), a systematic study of the alpha-driven Alfven modes in ITER has been carried out jointly by researchers from six institutions involving seven codes including the transport simulation code TRANSP (R. Budny and F. Poli, PPPL), three gyrokinetic codes: GEM (Y. Chen, Univ. of Colorado), GTC (J. McClenaghan, Z. Lin, UCI), and GYRO (E. Bass, R. Waltz, UCSD/GA), the hybrid code M3D-K (G.Y. Fu, PPPL), the gyro-fluid code TAEFL (D. Spong, ORNL), and the linear kinetic stability code NOVA-K (N. Gorelenkov, PPPL). A range of ITER parameters and profiles are specified by TRANSP simulation of a hybrid scenario case and a steady-state scenario case. Based on the specified ITER equilibria linear stability calculations are done to determine the stability boundary of alpha-driven high-n TAEs using the five initial value codes (GEM, GTC, GYRO, M3D-K, and TAEFL) and the kinetic stability code (NOVA-K). Both the effects of alpha particles and beam ions have been considered. Finally, the effects of the unstable modes on energetic particle transport have been explored using GEM and M3D-K.

  6. Annual Performance Assessment of Complex Fenestration Systems in Sunny Climates Using Advanced Computer Simulations

    Directory of Open Access Journals (Sweden)

    Chantal Basurto

    2015-12-01

    Full Text Available Complex Fenestration Systems (CFS are advanced daylighting systems that are placed on the upper part of a window to improve the indoor daylight distribution within rooms. Due to their double function of daylight redirection and solar protection, they are considered as a solution to mitigate the unfavorable effects due to the admission of direct sunlight in buildings located in prevailing sunny climates (risk of glare and overheating. Accordingly, an adequate assessment of their performance should include an annual evaluation of the main aspects relevant to the use of daylight in such regions: the indoor illuminance distribution, thermal comfort, and visual comfort of the occupant’s. Such evaluation is possible with the use of computer simulations combined with the bi-directional scattering distribution function (BSDF data of these systems. This study explores the use of available methods to assess the visible and thermal annual performance of five different CFS using advanced computer simulations. To achieve results, an on-site daylight monitoring was carried out in a building located in a predominantly sunny climate location, and the collected data was used to create and calibrate a virtual model used to carry-out the simulations. The results can be employed to select the CFS, which improves visual and thermal interior environment for the occupants.

  7. Topology identification of the complex networks with non-delayed and delayed coupling

    International Nuclear Information System (INIS)

    Guo Wanli; Chen Shihua; Sun Wen

    2009-01-01

    In practical situation, there exists many uncertain information in complex networks, such as the topological structures. So the topology identification is an important issue in the research of the complex networks. Based on LaSalle's invariance principle, in this Letter, an adaptive controlling method is proposed to identify the topology of a weighted general complex network model with non-delayed and delayed coupling. Finally, simulation results show that the method is effective.

  8. Social dimension and complexity differentially influence brain responses during feedback processing.

    Science.gov (United States)

    Pfabigan, Daniela M; Gittenberger, Marianne; Lamm, Claus

    2017-10-30

    Recent research emphasizes the importance of social factors during performance monitoring. Thus, the current study investigated the impact of social stimuli -such as communicative gestures- on feedback processing. Moreover, it addressed a shortcoming of previous studies, which failed to consider stimulus complexity as potential confounding factor. Twenty-four volunteers performed a time estimation task while their electroencephalogram was recorded. Either social complex, social non-complex, non-social complex, or non-social non-complex stimuli were used to provide performance feedback. No effects of social dimension or complexity were found for task performance. In contrast, Feedback-Related Negativity (FRN) and P300 amplitudes were sensitive to both factors, with larger FRN and P300 amplitudes after social compared to non-social stimuli, and larger FRN amplitudes after complex positive than non-complex positive stimuli. P2 amplitudes were solely sensitive to feedback valence and social dimension. Subjectively, social complex stimuli were rated as more motivating than non-social complex ones. Independently of each other, social dimension and visual complexity influenced amplitude variation during performance monitoring. Social stimuli seem to be perceived as more salient, which is corroborated by P2, FRN and P300 results, as well as by subjective ratings. This could be explained due to their given relevance during every day social interactions.

  9. Complex manifolds in relativity

    International Nuclear Information System (INIS)

    Flaherty, E.J. Jr.

    1975-01-01

    Complex manifold theory is applied to the study of certain problems in general relativity. The first half of the work is devoted to the mathematical theory of complex manifold. Then a brief review of general relativity is given. It is shown that any spacetime admits locally an almost Hermitian structure, suitably modified to be compatible with the indefinite metric of spacetime. This structure is integrable if and only if the spacetime admits two geodesic and shearfree null congruences, thus in particular if the spacetime is type D vacuum or electrified. The structure is ''half-integrable'' in a suitable sense if and only if the spacetime admits one geodesic and shearfree null congruence, thus in particular for all algebraically special vacuum spacetimes. Conditions for the modified Hermitian spacetime to be Kahlerian are presented. The most general metric for such a modified Kahlerian spacetime is found. It is shown that the type D vacuum and electrified spacetimes are conformally related to modified Kahlerian spacetimes by a generally complex conformal factor. These latter are shown to possess a very rich structure, including the existence of Killing tensors and Killing vectors. A new ''explanation'' of Newman's complex coordinate transformations is given. It is felt to be superior to previous ''explanations'' on several counts. For example, a physical interpretation in terms of a symmetry group is given. The existence of new complex coordinate transformations is established: Nt is shown that any type D vacuum spacetime is obtainable from either Schwarzschild spacetime or ''C'' spacetime by a complex coordinate transformation. Finally, some related topics are discussed and areas for future work are outlined. (Diss. Abstr. Int., B)

  10. Reducing the Complexity of Genetic Fuzzy Classifiers in Highly-Dimensional Classification Problems

    Directory of Open Access Journals (Sweden)

    DimitrisG. Stavrakoudis

    2012-04-01

    Full Text Available This paper introduces the Fast Iterative Rule-based Linguistic Classifier (FaIRLiC, a Genetic Fuzzy Rule-Based Classification System (GFRBCS which targets at reducing the structural complexity of the resulting rule base, as well as its learning algorithm's computational requirements, especially when dealing with high-dimensional feature spaces. The proposed methodology follows the principles of the iterative rule learning (IRL approach, whereby a rule extraction algorithm (REA is invoked in an iterative fashion, producing one fuzzy rule at a time. The REA is performed in two successive steps: the first one selects the relevant features of the currently extracted rule, whereas the second one decides the antecedent part of the fuzzy rule, using the previously selected subset of features. The performance of the classifier is finally optimized through a genetic tuning post-processing stage. Comparative results in a hyperspectral remote sensing classification as well as in 12 real-world classification datasets indicate the effectiveness of the proposed methodology in generating high-performing and compact fuzzy rule-based classifiers, even for very high-dimensional feature spaces.

  11. Analysis of the low-level waste radionuclide inventory for the Radioactive Waste Management Complex performance assessment

    Energy Technology Data Exchange (ETDEWEB)

    Plansky, L.E.; Hoiland, S.A.

    1992-02-01

    This report summarizes the results of a study to improve the estimates of the radionuclides in the low-level radioactive waste (LLW) inventory which is buried in the Idaho National Engineering Laboratory (INEL) Radioactive Waste Management Complex (RWMC) Subsurface Disposal Area (SDA). The work is done to support the RWMC draft performance assessment (PA). Improved radionuclide inventory estimates are provided for the INEL LLW generators. Engineering, environmental assessment or other research areas may find use for the information in this report. It may also serve as a LLW inventory baseline for data quality assurance. The individual INEL LLW generators, their history and their activities are also described in detail.

  12. Analysis of the low-level waste radionuclide inventory for the Radioactive Waste Management Complex performance assessment

    International Nuclear Information System (INIS)

    Plansky, L.E.; Hoiland, S.A.

    1992-02-01

    This report summarizes the results of a study to improve the estimates of the radionuclides in the low-level radioactive waste (LLW) inventory which is buried in the Idaho National Engineering Laboratory (INEL) Radioactive Waste Management Complex (RWMC) Subsurface Disposal Area (SDA). The work is done to support the RWMC draft performance assessment (PA). Improved radionuclide inventory estimates are provided for the INEL LLW generators. Engineering, environmental assessment or other research areas may find use for the information in this report. It may also serve as a LLW inventory baseline for data quality assurance. The individual INEL LLW generators, their history and their activities are also described in detail

  13. Noether charge, black hole volume, and complexity

    Energy Technology Data Exchange (ETDEWEB)

    Couch, Josiah; Fischler, Willy; Nguyen, Phuc H. [Theory Group, Department of Physics and Texas Cosmology Center,University of Texas at Austin, 2515 Speedway, C1600, Austin, TX 78712-1192 (United States)

    2017-03-23

    In this paper, we study the physical significance of the thermodynamic volumes of AdS black holes using the Noether charge formalism of Iyer and Wald. After applying this formalism to study the extended thermodynamics of a few examples, we discuss how the extended thermodynamics interacts with the recent complexity = action proposal of Brown et al. (CA-duality). We, in particular, discover that their proposal for the late time rate of change of complexity has a nice decomposition in terms of thermodynamic quantities reminiscent of the Smarr relation. This decomposition strongly suggests a geometric, and via CA-duality holographic, interpretation for the thermodynamic volume of an AdS black hole. We go on to discuss the role of thermodynamics in complexity = action for a number of black hole solutions, and then point out the possibility of an alternate proposal, which we dub “complexity = volume 2.0'. In this alternate proposal the complexity would be thought of as the spacetime volume of the Wheeler-DeWitt patch. Finally, we provide evidence that, in certain cases, our proposal for complexity is consistent with the Lloyd bound whereas CA-duality is not.

  14. Different Epidemic Models on Complex Networks

    International Nuclear Information System (INIS)

    Zhang Haifeng; Small, Michael; Fu Xinchu

    2009-01-01

    Models for diseases spreading are not just limited to SIS or SIR. For instance, for the spreading of AIDS/HIV, the susceptible individuals can be classified into different cases according to their immunity, and similarly, the infected individuals can be sorted into different classes according to their infectivity. Moreover, some diseases may develop through several stages. Many authors have shown that the individuals' relation can be viewed as a complex network. So in this paper, in order to better explain the dynamical behavior of epidemics, we consider different epidemic models on complex networks, and obtain the epidemic threshold for each case. Finally, we present numerical simulations for each case to verify our results.

  15. Two-phase flow in a saliniferous final repository using the example of ERAM. Final report; Zweiphasenfluss in einem salinaren Endlager am Beispiel des ERAM. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Kock, Ingo; Frieling, Gerd; Navarro, Martin

    2016-10-15

    In the frame of the research project ZIESEL the GRS enhanced the state of science and technology for the realization and evaluation of long-term safety cases for the final deposition of radioactive wastes. The superior aim was the improved understanding of two-phase flow processes in a complex final repository system. The consideration of two-phase processes in modeling of final repository systems induces processes and effects that significantly affect the transport behavior of fluid and radionuclides. Two-phase processes include not only capillary pressures and relative permeabilities but also a basic competition of phases with respect to pore volume for storage and transport and density-driven vertical separation of phases. Basically seals have been shown to be essential for the system behavior because of their influence of the gas pressure dependent control function. The system behavior is also influences by the model geometry.

  16. Describing joint air defence within operations other than war context as a complex system

    CSIR Research Space (South Africa)

    Oosthuizen, R

    2009-10-01

    Full Text Available . This paper will firstly investigate the theory of complexity and identify the main characteristics. This will be applied to Systems Engineering and modelling techniques, to propose a method of implementation in the real world. The application... of complexity in warfare is discussed to form a foundation for the discussion of JAD. Finally, the sources of complexity in JAD are identified and an approach to address these proposed. 2 Complex Systems 2.1 Definition of Complex Systems The theory...

  17. Complexity measures of music

    Science.gov (United States)

    Pease, April; Mahmoodi, Korosh; West, Bruce J.

    2018-03-01

    We present a technique to search for the presence of crucial events in music, based on the analysis of the music volume. Earlier work on this issue was based on the assumption that crucial events correspond to the change of music notes, with the interesting result that the complexity index of the crucial events is mu ~ 2, which is the same inverse power-law index of the dynamics of the brain. The search technique analyzes music volume and confirms the results of the earlier work, thereby contributing to the explanation as to why the brain is sensitive to music, through the phenomenon of complexity matching. Complexity matching has recently been interpreted as the transfer of multifractality from one complex network to another. For this reason we also examine the mulifractality of music, with the observation that the multifractal spectrum of a computer performance is significantly narrower than the multifractal spectrum of a human performance of the same musical score. We conjecture that although crucial events are demonstrably important for information transmission, they alone are not suficient to define musicality, which is more adequately measured by the multifractality spectrum.

  18. The BRST complex of homological Poisson reduction

    Science.gov (United States)

    Müller-Lennert, Martin

    2017-02-01

    BRST complexes are differential graded Poisson algebras. They are associated with a coisotropic ideal J of a Poisson algebra P and provide a description of the Poisson algebra (P/J)^J as their cohomology in degree zero. Using the notion of stable equivalence introduced in Felder and Kazhdan (Contemporary Mathematics 610, Perspectives in representation theory, 2014), we prove that any two BRST complexes associated with the same coisotropic ideal are quasi-isomorphic in the case P = R[V] where V is a finite-dimensional symplectic vector space and the bracket on P is induced by the symplectic structure on V. As a corollary, the cohomology of the BRST complexes is canonically associated with the coisotropic ideal J in the symplectic case. We do not require any regularity assumptions on the constraints generating the ideal J. We finally quantize the BRST complex rigorously in the presence of infinitely many ghost variables and discuss the uniqueness of the quantization procedure.

  19. Characterizing time series via complexity-entropy curves

    Science.gov (United States)

    Ribeiro, Haroldo V.; Jauregui, Max; Zunino, Luciano; Lenzi, Ervin K.

    2017-06-01

    The search for patterns in time series is a very common task when dealing with complex systems. This is usually accomplished by employing a complexity measure such as entropies and fractal dimensions. However, such measures usually only capture a single aspect of the system dynamics. Here, we propose a family of complexity measures for time series based on a generalization of the complexity-entropy causality plane. By replacing the Shannon entropy by a monoparametric entropy (Tsallis q entropy) and after considering the proper generalization of the statistical complexity (q complexity), we build up a parametric curve (the q -complexity-entropy curve) that is used for characterizing and classifying time series. Based on simple exact results and numerical simulations of stochastic processes, we show that these curves can distinguish among different long-range, short-range, and oscillating correlated behaviors. Also, we verify that simulated chaotic and stochastic time series can be distinguished based on whether these curves are open or closed. We further test this technique in experimental scenarios related to chaotic laser intensity, stock price, sunspot, and geomagnetic dynamics, confirming its usefulness. Finally, we prove that these curves enhance the automatic classification of time series with long-range correlations and interbeat intervals of healthy subjects and patients with heart disease.

  20. CEO performance appraisal: review and recommendations.

    Science.gov (United States)

    Newman, J F; Tyler, L; Dunbar, D M

    2001-01-01

    CEO performance appraisal (PA) is very valuable to an organization, but the chances of obtaining a PA of high quality decrease as executive responsibility increases. The realities of CEO PA are that it: (1) is inevitable; (2) is creative and complex; (3) involves politics; and (4) has a significant effect on the organization and the executive. PA is conducted for legal and social requirements, to enhance communication, to provide opportunities for improvement, and to relate performance to compensation. This article discusses several problems with chief executive officer (CEO) PA and the contemporary approaches that seek to improve it. Three fundamental areas for evaluation are identified: (1) organizational success; (2) areawide health status; and (3) professional role fulfillment. These provide an outline for successful healthcare PA. In addition to a discussion of the strategic considerations behind a successful CEO PA system, several recommendations are offered for the implementation of the annual evaluation process. The final goal of CEO PA is to link its results to CEO incentive compensation. It is strongly recommended that some portion of the CEO's salary directly hinge on his performance in two critical areas: organizational effectiveness and community health status.

  1. Novel algorithm by low complexity filter on retinal vessel segmentation

    Science.gov (United States)

    Rostampour, Samad

    2011-10-01

    This article shows a new method to detect blood vessels in the retina by digital images. Retinal vessel segmentation is important for detection of side effect of diabetic disease, because diabetes can form new capillaries which are very brittle. The research has been done in two phases: preprocessing and processing. Preprocessing phase consists to apply a new filter that produces a suitable output. It shows vessels in dark color on white background and make a good difference between vessels and background. The complexity is very low and extra images are eliminated. The second phase is processing and used the method is called Bayesian. It is a built-in in supervision classification method. This method uses of mean and variance of intensity of pixels for calculate of probability. Finally Pixels of image are divided into two classes: vessels and background. Used images are related to the DRIVE database. After performing this operation, the calculation gives 95 percent of efficiency average. The method also was performed from an external sample DRIVE database which has retinopathy, and perfect result was obtained

  2. A Low-Complexity ESPRIT-Based DOA Estimation Method for Co-Prime Linear Arrays.

    Science.gov (United States)

    Sun, Fenggang; Gao, Bin; Chen, Lizhen; Lan, Peng

    2016-08-25

    The problem of direction-of-arrival (DOA) estimation is investigated for co-prime array, where the co-prime array consists of two uniform sparse linear subarrays with extended inter-element spacing. For each sparse subarray, true DOAs are mapped into several equivalent angles impinging on the traditional uniform linear array with half-wavelength spacing. Then, by applying the estimation of signal parameters via rotational invariance technique (ESPRIT), the equivalent DOAs are estimated, and the candidate DOAs are recovered according to the relationship among equivalent and true DOAs. Finally, the true DOAs are estimated by combining the results of the two subarrays. The proposed method achieves a better complexity-performance tradeoff as compared to other existing methods.

  3. Conversation, coupling and complexity

    DEFF Research Database (Denmark)

    Fusaroli, Riccardo; Abney, Drew; Bahrami, Bahador

    We investigate the linguistic co-construction of interpersonal synergies. By applying a measure of coupling between complex systems to an experimentally elicited corpus of joint decision dialogues, we show that interlocutors’ linguistic behavior displays increasing signature of multi-scale coupling......, known as complexity matching, over the course of interaction. Furthermore, we show that stronger coupling corresponds with more effective interaction, as measured by collective task performance....

  4. Technical Efficiency and Organ Transplant Performance: A Mixed-Method Approach

    Science.gov (United States)

    de-Pablos-Heredero, Carmen; Fernández-Renedo, Carlos; Medina-Merodio, Jose-Amelio

    2015-01-01

    Mixed methods research is interesting to understand complex processes. Organ transplants are complex processes in need of improved final performance in times of budgetary restrictions. As the main objective a mixed method approach is used in this article to quantify the technical efficiency and the excellence achieved in organ transplant systems and to prove the influence of organizational structures and internal processes in the observed technical efficiency. The results show that it is possible to implement mechanisms for the measurement of the different components by making use of quantitative and qualitative methodologies. The analysis show a positive relationship between the levels related to the Baldrige indicators and the observed technical efficiency in the donation and transplant units of the 11 analyzed hospitals. Therefore it is possible to conclude that high levels in the Baldrige indexes are a necessary condition to reach an increased level of the service offered. PMID:25950653

  5. Final Report for ALCC Allocation: Predictive Simulation of Complex Flow in Wind Farms

    Energy Technology Data Exchange (ETDEWEB)

    Barone, Matthew F. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ananthan, Shreyas [National Renewable Energy Lab. (NREL), Golden, CO (United States); Churchfield, Matt [National Renewable Energy Lab. (NREL), Golden, CO (United States); Domino, Stefan P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Henry de Frahan, Marc [National Renewable Energy Lab. (NREL), Golden, CO (United States); Knaus, Robert C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melvin, Jeremy [Univ. of Texas, Austin, TX (United States); Moser, Robert [Univ. of Texas, Austin, TX (United States); Sprague, Michael [National Renewable Energy Lab. (NREL), Golden, CO (United States); Thomas, Stephen [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-08-01

    This report documents work performed using ALCC computing resources granted under a proposal submitted in February 2016, with the resource allocation period spanning the period July 2016 through June 2017. The award allocation was 10.7 million processor-hours at the National Energy Research Scientific Computing Center. The simulations performed were in support of two projects: the Atmosphere to Electrons (A2e) project, supported by the DOE EERE office; and the Exascale Computing Project (ECP), supported by the DOE Office of Science. The project team for both efforts consists of staff scientists and postdocs from Sandia National Laboratories and the National Renewable Energy Laboratory. At the heart of these projects is the open-source computational-fluid-dynamics (CFD) code, Nalu. Nalu solves the low-Mach-number Navier-Stokes equations using an unstructured- grid discretization. Nalu leverages the open-source Trilinos solver library and the Sierra Toolkit (STK) for parallelization and I/O. This report documents baseline computational performance of the Nalu code on problems of direct relevance to the wind plant physics application - namely, Large Eddy Simulation (LES) of an atmospheric boundary layer (ABL) flow and wall-modeled LES of a flow past a static wind turbine rotor blade. Parallel performance of Nalu and its constituent solver routines residing in the Trilinos library has been assessed previously under various campaigns. However, both Nalu and Trilinos have been, and remain, in active development and resources have not been available previously to rigorously track code performance over time. With the initiation of the ECP, it is important to establish and document baseline code performance on the problems of interest. This will allow the project team to identify and target any deficiencies in performance, as well as highlight any performance bottlenecks as we exercise the code on a greater variety of platforms and at larger scales. The current study is

  6. An Ab Initio MP2 Study of HCN-HX Hydrogen Bonded Complexes

    Directory of Open Access Journals (Sweden)

    Araújo Regiane C.M.U.

    1998-01-01

    Full Text Available An ab initio MP2/6-311++G** study has been performed to obtain geometries, binding energies and vibrational properties of HCN-HX H-bonded complexes with X = F, Cl, NC, CN and CCH. These MP2/6-311++G** results have revealed that: (i the calculated H-bond lengths are in very good agreement with the experimental ones; (ii the H-bond strength is associated with the intermolecular charge transfer and follows the order: HCN-HNC ~ HCN-HF > HCN-HCl ~ HCN-HCN > HCN-HCCH; (iii BSSE correction introduces an average reduction of 2.4 kJ/mol on the MP2/6-311++G** binding energies, i.e. 11% of the uncorrected binding energy; (iv the calculated zero-point energies reduce the stability of these complexes and show a good agreement with the available experimental values; (v the H-X stretching frequency is shifted downward upon H-bond formation. This displacement is associated with the H-bond length; (vi The more pronounced effect on the infrared intensities occurs with the H-X stretching intensity. It is much enhanced after complexation due to the charge-flux term; (vii the calculated intermolecular stretching frequencies are in very good agreement with the experimental ones; and, finally, (viii the results obtained for the HCN-HX complexes follow the same profile as those found for the acetylene-HX series but, in the latter case, the effects on the properties of the free molecules due to complexation are less pronounced than those in HCN-HX.

  7. Final Report: A Broad Research Project on the Sciences of Complexity, September 15, 1994 - November 15, 1999

    Energy Technology Data Exchange (ETDEWEB)

    None

    2000-02-01

    DOE support for a broad research program in the sciences of complexity permitted the Santa Fe Institute to initiate new collaborative research within its integrative core activities as well as to host visitors to participate in research on specific topics that serve as motivation and testing ground for the study of the general principles of complex systems. Results are presented on computational biology, biodiversity and ecosystem research, and advanced computing and simulation.

  8. Recent advances in improving performances of the lightweight complex hydrides Li-Mg-N-H system

    Directory of Open Access Journals (Sweden)

    Bao Zhang

    2017-02-01

    Full Text Available A brief review of state-of-the art advances in improving performances of the lightweight complex hydrides Li-Mg-N-H system is reported. Among the hydrogen storage materials, Li-Mg-N-H combination systems are regarded as one of the most potential candidates for the vehicular applications owing to their high hydrogen storage capacity (>5 wt% H and a more appropriate thermodynamic properties of hydrogen absorption and desorption. In the Li-Mg-N-H systems, tremendous efforts have been devoted to improving the hydrogen storage properties by adjusting composition, revealing reaction mechanisms, adding catalysts and refining the microstructures, etc. During the studies, different mechanisms, such as the coordinated two-molecule or multimolecule reaction mechanism and the ammonia-mediated mechanism, are proposed and applied under some certain conditions. Catalysis and nanosizing are very effective in enhancing the kinetic properties and thermodynamic destabilization of Li-Mg-N-H systems. Due to nano effects, the space-confinement and nanoconfinement seems to be more effective for improving the hydrogen storage performance, and it is great significant to develop hydrogen storage materials by studying the nanoconfined effects on the Li-Mg-N-H systems.

  9. Clean Coal Technology III: 10 MW Demonstration of Gas Suspension Absorption final project performance and economics report

    Energy Technology Data Exchange (ETDEWEB)

    Hsu, F.E.

    1995-08-01

    The 10 MW Demonstration of the Gas Suspension Absorption (GSA) program is a government and industry co-funded technology development. The objective of the project is to demonstrate the performance of the GSA system in treating a 10 MW slipstream of flue gas resulting from the combustion of a high sulfur coal. This project involves design, fabrication, construction and testing of the GSA system. The Project Performance and Economics Report provides the nonproprietary information for the ``10 MW Demonstration of the Gas Suspension Absorption (GSA) Project`` installed at Tennessee Valley Authority`s (TVA) Shawnee Power Station, Center for Emissions Research (CER) at Paducah, Kentucky. The program demonstrated that the GSA flue-gas-desulfurization (FGD) technology is capable of achieving high SO{sub 2} removal efficiencies (greater than 90%), while maintaining particulate emissions below the New Source Performance Standards (NSPS), without any negative environmental impact (section 6). A 28-day test demonstrated the reliability and operability of the GSA system during continuous operation. The test results and detailed discussions of the test data can be obtained from TVA`s Final Report (Appendix A). The Air Toxics Report (Appendix B), prepared by Energy and Environmental Research Corporation (EERC) characterizes air toxic emissions of selected hazardous air pollutants (HAP) from the GSA process. The results of this testing show that the GSA system can substantially reduce the emission of these HAP. With its lower capital costs and maintenance costs (section 7), as compared to conventional semi-dry scrubbers, the GSA technology commands a high potential for further commercialization in the United States. For detailed information refer to The Economic Evaluation Report (Appendix C) prepared by Raytheon Engineers and Constructors.

  10. Complexation of the actinides (III, IV and V) with organic acids

    International Nuclear Information System (INIS)

    Leguay, S.

    2012-01-01

    A thorough knowledge of the chemical properties of actinides is now required in a wide variety of fields: extraction processes involved in spent fuel reprocessing, groundwater in the vicinity of radioactive waste packages, environmental and biological media in the case of accidental release of radionuclides. In this context, the present work has been focused on the complexation of Am(III), Cm(III), Cf(III), Pu(IV) and Pa(V) with organic ligands: DTPA, NTA and citric acid. The complexation of pentavalent protactinium with citric and nitrilotriacetic acids was studied using liquid-liquid extraction with the element at tracer scale (C Pa ≤ 10 -10 M). The order and the mean charge of each complex were determined from the analysis of the systematic variations of the distribution coefficient of Pa(V) as function of ligand and proton concentration. Then, the apparent formation constants related of the so-identified complexes were calculated. The complexation of trivalent actinides with DTPA was studied by fluorescence spectroscopy (TRLFS) and capillary electrophoresis (CE-ICP-MS). The coexistence of the mono-protonated and non-protonated complexes (AnHDTPA - and AnDTPA 2- ) in acidic media (1.5 ≤ pH ≤ 3.5) was shown unambiguously. Literature data have been reinterpreted by taking into account both complexes and a consistent set of formation constants of An(III)-DTPA has been obtained. The experimental study was completed by theoretical calculations (DFT) on Cm-DTPA system. The coordination geometry of Cm in CmDTPA 2- and CmHDTPA - including water molecules in the first coordination sphere has been determined as well as interatomic distances. Finally, a study on the complexation of Pu(IV) with DTPA was initiated in order to more closely mimic physiological conditions. A three-step approach was proposed to avoid plutonium hydrolysis: i/ complexation of Pu(IV) with (NTA) in order to protect Pu(IV) from hydrolysis (at low pH) ii/ increase of pH toward neutral conditions

  11. Safety assessment of reactor components under complex multiaxial cyclic loading. Final report; Sicherheitsbewertung kerntechnischer Komponenten bei komplexer, mehrachsiger Schwingbeanspruchung. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Fesich, Thomas M.; Herter, Karl-Heinz; Schuler, Xaver

    2012-12-15

    Objective of the project was the experimental assurance of investigations on the theoretical basis of multiaxial fatigue loading. The review of the applicability of existing hypotheses, as well as the extension of the corresponding data base was carried out by experimental studies in fatigue tests under complex multiaxial loading for a ferritic and austenitic material. To investigate the influence of complex multiaxial stress conditions on the fatigue behavior, in this project notched cylindrical specimens were examined under alternating tensile/pressure loading and alternating torsional loading. Through the notch in the notched section inhomogeneous, multiaxial stress states are generated. By uniaxial alternating tests on unnotched specimens and a further series of tests on unnotched specimens under alternating torsional loading an evaluation of the impact and influence of the notch of stress on fatigue behavior was possible. A series of experiments with superimposition of alternating torsional and alternating tensile/pressure loading permits verification of the effect of phase-shifted stress and rotating principal coordinate system. All experiments were performed at room temperature. As part of the research project, the experimental results with the ferritic and austenitic materials were evaluated in terms of material behavior (hardening or softening) under cyclic loading. These were to uniaxial alternating tensile/pressure tests, alternating torsional tests (unnotched cylindrical specimens), alternating tensile/pressure tests on notched cylindrical specimens, alternating torsional tests on notched cylindrical specimens, alternating tensiontorsion tests with complex proportional stresses on unnotched cylindrical specimens (superimposition of normal and shear stress components), as well as alternating tension-torsion tests with complex non-proportional strain on unnotched cylindrical specimens (superimposition of normal and shear stress components with 90 phase

  12. Studies concerning the preparation of the 153Sm complex with EDTMP (ethylenediaminetetra methylenephosphonic acid) and other 153Sm complexes with other phosphonates, at room temperature

    International Nuclear Information System (INIS)

    Gasiglia, Haroldo Taurian

    2000-01-01

    This work presents a study on the preparation of the complexes 1 53S m - EDTMP, 153 Sm - HEDP, 153 Sm - NTMP, 153 Sm - DTPMP and 153 Sm - HDTMP at room temperature. The preparation of the complex 153 Sm - HDTMP, under heating (70 - 72 deg C), was also studied. Several factors affecting the 153 Sm - EDTMP complexing yields were studied, due to its importance for use in Nuclear Medicine. These factors were: the molar ratio [ligand] / [metal], the ligand concentration and the incubation time of the mixture ligand-metal. The preparation of this complex, in low molar ratios, was also investigated. A study of the 153 Sm - EDTMP concerning the 'in vitro' stability, when this complex was prepared in low radioactive concentrations was performed. A study on the temperature influence on its degradation, when this complex was obtained in higher radioactive concentrations, was also performed. The preparation of the complexes 153 Sm - HEDP, 153 Sm - NTMP, 153 Sm - DTPMP and 153 Sm - HDTMP was investigated by preparing the complexes in two situations: high molar ratio and ligand concentration and low molar ratio and ligand concentration. The 'in vitro' stability of each complex, obtained in low radioactive concentration was studied. In the specific case of the complex 153 Sm - HDTMP, its biological distribution in mice was performed. All the complexes were investigated by high performance liquid chromatography (HPLC) and its complexing yields were determined by other three chromatographic processes: ionic exchange, thin layer chromatography (TLC - SG) and paper chromatography. The chromatographic processes were performed by association with specific radiochemical techniques. This work also presents a comparative study on the chromatograms obtained by thin layer chromatography (TLC - SG) and paper chromatography, when evaluated by the technique of cutting the strips into pieces and the chromatograms performed directly on a radiochromatography. The shape of the chromatograms and R

  13. Chitosan–Zinc(II Complexes as a Bio-Sorbent for the Adsorptive Abatement of Phosphate: Mechanism of Complexation and Assessment of Adsorption Performance

    Directory of Open Access Journals (Sweden)

    Maryam Roza Yazdani

    2017-12-01

    Full Text Available This study examines zinc(II–chitosan complexes as a bio-sorbent for phosphate removal from aqueous solutions. The bio-sorbent is prepared and is characterized via Fourier Transform Infrared Spectroscopy (FT-IR, Scanning Electron Microscopy (SEM, and Point of Zero Charge (pHPZC–drift method. The adsorption capacity of zinc(II–chitosan bio-sorbent is compared with those of chitosan and ZnO–chitosan and nano-ZnO–chitosan composites. The effect of operational parameters including pH, temperature, and competing ions are explored via adsorption batch mode. A rapid phosphate uptake is observed within the first three hours of contact time. Phosphate removal by zinc(II–chitosan is favored when the surface charge of bio-sorbent is positive/or neutral e.g., within the pH range inferior or around its pHPZC, 7. Phosphate abatement is enhanced with decreasing temperature. The study of background ions indicates a minor effect of chloride, whereas nitrate and sulfate show competing effect with phosphate for the adsorptive sites. The adsorption kinetics is best described with the pseudo-second-order model. Sips (R2 > 0.96 and Freundlich (R2 ≥ 0.95 models suit the adsorption isotherm. The phosphate reaction with zinc(II–chitosan is exothermic, favorable and spontaneous. The complexation of zinc(II and chitosan along with the corresponding mechanisms of phosphate removal are presented. This study indicates the introduction of zinc(II ions into chitosan improves its performance towards phosphate uptake from 1.45 to 6.55 mg/g and provides fundamental information for developing bio-based materials for water remediation.

  14. Phosphorescence Imaging of Living Cells with Amino Acid-Functionalized Tris(2-phenylpyridine)iridium(III) Complexes

    NARCIS (Netherlands)

    Steunenberg, P.; Ruggi, A.; Berg, van den N.S.; Buckle, T.; Kuil, J.; Leeuwen, van F.W.B.; Velders, A.H.

    2012-01-01

    A series of nine luminescent cyclometalated octahedral iridium(III) tris(2-phenylpyridine) complexes has been synthesized, functionalized with three different amino acids (glycine, alanine, and lysine), on one, two, or all three of the phenylpyridine ligands. All starting complexes and final

  15. Geosynthetic wall performance : facing pressure and deformation : final report.

    Science.gov (United States)

    2017-02-01

    The objective of the study was to validate the performance of blocked-faced Geosynthetic Reinforced Soil (GRS) wall and to validate the Colorado Department of Transportations (CDOT) decision to waive the positive block connection for closely-space...

  16. Database management for an electrical distribution network of intermediate complexity CERN

    CERN Document Server

    De Ruschi, Daniele; Burdet, Georges

    2010-01-01

    This thesis is submitted as the final work for the degree of Master of Science in Engineering of Information that has been taken by the writer at at University of Bergamo, Italy. The report is based on the work conducted by the writer from September 2009 throughout June 2010 on a project assignment given by the department of Engineering Electrical Control at CERN Genève The work performed is a contribution to the GESMAR system of CERN. GESMAR is a CERN made complex platform for support and management of electric network. In this work is developed an information system for an ETL process. The report presents the design, implementation and evaluation made, prototypes of applications which take advantages of new information inserted in GESMAR are also presented.

  17. Sequence complexity and work extraction

    International Nuclear Information System (INIS)

    Merhav, Neri

    2015-01-01

    We consider a simplified version of a solvable model by Mandal and Jarzynski, which constructively demonstrates the interplay between work extraction and the increase of the Shannon entropy of an information reservoir which is in contact with a physical system. We extend Mandal and Jarzynski’s main findings in several directions: first, we allow sequences of correlated bits rather than just independent bits. Secondly, at least for the case of binary information, we show that, in fact, the Shannon entropy is only one measure of complexity of the information that must increase in order for work to be extracted. The extracted work can also be upper bounded in terms of the increase in other quantities that measure complexity, like the predictability of future bits from past ones. Third, we provide an extension to the case of non-binary information (i.e. a larger alphabet), and finally, we extend the scope to the case where the incoming bits (before the interaction) form an individual sequence, rather than a random one. In this case, the entropy before the interaction can be replaced by the Lempel–Ziv (LZ) complexity of the incoming sequence, a fact that gives rise to an entropic meaning of the LZ complexity, not only in information theory, but also in physics. (paper)

  18. Macrocyclic ligands for uranium complexation: Progress report, August 15, 1987-present

    International Nuclear Information System (INIS)

    Potts, K.T.

    1988-03-01

    The synthesis of several macrocyclic ligands, designed by a computer modeling approach for the complexation of the uranyl ion, has now been completed and their structures established. Preliminary indicate that these macrocycles successfully complex the uranyl ion. Other synthetic efforts have led to a variety of intermediates suitable for final ring closure to the desired macrocycles, providing appreciable potential for variation of the macrocyclic peripheral atoms. A 1:1-uranyl ion complex of one of these precursor products has been shown to undergo a DMSO-induced rearrangement to a 2:1 uranyl ion to ligand complex, both structures having been established by single crystal x-ray data. 10 refs

  19. Investigation of modern methods of probalistic sensitivity analysis of final repository performance assessment models (MOSEL)

    International Nuclear Information System (INIS)

    Spiessl, Sabine; Becker, Dirk-Alexander

    2017-06-01

    Sensitivity analysis is a mathematical means for analysing the sensitivities of a computational model to variations of its input parameters. Thus, it is a tool for managing parameter uncertainties. It is often performed probabilistically as global sensitivity analysis, running the model a large number of times with different parameter value combinations. Going along with the increase of computer capabilities, global sensitivity analysis has been a field of mathematical research for some decades. In the field of final repository modelling, probabilistic analysis is regarded a key element of a modern safety case. An appropriate uncertainty and sensitivity analysis can help identify parameters that need further dedicated research to reduce the overall uncertainty, generally leads to better system understanding and can thus contribute to building confidence in the models. The purpose of the project described here was to systematically investigate different numerical and graphical techniques of sensitivity analysis with typical repository models, which produce a distinctly right-skewed and tailed output distribution and can exhibit a highly nonlinear, non-monotonic or even non-continuous behaviour. For the investigations presented here, three test models were defined that describe generic, but typical repository systems. A number of numerical and graphical sensitivity analysis methods were selected for investigation and, in part, modified or adapted. Different sampling methods were applied to produce various parameter samples of different sizes and many individual runs with the test models were performed. The results were evaluated with the different methods of sensitivity analysis. On this basis the methods were compared and assessed. This report gives an overview of the background and the applied methods. The results obtained for three typical test models are presented and explained; conclusions in view of practical applications are drawn. At the end, a recommendation

  20. Investigation of modern methods of probalistic sensitivity analysis of final repository performance assessment models (MOSEL)

    Energy Technology Data Exchange (ETDEWEB)

    Spiessl, Sabine; Becker, Dirk-Alexander

    2017-06-15

    Sensitivity analysis is a mathematical means for analysing the sensitivities of a computational model to variations of its input parameters. Thus, it is a tool for managing parameter uncertainties. It is often performed probabilistically as global sensitivity analysis, running the model a large number of times with different parameter value combinations. Going along with the increase of computer capabilities, global sensitivity analysis has been a field of mathematical research for some decades. In the field of final repository modelling, probabilistic analysis is regarded a key element of a modern safety case. An appropriate uncertainty and sensitivity analysis can help identify parameters that need further dedicated research to reduce the overall uncertainty, generally leads to better system understanding and can thus contribute to building confidence in the models. The purpose of the project described here was to systematically investigate different numerical and graphical techniques of sensitivity analysis with typical repository models, which produce a distinctly right-skewed and tailed output distribution and can exhibit a highly nonlinear, non-monotonic or even non-continuous behaviour. For the investigations presented here, three test models were defined that describe generic, but typical repository systems. A number of numerical and graphical sensitivity analysis methods were selected for investigation and, in part, modified or adapted. Different sampling methods were applied to produce various parameter samples of different sizes and many individual runs with the test models were performed. The results were evaluated with the different methods of sensitivity analysis. On this basis the methods were compared and assessed. This report gives an overview of the background and the applied methods. The results obtained for three typical test models are presented and explained; conclusions in view of practical applications are drawn. At the end, a recommendation