An interactive videodisc system for training
International Nuclear Information System (INIS)
Cadwell, J.J.
1987-01-01
Under the sponsorship of the U.S. Department of Energy's Office of Classification (DOE/OC), Brookhaven National Laboratory/Technical Support Organization (BNL/TSO) has prepared a level-three interactive-laserdisc program for the training of authorized classifiers in the Department of Energy. This training programs consists of six modules presented in several formats. The material is presented in a highly interactive manner with various tests to reinforce and evaluate the trainee's progress in learning the material. A lengthy qualification test is presented at the end of the educational material. The various instructional techniques of scenario presentation, ''talking heads'', graphics, textual material and combinations of the above are used to assure that the training material attracts the trainee's interests and motivates him to understand and use the material. The state-of-the-art interactive laser videodisc with its storage capacity, speed flexibility, and superior training capacity was the logical choice for the training of Authorized Classifiers in the Department of Energy
Stevenson, Kimberly
This master's thesis describes the development of an expert system and interactive videodisc computer-based instructional job aid used for assisting in the integration of electron beam lithography devices. Comparable to all comprehensive training, expert system and job aid development require a criterion-referenced systems approach treatment to…
International Nuclear Information System (INIS)
Russell, C.M.
1988-01-01
The purpose of this study was twofold. The first part was to describe the development and evaluation of an interactive videodisc system to train radiation therapy technology students how to treat malignancies using a Linear Accelerator. The second part of the study was to evaluate the effectiveness of the interactive videodisc system as a simulation. The Gagne-Briggs instructional model was adapted to develop the interactive videodisc system. A model emerged as part of the project to conduct the formative evaluation of the prototype. A quasiexperimental research design was used to conduct the summative evaluation with two groups of first-year Radiation Therapy Technology students who entered the program in consecutive years. All testing and evaluation instruments were developed for the study with the exception of the clinical evaluation form. This latter form was already being used at the clinical sites. T-tests were used to analyze all data. A significant difference in cognitive achievement was evidenced between students exposed to the interactive videodisc system and students who were not exposed to the system. There was no significant difference found in clinical performance achievement and in attitude toward the clinical experience between both sets of participants. Instructor time was reduced by 1 and 1/2 hours for students on the interactive videodisc system. In conclusion, the interactive videodisc system was found to be more effective as an instructional method for cognitive achievement and as equally an effective method preparing students for clinical performance
Interactive videodisc in maintenance
International Nuclear Information System (INIS)
Zwingelstein, G.; Nguyen Van Nghi, B.
1986-01-01
After a recall of the videodisc characteristics, this paper presents its utilization by Electricite de France in the framework of training and maintenance. The SICMA (Interactive Communication System in Maintenance) developed and tested by Electricte de France is presented as also its utilization. It has been tested on the sites of Dampierre and Paluel in the cases of training and maintenance (deconnexion of drive rods of control elements); the conclusions of this experimentation are finally given. 4 refs [fr
Leonard, William H.
This study was designed to learn if students perceived an interactive computer/videodisc learning system to represent a viable alternative to (or extension of) the conventional laboratory for learning biology skills and concepts normally taught under classroom laboratory conditions. Data were collected by questionnaire for introductory biology classes at a large midwestern university where students were randomly assigned to two interactive videodisc/computer lessons titled Respiration and Climate and Life or traditional laboratory investigation with the same titles and concepts. The interactive videodisc system consisted of a TRS-80 Model III microcomputer interfaced to a Pioneer laser-disc player and a color TV monitor. Students indicated an overall level satisfaction with this strategy very similar to that of conventional laboratory instruction. Students frequently remarked that videodisc instruction gave them more experimental and procedural options and more efficient use of instructional time than did the conventional laboratory mode. These two results are consistent with past CAI research. Students also had a strong perception that the images on the videodisc were not real and this factor was perceived as having both advantages and disadvantages. Students found the two approaches to be equivalent to conventional laboratory instruction in the areas of general interest, understanding of basic principles, help on examinations, and attitude toward science. The student-opinion data in this study do not suggest that interactive videodisc technology serve as a substitute to the wet laboratory experience, but that this medium may enrich the spectrum of educational experiences usually not possible in typical classroom settings.
Comparing interactive videodisc training effectiveness to traditional training methods
International Nuclear Information System (INIS)
Kenworthy, N.W.
1987-01-01
Videodisc skills training programs developed by Industrial Training Corporation are being used and evaluated by major industrial facilities. In one such study, interactive videodisc training programs were compared to videotape and instructor-based training to determine the effectiveness of videodisc in terms of performance, training time and trainee attitudes. Results showed that when initial training was done using the interactive videodisc system, trainee performance was superior to the performance of trainees using videotape, and approximately equal to the performance of those trained by an instructor. When each method was used in follow-up training, interactive videodisc was definitely the most effective. Results also indicate that training time can be reduced using interactive videodisc. Attitudes of both trainees and instructors toward the interactive videodisc training were positive
Virginia Power's computer-based interactive videodisc training: a prototype for the future
International Nuclear Information System (INIS)
Seigler, G.G.; Adams, R.H.
1987-01-01
Virginia Power has developed a system and internally produced a prototype for computer-based interactive videodisc (CBIV) training. Two programs have been developed using the CBIV instructional methodology: Fire Team Retraining and General Employee Training (practical factors). In addition, the company developed a related program for conducting a videodisc tour of their nuclear power stations using a videodisc information management system (VIMS)
Whither Interactive Videodisc?
Geber, Beverly
1989-01-01
Probably within the next 10 years, current videodisc technology will be surpassed by something even more useful to corporate trainers. However, those with no vested interest in selling the technology recommend that if the need is there, corporations should invest in it now. (JOW)
Karpisek, Marian; And Others
1995-01-01
Presents five articles and a company resource directory to help librarians successfully incorporate technology into school libraries. Discusses actual situations, examines student needs, and gives advice to help librarians with library automation systems, videodiscs, library security systems, media retrieval, networking CD-ROMs, and locating…
Maintaining consistency in distributed systems
Birman, Kenneth P.
1991-01-01
In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.
Artificial Intelligence Applications to Videodisc Technology
Vries, John K.; Banks, Gordon; McLinden, Sean; Moossy, John; Brown, Melanie
1985-01-01
Much of medical information is visual in nature. Since it is not easy to describe pictorial information in linguistic terms, it has been difficult to store and retrieve this type of information. Coupling videodisc technology with artificial intelligence programming techniques may provide a means for solving this problem.
A Comparative Evaluation of Videodiscs for General Biology.
Ralph, Charles L.
1995-01-01
Provides a brief profile of the currently available videodiscs for general biology, with comparable information for each. An introduction discusses benefits and problems associated with videodisc use in the classroom. Profiles contain information on description, good and bad features, still images, animations and movies, audio, software,…
Sticky continuous processes have consistent price systems
DEFF Research Database (Denmark)
Bender, Christian; Pakkanen, Mikko; Sayit, Hasanjan
Under proportional transaction costs, a price process is said to have a consistent price system, if there is a semimartingale with an equivalent martingale measure that evolves within the bid-ask spread. We show that a continuous, multi-asset price process has a consistent price system, under...
Von Der Linn, Robert Christopher
A needs assessment of the Grumman E-Beam Systems Group identified the requirement for additional skill mastery for the engineers who assemble, integrate, and maintain devices used to manufacture integrated circuits. Further analysis of the tasks involved led to the decision to develop interactive videodisc, computer-based job aids to enable…
On the existence of consistent price systems
DEFF Research Database (Denmark)
Bayraktar, Erhan; Pakkanen, Mikko S.; Sayit, Hasanjan
2014-01-01
We formulate a sufficient condition for the existence of a consistent price system (CPS), which is weaker than the conditional full support condition (CFS). We use the new condition to show the existence of CPSs for certain processes that fail to have the CFS property. In particular this condition...
Using Interactive Videodisc To Teach Psychomotor Skills to Nursing Students
Renshaw, Sharon M.; Beadenkopf, F. Scott; Murray, Rodney
1989-01-01
An interactive videodisc program on the process of administering medications to clients will be demonstrated. Discussion will center on the strengths and limitations of interactive video for teaching psychomotor skills to healthcare professionals as well as design modifications that will facilitate this process. Interactive videodisc technology provides an exciting new medium for teaching psychomotor clinical skills to health care professionals. It is a particularly valuable approach for complex skills which involve visualization of motor activities and extensive client assessments.
Self-consistent nuclear energy systems
International Nuclear Information System (INIS)
Shimizu, A.; Fujiie, Y.
1995-01-01
A concept of self-consistent energy systems (SCNES) has been proposed as an ultimate goal of the nuclear energy system in the coming centuries. SCNES should realize a stable and unlimited energy supply without endangering the human race and the global environment. It is defined as a system that realizes at least the following four objectives simultaneously: (a) energy generation -attain high efficiency in the utilization of fission energy; (b) fuel production - secure inexhaustible energy source: breeding of fissile material with the breeding ratio greater than one and complete burning of transuranium through recycling; (c) burning of radionuclides - zero release of radionuclides from the system: complete burning of transuranium and elimination of radioactive fission products by neutron capture reactions through recycling; (d) system safety - achieve system safety both for the public and experts: eliminate criticality-related safety issues by using natural laws and simple logic. This paper describes the concept of SCNES and discusses the feasibility of the system. Both ''neutron balance'' and ''energbalance'' of the system are introduced as the necessary conditions to be satisfied at least by SCNES. Evaluations made so far indicate that both the neutron balance and the energy balance can be realized by fast reactors but not by thermal reactors. Concerning the system safety, two safety concepts: ''self controllability'' and ''self-terminability'' are introduced to eliminate the criticality-related safety issues in fast reactors. (author)
Consistent thermodynamic properties of lipids systems
DEFF Research Database (Denmark)
Cunico, Larissa; Ceriani, Roberta; Sarup, Bent
different pressures, with azeotrope behavior observed. Available thermodynamic consistency tests for TPx data were applied before performing parameter regressions for Wilson, NRTL, UNIQUAC and original UNIFAC models. The relevance of enlarging experimental databank of lipids systems data in order to improve......Physical and thermodynamic properties of pure components and their mixtures are the basic requirement for process design, simulation, and optimization. In the case of lipids, our previous works[1-3] have indicated a lack of experimental data for pure components and also for their mixtures...... the performance of predictive thermodynamic models was confirmed in this work by analyzing the calculated values of original UNIFAC model. For solid-liquid equilibrium (SLE) data, new consistency tests have been developed [2]. Some of the developed tests were based in the quality tests proposed for VLE data...
Sanford, M K; Hazelwood, S E; Bridges, A J; Cutts, J H; Mitchell, J A; Reid, J C; Sharp, G
1996-01-01
A computer-assisted interactive videodisc instructional program, HP-RHEUM was designed to teach clinical findings in arthritis to occupational and physical therapy students. Using the Rheumatology Image Library videodisc produced by the National Library of Medicine, HP-RHEUM consists of instructional modules which employ advance organizers, examples/nonexamples, summaries, and immediate feedback. To see if HP-RHEUM would be as effective as traditional classroom instruction, control data were collected in 1991 from 52 OT and PT students. Treatment data were collected from 61 students in 1992 when HP-RHEUM entirely replaced lectures. Identical pre- and post-tests consisted of 70 multiple choice questions, with 24 matched to slides. On the slide questions the HP-RHEUM group had significantly higher scores. Otherwise, there was no significant difference in performance between groups. HP-RHEUM provided an independent learning method and enhanced visual comprehension of rheumatologic disease concepts.
International Nuclear Information System (INIS)
Shiplett, D.W.
1990-01-01
This presentation discussed the use of computer aided training at Savannah River Site using a computer-assisted interactive videodisc system. This system was used in situations where there was a high frequency of training required, where there were a large number of people to be trained and where there was a rigid work schedule. The system was used to support classroom training to emphasize major points, display graphics of flowpaths, for simulations, and video of actual equipment
Crespin, Daniel J; Christianson, Jon B; McCullough, Jeffrey S; Finch, Michael D
This study addresses whether health systems have consistent diabetes care performance across their ambulatory clinics and whether increasing consistency is associated with improvements in clinic performance. Study data included 2007 to 2013 diabetes care intermediate outcome measures for 661 ambulatory clinics in Minnesota and bordering states. Health systems provided more consistent performance, as measured by the standard deviation of performance for clinics in a system, relative to propensity score-matched proxy systems created for comparison purposes. No evidence was found that improvements in consistency were associated with higher clinic performance. The combination of high performance and consistent care is likely to enhance a health system's brand reputation, allowing it to better mitigate the financial risks of consumers seeking care outside the organization. These results suggest that larger health systems are most likely to deliver the combination of consistent and high-performance care. Future research should explore the mechanisms that drive consistent care within health systems.
Interactive videodisc training for power plant operations
International Nuclear Information System (INIS)
Nolan, R.; Nolan, J.; Campos, M.; Haukom, R.; Quentin, G.
1990-01-01
During the last several years, professionals in the personal computer and video fields have seen their two technologies coming together. This merging has created a new medium called multimedia. Multimedia provides the user with the interactivity of the personal computer and the realism of live-action television. It appears to be a perfect marriage for education, training and selling applications. As multimedia productions continue to be produced and tested with high marks, business and industry are becoming interested. The Interactive Videodisc Trainer (IVT) is a demonstration of how multimedia technology can be used by the electric power industry for operator training. Although the subject for this pilot program is the Claus sulfur recovery unit at the Cool Water Integrated Gasification Combined Cycle plant, similar courseware can be put to use for training at any type of power plant. The goal is to show many of the features and capabilities inherent in this powerful new training tool, so that utilities can begin to see how it could work for them
A Consistent System for Coding Laboratory Samples
Sih, John C.
1996-07-01
A formal laboratory coding system is presented to keep track of laboratory samples. Preliminary useful information regarding the sample (origin and history) is gained without consulting a research notebook. Since this system uses and retains the same research notebook page number for each new experiment (reaction), finding and distinguishing products (samples) of the same or different reactions becomes an easy task. Using this system multiple products generated from a single reaction can be identified and classified in a uniform fashion. Samples can be stored and filed according to stage and degree of purification, e.g. crude reaction mixtures, recrystallized samples, chromatographed or distilled products.
Self-consistent phonons in disordered systems
International Nuclear Information System (INIS)
Das, M.P.
1990-01-01
The time is now ripe for the development of a microscopic theory of the disordered systems in the context of phonons. The adiabatic approximation has helped to separate the electronic motion from that of the ions. In the microscopic dielectric formulation we have been able to obtain the interatomic forces for ordered systems by incorporating the effect of the electronic motion. The nature of the electronic states in disordered systems is now better understood with realistic coherent potential approximation calculations. Therefore, it will not be too ambitious to construct an average dielectric function for a disordered system. Then we can obtain a properly screened pair potential in terms of this dielectric function. In view of the availability of super fast computers, the development of the microscopic theories are expected to get a new direction. (author). 36 refs
Consistency of a system of equations: What does that mean?
Still, Georg J.; Kern, Walter; Koelewijn, Jaap; Bomhoff, M.J.
2010-01-01
The concept of (structural) consistency also called structural solvability is an important basic tool for analyzing the structure of systems of equations. Our aim is to provide a sound and practically relevant meaning to this concept. The implications of consistency are expressed in terms of
An approach to a self-consistent nuclear energy system
International Nuclear Information System (INIS)
Fujii-e, Yoichi; Arie, Kazuo; Endo, Hiroshi
1992-01-01
A nuclear energy system should provide a stable supply of energy without endangering the environment or humans. If there is fear about exhausting world energy resources, accumulating radionuclides, and nuclear reactor safety, tension is created in human society. Nuclear energy systems of the future should be able to eliminate fear from people's minds. In other words, the whole system, including the nuclear fuel cycle, should be self-consistent. This is the ultimate goal of nuclear energy. If it can be realized, public acceptance of nuclear energy will increase significantly. In a self-consistent nuclear energy system, misunderstandings between experts on nuclear energy and the public should be minimized. The way to achieve this goal is to explain using simple logic. This paper proposes specific targets for self-consistent nuclear energy systems and shows that the fast breeder reactor (FBR) lies on the route to attaining the final goal
The consistency service of the ATLAS Distributed Data Management system
Serfon, C; The ATLAS collaboration
2011-01-01
With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failures is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically corrects the errors reported and informs the users in case of irrecoverable file loss.
The Consistency Service of the ATLAS Distributed Data Management system
Serfon, C; The ATLAS collaboration
2010-01-01
With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failure is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically correct the errors reported and informs the users in case of irrecoverable file loss.
Consistent approach to air-cleaning system duct design
International Nuclear Information System (INIS)
Miller, W.H.; Ornberg, S.C.; Rooney, K.L.
1981-01-01
Nuclear power plant air-cleaning system effectiveness is dependent on the capability of a duct system to safely convey contaminated gas to a filtration unit and subsequently to a point of discharge. This paper presents a logical and consistent design approach for selecting sheet metal ductwork construction to meet applicable criteria. The differences in design engineers' duct construction specifications are acknowledged. Typical duct construction details and suggestions for their effective use are presented. Improvements in duct design sections of ANSI/ASME N509-80 are highlighted. A detailed leakage analysis of a control room HVAC system is undertaken to illustrate the effects of conceptual design variations on duct construction requirements. Shortcomings of previously published analyses and interpretations of a current standard are included
Thermodynamically consistent Bayesian analysis of closed biochemical reaction systems
Directory of Open Access Journals (Sweden)
Goutsias John
2010-11-01
Full Text Available Abstract Background Estimating the rate constants of a biochemical reaction system with known stoichiometry from noisy time series measurements of molecular concentrations is an important step for building predictive models of cellular function. Inference techniques currently available in the literature may produce rate constant values that defy necessary constraints imposed by the fundamental laws of thermodynamics. As a result, these techniques may lead to biochemical reaction systems whose concentration dynamics could not possibly occur in nature. Therefore, development of a thermodynamically consistent approach for estimating the rate constants of a biochemical reaction system is highly desirable. Results We introduce a Bayesian analysis approach for computing thermodynamically consistent estimates of the rate constants of a closed biochemical reaction system with known stoichiometry given experimental data. Our method employs an appropriately designed prior probability density function that effectively integrates fundamental biophysical and thermodynamic knowledge into the inference problem. Moreover, it takes into account experimental strategies for collecting informative observations of molecular concentrations through perturbations. The proposed method employs a maximization-expectation-maximization algorithm that provides thermodynamically feasible estimates of the rate constant values and computes appropriate measures of estimation accuracy. We demonstrate various aspects of the proposed method on synthetic data obtained by simulating a subset of a well-known model of the EGF/ERK signaling pathway, and examine its robustness under conditions that violate key assumptions. Software, coded in MATLAB®, which implements all Bayesian analysis techniques discussed in this paper, is available free of charge at http://www.cis.jhu.edu/~goutsias/CSS%20lab/software.html. Conclusions Our approach provides an attractive statistical methodology for
Improving risk assessment by defining consistent and reliable system scenarios
Directory of Open Access Journals (Sweden)
B. Mazzorana
2009-02-01
Full Text Available During the entire procedure of risk assessment for hydrologic hazards, the selection of consistent and reliable scenarios, constructed in a strictly systematic way, is fundamental for the quality and reproducibility of the results. However, subjective assumptions on relevant impact variables such as sediment transport intensity on the system loading side and weak point response mechanisms repeatedly cause biases in the results, and consequently affect transparency and required quality standards. Furthermore, the system response of mitigation measures to extreme event loadings represents another key variable in hazard assessment, as well as the integral risk management including intervention planning. Formative Scenario Analysis, as a supplement to conventional risk assessment methods, is a technique to construct well-defined sets of assumptions to gain insight into a specific case and the potential system behaviour. By two case studies, carried out (1 to analyse sediment transport dynamics in a torrent section equipped with control measures, and (2 to identify hazards induced by woody debris transport at hydraulic weak points, the applicability of the Formative Scenario Analysis technique is presented. It is argued that during scenario planning in general and with respect to integral risk management in particular, Formative Scenario Analysis allows for the development of reliable and reproducible scenarios in order to design more specifically an application framework for the sustainable assessment of natural hazards impact. The overall aim is to optimise the hazard mapping and zoning procedure by methodologically integrating quantitative and qualitative knowledge.
A self-consistent nuclear energy supply system
International Nuclear Information System (INIS)
Fujii-e, Y.; Morita, T.; Kawakami, H.; Arie, K.; Suzuki, M.; Iida, M.; Yamazaki, H.
1992-01-01
A self-consistent nuclear energy supply system (SCNESS) is investigated for a Fast Reactor. SCNESS is proposed as a future stable energy supplier with no harmful influence on humans or environment for the ultimate goal of nuclear energy development. SCNESS should be inherently safe, be able to breed fissionable material, and transmute long-lived radioactive nuclides (i.e., minor actinides and long-lived fission products). The relationship between these characteristics and the spatial assignment of excess neutrons (v-1) for each characteristic are analyzed. The analysis shows that excess neutrons play an intrinsic role in realizing SCNESS. The reactor concept of SCNESS is investigated by considering utilization of excess neutrons. Results show that a small-size axially double-layered annular core with metal fuel is a choice candidate for SCNESS. SCNESS is concluded feasible. (author). 4 refs., 9 figs
Consistent Steering System using SCTP for Bluetooth Scatternet Sensor Network
Dhaya, R.; Sadasivam, V.; Kanthavel, R.
2012-12-01
Wireless communication is the best way to convey information from source to destination with flexibility and mobility and Bluetooth is the wireless technology suitable for short distance. On the other hand a wireless sensor network (WSN) consists of spatially distributed autonomous sensors to cooperatively monitor physical or environmental conditions, such as temperature, sound, vibration, pressure, motion or pollutants. Using Bluetooth piconet wireless technique in sensor nodes creates limitation in network depth and placement. The introduction of Scatternet solves the network restrictions with lack of reliability in data transmission. When the depth of the network increases, it results in more difficulties in routing. No authors so far focused on the reliability factors of Scatternet sensor network's routing. This paper illustrates the proposed system architecture and routing mechanism to increase the reliability. The another objective is to use reliable transport protocol that uses the multi-homing concept and supports multiple streams to prevent head-of-line blocking. The results show that the Scatternet sensor network has lower packet loss even in the congestive environment than the existing system suitable for all surveillance applications.
Consistency conditions for data base systems: a new problem of systems analysis
International Nuclear Information System (INIS)
Schlageter, G.
1976-01-01
A data base can be seen as a model of a system in the real world. During the systems analysis conditions must be derived which guarantee a close correspondence between the real system and the data base. These conditions are called consistency constraints. The notion of consistency is analyzed; different types of consistency constraints are presented. (orig.) [de
Student Consistency and Implications for Feedback in Online Assessment Systems
Madhyastha, Tara M.; Tanimoto, Steven
2009-01-01
Most of the emphasis on mining online assessment logs has been to identify content-specific errors. However, the pattern of general "consistency" is domain independent, strongly related to performance, and can itself be a target of educational data mining. We demonstrate that simple consistency indicators are related to student outcomes,…
The World Center for Computing's Pilot Videodisc Project for French Language Instruction.
Eastmond, J. Nicholls, Jr.; Mosenthal, Richard
1985-01-01
Describes a pilot videodisc project for French language instruction. Unique features include (1) learner control of instruction by a mouse or touch-sensitive screen, (2) extensive cultural interaction, and (3) an elaborate lexicon of word meanings portrayed visually for selected key words. (Author/SED)
Understanding and Improving the Performance Consistency of Distributed Computing Systems
Yigitbasi, M.N.
2012-01-01
With the increasing adoption of distributed systems in both academia and industry, and with the increasing computational and storage requirements of distributed applications, users inevitably demand more from these systems. Moreover, users also depend on these systems for latency and throughput
An Evaluation of Information Consistency in Grid Information Systems
Field, Laurence
2017-01-01
A Grid information system resolves queries that may need to consider all information sources (Grid services), which are widely distributed geographically, in order to enable efficient Grid functions that may utilise multiple cooperating services. Fundamentally this can be achieved by either moving the query to the data (query shipping) or moving the data to the query (data shipping). Existing Grid information system implementations have adopted one of the two approaches. This paper explores the two approaches in further detail by evaluating them to the best possible extent with respect to Grid information system benchmarking metrics. A Grid information system that follows the data shipping approach based on the replication of information that aims to improve the currency for highly-mutable information is presented. An implementation of this, based on an Enterprise Messaging System, is evaluated using the benchmarking method and the consequence of the results for the design of Grid information systems is discu...
An interactive histology image-barcode manual for a videodisc image library.
Ogilvie, R W
1995-01-01
Cell Biology and HISTOLOGY (alias Microanatomy, alias Microscopic Anatomy) is a required course for first-year medical and dental students in most health science centers. The traditional approach used in teaching this discipline is to present photomicrographic images of structures to students in lecture using 35 mm slides of fields seen through the microscope. The students then spend many hours viewing and studying specimens of tissues using a light microscope in a laboratory setting. Students in traditional courses of histology spend an inordinate amount of time learning the component structures by attempting to find and identify them in tissue sections using a microscope, where the structure being sought is surrounded by a multitude of other structures with which they are also not familiar. With the recent availability of videodisc stored image libraries of histological samples, it is now possible to study histological principles without the use of the microscope as the primary learning tool. A videodisc entitled " A Photographic Atlas" by S. Downing (published by Image Premastering Services Limited, Minneapolis, MN, 1991) has been incorporated into our histology course. Fifteen videodisc player stations are provided for 150 students. Images are retrieved by students using a bar code scanner attached to a videodisc player (Pioneer CLD-2400). Using this kind of image library, students can now learn basic histological structure, such as cell and tissue types, without the use of a microscope or as a tool for facilitating microscopy. The use of a videodisc library of randomly accessible images simplifies learning the basic components which all organs are composed of by presenting the learner with clear-cut examples to avoid confusion with other structures. However, videodisc players and TV monitors are still not appropriately priced for every student to own. This presents a problem in that the same images studied in class are not available to study and review outside
Promoting consistent use of the communication function classification system (CFCS).
Cunningham, Barbara Jane; Rosenbaum, Peter; Hidecker, Mary Jo Cooley
2016-01-01
We developed a Knowledge Translation (KT) intervention to standardize the way speech-language pathologists working in Ontario Canada's Preschool Speech and Language Program (PSLP) used the Communication Function Classification System (CFCS). This tool was being used as part of a provincial program evaluation and standardizing its use was critical for establishing reliability and validity within the provincial dataset. Two theoretical foundations - Diffusion of Innovations and the Communication Persuasion Matrix - were used to develop and disseminate the intervention to standardize use of the CFCS among a cohort speech-language pathologists. A descriptive pre-test/post-test study was used to evaluate the intervention. Fifty-two participants completed an electronic pre-test survey, reviewed intervention materials online, and then immediately completed an electronic post-test survey. The intervention improved clinicians' understanding of how the CFCS should be used, their intentions to use the tool in the standardized way, and their abilities to make correct classifications using the tool. Findings from this work will be shared with representatives of the Ontario PSLP. The intervention may be disseminated to all speech-language pathologists working in the program. This study can be used as a model for developing and disseminating KT interventions for clinicians in paediatric rehabilitation. The Communication Function Classification System (CFCS) is a new tool that allows speech-language pathologists to classify children's skills into five meaningful levels of function. There is uncertainty and inconsistent practice in the field about the methods for using this tool. This study used combined two theoretical frameworks to develop an intervention to standardize use of the CFCS among a cohort of speech-language pathologists. The intervention effectively increased clinicians' understanding of the methods for using the CFCS, ability to make correct classifications, and
1987-01-01
with non-emotional mate- rial . . . . P5. Students who are able to choose from a ’ menu ’ of topics to provide the general con- text of the exercise...smaller version of the videodisc encoded digitally and capable of storing vast numbers of still frames and text files, presents yet another opportunity for...37. En el restaurante , Ramiro pide . a. chorizo y tinto. b. sardinas y vino. c. tortilla y vino. 38. Cuando es t comiendo en el restaurante , Ramiro
Development of low-cost digital subtraction angiography system
International Nuclear Information System (INIS)
Ando, Yutaka; Kobayashi, Takeshi; Imai, Yutaka; Yagishita, Akira; Kunieda, Etsuo.
1983-01-01
We developed a simple and low-cost DSA system. This system consists of a conventional fluoroscopic equipment for the GI tract and a mini-computer (GAMMA-11) which are connected each other with a video-disc recorder. The uniqueness of our system are 1. low-cost, 2. low-radiation dose, 3. off-line processing, 4. flexibility of software. The analysis of the time-density curve and image processing will bring us a more usefull information than DSA alone. (author)
DEFF Research Database (Denmark)
Toldbod, Thomas; Israelsen, Poul
2014-01-01
Companies rely on multiple Management Control Systems to obtain their short and long term objectives. When applying a multifaceted perspective on Management Control System the concept of internal consistency has been found to be important in obtaining goal congruency in the company. However, to d...... management is aware of this shortcoming they use the cybernetic controls more interactively to overcome this shortcoming, whereby the cybernetic controls are also used as a learning platform and not just for performance control....
Branck, Charles E.; And Others
1987-01-01
This study of 87 veterinary medical students at Auburn University tests the effectiveness and student acceptance of interactive videodisc as an alternative to animal experimentation and other traditional teaching methods in analyzing canine cardiovascular sounds. Results of the questionnaire used are presented, and benefits of interactive video…
Techniques for Reducing Consistency-Related Communication in Distributed Shared Memory System
Zwaenepoel, W; Bennett, J.K.; Carter, J.B.
1995-01-01
Distributed shared memory 8DSM) is an abstraction of shared memory on a distributed memory machine. Hardware DSM systems support this abstraction at the architecture level; software DSM systems support the abstraction within the runtime system. One of the key problems in building an efficient software DSM system is to reduce the amount of communication needed to keep the distributed memories consistent. In this paper we present four techniques for doing so: 1) software release consistency; 2)...
Solution of degenerate hypergeometric system of Horn consisting of three equations
Tasmambetov, Zhaksylyk N.; Zhakhina, Ryskul U.
2017-09-01
The possibilities of constructing normal-regular solutions of a system consisting of three partial differential equations of the second order are studied by the Frobenius-Latysheva method. The method of determining unknown coefficients is shown and the relationship of the studied system with the system, which solution is Laguerre's polynomial of three variables is indicated. The generalization of the Frobenius-Latysheva method to the case of a system consisting of three equations makes it possible to clarify the relationship of such systems, which solutions are special functions of three variables. These systems include the functions of Whittaker and Bessel, 205 special functions of three variables from the list of M. Srivastava and P.W. Carlsson, as well as orthogonal polynomials of three variables. All this contributes to the further development of the analytic theory of systems consisting of three partial differential equations of the second order.
Vujačić, Ivan; Dattner, Itai
In this paper we use the sieve framework to prove consistency of the ‘direct integral estimator’ of parameters for partially observed systems of ordinary differential equations, which are commonly used for modeling dynamic processes.
Generation of static solutions of the self-consistent system of Einstein-Maxwell equations
International Nuclear Information System (INIS)
Anchikov, A.M.; Daishev, R.A.
1988-01-01
A theorem is proved, according to which to each solution of the Einstein equations with an arbitrary momentum-energy tensor in the right hand side there corresponds a static solution of the self-consistent system of Einstein-Maxwell equations. As a consequence of this theorem, a method is established of generating static solutions of the self-consistent system of Einstein-Maxwell equations with a charged grain as a source of vacuum solutions of the Einstein equations
Method used to test the imaging consistency of binocular camera's left-right optical system
Liu, Meiying; Wang, Hu; Liu, Jie; Xue, Yaoke; Yang, Shaodong; Zhao, Hui
2016-09-01
To binocular camera, the consistency of optical parameters of the left and the right optical system is an important factor that will influence the overall imaging consistency. In conventional testing procedure of optical system, there lacks specifications suitable for evaluating imaging consistency. In this paper, considering the special requirements of binocular optical imaging system, a method used to measure the imaging consistency of binocular camera is presented. Based on this method, a measurement system which is composed of an integrating sphere, a rotary table and a CMOS camera has been established. First, let the left and the right optical system capture images in normal exposure time under the same condition. Second, a contour image is obtained based on the multiple threshold segmentation result and the boundary is determined using the slope of contour lines near the pseudo-contour line. Third, the constraint of gray level based on the corresponding coordinates of left-right images is established and the imaging consistency could be evaluated through standard deviation σ of the imaging grayscale difference D (x, y) between the left and right optical system. The experiments demonstrate that the method is suitable for carrying out the imaging consistency testing for binocular camera. When the standard deviation 3σ distribution of imaging gray difference D (x, y) between the left and right optical system of the binocular camera does not exceed 5%, it is believed that the design requirements have been achieved. This method could be used effectively and paves the way for the imaging consistency testing of the binocular camera.
Consistency properties of chaotic systems driven by time-delayed feedback
Jüngling, T.; Soriano, M. C.; Oliver, N.; Porte, X.; Fischer, I.
2018-04-01
Consistency refers to the property of an externally driven dynamical system to respond in similar ways to similar inputs. In a delay system, the delayed feedback can be considered as an external drive to the undelayed subsystem. We analyze the degree of consistency in a generic chaotic system with delayed feedback by means of the auxiliary system approach. In this scheme an identical copy of the nonlinear node is driven by exactly the same signal as the original, allowing us to verify complete consistency via complete synchronization. In the past, the phenomenon of synchronization in delay-coupled chaotic systems has been widely studied using correlation functions. Here, we analytically derive relationships between characteristic signatures of the correlation functions in such systems and unequivocally relate them to the degree of consistency. The analytical framework is illustrated and supported by numerical calculations of the logistic map with delayed feedback for different replica configurations. We further apply the formalism to time series from an experiment based on a semiconductor laser with a double fiber-optical feedback loop. The experiment constitutes a high-quality replica scheme for studying consistency of the delay-driven laser and confirms the general theoretical results.
Bowman, Kaye; McKenna, Suzy
2016-01-01
This occasional paper provides an overview of the development of Australia's national training system and is a key knowledge document of a wider research project "Consistency with flexibility in the Australian national training system." This research project investigates the various approaches undertaken by each of the jurisdictions to…
A consistent description of kinetics and hydrodynamics of quantum Bose-systems
Directory of Open Access Journals (Sweden)
P.A.Hlushak
2004-01-01
Full Text Available A consistent approach to the description of kinetics and hydrodynamics of many-Boson systems is proposed. The generalized transport equations for strongly and weakly nonequilibrium Bose systems are obtained. Here we use the method of nonequilibrium statistical operator by D.N. Zubarev. New equations for the time distribution function of the quantum Bose system with a separate contribution from both the kinetic and potential energies of particle interactions are obtained. The generalized transport coefficients are determined accounting for the consistent description of kinetic and hydrodynamic processes.
Multi-component nuclear energy system to meet requirement of self-consistency
International Nuclear Information System (INIS)
Saito, Masaki; Artisyuk, Vladimir; Shmelev, Anotolii; Korovin, Yorii
2000-01-01
Environmental harmonization of nuclear energy technology is considered as an absolutely necessary condition in its future successful development for peaceful use. Establishment of Self-Consistent Nuclear Energy System, that simultaneously meets four requirements - energy production, fuel production, burning of radionuclides and safety, strongly relies on the neutron excess generation. Implementation of external non-fission based neutron sources into fission energy system would open the possibility of approaching Multicomponent Self-Consistent Nuclear Energy System with unlimited fuel resources, zero radioactivity release and high protection against uncontrolled proliferation of nuclear materials. (author)
A proposed grading system for standardizing tumor consistency of intracranial meningiomas.
Zada, Gabriel; Yashar, Parham; Robison, Aaron; Winer, Jesse; Khalessi, Alexander; Mack, William J; Giannotta, Steven L
2013-12-01
Tumor consistency plays an important and underrecognized role in the surgeon's ability to resect meningiomas, especially with evolving trends toward minimally invasive and keyhole surgical approaches. Aside from descriptors such as "hard" or "soft," no objective criteria exist for grading, studying, and conveying the consistency of meningiomas. The authors designed a practical 5-point scale for intraoperative grading of meningiomas based on the surgeon's ability to internally debulk the tumor and on the subsequent resistance to folding of the tumor capsule. Tumor consistency grades and features are as follows: 1) extremely soft tumor, internal debulking with suction only; 2) soft tumor, internal debulking mostly with suction, and remaining fibrous strands resected with easily folded capsule; 3) average consistency, tumor cannot be freely suctioned and requires mechanical debulking, and the capsule then folds with relative ease; 4) firm tumor, high degree of mechanical debulking required, and capsule remains difficult to fold; and 5) extremely firm, calcified tumor, approaches density of bone, and capsule does not fold. Additional grading categories included tumor heterogeneity (with minimum and maximum consistency scores) and a 3-point vascularity score. This grading system was prospectively assessed in 50 consecutive patients undergoing craniotomy for meningioma resection by 2 surgeons in an independent fashion. Grading scores were subjected to a linear weighted kappa analysis for interuser reliability. Fifty patients (100 scores) were included in the analysis. The mean maximal tumor diameter was 4.3 cm. The distribution of overall tumor consistency scores was as follows: Grade 1, 4%; Grade 2, 9%; Grade 3, 43%; Grade 4, 44%; and Grade 5, 0%. Regions of Grade 5 consistency were reported only focally in 14% of heterogeneous tumors. Tumors were designated as homogeneous in 68% and heterogeneous in 32% of grades. The kappa analysis score for overall tumor consistency
Non-linear phenomena in electronic systems consisting of coupled single-electron oscillators
International Nuclear Information System (INIS)
Kikombo, Andrew Kilinga; Hirose, Tetsuya; Asai, Tetsuya; Amemiya, Yoshihito
2008-01-01
This paper describes non-linear dynamics of electronic systems consisting of single-electron oscillators. A single-electron oscillator is a circuit made up of a tunneling junction and a resistor, and produces simple relaxation oscillation. Coupled with another, single electron oscillators exhibit complex behavior described by a combination of continuous differential equations and discrete difference equations. Computer simulation shows that a double-oscillator system consisting of two coupled oscillators produces multi-periodic oscillation with a single attractor, and that a quadruple-oscillator system consisting of four oscillators also produces multi-periodic oscillation but has a number of possible attractors and takes one of them determined by initial conditions
Generation of static solutions of self-consistent system of Einstein-Maxwell equations
International Nuclear Information System (INIS)
Anchikov, A.M.; Daishev, R.A.
1988-01-01
The theorem, according to which the static solution of the self-consistent system of the Einstein-Maxwell equations is assigned to energy static solution of the Einstein equations with the arbitrary energy-momentum tensor in the right part, is proved. As a consequence of this theorem, the way of the generation of the static solutions of the self-consistent system of the Einstein-Maxwell equations with charged dust as a source of the vacuum solutions of the Einstein equations is shown
Self-consistent cluster theory for systems with off-diagonal disorder
International Nuclear Information System (INIS)
Kaplan, T.; Leath, P.L.; Gray, L.J.; Diehl, H.W.
1980-01-01
A self-consistent cluster theory for elementary excitations in systems with diagonal, off-diagonal, and environmental disorder is presented. The theory is developed in augmented space where the configurational average over the disorder is replaced by a ground-state matrix element in a translationally invariant system. The analyticity of the resulting approximate Green's function is proved. Numerical results for the self-consistent single-site and pair approximations are presented for the vibrational and electronic properties of disordered linear chains with diagonal, off-diagonal, and environmental disorder
Energy Technology Data Exchange (ETDEWEB)
Myrzakulov, R.; Mamyrbekova, G.K.; Nugmanova, G.N.; Yesmakhanova, K.R. [Eurasian International Center for Theoretical Physics and Department of General and Theoretical Physics, Eurasian National University, Astana 010008 (Kazakhstan); Lakshmanan, M., E-mail: lakshman@cnld.bdu.ac.in [Centre for Nonlinear Dynamics, School of Physics, Bharathidasan University, Tiruchirapalli 620 024 (India)
2014-06-13
Motion of curves and surfaces in R{sup 3} lead to nonlinear evolution equations which are often integrable. They are also intimately connected to the dynamics of spin chains in the continuum limit and integrable soliton systems through geometric and gauge symmetric connections/equivalence. Here we point out the fact that a more general situation in which the curves evolve in the presence of additional self-consistent vector potentials can lead to interesting generalized spin systems with self-consistent potentials or soliton equations with self-consistent potentials. We obtain the general form of the evolution equations of underlying curves and report specific examples of generalized spin chains and soliton equations. These include principal chiral model and various Myrzakulov spin equations in (1+1) dimensions and their geometrically equivalent generalized nonlinear Schrödinger (NLS) family of equations, including Hirota–Maxwell–Bloch equations, all in the presence of self-consistent potential fields. The associated gauge equivalent Lax pairs are also presented to confirm their integrability. - Highlights: • Geometry of continuum spin chain with self-consistent potentials explored. • Mapping on moving space curves in R{sup 3} in the presence of potential fields carried out. • Equivalent generalized nonlinear Schrödinger (NLS) family of equations identified. • Integrability of identified nonlinear systems proved by deducing appropriate Lax pairs.
On dynamically consistent Jacobian inverse for non-holonomic robotic systems
Directory of Open Access Journals (Sweden)
Ratajczak Joanna
2017-12-01
Full Text Available This paper presents the dynamically consistent Jacobian inverse for non-holonomic robotic system, and its application to solving the motion planning problem. The system’s kinematics are represented by a driftless control system, and defined in terms of its input-output map in accordance with the endogenous configuration space approach. The dynamically consistent Jacobian inverse (DCJI has been introduced by means of a Riemannian metric in the endogenous configuration space, exploiting the reduced inertia matrix of the system’s dynamics. The consistency condition is formulated as the commutativity property of a diagram of maps. Singular configurations of DCJI are studied, and shown to coincide with the kinematic singularities. A parametric form of DCJI is derived, and used for solving example motion planning problems for the trident snake mobile robot. Some advantages in performance of DCJI in comparison to the Jacobian pseudoinverse are discovered.
Consistent adoption of the International System of Units (SI) in nuclear science and technology
Energy Technology Data Exchange (ETDEWEB)
Klumpar, J; Kovar, Z [Ceskoslovenska Akademie Ved, Prague. Laborator Radiologicke Dozimetrie; Sacha, J [Slovenska Akademia Vied, Bratislava (Czechoslovakia). Fyzikalny Ustav
1975-11-01
The principles are stressed behind a consistent introduction of the International System of Units (SI) in Czechoslovakia complying with the latest edition of the Czechoslovak Standard CSN 01 1300 on the prescribed system of national and international units. The use of special and auxiliary units in nuclear physics and technology is discussed, particular attention being devoted to the units of activity and to the time units applied in radiology. Conversion graph and tables are annexed.
Discretizing LTI Descriptor (Regular Differential Input Systems with Consistent Initial Conditions
Directory of Open Access Journals (Sweden)
Athanasios D. Karageorgos
2010-01-01
Full Text Available A technique for discretizing efficiently the solution of a Linear descriptor (regular differential input system with consistent initial conditions, and Time-Invariant coefficients (LTI is introduced and fully discussed. Additionally, an upper bound for the error ‖x¯(kT−x¯k‖ that derives from the procedure of discretization is also provided. Practically speaking, we are interested in such kind of systems, since they are inherent in many physical, economical and engineering phenomena.
Chaotic synchronization of vibrations of a coupled mechanical system consisting of a plate and beams
Directory of Open Access Journals (Sweden)
J. Awrejcewicz
Full Text Available In this paper mathematical model of a mechanical system consisting of a plate and either one or two beams is derived. Obtained PDEs are reduced to ODEs, and then studied mainly using the fast Fourier and wavelet transforms. A few examples of the chaotic synchronizations are illustrated and discussed.
Energy Technology Data Exchange (ETDEWEB)
Tingjin, Liu; Zhengjun, Sun [Chinese Nuclear Data Center, Beijing, BJ (China)
1996-06-01
To meet the requirement of nuclear engineering, especially nuclear fusion reactor, now the data in the major evaluated libraries are given not only for natural element but also for its isotopes. Inconsistency between element and its isotopes data is one of the main problem in present evaluated neutron libraries. The formulas for adjusting to satisfy simultaneously the two kinds of consistent relationships were derived by means of least square method, the program system CABEI were developed. This program was tested by calculating the Fe data in CENDL-2.1. The results show that adjusted values satisfy the two kinds of consistent relationships.
Bindoff, I; Stafford, A; Peterson, G; Kang, B H; Tenni, P
2012-08-01
Drug-related problems (DRPs) are of serious concern worldwide, particularly for the elderly who often take many medications simultaneously. Medication reviews have been demonstrated to improve medication usage, leading to reductions in DRPs and potential savings in healthcare costs. However, medication reviews are not always of a consistently high standard, and there is often room for improvement in the quality of their findings. Our aim was to produce computerized intelligent decision support software that can improve the consistency and quality of medication review reports, by helping to ensure that DRPs relevant to a patient are overlooked less frequently. A system that largely achieved this goal was previously published, but refinements have been made. This paper examines the results of both the earlier and newer systems. Two prototype multiple-classification ripple-down rules medication review systems were built, the second being a refinement of the first. Each of the systems was trained incrementally using a human medication review expert. The resultant knowledge bases were analysed and compared, showing factors such as accuracy, time taken to train, and potential errors avoided. The two systems performed well, achieving accuracies of approximately 80% and 90%, after being trained on only a small number of cases (126 and 244 cases, respectively). Through analysis of the available data, it was estimated that without the system intervening, the expert training the first prototype would have missed approximately 36% of potentially relevant DRPs, and the second 43%. However, the system appeared to prevent the majority of these potential expert errors by correctly identifying the DRPs for them, leaving only an estimated 8% error rate for the first expert and 4% for the second. These intelligent decision support systems have shown a clear potential to substantially improve the quality and consistency of medication reviews, which should in turn translate into
Holzinger, Andreas; Stickel, Christian; Fassold, Markus; Ebner, Martin
Interface consistency is an important basic concept in web design and has an effect on performance and satisfaction of end users. Consistency also has significant effects on the learning performance of both expert and novice end users. Consequently, the evaluation of consistency within a e-learning system and the ensuing eradication of irritating discrepancies in the user interface redesign is a big issue. In this paper, we report of our experiences with the Shadow Expert Technique (SET) during the evaluation of the consistency of the user interface of a large university learning management system. The main objective of this new usability evaluation method is to understand the interaction processes of end users with a specific system interface. Two teams of usability experts worked independently from each other in order to maximize the objectivity of the results. The outcome of this SET method is a list of recommended changes to improve the user interaction processes, hence to facilitate high consistency.
RELIABILITY ASSESSMENT OF ENTROPY METHOD FOR SYSTEM CONSISTED OF IDENTICAL EXPONENTIAL UNITS
Institute of Scientific and Technical Information of China (English)
Sun Youchao; Shi Jun
2004-01-01
The reliability assessment of unit-system near two levels is the most important content in the reliability multi-level synthesis of complex systems. Introducing the information theory into system reliability assessment, using the addible characteristic of information quantity and the principle of equivalence of information quantity, an entropy method of data information conversion is presented for the system consisted of identical exponential units. The basic conversion formulae of entropy method of unit test data are derived based on the principle of information quantity equivalence. The general models of entropy method synthesis assessment for system reliability approximate lower limits are established according to the fundamental principle of the unit reliability assessment. The applications of the entropy method are discussed by way of practical examples. Compared with the traditional methods, the entropy method is found to be valid and practicable and the assessment results are very satisfactory.
Feng, Lian-Li; Tian, Shou-Fu; Zhang, Tian-Tian; Zhou, Jun
2017-07-01
Under investigation in this paper is the variant Boussinesq system, which describes the propagation of surface long wave towards two directions in a certain deep trough. With the help of the truncated Painlevé expansion, we construct its nonlocal symmetry, Bäcklund transformation, and Schwarzian form, respectively. The nonlocal symmetries can be localised to provide the corresponding nonlocal group, and finite symmetry transformations and similarity reductions are computed. Furthermore, we verify that the variant Boussinesq system is solvable via the consistent Riccati expansion (CRE). By considering the consistent tan-function expansion (CTE), which is a special form of CRE, the interaction solutions between soliton and cnoidal periodic wave are explicitly studied.
Ring retroreflector system consisting of cube-corner reflectors with special coating
International Nuclear Information System (INIS)
Burmistrov, V B; Sadovnikov, M A; Sokolov, A L; Shargorodskiy, V D
2013-01-01
The ring retroreflector system (RS) consisting of cubecorner reflectors (CCRs) with a special coating of reflecting surfaces, intended for uniaxially Earth-oriented navigation satellites, is considered. The error of distance measurement caused by both the laser pulse delay in the CCR and its spatial position (CCR configuration) is studied. It is shown that the ring RS, formed by the CCR with a double-spot radiation pattern, allows the distance measurement error to be essentially reduced. (nanogradient dielectric coatings and metamaterials)
International Nuclear Information System (INIS)
Schrader, Heinrich
2000-01-01
Calibration in terms of activity of the ionization-chamber secondary standard measuring systems at the PTB is described. The measurement results of a Centronic IG12/A20, a Vinten ISOCAL IV and a radionuclide calibrator chamber for nuclear medicine applications are discussed, their energy-dependent efficiency curves established and the consistency checked using recently evaluated radionuclide decay data. Criteria for evaluating and transferring calibration factors (or efficiencies) are given
Directory of Open Access Journals (Sweden)
Jan Zavadsky
2014-07-01
Full Text Available Purpose: The performance management system (PMS is a metasystem over all business processes at the strategic and operational level. Effectiveness of the various management systems depends on many factors. One of them is the consistent definition of each system elements. The main purpose of this study is to explore if the performance management systems of the sample companies is consistent and how companies can create such a system. The consistency in this case is based on the homogenous definition of attributes relating to the performance indicator as a basic element of PMS.Methodology: At the beginning, we used an affinity diagram that helped us to clarify and to group various attributes of performance indicators. The main research results we achieved are through empirical study. The empirical study was carried out in a sample of Slovak companies. The criterion for selection was the existence of the certified management systems according to the ISO 9001. Representativeness of the sample companies was confirmed by application of Pearson´s chi-squared test (χ2 - test due to above standards. Findings: Coming from the review of various literature, we defined four groups of attributes relating to the performance indicator: formal attributes, attributes of target value, informational attributes and attributes of evaluation. The whole set contains 21 attributes. The consistency of PMS is based not on maximum or minimum number of attributes, but on the same type of attributes for each performance indicator used in PMS at both the operational and strategic level. The main findings are: companies use various financial and non-financial indicators at strategic or operational level; companies determine various attributes of performance indicator, but most of the performance indicators are otherwise determined; we identified the common attributes for the whole sample of companies. Practical implications: The research results have got an implication for
Bosons system with finite repulsive interaction: self-consistent field method
International Nuclear Information System (INIS)
Renatino, M.M.B.
1983-01-01
Some static properties of a boson system (T = zero degree Kelvin), under the action of a repulsive potential are studied. For the repulsive potential, a model was adopted consisting of a region where it is constant (r c ), and a decay as 1/r (r > r c ). The self-consistent field approximation used takes into account short range correlations through a local field corrections, which leads to an effective field. The static structure factor S(q-vector) and the effective potential ψ(q-vector) are obtained through a self-consistent calculation. The pair-correlation function g(r-vector) and the energy of the collective excitations E(q-vector) are also obtained, from the structure factor. The density of the system and the parameters of the repulsive potential, that is, its height and the size of the constant region were used as variables for the problem. The results obtained for S(q-vector), g(r-vector) and E(q-vector) for a fixed ratio r o /r c and a variable λ, indicates the raising of a system structure, which is more noticeable when the potential became more repulsive. (author)
Synchronization in node of complex networks consist of complex chaotic system
Energy Technology Data Exchange (ETDEWEB)
Wei, Qiang, E-mail: qiangweibeihua@163.com [Beihua University computer and technology College, BeiHua University, Jilin, 132021, Jilin (China); Digital Images Processing Institute of Beihua University, BeiHua University, Jilin, 132011, Jilin (China); Faculty of Electronic Information and Electrical Engineering, Dalian University of Technology, Dalian, 116024 (China); Xie, Cheng-jun [Beihua University computer and technology College, BeiHua University, Jilin, 132021, Jilin (China); Digital Images Processing Institute of Beihua University, BeiHua University, Jilin, 132011, Jilin (China); Liu, Hong-jun [School of Information Engineering, Weifang Vocational College, Weifang, 261041 (China); Li, Yan-hui [The Library, Weifang Vocational College, Weifang, 261041 (China)
2014-07-15
A new synchronization method is investigated for node of complex networks consists of complex chaotic system. When complex networks realize synchronization, different component of complex state variable synchronize up to different scaling complex function by a designed complex feedback controller. This paper change synchronization scaling function from real field to complex field for synchronization in node of complex networks with complex chaotic system. Synchronization in constant delay and time-varying coupling delay complex networks are investigated, respectively. Numerical simulations are provided to show the effectiveness of the proposed method.
The self-consistent field model for Fermi systems with account of three-body interactions
Directory of Open Access Journals (Sweden)
Yu.M. Poluektov
2015-12-01
Full Text Available On the basis of a microscopic model of self-consistent field, the thermodynamics of the many-particle Fermi system at finite temperatures with account of three-body interactions is built and the quasiparticle equations of motion are obtained. It is shown that the delta-like three-body interaction gives no contribution into the self-consistent field, and the description of three-body forces requires their nonlocality to be taken into account. The spatially uniform system is considered in detail, and on the basis of the developed microscopic approach general formulas are derived for the fermion's effective mass and the system's equation of state with account of contribution from three-body forces. The effective mass and pressure are numerically calculated for the potential of "semi-transparent sphere" type at zero temperature. Expansions of the effective mass and pressure in powers of density are obtained. It is shown that, with account of only pair forces, the interaction of repulsive character reduces the quasiparticle effective mass relative to the mass of a free particle, and the attractive interaction raises the effective mass. The question of thermodynamic stability of the Fermi system is considered and the three-body repulsive interaction is shown to extend the region of stability of the system with the interparticle pair attraction. The quasiparticle energy spectrum is calculated with account of three-body forces.
International Nuclear Information System (INIS)
Hamm, L.L.; Van Brunt, V.
1982-08-01
A comparison of implicit Runge-Kutta and orthogonal collocation methods is made for the numerical solution to the ordinary differential equation which describes the high-pressure vapor-liquid equilibria of a binary system. The systems of interest are limited to binary solubility systems where one of the components is supercritical and exists as a noncondensable gas in the pure state. Of the two methods - implicit Runge-Kuta and orthogonal collocation - this paper attempts to present some preliminary but not necessarily conclusive results that the implicit Runge-Kutta method is superior for the solution to the ordinary differential equation utilized in the thermodynamic consistency testing of binary solubility systems. Due to the extreme nonlinearity of thermodynamic properties in the region near the critical locus, an extended cubic spline fitting technique is devised for correlating the P-x data. The least-squares criterion is employed in smoothing the experimental data. Even though the derivation is presented specifically for the correlation of P-x data, the technique could easily be applied to any thermodynamic data by changing the endpoint requirements. The volumetric behavior of the systems must be given or predicted in order to perform thermodynamic consistency tests. A general procedure is developed for predicting the volumetric behavior required and some indication as to the expected limit of accuracy is given
McClelland, James L
2013-11-01
The complementary learning systems theory of the roles of hippocampus and neocortex (McClelland, McNaughton, & O'Reilly, 1995) holds that the rapid integration of arbitrary new information into neocortical structures is avoided to prevent catastrophic interference with structured knowledge representations stored in synaptic connections among neocortical neurons. Recent studies (Tse et al., 2007, 2011) showed that neocortical circuits can rapidly acquire new associations that are consistent with prior knowledge. The findings challenge the complementary learning systems theory as previously presented. However, new simulations extending those reported in McClelland et al. (1995) show that new information that is consistent with knowledge previously acquired by a putatively cortexlike artificial neural network can be learned rapidly and without interfering with existing knowledge; it is when inconsistent new knowledge is acquired quickly that catastrophic interference ensues. Several important features of the findings of Tse et al. (2007, 2011) are captured in these simulations, indicating that the neural network model used in McClelland et al. has characteristics in common with neocortical learning mechanisms. An additional simulation generalizes beyond the network model previously used, showing how the rate of change of cortical connections can depend on prior knowledge in an arguably more biologically plausible network architecture. In sum, the findings of Tse et al. are fully consistent with the idea that hippocampus and neocortex are complementary learning systems. Taken together, these findings and the simulations reported here advance our knowledge by bringing out the role of consistency of new experience with existing knowledge and demonstrating that the rate of change of connections in real and artificial neural networks can be strongly prior-knowledge dependent. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Design of micro distribution systems consisting of long channels with arbitrary cross sections
International Nuclear Information System (INIS)
Misdanitis, S; Valougeorgis, D
2012-01-01
Gas flows through long micro-channels of various cross sections have been extensively investigated over the years both numerically and experimentally. In various technological applications including microfluidics, these micro-channels are combined together in order to form a micro-channel network. Computational algorithms for solving gas pipe networks in the hydrodynamic regime are well developed. However, corresponding tools for solving networks consisting of micro-channels under any degree of gas rarefaction is very limited. Recently a kinetic algorithm has been developed to simulate gas distribution systems consisting of long circular channels under any vacuum conditions. In the present work this algorithm is generalized and extended into micro-channels of arbitrary cross-section etched by KOH in silicon (triangular and trapezoidal channels with acute angle of 54.74°). Since a kinetic approach is implemented, the analysis is valid and the results are accurate in the whole range of the Knudsen number, while the involved computational effort is very small. This is achieved by successfully integrating the kinetic results for the corresponding single channels into the general solver for designing the gas pipe network. To demonstrate the feasibility of the approach two typical systems consisting of long rectangular and trapezoidal micro-channels are solved.
International Nuclear Information System (INIS)
Tsventoukh, M. M.
2010-01-01
A study is made of the convective (interchange, or flute) plasma stability consistent with equilibrium in magnetic confinement systems with a magnetic field decreasing outward and large curvature of magnetic field lines. Algorithms are developed which calculate convective plasma stability from the Kruskal-Oberman kinetic criterion and in which the convective stability is iteratively consistent with MHD equilibrium for a given pressure and a given type of anisotropy in actual magnetic geometry. Vacuum and equilibrium convectively stable configurations in systems with a decreasing, highly curved magnetic field are calculated. It is shown that, in convectively stable equilibrium, the possibility of achieving high plasma pressures in the central region is restricted either by the expansion of the separatrix (when there are large regions of a weak magnetic field) or by the filamentation of the gradient plasma current (when there are small regions of a weak magnetic field, in which case the pressure drops mainly near the separatrix). It is found that, from the standpoint of equilibrium and of the onset of nonpotential ballooning modes, a kinetic description of convective stability yields better plasma confinement parameters in systems with a decreasing, highly curved magnetic field than a simpler MHD model and makes it possible to substantially improve the confinement parameters for a given type of anisotropy. For the Magnetor experimental compact device, the maximum central pressure consistent with equilibrium and stability is calculated to be as high as β ∼ 30%. It is shown that, for the anisotropy of the distribution function that is typical of a background ECR plasma, the limiting pressure gradient is about two times steeper than that for an isotropic plasma. From a practical point of view, the possibility is demonstrated of achieving better confinement parameters of a hot collisionless plasma in systems with a decreasing, highly curved magnetic field than those
International Nuclear Information System (INIS)
Ane, J.M.; Grandgirard, V.; Albajar, F.; Johner, J.
2001-01-01
A consistent and simple approach to derive plasma scenarios for next step tokamak design is presented. It is based on successive plasma equilibria snapshots from plasma breakdown to end of ramp-down. Temperature and density profiles for each equilibrium are derived from a 2D plasma model. The time interval between two successive equilibria is then computed from the toroidal field magnetic energy balance, the resistive term of which depends on n, T profiles. This approach provides a consistent analysis of plasma performance, flux consumption and PF system, including average voltages waveforms across the PF coils. The plasma model and the Poynting theorem for the toroidal magnetic energy are presented. Application to ITER-FEAT and to M2, a Q=5 machine designed at CEA, are shown. (author)
Self-consistent theory of finite Fermi systems and radii of nuclei
International Nuclear Information System (INIS)
Saperstein, E. E.; Tolokonnikov, S. V.
2011-01-01
Present-day self-consistent approaches in nuclear theory were analyzed from the point of view of describing distributions of nuclear densities. The generalized method of the energy density functional due to Fayans and his coauthors (this is the most successful version of the self-consistent theory of finite Fermi systems) was the first among the approaches under comparison. The second was the most successful version of the Skyrme-Hartree-Fock method with the HFB-17 functional due to Goriely and his coauthors. Charge radii of spherical nuclei were analyzed in detail. Several isotopic chains of deformed nuclei were also considered. Charge-density distributions ρ ch (r) were calculated for several spherical nuclei. They were compared with model-independent data extracted from an analysis of elastic electron scattering on nuclei.
Self-consistent field theory based molecular dynamics with linear system-size scaling
Energy Technology Data Exchange (ETDEWEB)
Richters, Dorothee [Institute of Mathematics and Center for Computational Sciences, Johannes Gutenberg University Mainz, Staudinger Weg 9, D-55128 Mainz (Germany); Kühne, Thomas D., E-mail: kuehne@uni-mainz.de [Institute of Physical Chemistry and Center for Computational Sciences, Johannes Gutenberg University Mainz, Staudinger Weg 7, D-55128 Mainz (Germany); Technical and Macromolecular Chemistry, University of Paderborn, Warburger Str. 100, D-33098 Paderborn (Germany)
2014-04-07
We present an improved field-theoretic approach to the grand-canonical potential suitable for linear scaling molecular dynamics simulations using forces from self-consistent electronic structure calculations. It is based on an exact decomposition of the grand canonical potential for independent fermions and does neither rely on the ability to localize the orbitals nor that the Hamilton operator is well-conditioned. Hence, this scheme enables highly accurate all-electron linear scaling calculations even for metallic systems. The inherent energy drift of Born-Oppenheimer molecular dynamics simulations, arising from an incomplete convergence of the self-consistent field cycle, is circumvented by means of a properly modified Langevin equation. The predictive power of the present approach is illustrated using the example of liquid methane under extreme conditions.
On optimization of an experimental system consisting of beam guidance and nuclear detectors
International Nuclear Information System (INIS)
Lehr, H.; Hinderer, G.; Maier, K.H.
1978-02-01
This report deals with the optimization of the resolution in nuclear physics experiments with a beam of accelerated particles. The complete system consisting of the beam handling, the nuclear reaction, and the particle detection is described with a linear matrix formalism. This allows to give analytic expressions for the linewidth of any physically interesting quantities, like Q-values of scattering angle in the center of mass system, as a function of beam line-, nuclear reaction-, and spectrometer parameters. From this then general prescriptions for optimizing the resolution by matching the beam handling and the detector system are derived. Explicitly treated are the measurements of Q-values and CM-scattering angle with an energy sensitive detector, a time of flight spectrometer, and a magnetic spectrometer. (orig.) [de
General variational many-body theory with complete self-consistency for trapped bosonic systems
International Nuclear Information System (INIS)
Streltsov, Alexej I.; Alon, Ofir E.; Cederbaum, Lorenz S.
2006-01-01
In this work we develop a complete variational many-body theory for a system of N trapped bosons interacting via a general two-body potential. The many-body solution of this system is expanded over orthogonal many-body basis functions (configurations). In this theory both the many-body basis functions and the respective expansion coefficients are treated as variational parameters. The optimal variational parameters are obtained self-consistently by solving a coupled system of noneigenvalue--generally integro-differential--equations to get the one-particle functions and by diagonalizing the secular matrix problem to find the expansion coefficients. We call this theory multiconfigurational Hartree theory for bosons or MCHB(M), where M specifies explicitly the number of one-particle functions used to construct the configurations. General rules for evaluating the matrix elements of one- and two-particle operators are derived and applied to construct the secular Hamiltonian matrix. We discuss properties of the derived equations. We show that in the limiting cases of one configuration the theory boils down to the well-known Gross-Pitaevskii and the recently developed multi-orbital mean fields. The invariance of the complete solution with respect to unitary transformations of the one-particle functions is utilized to find the solution with the minimal number of contributing configurations. In the second part of our work we implement and apply the developed theory. It is demonstrated that for any practical computation where the configurational space is restricted, the description of trapped bosonic systems strongly depends on the choice of the many-body basis set used, i.e., self-consistency is of great relevance. As illustrative examples we consider bosonic systems trapped in one- and two-dimensional symmetric and asymmetric double well potentials. We demonstrate that self-consistency has great impact on the predicted physical properties of the ground and excited states
A feasibility study on FP transmutation for Self-Consistent Nuclear Energy System (SCNES)
International Nuclear Information System (INIS)
Fujita, Reiko; Kawashima, Masatoshi; Ueda, Hiroaki; Takagi, Ryuzo; Matsuura, Haruaki; Fujii-e, Yoichi
1997-01-01
A fast reactor core/fuel cycle concept is discussed for the future 'Self-Consistent Nuclear Energy System (SCNES)' concept. The present study mainly discussed long-lived fission products (LLFPs) burning capability and recycle scheme in the framework of metallic fuel fast reactor cycle, aiming at the goals for fuel breeding capability and confinement for TRU and radio-active FPs within the system. In present paper, burning capability for Cs135 and Zr93 is mainly discussed from neutronic and chemical view points, assuming metallic fuel cycle system. The recent experimental results indicate that Cs can be separable along with the pyroprocess for metal fuel recycle system, as previously designed for a candidate fuel cycle system. Combining neutron spectrum-shift for target sub-assemblies and isotope separation using tunable laser, LLFP burning capability is enhanced. This result indicates that major LLFPs can be treated in the additional recycle schemes to avoid LLFP accumulation along with energy production. In total, the proposed fuel cycle is an candidate for realizing SCNES concept. (author)
Simplified DFT methods for consistent structures and energies of large systems
Caldeweyher, Eike; Gerit Brandenburg, Jan
2018-05-01
Kohn–Sham density functional theory (DFT) is routinely used for the fast electronic structure computation of large systems and will most likely continue to be the method of choice for the generation of reliable geometries in the foreseeable future. Here, we present a hierarchy of simplified DFT methods designed for consistent structures and non-covalent interactions of large systems with particular focus on molecular crystals. The covered methods are a minimal basis set Hartree–Fock (HF-3c), a small basis set screened exchange hybrid functional (HSE-3c), and a generalized gradient approximated functional evaluated in a medium-sized basis set (B97-3c), all augmented with semi-classical correction potentials. We give an overview on the methods design, a comprehensive evaluation on established benchmark sets for geometries and lattice energies of molecular crystals, and highlight some realistic applications on large organic crystals with several hundreds of atoms in the primitive unit cell.
Directory of Open Access Journals (Sweden)
Hans-Jörg Rheinberger
2011-06-01
Full Text Available It is generally accepted that the development of the modern sciences is rooted in experiment. Yet for a long time, experimentation did not occupy a prominent role, neither in philosophy nor in history of science. With the 'practical turn' in studying the sciences and their history, this has begun to change. This paper is concerned with systems and cultures of experimentation and the consistencies that are generated within such systems and cultures. The first part of the paper exposes the forms of historical and structural coherence that characterize the experimental exploration of epistemic objects. In the second part, a particular experimental culture in the life sciences is briefly described as an example. A survey will be given of what it means and what it takes to analyze biological functions in the test tube.
Consistent Probabilistic Description of the Neutral Kaon System: Novel Observable Effects
Bernabeu, J.; Villanueva-Perez, P.
2013-01-01
The neutral Kaon system has both CP violation in the mass matrix and a non-vanishing lifetime difference in the width matrix. This leads to an effective Hamiltonian which is not a normal operator, with incompatible (non-commuting) masses and widths. In the Weisskopf-Wigner Approach (WWA), by diagonalizing the entire Hamiltonian, the unphysical non-orthogonal "stationary" states $K_{L,S}$ are obtained. These states have complex eigenvalues whose real (imaginary) part does not coincide with the eigenvalues of the mass (width) matrix. In this work we describe the system as an open Lindblad-type quantum mechanical system due to Kaon decays. This approach, in terms of density matrices for initial and final states, provides a consistent probabilistic description, avoiding the standard problems because the width matrix becomes a composite operator not included in the Hamiltonian. We consider the dominant-decay channel to two pions, so that one of the Kaon states with definite lifetime becomes stable. This new approa...
Neutron excess generation by fusion neutron source for self-consistency of nuclear energy system
International Nuclear Information System (INIS)
Saito, Masaki; Artisyuk, V.; Chmelev, A.
1999-01-01
The present day fission energy technology faces with the problem of transmutation of dangerous radionuclides that requires neutron excess generation. Nuclear energy system based on fission reactors needs fuel breeding and, therefore, suffers from lack of neutron excess to apply large-scale transmutation option including elimination of fission products. Fusion neutron source (FNS) was proposed to improve neutron balance in the nuclear energy system. Energy associated with the performance of FNS should be small enough to keep the position of neutron excess generator, thus, leaving the role of dominant energy producers to fission reactors. The present paper deals with development of general methodology to estimate the effect of neutron excess generation by FNS on the performance of nuclear energy system as a whole. Multiplication of fusion neutrons in both non-fissionable and fissionable multipliers was considered. Based on the present methodology it was concluded that neutron self-consistency with respect to fuel breeding and transmutation of fission products can be attained with small fraction of energy associated with innovated fusion facilities. (author)
Self-consistent study of space-charge-dominated beams in a misaligned transport system
International Nuclear Information System (INIS)
Sing Babu, P.; Goswami, A.; Pandit, V.S.
2013-01-01
A self-consistent particle-in-cell (PIC) simulation method is developed to investigate the dynamics of space-charge-dominated beams through a misaligned solenoid based transport system. Evolution of beam centroid, beam envelope and emittance is studied as a function of misalignment parameters for various types of beam distributions. Simulation results performed up to 40 mA of proton beam indicate that centroid oscillations induced by the displacement and rotational misalignments of solenoids do not depend of the beam distribution. It is shown that the beam envelope around the centroid is independent of the centroid motion for small centroid oscillation. In addition, we have estimated the loss of beam during the transport caused by the misalignment for various beam distributions
Genetic Algorithm-Based Model Order Reduction of Aeroservoelastic Systems with Consistant States
Zhu, Jin; Wang, Yi; Pant, Kapil; Suh, Peter M.; Brenner, Martin J.
2017-01-01
This paper presents a model order reduction framework to construct linear parameter-varying reduced-order models of flexible aircraft for aeroservoelasticity analysis and control synthesis in broad two-dimensional flight parameter space. Genetic algorithms are used to automatically determine physical states for reduction and to generate reduced-order models at grid points within parameter space while minimizing the trial-and-error process. In addition, balanced truncation for unstable systems is used in conjunction with the congruence transformation technique to achieve locally optimal realization and weak fulfillment of state consistency across the entire parameter space. Therefore, aeroservoelasticity reduced-order models at any flight condition can be obtained simply through model interpolation. The methodology is applied to the pitch-plant model of the X-56A Multi-Use Technology Testbed currently being tested at NASA Armstrong Flight Research Center for flutter suppression and gust load alleviation. The present studies indicate that the reduced-order model with more than 12× reduction in the number of states relative to the original model is able to accurately predict system response among all input-output channels. The genetic-algorithm-guided approach exceeds manual and empirical state selection in terms of efficiency and accuracy. The interpolated aeroservoelasticity reduced order models exhibit smooth pole transition and continuously varying gains along a set of prescribed flight conditions, which verifies consistent state representation obtained by congruence transformation. The present model order reduction framework can be used by control engineers for robust aeroservoelasticity controller synthesis and novel vehicle design.
Choi, H. J.; Lee, S. B.; Lee, H. G.; Y Back, S.; Kim, S. H.; Kang, H. S.
2017-07-01
Several parts that comprise the large scientific device should be installed and operated at the accurate three-dimensional location coordinates (X, Y, and Z) where they should be subjected to survey and alignment. The location of the aligned parts should not be changed in order to ensure that the electron beam parameters (Energy 10 GeV, Charge 200 pC, and Bunch Length 60 fs, Emittance X/Y 0.481 μm/0.256 μm) of PAL-XFEL (X-ray Free Electron Laser of the Pohang Accelerator Laboratory) remain stable and can be operated without any problems. As time goes by, however, the ground goes through uplift and subsidence, which consequently deforms building floors. The deformation of the ground and buildings changes the location of several devices including magnets and RF accelerator tubes, which eventually leads to alignment errors (∆X, ∆Y, and ∆Z). Once alignment errors occur with regard to these parts, the electron beam deviates from its course and beam parameters change accordingly. PAL-XFEL has installed the Hydrostatic Leveling System (HLS) to measure and record the vertical change of buildings and ground consistently and systematically and the Wire Position System (WPS) to measure the two dimensional changes of girders. This paper is designed to introduce the operating principle and design concept of WPS and discuss the current situation regarding installation and operation.
Multi-component Self-Consistent Nuclear Energy System: On proliferation resistance aspect
International Nuclear Information System (INIS)
Shmelev, A.; Saito, M; Artisyuk, V.
2000-01-01
Self-Consistent Nuclear Energy System (SCNES) that simultaneously meets four requirements: energy production, fuel production, burning of radionuclides and safety is targeted at harmonization of nuclear energy technology with human environment. The main bulk of SCNES studies focus on a potential of fast reactor (FR) in generating neutron excess to keep suitable neutron balance. Proliferation resistance was implicitly anticipated in a fuel cycle with co-processing of Pu, minor actinides (MA) and some relatively short-lived fission products (FP). In a contrast to such a mono-component system, the present paper advertises advantage of incorporating accelerator and fusion driven neutron sources which could drastically improve characteristics of nuclear waste incineration. What important is that they could help in creating advanced Np and Pa containing fuels with double protection against uncontrolled proliferation. The first level of protection deals with possibility to approach long life core (LLC) in fission reactors. Extending the core life-time to reactor-time is beneficial from the proliferation resistance viewpoint since LLC would not necessarily require fuel management at energy producing site, with potential advantage of being moved to vendor site for spent fuel refabrication. Second level is provided by the presence of substantial amounts of 238 Pu and 232 U in these fuels that makes fissile nuclides in them isotopically protected. All this reveals an important advantage of a multi-component SCNES that could draw in developing countries without elaborated technological infrastructure. (author)
[Consistency and Reliability of MDK Expertise Examining the Encoding in the German DRG System].
Gaertner, T; Lehr, F; Blum, B; van Essen, J
2015-09-01
Hospital inpatient stays are reimbursed on the basis of German diagnosis-related groups (G-DRG). The G-DRG classification system is based on complex coding guidelines. The Medical Review Board of the Statutory Health Insurance Funds (MDK) examines the encoding by hospitals and delivers individual expertises on behalf of the German statutory health insurance companies in cases in which irregularities are suspected. A study was conducted on the inter-rater reliability of the MDK expertises regarding the scope of the assessment. A representative sample of 212 MDK expertises was taken from a selected pool of 1 392 MDK expertises in May 2013. This representative sample underwent a double-examination by 2 independent MDK experts using a special software based on the 3MTM G-DRG Grouper 2013 of 3M Medica, Germany. The following items encoded by the hospitals were examined: DRG, principal diagnosis, secondary diagnoses, procedures and additional payments. It was analysed whether the results of MDK expertises were consistent, reliable and correct. 202 expertises were eligible for evaluation, containing a total of 254 questions regarding one or more of the 5 items encoded by hospitals. The double-examination by 2 independent MDK experts showed matching results in 187 questions (73.6%) meaning they had been examined consistently and correctly. 59 questions (23.2%) did not show matching results, nevertheless they had been examined correctly regarding the scope of the assessment. None of the principal diagnoses was significantly affected by inconsistent or wrong judgment. A representative sample of MDK expertises examining the DRG encoding by hospitals showed a very high percentage of correct examination by the MDK experts. Identical MDK expertises cannot be achieved in all cases due to the scope of the assessment. Further improvement and simplification of codes and coding guidelines are required to reduce the scope of assessment with regard to correct DRG encoding and its
Dental students consistency in applying the ICDAS system within paediatric dentistry.
Foley, J I
2012-12-01
To examine dental students' consistency in utilising the International Caries Detection and Assessment System (ICDAS) one and three months after training. A prospective study. All clinical dental students (Year Two: BDS2; Year Three: BDS3; Year Four: BDS4) as part of their education in Paediatric Dentistry at Aberdeen Dental School (n = 56) received baseline training by two "gold-standard" examiners and were advised to complete the 90-minute ICDAS e-learning program. Study One: One month later, the occlusal surface of 40 extracted primary and permanent molar teeth were examined and assigned both a caries (0-6 scale) and restorative code (0-9 scale). Study Two: The same teeth were examined three months later. Kappa statistics were used to determine inter- and intra-examiner reliability at baseline and after three months. In total, 31 students (BDS2: n = 9; BDS3: n = 8; BDS4: n = 14) completed both examinations. The inter-examiner reliability kappa scores for restoration codes for Study One and Study Two were: BDS2: 0.47 and 0.38; BDS3: 0.61 and 0.52 and BDS4: 0.56 and 0.52. The caries scores for the two studies were: BDS2: 0.31 and 0.20; BDS3: 0.45 and 0.32 and BDS4: 0.35 and 0.34. The intra-examiner reliability range for restoration codes were: BDS2: 0.20 to 0.55; BDS3: 0.34 to 0.72 and BDS4: 0.28 to 0.80. The intra-examiner reliability range for caries codes were: BDS2: 0.35 to 0.62; BDS3: 0.22 to 0.53 and BDS4: 0.22 to 0.65. The consistency of ICDAS codes varied between students and also, between year groups. In general, consistency was greater for restoration codes.
Self-consistent spectral function for non-degenerate Coulomb systems and analytic scaling behaviour
International Nuclear Information System (INIS)
Fortmann, Carsten
2008-01-01
Novel results for the self-consistent single-particle spectral function and self-energy are presented for non-degenerate one-component Coulomb systems at various densities and temperatures. The GW (0) -method for the dynamical self-energy is used to include many-particle correlations beyond the quasi-particle approximation. The self-energy is analysed over a broad range of densities and temperatures (n = 10 17 cm -3 -10 27 cm -3 , T = 10 2 eV/k B -10 4 eV/k B ). The spectral function shows a systematic behaviour, which is determined by collective plasma modes at small wavenumbers and converges towards a quasi-particle resonance at higher wavenumbers. In the low density limit, the numerical results comply with an analytic scaling law that is presented for the first time. It predicts a power-law behaviour of the imaginary part of the self-energy, ImΣ ∼ -n 1/4 . This resolves a long time problem of the quasi-particle approximation which yields a finite self-energy at vanishing density
International Nuclear Information System (INIS)
Kita, Toshihiro
2005-01-01
A simple system consisting of a second-order lag element (a damped linear pendulum) and two first-order lag elements with piecewise-linear static feedback that has been derived from a power system model is presented. It exhibits chaotic behavior for a wide range of parameter values. The analysis of the bifurcations and the chaotic behavior are presented with qualitative estimation of the parameter values for which the chaotic behavior is observed. Several characteristics like scalability of the attractor and globality of the attractor-basin are also discussed
Hierarchical fault diagnosis for discrete-event systems under local consistency
Su, Rong; Wonham, W.M.
2006-01-01
In previous work the authors proposed a distributed diagnosis approach consisting of two phases—preliminary diagnosis in each local diagnoser and inter-diagnoser communication. The objective of communication is to achieve either global or local consistency among local diagnoses, where global
Assessment of the Degree of Consistency of the System of Fuzzy Rules
Directory of Open Access Journals (Sweden)
Pospelova Lyudmila Yakovlevna
2013-12-01
Full Text Available The article analyses recent achievements and publications and shows that difficulties of explaining the nature of fuzziness and equivocation arise in socio-economic models that use the traditional paradigm of classical rationalism (computational, agent and econometric models. The accumulated collective experience of development of optimal models confirms prospectiveness of application of the fuzzy set approach in modelling the society. The article justifies the necessity of study of the nature of inconsistency in fuzzy knowledge bases both on the generalised ontology level and on pragmatic functional level of the logical inference. The article offers the method of search for logical and conceptual contradictions in the form of a combination of the abduction and modus ponens. It discusses the key issue of the proposed method: what properties should have the membership function of the secondary fuzzy set, which describes in fuzzy inference models such a resulting state of the object of management, which combines empirically incompatible properties with high probability. The degree of membership of the object of management in several incompatible classes with respect to the fuzzy output variable is the degree of fuzziness of the “Intersection of all results of the fuzzy inference of the set, applied at some input of rules, is an empty set” statement. The article describes an algorithm of assessment of the degree of consistency. It provides an example of the step-by-step detection of contradictions in statistical fuzzy knowledge bases at the pragmatic functional level of the logical output. The obtained results of testing in the form of sets of incompatible facts, output chains, sets of non-crossing intervals and computed degrees of inconsistency allow experts timely elimination of inadmissible contradictions and, at the same time, increase of quality of recommendations and assessment of fuzzy expert systems.
Elastic constants of the hard disc system in the self-consistent free volume approximation
International Nuclear Information System (INIS)
Wojciechowski, K.W.
1990-09-01
Elastic moduli of the two dimensional hard disc crystal are determined exactly within the Kirkwood self-consistent free volume approximation and compared with the Monte Carlo simulation results. (author). 22 refs, 1 fig., 1 tab
Adjoint-consistent formulations of slip models for coupled electroosmotic flow systems
Garg, Vikram V; Prudhomme, Serge; van der Zee, Kris G; Carey, Graham F
2014-01-01
Models based on the Helmholtz `slip' approximation are often used for the simulation of electroosmotic flows. The objectives of this paper are to construct adjoint-consistent formulations of such models, and to develop adjoint
Adjoint-consistent formulations of slip models for coupled electroosmotic flow systems
Garg, Vikram V
2014-09-27
Background Models based on the Helmholtz `slip\\' approximation are often used for the simulation of electroosmotic flows. The objectives of this paper are to construct adjoint-consistent formulations of such models, and to develop adjoint-based numerical tools for adaptive mesh refinement and parameter sensitivity analysis. Methods We show that the direct formulation of the `slip\\' model is adjoint inconsistent, and leads to an ill-posed adjoint problem. We propose a modified formulation of the coupled `slip\\' model, which is shown to be well-posed, and therefore automatically adjoint-consistent. Results Numerical examples are presented to illustrate the computation and use of the adjoint solution in two-dimensional microfluidics problems. Conclusions An adjoint-consistent formulation for Helmholtz `slip\\' models of electroosmotic flows has been proposed. This formulation provides adjoint solutions that can be reliably used for mesh refinement and sensitivity analysis.
Global and local consistencies in distributed fault diagnosis for discrete-event systems
Su, R.; Wonham, W.M.
2005-01-01
In this paper, we present a unified framework for distributed diagnosis. We first introduce the concepts of global and local consistency in terms of supremal global and local supports, then present two distributed diagnosis problems based on them. After that, we provide algorithms to achieve
Self-consistency condition and high-density virial theorem in relativistic many-particle systems
International Nuclear Information System (INIS)
Kalman, G.; Canuto, V.; Datta, B.
1976-01-01
In order for the thermodynamic and kinetic definitions of the chemical potential and the pressure to lead to identical results a nontrivial self-consistency criterion has to be satisfied. This, in turn, leads to a virial-like theorem in the high-density limit
A discussion of coupling and resonance effects for integrated systems consisting of subsystems
International Nuclear Information System (INIS)
Lin, C.W.; Liu, T.H.
1975-01-01
Three representative cases are studied to evaluate the interaction effect and to establish the need to include both stiffness and mass of the interacting systems in the system model. The first case is a supported system supported by a two-degrees-of-freedom supporting system. The second case represents two single degree of freedom systems, each supported by itself, but interconnected by a spring. The third case is a single degree of freedom system supported by another single degree of freedom supporting system. In each of the three case studied, the interaction effect is first measured by the difference in their natural frequencies for both the coupled system and the uncoupled systems. Although natural frequencies are important to the dynamic analysis of a system, the ultimate decision of whether the mathematical model is realistic depends on the result of the system response it predicts. With this in mind, case three is then studied with a white noise input. It is found that the root mean square response of both the supporting systems are substantially lower when coupled than when the systems are analyzed separately. Based on the results of this study, guidelines are provided for the subdivision into subsystems. (orig./HP) [de
Self-consistent quasi-particle RPA for the description of superfluid Fermi systems
Rahbi, A; Chanfray, G; Schuck, P
2002-01-01
Self-Consistent Quasi-Particle RPA (SCQRPA) is for the first time applied to a more level pairing case. Various filling situation and values for the coupling constant are considered. Very encouraging results in comparison with the exact solution of the model are obtaining. The nature of the low lying mode in SCQRPA is identified. The strong reduction of the number fluctuation in SCQRPA vs BCS is pointed out. The transition from superfluidity to the normal fluid case is carefully investigated.
STUDY OF TRANSIENT AND STATIONARY OPERATION MODES OF SYNCHRONOUS SYSTEM CONSISTING IN TWO MACHINES
Directory of Open Access Journals (Sweden)
V. S. Safaryan
2017-01-01
Full Text Available The solution of the problem of reliable functioning of an electric power system (EPS in steady-state and transient regimes, prevention of EPS transition into asynchronous regime, maintenance and restoration of stability of post-emergency processes is based on formation and realization of mathematical models of an EPS processes. During the functioning of electric power system in asynchronous regime, besides the main frequencies, the currents and voltages include harmonic components, the frequencies of which are multiple of the difference of main frequencies. At the two-frequency asynchronous regime the electric power system is being made equivalent in a form of a two-machine system, functioning for a generalized load. In the article mathematical models of transient process of a two-machine system in natural form and in d–q coordinate system are presented. The mathematical model of two-machine system is considered in case of two windings of excitement at the rotors. Also, in the article varieties of mathematical models of EPS transient regimes (trivial, simple, complete are presented. Transient process of a synchronous two-machine system is described by the complete model. The quality of transient processes of a synchronous machine depends on the number of rotor excitation windings. When there are two excitation windings on the rotor (dual system of excitation, the mathematical model of electromagnetic transient processes of a synchronous machine is represented in a complex form, i.e. in coordinate system d, q, the current of rotor being represented by a generalized vector. In asynchronous operation of a synchronous two-machine system with two excitation windings on the rotor the current and voltage systems include only harmonics of two frequencies. The mathematical model of synchronous steady-state process of a two-machine system is also provided, and the steady-state regimes with different structures of initial information are considered.
Consistency Analysis and Data Consultation of Gas System of Gas-Electricity Network of Latvia
Zemite, L.; Kutjuns, A.; Bode, I.; Kunickis, M.; Zeltins, N.
2018-02-01
In the present research, the main critical points of gas transmission and storage system of Latvia have been determined to ensure secure and reliable gas supply among the Baltic States to fulfil the core objectives of the EU energy policies. Technical data of critical points of the gas transmission and storage system of Latvia have been collected and analysed with the SWOT method and solutions have been provided to increase the reliability of the regional natural gas system.
Mans, R.S.; Aalst, van der W.M.P.; Russell, N.C.; Bakker, P.J.M.; Moleman, A.J.; Rinderle-Ma, S.; Sadiq, S.; Leymann, F.
2010-01-01
Optimal support for complex healthcare processes cannot be provided by a single out-of-the-box Process-Aware Information System and necessitates the construction of customized applications based on these systems. In order to allow for the seamless integration of the new technology into the existing
McKay, Joshua A A; McCulloch, Cara L; Querido, Jordan S; Foster, Glen E; Koehle, Michael S; Sheel, A William
2016-11-01
The purpose of this investigation was to quantify the cardiovascular, respiratory, and cerebrovascular effects of two common yogic breathing exercises (YBE): bhastrika and chaturbhuj; and to determine the effect of their consistent practice on chemosensitivity. The first study was cross-sectional and compared experienced yogic breathers (YB) with matched controls; whereas the second was a 10-week longitudinal training study. The results support four major findings. First chaturbhuj resulted in a hypoxic stimulus in experienced YB compared to control [end-tidal oxygen tension (P ET O 2 ), YB: 77.5±5.7mmHg, Pbreath-hold: 90.8 8±12.1mmHg) compared to rest (100.1±7.4, Pbreath-hold (96.7±13.0mmHg) compared to rest (83.0±6.6mmHg, Pbreath-hold: 87.4±23.0cm/s, P<0.05; rest: 55.8±26.3cm/s). Fourth, experienced YB had lower central chemosensitivity than controls (YB: 3.4±0.4; control: 4.6±1.2L/min/mmHg; P<0.05). In conclusion, YBE significantly alter end-tidal gases, resulting in complex oscillations of cardiovascular and cerebrovascular variables, and if practiced consistently, may reduce chemosensitivity. Copyright © 2016. Published by Elsevier B.V.
Yang, Yuyi; Wei, Buqing; Zhao, Yuhua; Wang, Jun
2013-02-01
Azo dyes are toxic and carcinogenic and are often present in industrial effluents. In this research, azoreductase and glucose 1-dehydrogenase were coupled for both continuous generation of the cofactor NADH and azo dye removal. The results show that 85% maximum relative activity of azoreductase in an integrated enzyme system was obtained at the conditions: 1U azoreductase:10U glucose 1-dehydrogenase, 250mM glucose, 1.0mM NAD(+) and 150μM methyl red. Sensitivity analysis of the factors in the enzyme system affecting dye removal examined by an artificial neural network model shows that the relative importance of enzyme ratio between azoreductase and glucose 1-dehydrogenase was 22%, followed by dye concentration (27%), NAD(+) concentration (23%) and glucose concentration (22%), indicating none of the variables could be ignored in the enzyme system. Batch results show that the enzyme system has application potential for dye removal. Copyright © 2012 Elsevier Ltd. All rights reserved.
Self-consistent theory of steady-state lamellar solidification in binary eutectic systems
International Nuclear Information System (INIS)
Nash, G.E.; Glicksman, M.E.
1976-01-01
The potential theoretic methods developed recently at NRL for solving the diffusion equation are applied to the free-boundary problem describing lamellar eutectic solidification. Using these techniques, the original boundary value problem is reduced to a set of coupled integro-differential equations for the shape of the solid/liquid interface and various quantities defined on the interface. The behavior of the solutions is discussed in a qualitative fashion, leading to some interesting inferences regarding the nature of the eutectic solidification process. Using the information obtained from the analysis referred to above, an approximate theory of the lamellar-rod transition is formulated. The predictions of the theory are shown to be in qualitative agreement with experimental observations of this transition. In addition, a simplified version of the general integro-differential equations is developed and is used to assess the effect of interface curvature on the interfacial solute concentrations, and to check the new theory for consistency with experiment
Folt, Brian; Donnelly, Maureen A; Guyer, Craig
2018-03-01
The conspecific attraction hypothesis predicts that individuals are attracted to conspecifics because conspecifics may be cues to quality habitat and/or colonists may benefit from living in aggregations. Poison frogs (Dendrobatidae) are aposematic, territorial, and visually oriented-three characteristics which make dendrobatids an appropriate model to test for conspecific attraction. In this study, we tested this hypothesis using an extensive mark-recapture dataset of the strawberry poison frog ( Oophaga pumilio ) from La Selva Biological Station, Costa Rica. Data were collected from replicate populations in a relatively homogenous Theobroma cacao plantation, which provided a unique opportunity to test how conspecifics influence the spatial ecology of migrants in a controlled habitat with homogenous structure. We predicted that (1) individuals entering a population would aggregate with resident adults, (2) migrants would share sites with residents at a greater frequency than expected by chance, and (3) migrant home ranges would have shorter nearest-neighbor distances (NND) to residents than expected by chance. The results were consistent with these three predictions: Relative to random simulations, we observed significant aggregation, home-range overlap, and NND distribution functions in four, five, and six, respectively, of the six migrant-resident groups analyzed. Conspecific attraction may benefit migrant O. pumilio by providing cues to suitable home sites and/or increasing the potential for social interactions with conspecifics; if true, these benefits should outweigh the negative effects of other factors associated with aggregation. The observed aggregation between migrant and resident O. pumilio is consistent with conspecific attraction in dendrobatid frogs, and our study provides rare support from a field setting that conspecific attraction may be a relevant mechanism for models of anuran spatial ecology.
How to consistently make your product, technology or system more environmentally-sustainable?
DEFF Research Database (Denmark)
Laurent, Alexis; Cosme, Nuno Miguel Dias; Molin, Christine
Human activities are currently uns ustainable, causing many damages to ecosystems, human health and natural resources. In this setting, the development of new products and technologies has been increasingly required to relate to sustainability and ensure that such development goes hand -in-hand w...... of the system. We rely on state-of -the-art science in the food sector, the aquaculture sector and the energy sector to showcase and illustrate the potential of LCA to undertake the environmental sustainability challenge and support product/technology/system development....
Direct calculation of self-consistent π bond orders in conjugated systems and pairing relations
International Nuclear Information System (INIS)
Castro, A.F.
1982-01-01
Pairing relations in excited states of conjugated systems which satisfy to a given symmetry with a Pariser-Parr-Pople-like (PPP) calculation are studied. Six π - electron systems are considered having a symmetry axis which does not cross π centers following a treatment which permits the direct obtainment of the bond order matrix based on Hall's method. Pairing relations are looked for, too, using particular solutions when U(3) groups is applied. Pyridazine molecules are used in order to test the results. (L.C.) [pt
Simulation of distributed parameter system consisting of charged and neutral particles
International Nuclear Information System (INIS)
Grover, P.S.; Sinha, K.V.
1986-01-01
The time-dependent behavior of positively charged light particles have been simulated in an assembly of heavy gas atoms. The system is formulated in terms of partial differential equation. The stability and convergence of the numerical algorithm has been examined. Using this formulation effects of external electric field and temperature have been investigated on the lifetime and distribution function characteristics of charged particles
Cosmological evolution and Solar System consistency of massive scalar-tensor gravity
de Pirey Saint Alby, Thibaut Arnoulx; Yunes, Nicolás
2017-09-01
The scalar-tensor theory of Damour and Esposito-Farèse recently gained some renewed interest because of its ability to suppress modifications to general relativity in the weak field, while introducing large corrections in the strong field of compact objects through a process called scalarization. A large sector of this theory that allows for scalarization, however, has been shown to be in conflict with Solar System observations when accounting for the cosmological evolution of the scalar field. We here study an extension of this theory by endowing the scalar field with a mass to determine whether this allows the theory to pass Solar System constraints upon cosmological evolution for a larger sector of coupling parameter space. We show that the cosmological scalar field goes first through a quiescent phase, similar to the behavior of a massless field, but then it enters an oscillatory phase, with an amplitude (and frequency) that decays (and grows) exponentially. We further show that after the field enters the oscillatory phase, its effective energy density and pressure are approximately those of dust, as expected from previous cosmological studies. Due to these oscillations, we show that the scalar field cannot be treated as static today on astrophysical scales, and so we use time-dependent perturbation theory to compute the scalar-field-induced modifications to Solar System observables. We find that these modifications are suppressed when the mass of the scalar field and the coupling parameter of the theory are in a wide range, allowing the theory to pass Solar System constraints, while in principle possibly still allowing for scalarization.
Self-consistent random phase approximation - application to systems of strongly correlated fermions
International Nuclear Information System (INIS)
Jemai, M.
2004-07-01
In the present thesis we have applied the self consistent random phase approximation (SCRPA) to the Hubbard model with a small number of sites (a chain of 2, 4, 6,... sites). Earlier SCRPA had produced very good results in other models like the pairing model of Richardson. It was therefore interesting to see what kind of results the method is able to produce in the case of a more complex model like the Hubbard model. To our great satisfaction the case of two sites with two electrons (half-filling) is solved exactly by the SCRPA. This may seem a little trivial but the fact is that other respectable approximations like 'GW' or the approach with the Gutzwiller wave function yield results still far from exact. With this promising starting point, the case of 6 sites at half filling was considered next. For that case, evidently, SCRPA does not any longer give exact results. However, they are still excellent for a wide range of values of the coupling constant U, covering for instance the phase transition region towards a state with non zero magnetisation. We consider this as a good success of the theory. Non the less the case of 4 sites (a plaquette), as indeed all cases with 4n sites at half filling, turned out to have a problem because of degeneracies at the Hartree Fock level. A generalisation of the present method, including in addition to the pairs, quadruples of Fermions operators (called second RPA) is proposed to also include exactly the plaquette case in our approach. This is therefore a very interesting perspective of the present work. (author)
DEFF Research Database (Denmark)
Staunstrup, Jørgen
1998-01-01
This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....
Energy Technology Data Exchange (ETDEWEB)
Ohmacht, Martin
2017-08-15
In a multiprocessor system, a central memory synchronization module coordinates memory synchronization requests responsive to memory access requests in flight, a generation counter, and a reclaim pointer. The central module communicates via point-to-point communication. The module includes a global OR reduce tree for each memory access requesting device, for detecting memory access requests in flight. An interface unit is implemented associated with each processor requesting synchronization. The interface unit includes multiple generation completion detectors. The generation count and reclaim pointer do not pass one another.
Ternary systems, consist of erbium nitrates, water and nitrates of pyridines, quinolines
International Nuclear Information System (INIS)
Starikova, L.I.; Zhuravlev, E.F.; Khalfina, L.R.
1979-01-01
At 25 and 50 deg C investigated is solubility of solid phases in ternary water salt systems: erbium nitrate-pyridine nitrate-water; erbium nitrate-quinoline nitrate-water. Formation of congruently soluble compounds of the Er(NO 3 ) 3 x2C 5 H 5 NxHNO 3 , Er(NO 3 ) 3 x2C 9 H 7 NxHNO 3 x4H 2 O composition is established. X-ray phase and thermogravimetric analyses have been carried out
An Approach to Verifying Completeness and Consistency in a Rule-Based Expert System.
1982-08-01
peolea with the se e S knowlede base by observing en t om. W0hile thorough testing is an "samt4 Pert of V*flfyL the ooIlst4ftl and capleteness of a...physicians at Stanford’s Oncology Day Care Center on the management of patients who are on experimental treatment protocols. These protocols serve to...for oncology protocol management . Prooceedings of 7th IJCAI, pp. 876- 881, Vancouver, B.C., August 1981. I. van Melle, W. A Domain-Independent system
Ohmacht, Martin
2014-09-09
In a multiprocessor system, a central memory synchronization module coordinates memory synchronization requests responsive to memory access requests in flight, a generation counter, and a reclaim pointer. The central module communicates via point-to-point communication. The module includes a global OR reduce tree for each memory access requesting device, for detecting memory access requests in flight. An interface unit is implemented associated with each processor requesting synchronization. The interface unit includes multiple generation completion detectors. The generation count and reclaim pointer do not pass one another.
Directory of Open Access Journals (Sweden)
Shuichiro Yazawa
2014-06-01
Full Text Available The role of surface protective additives becomes vital when operating conditions become severe and moving components operate in a boundary lubrication regime. After protecting film is slowly removed by rubbing, it can regenerate through the tribochemical reaction of the additives at the contact. However, there are limitations about the regeneration of the protecting film when additives are totally consumed. On the other hand, there are a lot of hard coatings to protect the steel surface from wear. These can enable the functioning of tribological systems, even in adverse lubrication conditions. However, hard coatings usually make the friction coefficient higher, because of their high interfacial shear strength. Amongst hard coatings, diamond-like carbon (DLC is widely used, because of its relatively low friction and superior wear resistance. In practice, conventional lubricants that are essentially formulated for a steel/steel surface are still used for lubricating machine component surfaces provided with protective coatings, such as DLCs, despite the fact that the surface properties of coatings are quite different from those of steel. It is therefore important that the design of additive molecules and their interaction with coatings should be re-considered. The main aim of this paper is to discuss the DLC and the additive combination that enable tribofilm formation and effective lubrication of tribological systems.
Tile drainage phosphorus loss with long-term consistent cropping systems and fertilization.
Zhang, T Q; Tan, C S; Zheng, Z M; Drury, C F
2015-03-01
Phosphorus (P) loss in tile drainage water may vary with agricultural practices, and the impacts are often hard to detect with short-term studies. We evaluated the effects of long-term (≥43 yr) cropping systems (continuous corn [CC], corn-oats-alfalfa-alfalfa rotation [CR], and continuous grass [CS]) and fertilization (fertilization [F] vs. no-fertilization [NF]) on P loss in tile drainage water from a clay loam soil over a 4-yr period. Compared with NF, long-term fertilization increased concentrations and losses of dissolved reactive P (DRP), dissolved unreactive P (DURP), and total P (TP) in tile drainage water, with the increments following the order: CS > CR > CC. Dissolved P (dissolved reactive P [DRP] and dissolved unreactive P [DURP]) was the dominant P form in drainage outflow, accounting for 72% of TP loss under F-CS, whereas particulate P (PP) was the major form of TP loss under F-CC (72%), F-CR (62%), NF-CS (66%), NF-CC (74%), and NF-CR (72%). Dissolved unreactive P played nearly equal roles as DRP in P losses in tile drainage water. Stepwise regression analysis showed that the concentration of P (DRP, DURP, and PP) in tile drainage flow, rather than event flow volume, was the most important factor contributing to P loss in tile drainage water, although event flow volume was more important in PP loss than in dissolved P loss. Continuous grass significantly increased P loss by increasing P concentration and flow volume of tile drainage water, especially under the fertilization treatment. Long-term grasslands may become a significant P source in tile-drained systems when they receive regular P addition. Copyright © by the American Society of Agronomy, Crop Science Society of America, and Soil Science Society of America, Inc.
Newburg, D S
2009-04-01
This review discusses the role of human milk glycans in protecting infants, but the conclusion that the human milk glycans constitute an innate immune system whereby the mother protects her offspring may have general applicability in all mammals, including species of commercial importance. Infants that are not breastfed have a greater incidence of severe diarrhea and respiratory diseases than those who are breastfed. In the past, this had been attributed primarily to human milk secretory antibodies. However, the oligosaccharides are major components of human milk, and milk is also rich in other glycans, including glycoproteins, mucins, glycosaminoglycans, and glycolipids. These milk glycans, especially the oligosaccharides, are composed of thousands of components. The milk factor that promotes gut colonization by Bifidobacterium bifidum was found to be a glycan, and such prebiotic characteristics may contribute to protection against infectious agents. However, the ability of human milk glycans to protect the neonate seems primarily to be due to their inhibition of pathogen binding to their host cell target ligands. Many such examples include specific fucosylated oligosaccharides and glycans that inhibit specific pathogens. Most human milk oligosaccharides are fucosylated, and their production depends on fucosyltransferase enzymes; mutations in these fucosyltransferase genes are common and underlie the various Lewis blood types in humans. Variable expression of specific fucosylated oligosaccharides in milk, also a function of these genes (and maternal Lewis blood type), is significantly associated with the risk of infectious disease in breastfed infants. Human milk also contains major quantities and large numbers of sialylated oligosaccharides, many of which are also present in bovine colostrum. These could similarly inhibit several common viral pathogens. Moreover, human milk oligosaccharides strongly attenuate inflammatory processes in the intestinal mucosa. These
Final Scientific/Technical Report "Arc Tube Coating System for Color Consistency"
Energy Technology Data Exchange (ETDEWEB)
Buelow, Roger [Energy Focus, Inc., Solon, OH (United States); Jenson, Chris [Energy Focus, Inc., Solon, OH (United States); Kazenski, Keith [Energy Focus, Inc., Solon, OH (United States)
2013-03-21
DOE has enabled the use of coating materials using low cost application methods on light sources to positively affect the output of those sources. The coatings and light source combinations have shown increased lumen output of LED fixtures (1.5%-2.0%), LED arrays (1.4%) and LED powered remote phosphor systems Philips L-Prize lamp (0.9%). We have also demonstrated lifetime enhancements (3000 hrs vs 8000 hrs) and shifting to higher CRI (51 to 65) in metal halide high intensity discharge lamps with metal oxide coatings. The coatings on LEDs and LED products are significant as the market is moving increasingly more towards LED technology. Enhancements in LED performance are demonstrated in this work through the use of available materials and low cost application processes. EFOI used low refractive index fluoropolymers and low cost dipping processes for application of the material to surfaces related to light transmission of LEDs and LED products. Materials included Teflon AF, an amorphous fluorinated polymer and fluorinated acrylic monomers. The DOE SSL Roadmap sets goals for LED performance moving into the future. EFOI's coating technology is a means to shift the performance curve for LEDs. This is not limited to one type of LED, but is relevant across LED technologies. The metal halide work included the use of sol-gel solutions resulting in silicon dioxide and titanium dioxide coatings on the quartz substrates of the metal halide arc tubes. The coatings were applied using low cost dipping processes.
Mutual Inductance Problem for a System Consisting of a Current Sheet and a Thin Metal Plate
Fulton, J. P.; Wincheski, B.; Nath, S.; Namkung, M.
1993-01-01
Rapid inspection of aircraft structures for flaws is of vital importance to the commercial and defense aircraft industry. In particular, inspecting thin aluminum structures for flaws is the focus of a large scale R&D effort in the nondestructive evaluation (NDE) community. Traditional eddy current methods used today are effective, but require long inspection times. New electromagnetic techniques which monitor the normal component of the magnetic field above a sample due to a sheet of current as the excitation, seem to be promising. This paper is an attempt to understand and analyze the magnetic field distribution due to a current sheet above an aluminum test sample. A simple theoretical model, coupled with a two dimensional finite element model (FEM) and experimental data will be presented in the next few sections. A current sheet above a conducting sample generates eddy currents in the material, while a sensor above the current sheet or in between the two plates monitors the normal component of the magnetic field. A rivet or a surface flaw near a rivet in an aircraft aluminum skin will disturb the magnetic field, which is imaged by the sensor. Initial results showed a strong dependence of the flaw induced normal magnetic field strength on the thickness and conductivity of the current-sheet that could not be accounted for by skin depth attenuation alone. It was believed that the eddy current imaging method explained the dependence of the thickness and conductivity of the flaw induced normal magnetic field. Further investigation, suggested the complexity associated with the mutual inductance of the system needed to be studied. The next section gives an analytical model to better understand the phenomenon.
Consistent analysis of peripheral reaction channels and fusion for the 16,18O+58Ni systems
International Nuclear Information System (INIS)
Alves, J.J.S.; Gomes, P.R.S.; Lubian, J.; Chamon, L.C.; Pereira, D.; Anjos, R.M.; Rossi, E.S.; Silva, C.P.; Alvarez, M.A.G.; Nobre, G.P.A.; Gasques, L.R.
2005-01-01
We have measured elastic scattering and peripheral reaction channel cross sections for the 16,18 O+ 58 Ni systems at ELab=46 MeV. The data were analyzed through extensive coupled-channel calculations. It was investigated the consistency of the present analysis with a previous one at sub-barrier energies. Experimental fusion cross sections for these systems are also compared with the corresponding predictions of the coupled-channel calculations
Sean P. Healey; Paul L. Patterson; Sassan S. Saatchi; Michael A. Lefsky; Andrew J. Lister; Elizabeth A. Freeman
2012-01-01
Lidar height data collected by the Geosciences Laser Altimeter System (GLAS) from 2002 to 2008 has the potential to form the basis of a globally consistent sample-based inventory of forest biomass. GLAS lidar return data were collected globally in spatially discrete full waveform "shots," which have been shown to be strongly correlated with aboveground forest...
International Nuclear Information System (INIS)
Tadmor, E.
1988-07-01
A convergence theory for semi-discrete approximations to nonlinear systems of conservation laws is developed. It is shown, by a series of scalar counter-examples, that consistency with the conservation law alone does not guarantee convergence. Instead, a notion of consistency which takes into account both the conservation law and its augmenting entropy condition is introduced. In this context it is concluded that consistency and L(infinity)-stability guarantee for a relevant class of admissible entropy functions, that their entropy production rate belongs to a compact subset of H(loc)sup -1 (x,t). One can now use compensated compactness arguments in order to turn this conclusion into a convergence proof. The current state of the art for these arguments includes the scalar and a wide class of 2 x 2 systems of conservation laws. The general framework of the vanishing viscosity method is studied as an effective way to meet the consistency and L(infinity)-stability requirements. How this method is utilized to enforce consistency and stability for scalar conservation laws is shown. In this context we prove, under the appropriate assumptions, the convergence of finite difference approximations (e.g., the high resolution TVD and UNO methods), finite element approximations (e.g., the Streamline-Diffusion methods) and spectral and pseudospectral approximations (e.g., the Spectral Viscosity methods)
Directory of Open Access Journals (Sweden)
Jeong Ran Park
2007-12-01
Full Text Available We tried to develop itemized evaluation criteria and a clinical rater qualification system through rating training of inter-rater consistency for experienced clinical dental hygienists and dental hygiene clinical educators. A total of 15 clinical dental hygienists with 1-year careers participated as clinical examination candidates, while 5 dental hygienists with 3-year educations and clinical careers or longer participated as clinical raters. They all took the clinical examination as examinees. The results were compared, and the consistency of competence was measured. The comparison of clinical competence between candidates and clinical raters showed that the candidate group?占퐏 mean clinical competence ranged from 2.96 to 3.55 on a 5-point system in a total of 3 instruments (Probe, Explorer, Curet, while the clinical rater group?占퐏 mean clinical competence ranged from 4.05 to 4.29. There was a higher inter-rater consistency after education of raters in the following 4 items: Probe, Explorer, Curet, and insertion on distal surface. The mean score distribution of clinical raters ranged from 75% to 100%, which was more uniform in the competence to detect an artificial calculus than that of candidates (25% to 100%. According to the above results, there was a necessity in the operating clinical rater qualification system for comprehensive dental hygiene clinicians. Furthermore, in order to execute the clinical rater qualification system, it will be necessary to keep conducting a series of studies on educational content, time, frequency, and educator level.
Bitcoin Meets Strong Consistency
Decker, Christian; Seidel, Jochen; Wattenhofer, Roger
2014-01-01
The Bitcoin system only provides eventual consistency. For everyday life, the time to confirm a Bitcoin transaction is prohibitively slow. In this paper we propose a new system, built on the Bitcoin blockchain, which enables strong consistency. Our system, PeerCensus, acts as a certification authority, manages peer identities in a peer-to-peer network, and ultimately enhances Bitcoin and similar systems with strong consistency. Our extensive analysis shows that PeerCensus is in a secure state...
International Nuclear Information System (INIS)
Woafo, P.
1999-12-01
This paper deals with the dynamics of a model describing systems consisting of the classical Van der Pol oscillator coupled gyroscopically to a linear oscillator. Both the forced and autonomous cases are considered. Harmonic response is investigated along with its stability boundaries. Condition for quenching phenomena in the autonomous case is derived. Neimark bifurcation is observed and it is found that our model shows period doubling and period-m sudden transitions to chaos. Synchronization of two and more systems in their chaotic regime is presented. (author)
International Nuclear Information System (INIS)
Arie, K.; Suzuki, M.; Kawashima, M.; Igashira, M.; Shimizu, A.; Fujii-e, Y.
1995-01-01
Feasibility of FP burning while maintaining fuel breeding capability for the Self-Consistent Nuclear Energy System is evaluated through neutron balance and a fast reactor core. It is shown that all radioactive FPs produced by itself can be burnt by a fast reactor while maintaining breeding capability, assuming separation of radioactive FP and stable FP isotopes. Assuming that the recovery system of fuel and FPs to be burnt is based on a pyro-chemical process, the major long-lived FPs of I, Pd, Tc, Sn, Se can be burnt with keeping breeding capability by suitability arranging materials in the fast reactor core. (Author)
Energy Technology Data Exchange (ETDEWEB)
Parlak, Koray Sener; Oezdemir, Mehmet [Dept. of Electrical and Electronic Engineering, Firat University, Elazig, 23119 (Turkey); Aydemir, M. Timur [Dept. of Electrical and Electronic Engineering, Gazi University, Maltepe-Ankara 06570 (Turkey)
2009-06-15
A distributed power system consisting of two uninterrupted power supplies (UPS) is investigated in this paper. Parallel operation of the two sources increases the established power rating of the system. One of the sources can supply the system even when the other system is disconnected due to some faults, and this is an important feature. The control algorithm makes sure that the total load is shared between the supplies in accordance with their rated power levels, and the frequency of the supplies are restored to the rated values after the transitions. As the UPSs operate at an optimum power level, losses and faults due to overloading are prevented. The units safely operate without any means of communication between each other. The focus of the work is on the inverter stages of the UPSs. Simulations performed in Matlab Simulink environment have been verified with experimental work via DS1103 controller card. (author)
DEFF Research Database (Denmark)
Yang, Laurence; Tan, Justin; O'Brien, Edward J.
2015-01-01
based on proteomics data. This systems biology core proteome includes 212 genes not found in previous comparative genomics-based core proteome definitions, accounts for 65% of known essential genes in E. coli, and has 78% gene function overlap with minimal genomes (Buchnera aphidicola and Mycoplasma......Finding the minimal set of gene functions needed to sustain life is of both fundamental and practical importance. Minimal gene lists have been proposed by using comparative genomics-based core proteome definitions. A definition of a core proteome that is supported by empirical data, is understood...... at the systems-level, and provides a basis for computing essential cell functions is lacking. Here, we use a systems biology-based genome-scale model of metabolism and expression to define a functional core proteome consisting of 356 gene products, accounting for 44% of the Escherichia coli proteome by mass...
Directory of Open Access Journals (Sweden)
Xiaoxiao Meng
2018-01-01
Full Text Available AC microgrid mainly comprise inverter-interfaced distributed generators (IIDGs, which are nonlinear complex systems with multiple time scales, including frequency control, time delay measurements, and electromagnetic transients. The droop control-based IIDG in an AC microgrid is selected as the research object in this study, which comprises power droop controller, voltage- and current-loop controllers, and filter and line. The multi-time scale characteristics of the detailed IIDG model are divided based on singular perturbation theory. In addition, the IIDG model order is reduced by neglecting the system fast dynamics. The static and transient stability consistency of the IIDG model order reduction are demonstrated by extracting features of the IIDG small signal model and using the quadratic approximation method of the stability region boundary, respectively. The dynamic response consistencies of the IIDG model order reduction are evaluated using the frequency, damping and amplitude features extracted by the Prony transformation. Results are applicable to provide a simplified model for the dynamic characteristic analysis of IIDG systems in AC microgrid. The accuracy of the proposed method is verified by using the eigenvalue comparison, the transient stability index comparison and the dynamic time-domain simulation.
DEFF Research Database (Denmark)
Cachorro, Irene Albacete; Daraban, Iulia Maria; Lainé, Guillaume
2013-01-01
. The heat pump is a heat driven system and is running with the heat recovered by a heat exchanger from the exhausted gases from SOFC. The working fluid pair is NH3-H2O and is driven in two evaporators which are working at two different pressures. Thus, the heat pump will operate at tree pressure level...... with natural gas. The natural gas is first converted to a mixture of H2 and CO which feed the anode after a preheating step. The cathode is supplied with preheated air and gives, as output, electrical energy. The anode output is the exhaust gas which represents the thermal energy reservoir for heating...
International Nuclear Information System (INIS)
Kawashima, Masatoshi; Arie, Kazuo; Araki, Yoshio; Sato, Mitsuyoshi; Mori, Kenji; Nakayama, Yoshiyuki; Nakazono, Ryuichi; Kuroda, Yuji; Ishiguma, Kazuo; Fujii-e, Yoichi
2008-01-01
A sustainable nuclear energy system was developed based on the concept of Self-Consistent Nuclear Energy System (SCNES). Our study that trans-uranium (TRU) metallic fuel fast reactor cycle coupled with recycling of five long-lived fission products (LLFP) as well as actinides is the most promising system for the sustainable nuclear utilization. Efficient utilization of uranium-238 through the SCNES concept opens the doors to prolong the lifetime of nuclear energy systems towards several tens of thousand years. Recent evolution of the concept revealed compatibility of fuel sustainability, minor actinide (MA) minimization and non-proliferation aspects for peaceful use of nuclear energy systems through the discussion. As for those TRU compositions stabilized under fast neutron spectra, plutonium isotope fractions are remained in the range of reactor grade classification with high fraction of Pu240 isotope. Recent evolution of the SCNES concept has revealed that TRU recycling can cope with enhancing non-proliferation efforts in peaceful use with the 'no-blanket and multi-zoning core' concept. Therefore, the realization of SCNES is most important. In addition, along the process to the goals, a three-step approach is proposed to solve concurrent problems raised in the LWR systems. We discussed possible roles and contribution to the near future demand along worldwide expansion of LWR capacities by applying the 1st generation SCNES. MA fractions in TRU are more than 10% from LWR discharged fuels and even higher up to 20% in fuels from long interim storages. TRU recycling in the 1st generation SCNES system can reduce the MA fractions down to 4-5% in a few decades. This capability significantly releases 'MA' pressures in down-stream of LWR systems. Current efforts for enhancing capabilities for energy generation by LWR systems are efficient against the global warming crisis. In parallel to those movements, early realization of the SCNES concept can be the most viable decision
Time-dependent restricted-active-space self-consistent-field theory for bosonic many-body systems
International Nuclear Information System (INIS)
Lévêque, Camille; Madsen, Lars Bojer
2017-01-01
We develop an ab initio time-dependent wavefunction based theory for the description of a many-body system of cold interacting bosons. Like the multi-configurational time-dependent Hartree method for bosons (MCTDHB), the theory is based on a configurational interaction Ansatz for the many-body wavefunction with time-dependent self-consistent-field orbitals. The theory generalizes the MCTDHB method by incorporating restrictions on the active space of the orbital excitations. The restrictions are specified based on the physical situation at hand. The equations of motion of this time-dependent restricted-active-space self-consistent-field (TD-RASSCF) theory are derived. The similarity between the formal development of the theory for bosons and fermions is discussed. The restrictions on the active space allow the theory to be evaluated under conditions where other wavefunction based methods due to exponential scaling in the numerical effort cannot, and to clearly identify the excitations that are important for an accurate description, significantly beyond the mean-field approach. For ground state calculations we find it to be important to allow a few particles to have the freedom to move in many orbitals, an insight facilitated by the flexibility of the restricted-active-space Ansatz . Moreover, we find that a high accuracy can be obtained by including only even excitations in the many-body self-consistent-field wavefunction. Time-dependent simulations of harmonically trapped bosons subject to a quenching of their noncontact interaction, show failure of the mean-field Gross-Pitaevskii approach within a fraction of a harmonic oscillation period. The TD-RASSCF theory remains accurate at much reduced computational cost compared to the MCTDHB method. Exploring the effect of changes of the restricted-active-space allows us to identify that even self-consistent-field excitations are mainly responsible for the accuracy of the method. (paper)
Energy Technology Data Exchange (ETDEWEB)
Johnson, B. C.; Melosh, H. J. [Department of Physics, Purdue University, 525 Northwestern Avenue, West Lafayette, IN 47907 (United States); Lisse, C. M. [JHU-APL, 11100 Johns Hopkins Road, Laurel, MD 20723 (United States); Chen, C. H. [STScI, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Wyatt, M. C. [Institute of Astronomy, University of Cambridge, Madingley Road, Cambridge CB3 0HA (United Kingdom); Thebault, P. [LESIA, Observatoire de Paris, F-92195 Meudon Principal Cedex (France); Henning, W. G. [NASA Goddard Space Flight Center, 8800 Greenbelt Road, Greenbelt, MD 20771 (United States); Gaidos, E. [Department of Geology and Geophysics, University of Hawaii at Manoa, Honolulu, HI 96822 (United States); Elkins-Tanton, L. T. [Department of Terrestrial Magnetism, Carnegie Institution for Science, Washington, DC 20015 (United States); Bridges, J. C. [Department of Physics and Astronomy, University of Leicester, Leicester LE1 7RH (United Kingdom); Morlok, A., E-mail: johns477@purdue.edu [Department of Physical Sciences, Open University, Walton Hall, Milton Keynes MK7 6AA (United Kingdom)
2012-12-10
Spectral modeling of the large infrared excess in the Spitzer IRS spectra of HD 172555 suggests that there is more than 10{sup 19} kg of submicron dust in the system. Using physical arguments and constraints from observations, we rule out the possibility of the infrared excess being created by a magma ocean planet or a circumplanetary disk or torus. We show that the infrared excess is consistent with a circumstellar debris disk or torus, located at {approx}6 AU, that was created by a planetary scale hypervelocity impact. We find that radiation pressure should remove submicron dust from the debris disk in less than one year. However, the system's mid-infrared photometric flux, dominated by submicron grains, has been stable within 4% over the last 27 years, from the Infrared Astronomical Satellite (1983) to WISE (2010). Our new spectral modeling work and calculations of the radiation pressure on fine dust in HD 172555 provide a self-consistent explanation for this apparent contradiction. We also explore the unconfirmed claim that {approx}10{sup 47} molecules of SiO vapor are needed to explain an emission feature at {approx}8 {mu}m in the Spitzer IRS spectrum of HD 172555. We find that unless there are {approx}10{sup 48} atoms or 0.05 M{sub Circled-Plus} of atomic Si and O vapor in the system, SiO vapor should be destroyed by photo-dissociation in less than 0.2 years. We argue that a second plausible explanation for the {approx}8 {mu}m feature can be emission from solid SiO, which naturally occurs in submicron silicate ''smokes'' created by quickly condensing vaporized silicate.
International Nuclear Information System (INIS)
Johnson, B. C.; Melosh, H. J.; Lisse, C. M.; Chen, C. H.; Wyatt, M. C.; Thebault, P.; Henning, W. G.; Gaidos, E.; Elkins-Tanton, L. T.; Bridges, J. C.; Morlok, A.
2012-01-01
Spectral modeling of the large infrared excess in the Spitzer IRS spectra of HD 172555 suggests that there is more than 10 19 kg of submicron dust in the system. Using physical arguments and constraints from observations, we rule out the possibility of the infrared excess being created by a magma ocean planet or a circumplanetary disk or torus. We show that the infrared excess is consistent with a circumstellar debris disk or torus, located at ∼6 AU, that was created by a planetary scale hypervelocity impact. We find that radiation pressure should remove submicron dust from the debris disk in less than one year. However, the system's mid-infrared photometric flux, dominated by submicron grains, has been stable within 4% over the last 27 years, from the Infrared Astronomical Satellite (1983) to WISE (2010). Our new spectral modeling work and calculations of the radiation pressure on fine dust in HD 172555 provide a self-consistent explanation for this apparent contradiction. We also explore the unconfirmed claim that ∼10 47 molecules of SiO vapor are needed to explain an emission feature at ∼8 μm in the Spitzer IRS spectrum of HD 172555. We find that unless there are ∼10 48 atoms or 0.05 M ⊕ of atomic Si and O vapor in the system, SiO vapor should be destroyed by photo-dissociation in less than 0.2 years. We argue that a second plausible explanation for the ∼8 μm feature can be emission from solid SiO, which naturally occurs in submicron silicate ''smokes'' created by quickly condensing vaporized silicate.
Yang, Laurence; Tan, Justin; O'Brien, Edward J; Monk, Jonathan M; Kim, Donghyuk; Li, Howard J; Charusanti, Pep; Ebrahim, Ali; Lloyd, Colton J; Yurkovich, James T; Du, Bin; Dräger, Andreas; Thomas, Alex; Sun, Yuekai; Saunders, Michael A; Palsson, Bernhard O
2015-08-25
Finding the minimal set of gene functions needed to sustain life is of both fundamental and practical importance. Minimal gene lists have been proposed by using comparative genomics-based core proteome definitions. A definition of a core proteome that is supported by empirical data, is understood at the systems-level, and provides a basis for computing essential cell functions is lacking. Here, we use a systems biology-based genome-scale model of metabolism and expression to define a functional core proteome consisting of 356 gene products, accounting for 44% of the Escherichia coli proteome by mass based on proteomics data. This systems biology core proteome includes 212 genes not found in previous comparative genomics-based core proteome definitions, accounts for 65% of known essential genes in E. coli, and has 78% gene function overlap with minimal genomes (Buchnera aphidicola and Mycoplasma genitalium). Based on transcriptomics data across environmental and genetic backgrounds, the systems biology core proteome is significantly enriched in nondifferentially expressed genes and depleted in differentially expressed genes. Compared with the noncore, core gene expression levels are also similar across genetic backgrounds (two times higher Spearman rank correlation) and exhibit significantly more complex transcriptional and posttranscriptional regulatory features (40% more transcription start sites per gene, 22% longer 5'UTR). Thus, genome-scale systems biology approaches rigorously identify a functional core proteome needed to support growth. This framework, validated by using high-throughput datasets, facilitates a mechanistic understanding of systems-level core proteome function through in silico models; it de facto defines a paleome.
International Nuclear Information System (INIS)
Zhang, Bo; Ye, Xianggui; Edwards, Brian J.
2013-01-01
A combination of self-consistent field theory and density functional theory was used to examine the stable, 3-dimensional equilibrium morphologies formed by diblock copolymers with a tethered nanoparticle attached either between the two blocks or at the end of one of the blocks. Both neutral and interacting particles were examined, with and without favorable/unfavorable energetic potentials between the particles and the block segments. The phase diagrams of the various systems were constructed, allowing the identification of three types of ordered mesophases composed of lamellae, hexagonally packed cylinders, and spheroids. In particular, we examined the conditions under which the mesophases could be generated wherein the tethered particles were primarily located within the interface between the two blocks of the copolymer. Key factors influencing these properties were determined to be the particle position along the diblock chain, the interaction potentials of the blocks and particles, the block copolymer composition, and molecular weight of the copolymer
Hoteit, Ibrahim
2010-03-02
An eddy-permitting adjoint-based assimilation system has been implemented to estimate the state of the tropical Pacific Ocean. The system uses the Massachusetts Institute of Technology\\'s general circulation model and its adjoint. The adjoint method is used to adjust the model to observations by controlling the initial temperature and salinity; temperature, salinity, and horizontal velocities at the open boundaries; and surface fluxes of momentum, heat, and freshwater. The model is constrained with most of the available data sets in the tropical Pacific, including Tropical Atmosphere and Ocean, ARGO, expendable bathythermograph, and satellite SST and sea surface height data, and climatologies. Results of hindcast experiments in 2000 suggest that the iterated adjoint-based descent is able to significantly improve the model consistency with the multivariate data sets, providing a dynamically consistent realization of the tropical Pacific circulation that generally matches the observations to within specified errors. The estimated model state is evaluated both by comparisons with observations and by checking the controls, the momentum balances, and the representation of small-scale features that were not well sampled by the observations used in the assimilation. As part of these checks, the estimated controls are smoothed and applied in independent model runs to check that small changes in the controls do not greatly change the model hindcast. This is a simple ensemble-based uncertainty analysis. In addition, the original and smoothed controls are applied to a version of the model with doubled horizontal resolution resulting in a broadly similar “downscaled” hindcast, showing that the adjustments are not tuned to a single configuration (meaning resolution, topography, and parameter settings). The time-evolving model state and the adjusted controls should be useful for analysis or to supply the forcing, initial, and boundary conditions for runs of other models.
SU-E-J-211: Assessing the Consistency of the ViewRay 0.35 T MRI System
International Nuclear Information System (INIS)
Yan, Y; Saenz, D; Bayouth, J; Paliwal, B
2015-01-01
Purpose: ViewRay is a novel image-guided radiotherapy approach with an integrated 0.35 T MR unit and three Cobalt-60 heads. In order to continuously ensure the high resolution and high soft tissue contrast available in MR images, we quantified a multitude of relevant imaging parameters over a period of six months to establish its stability and imaging quality. In the assessment process, consideration is also given to the need to establish the number of tests required to have confidence in the performance of the system for radiation therapy planning applications. Methods: Daily, weekly, monthly and annual imaging tests were performed over a period of six months using standardized phantoms (a 24 cm diameter sphere and the ACR phantom) to quantify the performance of the system. In addition to the ACR and NEMA recommended tests, we also included element testing, spatial integrity, Eddy current and magnetic field inhomogeneity measurements. The ACR test is used for assessing the following parameters for T1 and T2: geometric accuracy, high contrast spatial resolution, slice thickness accuracy, slice position accuracy, percent signal ghosting, low contrast object detectability. It also includes percent image uniformity (PIU) for T1. The NEMA test is primarily designed to check SNR and PIU. Results: Over the period of six months, all the parameters were maintained within the recommendations provided in the ACR and NEMA standards. PIU and SNR were found to be sensitive to malfunctions in the components of the multileaf collimators. Details of the findings will be presented. Conclusion: The data suggests that ViewRay imaging system has functioned in a consistent and reliable manner. MR imaging from ViewRay 0.35T system complies with the ACR and NEMA recommended acceptance standards
SU-E-J-211: Assessing the Consistency of the ViewRay 0.35 T MRI System
Energy Technology Data Exchange (ETDEWEB)
Yan, Y; Saenz, D; Bayouth, J; Paliwal, B [University of Wisconsin, Madison, WI (United States)
2015-06-15
Purpose: ViewRay is a novel image-guided radiotherapy approach with an integrated 0.35 T MR unit and three Cobalt-60 heads. In order to continuously ensure the high resolution and high soft tissue contrast available in MR images, we quantified a multitude of relevant imaging parameters over a period of six months to establish its stability and imaging quality. In the assessment process, consideration is also given to the need to establish the number of tests required to have confidence in the performance of the system for radiation therapy planning applications. Methods: Daily, weekly, monthly and annual imaging tests were performed over a period of six months using standardized phantoms (a 24 cm diameter sphere and the ACR phantom) to quantify the performance of the system. In addition to the ACR and NEMA recommended tests, we also included element testing, spatial integrity, Eddy current and magnetic field inhomogeneity measurements. The ACR test is used for assessing the following parameters for T1 and T2: geometric accuracy, high contrast spatial resolution, slice thickness accuracy, slice position accuracy, percent signal ghosting, low contrast object detectability. It also includes percent image uniformity (PIU) for T1. The NEMA test is primarily designed to check SNR and PIU. Results: Over the period of six months, all the parameters were maintained within the recommendations provided in the ACR and NEMA standards. PIU and SNR were found to be sensitive to malfunctions in the components of the multileaf collimators. Details of the findings will be presented. Conclusion: The data suggests that ViewRay imaging system has functioned in a consistent and reliable manner. MR imaging from ViewRay 0.35T system complies with the ACR and NEMA recommended acceptance standards.
Hamm, L. L.; Vanbrunt, V.
1982-08-01
The numerical solution to the ordinary differential equation which describes the high-pressure vapor-liquid equilibria of a binary system where one of the components is supercritical and exists as a noncondensable gas in the pure state is considered with emphasis on the implicit Runge-Kuta and orthogonal collocation methods. Some preliminary results indicate that the implicit Runge-Kutta method is superior. Due to the extreme nonlinearity of thermodynamic properties in the region near the critical locus, and extended cubic spline fitting technique is devised for correlating the P-x data. The least-squares criterion is employed in smoothing the experimental data. The technique could easily be applied to any thermodynamic data by changing the endpoint requirements. The volumetric behavior of the systems must be given or predicted in order to perform thermodynamic consistency tests. A general procedure is developed for predicting the volumetric behavior required and some indication as to the expected limit of accuracy is given.
Consistency of orthodox gravity
Energy Technology Data Exchange (ETDEWEB)
Bellucci, S. [INFN, Frascati (Italy). Laboratori Nazionali di Frascati; Shiekh, A. [International Centre for Theoretical Physics, Trieste (Italy)
1997-01-01
A recent proposal for quantizing gravity is investigated for self consistency. The existence of a fixed-point all-order solution is found, corresponding to a consistent quantum gravity. A criterion to unify couplings is suggested, by invoking an application of their argument to more complex systems.
Internally consistent thermodynamic data for aqueous species in the system Na-K-Al-Si-O-H-Cl
Miron, George D.; Wagner, Thomas; Kulik, Dmitrii A.; Heinrich, Christoph A.
2016-08-01
A large amount of critically evaluated experimental data on mineral solubility, covering the entire Na-K-Al-Si-O-H-Cl system over wide ranges in temperature and pressure, was used to simultaneously refine the standard state Gibbs energies of aqueous ions and complexes in the framework of the revised Helgeson-Kirkham-Flowers equation of state. The thermodynamic properties of the solubility-controlling minerals were adopted from the internally consistent dataset of Holland and Powell (2002; Thermocalc dataset ds55). The global optimization of Gibbs energies of aqueous species, performed with the GEMSFITS code (Miron et al., 2015), was set up in such a way that the association equilibria for ion pairs and complexes, independently derived from conductance and potentiometric data, are always maintained. This was achieved by introducing reaction constraints into the parameter optimization that adjust Gibbs energies of complexes by their respective Gibbs energy effects of reaction, whenever the Gibbs energies of reactant species (ions) are changed. The optimized thermodynamic dataset is reported with confidence intervals for all parameters evaluated by Monte Carlo trial calculations. The new thermodynamic dataset is shown to reproduce all available fluid-mineral phase equilibria and mineral solubility data with good accuracy and precision over wide ranges in temperature (25-800 °C), pressure (1 bar to 5 kbar) and composition (salt concentrations up to 5 molal). The global data optimization process adopted in this study can be readily repeated any time when extensions to new chemical elements and species are needed, when new experimental data become available, or when a different aqueous activity model or equation of state should be used. This work serves as a proof of concept that our optimization strategy is feasible and successful in generating a thermodynamic dataset reproducing all fluid-mineral and aqueous speciation equilibria in the Na-K-Al-Si-O-H-Cl system within
Consistent model driven architecture
Niepostyn, Stanisław J.
2015-09-01
The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.
Bullis, Michael; And Others
1994-01-01
The Transition Competence Battery for Deaf Adolescents and Young Adults consists of six subtests related to employment and independent living, and uses booklets and videotapes/videodisks with questions presented in sign language. Comparison of alternative formats indicated that group administration of the multiple-choice format produced acceptable…
International Nuclear Information System (INIS)
Guest, Geoffrey; Bright, Ryan M.; Cherubini, Francesco; Strømman, Anders H.
2013-01-01
Temporary and permanent carbon storage from biogenic sources is seen as a way to mitigate climate change. The aim of this work is to illustrate the need to harmonize the quantification of such mitigation across all possible storage pools in the bio- and anthroposphere. We investigate nine alternative storage cases and a wide array of bio-resource pools: from annual crops, short rotation woody crops, medium rotation temperate forests, and long rotation boreal forests. For each feedstock type and biogenic carbon storage pool, we quantify the carbon cycle climate impact due to the skewed time distribution between emission and sequestration fluxes in the bio- and anthroposphere. Additional consideration of the climate impact from albedo changes in forests is also illustrated for the boreal forest case. When characterizing climate impact with global warming potentials (GWP), we find a large variance in results which is attributed to different combinations of biomass storage and feedstock systems. The storage of biogenic carbon in any storage pool does not always confer climate benefits: even when biogenic carbon is stored long-term in durable product pools, the climate outcome may still be undesirable when the carbon is sourced from slow-growing biomass feedstock. For example, when biogenic carbon from Norway Spruce from Norway is stored in furniture with a mean life time of 43 years, a climate change impact of 0.08 kg CO 2 eq per kg CO 2 stored (100 year time horizon (TH)) would result. It was also found that when biogenic carbon is stored in a pool with negligible leakage to the atmosphere, the resulting GWP factor is not necessarily − 1 CO 2 eq per kg CO 2 stored. As an example, when biogenic CO 2 from Norway Spruce biomass is stored in geological reservoirs with no leakage, we estimate a GWP of − 0.56 kg CO 2 eq per kg CO 2 stored (100 year TH) when albedo effects are also included. The large variance in GWPs across the range of resource and carbon storage
Energy Technology Data Exchange (ETDEWEB)
Guest, Geoffrey, E-mail: geoffrey.guest@ntnu.no; Bright, Ryan M., E-mail: ryan.m.bright@ntnu.no; Cherubini, Francesco, E-mail: francesco.cherubini@ntnu.no; Strømman, Anders H., E-mail: anders.hammer.stromman@ntnu.no
2013-11-15
Temporary and permanent carbon storage from biogenic sources is seen as a way to mitigate climate change. The aim of this work is to illustrate the need to harmonize the quantification of such mitigation across all possible storage pools in the bio- and anthroposphere. We investigate nine alternative storage cases and a wide array of bio-resource pools: from annual crops, short rotation woody crops, medium rotation temperate forests, and long rotation boreal forests. For each feedstock type and biogenic carbon storage pool, we quantify the carbon cycle climate impact due to the skewed time distribution between emission and sequestration fluxes in the bio- and anthroposphere. Additional consideration of the climate impact from albedo changes in forests is also illustrated for the boreal forest case. When characterizing climate impact with global warming potentials (GWP), we find a large variance in results which is attributed to different combinations of biomass storage and feedstock systems. The storage of biogenic carbon in any storage pool does not always confer climate benefits: even when biogenic carbon is stored long-term in durable product pools, the climate outcome may still be undesirable when the carbon is sourced from slow-growing biomass feedstock. For example, when biogenic carbon from Norway Spruce from Norway is stored in furniture with a mean life time of 43 years, a climate change impact of 0.08 kg CO{sub 2}eq per kg CO{sub 2} stored (100 year time horizon (TH)) would result. It was also found that when biogenic carbon is stored in a pool with negligible leakage to the atmosphere, the resulting GWP factor is not necessarily − 1 CO{sub 2}eq per kg CO{sub 2} stored. As an example, when biogenic CO{sub 2} from Norway Spruce biomass is stored in geological reservoirs with no leakage, we estimate a GWP of − 0.56 kg CO{sub 2}eq per kg CO{sub 2} stored (100 year TH) when albedo effects are also included. The large variance in GWPs across the range of
Interactive Videodisc Technology: Applications to the Air Command and Staff College Curriculum.
1988-04-01
objectives )r Executive and NSC system Congress Military Intelligence community Media National environment Transcultural communications Global challenges...Cuban missile crisis REGIONAL STUDIES: USSR AND EUROPE Superpower global objectives The Soviet Union: background The Soviet political-economic system...summary National security affairs review The crisis game WARFARE STUDIES MILITARY HISTORY AND THEORY * - Overview to thinking about war Sun Tzu Great
Hoteit, Ibrahim; Cornuelle, B.; Heimbach, P.
2010-01-01
An eddy-permitting adjoint-based assimilation system has been implemented to estimate the state of the tropical Pacific Ocean. The system uses the Massachusetts Institute of Technology's general circulation model and its adjoint. The adjoint method
Serfon, Cedric; The ATLAS collaboration
2016-01-01
One of the biggest challenge with Large scale data management system is to ensure the consistency between the global file catalog and what is physically on all storage elements. To tackle this issue, the Rucio software which is used by the ATLAS Distributed Data Management system has been extended to automatically handle lost or unregistered files (aka Dark Data). This system automatically detects these inconsistencies and take actions like recovery or deletion of unneeded files in a central manner. In this talk, we will present this system, explain the internals and give some results.
International Nuclear Information System (INIS)
Singh, Shakti; Singh, Mukesh; Kaushik, Subhash Chandra
2016-01-01
Highlights: • A cost effective hybrid PV-wind-biomass energy system with storage is proposed. • Mathematical modeling and operational strategy of the proposed system is discussed. • Optimal sizing of components is evaluated using evolutionary algorithms. • Results obtained is compared with software tool HOMER. • The performance of the hybrid system in the critical case has been presented. - Abstract: Renewable energy systems are proving to be promising and environment friendly sources of electricity generation, particularly, in countries with inadequate fossil fuel resources. In recent years, wind, solar photovoltaic (PV) and biomass based systems have been drawing more attention to provide electricity to isolated or energy deficient regions. This paper presents a hybrid PV-wind generation system along with biomass and storage to fulfill the electrical load demand of a small area. For optimal sizing of components, a recently introduced swarm based artificial bee colony (ABC) algorithm is applied. To verify the strength of the proposed technique, the results are compared with the results obtained from the standard software tool, hybrid optimization model for electric renewable (HOMER) and particle swarm optimization (PSO) algorithm. It has been verified from the results that the ABC algorithm has good convergence property and ability to provide good quality results. Further, for critical case such as the failure of any source, the behavior of the proposed system has been observed. It is evident from the results that the proposed scheme is able to manage a smooth power flow with the same optimal configuration.
Structural Consistency, Consistency, and Sequential Rationality.
Kreps, David M; Ramey, Garey
1987-01-01
Sequential equilibria comprise consistent beliefs and a sequentially ra tional strategy profile. Consistent beliefs are limits of Bayes ratio nal beliefs for sequences of strategies that approach the equilibrium strategy. Beliefs are structurally consistent if they are rationaliz ed by some single conjecture concerning opponents' strategies. Consis tent beliefs are not necessarily structurally consistent, notwithstan ding a claim by Kreps and Robert Wilson (1982). Moreover, the spirit of stru...
Weißenberger, Barbara E.; Angelkort, Hendrik
2009-01-01
To provide accounting information for management control purposes, two fundamental options exist: (a) The financial records can be used as a database for management accounting (integrated accounting system design), or (b) the management accounting system used by controllers can be based upon a so-called third set of books besides the financial and tax accounting records. Whereas the latter approach had been typical for firms in German-speaking countries until the 1980s, since then an increasi...
Directory of Open Access Journals (Sweden)
van Dijk Arie PJ
2008-08-01
Full Text Available Abstract Background The method used to delineate the boundary of the right ventricle (RV, relative to the trabeculations and papillary muscles in cardiovascular magnetic resonance (CMR ventricular volume analysis, may matter more when these structures are hypertrophied than in individuals with normal cardiovascular anatomy. This study aimed to compare two methods of cavity delineation in patients with systemic RV. Methods Twenty-nine patients (mean age 34.7 ± 12.4 years with a systemic RV (12 with congenitally corrected transposition of the great arteries (ccTGA and 17 with atrially switched (TGA underwent CMR. We compared measurements of systemic RV volumes and function using two analysis protocols. The RV trabeculations and papillary muscles were either included in the calculated blood volume, the boundary drawn immediately within the apparently compacted myocardial layer, or they were manually outlined and excluded. RV stroke volume (SV calculated using each method was compared with corresponding left ventricular (LV SV. Additionally, we compared the differences in analysis time, and in intra- and inter-observer variability between the two methods. Paired samples t-test was used to test for differences in volumes, function and analysis time between the two methods. Differences in intra- and inter-observer reproducibility were tested using an extension of the Bland-Altman method. Results The inclusion of trabeculations and papillary muscles in the ventricular volume resulted in higher values for systemic RV end diastolic volume (mean difference 28.7 ± 10.6 ml, p Conclusion The choice of method for systemic RV cavity delineation significantly affected volume measurements, given the CMR acquisition and analysis systems used. We recommend delineation outside the trabeculations for routine clinical measurements of systemic RV volumes as this approach took less time and gave more reproducible measurements.
The report, a review of the literature on heat flow through powders, was motivated by the use of fine powder systems to produce high thermal resistivities (thermal resistance per unit thickness). he term "superinsulations" has been used to describe this type of material, which ha...
W. Cohen; H. Andersen; S. Healey; G. Moisen; T. Schroeder; C. Woodall; G. Domke; Z. Yang; S. Stehman; R. Kennedy; C. Woodcock; Z. Zhu; J. Vogelmann; D. Steinwand; C. Huang
2014-01-01
The authors are developing a REDD+ MRV system that tests different biomass estimation frameworks and components. Design-based inference from a costly fi eld plot network was compared to sampling with LiDAR strips and a smaller set of plots in combination with Landsat for disturbance monitoring. Biomass estimation uncertainties associated with these different data sets...
Wu, Mengxue; Li, Chen; Yao, Wu
2017-01-11
In cement-based pastes, the relationship between the complex phase assemblage and mechanical properties is usually described by the "gel/space ratio" descriptor. The gel/space ratio is defined as the volume ratio of the gel to the available space in the composite system, and it has been widely studied in the cement unary system. This work determines the gel/space ratio in the cement-silica fume-fly ash ternary system (C-SF-FA system) by measuring the reaction degrees of the cement, SF, and FA. The effects that the supplementary cementitious material (SCM) replacements exert on the evolution of the gel/space ratio are discussed both theoretically and practically. The relationship between the gel/space ratio and compressive strength is then explored, and the relationship disparities for different mix proportions are analyzed in detail. The results demonstrate that the SCM replacements promote the gel/space ratio evolution only when the SCM reaction degree is higher than a certain value, which is calculated and defined as the critical reaction degree (CRD). The effects of the SCM replacements can be predicted based on the CRD, and the theological predictions agree with the test results quite well. At low gel/space ratios, disparities in the relationship between the gel/space ratio and the compressive strength are caused by porosity, which has also been studied in cement unary systems. The ratio of cement-produced gel to SCM-produced gel ( G C to G S C M ratio) is introduced for use in analyzing high gel/space ratios, in which it plays a major role in creating relationship disparities.
Directory of Open Access Journals (Sweden)
Mengxue Wu
2017-01-01
Full Text Available In cement-based pastes, the relationship between the complex phase assemblage and mechanical properties is usually described by the “gel/space ratio” descriptor. The gel/space ratio is defined as the volume ratio of the gel to the available space in the composite system, and it has been widely studied in the cement unary system. This work determines the gel/space ratio in the cement-silica fume-fly ash ternary system (C-SF-FA system by measuring the reaction degrees of the cement, SF, and FA. The effects that the supplementary cementitious material (SCM replacements exert on the evolution of the gel/space ratio are discussed both theoretically and practically. The relationship between the gel/space ratio and compressive strength is then explored, and the relationship disparities for different mix proportions are analyzed in detail. The results demonstrate that the SCM replacements promote the gel/space ratio evolution only when the SCM reaction degree is higher than a certain value, which is calculated and defined as the critical reaction degree (CRD. The effects of the SCM replacements can be predicted based on the CRD, and the theological predictions agree with the test results quite well. At low gel/space ratios, disparities in the relationship between the gel/space ratio and the compressive strength are caused by porosity, which has also been studied in cement unary systems. The ratio of cement-produced gel to SCM-produced gel ( G C to G S C M ratio is introduced for use in analyzing high gel/space ratios, in which it plays a major role in creating relationship disparities.
International Nuclear Information System (INIS)
Zhang, Yuan; Yang, Ke; Li, Xuemei; Xu, Jianzhong
2014-01-01
A simulation model consisting of wind speed, wind turbine and AA-CAES (advanced adiabatic compressed air energy storage) system is developed in this paper, and thermodynamic analysis on energy conversion and transfer in hybrid system is carried out. The impacts of stable wind speed and unstable wind speed on the hybrid system are analyzed and compared from the viewpoint of energy conversion and system efficiency. Besides, energy conversion relationship between wind turbine and AA-CAES system is investigated on the basis of process analysis. The results show that there are several different forms of energy in hybrid system, which have distinct conversion relationship. As to wind turbine, power coefficient determines wind energy utilization efficiency, and in AA-CAES system, it is compressor efficiency that mainly affects energy conversion efficiencies of other components. The strength and fluctuation of wind speed have a direct impact on energy conversion efficiencies of components of hybrid system, and within proper wind speed scope, the maximum of system efficiency could be expected. - Highlights: • A hybrid system consisting of wind, wind turbine and AA-CAES system is established. • Energy conversion in hybrid system with stable and unstable wind speed is analyzed. • Maximum efficiency of hybrid system can be reached within proper wind speed scope. • Thermal energy change in hybrid system is more sensitive to wind speed change. • Compressor efficiency can affect other efficiencies in AA-CAES system
Comparison of working length control consistency between hand K-files and Mtwo NiTi rotary system.
Krajczár, Károly; Varga, Enikő; Marada, Gyula; Jeges, Sára; Tóth, Vilmos
2016-04-01
The purpose of this study was to investigate the consistency of working length control between hand instrumentation in comparison to engine driven Mtwo nickel-titanium rotary files. Forty extracted maxillary molars were selected and divided onto two parallel groups. The working lengths of the mesiobuccal root canals were estimated. The teeth were fixed in a phantom head. The root canal preparation was carried out group 1 (n=20) with hand K-files, (VDW, Munich, Germany) and group 2 (n=20) with Mtwo instruments (VDW, Munich, Germany). Vestibulo-oral and mesio-distal directional x-ray images were taken before the preparation with #10 K-file, inserted into the mesiobuccal root canal to the working length, and after preparation with #25, #30 and #40 files. Working lenght changes were detected with measurements between the radiological apex and the instrument tips. In the Mtwo group a difference in the working competency (protary files. Mtwo NiTi rotary file did therefore proved to be more accurate in comparison to the conventional hand instrumentation. Working length, Mtwo, nickel-titanium, hand preparation, engine driven preparation.
Yu, Xiaochu; Jiang, Jingmei; Liu, Changwei; Shen, Keng; Wang, Zixing; Han, Wei; Liu, Xingrong; Lin, Guole; Zhang, Ye; Zhang, Ying; Ma, Yufen; Bo, Haixin; Zhao, Yupei
2017-06-15
Surgical safety has emerged as a crucial global health issue in the past two decades. Although several safety-enhancing tools are available, the pace of large-scale improvement remains slow, especially in developing countries such as China. The present project (Modern Surgery and Anesthesia Safety Management System Construction and Promotion) aims to develop and validate system-based integrated approaches for reducing perioperative deaths and complications using a multicentre, multistage design. The project involves collection of clinical and outcome information for 1 20 000 surgical inpatients at four regionally representative academic/teaching general hospitals in China during three sequential stages: preparation and development, effectiveness validation and improvement of implementation for promotion. These big data will provide the evidence base for the formulation, validation and improvement processes of a system-based stratified safety intervention package covering the entire surgical pathway. Attention will be directed to managing inherent patient risks and regulating medical safety behaviour. Information technology will facilitate data collection and intervention implementation, provide supervision mechanisms and guarantee transfer of key patient safety messages between departments and personnel. Changes in rates of deaths, surgical complications during hospitalisation, length of stay, system adoption and implementation rates will be analysed to evaluate effectiveness and efficiency. This study was approved by the institutional review boards of Peking Union Medical College Hospital, First Hospital of China Medical University, Qinghai Provincial People's Hospital, Xiangya Hospital Central South University and the Institute of Basic Medical Sciences, Chinese Academy of Medical Sciences. Study findings will be disseminated via peer-reviewed journals, conference presentations and patent papers. © Article author(s) (or their employer(s) unless otherwise
International Nuclear Information System (INIS)
Halbach, K.
1978-01-01
A description is given of some unfinished work that may have a bearing on the problem of producing a small beam spot on a target for heavy ion fusion. One of the important results obtained so far is an existence proof that shows that it is possible, at least in principle, to design systems, containing only quadrupoles and/or solenoids, with vanishing first and second derivatives of the spotsize with respect to momentum both at the target and at the exit of the last lens
Directory of Open Access Journals (Sweden)
Yuanbin Yu
2016-01-01
Full Text Available This paper presents a new method for battery degradation estimation using a power-energy (PE function in a battery/ultracapacitor hybrid energy storage system (HESS, and the integrated optimization which concerns both parameters matching and control for HESS has been done as well. A semiactive topology of HESS with double-layer capacitor (EDLC coupled directly with DC-link is adopted for a hybrid electric city bus (HECB. In the purpose of presenting the quantitative relationship between system parameters and battery serving life, the data during a 37-minute driving cycle has been collected and decomposed into discharging/charging fragments firstly, and then the optimal control strategy which is supposed to maximally use the available EDLC energy is presented to decompose the power between battery and EDLC. Furthermore, based on a battery degradation model, the conversion of power demand by PE function and PE matrix is applied to evaluate the relationship between the available energy stored in HESS and the serving life of battery pack. Therefore, according to the approach which could decouple parameters matching and optimal control of the HESS, the process of battery degradation and its serving life estimation for HESS has been summed up.
International Nuclear Information System (INIS)
Leyendecker, Sigrid; Betsch, Peter; Steinmann, Paul
2008-01-01
In the present work, the unified framework for the computational treatment of rigid bodies and nonlinear beams developed by Betsch and Steinmann (Multibody Syst. Dyn. 8, 367-391, 2002) is extended to the realm of nonlinear shells. In particular, a specific constrained formulation of shells is proposed which leads to the semi-discrete equations of motion characterized by a set of differential-algebraic equations (DAEs). The DAEs provide a uniform description for rigid bodies, semi-discrete beams and shells and, consequently, flexible multibody systems. The constraints may be divided into two classes: (i) internal constraints which are intimately connected with the assumption of rigidity of the bodies, and (ii) external constraints related to the presence of joints in a multibody framework. The present approach thus circumvents the use of rotational variables throughout the whole time discretization, facilitating the design of energy-momentum methods for flexible multibody dynamics. After the discretization has been completed a size-reduction of the discrete system is performed by eliminating the constraint forces. Numerical examples dealing with a spatial slider-crank mechanism and with intersecting shells illustrate the performance of the proposed method
Dziedzic, Adam; Mulawka, Jan
2014-11-01
NoSQL is a new approach to data storage and manipulation. The aim of this paper is to gain more insight into NoSQL databases, as we are still in the early stages of understanding when to use them and how to use them in an appropriate way. In this submission descriptions of selected NoSQL databases are presented. Each of the databases is analysed with primary focus on its data model, data access, architecture and practical usage in real applications. Furthemore, the NoSQL databases are compared in fields of data references. The relational databases offer foreign keys, whereas NoSQL databases provide us with limited references. An intermediate model between graph theory and relational algebra which can address the problem should be created. Finally, the proposal of a new approach to the problem of inconsistent references in Big Data storage systems is introduced.
Directory of Open Access Journals (Sweden)
Pedro Vasconcellos Eisenlohr
2014-06-01
Full Text Available Rigorous and well-defined criteria for the classification of vegetation constitute a prerequisite for effective biodiversity conservation strategies. In 2009, a new classification system was proposed for vegetation types in extra-Andean tropical and subtropical South America. The new system expanded upon the criteria established in the existing Brazilian Institute of Geography and Statistics classification system. Here, we attempted to determine whether the tree species composition of the formations within the Atlantic Forest Biome of Brazil is consistent with this new classification system. We compiled floristic surveys of 394 sites in southeastern Brazil (between 15º and 25ºS; and between the Atlantic coast and 55ºW. To assess the floristic consistency of the vegetation types, we performed non-metric multidimensional scaling (NMDS ordination analysis, followed by multifactorial ANOVA. The vegetation types, especially in terms of their thermal regimes, elevational belts and top-tier vegetation categories, were consistently discriminated in the first NMDS axis, and all assessed attributes showed at least one significant difference in the second axis. As was expected on the basis of the theoretical background, we found that tree species composition, in the areas of Atlantic Forest studied, was highly consistent with the new system of classification. Our findings not only help solidify the position of this new classification system but also contribute to expanding the knowledge of the patterns and underlying driving forces of the distribution of vegetation in the region.
Directory of Open Access Journals (Sweden)
Susanne H Landis
Full Text Available Extreme climate events such as heat waves are expected to increase in frequency under global change. As one indirect effect, they can alter magnitude and direction of species interactions, for example those between hosts and parasites. We simulated a summer heat wave to investigate how a changing environment affects the interaction between the broad-nosed pipefish (Syngnathus typhle as a host and its digenean trematode parasite (Cryptocotyle lingua. In a fully reciprocal laboratory infection experiment, pipefish from three different coastal locations were exposed to sympatric and allopatric trematode cercariae. In order to examine whether an extreme climatic event disrupts patterns of locally adapted host-parasite combinations we measured the parasite's transmission success as well as the host's adaptive and innate immune defence under control and heat wave conditions. Independent of temperature, sympatric cercariae were always more successful than allopatric ones, indicating that parasites are locally adapted to their hosts. Hosts suffered from heat stress as suggested by fewer cells of the adaptive immune system (lymphocytes compared to the same groups that were kept at 18°C. However, the proportion of the innate immune cells (monocytes was higher in the 18°C water. Contrary to our expectations, no interaction between host immune defence, parasite infectivity and temperature stress were found, nor did the pattern of local adaptation change due to increased water temperature. Thus, in this host-parasite interaction, the sympatric parasite keeps ahead of the coevolutionary dynamics across sites, even under increasing temperatures as expected under marine global warming.
International Nuclear Information System (INIS)
Winter, H.; Stocks, G.M.
1983-01-01
Previous Korringa-Kohn-Rostoker coherent-potential-approximation electronic-structure calculations for substitutionally random alloys have been based on ad hoc potentials. The lack of procedures suitable to provide self-consistent, parameter-free potentials prevented computations for systems consisting of dissimilar atoms and is also the reason why quantities like, for example, cohesive energies or lattice constants, have not so far been evaluated for systems of similar constituents. We present in full detail a generally applicable scheme devised for calculating the self-consistent electronic structures of substitutionally disordered systems. Its feasibility is demonstrated by presenting the results obtained for the Ag/sub x/Pd/sub 1-x/ alloy series. They are compared with those of former non-self-consistent calculations which use Mattheiss prescription potentials and the α = 1 Slater exchange, whereas the von Barth--Hedin expression is employed in our work. The differences are perceptible and have to be understood as combined self-consistency and exchange-correlation effects. .ID BW2039 .PG 905 909
Measuring process and knowledge consistency
DEFF Research Database (Denmark)
Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders
2007-01-01
When implementing configuration systems, knowledge about products and processes are documented and replicated in the configuration system. This practice assumes that products are specified consistently i.e. on the same rule base and likewise for processes. However, consistency cannot be taken...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation......, this paper presents a methodology for measuring product and process consistency prior to implementing a configuration system. The methodology consists of two parts: 1) measuring knowledge consistency and 2) measuring process consistency. Knowledge consistency is measured by developing a questionnaire...
International Nuclear Information System (INIS)
Rasulova, M.Yu
1998-01-01
A study has been made of a system of charged particles and inhomogeneities randomly distributed in accordance with the same law in the neighborhoods of corresponding sites of a planar crystal lattice. The existence and uniqueness of the solution of the generalized Poisson-Boltzmann's equation for the average self-consistent potential and average density of surface charges are proved. (author)
長岡, 雅美; 赤松, 喜久; Masami, Nagaoka; Yoshihisa, Akamatsu
2008-01-01
The purpose of this study is to clarify the concept of Guidance and Support on community sports and to specify the directionality of organization and support for achievement of the sports society through life. The authors have stressed that it is necessary for achievement of the society for longlife sports，to cooperate with other groups and to construct a consistent support system. This study is also to explore the condition of community sports club management through analyzing the Japan Juni...
Energy Technology Data Exchange (ETDEWEB)
Gupta, Mukesh [URS Professional Solutions LLC, Aiken, SC (United States); Niemi, Belinda [Washington River Protection Solutions, LLC, Richland, WA (United States); Paik, Ingle [Washington River Protection Solutions, LLC, Richland, WA (United States)
2015-09-02
In 2012, One System Nuclear Safety performed a comparison of the safety bases for the Tank Farms Operations Contractor (TOC) and Hanford Tank Waste Treatment and Immobilization Plant (WTP) (RPP-RPT-53222 / 24590-WTP-RPT-MGT-12-018, “One System Report of Comparative Evaluation of Safety Bases for Hanford Waste Treatment and Immobilization Plant Project and Tank Operations Contract”), and identified 25 recommendations that required further evaluation for consensus disposition. This report documents ten NSSC approved consistent methodologies and guides and the results of the additional evaluation process using a new set of evaluation criteria developed for the evaluation of the new methodologies.
International Nuclear Information System (INIS)
Kaganovich, Igor D.; Polomarov, Oleg
2003-01-01
In low-pressure discharges, when the electron mean free path is larger or comparable with the discharge length, the electron dynamics is essentially non-local. Moreover, the electron energy distribution function (EEDF) deviates considerably from a Maxwellian. Therefore, an accurate kinetic description of the low-pressure discharges requires knowledge of the non-local conductivity operator and calculation of the non-Maxwellian EEDF. The previous treatments made use of simplifying assumptions: a uniform density profile and a Maxwellian EEDF. In the present study a self-consistent system of equations for the kinetic description of nonlocal, non-uniform, nearly collisionless plasmas of low-pressure discharges is derived. It consists of the nonlocal conductivity operator and the averaged kinetic equation for calculation of the non-Maxwellian EEDF. The importance of accounting for the non-uniform plasma density profile on both the current density profile and the EEDF is demonstrated
Forster, Hannah; Walsh, Marianne C; O'Donovan, Clare B; Woolhead, Clara; McGirr, Caroline; Daly, E J; O'Riordan, Richard; Celis-Morales, Carlos; Fallaize, Rosalind; Macready, Anna L; Marsaux, Cyril F M; Navas-Carretero, Santiago; San-Cristobal, Rodrigo; Kolossa, Silvia; Hartwig, Kai; Mavrogianni, Christina; Tsirigoti, Lydia; Lambrinou, Christina P; Godlewska, Magdalena; Surwiłło, Agnieszka; Gjelstad, Ingrid Merethe Fange; Drevon, Christian A; Manios, Yannis; Traczyk, Iwona; Martinez, J Alfredo; Saris, Wim H M; Daniel, Hannelore; Lovegrove, Julie A; Mathers, John C; Gibney, Michael J; Gibney, Eileen R; Brennan, Lorraine
2016-06-30
Despite numerous healthy eating campaigns, the prevalence of diets high in saturated fatty acids, sugar, and salt and low in fiber, fruit, and vegetables remains high. With more people than ever accessing the Internet, Web-based dietary assessment instruments have the potential to promote healthier dietary behaviors via personalized dietary advice. The objectives of this study were to develop a dietary feedback system for the delivery of consistent personalized dietary advice in a multicenter study and to examine the impact of automating the advice system. The development of the dietary feedback system included 4 components: (1) designing a system for categorizing nutritional intakes; (2) creating a method for prioritizing 3 nutrient-related goals for subsequent targeted dietary advice; (3) constructing decision tree algorithms linking data on nutritional intake to feedback messages; and (4) developing personal feedback reports. The system was used manually by researchers to provide personalized nutrition advice based on dietary assessment to 369 participants during the Food4Me randomized controlled trial, with an automated version developed on completion of the study. Saturated fatty acid, salt, and dietary fiber were most frequently selected as nutrient-related goals across the 7 centers. Average agreement between the manual and automated systems, in selecting 3 nutrient-related goals for personalized dietary advice across the centers, was highest for nutrient-related goals 1 and 2 and lower for goal 3, averaging at 92%, 87%, and 63%, respectively. Complete agreement between the 2 systems for feedback advice message selection averaged at 87% across the centers. The dietary feedback system was used to deliver personalized dietary advice within a multi-country study. Overall, there was good agreement between the manual and automated feedback systems, giving promise to the use of automated systems for personalizing dietary advice. Clinicaltrials.gov NCT01530139
Consistent classical supergravity theories
International Nuclear Information System (INIS)
Muller, M.
1989-01-01
This book offers a presentation of both conformal and Poincare supergravity. The consistent four-dimensional supergravity theories are classified. The formulae needed for further modelling are included
Quasiparticles and thermodynamical consistency
International Nuclear Information System (INIS)
Shanenko, A.A.; Biro, T.S.; Toneev, V.D.
2003-01-01
A brief and simple introduction into the problem of the thermodynamical consistency is given. The thermodynamical consistency relations, which should be taken into account under constructing a quasiparticle model, are found in a general manner from the finite-temperature extension of the Hellmann-Feynman theorem. Restrictions following from these relations are illustrated by simple physical examples. (author)
International Nuclear Information System (INIS)
Kang, Yu; Zhang, Xiaoyan; Jiang, Wei; Wu, Chaoqun; Chen, Chunmei; Zheng, Yufang; Gu, Jianren; Xu, Congjian
2009-01-01
Compared with viral vectors, nonviral vectors are less immunogenic, more stable, safer and easier to replication for application in cancer gene therapy. However, nonviral gene delivery system has not been extensively used because of the low transfection efficiency and the short transgene expression, especially in vivo. It is desirable to develop a nonviral gene delivery system that can support stable genomic integration and persistent gene expression in vivo. Here, we used a composite nonviral gene delivery system consisting of the piggyBac (PB) transposon and polyethylenimine (PEI) for long-term transgene expression in mouse ovarian tumors. A recombinant plasmid PB [Act-RFP, HSV-tk] encoding both the herpes simplex thymidine kinase (HSV-tk) and the monomeric red fluorescent protein (mRFP1) under PB transposon elements was constructed. This plasmid and the PBase plasmid were injected into ovarian cancer tumor xenografts in mice by in vivo PEI system. The antitumor effects of HSV-tk/ganciclovir (GCV) system were observed after intraperitoneal injection of GCV. Histological analysis and TUNEL assay were performed on the cryostat sections of the tumor tissue. Plasmid construction was confirmed by PCR analysis combined with restrictive enzyme digestion. mRFP1 expression could be visualized three weeks after the last transfection of pPB/TK under fluorescence microscopy. After GCV admission, the tumor volume of PB/TK group was significantly reduced and the tumor inhibitory rate was 81.96% contrasted against the 43.07% in the TK group. Histological analysis showed that there were extensive necrosis and lymphocytes infiltration in the tumor tissue of the PB/TK group but limited in the tissue of control group. TUNEL assays suggested that the transfected cells were undergoing apoptosis after GCV admission in vivo. Our results show that the nonviral gene delivery system coupling PB transposon with PEI can be used as an efficient tool for gene therapy in ovarian cancer
Energy Technology Data Exchange (ETDEWEB)
Jemai, M
2004-07-01
In the present thesis we have applied the self consistent random phase approximation (SCRPA) to the Hubbard model with a small number of sites (a chain of 2, 4, 6,... sites). Earlier SCRPA had produced very good results in other models like the pairing model of Richardson. It was therefore interesting to see what kind of results the method is able to produce in the case of a more complex model like the Hubbard model. To our great satisfaction the case of two sites with two electrons (half-filling) is solved exactly by the SCRPA. This may seem a little trivial but the fact is that other respectable approximations like 'GW' or the approach with the Gutzwiller wave function yield results still far from exact. With this promising starting point, the case of 6 sites at half filling was considered next. For that case, evidently, SCRPA does not any longer give exact results. However, they are still excellent for a wide range of values of the coupling constant U, covering for instance the phase transition region towards a state with non zero magnetisation. We consider this as a good success of the theory. Non the less the case of 4 sites (a plaquette), as indeed all cases with 4n sites at half filling, turned out to have a problem because of degeneracies at the Hartree Fock level. A generalisation of the present method, including in addition to the pairs, quadruples of Fermions operators (called second RPA) is proposed to also include exactly the plaquette case in our approach. This is therefore a very interesting perspective of the present work. (author)
Directory of Open Access Journals (Sweden)
Sharmila Vaz
Full Text Available The social skills rating system (SSRS is used to assess social skills and competence in children and adolescents. While its characteristics based on United States samples (US are published, corresponding Australian figures are unavailable. Using a 4-week retest design, we examined the internal consistency, retest reliability and measurement error (ME of the SSRS secondary student form (SSF in a sample of Year 7 students (N = 187, from five randomly selected public schools in Perth, western Australia. Internal consistency (IC of the total scale and most subscale scores (except empathy on the frequency rating scale was adequate to permit independent use. On the importance rating scale, most IC estimates for girls fell below the benchmark. Test-retest estimates of the total scale and subscales were insufficient to permit reliable use. ME of the total scale score (frequency rating for boys was equivalent to the US estimate, while that for girls was lower than the US error. ME of the total scale score (importance rating was larger than the error using the frequency rating scale. The study finding supports the idea of using multiple informants (e.g. teacher and parent reports, not just student as recommended in the manual. Future research needs to substantiate the clinical meaningfulness of the MEs calculated in this study by corroborating them against the respective Minimum Clinically Important Difference (MCID.
Vaz, Sharmila; Parsons, Richard; Passmore, Anne Elizabeth; Andreou, Pantelis; Falkmer, Torbjörn
2013-01-01
The social skills rating system (SSRS) is used to assess social skills and competence in children and adolescents. While its characteristics based on United States samples (US) are published, corresponding Australian figures are unavailable. Using a 4-week retest design, we examined the internal consistency, retest reliability and measurement error (ME) of the SSRS secondary student form (SSF) in a sample of Year 7 students (N = 187), from five randomly selected public schools in Perth, western Australia. Internal consistency (IC) of the total scale and most subscale scores (except empathy) on the frequency rating scale was adequate to permit independent use. On the importance rating scale, most IC estimates for girls fell below the benchmark. Test-retest estimates of the total scale and subscales were insufficient to permit reliable use. ME of the total scale score (frequency rating) for boys was equivalent to the US estimate, while that for girls was lower than the US error. ME of the total scale score (importance rating) was larger than the error using the frequency rating scale. The study finding supports the idea of using multiple informants (e.g. teacher and parent reports), not just student as recommended in the manual. Future research needs to substantiate the clinical meaningfulness of the MEs calculated in this study by corroborating them against the respective Minimum Clinically Important Difference (MCID).
Nordtvedt, K L
1972-12-15
I have reviewed the historical and contemporary experiments that guide us in choosing a post-Newtonian, relativistic gravitational theory. The foundation experiments essentially constrain gravitation theory to be a metric theory in which matter couples solely to one gravitational field, the metric field, although other cosmological gravitational fields may exist. The metric field for any metric theory can be specified (for the solar system, for our present purposes) by a series of potential terms with several parameters. A variety of experiments specify (or put limits on) the numerical values of the seven parameters in the post-Newtonian metric field, and other such experiments have been planned. The empirical results, to date, yield values of the parameters that are consistent with the predictions of Einstein's general relativity.
Energy Technology Data Exchange (ETDEWEB)
Ebert, Joerg
2007-08-31
In this work the short-term and long-term stability of the nanoscale metallic multilayers at elevated temperatures is studied. Reasons and mechanisms for breakdown of the GMR-effect have been analyzed by different physical methods. The multilayered samples investigated in this work exhibit a GMR effect of GMR (alloy)=20.7 % which is significantly smaller than the effect of the standard system with pure Cu interlayers (GMR(Cu)=25.2 %). For protection against oxidation during the use a passivation coating consisting of SiO{sub 2} and Si{sub 3}N{sub 4} has been deposited by the means of plasma CVD. Typical parameters for this process are times of t{sub short-term}=1 h in the temperature range of 200 C
Bergantiños, Gustavo; Valencia-Toledo, Alfredo; Vidal-Puga, Juan
2016-01-01
The program evaluation review technique (PERT) is a tool used to schedule and coordinate activities in a complex project. In assigning the cost of a potential delay, we characterize the Shapley rule as the only rule that satisfies consistency and other desirable properties.
Dmitrieva, Olga; Michalakidis, Georgios; Mason, Aaron; Jones, Simon; Chan, Tom; de Lusignan, Simon
2012-01-01
A new distributed model of health care management is being introduced in England. Family practitioners have new responsibilities for the management of health care budgets and commissioning of services. There are national datasets available about health care providers and the geographical areas they serve. These data could be better used to assist the family practitioner turned health service commissioners. Unfortunately these data are not in a form that is readily usable by these fledgling family commissioning groups. We therefore Web enabled all the national hospital dermatology treatment data in England combining it with locality data to provide a smart commissioning tool for local communities. We used open-source software including the Ruby on Rails Web framework and MySQL. The system has a Web front-end, which uses hypertext markup language cascading style sheets (HTML/CSS) and JavaScript to deliver and present data provided by the database. A combination of advanced caching and schema structures allows for faster data retrieval on every execution. The system provides an intuitive environment for data analysis and processing across a large health system dataset. Web-enablement has enabled data about in patients, day cases and outpatients to be readily grouped, viewed, and linked to other data. The combination of web-enablement, consistent data collection from all providers; readily available locality data; and a registration based primary system enables the creation of data, which can be used to commission dermatology services in small areas. Standardized datasets collected across large health enterprises when web enabled can readily benchmark local services and inform commissioning decisions.
DEFF Research Database (Denmark)
Thomsen, Christa; Nielsen, Anne Ellerup
2006-01-01
This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...
Geometrically Consistent Mesh Modification
Bonito, A.
2010-01-01
A new paradigm of adaptivity is to execute refinement, coarsening, and smoothing of meshes on manifolds with incomplete information about their geometry and yet preserve position and curvature accuracy. We refer to this collectively as geometrically consistent (GC) mesh modification. We discuss the concept of discrete GC, show the failure of naive approaches, and propose and analyze a simple algorithm that is GC and accuracy preserving. © 2010 Society for Industrial and Applied Mathematics.
Li, Shujuan; Wang, Xiaoyu; Wang, Yingying; Zhao, Qianqian; Zhang, Lina; Yang, Xinggang; Liu, Dandan; Pan, Weisan
2015-01-01
In this study, a novel controlled release osmotic pump capsule consisting of pH-modulated solid dispersion for poorly soluble drug flurbiprofen (FP) was developed to improve the solubility and oral bioavailability of FP and to minimize the fluctuation of plasma concentration. The pH-modulated solid dispersion containing FP, Kollidon® 12 PF and Na2CO3 at a weight ratio of 1/4.5/0.02 was prepared using the solvent evaporation method. The osmotic pump capsule was assembled by semi-permeable capsule shell of cellulose acetate (CA) prepared by the perfusion method. Then, the solid dispersion, penetration enhancer, and suspending agents were tableted and filled into the capsule. Central composite design-response surface methodology was used to evaluate the influence of factors on the responses. A second-order polynomial model and a multiple linear model were fitted to correlation coefficient of drug release profile and ultimate cumulative release in 12 h, respectively. The actual response values were in good accordance with the predicted ones. The optimized formulation showed a complete drug delivery and zero-order release rate. Beagle dogs were used to be conducted in the pharmacokinetic study. The in vivo study indicated that the relative bioavailability of the novel osmotic pump system was 133.99% compared with the commercial preparation. The novel controlled delivery system with combination of pH-modulated solid dispersion and osmotic pump system is not only a promising strategy to improve the solubility and oral bioavailability of poorly soluble ionizable drugs but also an effective way to reduce dosing frequency and minimize the plasma fluctuation.
Directory of Open Access Journals (Sweden)
A. Breinholt
2013-10-01
Full Text Available Monitoring of flows in sewer systems is increasingly applied to calibrate urban drainage models used for long-term simulation. However, most often models are calibrated without considering the uncertainties. The generalized likelihood uncertainty estimation (GLUE methodology is here applied to assess parameter and flow simulation uncertainty using a simplified lumped sewer model that accounts for three separate flow contributions: wastewater, fast runoff from paved areas, and slow infiltrating water from permeable areas. Recently GLUE methodology has been critisised for generating prediction limits without statistical coherence and consistency and for the subjectivity in the choice of a threshold value to distinguish "behavioural" from "non-behavioural" parameter sets. In this paper we examine how well the GLUE methodology performs when the behavioural parameter sets deduced from a calibration period are applied to generate prediction bounds in validation periods. By retaining an increasing number of parameter sets we aim at obtaining consistency between the GLUE generated 90% prediction limits and the actual containment ratio (CR in calibration. Due to the large uncertainties related to spatio-temporal rain variability during heavy convective rain events, flow measurement errors, possible model deficiencies as well as epistemic uncertainties, it was not possible to obtain an overall CR of more than 80%. However, the GLUE generated prediction limits still proved rather consistent, since the overall CRs obtained in calibration corresponded well with the overall CRs obtained in validation periods for all proportions of retained parameter sets evaluated. When focusing on wet and dry weather periods separately, some inconsistencies were however found between calibration and validation and we address here some of the reasons why we should not expect the coverage of the prediction limits to be identical in calibration and validation periods in real
International Nuclear Information System (INIS)
Wang Xiaomin; Tegmark, Max; Zaldarriaga, Matias
2002-01-01
We perform a detailed analysis of the latest cosmic microwave background (CMB) measurements (including BOOMERaNG, DASI, Maxima and CBI), both alone and jointly with other cosmological data sets involving, e.g., galaxy clustering and the Lyman Alpha Forest. We first address the question of whether the CMB data are internally consistent once calibration and beam uncertainties are taken into account, performing a series of statistical tests. With a few minor caveats, our answer is yes, and we compress all data into a single set of 24 bandpowers with associated covariance matrix and window functions. We then compute joint constraints on the 11 parameters of the 'standard' adiabatic inflationary cosmological model. Our best fit model passes a series of physical consistency checks and agrees with essentially all currently available cosmological data. In addition to sharp constraints on the cosmic matter budget in good agreement with those of the BOOMERaNG, DASI and Maxima teams, we obtain a heaviest neutrino mass range 0.04-4.2 eV and the sharpest constraints to date on gravity waves which (together with preference for a slight red-tilt) favor 'small-field' inflation models
Griffiths, Robert B.
2001-11-01
Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics
Monari, Antonio; Rivail, Jean-Louis; Assfeld, Xavier
2013-02-19
Molecular mechanics methods can efficiently compute the macroscopic properties of a large molecular system but cannot represent the electronic changes that occur during a chemical reaction or an electronic transition. Quantum mechanical methods can accurately simulate these processes, but they require considerably greater computational resources. Because electronic changes typically occur in a limited part of the system, such as the solute in a molecular solution or the substrate within the active site of enzymatic reactions, researchers can limit the quantum computation to this part of the system. Researchers take into account the influence of the surroundings by embedding this quantum computation into a calculation of the whole system described at the molecular mechanical level, a strategy known as the mixed quantum mechanics/molecular mechanics (QM/MM) approach. The accuracy of this embedding varies according to the types of interactions included, whether they are purely mechanical or classically electrostatic. This embedding can also introduce the induced polarization of the surroundings. The difficulty in QM/MM calculations comes from the splitting of the system into two parts, which requires severing the chemical bonds that link the quantum mechanical subsystem to the classical subsystem. Typically, researchers replace the quantoclassical atoms, those at the boundary between the subsystems, with a monovalent link atom. For example, researchers might add a hydrogen atom when a C-C bond is cut. This Account describes another approach, the Local Self Consistent Field (LSCF), which was developed in our laboratory. LSCF links the quantum mechanical portion of the molecule to the classical portion using a strictly localized bond orbital extracted from a small model molecule for each bond. In this scenario, the quantoclassical atom has an apparent nuclear charge of +1. To achieve correct bond lengths and force constants, we must take into account the inner shell of
Consistency Anchor Formalization and Correctness Proofs
Miguel, Correia; Bessani, Alysson
2014-01-01
This is report contains the formal proofs for the techniques for increasing the consistency of cloud storage as presented in "Bessani et al. SCFS: A Cloud-backed File System. Proc. of the 2014 USENIX Annual Technical Conference. June 2014." The consistency anchor technique allows one to increase the consistency provided by eventually consistent cloud storage services like Amazon S3. This technique has been used in the SCFS (Shared Cloud File System) cloud-backed file system for solving rea...
Lagrangian multiforms and multidimensional consistency
Energy Technology Data Exchange (ETDEWEB)
Lobb, Sarah; Nijhoff, Frank [Department of Applied Mathematics, University of Leeds, Leeds LS2 9JT (United Kingdom)
2009-10-30
We show that well-chosen Lagrangians for a class of two-dimensional integrable lattice equations obey a closure relation when embedded in a higher dimensional lattice. On the basis of this property we formulate a Lagrangian description for such systems in terms of Lagrangian multiforms. We discuss the connection of this formalism with the notion of multidimensional consistency, and the role of the lattice from the point of view of the relevant variational principle.
National Research Council Canada - National Science Library
2006-01-01
DOD, joint, and military service weapon system acquisition policies inconsistently address and do not establish a clear process for considering and testing system chemical and biological survivability...
Possolo, Antonio; Schlamminger, Stephan; Stoudt, Sara; Pratt, Jon R.; Williams, Carl J.
2018-02-01
The Consultative Committee for Mass and related quantities (CCM), of the International Committee for weights and measures (CIPM), has recently declared the readiness of the community to support the redefinition of the international system of units (SI) at the next meeting of the General Conference on Weights and Measures (CGPM) scheduled for November, 2018. Such redefinition will replace the international prototype of the Kilogram (IPK), as the definition and sole primary realization of the unit of mass, with a definition involving the Planck constant, h. This redefinition in terms of a fundamental constant of nature will enable widespread primary realizations not only of the kilogram but also of its multiples and sub-multiples, best to address the full range of practical needs in the measurement of mass. We review and discuss the statistical models and statistical data reductions, uncertainty evaluations, and substantive arguments that support the verification of several technical preconditions for the redefinition that the CCM has established, and whose verification the CCM has affirmed. These conditions relate to the accuracy and mutual consistency of qualifying measurement results. We review also an issue that has surfaced only recently, concerning the convergence toward a stable value, of the historical values that the task group on fundamental constants of the committee on Data for Science and Technology CODATA-TGFC has recommended for h over the years, even though the CCM has not deemed this issue to be relevant. We conclude that no statistically significant trend can be substantiated for these recommended values, but note that cumulative consensus values that may be derived from the historical measurement results for h seem to have converged while continuing to exhibit fluctuations that are typical of a process in statistical control. Finally, we argue that the most recent consensus value derived from the best measurements available for h, obtained using
Smith, J. A.; Froyd, K. D.; Toon, O. B.
2012-12-01
We construct tables of reaction enthalpies and entropies for the association reactions involving sulfuric acid vapor, water vapor, and the bisulfate ion. These tables are created from experimental measurements and quantum chemical calculations for molecular clusters and a classical thermodynamic model for larger clusters. These initial tables are not thermodynamically consistent. For example, the Gibbs free energy of associating a cluster consisting of one acid molecule and two water molecules depends on the order in which the cluster was assembled: add two waters and then the acid or add an acid and a water and then the second water. We adjust the values within the tables using the method of Lagrange multipliers to minimize the adjustments and produce self-consistent Gibbs free energy surfaces for the neutral clusters and the charged clusters. With the self-consistent Gibbs free energy surfaces, we calculate size distributions of neutral and charged clusters for a variety of atmospheric conditions. Depending on the conditions, nucleation can be dominated by growth along the neutral channel or growth along the ion channel followed by ion-ion recombination.
N. Pecchiari; G. Pogliani
2006-01-01
This study analyses the consistency between overall and account-level materiality measures. The study starts emphasizing the need for further research on planning materiality, considering that prior studies have shown large differences in materiality methods. A review of literature on materiality [Messier et al. (2005)] suggests continuing inter-industry investigations on planning materiality [Wheeler and Pany (1989)]. We have also noticed the absence of research in the area of connection bet...
Directory of Open Access Journals (Sweden)
Marko Popović
2010-01-01
Full Text Available Most people would face a problem if there is a need to calculate the mole fraction of a substance A in a gaseous solution (a thermodynamic system containing two or more ideal gases knowing its molarity at a given temperature and pressure. For most it would take a lot of time and calculations to find the answer, especially because the quantities of other substances in the system aren't given. An even greater problem arises when we try to understand how special relativity affects gaseous systems, especially solutions and systems in equilibrium. In this paper formulas are suggested that greatly shorten the process of conversion from molarity to mole fraction and give us a better insight into the relativistic effects on a gaseous system.
Satellite Demonstration: The Videodisc Technology.
Propp, George; And Others
1979-01-01
Originally part of a symposium on educational media for the deaf, the paper describes a satellite demonstration of video disc materials. It is explained that a panel of deaf individuals in Washington, D.C. and another in Nebraska came into direct two-way communication for the first time, and video disc materials were broadcast via the satellite.…
International Nuclear Information System (INIS)
Kucheryavy, V.I.
1997-01-01
Using the self-consistent renormalization we calculate five types of quantities (having the mass anisotropy in general) associated with the canonical Ward identities and reduction identities for two-point chronological fermion current correlators which describe most general polarization properties of fermionic sector for all n-dimensional quantum field theories incorporating fermions with both degenerate and nondegenerate fermion mass spectrum. The analysis of the vector and axial-vector Ward identities and the reduction ones for regular values of these quantities is carried out. The effective formulae for nontrivial quantum corrections (NQC) to the canonical Ward identities are obtained for any space-time dimension. The properties of the NQC are investigated in detail. The emphasis on the space-time dimension and the signature dependence has been made. Particular properties of the two-dimensional words are pointed out
Directory of Open Access Journals (Sweden)
Trevisani Virgínia FM
2008-05-01
Full Text Available Abstract Background Music is ever present in our daily lives, establishing a link between humans and the arts through the senses and pleasure. Sound technicians are the link between musicians and audiences or consumers. Recently, general concern has arisen regarding occurrences of hearing loss induced by noise from excessively amplified sound-producing activities within leisure and professional environments. Sound technicians' activities expose them to the risk of hearing loss, and consequently put at risk their quality of life, the quality of the musical product and consumers' hearing. The aim of this study was to measure the prevalence of high frequency hearing loss consistent with noise exposure among sound technicians in Brazil and compare this with a control group without occupational noise exposure. Methods This was a cross-sectional study comparing 177 participants in two groups: 82 sound technicians and 95 controls (non-sound technicians. A questionnaire on music listening habits and associated complaints was applied, and data were gathered regarding the professionals' numbers of working hours per day and both groups' hearing complaint and presence of tinnitus. The participants' ear canals were visually inspected using an otoscope. Hearing assessments were performed (tonal and speech audiometry using a portable digital AD 229 E audiometer funded by FAPESP. Results There was no statistically significant difference between the sound technicians and controls regarding age and gender. Thus, the study sample was homogenous and would be unlikely to lead to bias in the results. A statistically significant difference in hearing loss was observed between the groups: 50% among the sound technicians and 10.5% among the controls. The difference could be addressed to high sound levels. Conclusion The sound technicians presented a higher prevalence of high frequency hearing loss consistent with noise exposure than did the general population, although
International Nuclear Information System (INIS)
Hattori, Kazumasa
2010-01-01
We investigate a two-orbital Anderson lattice model with Ising orbital intersite exchange interactions on the basis of a dynamical mean field theory combined with the static mean field approximation of intersite orbital interactions. Focusing on Ce-based heavy-fermion compounds, we examine the orbital crossover between two orbital states, when the total f-electron number per site n f is ∼1. We show that a 'meta-orbital' transition, at which the occupancy of two orbitals changes steeply, occurs when the hybridization between the ground-state f-electron orbital and conduction electrons is smaller than that between the excited f-electron orbital and conduction electrons at low pressures. Near the meta-orbital critical end point, orbital fluctuations are enhanced and couple with charge fluctuations. A critical theory of meta-orbital fluctuations is also developed by applying the self-consistent renormalization theory of itinerant electron magnetism to orbital fluctuations. The critical end point, first-order transition, and crossover are described within Gaussian approximations of orbital fluctuations. We discuss the relevance of our results to CeAl 2 , CeCu 2 Si 2 , CeCu 2 Ge 2 , and related compounds, which all have low-lying crystalline-electric-field excited states. (author)
Delandmeter, Philippe; Lambrechts, Jonathan; Legat, Vincent; Vallaeys, Valentin; Naithani, Jaya; Thiery, Wim; Remacle, Jean-François; Deleersnijder, Eric
2018-03-01
The discontinuous Galerkin (DG) finite element method is well suited for the modelling, with a relatively small number of elements, of three-dimensional flows exhibiting strong velocity or density gradients. Its performance can be highly enhanced by having recourse to r-adaptivity. Here, a vertical adaptive mesh method is developed for DG finite elements. This method, originally designed for finite difference schemes, is based on the vertical diffusion of the mesh nodes, with the diffusivity controlled by the density jumps at the mesh element interfaces. The mesh vertical movement is determined by means of a conservative arbitrary Lagrangian-Eulerian (ALE) formulation. Though conservativity is naturally achieved, tracer consistency is obtained by a suitable construction of the mesh vertical velocity field, which is defined in such a way that it is fully compatible with the tracer and continuity equations at a discrete level. The vertically adaptive mesh approach is implemented in the three-dimensional version of the geophysical and environmental flow Second-generation Louvain-la-Neuve Ice-ocean Model (SLIM 3D; www.climate.be/slim). Idealised benchmarks, aimed at simulating the oscillations of a sharp thermocline, are dealt with. Then, the relevance of the vertical adaptivity technique is assessed by simulating thermocline oscillations of Lake Tanganyika. The results are compared to measured vertical profiles of temperature, showing similar stratification and outcropping events.
Miroslaw Dyczkowski
2010-01-01
Economic effectiveness has become a decisive factor in feasibility studies for IT projects due to deteriorating economic situation. This paper characterises such an assessment on the example of an automated settlement system for a supply chain. The described project was carried out by a financial centre – located in Poland – of an in ternational company operating in the automotive industry. It aimed at applying EDI technologies to automate logistics. The project...
Energy Technology Data Exchange (ETDEWEB)
Warga, Johann; Pauer, Thomas; Boecking, Friedrich; Gerhardt, Juergen; Leonhard, Rolf [Robert Bosch GmbH, Stuttgart-Feuerbach (Germany). Diesel Systems
2011-07-01
Since the introduction of common rail technology in modern diesel engines for passenger cars there have been many changes and technological revolutions. Solely the continuous increase of the maximum injection pressure has remained unchanged as a guarantee for further engine performance improvement. Whether for down-sizing or for just simply increase the engine power or to reduce CO2 or to improve emissions: In all aspects the injection pressure can offer possible degrees of freedom. Besides, parallel to this continuous increase of injection pressure, the requirements concerning other injection system features have also continuously further developed. This paper focuses on the achievability of EU6 applications, among others, with the new Bosch 2000 bar solenoid valve injector, innovative nozzle technologies as e.g. with improved spray hole geometry or the modular concept common rail pump CP4. Current engine tests with pressures up to 2500 bar prove clearly the further advantages of pressure increase in diesel engines for passenger cars. In addition to the hydraulic components, system approaches in combination with electronic control, sensors and innovative control algorithms are increasingly in focus aiming to improve system accuracy and robustness. (orig.)
Farsiani, Hadi; Mosavat, Arman; Soleimanpour, Saman; Sadeghian, Hamid; Akbari Eydgahi, Mohammad Reza; Ghazvini, Kiarash; Sankian, Mojtaba; Aryan, Ehsan; Jamehdar, Saeid Amel; Rezaee, Seyed Abdolrahim
2016-06-21
Tuberculosis (TB) remains a major global health threat despite chemotherapy and Bacilli Calmette-Guérin (BCG) vaccination. Therefore, a safer and more effective vaccine against TB is urgently needed. This study evaluated the immunogenicity of a recombinant fusion protein consisting of early secreted antigenic target protein 6 kDa (ESAT-6), culture filtrate protein 10 kDa (CFP-10) and the Fc-domain of mouse IgG2a as a novel subunit vaccine. The recombinant expression vectors (pPICZαA-ESAT-6:CFP-10:Fcγ2a and pPICZαA-ESAT-6:CFP-10:His) were transferred into Pichia pastoris. After SDS-PAGE and immunoblotting, the immunogenicity of the recombinant proteins was evaluated in mice. When both recombinant proteins (ESAT-6:CFP-10:Fcγ2a and ESAT-6:CFP-10:His) were used for vaccination, Th1-type cellular responses were induced producing high levels of IFN-γ and IL-12. However, the Fc-tagged recombinant protein induced more effective Th1-type cellular responses with a small increase in IL-4 as compared to the BCG and ESAT-6:CFP-10:His groups. Moreover, mice primed with BCG and then supplemented with ESAT-6:CFP-10:Fcγ2a produced the highest levels of IFN-γ and IL-12 in immunized groups. The findings indicate that when Fcγ2a is fused to the ESAT-6:CFP-10 complex, as a delivery vehicle, there could be an increase in the immunogenicity of this type of subunit vaccine. Therefore, additional investigations are necessary for the development of appropriate Fc-based tuberculosis vaccines.
Energy Technology Data Exchange (ETDEWEB)
Gomez, Guillermo; Martinez, Graciano; Maellas, Jesus; Cuevas, Raquel; Orihuela, Pilar [INTA, Torrejon de Ardoz (Spain). Renewable Energy Dept.; Bueno, Emilio; Gila, Raul [Alcala de Henares Univ. - Polytechnical School, Barcelona (Spain). Electronic Dept.
2010-07-01
Several forums have been created owing to the current worry about the climate change and find a safety energy source to study the best way to introduce the Hydrogen into the market. To get this purpose the fuel cells can be a good technology, but it is necessary find projects where the reliability is much important than the price. There are many projects where the advantages of fuel cells (noiseless, high density of energy, simplicity, portability,..) can be more important than the economic aspects, and so that the fuel cells can find a place in the market. An example of these projects can be the power system to feed remote telecom applications. The main target of this paper is to show the results of the TRNSYS simulation of HIDROSOLARH{sub 2} to determinate the most efficient configuration from the energy point of view and calculate the balance of plant (BOP). (orig.)
Yuliusman; Ramadhan, I. T.; Huda, M.
2018-03-01
Catalyst are often used in the petroleum refinery industry, especially cobalt-based catalyst such as CoMoX. Every year, Indonesia’s oil industry produces around 1350 tons of spent hydrodesulphurization catalyst in which cobalt makes up for 7%wt. of them. Cobalt is a non-renewable and highly valuable resource. Taking into account the aforementioned reasons, this research was made to recover cobalt from spent hydrodesulphurization catalyst so that it can be reused by industries needing them. The methods used in the recovery of cobalt from the waste catalyst leach solution are liquid-liquid extraction using a synergistic system of VersaticTM 10 and Cyanex®272. Based on the experiments done using the aforementioned methods and materials, the optimum condition for the extraction process: concentration of VersaticTM 10 of 0.35 M, Cyanex®272 of 0.25 M, temperature of 23-25°C (room temperature), and pH of 6 with an extraction percentage of 98.80% and co-extraction of Ni at 93.51%.
International Nuclear Information System (INIS)
Seino, Hiroshi; Nagashima, Ken; Tanaka, Yoshichika; Nakauchi, Masahiko
2010-01-01
The Railway Technical Research Institute conducted a study to develop a superconducting magnetic bearing applicable to the flywheel energy-storage system for railways. In the first step of the study, the thrust rolling bearing was selected for application, and adopted liquid-nitrogen-cooled HTS-bulk as a rotor, and adopted superconducting coil as a stator for the superconducting magnetic bearing. Load capacity of superconducting magnetic bearing was verified up to 10 kN in the static load test. After that, rotation test of that approximately 5 kN thrust load added was performed with maximum rotation of 3000rpm. In the results of bearing rotation test, it was confirmed that position in levitation is able to maintain with stability during the rotation. Heat transfer properties by radiation in vacuum and conductivity by tenuous gas were basically studied by experiment by the reason of confirmation of rotor cooling method. The experimental result demonstrates that the optimal gas pressure is able to obtain without generating windage drag. In the second stage of the development, thrust load capacity of the bearing will be improved aiming at the achievement of the energy capacity of a practical scale. In the static load test of the new superconducting magnetic bearing, stable 20kN-levitation force was obtained.
A new approach to hull consistency
Directory of Open Access Journals (Sweden)
Kolev Lubomir
2016-06-01
Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.
Lin, Congcong; Chen, Fen; Ye, Tiantian; Zhang, Lina; Zhang, Wenji; Liu, Dandan; Xiong, Wei; Yang, Xinggang; Pan, Weisan
2014-04-25
The purpose of this study was to develop a new delivery system based on drug cyclodextrin (CD) complexation and loading into nanostructured lipid carriers (NLC) to improve the oral bioavailability of vinpocetine (VP). Three different CDs and three different methods to obtain solid vinpocetine-cyclodextrin-tartaric acid complexes (VP-CD-TA) were contrasted. The co-evaporation vinpocetine-β-cyclodextrin-tartaric acid loaded NLC (VP-β-CD-TA COE-loaded NLC) was obtained by emulsification ultrasonic dispersion method. VP-β-CD-TA COE-loaded NLC was suitably characterized for particle size, polydispersity index, zeta potential, entrapment efficiency and the morphology. The crystallization of drug in VP-CD-TA and NLC was investigated by differential scanning calorimetry (DSC). The in vitro release study was carried out at pH 1.2, pH 6.8 and pH 7.4 medium. New Zealand rabbits were applied to investigate the pharmacokinetic behavior in vivo. The VP-β-CD-TA COE-loaded NLC presented a superior physicochemical property and selected to further study. In the in vitro release study, VP-β-CD-TA COE-loaded NLC exhibited a higher dissolution rate in the pH 6.8 and pH 7.4 medium than VP suspension and VP-NLC. The relative bioavailability of VP-β-CD-TA COE-loaded NLC was 592% compared with VP suspension and 92% higher than VP-NLC. In conclusion, the new formulation significantly improved bioavailability of VP for oral delivery, demonstrated a perspective way for oral delivery of poorly water-soluble drugs. Copyright © 2014 Elsevier B.V. All rights reserved.
Directory of Open Access Journals (Sweden)
Melanie J Hopkins
Full Text Available The early Cambrian Guanshan biota of eastern Yunnan, China, contains exceptionally preserved animals and algae. Most diverse and abundant are the arthropods, of which there are at least 11 species of trilobites represented by numerous specimens. Many trilobite specimens show soft-body preservation via iron oxide pseudomorphs of pyrite replacement. Here we describe digestive structures from two species of trilobite, Palaeolenus lantenoisi and Redlichia mansuyi. Multiple specimens of both species contain the preserved remains of an expanded stomach region (a "crop" under the glabella, a structure which has not been observed in trilobites this old, despite numerous examples of trilobite gut traces from other Cambrian Lagerstätten. In addition, at least one specimen of Palaeolenus lantenoisi shows the preservation of an unusual combination of digestive structures: a crop and paired digestive glands along the alimentary tract. This combination of digestive structures has also never been observed in trilobites this old, and is rare in general, with prior evidence of it from one juvenile trilobite specimen from the late Cambrian Orsten fauna of Sweden and possibly one adult trilobite specimen from the Early Ordovician Fezouata Lagerstätte. The variation in the fidelity of preservation of digestive structures within and across different Lagerstätten may be due to variation in the type, quality, and point of digestion of food among specimens in addition to differences in mode of preservation. The presence and combination of these digestive features in the Guanshan trilobites contradicts current models of how the trilobite digestive system was structured and evolved over time. Most notably, the crop is not a derived structure as previously proposed, although it is possible that the relative size of the crop increased over the evolutionary history of the clade.
Hopkins, Melanie J; Chen, Feiyang; Hu, Shixue; Zhang, Zhifei
2017-01-01
The early Cambrian Guanshan biota of eastern Yunnan, China, contains exceptionally preserved animals and algae. Most diverse and abundant are the arthropods, of which there are at least 11 species of trilobites represented by numerous specimens. Many trilobite specimens show soft-body preservation via iron oxide pseudomorphs of pyrite replacement. Here we describe digestive structures from two species of trilobite, Palaeolenus lantenoisi and Redlichia mansuyi. Multiple specimens of both species contain the preserved remains of an expanded stomach region (a "crop") under the glabella, a structure which has not been observed in trilobites this old, despite numerous examples of trilobite gut traces from other Cambrian Lagerstätten. In addition, at least one specimen of Palaeolenus lantenoisi shows the preservation of an unusual combination of digestive structures: a crop and paired digestive glands along the alimentary tract. This combination of digestive structures has also never been observed in trilobites this old, and is rare in general, with prior evidence of it from one juvenile trilobite specimen from the late Cambrian Orsten fauna of Sweden and possibly one adult trilobite specimen from the Early Ordovician Fezouata Lagerstätte. The variation in the fidelity of preservation of digestive structures within and across different Lagerstätten may be due to variation in the type, quality, and point of digestion of food among specimens in addition to differences in mode of preservation. The presence and combination of these digestive features in the Guanshan trilobites contradicts current models of how the trilobite digestive system was structured and evolved over time. Most notably, the crop is not a derived structure as previously proposed, although it is possible that the relative size of the crop increased over the evolutionary history of the clade.
Shiino, Kenji; Yamada, Akira; Ischenko, Matthew; Khandheria, Bijoy K; Hudaverdi, Mahala; Speranza, Vicki; Harten, Mary; Benjamin, Anthony; Hamilton-Craig, Christian R; Platts, David G; Burstow, Darryl J; Scalia, Gregory M; Chan, Jonathan
2017-06-01
We aimed to assess intervendor agreement of global (GLS) and regional longitudinal strain by vendor-specific software after EACVI/ASE Industry Task Force Standardization Initiatives for Deformation Imaging. Fifty-five patients underwent prospective dataset acquisitions on the same day by the same operator using two commercially available cardiac ultrasound systems (GE Vivid E9 and Philips iE33). GLS and regional peak longitudinal strain were analyzed offline using corresponding vendor-specific software (EchoPAC BT13 and QLAB version 10.3). Absolute mean GLS measurements were similar between the two vendors (GE -17.5 ± 5.2% vs. Philips -18.9 ± 5.1%, P = 0.15). There was excellent intervendor correlation of GLS by the same observer [r = 0.94, P limits of agreement (LOA) -4.8 to 2.2%). Intervendor comparison for regional longitudinal strain by coronary artery territories distribution were: LAD: r = 0.85, P < 0.0001; bias 0.5%, LOA -5.3 to 6.4%; RCA: r = 0.88, P < 0.0001; bias -2.4%, LOA -8.6 to 3.7%; LCX: r = 0.76, P < 0.0001; bias -5.3%, LOA -10.6 to 2.0%. Intervendor comparison for regional longitudinal strain by LV levels were: basal: r = 0.86, P < 0.0001; bias -3.6%, LOA -9.9 to 2.0%; mid: r = 0.90, P < 0.0001; bias -2.6%, LOA -7.8 to 2.6%; apical: r = 0.74; P < 0.0001; bias -1.3%, LOA -9.4 to 6.8%. Intervendor agreement in GLS and regional strain measurements have significantly improved after the EACVI/ASE Task Force Strain Standardization Initiatives. However, significant wide LOA still exist, especially for regional strain measurements, which remains relevant when considering vendor-specific software for serial measurements. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2016. For permissions please email: journals.permissions@oup.com.
Decentralized Consistent Updates in SDN
Nguyen, Thanh Dang
2017-04-10
We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.
Consistent Price Systems in Multiasset Markets
Directory of Open Access Journals (Sweden)
Florian Maris
2012-01-01
markets with proportional transaction costs as discussed in the recent paper by Guasoni et al. (2008, where the CFS property is introduced and shown sufficient for CPSs for processes with certain state space. The current paper extends the results in the work of Guasoni et al. (2008, to processes with more general state space.
1982-10-31
Commaznd -- Fo,,.t Motoe, VIti ia 23651 85 8 8 030 NOTICES This report has been reviewed and is approved. FRANK E. GIUNTI F. A. NERONE Chief, Instructional...between the RT-524 antenna coaxial jack and the antenna matching network coaxial jack. 23. (6-1-5) T F Within the MT-1029, the negative side of the...1205 FRE@. SEfjITIVE NETWORKS 16.36 18.00 20.20 - CA2606 SEMrCO4OUCTORS 10.48 9.00 7.2a CR2607 T/S THE At!/GRA-39 19.60 15.00 10.20 CA42&0 SIGNAL
Consistency argued students of fluid
Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma
2017-01-01
Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.
Coordinating user interfaces for consistency
Nielsen, Jakob
2001-01-01
In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys
Choice, internal consistency, and rationality
Aditi Bhattacharyya; Prasanta K. Pattanaik; Yongsheng Xu
2010-01-01
The classical theory of rational choice is built on several important internal consistency conditions. In recent years, the reasonableness of those internal consistency conditions has been questioned and criticized, and several responses to accommodate such criticisms have been proposed in the literature. This paper develops a general framework to accommodate the issues raised by the criticisms of classical rational choice theory, and examines the broad impact of these criticisms from both no...
International Nuclear Information System (INIS)
Rafelski, J.
1979-01-01
After an introductory overview of the bag model the author uses the self-consistent solution of the coupled Dirac-meson fields to represent a bound state of strongly ineteracting fermions. In this framework he discusses the vivial approach to classical field equations. After a short description of the used numerical methods the properties of bound states of scalar self-consistent Fields and the solutions of a self-coupled Dirac field are considered. (HSI) [de
Time-consistent and market-consistent evaluations
Pelsser, A.; Stadje, M.A.
2014-01-01
We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
Market-consistent actuarial valuation
Wüthrich, Mario V
2016-01-01
This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.
The Principle of Energetic Consistency
Cohn, Stephen E.
2009-01-01
A basic result in estimation theory is that the minimum variance estimate of the dynamical state, given the observations, is the conditional mean estimate. This result holds independently of the specifics of any dynamical or observation nonlinearity or stochasticity, requiring only that the probability density function of the state, conditioned on the observations, has two moments. For nonlinear dynamics that conserve a total energy, this general result implies the principle of energetic consistency: if the dynamical variables are taken to be the natural energy variables, then the sum of the total energy of the conditional mean and the trace of the conditional covariance matrix (the total variance) is constant between observations. Ensemble Kalman filtering methods are designed to approximate the evolution of the conditional mean and covariance matrix. For them the principle of energetic consistency holds independently of ensemble size, even with covariance localization. However, full Kalman filter experiments with advection dynamics have shown that a small amount of numerical dissipation can cause a large, state-dependent loss of total variance, to the detriment of filter performance. The principle of energetic consistency offers a simple way to test whether this spurious loss of variance limits ensemble filter performance in full-blown applications. The classical second-moment closure (third-moment discard) equations also satisfy the principle of energetic consistency, independently of the rank of the conditional covariance matrix. Low-rank approximation of these equations offers an energetically consistent, computationally viable alternative to ensemble filtering. Current formulations of long-window, weak-constraint, four-dimensional variational methods are designed to approximate the conditional mode rather than the conditional mean. Thus they neglect the nonlinear bias term in the second-moment closure equation for the conditional mean. The principle of
Consistent guiding center drift theories
International Nuclear Information System (INIS)
Wimmel, H.K.
1982-04-01
Various guiding-center drift theories are presented that are optimized in respect of consistency. They satisfy exact energy conservation theorems (in time-independent fields), Liouville's theorems, and appropriate power balance equations. A theoretical framework is given that allows direct and exact derivation of associated drift-kinetic equations from the respective guiding-center drift-orbit theories. These drift-kinetic equations are listed. Northrop's non-optimized theory is discussed for reference, and internal consistency relations of G.C. drift theories are presented. (orig.)
Weak consistency and strong paraconsistency
Directory of Open Access Journals (Sweden)
Gemma Robles
2009-11-01
Full Text Available In a standard sense, consistency and paraconsistency are understood as, respectively, the absence of any contradiction and as the absence of the ECQ (“E contradictione quodlibet” rule that allows us to conclude any well formed formula from any contradiction. The aim of this paper is to explain the concepts of weak consistency alternative to the standard one, the concepts of paraconsistency related to them and the concept of strong paraconsistency, all of which have been defined by the author together with José M. Méndez.
Consistent force fields for saccharides
DEFF Research Database (Denmark)
Rasmussen, Kjeld
1999-01-01
Consistent force fields for carbohydrates were hitherto developed by extensive optimization ofpotential energy function parameters on experimental data and on ab initio results. A wide range of experimental data is used: internal structures obtained from gas phase electron diffraction and from x......-anomeric effects are accounted for without addition of specific terms. The work is done in the framework of the Consistent Force Field which originatedin Israel and was further developed in Denmark. The actual methods and strategies employed havebeen described previously. Extensive testing of the force field...
Glass consistency and glass performance
International Nuclear Information System (INIS)
Plodinec, M.J.; Ramsey, W.G.
1994-01-01
Glass produced by the Defense Waste Processing Facility (DWPF) will have to consistently be more durable than a benchmark glass (evaluated using a short-term leach test), with high confidence. The DWPF has developed a Glass Product Control Program to comply with this specification. However, it is not clear what relevance product consistency has on long-term glass performance. In this report, the authors show that DWPF glass, produced in compliance with this specification, can be expected to effectively limit the release of soluble radionuclides to natural environments. However, the release of insoluble radionuclides to the environment will be limited by their solubility, and not glass durability
Time-consistent actuarial valuations
Pelsser, A.A.J.; Salahnejhad Ghalehjooghi, A.
2016-01-01
Time-consistent valuations (i.e. pricing operators) can be created by backward iteration of one-period valuations. In this paper we investigate the continuous-time limits of well-known actuarial premium principles when such backward iteration procedures are applied. This method is applied to an
Dynamically consistent oil import tariffs
International Nuclear Information System (INIS)
Karp, L.; Newbery, D.M.
1992-01-01
The standard theory of optimal tariffs considers tariffs on perishable goods produced abroad under static conditions, in which tariffs affect prices only in that period. Oil and other exhaustable resources do not fit this model, for current tariffs affect the amount of oil imported, which will affect the remaining stock and hence its future price. The problem of choosing a dynamically consistent oil import tariff when suppliers are competitive but importers have market power is considered. The open-loop Nash tariff is solved for the standard competitive case in which the oil price is arbitraged, and it was found that the resulting tariff rises at the rate of interest. This tariff was found to have an equilibrium that in general is dynamically inconsistent. Nevertheless, it is shown that necessary and sufficient conditions exist under which the tariff satisfies the weaker condition of time consistency. A dynamically consistent tariff is obtained by assuming that all agents condition their current decisions on the remaining stock of the resource, in contrast to open-loop strategies. For the natural case in which all agents choose their actions simultaneously in each period, the dynamically consistent tariff was characterized, and found to differ markedly from the time-inconsistent open-loop tariff. It was shown that if importers do not have overwhelming market power, then the time path of the world price is insensitive to the ability to commit, as is the level of wealth achieved by the importer. 26 refs., 4 figs
Consistently violating the non-Gaussian consistency relation
International Nuclear Information System (INIS)
Mooij, Sander; Palma, Gonzalo A.
2015-01-01
Non-attractor models of inflation are characterized by the super-horizon evolution of curvature perturbations, introducing a violation of the non-Gaussian consistency relation between the bispectrum's squeezed limit and the power spectrum's spectral index. In this work we show that the bispectrum's squeezed limit of non-attractor models continues to respect a relation dictated by the evolution of the background. We show how to derive this relation using only symmetry arguments, without ever needing to solve the equations of motion for the perturbations
Consistence of Network Filtering Rules
Institute of Scientific and Technical Information of China (English)
SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian
2004-01-01
The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.
International Nuclear Information System (INIS)
Hazeltine, R.D.
1988-12-01
The boundary layer arising in the radial vicinity of a tokamak limiter is examined, with special reference to the TEXT tokamak. It is shown that sheath structure depends upon the self-consistent effects of ion guiding-center orbit modification, as well as the radial variation of E /times/ B-induced toroidal rotation. Reasonable agreement with experiment is obtained from an idealized model which, however simplified, preserves such self-consistent effects. It is argued that the radial sheath, which occurs whenever confining magnetic field-lines lie in the plasma boundary surface, is an object of some intrinsic interest. It differs from the more familiar axial sheath because magnetized charges respond very differently to parallel and perpendicular electric fields. 11 refs., 1 fig
Consistency and Communication in Committees
Inga Deimen; Felix Ketelaar; Mark T. Le Quement
2013-01-01
This paper analyzes truthtelling incentives in pre-vote communication in heterogeneous committees. We generalize the classical Condorcet jury model by introducing a new informational structure that captures consistency of information. In contrast to the impossibility result shown by Coughlan (2000) for the classical model, full pooling of information followed by sincere voting is an equilibrium outcome of our model for a large set of parameter values implying the possibility of ex post confli...
Deep Feature Consistent Variational Autoencoder
Hou, Xianxu; Shen, Linlin; Sun, Ke; Qiu, Guoping
2016-01-01
We present a novel method for constructing Variational Autoencoder (VAE). Instead of using pixel-by-pixel loss, we enforce deep feature consistency between the input and the output of a VAE, which ensures the VAE's output to preserve the spatial correlation characteristics of the input, thus leading the output to have a more natural visual appearance and better perceptual quality. Based on recent deep learning works such as style transfer, we employ a pre-trained deep convolutional neural net...
International Nuclear Information System (INIS)
Baumann, K; Weber, U; Simeonov, Y; Zink, K
2015-01-01
Purpose: Aim of this study was to optimize the magnetic field strengths of two quadrupole magnets in a particle therapy facility in order to obtain a beam quality suitable for spot beam scanning. Methods: The particle transport through an ion-optic system of a particle therapy facility consisting of the beam tube, two quadrupole magnets and a beam monitor system was calculated with the help of Matlab by using matrices that solve the equation of motion of a charged particle in a magnetic field and field-free region, respectively. The magnetic field strengths were optimized in order to obtain a circular and thin beam spot at the iso-center of the therapy facility. These optimized field strengths were subsequently transferred to the Monte-Carlo code FLUKA and the transport of 80 MeV/u C12-ions through this ion-optic system was calculated by using a user-routine to implement magnetic fields. The fluence along the beam-axis and at the iso-center was evaluated. Results: The magnetic field strengths could be optimized by using Matlab and transferred to the Monte-Carlo code FLUKA. The implementation via a user-routine was successful. Analyzing the fluence-pattern along the beam-axis the characteristic focusing and de-focusing effects of the quadrupole magnets could be reproduced. Furthermore the beam spot at the iso-center was circular and significantly thinner compared to an unfocused beam. Conclusion: In this study a Matlab tool was developed to optimize magnetic field strengths for an ion-optic system consisting of two quadrupole magnets as part of a particle therapy facility. These magnetic field strengths could subsequently be transferred to and implemented in the Monte-Carlo code FLUKA to simulate the particle transport through this optimized ion-optic system
On Modal Refinement and Consistency
DEFF Research Database (Denmark)
Nyman, Ulrik; Larsen, Kim Guldstrand; Wasowski, Andrzej
2007-01-01
Almost 20 years after the original conception, we revisit several fundamental question about modal transition systems. First, we demonstrate the incompleteness of the standard modal refinement using a counterexample due to Hüttel. Deciding any refinement, complete with respect to the standard...
Towards thermodynamical consistency of quasiparticle picture
International Nuclear Information System (INIS)
Biro, T.S.; Shanenko, A.A.; Toneev, V.D.; Research Inst. for Particle and Nuclear Physics, Hungarian Academy of Sciences, Budapest
2003-01-01
The purpose of the present article is to call attention to some realistic quasi-particle-based description of the quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamical consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamical consistency. A particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential, which can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics [ru
Toward thermodynamic consistency of quasiparticle picture
International Nuclear Information System (INIS)
Biro, T.S.; Toneev, V.D.; Shanenko, A.A.
2003-01-01
The purpose of the present article is to call attention to some realistic quasiparticle-based description of quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamic consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamic consistency. Particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential that can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics
Consistency relations in effective field theory
Energy Technology Data Exchange (ETDEWEB)
Munshi, Dipak; Regan, Donough, E-mail: D.Munshi@sussex.ac.uk, E-mail: D.Regan@sussex.ac.uk [Astronomy Centre, School of Mathematical and Physical Sciences, University of Sussex, Brighton BN1 9QH (United Kingdom)
2017-06-01
The consistency relations in large scale structure relate the lower-order correlation functions with their higher-order counterparts. They are direct outcome of the underlying symmetries of a dynamical system and can be tested using data from future surveys such as Euclid. Using techniques from standard perturbation theory (SPT), previous studies of consistency relation have concentrated on continuity-momentum (Euler)-Poisson system of an ideal fluid. We investigate the consistency relations in effective field theory (EFT) which adjusts the SPT predictions to account for the departure from the ideal fluid description on small scales. We provide detailed results for the 3D density contrast δ as well as the scaled divergence of velocity θ-bar . Assuming a ΛCDM background cosmology, we find the correction to SPT results becomes important at k ∼> 0.05 h/Mpc and that the suppression from EFT to SPT results that scales as square of the wave number k , can reach 40% of the total at k ≈ 0.25 h/Mpc at z = 0. We have also investigated whether effective field theory corrections to models of primordial non-Gaussianity can alter the squeezed limit behaviour, finding the results to be rather insensitive to these counterterms. In addition, we present the EFT corrections to the squeezed limit of the bispectrum in redshift space which may be of interest for tests of theories of modified gravity.
Self-consistency in Capital Markets
Benbrahim, Hamid
2013-03-01
Capital Markets are considered, at least in theory, information engines whereby traders contribute to price formation with their diverse perspectives. Regardless whether one believes in efficient market theory on not, actions by individual traders influence prices of securities, which in turn influence actions by other traders. This influence is exerted through a number of mechanisms including portfolio balancing, margin maintenance, trend following, and sentiment. As a result market behaviors emerge from a number of mechanisms ranging from self-consistency due to wisdom of the crowds and self-fulfilling prophecies, to more chaotic behavior resulting from dynamics similar to the three body system, namely the interplay between equities, options, and futures. This talk will address questions and findings regarding the search for self-consistency in capital markets.
Self-consistent velocity dependent effective interactions
International Nuclear Information System (INIS)
Kubo, Takayuki; Sakamoto, Hideo; Kammuri, Tetsuo; Kishimoto, Teruo.
1993-09-01
The field coupling method is extended to a system with a velocity dependent mean potential. By means of this method, we can derive the effective interactions which are consistent with the mean potential. The self-consistent velocity dependent effective interactions are applied to the microscopic analysis of the structures of giant dipole resonances (GDR) of 148,154 Sm, of the first excited 2 + states of Sn isotopes and of the first excited 3 - states of Mo isotopes. It is clarified that the interactions play crucial roles in describing the splitting of the resonant structure of GDR peaks, in restoring the energy weighted sum rule values, and in reducing B (Eλ) values. (author)
Charlton, Bruce G
2009-08-01
The main predictors of examination results and educational achievement in modern societies are intelligence (IQ - or general factor 'g' intelligence) and the personality trait termed 'Conscientiousness' (C). I have previously argued that increased use of continuous assessment (e.g. course work rather than timed and supervised examinations) and increased duration of the educational process implies that modern educational systems have become increasingly selective for the personality trait of Conscientiousness and consequently less selective for IQ. I have tested this prediction (in a preliminary fashion) by looking at the sex ratios in the most selective elite US universities. My two main assumptions are: (1) that a greater proportion of individuals with very high intelligence are men than women, and (2) that women are more conscientious than men. To estimate the proportion of men and women expected at highly-selective schools, I performed demonstration calculations based on three plausible estimates of male and female IQ averages and standard deviations. The expected percentage of men at elite undergraduate colleges (selecting students with IQ above 130 - i.e. in the top 2% of the population) were 66%, 61% and 74%. When these estimates were compared with the sex ratios at 33 elite colleges and universities, only two technical institutes had more than 60% men. Elite US colleges and universities therefore seem to be selecting primarily on the basis of something other than IQ - probably conscientiousness. There is a 'missing population' of very high IQ men who are not being admitted to the most selective and prestigious undergraduate schools, probably because their high school educational qualifications and evaluations are too low. This analysis is therefore consistent with the hypothesis that modern educational systems tend to select more strongly for Conscientiousness than for IQ. The implication is that modern undergraduates at the most-selective US schools are not
Energy Technology Data Exchange (ETDEWEB)
Neuhoff, K.; Boyd, R.; Grau, T. [Climate Policy Initiative, German Institute for Economic Research (DIW Berlin), Berlin (Germany); Hobbs, B.; Newbery, D. [Electricity Policy Research Group, University of Cambridge, Cambridge (United Kingdom); Borggrefe, F. [University of Cologne, Cologne (Germany); Barquin, J.; Echavarren, F. [Universidad Pontificia Comillas, Madrid (Spain); Bialek, J.; Dent, C. [Durham University, Durham (United Kingdom); Con Hirschhausen, C. [Technical University of Berlin, Berlin (Germany); Kunz, F.; Weigt, H. [Technical University of Dresden, Dresden (Germany); Nabe, C.; Papaefthymiou, G. [Ecofys Germany, Berlin (Germany); Weber, C. [Duisberg-Essen University, Duisburg-Essen (Germany)
2011-10-15
The core objective of the RE-Shaping project is to assist Member State governments in preparing for the implementation of Directive 2009/28/EC (on the promotion of the use of energy from renewable sources) and to guide a European policy for RES (renewable energy sources) in the mid- to long term. The past and present success of policies for renewable energies will be evaluated and recommendations derived to improve future RES support schemes. The core content of this collaborative research activity comprises: Developing a comprehensive policy background for RES support instruments; Providing the European Commission and Member States with scientifically based and statistically robust indicators to measure the success of currently implemented RES policies; Proposing innovative financing schemes for lower costs and better capital availability in RES financing; Initiation of National Policy Processes which attempt to stimulate debate and offer key stakeholders a meeting place to set and implement RES targets as well as options to improve the national policies fostering RES market penetration; Assessing options to coordinate or even gradually harmonize national RES policy approaches. In the EU, at least 200 gigawatts (GWs) of new and additional renewable electricity sources may be needed by 2020. The aim of this report is to analyse whether the current electricity market and system design is consistent with such an ambitious target. Using an international comparison, we identify opportunities to improve the power market design currently in place across EU countries so as to support the large scale integration of renewable energy sources.
Energy Technology Data Exchange (ETDEWEB)
Niemzig, O.C.
2005-07-18
In order to meet the cost targets of PEM fuel cells for commercialization significant cost reductions of cell stack components like membrane/electrode assemblies and bipolar plates have become key aspects of research and development. Central topics of his work are the bipolar plates and humidification for portable applications. Best results concerning conductivity of an extensive screening of a variety of carbon polymer compounds with polypropylene as matrix could be achieved with the carbon black/graphite/polypropylene-base system. Successful tests of this material in a fuel cell stack could be performed as well as the proof of suitability concerning material- and manufacturing costs. Dependent on application a decrease of material cost to 2 Euro/kg to 1,8 Euro/kW seems to be possible. Finally bipolar plates consisting of a selected carbon polymer compound were successfully integrated and tested in a 20-cell stack which was implemented in a portable PEFC-demonstrator unit with a power output between 50 and 150 W. (orig.)
Wang, Meng; Han, Qiutong; Li, Liang; Tang, Lanqin; Li, Haijin; Zhou, Yong; Zou, Zhigang
2017-07-01
An all-solid-state Bi2WO6/Au/CdS Z-scheme system was constructed for the photocatalytic reduction of CO2 into methane in the presence of water vapor. This Z-scheme consists of ultrathin Bi2WO6 nanoplates and CdS nanoparticles as photocatalysts, and a Au nanoparticle as a solid electron mediator offering a high speed charge transfer channel and leading to more efficient spatial separation of electron-hole pairs. The photo-generated electrons from the conduction band (CB) of Bi2WO6 transfer to the Au, and then release to the valence band (VB) of CdS to recombine with the holes of CdS. It allows the electrons remaining in the CB of CdS and holes in the VB of Bi2WO6 to possess strong reduction and oxidation powers, respectively, leading the Bi2WO6/Au/CdS to exhibit high photocatalytic reduction of CO2, relative to bare Bi2WO6, Bi2WO6/Au, and Bi2WO6/CdS. The depressed hole density on CdS also enhances the stability of the CdS against photocorrosion.
Thermodynamically consistent model calibration in chemical kinetics
Directory of Open Access Journals (Sweden)
Goutsias John
2011-05-01
Full Text Available Abstract Background The dynamics of biochemical reaction systems are constrained by the fundamental laws of thermodynamics, which impose well-defined relationships among the reaction rate constants characterizing these systems. Constructing biochemical reaction systems from experimental observations often leads to parameter values that do not satisfy the necessary thermodynamic constraints. This can result in models that are not physically realizable and may lead to inaccurate, or even erroneous, descriptions of cellular function. Results We introduce a thermodynamically consistent model calibration (TCMC method that can be effectively used to provide thermodynamically feasible values for the parameters of an open biochemical reaction system. The proposed method formulates the model calibration problem as a constrained optimization problem that takes thermodynamic constraints (and, if desired, additional non-thermodynamic constraints into account. By calculating thermodynamically feasible values for the kinetic parameters of a well-known model of the EGF/ERK signaling cascade, we demonstrate the qualitative and quantitative significance of imposing thermodynamic constraints on these parameters and the effectiveness of our method for accomplishing this important task. MATLAB software, using the Systems Biology Toolbox 2.1, can be accessed from http://www.cis.jhu.edu/~goutsias/CSS lab/software.html. An SBML file containing the thermodynamically feasible EGF/ERK signaling cascade model can be found in the BioModels database. Conclusions TCMC is a simple and flexible method for obtaining physically plausible values for the kinetic parameters of open biochemical reaction systems. It can be effectively used to recalculate a thermodynamically consistent set of parameter values for existing thermodynamically infeasible biochemical reaction models of cellular function as well as to estimate thermodynamically feasible values for the parameters of new
Gentzen's centenary the quest for consistency
Rathjen, Michael
2015-01-01
Gerhard Gentzen has been described as logic’s lost genius, whom Gödel called a better logician than himself. This work comprises articles by leading proof theorists, attesting to Gentzen’s enduring legacy to mathematical logic and beyond. The contributions range from philosophical reflections and re-evaluations of Gentzen’s original consistency proofs to the most recent developments in proof theory. Gentzen founded modern proof theory. His sequent calculus and natural deduction system beautifully explain the deep symmetries of logic. They underlie modern developments in computer science such as automated theorem proving and type theory.
Generalized contexts and consistent histories in quantum mechanics
International Nuclear Information System (INIS)
Losada, Marcelo; Laura, Roberto
2014-01-01
We analyze a restriction of the theory of consistent histories by imposing that a valid description of a physical system must include quantum histories which satisfy the consistency conditions for all states. We prove that these conditions are equivalent to imposing the compatibility conditions of our formalism of generalized contexts. Moreover, we show that the theory of consistent histories with the consistency conditions for all states and the formalism of generalized context are equally useful representing expressions which involve properties at different times
A consistent thermodynamic database for cement minerals
International Nuclear Information System (INIS)
Blanc, P.; Claret, F.; Burnol, A.; Marty, N.; Gaboreau, S.; Tournassat, C.; Gaucher, E.C.; Giffault, E.; Bourbon, X.
2010-01-01
work - the formation enthalpy and the Cp(T) function are taken from the literature or estimated - finally, the Log K(T) function is calculated, based on the selected dataset and it is compared to experimental data gathered at different temperatures. Each experimental point is extracted from solution compositions by using PHREEQC with a selection of aqueous complexes, consistent with the Thermochimie database. The selection was tested namely by drawing activity diagrams, allowing to assess phases relations. An example of such a diagram, drawn in the CaO-Al 2 O 3 -SiO 2 -H 2 O system is displayed. It can be seen that low pH concrete alteration proceeds essentially in decreasing the C/S ratio in C-S-H phases to the point where C-S-H are no longer stable and replaced by zeolite, then clay minerals. This evolution corresponds to a decrease in silica activity, which is consistent with the pH decrease, as silica concentration depends essentially on pH. Some rather consistent phase relations have been obtained for the SO 3 -Al 2 O 3 -CaO-CO 2 -H 2 O system. Addition of iron III enlarges the AFm-SO 4 stability field to the low temperature domain, whereas it decreases the pH domain where ettringite is stable. On the other hand, the stability field of katoite remains largely ambiguous, namely with respect to a hydro-garnet/grossular solid solution. With respect to other databases this work was made in consistency with a larger mineral selection, so that it can be used for modelling works in the cement clay interaction context
Johnson, P. D.; Ferrini, V. L.; Jerram, K.
2016-12-01
In 2015 the National Science Foundation funded the University of New Hampshire's Center for Coastal and Ocean Mapping and Lamont-Doherty Earth Observatory, for the second time, to coordinate the effort of standardizing the quality of multibeam echosounder (MBES) data across the U.S. academic fleet. This effort supports 9 different ship operating institutions who manage a total of 12 multibeam-equipped ships carrying 6 different MBES systems, manufactured by two different companies. These MBES are designed to operate over a very wide range of depths and operational modes. The complexity of this endeavor led to the creation of the Multibeam Advisory Committee (MAC), a team of academic and industry experts whose mission is to support the needs of the U.S academic fleet's multibeam echo sounders through all of the phases of the "life" of a MBES system and its data, from initial acceptance of the system, to recommendations on at-sea acquisition of data, to validation of already installed systems, and finally to the post-survey data evaluation. The main activities of the MAC include 1.) standardizing both the Shipboard Acceptance Testing of all new systems and Quality Assurance Testing of already installed systems, 2.) working with the both the ship operators/technicians and the manufacturers of the multibeam systems to guarantee that each MBES is working at its peak performance level, 3.) developing tools that aid in the collection of data, assessment of the MBES hardware, and evaluation of the quality of the MBES data, 4.) creating "best practices" documentation concerning data acquisition and workflow, and 5.) providing a website, http://mac.unols.org, to host technical information, tools, reports, and a "help desk" for operators of the systems to ask questions concerning issues that they see with their systems.
A Consistent Phylogenetic Backbone for the Fungi
Ebersberger, Ingo; de Matos Simoes, Ricardo; Kupczok, Anne; Gube, Matthias; Kothe, Erika; Voigt, Kerstin; von Haeseler, Arndt
2012-01-01
The kingdom of fungi provides model organisms for biotechnology, cell biology, genetics, and life sciences in general. Only when their phylogenetic relationships are stably resolved, can individual results from fungal research be integrated into a holistic picture of biology. However, and despite recent progress, many deep relationships within the fungi remain unclear. Here, we present the first phylogenomic study of an entire eukaryotic kingdom that uses a consistency criterion to strengthen phylogenetic conclusions. We reason that branches (splits) recovered with independent data and different tree reconstruction methods are likely to reflect true evolutionary relationships. Two complementary phylogenomic data sets based on 99 fungal genomes and 109 fungal expressed sequence tag (EST) sets analyzed with four different tree reconstruction methods shed light from different angles on the fungal tree of life. Eleven additional data sets address specifically the phylogenetic position of Blastocladiomycota, Ustilaginomycotina, and Dothideomycetes, respectively. The combined evidence from the resulting trees supports the deep-level stability of the fungal groups toward a comprehensive natural system of the fungi. In addition, our analysis reveals methodologically interesting aspects. Enrichment for EST encoded data—a common practice in phylogenomic analyses—introduces a strong bias toward slowly evolving and functionally correlated genes. Consequently, the generalization of phylogenomic data sets as collections of randomly selected genes cannot be taken for granted. A thorough characterization of the data to assess possible influences on the tree reconstruction should therefore become a standard in phylogenomic analyses. PMID:22114356
A Preliminary Study toward Consistent Soil Moisture from AMSR2
Parinussa, R.M.; Holmes, T.R.H.; Wanders, N.; Dorigo, W.A.; de Jeu, R.A.M.
2015-01-01
A preliminary study toward consistent soil moisture products from the Advanced Microwave Scanning Radiometer 2 (AMSR2) is presented. Its predecessor, the Advanced Microwave Scanning Radiometer for Earth Observing System (AMSR-E), has providedEarth scientists with a consistent and continuous global
Carl Rogers during Initial Interviews: A Moderate and Consistent Therapist.
Edwards, H. P.; And Others
1982-01-01
Analyzed two initial interviews by Carl Rogers in their entirety using the Carkhuff scales, Hill's category system, and a brief grammatical analysis to establish the level and consistency with which Rogers provides facilitative conditions. Results indicated his behavior as counselor was stable and consistent within and across interviews. (Author)
Replica consistency in a Data Grid
International Nuclear Information System (INIS)
Domenici, Andrea; Donno, Flavia; Pucciani, Gianni; Stockinger, Heinz; Stockinger, Kurt
2004-01-01
A Data Grid is a wide area computing infrastructure that employs Grid technologies to provide storage capacity and processing power to applications that handle very large quantities of data. Data Grids rely on data replication to achieve better performance and reliability by storing copies of data sets on different Grid nodes. When a data set can be modified by applications, the problem of maintaining consistency among existing copies arises. The consistency problem also concerns metadata, i.e., additional information about application data sets such as indices, directories, or catalogues. This kind of metadata is used both by the applications and by the Grid middleware to manage the data. For instance, the Replica Management Service (the Grid middleware component that controls data replication) uses catalogues to find the replicas of each data set. Such catalogues can also be replicated and their consistency is crucial to the correct operation of the Grid. Therefore, metadata consistency generally poses stricter requirements than data consistency. In this paper we report on the development of a Replica Consistency Service based on the middleware mainly developed by the European Data Grid Project. The paper summarises the main issues in the replica consistency problem, and lays out a high-level architectural design for a Replica Consistency Service. Finally, results from simulations of different consistency models are presented
Student Effort, Consistency, and Online Performance
Patron, Hilde; Lopez, Salvador
2011-01-01
This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas…
Translationally invariant self-consistent field theories
International Nuclear Information System (INIS)
Shakin, C.M.; Weiss, M.S.
1977-01-01
We present a self-consistent field theory which is translationally invariant. The equations obtained go over to the usual Hartree-Fock equations in the limit of large particle number. In addition to deriving the dynamic equations for the self-consistent amplitudes we discuss the calculation of form factors and various other observables
Consistent-handed individuals are more authoritarian.
Lyle, Keith B; Grillo, Michael C
2014-01-01
Individuals differ in the consistency with which they use one hand over the other to perform everyday activities. Some individuals are very consistent, habitually using a single hand to perform most tasks. Others are relatively inconsistent, and hence make greater use of both hands. More- versus less-consistent individuals have been shown to differ in numerous aspects of personality and cognition. In several respects consistent-handed individuals resemble authoritarian individuals. For example, both consistent-handedness and authoritarianism have been linked to cognitive inflexibility. Therefore we hypothesised that consistent-handedness is an external marker for authoritarianism. Confirming our hypothesis, we found that consistent-handers scored higher than inconsistent-handers on a measure of submission to authority, were more likely to identify with a conservative political party (Republican), and expressed less-positive attitudes towards out-groups. We propose that authoritarianism may be influenced by the degree of interaction between the left and right brain hemispheres, which has been found to differ between consistent- and inconsistent-handed individuals.
Testing the visual consistency of web sites
van der Geest, Thea; Loorbach, N.R.
2005-01-01
Consistency in the visual appearance of Web pages is often checked by experts, such as designers or reviewers. This article reports a card sort study conducted to determine whether users rather than experts could distinguish visual (in-)consistency in Web elements and pages. The users proved to
Consistent spectroscopy for a extended gauge model
International Nuclear Information System (INIS)
Oliveira Neto, G. de.
1990-11-01
The consistent spectroscopy was obtained with a Lagrangian constructed with vector fields with a U(1) group extended symmetry. As consistent spectroscopy is understood the determination of quantum physical properties described by the model in an manner independent from the possible parametrizations adopted in their description. (L.C.J.A.)
Modeling and Testing Legacy Data Consistency Requirements
DEFF Research Database (Denmark)
Nytun, J. P.; Jensen, Christian Søndergaard
2003-01-01
An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...
View from Europe: stability, consistency or pragmatism
International Nuclear Information System (INIS)
Dunster, H.J.
1988-01-01
The last few years of this decade look like a period of reappraisal of radiation protection standards. The revised risk estimates from Japan will be available, and the United Nations Scientific Committee on the Effects of Atomic Radiation will be publishing new reports on biological topics. The International Commission on Radiological Protection (ICRP) has started a review of its basic recommendations, and the new specification for dose equivalent in radiation fields of the International Commission on Radiation Units and Measurements (ICRU) will be coming into use. All this is occurring at a time when some countries are still trying to catch up with committed dose equivalent and the recently recommended change in the value of the quality factor for neutrons. In Europe, the problems of adapting to new ICRP recommendations are considerable. The European Community, including 12 states and nine languages, takes ICRP recommendations as a basis and develops council directives that are binding on member states, which have then to arrange for their own regulatory changes. Any substantial adjustments could take 5 y or more to work through the system. Clearly, the regulatory preference is for stability. Equally clearly, trade unions and public interest groups favor a rapid response to scientific developments (provided that the change is downward). Organizations such as the ICRP have to balance their desire for internal consistency and intellectual purity against the practical problems of their clients in adjusting to change. This paper indicates some of the changes that might be necessary over the next few years and how, given a pragmatic approach, they might be accommodated in Europe without too much regulatory confusion
Speed Consistency in the Smart Tachograph.
Borio, Daniele; Cano, Eduardo; Baldini, Gianmarco
2018-05-16
In the transportation sector, safety risks can be significantly reduced by monitoring the behaviour of drivers and by discouraging possible misconducts that entail fatigue and can increase the possibility of accidents. The Smart Tachograph (ST), the new revision of the Digital Tachograph (DT), has been designed with this purpose: to verify that speed limits and compulsory rest periods are respected by drivers. In order to operate properly, the ST periodically checks the consistency of data from different sensors, which can be potentially manipulated to avoid the monitoring of the driver behaviour. In this respect, the ST regulation specifies a test procedure to detect motion conflicts originating from inconsistencies between Global Navigation Satellite System (GNSS) and odometry data. This paper provides an experimental evaluation of the speed verification procedure specified by the ST regulation. Several hours of data were collected using three vehicles and considering light urban and highway environments. The vehicles were equipped with an On-Board Diagnostics (OBD) data reader and a GPS/Galileo receiver. The tests prescribed by the regulation were implemented with specific focus on synchronization aspects. The experimental analysis also considered aspects such as the impact of tunnels and the presence of data gaps. The analysis shows that the metrics selected for the tests are resilient to data gaps, latencies between GNSS and odometry data and simplistic manipulations such as data scaling. The new ST forces an attacker to falsify data from both sensors at the same time and in a coherent way. This makes more difficult the implementation of frauds in comparison to the current version of the DT.
Energy Technology Data Exchange (ETDEWEB)
Uslar, Mathias; Beenken, Petra; Beer, Sebastian [OFFIS, Oldenburg (Germany)
2009-07-01
The ongoing integration of distributed energy recourses into the existing power grid has lead to both grown communication costs and an increased need for interoperability between the involved actors. In this context, standardized and ontology- based data models help to reduce integration costs in heterogeneous system landscapes. Using ontology-based security profiles, such models can be extended with meta-data containing information about security measures for energyrelated data in need of protection. By this approach, we achieve both a unified data model and a unified security level. (orig.)
Consistency in the World Wide Web
DEFF Research Database (Denmark)
Thomsen, Jakob Grauenkjær
Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...
Consistent histories and operational quantum theory
International Nuclear Information System (INIS)
Rudolph, O.
1996-01-01
In this work a generalization of the consistent histories approach to quantum mechanics is presented. We first critically review the consistent histories approach to nonrelativistic quantum mechanics in a mathematically rigorous way and give some general comments about it. We investigate to what extent the consistent histories scheme is compatible with the results of the operational formulation of quantum mechanics. According to the operational approach, nonrelativistic quantum mechanics is most generally formulated in terms of effects, states, and operations. We formulate a generalized consistent histories theory using the concepts and the terminology which have proven useful in the operational formulation of quantum mechanics. The logical rule of the logical interpretation of quantum mechanics is generalized to the present context. The algebraic structure of the generalized theory is studied in detail
Self-consistent areas law in QCD
International Nuclear Information System (INIS)
Makeenko, Yu.M.; Migdal, A.A.
1980-01-01
The problem of obtaining the self-consistent areas law in quantum chromodynamics (QCD) is considered from the point of view of the quark confinement. The exact equation for the loop average in multicolor QCD is reduced to a bootstrap form. Its iterations yield new manifestly gauge invariant perturbation theory in the loop space, reproducing asymptotic freedom. For large loops, the areas law apprears to be a self-consistent solution
Consistency of the MLE under mixture models
Chen, Jiahua
2016-01-01
The large-sample properties of likelihood-based statistical inference under mixture models have received much attention from statisticians. Although the consistency of the nonparametric MLE is regarded as a standard conclusion, many researchers ignore the precise conditions required on the mixture model. An incorrect claim of consistency can lead to false conclusions even if the mixture model under investigation seems well behaved. Under a finite normal mixture model, for instance, the consis...
Self-consistent asset pricing models
Malevergne, Y.; Sornette, D.
2007-08-01
We discuss the foundations of factor or regression models in the light of the self-consistency condition that the market portfolio (and more generally the risk factors) is (are) constituted of the assets whose returns it is (they are) supposed to explain. As already reported in several articles, self-consistency implies correlations between the return disturbances. As a consequence, the alphas and betas of the factor model are unobservable. Self-consistency leads to renormalized betas with zero effective alphas, which are observable with standard OLS regressions. When the conditions derived from internal consistency are not met, the model is necessarily incomplete, which means that some sources of risk cannot be replicated (or hedged) by a portfolio of stocks traded on the market, even for infinite economies. Analytical derivations and numerical simulations show that, for arbitrary choices of the proxy which are different from the true market portfolio, a modified linear regression holds with a non-zero value αi at the origin between an asset i's return and the proxy's return. Self-consistency also introduces “orthogonality” and “normality” conditions linking the betas, alphas (as well as the residuals) and the weights of the proxy portfolio. Two diagnostics based on these orthogonality and normality conditions are implemented on a basket of 323 assets which have been components of the S&P500 in the period from January 1990 to February 2005. These two diagnostics show interesting departures from dynamical self-consistency starting about 2 years before the end of the Internet bubble. Assuming that the CAPM holds with the self-consistency condition, the OLS method automatically obeys the resulting orthogonality and normality conditions and therefore provides a simple way to self-consistently assess the parameters of the model by using proxy portfolios made only of the assets which are used in the CAPM regressions. Finally, the factor decomposition with the
International Nuclear Information System (INIS)
Ingremeau, J.-J.X.
2011-01-01
In the study of any new nuclear reactor, the design of the core is an important step. However designing and optimising a reactor core is quite complex as it involves neutronics, thermal-hydraulics and fuel thermomechanics and usually design of such a system is achieved through an iterative process, involving several different disciplines. In order to solve quickly such a multi-disciplinary system, while observing the appropriate constraints, a new approach has been developed to optimise both the core performance (in-cycle Pu inventory, fuel burn-up, etc...) and the core safety characteristics (safety estimators) of a Fast Neutron Reactor. This new approach, called FARM (Fast Reactor Methodology) uses analytical models and interpolations (Meta-models) from CEA reference codes for neutronics, thermal-hydraulics and fuel behaviour, which are coupled to automatically design a core based on several optimization variables. This global core model is then linked to a genetic algorithm and used to explore and optimise new core designs with improved performance. Consideration has also been given to which parameters can be best used to define the core performance and how safety can be taken into account.This new approach has been used to optimize the design of three concepts of Gas cooled Fast Reactor (GFR). For the first one, using a SiC/SiCf-cladded carbide-fuelled helium-bonded pin, the results demonstrate that the CEA reference core obtained with the traditional iterative method was an optimal core, but among many other possibilities (that is to say on the Pareto front). The optimization also found several other cores which exhibit some improved features at the expense of other safety or performance estimators. An evolution of this concept using a 'buffer', a new technology being developed at CEA, has hence been introduced in FARM. The FARM optimisation produced several core designs using this technology, and estimated their performance. The results obtained show that
Consistency and Reconciliation Model In Regional Development Planning
Directory of Open Access Journals (Sweden)
Dina Suryawati
2016-10-01
Full Text Available The aim of this study was to identify the problems and determine the conceptual model of regional development planning. Regional development planning is a systemic, complex and unstructured process. Therefore, this study used soft systems methodology to outline unstructured issues with a structured approach. The conceptual models that were successfully constructed in this study are a model of consistency and a model of reconciliation. Regional development planning is a process that is well-integrated with central planning and inter-regional planning documents. Integration and consistency of regional planning documents are very important in order to achieve the development goals that have been set. On the other hand, the process of development planning in the region involves technocratic system, that is, both top-down and bottom-up system of participation. Both must be balanced, do not overlap and do not dominate each other. regional, development, planning, consistency, reconciliation
Dong, S.
2018-05-01
We present a reduction-consistent and thermodynamically consistent formulation and an associated numerical algorithm for simulating the dynamics of an isothermal mixture consisting of N (N ⩾ 2) immiscible incompressible fluids with different physical properties (densities, viscosities, and pair-wise surface tensions). By reduction consistency we refer to the property that if only a set of M (1 ⩽ M ⩽ N - 1) fluids are present in the system then the N-phase governing equations and boundary conditions will exactly reduce to those for the corresponding M-phase system. By thermodynamic consistency we refer to the property that the formulation honors the thermodynamic principles. Our N-phase formulation is developed based on a more general method that allows for the systematic construction of reduction-consistent formulations, and the method suggests the existence of many possible forms of reduction-consistent and thermodynamically consistent N-phase formulations. Extensive numerical experiments have been presented for flow problems involving multiple fluid components and large density ratios and large viscosity ratios, and the simulation results are compared with the physical theories or the available physical solutions. The comparisons demonstrate that our method produces physically accurate results for this class of problems.
Putting humans in ecology: consistency in science and management.
Hobbs, Larry; Fowler, Charles W
2008-03-01
Normal and abnormal levels of human participation in ecosystems can be revealed through the use of macro-ecological patterns. Such patterns also provide consistent and objective guidance that will lead to achieving and maintaining ecosystem health and sustainability. This paper focuses on the consistency of this type of guidance and management. Such management, in sharp contrast to current management practices, ensures that our actions as individuals, institutions, political groups, societies, and as a species are applied consistently across all temporal, spatial, and organizational scales. This approach supplants management of today, where inconsistency results from debate, politics, and legal and religious polarity. Consistency is achieved when human endeavors are guided by natural patterns. Pattern-based management meets long-standing demands for enlightened management that requires humans to participate in complex systems in consistent and sustainable ways.
International Nuclear Information System (INIS)
Shepard, J.R.
1991-01-01
The authors examine the RPA based on a relativistic Hartree approximation description for nuclear ground states. This model includes contributions from the negative energy sea at the 1-loop level. They emphasize consistency between the treatment of the ground state and the RPA. This consistency is important in the description of low-lying collective levels but less important for the longitudinal (e, e') quasi-elastic response. They also study the effect of imposing a 3-momentum cutoff on negative energy sea contributions. A cutoff of twice the nucleon mass improves agreement with observed spin orbit splittings in nuclei compared to the standard infinite cutoff results, an effect traceable to the fact that imposing the cutoff reduces m*/m. The cutoff is much less important than consistency in the description of low-lying collective levels. The cutoff model provides excellent agreement with quasi-elastic (e, e') data
Personalized recommendation based on unbiased consistence
Zhu, Xuzhen; Tian, Hui; Zhang, Ping; Hu, Zheng; Zhou, Tao
2015-08-01
Recently, in physical dynamics, mass-diffusion-based recommendation algorithms on bipartite network provide an efficient solution by automatically pushing possible relevant items to users according to their past preferences. However, traditional mass-diffusion-based algorithms just focus on unidirectional mass diffusion from objects having been collected to those which should be recommended, resulting in a biased causal similarity estimation and not-so-good performance. In this letter, we argue that in many cases, a user's interests are stable, and thus bidirectional mass diffusion abilities, no matter originated from objects having been collected or from those which should be recommended, should be consistently powerful, showing unbiased consistence. We further propose a consistence-based mass diffusion algorithm via bidirectional diffusion against biased causality, outperforming the state-of-the-art recommendation algorithms in disparate real data sets, including Netflix, MovieLens, Amazon and Rate Your Music.
Financial model calibration using consistency hints.
Abu-Mostafa, Y S
2001-01-01
We introduce a technique for forcing the calibration of a financial model to produce valid parameters. The technique is based on learning from hints. It converts simple curve fitting into genuine calibration, where broad conclusions can be inferred from parameter values. The technique augments the error function of curve fitting with consistency hint error functions based on the Kullback-Leibler distance. We introduce an efficient EM-type optimization algorithm tailored to this technique. We also introduce other consistency hints, and balance their weights using canonical errors. We calibrate the correlated multifactor Vasicek model of interest rates, and apply it successfully to Japanese Yen swaps market and US dollar yield market.
Parquet equations for numerical self-consistent-field theory
International Nuclear Information System (INIS)
Bickers, N.E.
1991-01-01
In recent years increases in computational power have provided new motivation for the study of self-consistent-field theories for interacting electrons. In this set of notes, the so-called parquet equations for electron systems are derived pedagogically. The principal advantages of the parquet approach are outlined, and its relationship to simpler self-consistent-field methods, including the Baym-Kadanoff technique, is discussed in detail. (author). 14 refs, 9 figs
Proteolysis and consistency of Meshanger cheese
Jong, de L.
1978-01-01
Proteolysis in Meshanger cheese, estimated by quantitative polyacrylamide gel electrophoresis is discussed. The conversion of α _{s1} -casein was proportional to rennet concentration in the cheese. Changes in consistency, after a maximum, were correlated to breakdown of
Developing consistent pronunciation models for phonemic variants
CSIR Research Space (South Africa)
Davel, M
2006-09-01
Full Text Available Pronunciation lexicons often contain pronunciation variants. This can create two problems: It can be difficult to define these variants in an internally consistent way and it can also be difficult to extract generalised grapheme-to-phoneme rule sets...
Image recognition and consistency of response
Haygood, Tamara M.; Ryan, John; Liu, Qing Mary A.; Bassett, Roland; Brennan, Patrick C.
2012-02-01
Purpose: To investigate the connection between conscious recognition of an image previously encountered in an experimental setting and consistency of response to the experimental question. Materials and Methods: Twenty-four radiologists viewed 40 frontal chest radiographs and gave their opinion as to the position of a central venous catheter. One-to-three days later they again viewed 40 frontal chest radiographs and again gave their opinion as to the position of the central venous catheter. Half of the radiographs in the second set were repeated images from the first set and half were new. The radiologists were asked of each image whether it had been included in the first set. For this study, we are evaluating only the 20 repeated images. We used the Kruskal-Wallis test and Fisher's exact test to determine the relationship between conscious recognition of a previously interpreted image and consistency in interpretation of the image. Results. There was no significant correlation between recognition of the image and consistency in response regarding the position of the central venous catheter. In fact, there was a trend in the opposite direction, with radiologists being slightly more likely to give a consistent response with respect to images they did not recognize than with respect to those they did recognize. Conclusion: Radiologists' recognition of previously-encountered images in an observer-performance study does not noticeably color their interpretation on the second encounter.
Consistent Valuation across Curves Using Pricing Kernels
Directory of Open Access Journals (Sweden)
Andrea Macrina
2018-03-01
Full Text Available The general problem of asset pricing when the discount rate differs from the rate at which an asset’s cash flows accrue is considered. A pricing kernel framework is used to model an economy that is segmented into distinct markets, each identified by a yield curve having its own market, credit and liquidity risk characteristics. The proposed framework precludes arbitrage within each market, while the definition of a curve-conversion factor process links all markets in a consistent arbitrage-free manner. A pricing formula is then derived, referred to as the across-curve pricing formula, which enables consistent valuation and hedging of financial instruments across curves (and markets. As a natural application, a consistent multi-curve framework is formulated for emerging and developed inter-bank swap markets, which highlights an important dual feature of the curve-conversion factor process. Given this multi-curve framework, existing multi-curve approaches based on HJM and rational pricing kernel models are recovered, reviewed and generalised and single-curve models extended. In another application, inflation-linked, currency-based and fixed-income hybrid securities are shown to be consistently valued using the across-curve valuation method.
Guided color consistency optimization for image mosaicking
Xie, Renping; Xia, Menghan; Yao, Jian; Li, Li
2018-01-01
This paper studies the problem of color consistency correction for sequential images with diverse color characteristics. Existing algorithms try to adjust all images to minimize color differences among images under a unified energy framework, however, the results are prone to presenting a consistent but unnatural appearance when the color difference between images is large and diverse. In our approach, this problem is addressed effectively by providing a guided initial solution for the global consistency optimization, which avoids converging to a meaningless integrated solution. First of all, to obtain the reliable intensity correspondences in overlapping regions between image pairs, we creatively propose the histogram extreme point matching algorithm which is robust to image geometrical misalignment to some extents. In the absence of the extra reference information, the guided initial solution is learned from the major tone of the original images by searching some image subset as the reference, whose color characteristics will be transferred to the others via the paths of graph analysis. Thus, the final results via global adjustment will take on a consistent color similar to the appearance of the reference image subset. Several groups of convincing experiments on both the synthetic dataset and the challenging real ones sufficiently demonstrate that the proposed approach can achieve as good or even better results compared with the state-of-the-art approaches.
Consistent application of codes and standards
International Nuclear Information System (INIS)
Scott, M.A.
1989-01-01
The guidelines presented in the US Department of Energy, General Design Criteria (DOE 6430.1A), and the Design and Evaluation Guidelines for Department of Energy Facilities Subject to Natural Phenomena Hazards (UCRL-15910) provide a consistent and well defined approach to determine the natural phenomena hazards loads for US Department of Energy site facilities. The guidelines for the application of loads combinations and allowables criteria are not as well defined and are more flexible in interpretation. This flexibility in the interpretation of load combinations can lead to conflict between the designer and overseer. The establishment of an efficient set of acceptable design criteria, based on US Department of Energy guidelines, provides a consistent baseline for analysis, design, and review. Additionally, the proposed method should not limit the design and analytical innovation necessary to analyze or qualify the unique structure. This paper investigates the consistent application of load combinations, analytical methods, and load allowables and suggests a reference path consistent with the US Department of Energy guidelines
Consistency in multi-viewpoint architectural design
Dijkman, R.M.; Dijkman, Remco Matthijs
2006-01-01
This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder’s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint.
Consistent Visual Analyses of Intrasubject Data
Kahng, SungWoo; Chung, Kyong-Mee; Gutshall, Katharine; Pitts, Steven C.; Kao, Joyce; Girolami, Kelli
2010-01-01
Visual inspection of single-case data is the primary method of interpretation of the effects of an independent variable on a dependent variable in applied behavior analysis. The purpose of the current study was to replicate and extend the results of DeProspero and Cohen (1979) by reexamining the consistency of visual analysis across raters. We…
Consistent Stochastic Modelling of Meteocean Design Parameters
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Sterndorff, M. J.
2000-01-01
Consistent stochastic models of metocean design parameters and their directional dependencies are essential for reliability assessment of offshore structures. In this paper a stochastic model for the annual maximum values of the significant wave height, and the associated wind velocity, current...
Dynamic phonon exchange requires consistent dressing
International Nuclear Information System (INIS)
Hahne, F.J.W.; Engelbrecht, C.A.; Heiss, W.D.
1976-01-01
It is shown that states with undersirable properties (such as ghosts, states with complex eigenenergies and states with unrestricted normalization) emerge from two-body calculations using dynamic effective interactions if one is not careful in introducing single-particle self-energy insertions in a consistent manner
Consistent feeding positions of great tit parents
Lessells, C.M.; Poelman, E.H.; Mateman, A.C.; Cassey, Ph.
2006-01-01
When parent birds arrive at the nest to provision their young, their position on the nest rim may influence which chick or chicks are fed. As a result, the consistency of feeding positions of the individual parents, and the difference in position between the parents, may affect how equitably food is
Consistency of the postulates of special relativity
International Nuclear Information System (INIS)
Gron, O.; Nicola, M.
1976-01-01
In a recent article in this journal, Kingsley has tried to show that the postulates of special relativity contradict each other. It is shown that the arguments of Kingsley are invalid because of an erroneous appeal to symmetry in a nonsymmetric situation. The consistency of the postulates of special relativity and the relativistic kinematics deduced from them is restated
Consistency of Network Traffic Repositories: An Overview
Lastdrager, E.; Lastdrager, E.E.H.; Pras, Aiko
2009-01-01
Traffc repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffc that has been ï¬‚owing over the network; little thoughts are made regarding the consistency of these repositories. Still, for
Consistency analysis of network traffic repositories
Lastdrager, Elmer; Lastdrager, E.E.H.; Pras, Aiko
Traffic repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffic that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for
Matrix approach to the Shapley value and dual similar associated consistency
Xu, G.; Driessen, Theo
Replacing associated consistency in Hamiache's axiom system by dual similar associated consistency, we axiomatize the Shapley value as the unique value verifying the inessential game property, continuity and dual similar associated consistency. Continuing the matrix analysis for Hamiache's
Consistency among integral measurements of aggregate decay heat power
Energy Technology Data Exchange (ETDEWEB)
Takeuchi, H.; Sagisaka, M.; Oyamatsu, K.; Kukita, Y. [Nagoya Univ. (Japan)
1998-03-01
Persisting discrepancies between summation calculations and integral measurements force us to assume large uncertainties in the recommended decay heat power. In this paper, we develop a hybrid method to calculate the decay heat power of a fissioning system from those of different fissioning systems. Then, this method is applied to examine consistency among measured decay heat powers of {sup 232}Th, {sup 233}U, {sup 235}U, {sup 238}U and {sup 239}Pu at YAYOI. The consistency among the measured values are found to be satisfied for the {beta} component and fairly well for the {gamma} component, except for cooling times longer than 4000 s. (author)
A consistent interpretation of quantum mechanics
International Nuclear Information System (INIS)
Omnes, Roland
1990-01-01
Some mostly recent theoretical and mathematical advances can be linked together to yield a new consistent interpretation of quantum mechanics. It relies upon a unique and universal interpretative rule of a logical character which is based upon Griffiths consistent history. Some new results in semi-classical physics allow classical physics to be derived from this rule, including its logical aspects, and to prove accordingly the existence of determinism within the quantum framework. Together with decoherence, this can be used to retrieve the existence of facts, despite the probabilistic character of the theory. Measurement theory can then be made entirely deductive. It is accordingly found that wave packet reduction is a logical property, whereas one can always choose to avoid using it. The practical consequences of this interpretation are most often in agreement with the Copenhagen formulation but they can be proved never to give rise to any logical inconsistency or paradox. (author)
Student Effort, Consistency and Online Performance
Directory of Open Access Journals (Sweden)
Hilde Patron
2011-07-01
Full Text Available This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas effort, or total minutes spent online, is not. Other independent variables include GPA and the difference between a pre-test and a post-test. The GPA is used as a measure of motivation, and the difference between a post-test and pre-test as marginal learning. As expected, the level of motivation is found statistically significant at a 99% confidence level, and marginal learning is also significant at a 95% level.
Consistency relation for cosmic magnetic fields
DEFF Research Database (Denmark)
Jain, R. K.; Sloth, M. S.
2012-01-01
If cosmic magnetic fields are indeed produced during inflation, they are likely to be correlated with the scalar metric perturbations that are responsible for the cosmic microwave background anisotropies and large scale structure. Within an archetypical model of inflationary magnetogenesis, we show...... that there exists a new simple consistency relation for the non-Gaussian cross correlation function of the scalar metric perturbation with two powers of the magnetic field in the squeezed limit where the momentum of the metric perturbation vanishes. We emphasize that such a consistency relation turns out...... to be extremely useful to test some recent calculations in the literature. Apart from primordial non-Gaussianity induced by the curvature perturbations, such a cross correlation might provide a new observational probe of inflation and can in principle reveal the primordial nature of cosmic magnetic fields. DOI...
Consistent Estimation of Partition Markov Models
Directory of Open Access Journals (Sweden)
Jesús E. García
2017-04-01
Full Text Available The Partition Markov Model characterizes the process by a partition L of the state space, where the elements in each part of L share the same transition probability to an arbitrary element in the alphabet. This model aims to answer the following questions: what is the minimal number of parameters needed to specify a Markov chain and how to estimate these parameters. In order to answer these questions, we build a consistent strategy for model selection which consist of: giving a size n realization of the process, finding a model within the Partition Markov class, with a minimal number of parts to represent the process law. From the strategy, we derive a measure that establishes a metric in the state space. In addition, we show that if the law of the process is Markovian, then, eventually, when n goes to infinity, L will be retrieved. We show an application to model internet navigation patterns.
Internal Branding and Employee Brand Consistent Behaviours
DEFF Research Database (Denmark)
Mazzei, Alessandra; Ravazzani, Silvia
2017-01-01
constitutive processes. In particular, the paper places emphasis on the role and kinds of communication practices as a central part of the nonnormative and constitutive internal branding process. The paper also discusses an empirical study based on interviews with 32 Italian and American communication managers...... and 2 focus groups with Italian communication managers. Findings show that, in order to enhance employee brand consistent behaviours, the most effective communication practices are those characterised as enablement-oriented. Such a communication creates the organizational conditions adequate to sustain......Employee behaviours conveying brand values, named brand consistent behaviours, affect the overall brand evaluation. Internal branding literature highlights a knowledge gap in terms of communication practices intended to sustain such behaviours. This study contributes to the development of a non...
FTA Transit Intelligent Transportation System Architecture Consistency Review - 2010 Update
2011-07-01
This report provides an assessment on the level of compliance among the FTA grantees with the National ITS Architecture Policy, specifically examining three items: 1. The use and maintenance of Regional ITS Architectures by transit agencies to plan, ...
Consistent Prediction of Properties of Systems with Lipids
DEFF Research Database (Denmark)
Cunico, Larissa; Ceriani, Roberta; Sarup, Bent
Equilibria between vapour, liquid and/or solid phases, pure component properties and also the mixture-phase properties are necessary for synthesis, design and analysis of different unit operations found in the production of edible oils, fats and biodiesel. A systematic numerical analysis....... Lipids are found in almost all mixtures involving edible oils, fats and biodiesel. They are also being extracted for use in the pharma-industry. A database for pure components (lipids) present in these processes and mixtures properties has been developed and made available for different applications...... (model development, property verification, property prediction, etc.). The database has verified data for fatty acids, acylglycerols, fatty esters, fatty alcohols, vegetable oils, biodiesel and minor compounds as phospholipids, tocopherols, sterols, carotene and squalene, together with a user friendly...
Evaluating Temporal Consistency in Marine Biodiversity Hotspots
Piacenza, Susan E.; Thurman, Lindsey L.; Barner, Allison K.; Benkwitt, Cassandra E.; Boersma, Kate S.; Cerny-Chipman, Elizabeth B.; Ingeman, Kurt E.; Kindinger, Tye L.; Lindsley, Amy J.; Nelson, Jake; Reimer, Jessica N.; Rowe, Jennifer C.; Shen, Chenchen; Thompson, Kevin A.; Heppell, Selina S.
2015-01-01
With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monito...
Cloud Standardization: Consistent Business Processes and Information
Directory of Open Access Journals (Sweden)
Razvan Daniel ZOTA
2013-01-01
Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.
Consistency Analysis of Nearest Subspace Classifier
Wang, Yi
2015-01-01
The Nearest subspace classifier (NSS) finds an estimation of the underlying subspace within each class and assigns data points to the class that corresponds to its nearest subspace. This paper mainly studies how well NSS can be generalized to new samples. It is proved that NSS is strongly consistent under certain assumptions. For completeness, NSS is evaluated through experiments on various simulated and real data sets, in comparison with some other linear model based classifiers. It is also ...
Consistent probabilities in loop quantum cosmology
International Nuclear Information System (INIS)
Craig, David A; Singh, Parampreet
2013-01-01
A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)
Orthology and paralogy constraints: satisfiability and consistency.
Lafond, Manuel; El-Mabrouk, Nadia
2014-01-01
A variety of methods based on sequence similarity, reconciliation, synteny or functional characteristics, can be used to infer orthology and paralogy relations between genes of a given gene family G. But is a given set C of orthology/paralogy constraints possible, i.e., can they simultaneously co-exist in an evolutionary history for G? While previous studies have focused on full sets of constraints, here we consider the general case where C does not necessarily involve a constraint for each pair of genes. The problem is subdivided in two parts: (1) Is C satisfiable, i.e. can we find an event-labeled gene tree G inducing C? (2) Is there such a G which is consistent, i.e., such that all displayed triplet phylogenies are included in a species tree? Previous results on the Graph sandwich problem can be used to answer to (1), and we provide polynomial-time algorithms for satisfiability and consistency with a given species tree. We also describe a new polynomial-time algorithm for the case of consistency with an unknown species tree and full knowledge of pairwise orthology/paralogy relationships, as well as a branch-and-bound algorithm in the case when unknown relations are present. We show that our algorithms can be used in combination with ProteinOrtho, a sequence similarity-based orthology detection tool, to extract a set of robust orthology/paralogy relationships.
Autonomous Navigation with Constrained Consistency for C-Ranger
Directory of Open Access Journals (Sweden)
Shujing Zhang
2014-06-01
Full Text Available Autonomous underwater vehicles (AUVs have become the most widely used tools for undertaking complex exploration tasks in marine environments. Their synthetic ability to carry out localization autonomously and build an environmental map concurrently, in other words, simultaneous localization and mapping (SLAM, are considered to be pivotal requirements for AUVs to have truly autonomous navigation. However, the consistency problem of the SLAM system has been greatly ignored during the past decades. In this paper, a consistency constrained extended Kalman filter (EKF SLAM algorithm, applying the idea of local consistency, is proposed and applied to the autonomous navigation of the C-Ranger AUV, which is developed as our experimental platform. The concept of local consistency (LC is introduced after an explicit theoretical derivation of the EKF-SLAM system. Then, we present a locally consistency-constrained EKF-SLAM design, LC-EKF, in which the landmark estimates used for linearization are fixed at the beginning of each local time period, rather than evaluated at the latest landmark estimates. Finally, our proposed LC-EKF algorithm is experimentally verified, both in simulations and sea trials. The experimental results show that the LC-EKF performs well with regard to consistency, accuracy and computational efficiency.
Consistency of color representation in smart phones.
Dain, Stephen J; Kwan, Benjamin; Wong, Leslie
2016-03-01
One of the barriers to the construction of consistent computer-based color vision tests has been the variety of monitors and computers. Consistency of color on a variety of screens has necessitated calibration of each setup individually. Color vision examination with a carefully controlled display has, as a consequence, been a laboratory rather than a clinical activity. Inevitably, smart phones have become a vehicle for color vision tests. They have the advantage that the processor and screen are associated and there are fewer models of smart phones than permutations of computers and monitors. Colorimetric consistency of display within a model may be a given. It may extend across models from the same manufacturer but is unlikely to extend between manufacturers especially where technologies vary. In this study, we measured the same set of colors in a JPEG file displayed on 11 samples of each of four models of smart phone (iPhone 4s, iPhone5, Samsung Galaxy S3, and Samsung Galaxy S4) using a Photo Research PR-730. The iPhones are white LED backlit LCD and the Samsung are OLEDs. The color gamut varies between models and comparison with sRGB space shows 61%, 85%, 117%, and 110%, respectively. The iPhones differ markedly from the Samsungs and from one another. This indicates that model-specific color lookup tables will be needed. Within each model, the primaries were quite consistent (despite the age of phone varying within each sample). The worst case in each model was the blue primary; the 95th percentile limits in the v' coordinate were ±0.008 for the iPhone 4 and ±0.004 for the other three models. The u'v' variation in white points was ±0.004 for the iPhone4 and ±0.002 for the others, although the spread of white points between models was u'v'±0.007. The differences are essentially the same for primaries at low luminance. The variation of colors intermediate between the primaries (e.g., red-purple, orange) mirror the variation in the primaries. The variation in
On the Consistent Migration of Unsplittable Flows
DEFF Research Database (Denmark)
Förster, Klaus-Tycho
2017-01-01
in an inherently asynchronous system, the switches distributed over the network. To this end, a multitude of scheduling systems have been proposed since the initial papers of Reitblatt et al. (Abstractions for Network Update, SIGCOMM ’12) and Hong et al. (SWAN, SIGCOMM ’13). While the the complexity...
Evaluating Temporal Consistency in Marine Biodiversity Hotspots.
Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S
2015-01-01
With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other
Self-consistent gravitational self-force
International Nuclear Information System (INIS)
Pound, Adam
2010-01-01
I review the problem of motion for small bodies in general relativity, with an emphasis on developing a self-consistent treatment of the gravitational self-force. An analysis of the various derivations extant in the literature leads me to formulate an asymptotic expansion in which the metric is expanded while a representative worldline is held fixed. I discuss the utility of this expansion for both exact point particles and asymptotically small bodies, contrasting it with a regular expansion in which both the metric and the worldline are expanded. Based on these preliminary analyses, I present a general method of deriving self-consistent equations of motion for arbitrarily structured (sufficiently compact) small bodies. My method utilizes two expansions: an inner expansion that keeps the size of the body fixed, and an outer expansion that lets the body shrink while holding its worldline fixed. By imposing the Lorenz gauge, I express the global solution to the Einstein equation in the outer expansion in terms of an integral over a worldtube of small radius surrounding the body. Appropriate boundary data on the tube are determined from a local-in-space expansion in a buffer region where both the inner and outer expansions are valid. This buffer-region expansion also results in an expression for the self-force in terms of irreducible pieces of the metric perturbation on the worldline. Based on the global solution, these pieces of the perturbation can be written in terms of a tail integral over the body's past history. This approach can be applied at any order to obtain a self-consistent approximation that is valid on long time scales, both near and far from the small body. I conclude by discussing possible extensions of my method and comparing it to alternative approaches.
Consistency Checking of Web Service Contracts
DEFF Research Database (Denmark)
Cambronero, M. Emilia; Okika, Joseph C.; Ravn, Anders Peter
2008-01-01
Behavioural properties are analyzed for web service contracts formulated in Business Process Execution Language (BPEL) and Choreography Description Language (CDL). The key result reported is an automated technique to check consistency between protocol aspects of the contracts. The contracts...... are abstracted to (timed) automata and from there a simulation is set up, which is checked using automated tools for analyzing networks of finite state processes. Here we use the Concurrency Work Bench. The proposed techniques are illustrated with a case study that include otherwise difficult to analyze fault...
A method for consistent precision radiation therapy
International Nuclear Information System (INIS)
Leong, J.
1985-01-01
Using a meticulous setup procedure in which repeated portal films were taken before each treatment until satisfactory portal verifications were obtained, a high degree of precision in patient positioning was achieved. A fluctuation from treatment to treatment, over 11 treatments, of less than +-0.10 cm (S.D.) for anatomical points inside the treatment field was obtained. This, however, only applies to specific anatomical points selected for this positioning procedure and does not apply to all points within the portal. We have generalized this procedure and have suggested a means by which any target volume can be consistently positioned which may approach this degree of precision. (orig.)
Two consistent calculations of the Weinberg angle
International Nuclear Information System (INIS)
Fairlie, D.B.
1979-01-01
The Weinberg-Salam theory is reformulated as a pure Yang-Mills theory in a six-dimensional space, the Higgs field being interpreted as gauge potentials in the additional dimensions. Viewed in this way, the condition that the Higgs field transforms as a U(1) representation of charge one is equivalent to requiring a value of 30 0 C for the Weinberg angle. A second consistent determination comes from the idea borrowed from monopole theory that the electromagnetic field is in the direction of the Higgs field. (Author)
Consistent resolution of some relativistic quantum paradoxes
International Nuclear Information System (INIS)
Griffiths, Robert B.
2002-01-01
A relativistic version of the (consistent or decoherent) histories approach to quantum theory is developed on the basis of earlier work by Hartle, and used to discuss relativistic forms of the paradoxes of spherical wave packet collapse, Bohm's formulation of the Einstein-Podolsky-Rosen paradox, and Hardy's paradox. It is argued that wave function collapse is not needed for introducing probabilities into relativistic quantum mechanics, and in any case should never be thought of as a physical process. Alternative approaches to stochastic time dependence can be used to construct a physical picture of the measurement process that is less misleading than collapse models. In particular, one can employ a coarse-grained but fully quantum-mechanical description in which particles move along trajectories, with behavior under Lorentz transformations the same as in classical relativistic physics, and detectors are triggered by particles reaching them along such trajectories. States entangled between spacelike separate regions are also legitimate quantum descriptions, and can be consistently handled by the formalism presented here. The paradoxes in question arise because of using modes of reasoning which, while correct for classical physics, are inconsistent with the mathematical structure of quantum theory, and are resolved (or tamed) by using a proper quantum analysis. In particular, there is no need to invoke, nor any evidence for, mysterious long-range superluminal influences, and thus no incompatibility, at least from this source, between relativity theory and quantum mechanics
Self-consistent model of confinement
International Nuclear Information System (INIS)
Swift, A.R.
1988-01-01
A model of the large-spatial-distance, zero--three-momentum, limit of QCD is developed from the hypothesis that there is an infrared singularity. Single quarks and gluons do not propagate because they have infinite energy after renormalization. The Hamiltonian formulation of the path integral is used to quantize QCD with physical, nonpropagating fields. Perturbation theory in the infrared limit is simplified by the absence of self-energy insertions and by the suppression of large classes of diagrams due to vanishing propagators. Remaining terms in the perturbation series are resummed to produce a set of nonlinear, renormalizable integral equations which fix both the confining interaction and the physical propagators. Solutions demonstrate the self-consistency of the concepts of an infrared singularity and nonpropagating fields. The Wilson loop is calculated to provide a general proof of confinement. Bethe-Salpeter equations for quark-antiquark pairs and for two gluons have finite-energy solutions in the color-singlet channel. The choice of gauge is addressed in detail. Large classes of corrections to the model are discussed and shown to support self-consistency
Subgame consistent cooperation a comprehensive treatise
Yeung, David W K
2016-01-01
Strategic behavior in the human and social world has been increasingly recognized in theory and practice. It is well known that non-cooperative behavior could lead to suboptimal or even highly undesirable outcomes. Cooperation suggests the possibility of obtaining socially optimal solutions and the calls for cooperation are prevalent in real-life problems. Dynamic cooperation cannot be sustainable if there is no guarantee that the agreed upon optimality principle at the beginning is maintained throughout the cooperation duration. It is due to the lack of this kind of guarantees that cooperative schemes fail to last till its end or even fail to get started. The property of subgame consistency in cooperative dynamic games and the corresponding solution mechanism resolve this “classic” problem in game theory. This book is a comprehensive treatise on subgame consistent dynamic cooperation covering the up-to-date state of the art analyses in this important topic. It sets out to provide the theory, solution tec...
Sludge characterization: the role of physical consistency
Energy Technology Data Exchange (ETDEWEB)
Spinosa, Ludovico; Wichmann, Knut
2003-07-01
The physical consistency is an important parameter in sewage sludge characterization as it strongly affects almost all treatment, utilization and disposal operations. In addition, in many european Directives a reference to the physical consistency is reported as a characteristic to be evaluated for fulfilling the regulations requirements. Further, in many analytical methods for sludge different procedures are indicated depending on whether a sample is liquid or not, is solid or not. Three physical behaviours (liquid, paste-like and solid) can be observed with sludges, so the development of analytical procedures to define the boundary limit between liquid and paste-like behaviours (flowability) and that between solid and paste-like ones (solidity) is of growing interest. Several devices can be used for evaluating the flowability and solidity properties, but often they are costly and difficult to be operated in the field. Tests have been carried out to evaluate the possibility to adopt a simple extrusion procedure for flowability measurements, and a Vicat needle for solidity ones. (author)
Consistent mutational paths predict eukaryotic thermostability
Directory of Open Access Journals (Sweden)
van Noort Vera
2013-01-01
Full Text Available Abstract Background Proteomes of thermophilic prokaryotes have been instrumental in structural biology and successfully exploited in biotechnology, however many proteins required for eukaryotic cell function are absent from bacteria or archaea. With Chaetomium thermophilum, Thielavia terrestris and Thielavia heterothallica three genome sequences of thermophilic eukaryotes have been published. Results Studying the genomes and proteomes of these thermophilic fungi, we found common strategies of thermal adaptation across the different kingdoms of Life, including amino acid biases and a reduced genome size. A phylogenetics-guided comparison of thermophilic proteomes with those of other, mesophilic Sordariomycetes revealed consistent amino acid substitutions associated to thermophily that were also present in an independent lineage of thermophilic fungi. The most consistent pattern is the substitution of lysine by arginine, which we could find in almost all lineages but has not been extensively used in protein stability engineering. By exploiting mutational paths towards the thermophiles, we could predict particular amino acid residues in individual proteins that contribute to thermostability and validated some of them experimentally. By determining the three-dimensional structure of an exemplar protein from C. thermophilum (Arx1, we could also characterise the molecular consequences of some of these mutations. Conclusions The comparative analysis of these three genomes not only enhances our understanding of the evolution of thermophily, but also provides new ways to engineer protein stability.
Consistency of extreme flood estimation approaches
Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf
2017-04-01
Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.
Consistent biokinetic models for the actinide elements
International Nuclear Information System (INIS)
Leggett, R.W.
2001-01-01
The biokinetic models for Th, Np, Pu, Am and Cm currently recommended by the International Commission on Radiological Protection (ICRP) were developed within a generic framework that depicts gradual burial of skeletal activity in bone volume, depicts recycling of activity released to blood and links excretion to retention and translocation of activity. For other actinide elements such as Ac, Pa, Bk, Cf and Es, the ICRP still uses simplistic retention models that assign all skeletal activity to bone surface and depicts one-directional flow of activity from blood to long-term depositories to excreta. This mixture of updated and older models in ICRP documents has led to inconsistencies in dose estimates and interpretation of bioassay for radionuclides with reasonably similar biokinetics. This paper proposes new biokinetic models for Ac, Pa, Bk, Cf and Es that are consistent with the updated models for Th, Np, Pu, Am and Cm. The proposed models are developed within the ICRP's generic model framework for bone-surface-seeking radionuclides, and an effort has been made to develop parameter values that are consistent with results of comparative biokinetic data on the different actinide elements. (author)
Consistency of canonical formulation of Horava gravity
International Nuclear Information System (INIS)
Soo, Chopin
2011-01-01
Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.
Consistency of canonical formulation of Horava gravity
Energy Technology Data Exchange (ETDEWEB)
Soo, Chopin, E-mail: cpsoo@mail.ncku.edu.tw [Department of Physics, National Cheng Kung University, Tainan, Taiwan (China)
2011-09-22
Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.
Evaluating the hydrological consistency of evaporation products
Lopez Valencia, Oliver Miguel; Houborg, Rasmus; McCabe, Matthew
2017-01-01
Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months
Self-consistent modelling of ICRH
International Nuclear Information System (INIS)
Hellsten, T.; Hedin, J.; Johnson, T.; Laxaaback, M.; Tennfors, E.
2001-01-01
The performance of ICRH is often sensitive to the shape of the high energy part of the distribution functions of the resonating species. This requires self-consistent calculations of the distribution functions and the wave-field. In addition to the wave-particle interactions and Coulomb collisions the effects of the finite orbit width and the RF-induced spatial transport are found to be important. The inward drift dominates in general even for a symmetric toroidal wave spectrum in the centre of the plasma. An inward drift does not necessarily produce a more peaked heating profile. On the contrary, for low concentrations of hydrogen minority in deuterium plasmas it can even give rise to broader profiles. (author)
Non linear self consistency of microtearing modes
International Nuclear Information System (INIS)
Garbet, X.; Mourgues, F.; Samain, A.
1987-01-01
The self consistency of a microtearing turbulence is studied in non linear regimes where the ergodicity of the flux lines determines the electron response. The current which sustains the magnetic perturbation via the Ampere law results from the combines action of the radial electric field in the frame where the island chains are static and of the thermal electron diamagnetism. Numerical calculations show that at usual values of β pol in Tokamaks the turbulence can create a diffusion coefficient of order ν th p 2 i where p i is the ion larmor radius and ν th the electron ion collision frequency. On the other hand, collisionless regimes involving special profiles of each mode near the resonant surface seem possible
Consistent evolution in a pedestrian flow
Guan, Junbiao; Wang, Kaihua
2016-03-01
In this paper, pedestrian evacuation considering different human behaviors is studied by using a cellular automaton (CA) model combined with the snowdrift game theory. The evacuees are divided into two types, i.e. cooperators and defectors, and two different human behaviors, herding behavior and independent behavior, are investigated. It is found from a large amount of numerical simulations that the ratios of the corresponding evacuee clusters are evolved to consistent states despite 11 typically different initial conditions, which may largely owe to self-organization effect. Moreover, an appropriate proportion of initial defectors who are of herding behavior, coupled with an appropriate proportion of initial defectors who are of rationally independent thinking, are two necessary factors for short evacuation time.
Evaluating the hydrological consistency of evaporation products
Lopez Valencia, Oliver Miguel
2017-01-18
Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months
High-performance speech recognition using consistency modeling
Digalakis, Vassilios; Murveit, Hy; Monaco, Peter; Neumeyer, Leo; Sankar, Ananth
1994-12-01
The goal of SRI's consistency modeling project is to improve the raw acoustic modeling component of SRI's DECIPHER speech recognition system and develop consistency modeling technology. Consistency modeling aims to reduce the number of improper independence assumptions used in traditional speech recognition algorithms so that the resulting speech recognition hypotheses are more self-consistent and, therefore, more accurate. At the initial stages of this effort, SRI focused on developing the appropriate base technologies for consistency modeling. We first developed the Progressive Search technology that allowed us to perform large-vocabulary continuous speech recognition (LVCSR) experiments. Since its conception and development at SRI, this technique has been adopted by most laboratories, including other ARPA contracting sites, doing research on LVSR. Another goal of the consistency modeling project is to attack difficult modeling problems, when there is a mismatch between the training and testing phases. Such mismatches may include outlier speakers, different microphones and additive noise. We were able to either develop new, or transfer and evaluate existing, technologies that adapted our baseline genonic HMM recognizer to such difficult conditions.
An energetically consistent vertical mixing parameterization in CCSM4
DEFF Research Database (Denmark)
Nielsen, Søren Borg; Jochum, Markus; Eden, Carsten
2018-01-01
An energetically consistent stratification-dependent vertical mixing parameterization is implemented in the Community Climate System Model 4 and forced with energy conversion from the barotropic tides to internal waves. The structures of the resulting dissipation and diffusivity fields are compared......, however, depends greatly on the details of the vertical mixing parameterizations, where the new energetically consistent parameterization results in low thermocline diffusivities and a sharper and shallower thermocline. It is also investigated if the ocean state is more sensitive to a change in forcing...
2010-07-01
... waste in the State may be deemed inconsistent. (c) If the State manifest system does not meet the requirements of this part, the State program shall be deemed inconsistent. [48 FR 14248, Apr. 1, 1983; 48 FR... facilities authorized to operate under the Federal or an approved State program shall be deemed inconsistent...
Consistency and refinement for Interval Markov Chains
DEFF Research Database (Denmark)
Delahaye, Benoit; Larsen, Kim Guldstrand; Legay, Axel
2012-01-01
Interval Markov Chains (IMC), or Markov Chains with probability intervals in the transition matrix, are the base of a classic specification theory for probabilistic systems [18]. The standard semantics of IMCs assigns to a specification the set of all Markov Chains that satisfy its interval...
Exploring the Consistent behavior of Information Services
Directory of Open Access Journals (Sweden)
Kapidakis Sarantos
2016-01-01
Full Text Available Computer services are normally assumed to work well all the time. This usually happens for crucial services like bank electronic services, but not necessarily so for others, that there is no commercial interest in their operation. In this work we examined the operation and the errors of information services and tried to find clues that will help predicting the consistency of the behavior and the quality of the harvesting, which is harder because of the transient conditions and the many services and the huge amount of harvested information. We found many unexpected situations. The services that always successfully satisfy a request may in fact return part of it. A significant part of the OAI services have ceased working while many other serves occasionally fail to respond. Some services fail in the same way each time, and we pronounce them dead, as we do not see a way to overcome that. Others also always, or sometimes fail, but not in the same way, and we hope that their behavior is affected by temporary factors, that may improve later on. We categorized the services into classes, to study their behavior in more detail.
[Consistent Declarative Memory with Depressive Symptomatology].
Botelho de Oliveira, Silvia; Flórez, Ruth Natalia Suárez; Caballero, Diego Andrés Vásquez
2012-12-01
Some studies have suggested that potentiated remembrance of negative events on people with depressive disorders seems to be an important factor in the etiology, course and maintenance of depression. Evaluate the emotional memory in people with and without depressive symptomatology by means of an audio-visual test. 73 university students were evaluated, male and female, between 18 and 40 years old, distributed in two groups: with depressive symptomatology (32) and without depressive symptomatology (40), using the Scale from the Center of Epidemiologic Studies for Depression (CES-D, English Abbreviation) and a cutting point of 20. There were not meaningful differences between free and voluntary recalls, with and without depressive symptomatology, in spite of the fact that both groups had granted a higher emotional value to the audio-visual test and that they had associated it with emotional sadness. People with depressive symptomatology did not exhibit the effect of mnemonic potentiation generally associated to the content of the emotional version of the test; therefore, the hypothesis of emotional consistency was not validated. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Self consistent field theory of virus assembly
Li, Siyu; Orland, Henri; Zandi, Roya
2018-04-01
The ground state dominance approximation (GSDA) has been extensively used to study the assembly of viral shells. In this work we employ the self-consistent field theory (SCFT) to investigate the adsorption of RNA onto positively charged spherical viral shells and examine the conditions when GSDA does not apply and SCFT has to be used to obtain a reliable solution. We find that there are two regimes in which GSDA does work. First, when the genomic RNA length is long enough compared to the capsid radius, and second, when the interaction between the genome and capsid is so strong that the genome is basically localized next to the wall. We find that for the case in which RNA is more or less distributed uniformly in the shell, regardless of the length of RNA, GSDA is not a good approximation. We observe that as the polymer-shell interaction becomes stronger, the energy gap between the ground state and first excited state increases and thus GSDA becomes a better approximation. We also present our results corresponding to the genome persistence length obtained through the tangent-tangent correlation length and show that it is zero in case of GSDA but is equal to the inverse of the energy gap when using SCFT.
Consistency based correlations for tailings consolidation
Energy Technology Data Exchange (ETDEWEB)
Azam, S.; Paul, A.C. [Regina Univ., Regina, SK (Canada). Environmental Systems Engineering
2010-07-01
The extraction of oil, uranium, metals and mineral resources from the earth generates significant amounts of tailings slurry. The tailings are contained in a disposal area with perimeter dykes constructed from the coarser fraction of the slurry. There are many unique challenges pertaining to the management of the containment facilities for several decades beyond mine closure that are a result of the slow settling rates of the fines and the high standing toxic waters. Many tailings dam failures in different parts of the world have been reported to result in significant contaminant releases causing public concern over the conventional practice of tailings disposal. Therefore, in order to reduce and minimize the environmental footprint, the fluid tailings need to undergo efficient consolidation. This paper presented an investigation into the consolidation behaviour of tailings in conjunction with soil consistency that captured physicochemical interactions. The paper discussed the large strain consolidation behaviour (volume compressibility and hydraulic conductivity) of six fine-grained soil slurries based on published data. The paper provided background information on the study and presented the research methodology. The geotechnical index properties of the selected materials were also presented. The large strain consolidation, volume compressibility correlations, and hydraulic conductivity correlations were provided. It was concluded that the normalized void ratio best described volume compressibility whereas liquidity index best explained the hydraulic conductivity. 17 refs., 3 tabs., 4 figs.
Consistency between GRUAN sondes, LBLRTM and IASI
Directory of Open Access Journals (Sweden)
X. Calbet
2017-06-01
Full Text Available Radiosonde soundings from the GCOS Reference Upper-Air Network (GRUAN data record are shown to be consistent with Infrared Atmospheric Sounding Instrument (IASI-measured radiances via LBLRTM (Line-By-Line Radiative Transfer Model in the part of the spectrum that is mostly affected by water vapour absorption in the upper troposphere (from 700 hPa up. This result is key for climate data records, since GRUAN, IASI and LBLRTM constitute reference measurements or a reference radiative transfer model in each of their fields. This is specially the case for night-time radiosonde measurements. Although the sample size is small (16 cases, daytime GRUAN radiosonde measurements seem to have a small dry bias of 2.5 % in absolute terms of relative humidity, located mainly in the upper troposphere, with respect to LBLRTM and IASI. Full metrological closure is not yet possible and will not be until collocation uncertainties are better characterized and a full uncertainty covariance matrix is clarified for GRUAN.
Toward a consistent model for glass dissolution
International Nuclear Information System (INIS)
Strachan, D.M.; McGrail, B.P.; Bourcier, W.L.
1994-01-01
Understanding the process of glass dissolution in aqueous media has advanced significantly over the last 10 years through the efforts of many scientists around the world. Mathematical models describing the glass dissolution process have also advanced from simple empirical functions to structured models based on fundamental principles of physics, chemistry, and thermodynamics. Although borosilicate glass has been selected as the waste form for disposal of high-level wastes in at least 5 countries, there is no international consensus on the fundamental methodology for modeling glass dissolution that could be used in assessing the long term performance of waste glasses in a geologic repository setting. Each repository program is developing their own model and supporting experimental data. In this paper, we critically evaluate a selected set of these structured models and show that a consistent methodology for modeling glass dissolution processes is available. We also propose a strategy for a future coordinated effort to obtain the model input parameters that are needed for long-term performance assessments of glass in a geologic repository. (author) 4 figs., tabs., 75 refs
Consistent data-driven computational mechanics
González, D.; Chinesta, F.; Cueto, E.
2018-05-01
We present a novel method, within the realm of data-driven computational mechanics, to obtain reliable and thermodynamically sound simulation from experimental data. We thus avoid the need to fit any phenomenological model in the construction of the simulation model. This kind of techniques opens unprecedented possibilities in the framework of data-driven application systems and, particularly, in the paradigm of industry 4.0.
Merging By Decentralized Eventual Consistency Algorithms
Directory of Open Access Journals (Sweden)
Ahmed-Nacer Mehdi
2015-12-01
Full Text Available Merging mechanism is an essential operation for version control systems. When each member of collaborative development works on an individual copy of the project, software merging allows to reconcile modifications made concurrently as well as managing software change through branching. The collaborative system is in charge to propose a merge result that includes user’s modifications. Theusers now have to check and adapt this result. The adaptation should be as effort-less as possible, otherwise, the users may get frustrated and will quit the collaboration. This paper aims to reduce the conflicts during the collaboration and im prove the productivity. It has three objectives: study the users’ behavior during the collaboration, evaluate the quality of textual merging results produced by specific algorithms and propose a solution to improve the r esult quality produced by the default merge tool of distributed version control systems. Through a study of eight open-source repositories totaling more than 3 million lines of code, we observe the behavior of the concurrent modifications during t he merge p rocedure. We i dentified when th e ex isting merge techniques under-perform, and we propose solutions to improve the quality of the merge. We finally compare with the traditional merge tool through a large corpus of collaborative editing.
Application of consistent fluid added mass matrix to core seismic
International Nuclear Information System (INIS)
Koo, K. H.; Lee, J. H.
2003-01-01
In this paper, the application algorithm of a consistent fluid added mass matrix including the coupling terms to the core seismic analysis is developed and installed at SAC-CORE3.0 code. As an example, we assumed the 7-hexagon system of the LMR core and carried out the vibration modal analysis and the nonlinear time history seismic response analysis using SAC-CORE3.0. Used consistent fluid added mass matrix is obtained by using the finite element program of the FAMD(Fluid Added Mass and Damping) code. From the results of the vibration modal analysis, the core duct assemblies reveal strongly coupled vibration modes, which are so different from the case of in-air condition. From the results of the time history seismic analysis, it was verified that the effects of the coupled terms of the consistent fluid added mass matrix are significant in impact responses and the dynamic responses
The Consistency Between Clinical and Electrophysiological Diagnoses
Directory of Open Access Journals (Sweden)
Esra E. Okuyucu
2009-09-01
Full Text Available OBJECTIVE: The aim of this study was to provide information concerning the impact of electrophysiological tests in the clinical management and diagnosis of patients, and to evaluate the consistency between referring clinical diagnoses and electrophysiological diagnoses. METHODS: The study included 957 patients referred to the electroneuromyography (ENMG laboratory from different clinics with different clinical diagnoses in 2008. Demographic data, referring clinical diagnoses, the clinics where the requests wanted, and diagnoses after ENMG testing were recorded and statistically evaluated. RESULTS: In all, 957 patients [644 (67.3% female and 313 (32.7% male] were included in the study. Mean age of the patients was 45.40 ± 14.54 years. ENMG requests were made by different specialists; 578 (60.4% patients were referred by neurologists, 122 (12.8% by orthopedics, 140 (14.6% by neurosurgeons, and 117 (12.2% by physical treatment and rehabilitation departments. According to the results of ENMG testing, 513 (53.6% patients’ referrals were related to their referral diagnosis, whereas 397 (41.5% patients had normal ENMG test results, and 47 (4.9% patients had a diagnosis that differed from the referring diagnosis. Among the relation between the referral diagnosis and electrophysiological diagnosis according to the clinics where the requests were made, there was no statistical difference (p= 0.794, but there were statistically significant differences between the support of different clinical diagnoses, such as carpal tunnel syndrome, polyneuropathy, radiculopathy-plexopathy, entrapment neuropathy, and myopathy based on ENMG test results (p< 0.001. CONCLUSION: ENMG is a frequently used neurological examination. As such, referrals for ENMG can be made to either support the referring diagnosis or to exclude other diagnoses. This may explain the inconsistency between clinical referring diagnoses and diagnoses following ENMG
Self-consistent meson mass spectrum
International Nuclear Information System (INIS)
Balazs, L.A.P.
1982-01-01
A dual-topological-unitarization (or dual-fragmentation) approach to the calculation of hadron masses is presented, in which the effect of planar ''sea''-quark loops is taken into account from the beginning. Using techniques based on analyticity and generalized ladder-graph dynamics, we first derive the approximate ''generic'' Regge-trajectory formula α(t) = max (S 1 +S 2 , S 3 +S 4 )-(1/2) +2alpha-circumflex'[s/sub a/ +(1/2)(t-summationm/sub i/ 2 )] for any given hadronic process 1+2→3+4, where S/sub i/ and m/sub i/ are the spins and masses of i = 1,2,3,4, and √s/sub a/ is the effective mass of the lowest nonvanishing contribution (a) exchanged in the crossed channel. By requiring a minimization of secondary (background, etc.) contributions to a, and demanding simultaneous consistency for entire sets of such processes, we are then able to calculate the masses of all the lowest pseudoscalar and vector qq-bar states with q = u,d,s and the Regge trajectories on which they lie. By making certain additional assumptions we are also able to do this with q = u,d,c and q = u,d,b. Our only arbitrary parameters are m/sub rho/, m/sub K/*, m/sub psi/, and m/sub Upsilon/, one of which merely serves to fix the energy scale. In contrast to many other approaches, a small m/sub π/ 2 /m/sub rho/ 2 ratio arises quite naturally in the present scheme
Diagnosing a Strong-Fault Model by Conflict and Consistency
Directory of Open Access Journals (Sweden)
Wenfeng Zhang
2018-03-01
Full Text Available The diagnosis method for a weak-fault model with only normal behaviors of each component has evolved over decades. However, many systems now demand a strong-fault models, the fault modes of which have specific behaviors as well. It is difficult to diagnose a strong-fault model due to its non-monotonicity. Currently, diagnosis methods usually employ conflicts to isolate possible fault and the process can be expedited when some observed output is consistent with the model’s prediction where the consistency indicates probably normal components. This paper solves the problem of efficiently diagnosing a strong-fault model by proposing a novel Logic-based Truth Maintenance System (LTMS with two search approaches based on conflict and consistency. At the beginning, the original a strong-fault model is encoded by Boolean variables and converted into Conjunctive Normal Form (CNF. Then the proposed LTMS is employed to reason over CNF and find multiple minimal conflicts and maximal consistencies when there exists fault. The search approaches offer the best candidate efficiency based on the reasoning result until the diagnosis results are obtained. The completeness, coverage, correctness and complexity of the proposals are analyzed theoretically to show their strength and weakness. Finally, the proposed approaches are demonstrated by applying them to a real-world domain—the heat control unit of a spacecraft—where the proposed methods are significantly better than best first and conflict directly with A* search methods.
Diagnosing a Strong-Fault Model by Conflict and Consistency.
Zhang, Wenfeng; Zhao, Qi; Zhao, Hongbo; Zhou, Gan; Feng, Wenquan
2018-03-29
The diagnosis method for a weak-fault model with only normal behaviors of each component has evolved over decades. However, many systems now demand a strong-fault models, the fault modes of which have specific behaviors as well. It is difficult to diagnose a strong-fault model due to its non-monotonicity. Currently, diagnosis methods usually employ conflicts to isolate possible fault and the process can be expedited when some observed output is consistent with the model's prediction where the consistency indicates probably normal components. This paper solves the problem of efficiently diagnosing a strong-fault model by proposing a novel Logic-based Truth Maintenance System (LTMS) with two search approaches based on conflict and consistency. At the beginning, the original a strong-fault model is encoded by Boolean variables and converted into Conjunctive Normal Form (CNF). Then the proposed LTMS is employed to reason over CNF and find multiple minimal conflicts and maximal consistencies when there exists fault. The search approaches offer the best candidate efficiency based on the reasoning result until the diagnosis results are obtained. The completeness, coverage, correctness and complexity of the proposals are analyzed theoretically to show their strength and weakness. Finally, the proposed approaches are demonstrated by applying them to a real-world domain-the heat control unit of a spacecraft-where the proposed methods are significantly better than best first and conflict directly with A* search methods.
REPFLO model evaluation, physical and numerical consistency
International Nuclear Information System (INIS)
Wilson, R.N.; Holland, D.H.
1978-11-01
This report contains a description of some suggested changes and an evaluation of the REPFLO computer code, which models ground-water flow and nuclear-waste migration in and about a nuclear-waste repository. The discussion contained in the main body of the report is supplemented by a flow chart, presented in the Appendix of this report. The suggested changes are of four kinds: (1) technical changes to make the code compatible with a wider variety of digital computer systems; (2) changes to fill gaps in the computer code, due to missing proprietary subroutines; (3) changes to (a) correct programming errors, (b) correct logical flaws, and (c) remove unnecessary complexity; and (4) changes in the computer code logical structure to make REPFLO a more viable model from the physical point of view
Thermodynamically consistent data-driven computational mechanics
González, David; Chinesta, Francisco; Cueto, Elías
2018-05-01
In the paradigm of data-intensive science, automated, unsupervised discovering of governing equations for a given physical phenomenon has attracted a lot of attention in several branches of applied sciences. In this work, we propose a method able to avoid the identification of the constitutive equations of complex systems and rather work in a purely numerical manner by employing experimental data. In sharp contrast to most existing techniques, this method does not rely on the assumption on any particular form for the model (other than some fundamental restrictions placed by classical physics such as the second law of thermodynamics, for instance) nor forces the algorithm to find among a predefined set of operators those whose predictions fit best to the available data. Instead, the method is able to identify both the Hamiltonian (conservative) and dissipative parts of the dynamics while satisfying fundamental laws such as energy conservation or positive production of entropy, for instance. The proposed method is tested against some examples of discrete as well as continuum mechanics, whose accurate results demonstrate the validity of the proposed approach.
Process variables consistency at Atucha I NPP
International Nuclear Information System (INIS)
Arostegui, E.; Aparicio, M.; Herzovich, P.; Wenzel, J.; Urrutia, G.
1996-01-01
A method to evaluate the different systems performance has been developed and is still under assessment. In order to perform this job a process computer upgraded in 1992 was used. In this sense and taking into account that the resolution and stability of instrumentation is higher than its accuracy process data were corrected by software. In this was, much time spent in recalibration, and also human errors were avoided. Besides, this method allowed a better record of instrumentation performance and also an early detection of instruments failure. On the other hand, the process modelization, mainly heat and material balances has also been used to check that sensors, transducers, analog to digital converters and computer software are working properly. Some of these process equations have been introduced into the computer codes, so in some cases, it is possible to have an ''on line'' analysis of process variables and process instrumentation behaviour. Examples of process analysis are: Heat exchangers, i.e. the power calculated using shell side temperatures is compared with the tube side values; turbine performance is compared with condenser water temperature; power measured on the secondary side (one minute average measurements optimized in order to eliminate process noise are compared with power obtained from primary side data); the calibration of temperatures have been made by direct measurement of redundant sensors and have shown to be the best method; in the case of pressure and differential pressure transducers are cross checked in service when it is possible. In the present paper, details of the examples mentioned above and of other ones are given and discussed. (author). 2 refs, 1 fig., 1 tab
Process variables consistency at Atucha I NPP
Energy Technology Data Exchange (ETDEWEB)
Arostegui, E; Aparicio, M; Herzovich, P; Wenzel, J [Central Nuclear Atucha I, Nucleoelectrica S.A., Lima, Buenos Aires (Argentina); Urrutia, G [Comision Nacional de Energia Atomica, Buenos Aires (Argentina)
1997-12-31
A method to evaluate the different systems performance has been developed and is still under assessment. In order to perform this job a process computer upgraded in 1992 was used. In this sense and taking into account that the resolution and stability of instrumentation is higher than its accuracy process data were corrected by software. In this was, much time spent in recalibration, and also human errors were avoided. Besides, this method allowed a better record of instrumentation performance and also an early detection of instruments failure. On the other hand, the process modelization, mainly heat and material balances has also been used to check that sensors, transducers, analog to digital converters and computer software are working properly. Some of these process equations have been introduced into the computer codes, so in some cases, it is possible to have an ``on line`` analysis of process variables and process instrumentation behaviour. Examples of process analysis are: Heat exchangers, i.e. the power calculated using shell side temperatures is compared with the tube side values; turbine performance is compared with condenser water temperature; power measured on the secondary side (one minute average measurements optimized in order to eliminate process noise are compared with power obtained from primary side data); the calibration of temperatures have been made by direct measurement of redundant sensors and have shown to be the best method; in the case of pressure and differential pressure transducers are cross checked in service when it is possible. In the present paper, details of the examples mentioned above and of other ones are given and discussed. (author). 2 refs, 1 fig., 1 tab.
Time-Consistent and Market-Consistent Evaluations (Revised version of 2012-086)
Stadje, M.A.; Pelsser, A.
2014-01-01
Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
Assessing atmospheric bias correction for dynamical consistency using potential vorticity
International Nuclear Information System (INIS)
Rocheta, Eytan; Sharma, Ashish; Evans, Jason P
2014-01-01
Correcting biases in atmospheric variables prior to impact studies or dynamical downscaling can lead to new biases as dynamical consistency between the ‘corrected’ fields is not maintained. Use of these bias corrected fields for subsequent impact studies and dynamical downscaling provides input conditions that do not appropriately represent intervariable relationships in atmospheric fields. Here we investigate the consequences of the lack of dynamical consistency in bias correction using a measure of model consistency—the potential vorticity (PV). This paper presents an assessment of the biases present in PV using two alternative correction techniques—an approach where bias correction is performed individually on each atmospheric variable, thereby ignoring the physical relationships that exists between the multiple variables that are corrected, and a second approach where bias correction is performed directly on the PV field, thereby keeping the system dynamically coherent throughout the correction process. In this paper we show that bias correcting variables independently results in increased errors above the tropopause in the mean and standard deviation of the PV field, which are improved when using the alternative proposed. Furthermore, patterns of spatial variability are improved over nearly all vertical levels when applying the alternative approach. Results point to a need for a dynamically consistent atmospheric bias correction technique which results in fields that can be used as dynamically consistent lateral boundaries in follow-up downscaling applications. (letter)
Criteria for the generation of spectra consistent time histories
International Nuclear Information System (INIS)
Lin, C.-W.
1977-01-01
Several methods are available to conduct seismic analysis for nuclear power plant systems and components. Among them, the response spectrum technique has been most widely adopted for linear type of modal analysis. However, for designs which consist of structural or material nonlinearites such as frequency dependent soil properties, the existance of gaps, single tie rods, and friction between supports where the response has to be computed as a function of time, time history approach is the only viable method of analysis. Two examples of time history analysis are: 1) soil-structure interaction study and, 2) a coupled reactor coolant system and building analysis to either generate the floor response specra or compute nonlinear system time history response. The generation of a suitable time history input for the analysis has been discussed in the literature. Some general guidelines are available to insure that the time history imput will be as conservative as the design response spectra. Very little has been reported as to the effect of the dyanmic characteristics of the time history input upon the system response. In fact, the only available discussion in this respect concerns only with the statitical independent nature of the time history components. In this paper, numerical results for cases using the time history approach are presented. Criteria are also established which may be advantageously used to arrive at spectra consistent time histories which are conservative and more importantly, realistic. (Auth.)
Surfactant modified clays’ consistency limits and contact angles
Directory of Open Access Journals (Sweden)
S Akbulut
2012-07-01
Full Text Available This study was aimed at preparing a surfactant modified clay (SMC and researching the effect of surfactants on clays' contact angles and consistency limits; clay was thus modified by surfactants formodifying their engineering properties. Seven surfactants (trimethylglycine, hydroxyethylcellulose octyl phenol ethoxylate, linear alkylbenzene sulfonic acid, sodium lauryl ether sulfate, cetyl trimethylammonium chloride and quaternised ethoxylated fatty amine were used as surfactants in this study. The experimental results indicated that SMC consistency limits (liquid and plastic limits changedsignificantly compared to those of natural clay. Plasticity index and liquid limit (PI-LL values representing soil class approached the A-line when zwitterion, nonionic, and anionic surfactant percentageincreased. However, cationic SMC became transformed from CH (high plasticity clay to MH (high plasticity silt class soils, according to the unified soil classification system (USCS. Clay modifiedwith cationic and anionic surfactants gave higher and lower contact angles than natural clay, respectively.
Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation
Lindell, Annukka K.
2017-01-01
Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals’ selfie corpora. PMID:28270790
Feeling Expression Using Avatars and Its Consistency for Subjective Annotation
Ito, Fuyuko; Sasaki, Yasunari; Hiroyasu, Tomoyuki; Miki, Mitsunori
Consumer Generated Media(CGM) is growing rapidly and the amount of content is increasing. However, it is often difficult for users to extract important contents and the existence of contents recording their experiences can easily be forgotten. As there are no methods or systems to indicate the subjective value of the contents or ways to reuse them, subjective annotation appending subjectivity, such as feelings and intentions, to contents is needed. Representation of subjectivity depends on not only verbal expression, but also nonverbal expression. Linguistically expressed annotation, typified by collaborative tagging in social bookmarking systems, has come into widespread use, but there is no system of nonverbally expressed annotation on the web. We propose the utilization of controllable avatars as a means of nonverbal expression of subjectivity, and confirmed the consistency of feelings elicited by avatars over time for an individual and in a group. In addition, we compared the expressiveness and ease of subjective annotation between collaborative tagging and controllable avatars. The result indicates that the feelings evoked by avatars are consistent in both cases, and using controllable avatars is easier than collaborative tagging for representing feelings elicited by contents that do not express meaning, such as photos.
Self-consistent potential variations in magnetic wells
International Nuclear Information System (INIS)
Kesner, J.; Knorr, G.; Nicholson, D.R.
1981-01-01
Self-consistent electrostatic potential variations are considered in a spatial region of weak magnetic field, as in the proposed tandem mirror thermal barriers (with no trapped ions). For some conditions, equivalent to ion distributions with a sufficiently high net drift speed along the magnetic field, the desired potential depressions are found. When the net drift speed is not high enough, potential depressions are found only in combination with strong electric fields on the boundaries of the system. These potential depressions are not directly related to the magnetic field depression. (author)
Consistent treatment of one-body dynamics and collective fluctuations
International Nuclear Information System (INIS)
Pfitzner, A.
1986-09-01
We show how the residual coupling deltaV between collective and intrinsic motion induces correlations, which lead to fluctuations of the collective variables and to a redistribution of single-particle occupation numbers rho/sub α/. The evolution of rho/sub α/ and of the collective fluctuations is consistently described by a coupled system of equations, which accounts for the dependence of the transport coefficients on rho/sub α/, and for the dependence of the transition rates in the master equation on the collective variances. (author)
Sensor and control for consistent seed drill coulter depth
DEFF Research Database (Denmark)
Kirkegaard Nielsen, Søren; Nørremark, Michael; Green, Ole
2016-01-01
The consistent depth placement of seeds is vital for achieving the optimum yield of agricultural crops. In state-of-the-art seeding machines, the depth of drill coulters will vary with changes in soil resistance. This paper presents the retrofitting of an angle sensor to the pivoting point...... by a sub-millimetre accurate positioning system (iGPS, Nikon Metrology NV, Belgium) mounted on the drill coulter. At a drill coulter depth of 55 mm and controlled by an ordinary fixed spring loaded down force only, the change in soil resistance decreased the mean depth by 23 mm. By dynamically controlling...
Consistency checks in beam emission modeling for neutral beam injectors
International Nuclear Information System (INIS)
Punyapu, Bharathi; Vattipalle, Prahlad; Sharma, Sanjeev Kumar; Baruah, Ujjwal Kumar; Crowley, Brendan
2015-01-01
In positive neutral beam systems, the beam parameters such as ion species fractions, power fractions and beam divergence are routinely measured using Doppler shifted beam emission spectrum. The accuracy with which these parameters are estimated depend on the accuracy of the atomic modeling involved in these estimations. In this work, an effective procedure to check the consistency of the beam emission modeling in neutral beam injectors is proposed. As a first consistency check, at a constant beam voltage and current, the intensity of the beam emission spectrum is measured by varying the pressure in the neutralizer. Then, the scaling of measured intensity of un-shifted (target) and Doppler shifted intensities (projectile) of the beam emission spectrum at these pressure values are studied. If the un-shifted component scales with pressure, then the intensity of this component will be used as a second consistency check on the beam emission modeling. As a further check, the modeled beam fractions and emission cross sections of projectile and target are used to predict the intensity of the un-shifted component and then compared with the value of measured target intensity. An agreement between the predicted and measured target intensities provide the degree of discrepancy in the beam emission modeling. In order to test this methodology, a systematic analysis of Doppler shift spectroscopy data obtained on the JET neutral beam test stand data was carried out
Consistently Trained Artificial Neural Network for Automatic Ship Berthing Control
Directory of Open Access Journals (Sweden)
Y.A. Ahmed
2015-09-01
Full Text Available In this paper, consistently trained Artificial Neural Network controller for automatic ship berthing is discussed. Minimum time course changing manoeuvre is utilised to ensure such consistency and a new concept named ‘virtual window’ is introduced. Such consistent teaching data are then used to train two separate multi-layered feed forward neural networks for command rudder and propeller revolution output. After proper training, several known and unknown conditions are tested to judge the effectiveness of the proposed controller using Monte Carlo simulations. After getting acceptable percentages of success, the trained networks are implemented for the free running experiment system to judge the network’s real time response for Esso Osaka 3-m model ship. The network’s behaviour during such experiments is also investigated for possible effect of initial conditions as well as wind disturbances. Moreover, since the final goal point of the proposed controller is set at some distance from the actual pier to ensure safety, therefore a study on automatic tug assistance is also discussed for the final alignment of the ship with actual pier.
CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.
Shalizi, Cosma Rohilla; Rinaldo, Alessandro
2013-04-01
The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.
G-Consistent Subsets and Reduced Dynamical Quantum Maps
Ceballos, Russell R.
A quantum system which evolves in time while interacting with an external environ- ment is said to be an open quantum system (OQS), and the influence of the environment on the unperturbed unitary evolution of the system generally leads to non-unitary dynamics. This kind of open system dynamical evolution has been typically modeled by a Standard Prescription (SP) which assumes that the state of the OQS is initially uncorrelated with the environment state. It is here shown that when a minimal set of physically motivated assumptions are adopted, not only does there exist constraints on the reduced dynamics of an OQS such that this SP does not always accurately describe the possible initial cor- relations existing between the OQS and environment, but such initial correlations, and even entanglement, can be witnessed when observing a particular class of reduced state transformations termed purity extractions are observed. Furthermore, as part of a more fundamental investigation to better understand the minimal set of assumptions required to formulate well defined reduced dynamical quantum maps, it is demonstrated that there exists a one-to-one correspondence between the set of initial reduced states and the set of admissible initial system-environment composite states when G-consistency is enforced. Given the discussions surrounding the requirement of complete positivity and the reliance on the SP, the results presented here may well be found valuable for determining the ba- sic properties of reduced dynamical maps, and when restrictions on the OQS dynamics naturally emerge.
Self-consistent chaos in the beam-plasma instability
International Nuclear Information System (INIS)
Tennyson, J.L.; Meiss, J.D.
1993-01-01
The effect of self-consistency on Hamiltonian systems with a large number of degrees-of-freedom is investigated for the beam-plasma instability using the single-wave model of O'Neil, Winfrey, and Malmberg.The single-wave model is reviewed and then rederived within the Hamiltonian context, which leads naturally to canonical action- angle variables. Simulations are performed with a large (10 4 ) number of beam particles interacting with the single wave. It is observed that the system relaxes into a time asymptotic periodic state where only a few collective degrees are active; namely, a clump of trapped particles oscillating in a modulated wave, within a uniform chaotic sea with oscillating phase space boundaries. Thus self-consistency is seen to effectively reduce the number of degrees- of-freedom. A simple low degree-of-freedom model is derived that treats the clump as a single macroparticle, interacting with the wave and chaotic sea. The uniform chaotic sea is modeled by a fluid waterbag, where the waterbag boundaries correspond approximately to invariant tori. This low degree-of-freedom model is seen to compare well with the simulation
48 CFR 52.230-3 - Disclosure and Consistency of Cost Accounting Practices.
2010-10-01
... Text of Provisions and Clauses 52.230-3 Disclosure and Consistency of Cost Accounting Practices. As prescribed in 30.201-4(b)(1), insert the following clause: Disclosure and Consistency of Cost Accounting... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Disclosure and Consistency...
49 CFR 385.311 - What will the safety audit consist of?
2010-10-01
... SAFETY FITNESS PROCEDURES New Entrant Safety Assurance Program § 385.311 What will the safety audit consist of? The safety audit will consist of a review of the new entrant's safety management systems and a... 49 Transportation 5 2010-10-01 2010-10-01 false What will the safety audit consist of? 385.311...
Bootstrap embedding: An internally consistent fragment-based method
Energy Technology Data Exchange (ETDEWEB)
Welborn, Matthew; Tsuchimochi, Takashi; Van Voorhis, Troy [Department of Chemistry, Massachusetts Institute of Technology, 77 Massachusetts Avenue, Cambridge, Massachusetts 02139 (United States)
2016-08-21
Strong correlation poses a difficult problem for electronic structure theory, with computational cost scaling quickly with system size. Fragment embedding is an attractive approach to this problem. By dividing a large complicated system into smaller manageable fragments “embedded” in an approximate description of the rest of the system, we can hope to ameliorate the steep cost of correlated calculations. While appealing, these methods often converge slowly with fragment size because of small errors at the boundary between fragment and bath. We describe a new electronic embedding method, dubbed “Bootstrap Embedding,” a self-consistent wavefunction-in-wavefunction embedding theory that uses overlapping fragments to improve the description of fragment edges. We apply this method to the one dimensional Hubbard model and a translationally asymmetric variant, and find that it performs very well for energies and populations. We find Bootstrap Embedding converges rapidly with embedded fragment size, overcoming the surface-area-to-volume-ratio error typical of many embedding methods. We anticipate that this method may lead to a low-scaling, high accuracy treatment of electron correlation in large molecular systems.
Efficient self-consistency for magnetic tight binding
Soin, Preetma; Horsfield, A. P.; Nguyen-Manh, D.
2011-06-01
Tight binding can be extended to magnetic systems by including an exchange interaction on an atomic site that favours net spin polarisation. We have used a published model, extended to include long-ranged Coulomb interactions, to study defects in iron. We have found that achieving self-consistency using conventional techniques was either unstable or very slow. By formulating the problem of achieving charge and spin self-consistency as a search for stationary points of a Harris-Foulkes functional, extended to include spin, we have derived a much more efficient scheme based on a Newton-Raphson procedure. We demonstrate the capabilities of our method by looking at vacancies and self-interstitials in iron. Self-consistency can indeed be achieved in a more efficient and stable manner, but care needs to be taken to manage this. The algorithm is implemented in the code PLATO. Program summaryProgram title:PLATO Catalogue identifier: AEFC_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFC_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 228 747 No. of bytes in distributed program, including test data, etc.: 1 880 369 Distribution format: tar.gz Programming language: C and PERL Computer: Apple Macintosh, PC, Unix machines Operating system: Unix, Linux, Mac OS X, Windows XP Has the code been vectorised or parallelised?: Yes. Up to 256 processors tested RAM: Up to 2 Gbytes per processor Classification: 7.3 External routines: LAPACK, BLAS and optionally ScaLAPACK, BLACS, PBLAS, FFTW Catalogue identifier of previous version: AEFC_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 2616 Does the new version supersede the previous version?: Yes Nature of problem: Achieving charge and spin self-consistency in magnetic tight binding can be very
Self-consistent approach to the eletronic problem in disordered solids
International Nuclear Information System (INIS)
Taguena-Martinez, J.; Barrio, R.A.; Martinez, E.; Yndurain, F.
1984-01-01
It is developed a simple formalism which allows us to perform a self consistent non-parametrized calculation in a non-periodic system, by finding out the thermodynamically averaged Green's function of a cluster Bethe lattice system. (Author) [pt
Consistent realization of Celestial and Terrestrial Reference Frames
Kwak, Younghee; Bloßfeld, Mathis; Schmid, Ralf; Angermann, Detlef; Gerstl, Michael; Seitz, Manuela
2018-03-01
The Celestial Reference System (CRS) is currently realized only by Very Long Baseline Interferometry (VLBI) because it is the space geodetic technique that enables observations in that frame. In contrast, the Terrestrial Reference System (TRS) is realized by means of the combination of four space geodetic techniques: Global Navigation Satellite System (GNSS), VLBI, Satellite Laser Ranging (SLR), and Doppler Orbitography and Radiopositioning Integrated by Satellite. The Earth orientation parameters (EOP) are the link between the two types of systems, CRS and TRS. The EOP series of the International Earth Rotation and Reference Systems Service were combined of specifically selected series from various analysis centers. Other EOP series were generated by a simultaneous estimation together with the TRF while the CRF was fixed. Those computation approaches entail inherent inconsistencies between TRF, EOP, and CRF, also because the input data sets are different. A combined normal equation (NEQ) system, which consists of all the parameters, i.e., TRF, EOP, and CRF, would overcome such an inconsistency. In this paper, we simultaneously estimate TRF, EOP, and CRF from an inter-technique combined NEQ using the latest GNSS, VLBI, and SLR data (2005-2015). The results show that the selection of local ties is most critical to the TRF. The combination of pole coordinates is beneficial for the CRF, whereas the combination of Δ UT1 results in clear rotations of the estimated CRF. However, the standard deviations of the EOP and the CRF improve by the inter-technique combination which indicates the benefits of a common estimation of all parameters. It became evident that the common determination of TRF, EOP, and CRF systematically influences future ICRF computations at the level of several μas. Moreover, the CRF is influenced by up to 50 μas if the station coordinates and EOP are dominated by the satellite techniques.
Consistency in performance evaluation reports and medical records.
Lu, Mingshan; Ma, Ching-to Albert
2002-12-01
In the health care market managed care has become the latest innovation for the delivery of services. For efficient implementation, the managed care organization relies on accurate information. So clinicians are often asked to report on patients before referrals are approved, treatments authorized, or insurance claims processed. What are clinicians responses to solicitation for information by managed care organizations? The existing health literature has already pointed out the importance of provider gaming, sincere reporting, nudging, and dodging the rules. We assess the consistency of clinicians reports on clients across administrative data and clinical records. For about 1,000 alcohol abuse treatment episodes, we compare clinicians reports across two data sets. The first one, the Maine Addiction Treatment System (MATS), was an administrative data set; the state government used it for program performance monitoring and evaluation. The second was a set of medical record abstracts, taken directly from the clinical records of treatment episodes. A clinician s reporting practice exhibits an inconsistency if the information reported in MATS differs from the information reported in the medical record in a statistically significant way. We look for evidence of inconsistencies in five categories: admission alcohol use frequency, discharge alcohol use frequency, termination status, admission employment status, and discharge employment status. Chi-square tests, Kappa statistics, and sensitivity and specificity tests are used for hypothesis testing. Multiple imputation methods are employed to address the problem of missing values in the record abstract data set. For admission and discharge alcohol use frequency measures, we find, respectively, strong and supporting evidence for inconsistencies. We find equally strong evidence for consistency in reports of admission and discharge employment status, and mixed evidence on report consistency on termination status. Patterns of
Self-consistent modeling of electron cyclotron resonance ion sources
International Nuclear Information System (INIS)
Girard, A.; Hitz, D.; Melin, G.; Serebrennikov, K.; Lecot, C.
2004-01-01
In order to predict the performances of electron cyclotron resonance ion source (ECRIS), it is necessary to perfectly model the different parts of these sources: (i) magnetic configuration; (ii) plasma characteristics; (iii) extraction system. The magnetic configuration is easily calculated via commercial codes; different codes also simulate the ion extraction, either in two dimension, or even in three dimension (to take into account the shape of the plasma at the extraction influenced by the hexapole). However the characteristics of the plasma are not always mastered. This article describes the self-consistent modeling of ECRIS: we have developed a code which takes into account the most important construction parameters: the size of the plasma (length, diameter), the mirror ratio and axial magnetic profile, whether a biased probe is installed or not. These input parameters are used to feed a self-consistent code, which calculates the characteristics of the plasma: electron density and energy, charge state distribution, plasma potential. The code is briefly described, and some of its most interesting results are presented. Comparisons are made between the calculations and the results obtained experimentally
Self-consistent modeling of electron cyclotron resonance ion sources
Girard, A.; Hitz, D.; Melin, G.; Serebrennikov, K.; Lécot, C.
2004-05-01
In order to predict the performances of electron cyclotron resonance ion source (ECRIS), it is necessary to perfectly model the different parts of these sources: (i) magnetic configuration; (ii) plasma characteristics; (iii) extraction system. The magnetic configuration is easily calculated via commercial codes; different codes also simulate the ion extraction, either in two dimension, or even in three dimension (to take into account the shape of the plasma at the extraction influenced by the hexapole). However the characteristics of the plasma are not always mastered. This article describes the self-consistent modeling of ECRIS: we have developed a code which takes into account the most important construction parameters: the size of the plasma (length, diameter), the mirror ratio and axial magnetic profile, whether a biased probe is installed or not. These input parameters are used to feed a self-consistent code, which calculates the characteristics of the plasma: electron density and energy, charge state distribution, plasma potential. The code is briefly described, and some of its most interesting results are presented. Comparisons are made between the calculations and the results obtained experimentally.
Self-consistent electron transport in collisional plasmas
International Nuclear Information System (INIS)
Mason, R.J.
1982-01-01
A self-consistent scheme has been developed to model electron transport in evolving plasmas of arbitrary classical collisionality. The electrons and ions are treated as either multiple donor-cell fluids, or collisional particles-in-cell. Particle suprathermal electrons scatter off ions, and drag against fluid background thermal electrons. The background electrons undergo ion friction, thermal coupling, and bremsstrahlung. The components move in self-consistent advanced E-fields, obtained by the Implicit Moment Method, which permits Δt >> ω/sub p/ -1 and Δx >> lambda/sub D/ - offering a 10 2 - 10 3 -fold speed-up over older explicit techniques. The fluid description for the background plasma components permits the modeling of transport in systems spanning more than a 10 7 -fold change in density, and encompassing contiguous collisional and collisionless regions. Results are presented from application of the scheme to the modeling of CO 2 laser-generated suprathermal electron transport in expanding thin foils, and in multi-foil target configurations
Multiplicative Consistency for Interval Valued Reciprocal Preference Relations
Wu, Jian; Chiclana, Francisco
2014-01-01
The multiplicative consistency (MC) property of interval additive reciprocal preference relations (IARPRs) is explored, and then the consistency index is quantified by the multiplicative consistency estimated IARPR. The MC property is used to measure the level of consistency of the information provided by the experts and also to propose the consistency index induced ordered weighted averaging (CI-IOWA) operator. The novelty of this operator is that it aggregates individual IARPRs in such ...
ER=EPR, GHZ, and the consistency of quantum measurements
International Nuclear Information System (INIS)
Susskind, Leonard
2016-01-01
This paper illustrates various aspects of the ER=EPR conjecture. It begins with a brief heuristic argument, using the Ryu-Takayanagi correspondence, for why entanglement between black holes implies the existence of Einstein-Rosen bridges. The main part of the paper addresses a fundamental question: Is ER=EPR consistent with the standard postulates of quantum mechanics? Naively it seems to lead to an inconsistency between observations made on entangled systems by different observers. The resolution of the paradox lies in the properties of multiple black holes, entangled in the Greenberger-Horne-Zeilinger pattern. The last part of the paper is about entanglement as a resource for quantum communication. ER=EPR provides a way to visualize protocols like quantum teleportation. In some sense teleportation takes place through the wormhole, but as usual, classical communication is necessary to complete the protocol. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)
Self-consistent expansion for the molecular beam epitaxy equation.
Katzav, Eytan
2002-03-01
Motivated by a controversy over the correct results derived from the dynamic renormalization group (DRG) analysis of the nonlinear molecular beam epitaxy (MBE) equation, a self-consistent expansion for the nonlinear MBE theory is considered. The scaling exponents are obtained for spatially correlated noise of the general form D(r-r('),t-t('))=2D(0)[r-->-r(')](2rho-d)delta(t-t(')). I find a lower critical dimension d(c)(rho)=4+2rho, above which the linear MBE solution appears. Below the lower critical dimension a rho-dependent strong-coupling solution is found. These results help to resolve the controversy over the correct exponents that describe nonlinear MBE, using a reliable method that proved itself in the past by giving reasonable results for the strong-coupling regime of the Kardar-Parisi-Zhang system (for d>1), where DRG failed to do so.
Back to the Future: Consistency-Based Trajectory Tracking
Kurien, James; Nayak, P. Pandurand; Norvig, Peter (Technical Monitor)
2000-01-01
Given a model of a physical process and a sequence of commands and observations received over time, the task of an autonomous controller is to determine the likely states of the process and the actions required to move the process to a desired configuration. We introduce a representation and algorithms for incrementally generating approximate belief states for a restricted but relevant class of partially observable Markov decision processes with very large state spaces. The algorithm presented incrementally generates, rather than revises, an approximate belief state at any point by abstracting and summarizing segments of the likely trajectories of the process. This enables applications to efficiently maintain a partial belief state when it remains consistent with observations and revisit past assumptions about the process' evolution when the belief state is ruled out. The system presented has been implemented and results on examples from the domain of spacecraft control are presented.
Consistency of the tachyon warm inflationary universe models
International Nuclear Information System (INIS)
Zhang, Xiao-Min; Zhu, Jian-Yang
2014-01-01
This study concerns the consistency of the tachyon warm inflationary models. A linear stability analysis is performed to find the slow-roll conditions, characterized by the potential slow-roll (PSR) parameters, for the existence of a tachyon warm inflationary attractor in the system. The PSR parameters in the tachyon warm inflationary models are redefined. Two cases, an exponential potential and an inverse power-law potential, are studied, when the dissipative coefficient Γ = Γ 0 and Γ = Γ(φ), respectively. A crucial condition is obtained for a tachyon warm inflationary model characterized by the Hubble slow-roll (HSR) parameter ε H , and the condition is extendable to some other inflationary models as well. A proper number of e-folds is obtained in both cases of the tachyon warm inflation, in contrast to existing works. It is also found that a constant dissipative coefficient (Γ = Γ 0 ) is usually not a suitable assumption for a warm inflationary model
Self-consistent simulation of the CSR effect
International Nuclear Information System (INIS)
Li, R.; Bohn, C.L.; Bisogano, J.J.
1998-01-01
When a microbunch with high charge traverses a curved trajectory, the curvature-induced bunch self-interaction, by way of coherent synchrotron radiation (CSR) and space-charge forces, may cause serious emittance degradation. In this paper, the authors present a self-consistent simulation for the study of the impact of CSR on beam optics. The dynamics of the bunch under the influence of the CSR forces is simulated using macroparticles, where the CSR force in turn depends on the history of bunch dynamics in accordance with causality. The simulation is benchmarked with analytical results obtained for a rigid-line bunch. Here they present the algorithm used in the simulation, along with the simulation results obtained for bending systems in the Jefferson Lab (JLab) free-electron-laser (FEL) lattice
ER=EPR, GHZ, and the consistency of quantum measurements
Energy Technology Data Exchange (ETDEWEB)
Susskind, Leonard [Stanford Institute for Theoretical Physics and Department of Physics, Stanford University, Stanford, CA (United States)
2016-01-15
This paper illustrates various aspects of the ER=EPR conjecture. It begins with a brief heuristic argument, using the Ryu-Takayanagi correspondence, for why entanglement between black holes implies the existence of Einstein-Rosen bridges. The main part of the paper addresses a fundamental question: Is ER=EPR consistent with the standard postulates of quantum mechanics? Naively it seems to lead to an inconsistency between observations made on entangled systems by different observers. The resolution of the paradox lies in the properties of multiple black holes, entangled in the Greenberger-Horne-Zeilinger pattern. The last part of the paper is about entanglement as a resource for quantum communication. ER=EPR provides a way to visualize protocols like quantum teleportation. In some sense teleportation takes place through the wormhole, but as usual, classical communication is necessary to complete the protocol. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)
Quench simulation of SMES consisting of some superconducting coils
International Nuclear Information System (INIS)
Noguchi, S.; Oga, Y.; Igarashi, H.
2011-01-01
A chain of quenches may be caused by a quench of one element coil when SMES is consists of many element coils. To avoid the chain of quenches, the energy stored in element coil has to be quickly discharged. The cause of the chain of the quenches is the short time constant of the decreasing current of the quenched coil. In recent years, many HTS superconducting magnetic energy storage (HTS-SMES) systems are investigated and designed. They usually consist of some superconducting element coils due to storing excessively high energy. If one of them was quenched, the storage energy of the superconducting element coil quenched has to be immediately dispersed to protect the HTS-SMES system. As the result, the current of the other element coils, which do not reach to quench, increases since the magnetic coupling between the quenched element coil and the others are excessively strong. The increase of the current may cause the quench of the other element coils. If the energy dispersion of the element coil quenched was failed, the other superconducting element coil would be quenched in series. Therefore, it is necessary to investigate the behavior of the HTS-SMES after quenching one or more element coils. To protect a chain of quenches, it is also important to investigate the time constant of the coils. We have developed a simulation code to investigate the behavior of the HTS-SMES. By the quench simulation, it is indicated that a chain of quenches is caused by a quench of one element coil.
Energy Technology Data Exchange (ETDEWEB)
Kojima, T; Tange, A; Matsuda, K [NHK Spring Co. Ltd., Yokohama (Japan)
1997-10-01
For the purpose of continuous run without any maintenance, new DPF (diesel particulate filter)systems laminated by both metal-wire mesh and alumina-fiber mesh alternately, are under the developments. The perfect combustion of trapped diesel particulate can be achieved by a couple of the resistance heating devices inserted into the filter. 5 refs., 7 figs., 3 tabs.
Are consistent equal-weight particle filters possible?
van Leeuwen, P. J.
2017-12-01
Particle filters are fully nonlinear data-assimilation methods that could potentially change the way we do data-assimilation in highly nonlinear high-dimensional geophysical systems. However, the standard particle filter in which the observations come in by changing the relative weights of the particles is degenerate. This means that one particle obtains weight one, and all other particles obtain a very small weight, effectively meaning that the ensemble of particles reduces to that one particle. For over 10 years now scientists have searched for solutions to this problem. One obvious solution seems to be localisation, in which each part of the state only sees a limited number of observations. However, for a realistic localisation radius based on physical arguments, the number of observations is typically too large, and the filter is still degenerate. Another route taken is trying to find proposal densities that lead to more similar particle weights. There is a simple proof, however, that shows that there is an optimum, the so-called optimal proposal density, and that optimum will lead to a degenerate filter. On the other hand, it is easy to come up with a counter example of a particle filter that is not degenerate in high-dimensional systems. Furthermore, several particle filters have been developed recently that claim to have equal or equivalent weights. In this presentation I will show how to construct a particle filter that is never degenerate in high-dimensional systems, and how that is still consistent with the proof that one cannot do better than the optimal proposal density. Furthermore, it will be shown how equal- and equivalent-weights particle filters fit within this framework. This insight will then lead to new ways to generate particle filters that are non-degenerate, opening up the field of nonlinear filtering in high-dimensional systems.
Consistency check of photon beam physical data after recommissioning process
International Nuclear Information System (INIS)
Kadman, B; Chawapun, N; Ua-apisitwong, S; Asakit, T; Chumpu, N; Rueansri, J
2016-01-01
In radiotherapy, medical linear accelerator (Linac) is the key system used for radiation treatments delivery. Although, recommissioning was recommended after major modification of the machine by AAPM TG53, but it might not be practical in radiotherapy center with heavy workloads. The main purpose of this study was to compare photon beam physical data between initial commissioning and recommissioning of 6 MV Elekta Precise linac. The parameters for comparing were the percentage depth dose (PDD) and beam profiles. The clinical commissioning test cases followed IAEA-TECDOC-1583 were planned on REF 91230 IMRT Dose Verification Phantom by Philips’ Pinnacle treatment planning system. The Delta 4PT was used for dose distribution verification with 90% passing criteria of the gamma index (3%/3mm). Our results revealed that the PDDs and beam profiles agreed within a tolerance limit recommended by TRS430. Most of the point doses and dose distribution verification passed the acceptance criteria. This study showed the consistency of photon beam physical data after recommissioning process. There was a good agreement between initial commissioning and recommissioning within a tolerance limit, demonstrated that the full recommissioning process might not be required. However, in the complex treatment planning geometry, the initial data should be applied with great caution. (paper)
A new mixed self-consistent field procedure
Alvarez-Ibarra, A.; Köster, A. M.
2015-10-01
A new approach for the calculation of three-centre electronic repulsion integrals (ERIs) is developed, implemented and benchmarked in the framework of auxiliary density functional theory (ADFT). The so-called mixed self-consistent field (mixed SCF) divides the computationally costly ERIs in two sets: far-field and near-field. Far-field ERIs are calculated using the newly developed double asymptotic expansion as in the direct SCF scheme. Near-field ERIs are calculated only once prior to the SCF procedure and stored in memory, as in the conventional SCF scheme. Hence the name, mixed SCF. The implementation is particularly powerful when used in parallel architectures, since all RAM available are used for near-field ERI storage. In addition, the efficient distribution algorithm performs minimal intercommunication operations between processors, avoiding a potential bottleneck. One-, two- and three-dimensional systems are used for benchmarking, showing substantial time reduction in the ERI calculation for all of them. A Born-Oppenheimer molecular dynamics calculation for the Na+55 cluster is also shown in order to demonstrate the speed-up for small systems achievable with the mixed SCF. Dedicated to Sourav Pal on the occasion of his 60th birthday.
Privacy, Time Consistent Optimal Labour Income Taxation and Education Policy
Konrad, Kai A.
1999-01-01
Incomplete information is a commitment device for time consistency problems. In the context of time consistent labour income taxation privacy reduces welfare losses and increases the effectiveness of public education as a second best policy.
Personality and Situation Predictors of Consistent Eating Patterns
Vainik, Uku; Dub?, Laurette; Lu, Ji; Fellows, Lesley K.
2015-01-01
Introduction A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studi...
Two Impossibility Results on the Converse Consistency Principle in Bargaining
Youngsub Chun
1999-01-01
We present two impossibility results on the converse consistency principle in the context of bargaining. First, we show that there is no solution satis-fying Pareto optimality, contraction independence, and converse consistency. Next, we show that there is no solution satisfying Pareto optimality, strong individual rationality, individual monotonicity, and converse consistency.
Personality consistency analysis in cloned quarantine dog candidates
Directory of Open Access Journals (Sweden)
Jin Choi
2017-01-01
Full Text Available In recent research, personality consistency has become an important characteristic. Diverse traits and human-animal interactions, in particular, are studied in the field of personality consistency in dogs. Here, we investigated the consistency of dominant behaviours in cloned and control groups followed by the modified Puppy Aptitude Test, which consists of ten subtests to ascertain the influence of genetic identity. In this test, puppies are exposed to stranger, restraint, prey-like object, noise, startling object, etc. Six cloned and four control puppies participated and the consistency of responses at ages 7–10 and 16 weeks in the two groups was compared. The two groups showed different consistencies in the subtests. While the average scores of the cloned group were consistent (P = 0.7991, those of the control group were not (P = 0.0089. Scores of Pack Drive and Fight or Flight Drive were consistent in the cloned group, however, those of the control group were not. Scores of Prey Drive were not consistent in either the cloned or the control group. Therefore, it is suggested that consistency of dominant behaviour is affected by genetic identity and some behaviours can be influenced more than others. Our results suggest that cloned dogs could show more consistent traits than non-cloned. This study implies that personality consistency could be one of the ways to analyse traits of puppies.
Checking Consistency of Pedigree Information is NP-complete
DEFF Research Database (Denmark)
Aceto, Luca; Hansen, Jens A.; Ingolfsdottir, Anna
Consistency checking is a fundamental computational problem in genetics. Given a pedigree and information on the genotypes of some of the individuals in it, the aim of consistency checking is to determine whether these data are consistent with the classic Mendelian laws of inheritance. This probl...
26 CFR 1.338-8 - Asset and stock consistency.
2010-04-01
... that are controlled foreign corporations. (6) Stock consistency. This section limits the application of... 26 Internal Revenue 4 2010-04-01 2010-04-01 false Asset and stock consistency. 1.338-8 Section 1... (CONTINUED) INCOME TAXES Effects on Corporation § 1.338-8 Asset and stock consistency. (a) Introduction—(1...
Modeling self-consistent multi-class dynamic traffic flow
Cho, Hsun-Jung; Lo, Shih-Ching
2002-09-01
In this study, we present a systematic self-consistent multiclass multilane traffic model derived from the vehicular Boltzmann equation and the traffic dispersion model. The multilane domain is considered as a two-dimensional space and the interaction among vehicles in the domain is described by a dispersion model. The reason we consider a multilane domain as a two-dimensional space is that the driving behavior of road users may not be restricted by lanes, especially motorcyclists. The dispersion model, which is a nonlinear Poisson equation, is derived from the car-following theory and the equilibrium assumption. Under the concept that all kinds of users share the finite section, the density is distributed on a road by the dispersion model. In addition, the dynamic evolution of the traffic flow is determined by the systematic gas-kinetic model derived from the Boltzmann equation. Multiplying Boltzmann equation by the zeroth, first- and second-order moment functions, integrating both side of the equation and using chain rules, we can derive continuity, motion and variance equation, respectively. However, the second-order moment function, which is the square of the individual velocity, is employed by previous researches does not have physical meaning in traffic flow. Although the second-order expansion results in the velocity variance equation, additional terms may be generated. The velocity variance equation we propose is derived from multiplying Boltzmann equation by the individual velocity variance. It modifies the previous model and presents a new gas-kinetic traffic flow model. By coupling the gas-kinetic model and the dispersion model, a self-consistent system is presented.
Kalra, Sukirti; Paul, Manash K; Balaram, Hemalatha; Mukhopadhyay, Anup Kumar
2007-05-01
The thiopurine antimetabolite 6-mercaptopurine (6MP) is an important chemotherapeutic drug in the conventional treatment of childhood acute lymphoblastic leukemia (ALL). 6MP is mainly catabolized by both hypoxanthine-guanine phosphoribosyltransferase (HGPRT) and xanthine oxidase (XOD) to form thioinosinic monophosphate (TIMP) (therapeutically active metabolite) and 6-thiouric acid (6TUA) (inactive metabolite), respectively. The activity of both the enzymes varies among ALL patients governing the active and the inactive metabolite profile within the immature lymphocytes. Therefore, an attempt was made to study the kinetic nature of the branched bi-enzyme system acting on 6MP and to quantitate TIMP and 6TUA formed when the two enzymes are present in equal and variable ratios. The quantification of the branched kinetics using spectrophotometric method presents problem due to the closely apposed lambda(max) of the substrates and products. Hence, employing an HPLC method, the quantification of the products was done with the progress of time. The limit of quantification (LOQ) of substrate was found to be 10nM and for products as 50 nM. The limit of detection (LOD) was found to be 1 nM for the substrate and the products. The method exhibited linearity in the range of 0.01-100 microM for 6MP and 0.05-100 microM for both 6TUA and TIMP. The amount of TIMP formed was higher than that of 6TUA in the bi-enzyme system when both the enzymes were present in equivalent enzymatic ratio. It was further found that enzymatic ratios play an important role in determining the amounts of TIMP and 6TUA. This method was further validated using actively growing T-ALL cell line (Jurkat) to study the branched kinetics, wherein it was observed that treatment of 50 microM 6MP led to the generation of 12 microM TIMP and 0.8 microM 6TUA in 6 h at 37 degrees C.
Interactive Videodisc Design and Production, Workshop Guide. Volume 2
1983-12-01
is the three-dimensional rotating cube, using two channels. One face of the revolving cube can have any of the above effects in it while a second face ...simultaneously rotates into view. The third face coming 5’. into view has the original image while the fourth face has the second image. Other special...utterance might be, "four hundred men." The number of syllables is a factor in the length of the utterance and in its recognitio . accuracy. If all
Is Active Tectonics on Madagascar Consistent with Somalian Plate Kinematics?
Stamps, D. S.; Kreemer, C.; Rajaonarison, T. A.
2017-12-01
The East African Rift System (EARS) actively breaks apart the Nubian and Somalian tectonic plates. Madagascar finds itself at the easternmost boundary of the EARS, between the Rovuma block, Lwandle plate, and the Somalian plate. Earthquake focal mechanisms and N-S oriented fault structures on the continental island suggest that Madagascar is experiencing east-west oriented extension. However, some previous plate kinematic studies indicate minor compressional strains across Madagascar. This inconsistency may be due to uncertainties in Somalian plate rotation. Past estimates of the rotation of the Somalian plate suffered from a poor coverage of GPS stations, but some important new stations are now available for a re-evaluation. In this work, we revise the kinematics of the Somalian plate. We first calculate a new GPS velocity solution and perform block kinematic modeling to evaluate the Somalian plate rotation. We then estimate new Somalia-Rovuma and Somalia-Lwandle relative motions across Madagascar and evaluate whether they are consistent with GPS measurements made on the island itself, as well as with other kinematic indicators.
Consistent microscopic and phenomenological analysis of composite particle opticle potential
International Nuclear Information System (INIS)
Mukhopadhyay, Sheela; Srivastava, D.K.; Ganguly, N.K.
1976-01-01
A microscopic calculation of composits particle optical potential has been done using a realistic nucleon-helion interaction and folding it with the density distribution of the targets. The second order effects were simulated by introducing a scaling factor which was searched on to reproduce the experimental scattering results. Composite particle optical potential was also derived from the nucleon-nucleus optical potential. The second order term was explicitly treated as a parameter. Elastic scattering of 20 MeV 3 H on targets ranging from 40 Ca to 208 Pb to 208 Pb have also been analysed using phenomenological optical model. Agreement of these results with the above calculations verified the consistency of the microscopic theory. But the equivalent sharp radius calculated with n-helion interaction was observed to be smaller than phenomenological value. This was attributed to the absence of saturation effects in the density-independent interaction used. Saturation has been introduced by a density dependent term of the form (1-c zetasup(2/3)), where zeta is the compound density of the target helion system. (author)
Decentralized Consistent Network Updates in SDN with ez-Segway
Nguyen, Thanh Dang
2017-03-06
We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and black-holes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.
48 CFR 52.230-4 - Disclosure and Consistency of Cost Accounting Practices-Foreign Concerns.
2010-10-01
... CONTRACT CLAUSES Text of Provisions and Clauses 52.230-4 Disclosure and Consistency of Cost Accounting... Disclosure Statement, disclose in writing its cost accounting practices as required by 48 CFR 9903.202-1... 48 Federal Acquisition Regulations System 2 2010-10-01 2010-10-01 false Disclosure and Consistency...
Consistent Regulation of Infrastructure Businesses: Some Economic Issues
Flavio M. Menezes
2008-01-01
This paper examines some important economic aspects associated with the notion that consistency in the regulation of infrastructure businesses is a desirable feature. It makes two important points. First, it is not easy to measure consistency. In particular, one cannot simply point to different regulatory parameters as evidence of inconsistent regulatory policy. Second, even if one does observe consistency emerging from decisions made by different regulators, it does not necessarily mean that...
Antibiotics for respiratory, ear and urinary tract disorders and consistency among GPs.
Ong, D.S.Y.; Kuyvenhoven, M.M.; Dijk, L. van; Verheij, T.J.M.
2008-01-01
Objectives: To describe specific diagnoses for which systemic antibiotics are prescribed, to assess adherence of antibiotic choice to national guidelines and to assess consistency among general practitioners (GPs) in prescribed volumes of antibiotics for respiratory, ear and urinary tract disorders.
Large scale Bayesian nuclear data evaluation with consistent model defects
International Nuclear Information System (INIS)
Schnabel, G
2015-01-01
The aim of nuclear data evaluation is the reliable determination of cross sections and related quantities of the atomic nuclei. To this end, evaluation methods are applied which combine the information of experiments with the results of model calculations. The evaluated observables with their associated uncertainties and correlations are assembled into data sets, which are required for the development of novel nuclear facilities, such as fusion reactors for energy supply, and accelerator driven systems for nuclear waste incineration. The efficiency and safety of such future facilities is dependent on the quality of these data sets and thus also on the reliability of the applied evaluation methods. This work investigated the performance of the majority of available evaluation methods in two scenarios. The study indicated the importance of an essential component in these methods, which is the frequently ignored deficiency of nuclear models. Usually, nuclear models are based on approximations and thus their predictions may deviate from reliable experimental data. As demonstrated in this thesis, the neglect of this possibility in evaluation methods can lead to estimates of observables which are inconsistent with experimental data. Due to this finding, an extension of Bayesian evaluation methods is proposed to take into account the deficiency of the nuclear models. The deficiency is modeled as a random function in terms of a Gaussian process and combined with the model prediction. This novel formulation conserves sum rules and allows to explicitly estimate the magnitude of model deficiency. Both features are missing in available evaluation methods so far. Furthermore, two improvements of existing methods have been developed in the course of this thesis. The first improvement concerns methods relying on Monte Carlo sampling. A Metropolis-Hastings scheme with a specific proposal distribution is suggested, which proved to be more efficient in the studied scenarios than the
Evaluating the hydrological consistency of satellite based water cycle components
Lopez Valencia, Oliver Miguel
2016-06-15
Advances in multi-satellite based observations of the earth system have provided the capacity to retrieve information across a wide-range of land surface hydrological components and provided an opportunity to characterize terrestrial processes from a completely new perspective. Given the spatial advantage that space-based observations offer, several regional-to-global scale products have been developed, offering insights into the multi-scale behaviour and variability of hydrological states and fluxes. However, one of the key challenges in the use of satellite-based products is characterizing the degree to which they provide realistic and representative estimates of the underlying retrieval: that is, how accurate are the hydrological components derived from satellite observations? The challenge is intrinsically linked to issues of scale, since the availability of high-quality in-situ data is limited, and even where it does exist, is generally not commensurate to the resolution of the satellite observation. Basin-scale studies have shown considerable variability in achieving water budget closure with any degree of accuracy using satellite estimates of the water cycle. In order to assess the suitability of this type of approach for evaluating hydrological observations, it makes sense to first test it over environments with restricted hydrological inputs, before applying it to more hydrological complex basins. Here we explore the concept of hydrological consistency, i.e. the physical considerations that the water budget impose on the hydrologic fluxes and states to be temporally and spatially linked, to evaluate the reproduction of a set of large-scale evaporation (E) products by using a combination of satellite rainfall (P) and Gravity Recovery and Climate Experiment (GRACE) observations of storage change, focusing on arid and semi-arid environments, where the hydrological flows can be more realistically described. Our results indicate no persistent hydrological
Consistent Code Qualification Process and Application to WWER-1000 NPP
International Nuclear Information System (INIS)
Berthon, A.; Petruzzi, A.; Giannotti, W.; D'Auria, F.; Reventos, F.
2006-01-01
Calculation analysis by application of the system codes are performed to evaluate the NPP or the facility behavior during a postulated transient or to evaluate the code capability. The calculation analysis constitutes a process that involves the code itself, the data of the reference plant, the data about the transient, the nodalization, and the user. All these elements affect one each other and affect the results. A major issue in the use of mathematical model is constituted by the model capability to reproduce the plant or facility behavior under steady state and transient conditions. These aspects constitute two main checks that must be satisfied during the qualification process. The first of them is related to the realization of a scheme of the reference plant; the second one is related to the capability to reproduce the transient behavior. The aim of this paper is to describe the UMAE (Uncertainty Method based on Accuracy Extrapolation) methodology developed at University of Pisa for qualifying a nodalization and analysing the calculated results and to perform the uncertainty evaluation of the system code by the CIAU code (Code with the capability of Internal Assessment of Uncertainty). The activity consists with the re-analysis of the Experiment BL-44 (SBLOCA) performed in the LOBI facility and the analysis of a Kv-scaling calculation of the WWER-1000 NPP nodalization taking as reference the test BL-44. Relap5/Mod3.3 has been used as thermal-hydraulic system code and the standard procedure adopted at University of Pisa has been applied to show the capability of the code to predict the significant aspects of the transient and to obtain a qualified nodalization of the WWER-1000 through a systematic qualitative and quantitative accuracy evaluation. The qualitative accuracy evaluation is based on the selection of Relevant Thermal-hydraulic Aspects (RTAs) and is a prerequisite to the application of the Fast Fourier Transform Based Method (FFTBM) which quantifies
Personality consistency in dogs: a meta-analysis.
Fratkin, Jamie L; Sinn, David L; Patall, Erika A; Gosling, Samuel D
2013-01-01
Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests') versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.
Personality consistency in dogs: a meta-analysis.
Directory of Open Access Journals (Sweden)
Jamie L Fratkin
Full Text Available Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family. Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43. Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests' versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.
Personality Consistency in Dogs: A Meta-Analysis
Fratkin, Jamie L.; Sinn, David L.; Patall, Erika A.; Gosling, Samuel D.
2013-01-01
Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that ‘puppy tests’ measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., ‘puppy tests’) versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed. PMID:23372787
26 CFR 301.6224(c)-3 - Consistent settlements.
2010-04-01
... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Consistent settlements. 301.6224(c)-3 Section... settlements. (a) In general. If the Internal Revenue Service enters into a settlement agreement with any..., settlement terms consistent with those contained in the settlement agreement entered into. (b) Requirements...
Self-consistent calculation of atomic structure for mixture
International Nuclear Information System (INIS)
Meng Xujun; Bai Yun; Sun Yongsheng; Zhang Jinglin; Zong Xiaoping
2000-01-01
Based on relativistic Hartree-Fock-Slater self-consistent average atomic model, atomic structure for mixture is studied by summing up component volumes in mixture. Algorithmic procedure for solving both the group of Thomas-Fermi equations and the self-consistent atomic structure is presented in detail, and, some numerical results are discussed
Consistency and Inconsistency in PhD Thesis Examination
Holbrook, Allyson; Bourke, Sid; Lovat, Terry; Fairbairn, Hedy
2008-01-01
This is a mixed methods investigation of consistency in PhD examination. At its core is the quantification of the content and conceptual analysis of examiner reports for 804 Australian theses. First, the level of consistency between what examiners say in their reports and the recommendation they provide for a thesis is explored, followed by an…
Delimiting Coefficient a from Internal Consistency and Unidimensionality
Sijtsma, Klaas
2015-01-01
I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient a to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient a is a lower bound to reliability and that concepts of internal consistency and…
Risk aversion vs. the Omega ratio : Consistency results
Balder, Sven; Schweizer, Nikolaus
This paper clarifies when the Omega ratio and related performance measures are consistent with second order stochastic dominance and when they are not. To avoid consistency problems, the threshold parameter in the ratio should be chosen as the expected return of some benchmark – as is commonly done
Policy consistency and the achievement of Nigeria's foreign policy ...
African Journals Online (AJOL)
This study is an attempt to investigate the policy consistency of Nigeria‟s foreign policy and to understand the basis for this consistency; and also to see whether peacekeeping/peace-enforcement is key instrument in the achievement of Nigeria‟s foreign policy goals. The objective of the study was to examine whether the ...
Decentralized Consistency Checking in Cross-organizational Workflows
Wombacher, Andreas
Service Oriented Architectures facilitate loosely coupled composed services, which are established in a decentralized way. One challenge for such composed services is to guarantee consistency, i.e., deadlock-freeness. This paper presents a decentralized approach to consistency checking, which
Quasi-Particle Self-Consistent GW for Molecules.
Kaplan, F; Harding, M E; Seiler, C; Weigend, F; Evers, F; van Setten, M J
2016-06-14
We present the formalism and implementation of quasi-particle self-consistent GW (qsGW) and eigenvalue only quasi-particle self-consistent GW (evGW) adapted to standard quantum chemistry packages. Our implementation is benchmarked against high-level quantum chemistry computations (coupled-cluster theory) and experimental results using a representative set of molecules. Furthermore, we compare the qsGW approach for five molecules relevant for organic photovoltaics to self-consistent GW results (scGW) and analyze the effects of the self-consistency on the ground state density by comparing calculated dipole moments to their experimental values. We show that qsGW makes a significant improvement over conventional G0W0 and that partially self-consistent flavors (in particular evGW) can be excellent alternatives.
Consistency of hand preference: predictions to intelligence and school achievement.
Kee, D W; Gottfried, A; Bathurst, K
1991-05-01
Gottfried and Bathurst (1983) reported that hand preference consistency measured over time during infancy and early childhood predicts intellectual precocity for females, but not for males. In the present study longitudinal assessments of children previously classified by Gottfried and Bathurst as consistent or nonconsistent in cross-time hand preference were conducted during middle childhood (ages 5 to 9). Findings show that (a) early measurement of hand preference consistency for females predicts school-age intellectual precocity, (b) the locus of the difference between consistent vs. nonconsistent females is in verbal intelligence, and (c) the precocity of the consistent females was also revealed on tests of school achievement, particularly tests of reading and mathematics.
Personality and Situation Predictors of Consistent Eating Patterns.
Vainik, Uku; Dubé, Laurette; Lu, Ji; Fellows, Lesley K
2015-01-01
A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied. A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner). The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation. Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption. Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.
Personality and Situation Predictors of Consistent Eating Patterns.
Directory of Open Access Journals (Sweden)
Uku Vainik
Full Text Available A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied.A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner. The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation.Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption.Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.
Accuracy and Consistency of Respiratory Gating in Abdominal Cancer Patients
International Nuclear Information System (INIS)
Ge, Jiajia; Santanam, Lakshmi; Yang, Deshan; Parikh, Parag J.
2013-01-01
Purpose: To evaluate respiratory gating accuracy and intrafractional consistency for abdominal cancer patients treated with respiratory gated treatment on a regular linear accelerator system. Methods and Materials: Twelve abdominal patients implanted with fiducials were treated with amplitude-based respiratory-gated radiation therapy. On the basis of daily orthogonal fluoroscopy, the operator readjusted the couch position and gating window such that the fiducial was within a setup margin (fiducial-planning target volume [f-PTV]) when RPM indicated “beam-ON.” Fifty-five pre- and post-treatment fluoroscopic movie pairs with synchronized respiratory gating signal were recorded. Fiducial motion traces were extracted from the fluoroscopic movies using a template matching algorithm and correlated with f-PTV by registering the digitally reconstructed radiographs with the fluoroscopic movies. Treatment was determined to be “accurate” if 50% of the fiducial area stayed within f-PTV while beam-ON. For movie pairs that lost gating accuracy, a MATLAB program was used to assess whether the gating window was optimized, the external-internal correlation (EIC) changed, or the patient moved between movies. A series of safety margins from 0.5 mm to 3 mm was added to f-PTV for reassessing gating accuracy. Results: A decrease in gating accuracy was observed in 44% of movie pairs from daily fluoroscopic movies of 12 abdominal patients. Three main causes for inaccurate gating were identified as change of global EIC over time (∼43%), suboptimal gating setup (∼37%), and imperfect EIC within movie (∼13%). Conclusions: Inconsistent respiratory gating accuracy may occur within 1 treatment session even with a daily adjusted gating window. To improve or maintain gating accuracy during treatment, we suggest using at least a 2.5-mm safety margin to account for gating and setup uncertainties
Facial Mimicry and Emotion Consistency: Influences of Memory and Context.
Kirkham, Alexander J; Hayes, Amy E; Pawling, Ralph; Tipper, Steven P
2015-01-01
This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene), whilst others were always inconsistent (e.g., frowning with a positive scene). During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.
Facial Mimicry and Emotion Consistency: Influences of Memory and Context.
Directory of Open Access Journals (Sweden)
Alexander J Kirkham
Full Text Available This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene, whilst others were always inconsistent (e.g., frowning with a positive scene. During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.
Quasiparticle self-consistent GW method: a short summary
International Nuclear Information System (INIS)
Kotani, Takao; Schilfgaarde, Mark van; Faleev, Sergey V; Chantis, Athanasios
2007-01-01
We have developed a quasiparticle self-consistent GW method (QSGW), which is a new self-consistent method to calculate the electronic structure within the GW approximation. The method is formulated based on the idea of a self-consistent perturbation; the non-interacting Green function G 0 , which is the starting point for GWA to obtain G, is determined self-consistently so as to minimize the perturbative correction generated by GWA. After self-consistency is attained, we have G 0 , W (the screened Coulomb interaction) and G self-consistently. This G 0 can be interpreted as the optimum non-interacting propagator for the quasiparticles. We will summarize some theoretical discussions to justify QSGW. Then we will survey results which have been obtained up to now: e.g., band gaps for normal semiconductors are predicted to a precision of 0.1-0.3 eV; the self-consistency including the off-diagonal part is required for NiO and MnO; and so on. There are still some remaining disagreements with experiments; however, they are very systematic, and can be explained from the neglect of excitonic effects
Protective Factors, Risk Indicators, and Contraceptive Consistency Among College Women.
Morrison, Leslie F; Sieving, Renee E; Pettingell, Sandra L; Hellerstedt, Wendy L; McMorris, Barbara J; Bearinger, Linda H
2016-01-01
To explore risk and protective factors associated with consistent contraceptive use among emerging adult female college students and whether effects of risk indicators were moderated by protective factors. Secondary analysis of National Longitudinal Study of Adolescent to Adult Health Wave III data. Data collected through in-home interviews in 2001 and 2002. National sample of 18- to 25-year-old women (N = 842) attending 4-year colleges. We examined relationships between protective factors, risk indicators, and consistent contraceptive use. Consistent contraceptive use was defined as use all of the time during intercourse in the past 12 months. Protective factors included external supports of parental closeness and relationship with caring nonparental adult and internal assets of self-esteem, confidence, independence, and life satisfaction. Risk indicators included heavy episodic drinking, marijuana use, and depression symptoms. Multivariable logistic regression models were used to evaluate relationships between protective factors and consistent contraceptive use and between risk indicators and contraceptive use. Self-esteem, confidence, independence, and life satisfaction were significantly associated with more consistent contraceptive use. In a final model including all internal assets, life satisfaction was significantly related to consistent contraceptive use. Marijuana use and depression symptoms were significantly associated with less consistent use. With one exception, protective factors did not moderate relationships between risk indicators and consistent use. Based on our findings, we suggest that risk and protective factors may have largely independent influences on consistent contraceptive use among college women. A focus on risk and protective factors may improve contraceptive use rates and thereby reduce unintended pregnancy among college students. Copyright © 2016 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses. Published
The Consistent Preferences Approach to Deductive Reasoning in Games
Asheim, Geir B
2006-01-01
"The Consistent Preferences Approach to Deductive Reasoning in Games" presents, applies, and synthesizes what my co-authors and I have called the 'consistent preferences' approach to deductive reasoning in games. Briefly described, this means that the object of the analysis is the ranking by each player of his own strategies, rather than his choice. The ranking can be required to be consistent (in different senses) with his beliefs about the opponent's ranking of her strategies. This can be contrasted to the usual 'rational choice' approach where a player's strategy choice is (in dif
On the consistent histories approach to quantum mechanics
International Nuclear Information System (INIS)
Dowker, F.; Kent, A.
1996-01-01
We review the consistent histories formulations of quantum mechanics developed by Griffiths, Omnes, Gell-Man, and Hartle, and we describe the classifications of consistent sets. We illustrate some general features of consistent sets by a few lemmas and examples. We also consider various interpretations of the formalism, and we examine the new problems which arise in reconstructing the past and predicting the future. It is shown that Omnes characterization of true statements---statements that can be deduced unconditionally in his interpretation---is incorrect. We examine critically Gell-Mann and Hartle's interpretation of the formalism, and in particular, their discussions of communication, prediction, and retrodiction, and we conclude that their explanation of the apparent persistence of quasiclassicality relies on assumptions about an as-yet-unknown theory of experience. Our overall conclusion is that the consistent histories approach illustrates the need to supplement quantum mechanics by some selection principle in order to produce a fundamental theory capable of unconditional predictions
Consistency of Trend Break Point Estimator with Underspecified Break Number
Directory of Open Access Journals (Sweden)
Jingjing Yang
2017-01-01
Full Text Available This paper discusses the consistency of trend break point estimators when the number of breaks is underspecified. The consistency of break point estimators in a simple location model with level shifts has been well documented by researchers under various settings, including extensions such as allowing a time trend in the model. Despite the consistency of break point estimators of level shifts, there are few papers on the consistency of trend shift break point estimators in the presence of an underspecified break number. The simulation study and asymptotic analysis in this paper show that the trend shift break point estimator does not converge to the true break points when the break number is underspecified. In the case of two trend shifts, the inconsistency problem worsens if the magnitudes of the breaks are similar and the breaks are either both positive or both negative. The limiting distribution for the trend break point estimator is developed and closely approximates the finite sample performance.
Liking for Evaluators: Consistency and Self-Esteem Theories
Regan, Judith Weiner
1976-01-01
Consistency and self-esteem theories make contrasting predictions about the relationship between a person's self-evaluation and his liking for an evaluator. Laboratory experiments confirmed predictions about these theories. (Editor/RK)
Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering
Sicat, Ronell Barrera; Kruger, Jens; Moller, Torsten; Hadwiger, Markus
2014-01-01
This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined
Structures, profile consistency, and transport scaling in electrostatic convection
DEFF Research Database (Denmark)
Bian, N.H.; Garcia, O.E.
2005-01-01
Two mechanisms at the origin of profile consistency in models of electrostatic turbulence in magnetized plasmas are considered. One involves turbulent diffusion in collisionless plasmas and the subsequent turbulent equipartition of Lagrangian invariants. By the very nature of its definition...
15 CFR 930.36 - Consistency determinations for proposed activities.
2010-01-01
... necessity of issuing separate consistency determinations for each incremental action controlled by the major... plans), and that affect any coastal use or resource of more than one State. Many States share common...
Decentralized Consistent Network Updates in SDN with ez-Segway
Nguyen, Thanh Dang; Chiesa, Marco; Canini, Marco
2017-01-01
We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and black-holes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes
The utility of theory of planned behavior in predicting consistent ...
African Journals Online (AJOL)
admin
disease. Objective: To examine the utility of theory of planned behavior in predicting consistent condom use intention of HIV .... (24-25), making subjective norms as better predictors of intention ..... Organizational Behavior and Human Decision.
A methodology for the data energy regional consumption consistency analysis
International Nuclear Information System (INIS)
Canavarros, Otacilio Borges; Silva, Ennio Peres da
1999-01-01
The article introduces a methodology for data energy regional consumption consistency analysis. The work was going based on recent studies accomplished by several cited authors and boarded Brazilian matrices and Brazilian energetics regional balances. The results are compared and analyzed
Island of Stability for Consistent Deformations of Einstein's Gravity
DEFF Research Database (Denmark)
Dietrich, Dennis D.; Berkhahn, Felix; Hofmann, Stefan
2012-01-01
We construct deformations of general relativity that are consistent and phenomenologically viable, since they respect, in particular, cosmological backgrounds. These deformations have unique symmetries in accordance with their Minkowski cousins (Fierz-Pauli theory for massive gravitons) and incor...
Self-consistent normal ordering of gauge field theories
International Nuclear Information System (INIS)
Ruehl, W.
1987-01-01
Mean-field theories with a real action of unconstrained fields can be self-consistently normal ordered. This leads to a considerable improvement over standard mean-field theory. This concept is applied to lattice gauge theories. First an appropriate real action mean-field theory is constructed. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean-field theory are derived. (author). 4 refs
Consistency of the least weighted squares under heteroscedasticity
Czech Academy of Sciences Publication Activity Database
Víšek, Jan Ámos
2011-01-01
Roč. 2011, č. 47 (2011), s. 179-206 ISSN 0023-5954 Grant - others:GA UK(CZ) GA402/09/055 Institutional research plan: CEZ:AV0Z10750506 Keywords : Regression * Consistency * The least weighted squares * Heteroscedasticity Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/visek-consistency of the least weighted squares under heteroscedasticity.pdf
Cosmological consistency tests of gravity theory and cosmic acceleration
Ishak-Boushaki, Mustapha B.
2017-01-01
Testing general relativity at cosmological scales and probing the cause of cosmic acceleration are among the important objectives targeted by incoming and future astronomical surveys and experiments. I present our recent results on consistency tests that can provide insights about the underlying gravity theory and cosmic acceleration using cosmological data sets. We use statistical measures, the rate of cosmic expansion, the growth rate of large scale structure, and the physical consistency of these probes with one another.
Self-consistency corrections in effective-interaction calculations
International Nuclear Information System (INIS)
Starkand, Y.; Kirson, M.W.
1975-01-01
Large-matrix extended-shell-model calculations are used to compute self-consistency corrections to the effective interaction and to the linked-cluster effective interaction. The corrections are found to be numerically significant and to affect the rate of convergence of the corresponding perturbation series. The influence of various partial corrections is tested. It is concluded that self-consistency is an important effect in determining the effective interaction and improving the rate of convergence. (author)
Consistent Estimation of Pricing Kernels from Noisy Price Data
Vladislav Kargin
2003-01-01
If pricing kernels are assumed non-negative then the inverse problem of finding the pricing kernel is well-posed. The constrained least squares method provides a consistent estimate of the pricing kernel. When the data are limited, a new method is suggested: relaxed maximization of the relative entropy. This estimator is also consistent. Keywords: $\\epsilon$-entropy, non-parametric estimation, pricing kernel, inverse problems.
Measuring consistency of autobiographical memory recall in depression.
LENUS (Irish Health Repository)
Semkovska, Maria
2012-05-15
Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms.
Measuring consistency of autobiographical memory recall in depression.
Semkovska, Maria; Noone, Martha; Carton, Mary; McLoughlin, Declan M
2012-05-15
Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Are prescription drug insurance choices consistent with expected utility theory?
Bundorf, M Kate; Mata, Rui; Schoenbaum, Michael; Bhattacharya, Jay
2013-09-01
To determine the extent to which people make choices inconsistent with expected utility theory when choosing among prescription drug insurance plans and whether tabular or graphical presentation format influences the consistency of their choices. Members of an Internet-enabled panel chose between two Medicare prescription drug plans. The "low variance" plan required higher out-of-pocket payments for the drugs respondents usually took but lower out-of-pocket payments for the drugs they might need if they developed a new health condition than the "high variance" plan. The probability of a change in health varied within subjects and the presentation format (text vs. graphical) and the affective salience of the clinical condition (abstract vs. risk related to specific clinical condition) varied between subjects. Respondents were classified based on whether they consistently chose either the low or high variance plan. Logistic regression models were estimated to examine the relationship between decision outcomes and task characteristics. The majority of respondents consistently chose either the low or high variance plan, consistent with expected utility theory. Half of respondents consistently chose the low variance plan. Respondents were less likely to make discrepant choices when information was presented in graphical format. Many people, although not all, make choices consistent with expected utility theory when they have information on differences among plans in the variance of out-of-pocket spending. Medicare beneficiaries would benefit from information on the extent to which prescription drug plans provide risk protection. PsycINFO Database Record (c) 2013 APA, all rights reserved.
Martial arts striking hand peak acceleration, accuracy and consistency.
Neto, Osmar Pinto; Marzullo, Ana Carolina De Miranda; Bolander, Richard P; Bir, Cynthia A
2013-01-01
The goal of this paper was to investigate the possible trade-off between peak hand acceleration and accuracy and consistency of hand strikes performed by martial artists of different training experiences. Ten male martial artists with training experience ranging from one to nine years volunteered to participate in the experiment. Each participant performed 12 maximum effort goal-directed strikes. Hand acceleration during the strikes was obtained using a tri-axial accelerometer block. A pressure sensor matrix was used to determine the accuracy and consistency of the strikes. Accuracy was estimated by the radial distance between the centroid of each subject's 12 strikes and the target, whereas consistency was estimated by the square root of the 12 strikes mean squared distance from their centroid. We found that training experience was significantly correlated to hand peak acceleration prior to impact (r(2)=0.456, p =0.032) and accuracy (r(2)=0. 621, p=0.012). These correlations suggest that more experienced participants exhibited higher hand peak accelerations and at the same time were more accurate. Training experience, however, was not correlated to consistency (r(2)=0.085, p=0.413). Overall, our results suggest that martial arts training may lead practitioners to achieve higher striking hand accelerations with better accuracy and no change in striking consistency.
Self-consistent electrodynamic scattering in the symmetric Bragg case
International Nuclear Information System (INIS)
Campos, H.S.
1988-01-01
We have analyzed the symmetric Bragg case, introducing a model of self consistent scattering for two elliptically polarized beams. The crystal is taken as a set of mathematical planes, each of them defined by a surface density of dipoles. We have considered the mesofield and the epifield differently from that of the Ewald's theory and, we assumed a plane of dipoles and the associated fields as a self consistent scattering unit. The exact analytical treatment when applied to any two neighbouring planes, results in a general and self consistent Bragg's equation, in terms of the amplitude and phase variations. The generalized solution for the set of N planes was obtained after introducing an absorption factor in the incident radiation, in two ways: (i) the analytical one, through a rule of field similarity, which says that the incidence occurs in both faces of the all crystal planes and also, through a matricial development with the Chebyshev polynomials; (ii) using the numerical solution we calculated, iteratively, the reflectivity, the reflection phase, the transmissivity, the transmission phase and the energy. The results are showed through reflection and transmission curves, which are characteristics as from kinematical as dynamical theories. The conservation of the energy results from the Ewald's self consistency principle is used. In the absorption case, the results show that it is not the only cause for the asymmetric form in the reflection curves. The model contains basic elements for a unified, microscope, self consistent, vectorial and exact formulation for interpretating the X ray diffraction in perfect crystals. (author)
Cognitive consistency and math-gender stereotypes in Singaporean children.
Cvencek, Dario; Meltzoff, Andrew N; Kapur, Manu
2014-01-01
In social psychology, cognitive consistency is a powerful principle for organizing psychological concepts. There have been few tests of cognitive consistency in children and no research about cognitive consistency in children from Asian cultures, who pose an interesting developmental case. A sample of 172 Singaporean elementary school children completed implicit and explicit measures of math-gender stereotype (male=math), gender identity (me=male), and math self-concept (me=math). Results showed strong evidence for cognitive consistency; the strength of children's math-gender stereotypes, together with their gender identity, significantly predicted their math self-concepts. Cognitive consistency may be culturally universal and a key mechanism for developmental change in social cognition. We also discovered that Singaporean children's math-gender stereotypes increased as a function of age and that boys identified with math more strongly than did girls despite Singaporean girls' excelling in math. The results reveal both cultural universals and cultural variation in developing social cognition. Copyright © 2013 Elsevier Inc. All rights reserved.
A consistent response spectrum analysis including the resonance range
International Nuclear Information System (INIS)
Schmitz, D.; Simmchen, A.
1983-01-01
The report provides a complete consistent Response Spectrum Analysis for any component. The effect of supports with different excitation is taken into consideration, at is the description of the resonance ranges. It includes information explaining how the contributions of the eigenforms with higher eigenfrequencies are to be considered. Stocking of floor response spectra is also possible using the method described here. However, modified floor response spectra must now be calculated for each building mode. Once these have been prepared, the calculation of the dynamic component values is practically no more complicated than with the conventional, non-consistent methods. The consistent Response Spectrum Analysis can supply smaller and larger values than the conventional theory, a fact which can be demonstrated using simple examples. The report contains a consistent Response Spectrum Analysis (RSA), which, as far as we know, has been formulated in this way for the first time. A consistent RSA is so important because today this method is preferentially applied as an important tool for the earthquake proof of components in nuclear power plants. (orig./HP)
GRAVITATIONALLY CONSISTENT HALO CATALOGS AND MERGER TREES FOR PRECISION COSMOLOGY
International Nuclear Information System (INIS)
Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi; Busha, Michael T.; Klypin, Anatoly A.; Primack, Joel R.
2013-01-01
We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.
Managing Consistency Anomalies in Distributed Integrated Databases with Relaxed ACID Properties
DEFF Research Database (Denmark)
Frank, Lars; Ulslev Pedersen, Rasmus
2014-01-01
In central databases the consistency of data is normally implemented by using the ACID (Atomicity, Consistency, Isolation and Durability) properties of a DBMS (Data Base Management System). This is not possible if distributed and/or mobile databases are involved and the availability of data also...... has to be optimized. Therefore, we will in this paper use so called relaxed ACID properties across different locations. The objective of designing relaxed ACID properties across different database locations is that the users can trust the data they use even if the distributed database temporarily...... is inconsistent. It is also important that disconnected locations can operate in a meaningful way in socalled disconnected mode. A database is DBMS consistent if its data complies with the consistency rules of the DBMS's metadata. If the database is DBMS consistent both when a transaction starts and when it has...
Self-consistent hybrid functionals for solids: a fully-automated implementation
Erba, A.
2017-08-01
A fully-automated algorithm for the determination of the system-specific optimal fraction of exact exchange in self-consistent hybrid functionals of the density-functional-theory is illustrated, as implemented into the public Crystal program. The exchange fraction of this new class of functionals is self-consistently updated proportionally to the inverse of the dielectric response of the system within an iterative procedure (Skone et al 2014 Phys. Rev. B 89, 195112). Each iteration of the present scheme, in turn, implies convergence of a self-consistent-field (SCF) and a coupled-perturbed-Hartree-Fock/Kohn-Sham (CPHF/KS) procedure. The present implementation, beside improving the user-friendliness of self-consistent hybrids, exploits the unperturbed and electric-field perturbed density matrices from previous iterations as guesses for subsequent SCF and CPHF/KS iterations, which is documented to reduce the overall computational cost of the whole process by a factor of 2.
Consistent Partial Least Squares Path Modeling via Regularization.
Jung, Sunho; Park, JaeHong
2018-01-01
Partial least squares (PLS) path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc), designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.
Context-dependent individual behavioral consistency in Daphnia
DEFF Research Database (Denmark)
Heuschele, Jan; Ekvall, Mikael T.; Bianco, Giuseppe
2017-01-01
The understanding of consistent individual differences in behavior, often termed "personality," for adapting and coping with threats and novel environmental conditions has advanced considerably during the last decade. However, advancements are almost exclusively associated with higher-order animals......, whereas studies focusing on smaller aquatic organisms are still rare. Here, we show individual differences in the swimming behavior of Daphnia magna, a clonal freshwater invertebrate, before, during, and after being exposed to a lethal threat, ultraviolet radiation (UVR). We show consistency in swimming...... that of adults. Overall, we show that aquatic invertebrates are far from being identical robots, but instead they show considerable individual differences in behavior that can be attributed to both ontogenetic development and individual consistency. Our study also demonstrates, for the first time...
Consistent forcing scheme in the cascaded lattice Boltzmann method
Fei, Linlin; Luo, Kai Hong
2017-11-01
In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.
Self-consistent approximations beyond the CPA: Part II
International Nuclear Information System (INIS)
Kaplan, T.; Gray, L.J.
1982-01-01
This paper concentrates on a self-consistent approximation for random alloys developed by Kaplan, Leath, Gray, and Diehl. The construction of the augmented space formalism for a binary alloy is sketched, and the notation to be used derived. Using the operator methods of the augmented space, the self-consistent approximation is derived for the average Green's function, and for evaluating the self-energy, taking into account the scattering by clusters of excitations. The particular cluster approximation desired is derived by treating the scattering by the excitations with S /SUB T/ exactly. Fourier transforms on the disorder-space clustersite labels solve the self-consistent set of equations. Expansion to short range order in the alloy is also discussed. A method to reduce the problem to a computationally tractable form is described
Linear augmented plane wave method for self-consistent calculations
International Nuclear Information System (INIS)
Takeda, T.; Kuebler, J.
1979-01-01
O.K. Andersen has recently introduced a linear augmented plane wave method (LAPW) for the calculation of electronic structure that was shown to be computationally fast. A more general formulation of an LAPW method is presented here. It makes use of a freely disposable number of eigenfunctions of the radial Schroedinger equation. These eigenfunctions can be selected in a self-consistent way. The present formulation also results in a computationally fast method. It is shown that Andersen's LAPW is obtained in a special limit from the present formulation. Self-consistent test calculations for copper show the present method to be remarkably accurate. As an application, scalar-relativistic self-consistent calculations are presented for the band structure of FCC lanthanum. (author)
Self-consistency and coherent effects in nonlinear resonances
International Nuclear Information System (INIS)
Hofmann, I.; Franchetti, G.; Qiang, J.; Ryne, R. D.
2003-01-01
The influence of space charge on emittance growth is studied in simulations of a coasting beam exposed to a strong octupolar perturbation in an otherwise linear lattice, and under stationary parameters. We explore the importance of self-consistency by comparing results with a non-self-consistent model, where the space charge electric field is kept 'frozen-in' to its initial values. For Gaussian distribution functions we find that the 'frozen-in' model results in a good approximation of the self-consistent model, hence coherent response is practically absent and the emittance growth is self-limiting due to space charge de-tuning. For KV or waterbag distributions, instead, strong coherent response is found, which we explain in terms of absence of Landau damping
A consistent time frame for Chaucer's Canterbury Pilgrimage
Kummerer, K. R.
2001-08-01
A consistent time frame for the pilgrimage that Geoffrey Chaucer describes in The Canterbury Tales can be established if the seven celestial assertions related to the journey mentioned in the text can be reconciled with each other and the date of April 18 that is also mentioned. Past attempts to establish such a consistency for all seven celestial assertions have not been successful. The analysis herein, however, indicates that in The Canterbury Tales Chaucer accurately describes the celestial conditions he observed in the April sky above the London(Canterbury region of England in the latter half of the fourteenth century. All seven celestial assertions are in agreement with each other and consistent with the April 18 date. The actual words of Chaucer indicate that the Canterbury journey began during the 'seson' he defines in the General Prologue and ends under the light of the full Moon on the night of April 18, 1391.
Consistent forcing scheme in the cascaded lattice Boltzmann method.
Fei, Linlin; Luo, Kai Hong
2017-11-01
In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.
Bootstrap-Based Inference for Cube Root Consistent Estimators
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi
This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...... to be inconsistent. Our method restores consistency of the nonparametric bootstrap by altering the shape of the criterion function defining the estimator whose distribution we seek to approximate. This modification leads to a generic and easy-to-implement resampling method for inference that is conceptually distinct...... from other available distributional approximations based on some form of modified bootstrap. We offer simulation evidence showcasing the performance of our inference method in finite samples. An extension of our methodology to general M-estimation problems is also discussed....
Self-consistent modelling of resonant tunnelling structures
DEFF Research Database (Denmark)
Fiig, T.; Jauho, A.P.
1992-01-01
We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated...... applied voltages and carrier densities at the emitter-barrier interface. We include the two-dimensional accumulation layer charge and the quantum well charge in our self-consistent scheme. We discuss the evaluation of the current contribution originating from the two-dimensional accumulation layer charges......, and our qualitative estimates seem consistent with recent experimental studies. The intrinsic bistability of resonant tunnelling diodes is analyzed within several different approximation schemes....
Examination of the consistency and accuracy of computerized brachytherapy dose predictions
International Nuclear Information System (INIS)
Tolbert, D.D.; Reed, S.A.
1981-01-01
Four brachytherapy test cases were sent to representatives of commercial and non-commercial, computerized radiation oncology treatment planning systems. Four commercial systems are represented herein. The non-commercial, state-of-the-art systems represented are (in alphabetical order) BRACHY, ISODOS and RADCOMP. Mutual comparisons were made to examine consistency and a comparison with experimental measurements around a single source was made to examine accuracy. The systems represented are most consistent within 5 cm from the center of a single source, and within rays from the center making angles of greater than or equal to 20 0 relative to the source axis. Taking into account tissue absorption and scatter, the spatial uncertainty in the location of a particular isodose rate value is less than or equal to 0.7 mm for commercial systems and less than or equal to 0.5 mm for non-commercial systems
An Explicit Consistent Geometric Stiffness Matrix for the DKT Element
Directory of Open Access Journals (Sweden)
Eliseu Lucena Neto
Full Text Available Abstract A large number of references dealing with the geometric stiffness matrix of the DKT finite element exist in the literature, where nearly all of them adopt an inconsistent form. While such a matrix may be part of the element to treat nonlinear problems in general, it is of crucial importance for linearized buckling analysis. The present work seems to be the first to obtain an explicit expression for this matrix in a consistent way. Numerical results on linear buckling of plates assess the element performance either with the proposed explicit consistent matrix, or with the most commonly used inconsistent matrix.
The cluster bootstrap consistency in generalized estimating equations
Cheng, Guang
2013-03-01
The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.
Consistency in the description of diffusion in compacted bentonite
International Nuclear Information System (INIS)
Lehikoinen, J.; Muurinen, A.
2009-01-01
A macro-level diffusion model, which aims to provide a unifying framework for explaining the experimentally observed co-ion exclusion and greatly controversial counter-ion surface diffusion in a consistent fashion, is presented. It is explained in detail why a term accounting for the non-zero mobility of the counter-ion surface excess is required in the mathematical form of the macroscopic diffusion flux. The prerequisites for the consistency of the model and the problems associated with the interpretation of diffusion in such complex pore geometries as in compacted smectite clays are discussed. (author)
Standard Model Vacuum Stability and Weyl Consistency Conditions
DEFF Research Database (Denmark)
Antipin, Oleg; Gillioz, Marc; Krog, Jens
2013-01-01
At high energy the standard model possesses conformal symmetry at the classical level. This is reflected at the quantum level by relations between the different beta functions of the model. These relations are known as the Weyl consistency conditions. We show that it is possible to satisfy them...... order by order in perturbation theory, provided that a suitable coupling constant counting scheme is used. As a direct phenomenological application, we study the stability of the standard model vacuum at high energies and compare with previous computations violating the Weyl consistency conditions....
STP: A mathematically and physically consistent library of steam properties
International Nuclear Information System (INIS)
Aguilar, F.; Hutter, A.C.; Tuttle, P.G.
1982-01-01
A new FORTRAN library of subroutines has been developed from the fundamental equation of Keenan et al. to evaluate a large set of water properties including derivatives such as sound speed and isothermal compressibility. The STP library uses the true saturation envelope of the Keenan et al. fundamental equation. The evaluation of the true envelope by a continuation method is explained. This envelope, along with other design features, imparts an exceptionally high degree of thermodynamic and mathematical consistency to the STP library, even at the critical point. Accuracy and smoothness, library self-consistency, and designed user convenience make the STP library a reliable and versatile water property package
Weyl consistency conditions in non-relativistic quantum field theory
Energy Technology Data Exchange (ETDEWEB)
Pal, Sridip; Grinstein, Benjamín [Department of Physics, University of California,San Diego, 9500 Gilman Drive, La Jolla, CA 92093 (United States)
2016-12-05
Weyl consistency conditions have been used in unitary relativistic quantum field theory to impose constraints on the renormalization group flow of certain quantities. We classify the Weyl anomalies and their renormalization scheme ambiguities for generic non-relativistic theories in 2+1 dimensions with anisotropic scaling exponent z=2; the extension to other values of z are discussed as well. We give the consistency conditions among these anomalies. As an application we find several candidates for a C-theorem. We comment on possible candidates for a C-theorem in higher dimensions.
A Van Atta reflector consisting of half-wave dipoles
DEFF Research Database (Denmark)
Appel-Hansen, Jørgen
1966-01-01
The reradiation pattern of a passive Van Atta reflector consisting of half-wave dipoles is investigated. The character of the reradiation pattern first is deduced by qualitative and physical considerations. Various types of array elements are considered and several geometrical configurations...... of these elements are outlined. Following this, an analysis is made of the reradiation pattern of a linear Van Atta array consisting of four equispaced half-wave dipoles. The general form of the reradiation pattern is studied analytically. The influence of scattering and coupling is determined and the dependence...
A self-consistent theory of the magnetic polaron
International Nuclear Information System (INIS)
Marvakov, D.I.; Kuzemsky, A.L.; Vlahov, J.P.
1984-10-01
A finite temperature self-consistent theory of magnetic polaron in the s-f model of ferromagnetic semiconductors is developed. The calculations are based on the novel approach of the thermodynamic two-time Green function methods. This approach consists in the introduction of the ''irreducible'' Green functions (IGF) and derivation of the exact Dyson equation and exact self-energy operator. It is shown that IGF method gives a unified and natural approach for a calculation of the magnetic polaron states by taking explicitly into account the damping effects and finite lifetime. (author)
Evidence for Consistency of the Glycation Gap in Diabetes
Nayak, Ananth U.; Holland, Martin R.; Macdonald, David R.; Nevill, Alan; Singh, Baldev M.
2011-01-01
OBJECTIVE Discordance between HbA1c and fructosamine estimations in the assessment of glycemia is often encountered. A number of mechanisms might explain such discordance, but whether it is consistent is uncertain. This study aims to coanalyze paired glycosylated hemoglobin (HbA1c)-fructosamine estimations by using fructosamine to determine a predicted HbA1c, to calculate a glycation gap (G-gap) and to determine whether the G-gap is consistent over time. RESEARCH DESIGN AND METHODS We include...
Diagnostic language consistency among multicultural English-speaking nurses.
Wieck, K L
1996-01-01
Cultural differences among nurses may influence the choice of terminology applicable to use of a nursing diagnostic statement. This study explored whether defining characteristics are consistently applied by culturally varied nurses in an English language setting. Two diagnoses, pain, and high risk for altered skin integrity, were studied within six cultures: African, Asian, Filipino, East Indian, African-American, and Anglo-American nurses. Overall, there was consistency between the cultural groups. Analysis of variance for the pain scale demonstrated differences among cultures on two characteristics of pain, restlessness and grimace. The only difference on the high risk for altered skin integrity scale was found on the distructor, supple skin.
The least weighted squares II. Consistency and asymptotic normality
Czech Academy of Sciences Publication Activity Database
Víšek, Jan Ámos
2002-01-01
Roč. 9, č. 16 (2002), s. 1-28 ISSN 1212-074X R&D Projects: GA AV ČR KSK1019101 Grant - others:GA UK(CR) 255/2000/A EK /FSV Institutional research plan: CEZ:AV0Z1075907 Keywords : robust regression * consistency * asymptotic normality Subject RIV: BA - General Mathematics
Consistency relation for the Lorentz invariant single-field inflation
International Nuclear Information System (INIS)
Huang, Qing-Guo
2010-01-01
In this paper we compute the sizes of equilateral and orthogonal shape bispectrum for the general Lorentz invariant single-field inflation. The stability of field theory implies a non-negative square of sound speed which leads to a consistency relation between the sizes of orthogonal and equilateral shape bispectrum, namely f NL orth. ≤ −0.054f NL equil. . In particular, for the single-field Dirac-Born-Infeld (DBI) inflation, the consistency relation becomes f NL orth. = 0.070f NL equil. ≤ 0. These consistency relations are also valid in the mixed scenario where the quantum fluctuations of some other light scalar fields contribute to a part of total curvature perturbation on the super-horizon scale and may generate a local form bispectrum. A distinguishing prediction of the mixed scenario is τ NL loc. > ((6/5)f NL loc. ) 2 . Comparing these consistency relations to WMAP 7yr data, there is still a big room for the Lorentz invariant inflation, but DBI inflation has been disfavored at more than 68% CL
Short-Cut Estimators of Criterion-Referenced Test Consistency.
Brown, James Dean
1990-01-01
Presents simplified methods for deriving estimates of the consistency of criterion-referenced, English-as-a-Second-Language tests, including (1) the threshold loss agreement approach using agreement or kappa coefficients, (2) the squared-error loss agreement approach using the phi(lambda) dependability approach, and (3) the domain score…
SOCIAL COMPARISON, SELF-CONSISTENCY AND THE PRESENTATION OF SELF.
MORSE, STANLEY J.; GERGEN, KENNETH J.
TO DISCOVER HOW A PERSON'S (P) SELF-CONCEPT IS AFFECTED BY THE CHARACTERISTICS OF ANOTHER (O) WHO SUDDENLY APPEARS IN THE SAME SOCIAL ENVIRONMENT, SEVERAL QUESTIONNAIRES, INCLUDING THE GERGEN-MORSE (1967) SELF-CONSISTENCY SCALE AND HALF THE COOPERSMITH SELF-ESTEEM INVENTORY, WERE ADMINISTERED TO 78 UNDERGRADUATE MEN WHO HAD ANSWERED AN AD FOR WORK…
Consistency of the Takens estimator for the correlation dimension
Borovkova, S.; Burton, Robert; Dehling, H.
Motivated by the problem of estimating the fractal dimension of a strange attractor, we prove weak consistency of U-statistics for stationary ergodic and mixing sequences when the kernel function is unbounded, extending by this earlier results of Aaronson, Burton, Dehling, Gilat, Hill and Weiss. We
An algebraic method for constructing stable and consistent autoregressive filters
International Nuclear Information System (INIS)
Harlim, John; Hong, Hoon; Robbins, Jacob L.
2015-01-01
In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides a discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern
Delimiting coefficient alpha from internal consistency and unidimensionality
Sijtsma, K.
2015-01-01
I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient α to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient α is a lower bound to reliability and
Challenges of Predictability and Consistency in the First ...
African Journals Online (AJOL)
This article aims to investigate some features of Endemann's (1911) Wörterbuch der Sotho-Sprache (Dictionary of the Sotho language) with the focus on challenges of predictability and consistency in the lemmatization approach, the access alphabet, cross references and article treatments. The dictionary has hitherto ...
The Impact of Orthographic Consistency on German Spoken Word Identification
Beyermann, Sandra; Penke, Martina
2014-01-01
An auditory lexical decision experiment was conducted to find out whether sound-to-spelling consistency has an impact on German spoken word processing, and whether such an impact is different at different stages of reading development. Four groups of readers (school children in the second, third and fifth grades, and university students)…
Final Report Fermionic Symmetries and Self consistent Shell Model
International Nuclear Information System (INIS)
Zamick, Larry
2008-01-01
In this final report in the field of theoretical nuclear physics we note important accomplishments.We were confronted with 'anomoulous' magnetic moments by the experimetalists and were able to expain them. We found unexpected partial dynamical symmetries--completely unknown before, and were able to a large extent to expain them. The importance of a self consistent shell model was emphasized.
Using the Perceptron Algorithm to Find Consistent Hypotheses
Anthony, M.; Shawe-Taylor, J.
1993-01-01
The perceptron learning algorithm yields quite naturally an algorithm for finding a linearly separable boolean function consistent with a sample of such a function. Using the idea of a specifying sample, we give a simple proof that this algorithm is not efficient, in general.
Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines
Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.
2011-01-01
Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433
Consistent seasonal snow cover depth and duration variability over ...
Indian Academy of Sciences (India)
Decline in consistent seasonal snow cover depth, duration and changing snow cover build- up pattern over the WH in recent decades indicate that WH has undergone considerable climate change and winter weather patterns are changing in the WH. 1. Introduction. Mountainous regions around the globe are storehouses.
Is There a Future for Education Consistent with Agenda 21?
Smyth, John
1999-01-01
Discusses recent experiences in developing and implementing strategies for education consistent with the concept of sustainable development at two different levels: (1) the international level characterized by Agenda 21 along with the efforts of the United Nations Commission on Sustainable Development to foster its progress; and (2) the national…
Consistent dynamical and statistical description of fission and comparison
Energy Technology Data Exchange (ETDEWEB)
Shunuan, Wang [Chinese Nuclear Data Center, Beijing, BJ (China)
1996-06-01
The research survey of consistent dynamical and statistical description of fission is briefly introduced. The channel theory of fission with diffusive dynamics based on Bohr channel theory of fission and Fokker-Planck equation and Kramers-modified Bohr-Wheeler expression according to Strutinsky method given by P.Frobrich et al. are compared and analyzed. (2 figs.).
Brief Report: Consistency of Search Engine Rankings for Autism Websites
Reichow, Brian; Naples, Adam; Steinhoff, Timothy; Halpern, Jason; Volkmar, Fred R.
2012-01-01
The World Wide Web is one of the most common methods used by parents to find information on autism spectrum disorders and most consumers find information through search engines such as Google or Bing. However, little is known about how the search engines operate or the consistency of the results that are returned over time. This study presents the…
Consistency of the Self-Schema in Depression.
Ross, Michael J.; Mueller, John H.
Depressed individuals may filter or distort environmental information in direct relationship to their self perceptions. To investigate the degree of uncertainty about oneself and others, as measured by consistent/inconsistent responses, 72 college students (32 depressed and 40 nondepressed) rated selected adjectives from the Derry and Kuiper…
Composition consisting of a dendrimer and an active substance
1995-01-01
The invention relates to a composition consisting of a dendrimer provided with blocking agents and an active substance occluded in the dendrimer. According to the invention a blocking agent is a compound which is sterically of sufficient size, which readily enters into a chemical bond with the
Analytical relativistic self-consistent-field calculations for atoms
International Nuclear Information System (INIS)
Barthelat, J.C.; Pelissier, M.; Durand, P.
1980-01-01
A new second-order representation of the Dirac equation is presented. This representation which is exact for a hydrogen atom is applied to approximate analytical self-consistent-field calculations for atoms. Results are given for the rare-gas atoms from helium to radon and for lead. The results compare favorably with numerical Dirac-Hartree-Fock solutions
A consistent analysis for the quark condensate in QCD
International Nuclear Information System (INIS)
Huang Zheng; Huang Tao
1988-08-01
The dynamical symmetry breaking in QCD is analysed based on the vacuum condensates. A self-consistent equation for the quark condensate (φ φ) is derived. A nontrivial solution for (φ φ) ≠ 0 is given in terms of the QCD scale parameter A
The consistency assessment of topological relations in cartographic generalization
Zheng, Chunyan; Guo, Qingsheng; Du, Xiaochu
2006-10-01
The field of research in the generalization assessment has been less studied than the generalization process itself, and it is very important to keep topological relation consistency for meeting generalization quality. This paper proposes a methodology to assess the quality of generalized map from topological relations consistency. Taking roads (including railway) and residential areas for examples, from the viewpoint of the spatial cognition, some issues about topological consistency in different map scales are analyzed. The statistic information about the inconsistent topological relations can be obtained by comparing the two matrices: one is the matrix for the topological relations in the generalized map; the other is the theoretical matrix for the topological relations that should be maintained after generalization. Based on the fuzzy set theory and the classification of map object types, the consistency evaluation model of topological relations is established. The paper proves the feasibility of the method through the example about how to evaluate the local topological relations between simple roads and residential area finally.
Numerical consistency check between two approaches to radiative ...
Indian Academy of Sciences (India)
approaches for a consistency check on numerical accuracy, and find out the stabil- ... ln(MR/1 GeV) to top-quark mass scale t0(= ln(mt/1 GeV)) where t0 ≤ t ≤ tR, we ..... It is in general to tone down the solar mixing angle through further fine.
Consistency or Discrepancy? Rethinking Schools from Organizational Hypocrisy to Integrity
Kiliçoglu, Gökhan
2017-01-01
Consistency in statements, decisions and practices is highly important for both organization members and the image of an organization. It is expected from organizations, especially from their administrators, to "walk the talk"--in other words, to try to practise what they preach. However, in the process of gaining legitimacy and adapting…
Consistency Check for the Bin Packing Constraint Revisited
Dupuis, Julien; Schaus, Pierre; Deville, Yves
The bin packing problem (BP) consists in finding the minimum number of bins necessary to pack a set of items so that the total size of the items in each bin does not exceed the bin capacity C. The bin capacity is common for all the bins.
Matrix analysis for associated consistency in cooperative game theory
Xu, G.; Driessen, Theo; Sun, H.; Sun, H.
Hamiache's recent axiomatization of the well-known Shapley value for TU games states that the Shapley value is the unique solution verifying the following three axioms: the inessential game property, continuity and associated consistency. Driessen extended Hamiache's axiomatization to the enlarged
Matrix analysis for associated consistency in cooperative game theory
Xu Genjiu, G.; Driessen, Theo; Sun, H.; Sun, H.
Hamiache axiomatized the Shapley value as the unique solution verifying the inessential game property, continuity and associated consistency. Driessen extended Hamiache’s axiomatization to the enlarged class of efficient, symmetric, and linear values. In this paper, we introduce the notion of row