NONE
1995-08-01
An aerial radiological survey was conducted over the Project Gasbuggy site, 55 miles (89 kilometers) east of Farmington, New Mexico, on October 27, 1994. Parallel lines were flown at intervals of 300 feet (91 meters) over a 16-square-mile (41-square-kilometer) area at a 150-foot (46-meter) altitude centered on the Gasbuggy site. The gamma energy spectra obtained were reduced to an exposure rate contour map overlaid on a high altitude aerial photograph of the area. The terrestrial exposure rate varied from 14 to 20 {micro}R/h at 1 meter above ground level. No anomalous or man-made isotopes were found.
Site Characterization Work Plan for Gasbuggy, New Mexico
DOE/NV
2000-12-14
Project Gasbuggy was the first of three joint government-industry experiments conducted to test the effectiveness of nuclear explosives to fracture deeply buried, low-permeability natural gas reservoirs to stimulate production. The scope of this work plan is to document the environmental objectives and the proposed technical site investigation strategies that will be utilized for the site characterization of the Project Gasbuggy Site. Its goal is the collection of data in sufficient quantity and quality to determine current site conditions, support a risk assessment for the site surfaces, and evaluate if further remedial action is required to achieve permanent closure of the site that is both protective of human health and the environment. The Gasbuggy Site is located approximately 55 air miles east of Farmington, New Mexico, in Rio Arriba County within the Carson National Forest in the northeast portion of the San Juan Basin. Historically, Project Gasbuggy consisted of the joint government-industry detonation of a nuclear device on December 10, 1967, followed by reentry drilling and gas production testing and project evaluation activities in post-detonation operations from 1967 to 1976. Based on historical documentation, no chemical release sites other than the mud pits were identified; additionally, there was no material buried at the Gasbuggy Site other than drilling fluids and construction debris. Although previous characterization and restoration activities including sensitive species surveys, cultural resources surveys, surface geophysical surveys, and limited soil sampling and analysis were performed in 1978 and again in 2000, no formal closure of the site was achieved. Also, these efforts did not adequately address the site's potential for chemical contamination at the surface/shallow subsurface ground levels or the subsurface hazards for potential migration outside of the current site subsurface intrusion restrictions. Additional investigation
Site Characterization Work Plan for Gasbuggy, New Mexico (Rev.1, Jan. 2002)
U.S. Department of Energy, National Nuclear Security Administration Nevada Operations Office (NNSA/NV)
2002-01-25
Project Gasbuggy was the first of three joint government-industry experiments conducted to test the effectiveness of nuclear explosives to fracture deeply buried, low-permeability natural gas reservoirs to stimulate production. The scope of this work plan is to document the environmental objectives and the proposed technical site investigation strategies that will be utilized for the site characterization of the Project Gasbuggy Site. Its goal is the collection of data in sufficient quantity and quality to determine current site conditions, support a risk assessment for the site surfaces, and evaluate if further remedial action is required to achieve permanent closure of the site that is both protective of human health and the environment. The Gasbuggy Site is located approximately 55 air miles east of Farmington, New Mexico, in Rio Arriba County within the Carson National Forest in the northeast portion of the San Juan Basin. Historically, Project Gasbuggy consisted of the joint government-industry detonation of a nuclear device on December 10, 1967, followed by reentry drilling and gas production testing and project evaluation activities in post-detonation operations from 1967 to 1976. Based on historical documentation, no chemical release sites other than the mud pits were identified; additionally, there was no material buried at the Gasbuggy Site other than drilling fluids and construction debris. Although previous characterization and restoration activities including sensitive species surveys, cultural resources surveys, surface geophysical surveys, and limited soil sampling and analysis were performed in 1978 and again in 2000, no formal closure of the site was achieved. Also, these efforts did not adequately address the site's potential for chemical contamination at the surface/shallow subsurface ground levels or the subsurface hazards for potential migration outside of the current site subsurface intrusion restrictions. Additional investigation
Gasbuggy, New Mexico Long-Term Hydrologic Monitoring Program Evaluation Report
None
2009-06-01
This report summarizes an evaluation of the Long-Term Hydrologic Monitoring Program (LTHMP) that has been conducted since 1972 at the Gasbuggy, New Mexico underground nuclear detonation site. The nuclear testing was conducted by the U.S. Atomic Energy Commission under the Plowshare program, which is discussed in greater detail in Appendix A. The detonation at Gasbuggy took place in 1967, 4,240 feet below ground surface, and was designed to fracture the host rock of a low-permeability natural gas-bearing formation in an effort to improve gas production. The site has historically been managed under the Nevada Offsites Project. These underground nuclear detonation sites are within the United States but outside of the Nevada Test Site where most of the experimental nuclear detonations conducted by the U.S. Government took place. Gasbuggy is managed by the U.S. Department of Energy (DOE) Office of Legacy Management (LM ).
Surface radioactivity at the plowshare gas-stimulation test sites: Gasbuggy, Rulison, Rio Blanco
Faller, S.H.
1995-01-01
A surface soil characterization was conducted at three former underground nuclear test sites: Gasbuggy, New Mexico; Rulison, Colorado; and Rio Blanco, Colorado. The abundances of man-made and naturally occurring radionuclides were determined with their contributions to total exposure rate. CS-137 was the only man-made radionuclide detected in the study and was highest at undisturbed locations with little forest litter cover. The amounts observed are consisted with radiocesium fallout concentration observed in other parts of the United States.
Gasbuggy, New Mexico, Hydrologic and Natural Gas Sampling and Analysis Results for 2009
None
2009-11-01
The U.S. Department of Energy (DOE) Office of Legacy Management conducted hydrologic and natural gas sampling for the Gasbuggy, New Mexico, site on June 16, and 17, 2009. Hydrologic sampling consists of collecting water samples from water wells and surface water locations. Natural gas sampling consists of collecting both gas samples and samples of produced water from gas production wells. The water well samples were analyzed for gamma-emitting radionuclides and tritium. Surface water samples were analyzed for tritium. Water samples from gas production wells were analyzed for gamma-emitting radionuclides, gross alpha, gross beta, and tritium. Natural gas samples were analyzed for tritium and carbon-14. Water samples were analyzed by ALS Laboratory Group in Fort Collins, Colorado, and natural gas samples were analyzed by Isotech Laboratories in Champaign, Illinois. Concentrations of tritium and gamma-emitting radionuclides in water samples collected in the vicinity of the Gasbuggy site continue to demonstrate that the sample locations have not been impacted by detonation-related contaminants. Results from the sampling of natural gas from producing wells demonstrate that the gas wells nearest the Gasbuggy site are not currently impacted by detonation-related contaminants. Annual sampling of the gas production wells nearest the Gasbuggy site for gas and produced water will continue for the foreseeable future. The sampling frequency of water wells and surface water sources in the surrounding area will be reduced to once every 5 years. The next hydrologic sampling event at water wells, springs, and ponds will be in 2014.
Gasbuggy Site Assessment and Risk Evaluation
None
2011-03-01
This report describes the geologic and hydrologic conditions and evaluates potential health risks to workers in the natural gas industry in the vicinity of the Gasbuggy, New Mexico, site, where the U.S. Atomic Energy Commission detonated an underground nuclear device in 1967. The 29-kiloton detonation took place 4,240 feet below ground surface and was designed to evaluate the use of a nuclear detonation to enhance natural gas production from the Pictured Cliffs Formation in the San Juan Basin, Rio Arriba County, New Mexico, on land administered by Carson National Forest. A site-specific conceptual model was developed based on current understanding of the hydrologic and geologic environment. This conceptual model was used for establishing plausible contaminant exposure scenarios, which were then evaluated for human health risk potential. The most mobile and, therefore, the most probable contaminant that could result in human exposure is tritium. Natural gas production wells were identified as having the greatest potential for bringing detonation-derived contaminants (tritium) to the ground surface in the form of tritiated produced water. Three exposure scenarios addressing potential contamination from gas wells were considered in the risk evaluation: a gas well worker during gas-well-drilling operations, a gas well worker performing routine maintenance, and a residential exposure. The residential exposure scenario was evaluated only for comparison; permanent residences on national forest lands at the Gasbuggy site are prohibited
Gasbuggy Site Assessment and Risk Evaluation
None
2011-03-01
The Gasbuggy site is in northern New Mexico in the San Juan Basin, Rio Arriba County (Figure 1-1). The Gasbuggy experiment was designed to evaluate the use of a nuclear detonation to enhance natural gas production from the Pictured Cliffs Formation, a tight, gas-bearing sandstone formation. The 29-kiloton-yield nuclear device was placed in a 17.5-inch wellbore at 4,240 feet (ft) below ground surface (bgs), approximately 40 ft below the Pictured Cliffs/Lewis shale contact, in an attempt to force the cavity/chimney formed by the detonation up into the Pictured Cliffs Sandstone. The test was conducted below the southwest quarter of Section 36, Township 29 North, Range 4 West, New Mexico Principal Meridian. The device was detonated on December 10, 1967, creating a 335-ft-high chimney above the detonation point and a cavity 160 ft in diameter. The gas produced from GB-ER (the emplacement and reentry well) during the post-detonation production tests was radioactive and diluted, primarily by carbon dioxide. After 2 years, the energy content of the gas had recovered to 80 percent of the value of gas in conventionally developed wells in the area. There is currently no technology capable of remediating deep underground nuclear detonation cavities and chimneys. Consequently, the U.S. Department of Energy (DOE) must continue to manage the Gasbuggy site to ensure that no inadvertent intrusion into the residual contamination occurs. DOE has complete control over the 1/4 section (160 acres) containing the shot cavity, and no drilling is permitted on that property. However, oil and gas leases are on the surrounding land. Therefore, the most likely route of intrusion and potential exposure would be through contaminated natural gas or contaminated water migrating into a producing natural gas well outside the immediate vicinity of ground zero. The purpose of this report is to describe the current site conditions and evaluate the potential health risks posed by the most plausible
Gasbuggy, New Mexico, Natural Gas and Produced Water Sampling and Analysis Results for 2011
None
2011-09-01
The U.S. Department of Energy (DOE) Office of Legacy Management conducted natural gas sampling for the Gasbuggy, New Mexico, site on June 7 and 8, 2011. Natural gas sampling consists of collecting both gas samples and samples of produced water from gas production wells. Water samples from gas production wells were analyzed for gamma-emitting radionuclides, gross alpha, gross beta, and tritium. Natural gas samples were analyzed for tritium and carbon-14. ALS Laboratory Group in Fort Collins, Colorado, analyzed water samples. Isotech Laboratories in Champaign, Illinois, analyzed natural gas samples.
Quadratically consistent projection from particles to mesh
Duque, Daniel
2016-01-01
The advantage of particle Lagrangian methods in computational fluid dynamics is that advection is accurately modeled. However, this complicates the calculation of space derivatives. If a mesh is employed, it must be updated at each time step. On the other hand, fixed mesh, Eulerian, formulations benefit from the mesh being defined at the beginning of the simulation, but feature non-linear advection terms. It therefore seems natural to combine the two approaches, using a fixed mesh to perform calculations related to space derivatives, and using the particles to advect the information with time. The idea of combining Lagrangian particles and a fixed mesh goes back to Particle-in-Cell methods, and is here considered within the context of the finite element method (FEM) for the fixed mesh, and the particle FEM (pFEM) for the particles. Our results, in agreement with recent works, show that interpolation ("projection") errors, especially from particles to mesh, are the culprits of slow convergence of the method if...
July 2010 Natural Gas and Produced Water Sampling at the Gasbuggy, New Mexico, Site
Plessinger, Mark [S.M. Stoller Corporation, Broomfield, CO (United States)
2011-01-01
Annual natural gas and produced water monitoring was conducted for gas wells adjacent to Section 36, where the Gasbuggy test was conducted, in accordance with the draft Long-Term Surveillance and Maintenance Plan for the Gasbuggy Site, Rio Arriba County, New Mexico. Sampling and analysis was conducted as specified in the Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites. (LMS/PLN/S04351, continually updated). Natural gas samples were collected for tritium and carbon-14 analysis. Produced water samples were collected and analyzed for tritium, gamma-emitting radionuclides (by high-resolution gamma spectrometry), gross alpha, and gross beta. An additional water sample was collected from well 29-6 Water Hole for analysis of tritium and gamma-emitting radionuclides. A duplicate produced water sample was collected from well 30-039-21743.
June 2011 Natural Gas and Produced Water Sampling at the Gasbuggy, New Mexico, Site
None
2011-10-01
Annual natural gas and produced water monitoring was conducted for gas wells adjacent to Section 36, where the Gasbuggy test was conducted, in accordance with the draft Long-Term Surveillance and Maintenance Plan for the Gasbuggy Site, Rio Arriba County, New Mexico. Sampling and analysis were conducted as specified in the Sampling and Analysis Plan for U.S. Department of Energy Office of Legacy Management Sites (LMS/PLN/S04351, continually updated). Natural gas samples were collected for tritium and carbon-14 analyses. Produced water samples were collected and analyzed for tritium, gamma-emitting radionuclides (by high-resolution gamma spectrometry), gross alpha, and gross beta. A duplicate produced water sample was collected from well 30-039-21743. Produced water samples were not collected at locations 30-039-30161 and 30-039-21744 because of the lack of water. Samples were not collected from location 30-039-29988 because the well was shut-in.
Gasbuggy, New Mexico, Natural Gas and Produced Water Sampling Results for 2012
None
2012-12-01
The U.S. Department of Energy (DOE) Office of Legacy Management conducted annual natural gas sampling for the Gasbuggy, New Mexico, Site on June 20 and 21, 2012. This long-term monitoring of natural gas includes samples of produced water from gas production wells that are located near the site. Water samples from gas production wells were analyzed for gamma-emitting radionuclides, gross alpha, gross beta, and tritium. Natural gas samples were analyzed for tritium and carbon-14. ALS Laboratory Group in Fort Collins, Colorado, analyzed water samples. Isotech Laboratories in Champaign, Illinois, analyzed natural gas samples.
Full data consistency conditions for cone-beam projections with sources on a plane.
Clackdoyle, Rolf; Desbat, Laurent
2013-12-07
Cone-beam consistency conditions (also known as range conditions) are mathematical relationships between different cone-beam projections, and they therefore describe the redundancy or overlap of information between projections. These redundancies have often been exploited for applications in image reconstruction. In this work we describe new consistency conditions for cone-beam projections whose source positions lie on a plane. A further restriction is that the target object must not intersect this plane. The conditions require that moments of the cone-beam projections be polynomial functions of the source positions, with some additional constraints on the coefficients of the polynomials. A precise description of the consistency conditions is that the four parameters of the cone-beam projections (two for the detector, two for the source position) can be expressed with just three variables, using a certain formulation involving homogeneous polynomials. The main contribution of this work is our demonstration that these conditions are not only necessary, but also sufficient. Thus the consistency conditions completely characterize all redundancies, so no other independent conditions are possible and in this sense the conditions are full. The idea of the proof is to use the known consistency conditions for 3D parallel projections, and to then apply a 1996 theorem of Edholm and Danielsson that links parallel to cone-beam projections. The consistency conditions are illustrated with a simulation example.
Michaels, Patrick J; Christy, John R; Herman, Chad S; Liljegren, Lucia M; Annan, James D
2013-01-01
Assessing the consistency between short-term global temperature trends in observations and climate model projections is a challenging problem. While climate models capture many processes governing short-term climate fluctuations, they are not expected to simulate the specific timing of these somewhat random phenomena - the occurrence of which may impact the realized trend. Therefore, to assess model performance, we develop distributions of projected temperature trends from a collection of climate models running the IPCC A1B emissions scenario. We evaluate where observed trends of length 5 to 15 years fall within the distribution of model trends of the same length. We find that current trends lie near the lower limits of the model distributions, with cumulative probability-of-occurrence values typically between 5 percent and 20 percent, and probabilities below 5 percent not uncommon. Our results indicate cause for concern regarding the consistency between climate model projections and observed climate behavior...
Relativistic Consistent Angular-Momentum Projected Shell-Model:Relativistic Mean Field
LI Yan-Song; LONG Gui-Lu
2004-01-01
We develop a relativistic nuclear structure model, relativistic consistent angular-momentum projected shellmodel (RECAPS), which combines the relativistic mean-field theory with the angular-momentum projection method.In this new model, nuclear ground-state properties are first calculated consistently using relativistic mean-field (RMF)theory. Then angular momentum projection method is used to project out states with good angular momentum from a few important configurations. By diagonalizing the hamiltonian, the energy levels and wave functions are obtained.This model is a new attempt for the understanding of nuclear structure of normal nuclei and for the prediction of nuclear properties of nuclei far from stability. In this paper, we will describe the treatment of the relativistic mean field. A computer code, RECAPS-RMF, is developed. It solves the relativistic mean field with axial-symmetric deformation in the spherical harmonic oscillator basis. Comparisons between our calculations and existing relativistic mean-field calculations are made to test the model. These include the ground-state properties of spherical nuclei 16O and 208Pb,the deformed nucleus 20Ne. Good agreement is obtained.
Pérez-Pérez, Martín; Glez-Peña, Daniel; Fdez-Riverola, Florentino; Lourenço, Anália
2015-02-01
Document annotation is a key task in the development of Text Mining methods and applications. High quality annotated corpora are invaluable, but their preparation requires a considerable amount of resources and time. Although the existing annotation tools offer good user interaction interfaces to domain experts, project management and quality control abilities are still limited. Therefore, the current work introduces Marky, a new Web-based document annotation tool equipped to manage multi-user and iterative projects, and to evaluate annotation quality throughout the project life cycle. At the core, Marky is a Web application based on the open source CakePHP framework. User interface relies on HTML5 and CSS3 technologies. Rangy library assists in browser-independent implementation of common DOM range and selection tasks, and Ajax and JQuery technologies are used to enhance user-system interaction. Marky grants solid management of inter- and intra-annotator work. Most notably, its annotation tracking system supports systematic and on-demand agreement analysis and annotation amendment. Each annotator may work over documents as usual, but all the annotations made are saved by the tracking system and may be further compared. So, the project administrator is able to evaluate annotation consistency among annotators and across rounds of annotation, while annotators are able to reject or amend subsets of annotations made in previous rounds. As a side effect, the tracking system minimises resource and time consumption. Marky is a novel environment for managing multi-user and iterative document annotation projects. Compared to other tools, Marky offers a similar visually intuitive annotation experience while providing unique means to minimise annotation effort and enforce annotation quality, and therefore corpus consistency. Marky is freely available for non-commercial use at http://sing.ei.uvigo.es/marky. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Large rainfall changes consistently projected over substantial areas of tropical land
Chadwick, Robin; Good, Peter; Martin, Gill; Rowell, David P.
2016-02-01
Many tropical countries are exceptionally vulnerable to changes in rainfall patterns, with floods or droughts often severely affecting human life and health, food and water supplies, ecosystems and infrastructure. There is widespread disagreement among climate model projections of how and where rainfall will change over tropical land at the regional scales relevant to impacts, with different models predicting the position of current tropical wet and dry regions to shift in different ways. Here we show that despite uncertainty in the location of future rainfall shifts, climate models consistently project that large rainfall changes will occur for a considerable proportion of tropical land over the twenty-first century. The area of semi-arid land affected by large changes under a higher emissions scenario is likely to be greater than during even the most extreme regional wet or dry periods of the twentieth century, such as the Sahel drought of the late 1960s to 1990s. Substantial changes are projected to occur by mid-century--earlier than previously expected--and to intensify in line with global temperature rise. Therefore, current climate projections contain quantitative, decision-relevant information on future regional rainfall changes, particularly with regard to climate change mitigation policy.
A Consistent Fuzzy Preference Relations Based ANP Model for R&D Project Selection
Chia-Hua Cheng
2017-08-01
Full Text Available In today’s rapidly changing economy, technology companies have to make decisions on research and development (R&D projects investment on a routine bases with such decisions having a direct impact on that company’s profitability, sustainability and future growth. Companies seeking profitable opportunities for investment and project selection must consider many factors such as resource limitations and differences in assessment, with consideration of both qualitative and quantitative criteria. Often, differences in perception by the various stakeholders hinder the attainment of a consensus of opinion and coordination efforts. Thus, in this study, a hybrid model is developed for the consideration of the complex criteria taking into account the different opinions of the various stakeholders who often come from different departments within the company and have different opinions about which direction to take. The decision-making trial and evaluation laboratory (DEMATEL approach is used to convert the cause and effect relations representing the criteria into a visual network structure. A consistent fuzzy preference relations based analytic network process (CFPR-ANP method is developed to calculate the preference-weights of the criteria based on the derived network structure. The CFPR-ANP is an improvement over the original analytic network process (ANP method in that it reduces the problem of inconsistency as well as the number of pairwise comparisons. The combined complex proportional assessment (COPRAS-G method is applied with fuzzy grey relations to resolve conflicts arising from differences in information and opinions provided by the different stakeholders about the selection of the most suitable R&D projects. This novel combination approach is then used to assist an international brand-name company to prioritize projects and make project decisions that will maximize returns and ensure sustainability for the company.
Ni, Ming-Jiu
2009-10-01
Two consistent projection methods of second-order temporal and spatial accuracy have been developed on a rectangular collocated mesh for variable density Navier-Stokes equations with a continuous surface force. Instead of the original projection methods (denoted as algorithms I and II in this paper), in which the updated cell center velocity from the intermediate velocity and the pressure gradient is not guaranteed solenoidal, the consistent projection methods (denoted as algorithms III and IV) obtain the cell center velocity based on an interpolation from a conservative fluxes with velocity unit on surrounding cell faces. Dependent on treatment of the continuous surface force, the pressure gradient in algorithm III or the sum of the pressure gradient and the surface force in algorithm IV at a cell center is then conducted from the difference between the updated velocity and the intermediate velocity in a consistent projection method. A non-viscous 3D static drop with serials of density ratios is numerically simulated. Using the consistent projection methods, the spurious currents can be greatly reduced and the pressure jump across the interface can be accurately captured without oscillations. The developed consistent projection method are also applied for simulation of interface evolution of an initial ellipse driven by the surface tension and of an initial sphere bubble driven by the buoyancy with good accuracy and good resolution.
The relativistic consistent angular-momentum projected shell model study of the N=Z nucleus 52Fe
LI YanSong; LONG GuiLu
2009-01-01
The relativistic consistent angular-momentum projected shell model (RECAPS) is used in the study of the structure and electromagnetic transitions of the low-lying states in the N=Z nucleus 52Fe.The model calculations show a reasonably good agreement with the data.The backbending at 12+ is reproduced and the energy level structure suggests that neutron-proton interactions play important roles.
The relativistic consistent angular-momentum projected shell model study of the N=Z nucleus 52Fe
无
2009-01-01
The relativistic consistent angular-momentum projected shell model(ReCAPS) is used in the study of the structure and electromagnetic transitions of the low-lying states in the N=Z nucleus 52Fe.The model calculations show a reasonably good agreement with the data.The backbending at 12+ is reproduced and the energy level structure suggests that neutron-proton interactions play important roles.
Olsen, Are; Key, Robert M.; van Heuven, Steven; Lauvset, Siv K.; Velo, Anton; Lin, Xiaohua; Schirnick, Carsten; Kozyr, Alex; Tanhua, Toste; Hoppema, Mario; Jutterström, Sara; Steinfeldt, Reiner; Jeansson, Emil; Ishii, Masao; Pérez, Fiz F.; Suzuki, Toru
2016-08-01
Version 2 of the Global Ocean Data Analysis Project (GLODAPv2) data product is composed of data from 724 scientific cruises covering the global ocean. It includes data assembled during the previous efforts GLODAPv1.1 (Global Ocean Data Analysis Project version 1.1) in 2004, CARINA (CARbon IN the Atlantic) in 2009/2010, and PACIFICA (PACIFic ocean Interior CArbon) in 2013, as well as data from an additional 168 cruises. Data for 12 core variables (salinity, oxygen, nitrate, silicate, phosphate, dissolved inorganic carbon, total alkalinity, pH, CFC-11, CFC-12, CFC-113, and CCl4) have been subjected to extensive quality control, including systematic evaluation of bias. The data are available in two formats: (i) as submitted but updated to WOCE exchange format and (ii) as a merged and internally consistent data product. In the latter, adjustments have been applied to remove significant biases, respecting occurrences of any known or likely time trends or variations. Adjustments applied by previous efforts were re-evaluated. Hence, GLODAPv2 is not a simple merging of previous products with some new data added but a unique, internally consistent data product. This compiled and adjusted data product is believed to be consistent to better than 0.005 in salinity, 1 % in oxygen, 2 % in nitrate, 2 % in silicate, 2 % in phosphate, 4 µmol kg-1 in dissolved inorganic carbon, 6 µmol kg-1 in total alkalinity, 0.005 in pH, and 5 % for the halogenated transient tracers.The original data and their documentation and doi codes are available at the Carbon Dioxide Information Analysis Center (http://cdiac.ornl.gov/oceans/GLODAPv2/). This site also provides access to the calibrated data product, which is provided as a single global file or four regional ones - the Arctic, Atlantic, Indian, and Pacific oceans - under the doi:10.3334/CDIAC/OTG.NDP093_GLODAPv2. The product files also include significant ancillary and approximated data. These were obtained by interpolation of, or calculation
Sahoo, A. K.; Pan, M.; Gao, H.; Wood, E. F.; Houser, P. R.; Lettenmaier, D. P.; Pinker, R.; Kummerow, C. D.
2008-12-01
We aim to develop consistent, long-term Earth System Data Records (ESDRs) for the major components (storages and fluxes) of the terrestrial water cycle at a spatial resolution of 0.5 degrees (latitude-longitude) and for the period 1950 to near-present. The resulting ESDRs are intended to provide a consistent basis for estimating the mean state and variability of the land surface water cycle at the spatial scale of the major global river basins. The ESDRs to produce include a) surface meteorology (precipitation, air temperature, humidity and wind), b) surface downward radiation (solar and longwave) and c) derived and/or assimilated fluxes and storages such as surface soil moisture storage, total basin water storage, snow water equivalent, storage in large lakes, reservoirs, and wetlands, evapotranspiration, and surface runoff. We construct data records for all variables back to 1950, recognizing that the post-satellite data will be of higher quality than pre-satellite (a reasonable compromise given the need for long-term records to define interannual and interdecadal variability of key water cycle variables). A distinguishing feature will be inclusion of two variables that reflect the massive effects of anthropogenic manipulation of the terrestrial water cycle, specifically reservoir storage, and irrigation water use. The overall goal of the project is to develop long term, consistent ESDRs for terrestrial water cycle states and variables by updating and extending previously funded Pathfinder data set activities to the investigators, and by making available the data set to the scientific community and data users via a state-of-the-art internet web-portal. The ESDRs will utilize algorithms and methods that are well documented in the peer reviewed literature. The ESDRs will merge satellite-derived products with predictions of the same variables by LSMs driven by merged satellite and in situ forcing data sets (most notably precipitation), with the constraint that the
Daniel T. L. Shek
2012-01-01
Full Text Available Subjective outcome evaluation findings based on the perspective of the participants of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes in nine datasets collected from 2005 to 2009 (n=206,313 program participants were examined in this paper. Based on the consolidated data with schools as units, results showed that the participants generally had positive perceptions of the program, implementers, and benefits of the program. More than four-fifths of the participants regarded the program as beneficial to their holistic development. Multiple regression analysis revealed that the perceived qualities of the program and the program implementers predicted perceived effectiveness of the program. Based on the subjective outcome evaluation findings, the present study provides support for the effectiveness of the Tier 1 Program of the Project P.A.T.H.S. in Hong Kong.
Shek, Daniel T L; Sun, Rachel C F
2012-01-01
Subjective outcome evaluation findings based on the perspective of the participants of the Project P.A.T.H.S. (Positive Adolescent Training through Holistic Social Programmes) in nine datasets collected from 2005 to 2009 (n = 206, 313 program participants) were examined in this paper. Based on the consolidated data with schools as units, results showed that the participants generally had positive perceptions of the program, implementers, and benefits of the program. More than four-fifths of the participants regarded the program as beneficial to their holistic development. Multiple regression analysis revealed that the perceived qualities of the program and the program implementers predicted perceived effectiveness of the program. Based on the subjective outcome evaluation findings, the present study provides support for the effectiveness of the Tier 1 Program of the Project P.A.T.H.S. in Hong Kong.
Schnell, D J; Galavotti, C; Fishbein, M; Chan, D K
1996-01-01
The stages of behavior change model has been used to understand a variety of health behaviors. Since consistent condom use has been promoted as a risk-reduction behavior for prevention of human immunodeficiency virus (HIV) infection, an algorithm for staging the adoption of consistent condom use during vaginal sex was empirically developed using three considerations: HIV prevention efficacy, analogy with work on staging other health-related behaviors, and condom use data from groups at high risk for HIV infection. This algorithm suggests that the adoption of consistent condom use among persons at high risk can be meaningfully measured with the model. However, variations in the algorithm details affect both the interpretation of stages and apportionment of persons across stages.
Hannam, Mark; Baker, John G; Boyle, Michael; Bruegmann, Bernd; Chu, Tony; Dorband, Nils; Herrmann, Frank; Hinder, Ian; Kelly, Bernard J; Kidder, Lawrence E; Laguna, Pablo; Matthews, Keith D; van Meter, James R; Pfeiffer, Harald P; Pollney, Denis; Reisswig, Christian; Scheel, Mark A; Shoemaker, Deirdre
2009-01-01
We quantify the consistency of numerical-relativity black-hole-binary waveforms for use in gravitational-wave (GW) searches with current and planned ground-based detectors. We compare previously published results for the $(\\ell=2,| m | =2)$ mode of the gravitational waves from an equal-mass nonspinning binary, calculated by five numerical codes. We focus on the 1000M (about six orbits, or 12 GW cycles) before the peak of the GW amplitude and the subsequent ringdown. We find that the phase and amplitude agree within each code's uncertainty estimates. The mismatch between the $(\\ell=2,| m| =2)$ modes is better than $10^{-3}$ for binary masses above $60 M_{\\odot}$ with respect to the Enhanced LIGO detector noise curve, and for masses above $180 M_{\\odot}$ with respect to Advanced LIGO, Virgo and Advanced Virgo. Between the waveforms with the best agreement, the mismatch is below $2 \\times 10^{-4}$. We find that the waveforms would be indistinguishable in all ground-based detectors (and for the masses we consider...
Borrajo, M.; Egido, J.L. [Universidad Autonoma de Madrid, Departamento de Fisica Teorica, Madrid (Spain)
2016-09-15
We present an approach for the calculation of odd nuclei with exact self-consistent blocking and particle number and angular-momentum projection with the finite-range density-dependent Gogny force. As an application we calculate the nucleus {sup 31}Mg at the border of the N = 20 inversion island. We evaluate the ground-state properties, the excited states and the transition probabilities. In general we obtain a good description of the measured observables. (orig.)
Stapelberg, Stefan; Finkensieper, Stephan; Stengel, Martin; Schlundt, Cornelia; Sus, Oliver; Hollmann, Rainer; Poulsen, Caroline; ESA Cloud cci Team
2016-04-01
In 2010 the ESA Climate Change Initiative (CCI) Cloud project was started along with 12 other CCI projects covering atmospheric, oceanic and terrestrial "essential climate variables (ECV)". The main goal is the generation of satellite-based climate data records that meet the challenging requirements of the Global Climate Observing System. The objective target within the ESA Cloud_cci project is the generation of long-term coherent cloud property datasets covering 33 years that also provide mathematically consistent uncertainty information following the optimal estimation (OE) retrieval theory. The cloud properties considered are cloud mask, cloud top level estimates, cloud thermodynamic phase, cloud optical thickness, cloud effective radius and post processed parameters such as cloud liquid and ice water path. In this presentation we will discuss the benefit of using an optimal estimation retrieval framework, which provides consistence among the retrieved cloud variables and pixel-based uncertainty estimates based on different passive instruments such as AVHRR, MODIS and AATSR. We will summarize the results of the project so far along with ongoing further developments that currently take place. Our results will be compared with other well-established satellite data records, surface observations and cloud climatologies (e.g., PATMOS-X, ISCCP, CLARA-A2, MODIS collection 6, SYNOP). These inter-comparison results will indicate the strengths and weaknesses of the Cloud_cci datasets. Finally, we will present long-term time series of the retrieved cloud variables for AVHRR (1982-2014) that enable global, multi-decadal analyses of clouds.
Staunstrup, Jørgen
1998-01-01
This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....
Bordin, Lorenzo; Creminelli, Paolo; Mirbabayi, Mehrdad; Noreña, Jorge
2017-03-01
We argue that isotropic scalar fluctuations in solid inflation are adiabatic in the super-horizon limit. During the solid phase this adiabatic mode has peculiar features: constant energy-density slices and comoving slices do not coincide, and their curvatures, parameterized respectively by ζ and Script R, both evolve in time. The existence of this adiabatic mode implies that Maldacena's squeezed limit consistency relation holds after angular average over the long mode. The correlation functions of a long-wavelength spherical scalar mode with several short scalar or tensor modes is fixed by the scaling behavior of the correlators of short modes, independently of the solid inflation action or dynamics of reheating.
Kent, A
1996-01-01
In the consistent histories formulation of quantum theory, the probabilistic predictions and retrodictions made from observed data depend on the choice of a consistent set. We show that this freedom allows the formalism to retrodict several contradictory propositions which correspond to orthogonal commuting projections and which all have probability one. We also show that the formalism makes contradictory probability one predictions when applied to generalised time-symmetric quantum mechanics.
De Groot, Mark C.H.; Schlienger, Raymond; Reynolds, Robert; Gardarsdottir, Helga; Juhaeri, Juhaeri; Hesse, Ulrik; Gasse, Christiane; Rottenkolber, Marietta; Schuerch, Markus; Kurz, Xavier; Klungel, Olaf H.
2013-01-01
Background: Pharmacoepidemiological (PE) research should provide consistent, reliable and reproducible results to contribute to the benefit-risk assessment of medicines. IMI-PROTECT aims to identify sources of methodological variations in PE studies using a common protocol and analysis plan across d
Chip Multithreaded Consistency Model
Zu-Song Li; Dan-Dan Huan; Wei-Wu Hu; Zhi-Min Tang
2008-01-01
Multithreaded technique is the developing trend of high performance processor. Memory consistency model is essential to the correctness, performance and complexity of multithreaded processor. The chip multithreaded consistency model adapting to multithreaded processor is proposed in this paper. The restriction imposed on memory event ordering by chip multithreaded consistency is presented and formalized. With the idea of critical cycle built by Wei-Wu Hu, we prove that the proposed chip multithreaded consistency model satisfies the criterion of correct execution of sequential consistency model. Chip multithreaded consistency model provides a way of achieving high performance compared with sequential consistency model and ensures the compatibility of software that the execution result in multithreaded processor is the same as the execution result in uniprocessor. The implementation strategy of chip multithreaded consistency model in Godson-2 SMT processor is also proposed. Godson-2 SMT processor supports chip multithreaded consistency model correctly by exception scheme based on the sequential memory access queue of each thread.
Consistent model driven architecture
Niepostyn, Stanisław J.
2015-09-01
The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.
No consistent bimetric gravity?
Deser, S; Waldron, A
2013-01-01
We discuss the prospects for a consistent, nonlinear, partially massless (PM), gauge symmetry of bimetric gravity (BMG). Just as for single metric massive gravity, ultimate consistency of both BMG and the putative PM BMG theory relies crucially on this gauge symmetry. We argue, however, that it does not exist.
Hiscock, S.
1986-07-01
The importance of consistency in coal quality has become of increasing significance recently, with the current trend towards using coal from a range of sources. A significant development has been the swing in responsibilities for coal quality. The increasing demand for consistency in quality has led to a re-examination of where in the trade and transport chain the quality should be assessed and where further upgrading of inspection and preparation facilities are required. Changes are in progress throughout the whole coal transport chain which will improve consistency of delivered coal quality. These include installation of beneficiation plant at coal mines, export terminals, and on the premises of end users. It is suggested that one of the keys to success for the coal industry will be the ability to provide coal of a consistent quality.
Self-consistent triaxial models
Sanders, Jason L
2015-01-01
We present self-consistent triaxial stellar systems that have analytic distribution functions (DFs) expressed in terms of the actions. These provide triaxial density profiles with cores or cusps at the centre. They are the first self-consistent triaxial models with analytic DFs suitable for modelling giant ellipticals and dark haloes. Specifically, we study triaxial models that reproduce the Hernquist profile from Williams & Evans (2015), as well as flattened isochrones of the form proposed by Binney (2014). We explore the kinematics and orbital structure of these models in some detail. The models typically become more radially anisotropic on moving outwards, have velocity ellipsoids aligned in Cartesian coordinates in the centre and aligned in spherical polar coordinates in the outer parts. In projection, the ellipticity of the isophotes and the position angle of the major axis of our models generally changes with radius. So, a natural application is to elliptical galaxies that exhibit isophote twisting....
Network Consistent Data Association.
Chakraborty, Anirban; Das, Abir; Roy-Chowdhury, Amit K
2016-09-01
Existing data association techniques mostly focus on matching pairs of data-point sets and then repeating this process along space-time to achieve long term correspondences. However, in many problems such as person re-identification, a set of data-points may be observed at multiple spatio-temporal locations and/or by multiple agents in a network and simply combining the local pairwise association results between sets of data-points often leads to inconsistencies over the global space-time horizons. In this paper, we propose a Novel Network Consistent Data Association (NCDA) framework formulated as an optimization problem that not only maintains consistency in association results across the network, but also improves the pairwise data association accuracies. The proposed NCDA can be solved as a binary integer program leading to a globally optimal solution and is capable of handling the challenging data-association scenario where the number of data-points varies across different sets of instances in the network. We also present an online implementation of NCDA method that can dynamically associate new observations to already observed data-points in an iterative fashion, while maintaining network consistency. We have tested both the batch and the online NCDA in two application areas-person re-identification and spatio-temporal cell tracking and observed consistent and highly accurate data association results in all the cases.
Thomsen, Christa; Nielsen, Anne Ellerup
2006-01-01
of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted......This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...
A Magnetic Consistency Relation
Jain, Rajeev Kumar
2012-01-01
If cosmic magnetic fields are indeed produced during inflation, they are likely to be correlated with the scalar metric perturbations that are responsible for the Cosmic Microwave Background anisotropies and Large Scale Structure. Within an archetypical model of inflationary magnetogenesis, we show that there exists a new simple consistency relation for the non-Gaussian cross correlation function of the scalar metric perturbation with two powers of the magnetic field in the squeezed limit where the momentum of the metric perturbation vanishes. We emphasize that such a consistency relation turns out to be extremely useful to test some recent calculations in the literature. Apart from primordial non-Gaussianity induced by the curvature perturbations, such a cross correlation might provide a new observational probe of inflation and can in principle reveal the primordial nature of cosmic magnetic fields.
Consistency in Distributed Systems
Kemme, Bettina; Ramalingam, Ganesan; Schiper, André; Shapiro, Marc; Vaswani, Kapil
2013-01-01
International audience; In distributed systems, there exists a fundamental trade-off between data consistency, availability, and the ability to tolerate failures. This trade-off has significant implications on the design of the entire distributed computing infrastructure such as storage systems, compilers and runtimes, application development frameworks and programming languages. Unfortunately, it also has significant, and poorly understood, implications for the designers and developers of en...
Geometrically Consistent Mesh Modification
Bonito, A.
2010-01-01
A new paradigm of adaptivity is to execute refinement, coarsening, and smoothing of meshes on manifolds with incomplete information about their geometry and yet preserve position and curvature accuracy. We refer to this collectively as geometrically consistent (GC) mesh modification. We discuss the concept of discrete GC, show the failure of naive approaches, and propose and analyze a simple algorithm that is GC and accuracy preserving. © 2010 Society for Industrial and Applied Mathematics.
Consistent wind Facilitates Vection
Masaki Ogawa
2011-10-01
Full Text Available We examined whether a consistent haptic cue suggesting forward self-motion facilitated vection. We used a fan with no blades (Dyson, AM01 providing a wind of constant strength and direction (wind speed was 6.37 m/s to the subjects' faces with the visual stimuli visible through the fan. We used an optic flow of expansion or contraction created by positioning 16,000 dots at random inside a simulated cube (length 20 m, and moving the observer's viewpoint to simulate forward or backward self-motion of 16 m/s. we tested three conditions for fan operation, which were normal operation, normal operation with the fan reversed (ie, no wind, and no operation (no wind and no sound. Vection was facilitated by the wind (shorter latency, longer duration and larger magnitude values with the expansion stimuli. The fan noise did not facilitate vection. The wind neither facilitated nor inhibited vection with the contraction stimuli, perhaps because a headwind is not consistent with backward self-motion. We speculate that the consistency between multi modalities is a key factor in facilitating vection.
Infanticide and moral consistency.
McMahan, Jeff
2013-05-01
The aim of this essay is to show that there are no easy options for those who are disturbed by the suggestion that infanticide may on occasion be morally permissible. The belief that infanticide is always wrong is doubtfully compatible with a range of widely shared moral beliefs that underlie various commonly accepted practices. Any set of beliefs about the morality of abortion, infanticide and the killing of animals that is internally consistent and even minimally credible will therefore unavoidably contain some beliefs that are counterintuitive.
Serfon, Cedric; The ATLAS collaboration
2016-01-01
One of the biggest challenge with Large scale data management system is to ensure the consistency between the global file catalog and what is physically on all storage elements. To tackle this issue, the Rucio software which is used by the ATLAS Distributed Data Management system has been extended to automatically handle lost or unregistered files (aka Dark Data). This system automatically detects these inconsistencies and take actions like recovery or deletion of unneeded files in a central manner. In this talk, we will present this system, explain the internals and give some results.
When is holography consistent?
McInnes, Brett, E-mail: matmcinn@nus.edu.sg [National University of Singapore (Singapore); Ong, Yen Chin, E-mail: yenchin.ong@nordita.org [Nordita, KTH Royal Institute of Technology and Stockholm University, Roslagstullsbacken 23, SE-106 91 Stockholm (Sweden)
2015-09-15
Holographic duality relates two radically different kinds of theory: one with gravity, one without. The very existence of such an equivalence imposes strong consistency conditions which are, in the nature of the case, hard to satisfy. Recently a particularly deep condition of this kind, relating the minimum of a probe brane action to a gravitational bulk action (in a Euclidean formulation), has been recognized; and the question arises as to the circumstances under which it, and its Lorentzian counterpart, is satisfied. We discuss the fact that there are physically interesting situations in which one or both versions might, in principle, not be satisfied. These arise in two distinct circumstances: first, when the bulk is not an Einstein manifold and, second, in the presence of angular momentum. Focusing on the application of holography to the quark–gluon plasma (of the various forms arising in the early Universe and in heavy-ion collisions), we find that these potential violations never actually occur. This suggests that the consistency condition is a “law of physics” expressing a particular aspect of holography.
Consistent quantum measurements
Griffiths, Robert B.
2015-11-01
In response to recent criticisms by Okon and Sudarsky, various aspects of the consistent histories (CH) resolution of the quantum measurement problem(s) are discussed using a simple Stern-Gerlach device, and compared with the alternative approaches to the measurement problem provided by spontaneous localization (GRW), Bohmian mechanics, many worlds, and standard (textbook) quantum mechanics. Among these CH is unique in solving the second measurement problem: inferring from the measurement outcome a property of the measured system at a time before the measurement took place, as is done routinely by experimental physicists. The main respect in which CH differs from other quantum interpretations is in allowing multiple stochastic descriptions of a given measurement situation, from which one (or more) can be selected on the basis of its utility. This requires abandoning a principle (termed unicity), central to classical physics, that at any instant of time there is only a single correct description of the world.
When Is Holography Consistent?
McInnes, Brett
2015-01-01
Holographic duality relates two radically different kinds of theory: one with gravity, one without. The very existence of such an equivalence imposes strong consistency conditions which are, in the nature of the case, hard to satisfy. Recently a particularly deep condition of this kind, relating the minimum of a probe brane action to a gravitational bulk action (in a Euclidean formulation), has been recognised; and the question arises as to the circumstances under which it, and its Lorentzian counterpart, are satisfied. We discuss the fact that there are physically interesting situations in which one or both versions might, in principle, \\emph{not} be satisfied. These arise in two distinct circumstances: first, when the bulk is not an Einstein manifold, and, second, in the presence of angular momentum. Focusing on the application of holography to the quark-gluon plasma (of the various forms arising in the early Universe and in heavy-ion collisions), we find that these potential violations never actually occur...
Developing consistent time series landsat data products
The Landsat series satellite has provided earth observation data record continuously since early 1970s. There are increasing demands on having a consistent time series of Landsat data products. In this presentation, I will summarize the work supported by the USGS Landsat Science Team project from 20...
Consistency of canonical formulation of Horava gravity
Soo, Chopin, E-mail: cpsoo@mail.ncku.edu.tw [Department of Physics, National Cheng Kung University, Tainan, Taiwan (China)
2011-09-22
Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.
Consistency of trace norm minimization
Bach, Francis
2007-01-01
Regularization by the sum of singular values, also referred to as the trace norm, is a popular technique for estimating low rank rectangular matrices. In this paper, we extend some of the consistency results of the Lasso to provide necessary and sufficient conditions for rank consistency of trace norm minimization with the square loss. We also provide an adaptive version that is rank consistent even when the necessary condition for the non adaptive version is not fulfilled.
High SNR Consistent Compressive Sensing
Kallummil, Sreejith; Kalyani, Sheetal
2017-01-01
High signal to noise ratio (SNR) consistency of model selection criteria in linear regression models has attracted a lot of attention recently. However, most of the existing literature on high SNR consistency deals with model order selection. Further, the limited literature available on the high SNR consistency of subset selection procedures (SSPs) is applicable to linear regression with full rank measurement matrices only. Hence, the performance of SSPs used in underdetermined linear models ...
Consistency argued students of fluid
Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma
2017-01-01
Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.
Coordinating user interfaces for consistency
Nielsen, Jakob
2001-01-01
In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys
Consistency of Random Survival Forests.
Ishwaran, Hemant; Kogalur, Udaya B
2010-07-01
We prove uniform consistency of Random Survival Forests (RSF), a newly introduced forest ensemble learner for analysis of right-censored survival data. Consistency is proven under general splitting rules, bootstrapping, and random selection of variables-that is, under true implementation of the methodology. Under this setting we show that the forest ensemble survival function converges uniformly to the true population survival function. To prove this result we make one key assumption regarding the feature space: we assume that all variables are factors. Doing so ensures that the feature space has finite cardinality and enables us to exploit counting process theory and the uniform consistency of the Kaplan-Meier survival function.
Probability-consistent spectrum and code spectrum
沈建文; 石树中
2004-01-01
In the seismic safety evaluation (SSE) for key projects, the probability-consistent spectrum (PCS), usually obtained from probabilistic seismic hazard analysis (PSHA), is not consistent with the design response spectrum given by Code for Seismic Design of Buildings (GB50011-2001). Sometimes, there may be a remarkable difference between them. If the PCS is lower than the corresponding code design response spectrum (CDS), the seismic fortification criterion for the key projects would be lower than that for the general industry and civil buildings. In the paper, the relation between PCS and CDS is discussed by using the ideal simple potential seismic source. The results show that in the most areas influenced mainly by the potential sources of the epicentral earthquakes and the regional earthquakes, PCS is generally lower than CDS in the long periods. We point out that the long-period response spectra of the code should be further studied and combined with the probability method of seismic zoning as much as possible. Because of the uncertainties in SSE, it should be prudent to use the long-period response spectra given by SSE for key projects when they are lower than CDS.
Process Fairness and Dynamic Consistency
S.T. Trautmann (Stefan); P.P. Wakker (Peter)
2010-01-01
textabstractAbstract: When process fairness deviates from outcome fairness, dynamic inconsistencies can arise as in nonexpected utility. Resolute choice (Machina) can restore dynamic consistency under nonexpected utility without using Strotz's precommitment. It can similarly justify dynamically
Gravitation, Causality, and Quantum Consistency
Hertzberg, Mark P
2016-01-01
We examine the role of consistency with causality and quantum mechanics in determining the properties of gravitation. We begin by constructing two different classes of interacting theories of massless spin 2 particles -- gravitons. One involves coupling the graviton with the lowest number of derivatives to matter, the other involves coupling the graviton with higher derivatives to matter, making use of the linearized Riemann tensor. The first class requires an infinite tower of terms for consistency, which is known to lead uniquely to general relativity. The second class only requires a finite number of terms for consistency, which appears as a new class of theories of massless spin 2. We recap the causal consistency of general relativity and show how this fails in the second class for the special case of coupling to photons, exploiting related calculations in the literature. In an upcoming publication [1] this result is generalized to a much broader set of theories. Then, as a causal modification of general ...
Time-consistent and market-consistent evaluations
Pelsser, A.; Stadje, M.A.
2014-01-01
We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from mathemati
Entropy-based consistent model driven architecture
Niepostyn, Stanisław Jerzy
2016-09-01
A description of software architecture is a plan of the IT system construction, therefore any architecture gaps affect the overall success of an entire project. The definitions mostly describe software architecture as a set of views which are mutually unrelated, hence potentially inconsistent. Software architecture completeness is also often described in an ambiguous way. As a result most methods of IT systems building comprise many gaps and ambiguities, thus presenting obstacles for software building automation. In this article the consistency and completeness of software architecture are mathematically defined based on calculation of entropy of the architecture description. Following this approach, in this paper we also propose our method of automatic verification of consistency and completeness of the software architecture development method presented in our previous article as Consistent Model Driven Architecture (CMDA). The proposed FBS (Functionality-Behaviour-Structure) entropy-based metric applied in our CMDA approach enables IT architects to decide whether the modelling process is complete and consistent. With this metric, software architects could assess the readiness of undergoing modelling work for the start of IT system building. It even allows them to assess objectively whether the designed software architecture of the IT system could be implemented at all. The overall benefit of such an approach is that it facilitates the preparation of complete and consistent software architecture more effectively as well as it enables assessing and monitoring of the ongoing modelling development status. We demonstrate this with a few industry examples of IT system designs.
Market-consistent actuarial valuation
Wüthrich, Mario V
2016-01-01
This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.
Consistent Histories in Quantum Cosmology
Craig, David A; 10.1007/s10701-010-9422-6
2010-01-01
We illustrate the crucial role played by decoherence (consistency of quantum histories) in extracting consistent quantum probabilities for alternative histories in quantum cosmology. Specifically, within a Wheeler-DeWitt quantization of a flat Friedmann-Robertson-Walker cosmological model sourced with a free massless scalar field, we calculate the probability that the univese is singular in the sense that it assumes zero volume. Classical solutions of this model are a disjoint set of expanding and contracting singular branches. A naive assessment of the behavior of quantum states which are superpositions of expanding and contracting universes may suggest that a "quantum bounce" is possible i.e. that the wave function of the universe may remain peaked on a non-singular classical solution throughout its history. However, a more careful consistent histories analysis shows that for arbitrary states in the physical Hilbert space the probability of this Wheeler-DeWitt quantum universe encountering the big bang/crun...
The Importance of being consistent
Wasserman, Adam; Jiang, Kaili; Kim, Min-Cheol; Sim, Eunji; Burke, Kieron
2016-01-01
We review the role of self-consistency in density functional theory. We apply a recent analysis to both Kohn-Sham and orbital-free DFT, as well as to Partition-DFT, which generalizes all aspects of standard DFT. In each case, the analysis distinguishes between errors in approximate functionals versus errors in the self-consistent density. This yields insights into the origins of many errors in DFT calculations, especially those often attributed to self-interaction or delocalization error. In many classes of problems, errors can be substantially reduced by using `better' densities. We review the history of these approaches, many of their applications, and give simple pedagogical examples.
Improving analytical tomographic reconstructions through consistency conditions
Arcadu, Filippo; Stampanoni, Marco; Marone, Federica
2016-01-01
This work introduces and characterizes a fast parameterless filter based on the Helgason-Ludwig consistency conditions, used to improve the accuracy of analytical reconstructions of tomographic undersampled datasets. The filter, acting in the Radon domain, extrapolates intermediate projections between those existing. The resulting sinogram, doubled in views, is then reconstructed by a standard analytical method. Experiments with simulated data prove that the peak-signal-to-noise ratio of the results computed by filtered backprojection is improved up to 5-6 dB, if the filter is used prior to reconstruction.
Consistent supersymmetric decoupling in cosmology
Sousa Sánchez, Kepa
2012-01-01
The present work discusses several problems related to the stability of ground states with broken supersymmetry in supergravity, and to the existence and stability of cosmic strings in various supersymmetric models. In particular we study the necessary conditions to truncate consistently a sector o
Energy-Consistent Multiscale Algorithms for Granular Flows
2014-08-07
8-98) v Prescribed by ANSI Std. Z39.18 30-07-2014 Final 01-MAY-2011 - 30-APR-2014 AFOSR YIP Energy-Consistent Multiscale Algorithms for Granular...document the achievements made as a result of this Young Investigator Program ( YIP ) project. We worked on the development of multi scale energy... YIP ) project. We worked on the development of multi scale energy-consistent algorithms to simulate and capture flow phenomena in granular
Consistence of Network Filtering Rules
SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian
2004-01-01
The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.
On Modal Refinement and Consistency
Nyman, Ulrik; Larsen, Kim Guldstrand; Wasowski, Andrzej
2007-01-01
Almost 20 years after the original conception, we revisit several fundamental question about modal transition systems. First, we demonstrate the incompleteness of the standard modal refinement using a counterexample due to Hüttel. Deciding any refinement, complete with respect to the standard...... notions of implementation, is shown to be computationally hard (co-NP hard). Second, we consider four forms of consistency (existence of implementations) for modal specifications. We characterize each operationally, giving algorithms for deciding, and for synthesizing implementations, together...
Tri-Sasakian consistent reduction
Cassani, Davide
2011-01-01
We establish a universal consistent Kaluza-Klein truncation of M-theory based on seven-dimensional tri-Sasakian structure. The four-dimensional truncated theory is an N=4 gauged supergravity with three vector multiplets and a non-abelian gauge group, containing the compact factor SO(3). Consistency follows from the fact that our truncation takes exactly the same form as a left-invariant reduction on a specific coset manifold, and we show that the same holds for the various universal consistent truncations recently put forward in the literature. We describe how the global symmetry group SL(2,R) x SO(6,3) is embedded in the symmetry group E7(7) of maximally supersymmetric reductions, and make the connection with the approach of Exceptional Generalized Geometry. Vacuum AdS4 solutions spontaneously break the amount of supersymmetry from N=4 to N=3,1 or 0, and the spectrum contains massive modes. We find a subtruncation to minimal N=3 gauged supergravity as well as an N=1 subtruncation to the SO(3)-invariant secto...
Consistent quadrupole-octupole collective model
Dobrowolski, A.; Mazurek, K.; Góźdź, A.
2016-11-01
Within this work we present a consistent approach to quadrupole-octupole collective vibrations coupled with the rotational motion. A realistic collective Hamiltonian with variable mass-parameter tensor and potential obtained through the macroscopic-microscopic Strutinsky-like method with particle-number-projected BCS (Bardeen-Cooper-Schrieffer) approach in full vibrational and rotational, nine-dimensional collective space is diagonalized in the basis of projected harmonic oscillator eigensolutions. This orthogonal basis of zero-, one-, two-, and three-phonon oscillator-like functions in vibrational part, coupled with the corresponding Wigner function is, in addition, symmetrized with respect to the so-called symmetrization group, appropriate to the collective space of the model. In the present model it is D4 group acting in the body-fixed frame. This symmetrization procedure is applied in order to provide the uniqueness of the Hamiltonian eigensolutions with respect to the laboratory coordinate system. The symmetrization is obtained using the projection onto the irreducible representation technique. The model generates the quadrupole ground-state spectrum as well as the lowest negative-parity spectrum in 156Gd nucleus. The interband and intraband B (E 1 ) and B (E 2 ) reduced transition probabilities are also calculated within those bands and compared with the recent experimental results for this nucleus. Such a collective approach is helpful in searching for the fingerprints of the possible high-rank symmetries (e.g., octahedral and tetrahedral) in nuclear collective bands.
Souto-Iglesias, Antonio; González, Leo M; Cercos-Pita, Jose L
2013-01-01
The consistency of Moving Particle Semi-implicit (MPS) method in reproducing the gradient, divergence and Laplacian differential operators is discussed in the present paper. Its relation to the Smoothed Particle Hydrodynamics (SPH) method is rigorously established. The application of the MPS method to solve the Navier-Stokes equations using a fractional step approach is treated, unveiling inconsistency problems when solving the Poisson equation for the pressure. A new corrected MPS method incorporating boundary terms is proposed. Applications to one dimensional boundary value Dirichlet and mixed Neumann-Dirichlet problems and to two-dimensional free-surface flows are presented.
Measuring process and knowledge consistency
Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders
2007-01-01
with a 5 point Liker scale and a corresponding scoring system. Process consistency is measured by using a first-person drawing tool with the respondent in the centre. Respondents sketch the sequence of steps and people they contact when configuring a product. The methodology is tested in one company...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation...
Maintaining consistency in distributed systems
Birman, Kenneth P.
1991-01-01
In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.
Decentralized Consistent Updates in SDN
Nguyen, Thanh Dang
2017-04-10
We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.
The Consistent Vehicle Routing Problem
Groer, Christopher S [ORNL; Golden, Bruce [University of Maryland; Edward, Wasil [American University
2009-01-01
In the small package shipping industry (as in other industries), companies try to differentiate themselves by providing high levels of customer service. This can be accomplished in several ways, including online tracking of packages, ensuring on-time delivery, and offering residential pickups. Some companies want their drivers to develop relationships with customers on a route and have the same drivers visit the same customers at roughly the same time on each day that the customers need service. These service requirements, together with traditional constraints on vehicle capacity and route length, define a variant of the classical capacitated vehicle routing problem, which we call the consistent VRP (ConVRP). In this paper, we formulate the problem as a mixed-integer program and develop an algorithm to solve the ConVRP that is based on the record-to-record travel algorithm. We compare the performance of our algorithm to the optimal mixed-integer program solutions for a set of small problems and then apply our algorithm to five simulated data sets with 1,000 customers and a real-world data set with more than 3,700 customers. We provide a technique for generating ConVRP benchmark problems from vehicle routing problem instances given in the literature and provide our solutions to these instances. The solutions produced by our algorithm on all problems do a very good job of meeting customer service objectives with routes that have a low total travel time.
Soler Vich, Josep Francesc
2009-01-01
This project carries out a part of a new location application. This service consists of location-based application that connects with a location server. That server provides the necessary location information to run the service. Then an interface between the location-based application and the location server is needed. That interface could be a Location Application Programming Interface (LAPI), and the development of one LAPI is the goal of this project.
National Aeronautics and Space Administration — For the first two years of the project (FY12-13), the RadWorks project has consisted of two top-level elements. The first element involved the prototype and...
2001-01-01
As business environments become increasingly competitive, companies seek more comprehensive solutions to the delivery of their projects. "Project Delivery System: Fourth Edition" describes the process-driven project delivery systems which incorporates the best practices from Total Quality and is aligned with the Project Management Institute and ISO Quality Standards is the means by which projects are consistently and efficiently planned, executed and completed to the satisfaction of clients and customers.
Evaluating the hydrological consistency of evaporation products
López, Oliver
2017-01-18
Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months
Consistent Sets and Contrary Inferences Reply to Griffiths and Hartle
Kent, A
1998-01-01
It was pointed out recently [A. Kent, Phys. Rev. Lett. 78 (1997) 2874] that the consistent histories approach allows contrary inferences to be made from the same data, corresponding to commuting orthogonal projections in different consistent sets. To many, this seems undesirable in a theory of physical inferences. It also raises a specific problem for the consistent histories formalism, since that formalism is set up so as to eliminate contradictory inferences, yet there seems to be no sensible physical distinction between contradictory and contrary inferences. It seems particularly hard to defend this asymmetry, since (i) there is a well-defined quantum histories formalisms which admits both contradictory and contrary inferences, and (ii) there is also a well-defined formalism, based on ordered consistent sets of histories, which excludes both. In a recent comment, Griffiths and Hartle, while accepting the validity of the examples given in the above paper, restate their own preference for the consistent hist...
Project 2010 Project Management
Happy, Robert
2010-01-01
The ideal on-the-job reference guide for project managers who use Microsoft Project 2010. This must-have guide to using Microsoft Project 2010 is written from a real project manager's perspective and is packed with information you can use on the job. The book explores using Project 2010 during phases of project management, reveals best practices, and walks you through project flow from planning through tracking to closure. This valuable book follows the processes defined in the PMBOK Guide, Fourth Edition , and also provides exam prep for Microsoft's MCTS: Project 2010 certification.: Explains
Consistent Design of Dependable Control Systems
Blanke, M.
1996-01-01
Design of fault handling in control systems is discussed, and a method for consistent design is presented.......Design of fault handling in control systems is discussed, and a method for consistent design is presented....
The CHPRC Columbia River Protection Project Quality Assurance Project Plan
Fix, N. J.
2008-11-30
Pacific Northwest National Laboratory researchers are working on the CHPRC Columbia River Protection Project (hereafter referred to as the Columbia River Project). This is a follow-on project, funded by CH2M Hill Plateau Remediation Company, LLC (CHPRC), to the Fluor Hanford, Inc. Columbia River Protection Project. The work scope consists of a number of CHPRC funded, related projects that are managed under a master project (project number 55109). All contract releases associated with the Fluor Hanford Columbia River Project (Fluor Hanford, Inc. Contract 27647) and the CHPRC Columbia River Project (Contract 36402) will be collected under this master project. Each project within the master project is authorized by a CHPRC contract release that contains the project-specific statement of work. This Quality Assurance Project Plan provides the quality assurance requirements and processes that will be followed by the Columbia River Project staff.
MDCC: Multi-Data Center Consistency
Kraska, Tim; Franklin, Michael J; Madden, Samuel
2012-01-01
Replicating data across multiple data centers not only allows moving the data closer to the user and, thus, reduces latency for applications, but also increases the availability in the event of a data center failure. Therefore, it is not surprising that companies like Google, Yahoo, and Netflix already replicate user data across geographically different regions. However, replication across data centers is expensive. Inter-data center network delays are in the hundreds of milliseconds and vary significantly. Synchronous wide-area replication is therefore considered to be unfeasible with strong consistency and current solutions either settle for asynchronous replication which implies the risk of losing data in the event of failures, restrict consistency to small partitions, or give up consistency entirely. With MDCC (Multi-Data Center Consistency), we describe the first optimistic commit protocol, that does not require a master or partitioning, and is strongly consistent at a cost similar to eventually consiste...
A dual-consistency cache coherence protocol
Ros, Alberto; Jimborean, Alexandra
2015-01-01
Weak memory consistency models can maximize system performance by enabling hardware and compiler optimizations, but increase programming complexity since they do not match programmers’ intuition. The design of an efficient system with an intuitive memory model is an open challenge. This paper proposes SPEL, a dual-consistency cache coherence protocol which simultaneously guarantees the strongest memory consistency model provided by the hardware and yields improvements in both performance and ...
A new approach to hull consistency
Kolev Lubomir
2016-06-01
Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.
Consistent estimators in random censorship semiparametric models
王启华
1996-01-01
For the fixed design regression modelwhen Y, are randomly censored on the right, the estimators of unknown parameter and regression function g from censored observations are defined in the two cases .where the censored distribution is known and unknown, respectively. Moreover, the sufficient conditions under which these estimators are strongly consistent and pth (p>2) mean consistent are also established.
Student Effort, Consistency, and Online Performance
Patron, Hilde; Lopez, Salvador
2011-01-01
This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas…
Consistent truncations with massive modes and holography
Cassani, Davide; Faedo, Anton F
2011-01-01
We review the basic features of some recently found consistent Kaluza-Klein truncations including massive modes. We emphasize the general ideas underlying the reduction procedure, then we focus on type IIB supergravity on 5-dimensional manifolds admitting a Sasaki-Einstein structure, which leads to half-maximal gauged supergravity in five dimensions. Finally, we comment on the holographic picture of consistency.
CONSISTENT AGGREGATION IN FOOD DEMAND SYSTEMS
Levedahl, J. William; Reed, Albert J.; Clark, J. Stephen
2002-01-01
Two aggregation schemes for food demand systems are tested for consistency with the Generalized Composite Commodity Theorem (GCCT). One scheme is based on the standard CES classification of food expenditures. The second scheme is based on the Food Guide Pyramid. Evidence is found that both schemes are consistent with the GCCT.
A Framework of Memory Consistency Models
胡伟武; 施巍松; 等
1998-01-01
Previous descriptions of memory consistency models in shared-memory multiprocessor systems are mainly expressed as constraints on the memory access event ordering and hence are hardware-centric.This paper presents a framework of memory consistency models which describes the memory consistency model on the behavior level.Based on the understanding that the behavior of an execution is determined by the execution order of conflicting accesses,a memory consistency model is defined as an interprocessor synchronization mechanism which orders the execution of operations from different processors.Synchronization order of an execution under certain consistency model is also defined.The synchronization order,together with the program order determines the behavior of an execution.This paper also presents criteria for correct program and correct implementation of consistency models.Regarding an implementation of a consistency model as certain memory event ordering constraints,this paper provides a method to prove the correctness of consistency model implementations,and the correctness of the lock-based cache coherence protocol is proved with this method.
Sticky continuous processes have consistent price systems
Bender, Christian; Pakkanen, Mikko; Sayit, Hasanjan
Under proportional transaction costs, a price process is said to have a consistent price system, if there is a semimartingale with an equivalent martingale measure that evolves within the bid-ask spread. We show that a continuous, multi-asset price process has a consistent price system, under arb...
Testing the visual consistency of web sites
Geest, van der Thea; Loorbach, Nicole
2005-01-01
Consistency in the visual appearance of Web pages is often checked by experts, such as designers or reviewers. This article reports a card sort study conducted to determine whether users rather than experts could distinguish visual (in-)consistency in Web elements and pages. The users proved to agre
Putting Consistent Theories Together in Institutions
应明生
1995-01-01
The problem of putting consistent theories together in institutions is discussed.A general necessary condition for consistency of the resulting theory is carried out,and some sufficient conditions are given for diagrams of theories in which shapes are tree bundles or directed graphs.Moreover,some transformations from complicated cases to simple ones are established.
Thermal Data Exchange Using International Standards Project
National Aeronautics and Space Administration — Spacecraft projects today consist of many different cooperating companies and institutions. The project members typically use different thermal design analysis...
Modeling and Testing Legacy Data Consistency Requirements
Nytun, J. P.; Jensen, Christian Søndergaard
2003-01-01
An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...
On the Initial State and Consistency Relations
Berezhiani, Lasha
2014-01-01
We study the effect of the initial state on the consistency conditions for adiabatic perturbations. In order to be consistent with the constraints of General Relativity, the initial state must be diffeomorphism invariant. As a result, we show that initial wavefunctional/density matrix has to satisfy a Slavnov-Taylor identity similar to that of the action. We then investigate the precise ways in which modified initial states can lead to violations of the consistency relations. We find two independent sources of violations: i) the state can include initial non-Gaussianities; ii) even if the initial state is Gaussian, such as a Bogoliubov state, the modified 2-point function can modify the q->0 analyticity properties of the vertex functional and result in violations of the consistency relations.
On the initial state and consistency relations
Berezhiani, Lasha; Khoury, Justin, E-mail: lashaber@sas.upenn.edu, E-mail: jkhoury@sas.upenn.edu [Center for Particle Cosmology, Department of Physics and Astronomy, University of Pennsylvania, 209 South 33rd Street, Philadelphia, PA 19104 (United States)
2014-09-01
We study the effect of the initial state on the consistency conditions for adiabatic perturbations. In order to be consistent with the constraints of General Relativity, the initial state must be diffeomorphism invariant. As a result, we show that initial wavefunctional/density matrix has to satisfy a Slavnov-Taylor identity similar to that of the action. We then investigate the precise ways in which modified initial states can lead to violations of the consistency relations. We find two independent sources of violations: i) the state can include initial non-Gaussianities; ii) even if the initial state is Gaussian, such as a Bogoliubov state, the modified 2-point function can modify the q-vector → 0 analyticity properties of the vertex functional and result in violations of the consistency relations.
Self-Consistent Asset Pricing Models
Malevergne, Y
2006-01-01
We discuss the foundations of factor or regression models in the light of the self-consistency condition that the market portfolio (and more generally the risk factors) is (are) constituted of the assets whose returns it is (they are) supposed to explain. As already reported in several articles, self-consistency implies correlations between the return disturbances. As a consequence, the alpha's and beta's of the factor model are unobservable. Self-consistency leads to renormalized beta's with zero effective alpha's, which are observable with standard OLS regressions. Analytical derivations and numerical simulations show that, for arbitrary choices of the proxy which are different from the true market portfolio, a modified linear regression holds with a non-zero value $\\alpha_i$ at the origin between an asset $i$'s return and the proxy's return. Self-consistency also introduces ``orthogonality'' and ``normality'' conditions linking the beta's, alpha's (as well as the residuals) and the weights of the proxy por...
Quasiparticle self-consistent GW theory.
van Schilfgaarde, M; Kotani, Takao; Faleev, S
2006-06-09
In past decades the scientific community has been looking for a reliable first-principles method to predict the electronic structure of solids with high accuracy. Here we present an approach which we call the quasiparticle self-consistent approximation. It is based on a kind of self-consistent perturbation theory, where the self-consistency is constructed to minimize the perturbation. We apply it to selections from different classes of materials, including alkali metals, semiconductors, wide band gap insulators, transition metals, transition metal oxides, magnetic insulators, and rare earth compounds. Apart from some mild exceptions, the properties are very well described, particularly in weakly correlated cases. Self-consistency dramatically improves agreement with experiment, and is sometimes essential. Discrepancies with experiment are systematic, and can be explained in terms of approximations made.
Consistency in the World Wide Web
Thomsen, Jakob Grauenkjær
Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...
Consistency Relations for Large Field Inflation
Chiba, Takeshi
2014-01-01
Consistency relations for chaotic inflation with a monomial potential and natural inflation and hilltop inflation are given which involve the scalar spectral index $n_s$, the tensor-to-scalar ratio $r$ and the running of the spectral index $\\alpha$. The measurement of $\\alpha$ with $O(10^{-3})$ and the improvement in the measurement of $n_s$ could discriminate monomial model from natural/hilltop inflation models. A consistency region for general large field models is also presented.
Consistency and Derangements in Brane Tilings
Hanany, Amihay; Ramgoolam, Sanjaye; Seong, Rak-Kyeong
2015-01-01
Brane tilings describe Lagrangians (vector multiplets, chiral multiplets, and the superpotential) of four dimensional $\\mathcal{N}=1$ supersymmetric gauge theories. These theories, written in terms of a bipartite graph on a torus, correspond to worldvolume theories on $N$ D$3$-branes probing a toric Calabi-Yau threefold singularity. A pair of permutations compactly encapsulates the data necessary to specify a brane tiling. We show that geometric consistency for brane tilings, which ensures that the corresponding quantum field theories are well behaved, imposes constraints on the pair of permutations, restricting certain products constructed from the pair to have no one-cycles. Permutations without one-cycles are known as derangements. We illustrate this formulation of consistency with known brane tilings. Counting formulas for consistent brane tilings with an arbitrary number of chiral bifundamental fields are written down in terms of delta functions over symmetric groups.
Quantifying the consistency of scientific databases
Šubelj, Lovro; Boshkoska, Biljana Mileva; Kastrin, Andrej; Levnajić, Zoran
2015-01-01
Science is a social process with far-reaching impact on our modern society. In the recent years, for the first time we are able to scientifically study the science itself. This is enabled by massive amounts of data on scientific publications that is increasingly becoming available. The data is contained in several databases such as Web of Science or PubMed, maintained by various public and private entities. Unfortunately, these databases are not always consistent, which considerably hinders this study. Relying on the powerful framework of complex networks, we conduct a systematic analysis of the consistency among six major scientific databases. We found that identifying a single "best" database is far from easy. Nevertheless, our results indicate appreciable differences in mutual consistency of different databases, which we interpret as recipes for future bibliometric studies.
Self-consistent Green's function approaches
Barbieri, Carlo
2016-01-01
We present the fundamental techniques and working equations of many-body Green's function theory for calculating ground state properties and the spectral strength. Green's function methods closely relate to other polynomial scaling approaches discussed in chapters~8 and ~10. However, here we aim directly at a global view of the many-fermion structure. We derive the working equations for calculating many-body propagators, using both the Algebraic Diagrammatic Construction technique and the self-consistent formalism at finite temperature. Their implementation is discussed, as well as the the inclusion of three-nucleon interactions. The self-consistency feature is essential to guarantee thermodynamic consistency. The paring and neutron matter models introduced in previous chapters are solved and compared with the other methods in this book.
Personalized recommendation based on unbiased consistence
Zhu, Xuzhen; Tian, Hui; Zhang, Ping; Hu, Zheng; Zhou, Tao
2015-08-01
Recently, in physical dynamics, mass-diffusion-based recommendation algorithms on bipartite network provide an efficient solution by automatically pushing possible relevant items to users according to their past preferences. However, traditional mass-diffusion-based algorithms just focus on unidirectional mass diffusion from objects having been collected to those which should be recommended, resulting in a biased causal similarity estimation and not-so-good performance. In this letter, we argue that in many cases, a user's interests are stable, and thus bidirectional mass diffusion abilities, no matter originated from objects having been collected or from those which should be recommended, should be consistently powerful, showing unbiased consistence. We further propose a consistence-based mass diffusion algorithm via bidirectional diffusion against biased causality, outperforming the state-of-the-art recommendation algorithms in disparate real data sets, including Netflix, MovieLens, Amazon and Rate Your Music.
A Revisit to Probability - Possibility Consistency Principles
Mamoni Dhar
2013-03-01
Full Text Available In this article, our main intention is to highlight the fact that the probable links between probability and possibility which were established by different authors at different point of time on the basis of some well known consistency principles cannot provide the desired result. That is why the paper discussed some prominent works for transformations between probability and possibility and finally aimed to suggest a new principle because none of the existing principles because none of them found the unique transformation. The new consistency principle which is suggested hereby would in turn replace all others that exist in the literature references by providing a reliable estimate of consistency between the two.Furthermore some properties of entropy of fuzzy numbers are also presented in this article.
Consistent matter couplings for Plebanski gravity
Tennie, Felix
2010-01-01
We develop a scheme for the minimal coupling of all standard types of tensor and spinor field matter to Plebanski gravity. This theory is a geometric reformulation of vacuum general relativity in terms of two-form frames and connection one-forms, and provides a covariant basis for various quantization approaches. Using the spinor formalism we prove the consistency of the newly proposed matter coupling by demonstrating the full equivalence of Plebanski gravity plus matter to Einstein--Cartan gravity. As a byproduct we also show the consistency of some previous suggestions for matter actions.
Consistent matter couplings for Plebanski gravity
Tennie, Felix; Wohlfarth, Mattias N. R.
2010-11-01
We develop a scheme for the minimal coupling of all standard types of tensor and spinor field matter to Plebanski gravity. This theory is a geometric reformulation of vacuum general relativity in terms of two-form frames and connection one-forms, and provides a covariant basis for various quantization approaches. Using the spinor formalism we prove the consistency of the newly proposed matter coupling by demonstrating the full equivalence of Plebanski gravity plus matter to Einstein-Cartan gravity. As a by-product we also show the consistency of some previous suggestions for matter actions.
Sparse motion segmentation using multiple six-point consistencies
Zografos, Vasileios; Ellis, Liam
2010-01-01
We present a method for segmenting an arbitrary number of moving objects in image sequences using the geometry of 6 points in 2D to infer motion consistency. The method has been evaluated on the Hopkins 155 database and surpasses current state-of-the-art methods such as SSC, both in terms of overall performance on two and three motions but also in terms of maximum errors. The method works by ?nding initial clusters in the spatial domain, and then classifying each remaining point as belonging to the cluster that minimizes a motion consistency score. In contrast to most other motion segmentation methods that are based on an a?ne camera model, the proposed method is fully projective.
Consistency in multi-viewpoint architectural design
Dijkman, R.M.; Dijkman, Remco Matthijs
2006-01-01
This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder’s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint.
Consistency and stability of recombinant fermentations.
Wiebe, M E; Builder, S E
1994-01-01
Production of proteins of consistent quality in heterologous, genetically-engineered expression systems is dependent upon identifying the manufacturing process parameters which have an impact on product structure, function, or purity, validating acceptable ranges for these variables, and performing the manufacturing process as specified. One of the factors which may affect product consistency is genetic instability of the primary product sequence, as well as instability of genes which code for proteins responsible for post-translational modification of the product. Approaches have been developed for mammalian expression systems to assure that product quality is not changing through mechanisms of genetic instability. Sensitive protein analytical methods, particularly peptide mapping, are used to evaluate product structure directly, and are more sensitive in detecting genetic instability than is direct genetic analysis by nucleotide sequencing of the recombinant gene or mRNA. These methods are being employed to demonstrate that the manufacturing process consistently yields a product of defined structure from cells cultured through the range of cell ages used in the manufacturing process and well beyond the maximum cell age defined for the process. The combination of well designed validation studies which demonstrate consistent product quality as a function of cell age, and rigorous quality control of every product lot by sensitive protein analytical methods provide the necessary assurance that product structure is not being altered through mechanisms of mutation and selection.
Developing consistent pronunciation models for phonemic variants
Davel, M
2006-09-01
Full Text Available from a lexicon containing variants. In this paper we (the authors) address both these issues by creating ‘pseudo-phonemes’ associated with sets of ‘generation restriction rules’ to model those pronunciations that are consistently realised as two or more...
On Consistency Maintenance In Service Discovery
Sundramoorthy, V.; Hartel, Pieter H.; Scholten, Johan
2005-01-01
Communication and node failures degrade the ability of a service discovery protocol to ensure Users receive the correct service information when the service changes. We propose that service discovery protocols employ a set of recovery techniques to recover from failures and regain consistency. We
On Consistency Maintenance In Service Discovery
Sundramoorthy, V.; Hartel, Pieter H.; Scholten, Johan
Communication and node failures degrade the ability of a service discovery protocol to ensure Users receive the correct service information when the service changes. We propose that service discovery protocols employ a set of recovery techniques to recover from failures and regain consistency. We
Consistent feeding positions of great tit parents
Lessells, C.M.; Poelman, E.H.; Mateman, A.C.; Cassey, P.
2006-01-01
When parent birds arrive at the nest to provision their young, their position on the nest rim may influence which chick or chicks are fed. As a result, the consistency of feeding positions of the individual parents, and the difference in position between the parents, may affect how equitably food is
Computing Rooted and Unrooted Maximum Consistent Supertrees
van Iersel, Leo
2009-01-01
A chief problem in phylogenetics and database theory is the computation of a maximum consistent tree from a set of rooted or unrooted trees. A standard input are triplets, rooted binary trees on three leaves, or quartets, unrooted binary trees on four leaves. We give exact algorithms constructing rooted and unrooted maximum consistent supertrees in time O(2^n n^5 m^2 log(m)) for a set of m triplets (quartets), each one distinctly leaf-labeled by some subset of n labels. The algorithms extend to weighted triplets (quartets). We further present fast exact algorithms for constructing rooted and unrooted maximum consistent trees in polynomial space. Finally, for a set T of m rooted or unrooted trees with maximum degree D and distinctly leaf-labeled by some subset of a set L of n labels, we compute, in O(2^{mD} n^m m^5 n^6 log(m)) time, a tree distinctly leaf-labeled by a maximum-size subset X of L that all trees in T, when restricted to X, are consistent with.
Addendum to "On the consistency of MPS"
Souto-Iglesias, Antonio; González, Leo M; Cercos-Pita, Jose L
2013-01-01
The analogies between the Moving Particle Semi-implicit method (MPS) and Incompressible Smoothed Particle Hydrodynamics method (ISPH) are established in this note, as an extension of the MPS consistency analysis conducted in "Souto-Iglesias et al., Computer Physics Communications, 184(3), 2013."
Proteolysis and consistency of Meshanger cheese
Jong, de L.
1978-01-01
Proteolysis in Meshanger cheese, estimated by quantitative polyacrylamide gel electrophoresis is discussed. The conversion of α _{s1} -casein was proportional to rennet concentration in the cheese. Changes in consistency, after a maximum, were correlated to breakdown of
On the existence of consistent price systems
Bayraktar, Erhan; Pakkanen, Mikko S.; Sayit, Hasanjan
2014-01-01
We formulate a sufficient condition for the existence of a consistent price system (CPS), which is weaker than the conditional full support condition (CFS). We use the new condition to show the existence of CPSs for certain processes that fail to have the CFS property. In particular this condition...
A self-consistent Maltsev pulse model
Buneman, O.
1985-04-01
A self-consistent model for an electron pulse propagating through a plasma is presented. In this model, the charge imbalance between plasma ions, plasma electrons and pulse electrons creates the travelling potential well in which the pulse electrons are trapped.
Consistent implementation of decisions in the brain.
James A R Marshall
Full Text Available Despite the complexity and variability of decision processes, motor responses are generally stereotypical and independent of decision difficulty. How is this consistency achieved? Through an engineering analogy we consider how and why a system should be designed to realise not only flexible decision-making, but also consistent decision implementation. We specifically consider neurobiologically-plausible accumulator models of decision-making, in which decisions are made when a decision threshold is reached. To trade-off between the speed and accuracy of the decision in these models, one can either adjust the thresholds themselves or, equivalently, fix the thresholds and adjust baseline activation. Here we review how this equivalence can be implemented in such models. We then argue that manipulating baseline activation is preferable as it realises consistent decision implementation by ensuring consistency of motor inputs, summarise empirical evidence in support of this hypothesis, and suggest that it could be a general principle of decision making and implementation. Our goal is therefore to review how neurobiologically-plausible models of decision-making can manipulate speed-accuracy trade-offs using different mechanisms, to consider which of these mechanisms has more desirable decision-implementation properties, and then review the relevant neuroscientific data on which mechanism brains actually use.
Consistency in multi-viewpoint architectural design
Dijkman, Remco Matthijs
2006-01-01
This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder¿s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint. Thi
Properties and Update Semantics of Consistent Views
1985-09-01
8217 PR.OPERTIES AND UPDATE SEMANTICS OF CONSISTENT VIEWS G. Gottlob Institute for Applied Mathematics C.N.H.., G<•nova, Italy Compnt.<•r Sden... Gottlob G., Paolini P., Zicari R., "Proving Properties of Programs ou Database Views", Dipartiuwnto di Elcttronica, Politecnko di Milano (in
Consistency Analysis of Network Traffic Repositories
Lastdrager, Elmer; Pras, Aiko
2009-01-01
Traffic repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffic that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for var
Consistency of Network Traffic Repositories: An Overview
Lastdrager, E.; Pras, A.
2009-01-01
Traffc repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffc that has been ﬂowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for vario
Consistency and variability in functional localisers
Duncan, Keith J.; Pattamadilok, Chotiga; Knierim, Iris; Devlin, Joseph T.
2009-01-01
A critical assumption underlying the use of functional localiser scans is that the voxels identified as the functional region-of-interest (fROI) are essentially the same as those activated by the main experimental manipulation. Intra-subject variability in the location of the fROI violates this assumption, reducing the sensitivity of the analysis and biasing the results. Here we investigated consistency and variability in fROIs in a set of 45 volunteers. They performed two functional localiser scans to identify word- and object-sensitive regions of ventral and lateral occipito-temporal cortex, respectively. In the main analyses, fROIs were defined as the category-selective voxels in each region and consistency was measured as the spatial overlap between scans. Consistency was greatest when minimally selective thresholds were used to define “active” voxels (p < 0.05 uncorrected), revealing that approximately 65% of the voxels were commonly activated by both scans. In contrast, highly selective thresholds (p < 10− 4 to 10− 6) yielded the lowest consistency values with less than 25% overlap of the voxels active in both scans. In other words, intra-subject variability was surprisingly high, with between one third and three quarters of the voxels in a given fROI not corresponding to those activated in the main task. This level of variability stands in striking contrast to the consistency seen in retinotopically-defined areas and has important implications for designing robust but efficient functional localiser scans. PMID:19289173
School Science Review, 1978
1978-01-01
Presents sixteen project notes developed by pupils of Chipping Norton School and Bristol Grammar School, in the United Kingdom. These Projects include eight biology A-level projects and eight Chemistry A-level projects. (HM)
[Training of sensorial panels consisting of children].
Wittig de Penna, E; Bunger Timermann, A; Serrano Valdés, L
2000-03-01
In the development of food products for children, it is advisable to establish the characteristics of the product with groups of children that represent the target population. To ensure the success of the products, the quality and hedonic satisfaction expectatives must be considered. In order to accomplish this premises, a group of children under the Program of Complementary Feeding of the Health Ministry--was selected and trained. The project was developed with a group of 33 children ages 9 to 12 years--from the Republica of Colombia School of Santiago, whose parents agreed and supported the participation of their children in this project. The first step was teaching the technics and methodology of Sensory Evaluation, and increasing their sensitivity. After the 8 programmed sessions, those children who met the minimal requirements for a training group were chosen. The second step was performed during 12 sessions, working with 14 children. The training was aimed at the development of the vocabulary to describe quality and defects, ranking tests, discriminative tests and the use of different scales. Tests to verify reliability, veracity and reproducibility of judgements (p < 0.05) were carried out. The trained group was able to assess different meals of the Program. The obtained results allowed to propose the improvement of some quality criteria of the Program meals.
Bowman, Kaye; McKenna, Suzy
2016-01-01
This occasional paper provides an overview of the development of Australia's national training system and is a key knowledge document of a wider research project "Consistency with flexibility in the Australian national training system." This research project investigates the various approaches undertaken by each of the jurisdictions to…
Self-consistency in Capital Markets
Benbrahim, Hamid
2013-03-01
Capital Markets are considered, at least in theory, information engines whereby traders contribute to price formation with their diverse perspectives. Regardless whether one believes in efficient market theory on not, actions by individual traders influence prices of securities, which in turn influence actions by other traders. This influence is exerted through a number of mechanisms including portfolio balancing, margin maintenance, trend following, and sentiment. As a result market behaviors emerge from a number of mechanisms ranging from self-consistency due to wisdom of the crowds and self-fulfilling prophecies, to more chaotic behavior resulting from dynamics similar to the three body system, namely the interplay between equities, options, and futures. This talk will address questions and findings regarding the search for self-consistency in capital markets.
Student Effort, Consistency and Online Performance
Hilde Patron
2011-07-01
Full Text Available This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas effort, or total minutes spent online, is not. Other independent variables include GPA and the difference between a pre-test and a post-test. The GPA is used as a measure of motivation, and the difference between a post-test and pre-test as marginal learning. As expected, the level of motivation is found statistically significant at a 99% confidence level, and marginal learning is also significant at a 95% level.
Consistence beats causality in recommender systems
Zhu, Xuzhen; Hu, Zheng; Zhang, Ping; Zhou, Tao
2015-01-01
The explosive growth of information challenges people's capability in finding out items fitting to their own interests. Recommender systems provide an efficient solution by automatically push possibly relevant items to users according to their past preferences. Recommendation algorithms usually embody the causality from what having been collected to what should be recommended. In this article, we argue that in many cases, a user's interests are stable, and thus the previous and future preferences are highly consistent. The temporal order of collections then does not necessarily imply a causality relationship. We further propose a consistence-based algorithm that outperforms the state-of-the-art recommendation algorithms in disparate real data sets, including \\textit{Netflix}, \\textit{MovieLens}, \\textit{Amazon} and \\textit{Rate Your Music}.
A supersymmetric consistent truncation for conifold solutions
Cassani, Davide
2010-01-01
We establish a supersymmetric consistent truncation of type IIB supergravity on the T^{1,1} coset space, based on extending the Papadopoulos-Tseytlin ansatz to the full set of SU(2)xSU(2) invariant Kaluza-Klein modes. The five-dimensional model is a gauged N=4 supergravity with three vector multiplets, which incorporates various conifold solutions and is suitable for the study of their dynamics. By analysing the scalar potential we find a family of new non-supersymmetric AdS_5 extrema interpolating between a solution obtained long ago by Romans and a solution employing an Einstein metric on T^{1,1} different from the standard one. Finally, we discuss some simple consistent subtruncations preserving N=2 supersymmetry. One of them is compatible with the inclusion of smeared D7-branes.
Temporally consistent segmentation of point clouds
Owens, Jason L.; Osteen, Philip R.; Daniilidis, Kostas
2014-06-01
We consider the problem of generating temporally consistent point cloud segmentations from streaming RGB-D data, where every incoming frame extends existing labels to new points or contributes new labels while maintaining the labels for pre-existing segments. Our approach generates an over-segmentation based on voxel cloud connectivity, where a modified k-means algorithm selects supervoxel seeds and associates similar neighboring voxels to form segments. Given the data stream from a potentially mobile sensor, we solve for the camera transformation between consecutive frames using a joint optimization over point correspondences and image appearance. The aligned point cloud may then be integrated into a consistent model coordinate frame. Previously labeled points are used to mask incoming points from the new frame, while new and previous boundary points extend the existing segmentation. We evaluate the algorithm on newly-generated RGB-D datasets.
Foundations of consistent couple stress theory
Hadjesfandiari, Ali R
2015-01-01
In this paper, we examine the recently developed skew-symmetric couple stress theory and demonstrate its inner consistency, natural simplicity and fundamental connection to classical mechanics. This hopefully will help the scientific community to overcome any ambiguity and skepticism about this theory, especially the validity of the skew-symmetric character of the couple-stress tensor. We demonstrate that in a consistent continuum mechanics, the response of infinitesimal elements of matter at each point decomposes naturally into a rigid body portion, plus the relative translation and rotation of these elements at adjacent points of the continuum. This relative translation and rotation captures the deformation in terms of stretches and curvatures, respectively. As a result, the continuous displacement field and its corresponding rotation field are the primary variables, which remarkably is in complete alignment with rigid body mechanics, thus providing a unifying basis. For further clarification, we also exami...
Consistent Linearized Gravity in Brane Backgrounds
Aref'eva, I Ya; Mück, W; Viswanathan, K S; Volovich, I V
2000-01-01
A globally consistent treatment of linearized gravity in the Randall-Sundrum background with matter on the brane is formulated. Using a novel gauge, in which the transverse components of the metric are non-vanishing, the brane is kept straight. We analyze the gauge symmetries and identify the physical degrees of freedom of gravity. Our results underline the necessity for non-gravitational confinement of matter to the brane.
Self-consistent model of fermions
Yershov, V N
2002-01-01
We discuss a composite model of fermions based on three-flavoured preons. We show that the opposite character of the Coulomb and strong interactions between these preons lead to formation of complex structures reproducing three generations of quarks and leptons with all their quantum numbers and masses. The model is self-consistent (it doesn't use input parameters). Nevertheless, the masses of the generated structures match the experimental values.
Consistent formulation of the spacelike axial gauge
Burnel, A.; Van der Rest-Jaspers, M.
1983-12-15
The usual formulation of the spacelike axial gauge is afflicted with the difficulty that the metric is indefinite while no ghost is involved. We solve this difficulty by introducing a ghost whose elimination is such that the metric becomes positive for physical states. The technique consists in the replacement of the gauge condition nxA = 0 by the weaker one partial/sub 0/nxAroughly-equal0.
Security Policy: Consistency, Adjustments and Restraining Factors
Yang; Jiemian
2004-01-01
In the 2004 U.S. presidential election, despite well-divided domestic opinions and Kerry's appealing slogan of "Reversing the Trend," a slight majority still voted for George W. Bush in the end. It is obvious that, based on the author's analysis, security agenda such as counter-terrorism and Iraqi issue has contributed greatly to the reelection of Mr. Bush. This also indicates that the security policy of Bush's second term will basically be consistent.……
Self-consistent structure of metallic hydrogen
Straus, D. M.; Ashcroft, N. W.
1977-01-01
A calculation is presented of the total energy of metallic hydrogen for a family of face-centered tetragonal lattices carried out within the self-consistent phonon approximation. The energy of proton motion is large and proper inclusion of proton dynamics alters the structural dependence of the total energy, causing isotropic lattices to become favored. For the dynamic lattice the structural dependence of terms of third and higher order in the electron-proton interaction is greatly reduced from static lattice equivalents.
Radiometric consistency assessment of hyperspectral infrared sounders
Wang, L.; Y. Han; Jin, X.; Y. Chen; D. A. Tremblay
2015-01-01
The radiometric and spectral consistency among the Atmospheric Infrared Sounder (AIRS), the Infrared Atmospheric Sounding Interferometer (IASI), and the Cross-track Infrared Sounder (CrIS) is fundamental for the creation of long-term infrared (IR) hyperspectral radiance benchmark datasets for both inter-calibration and climate-related studies. In this study, the CrIS radiance measurements on Suomi National Polar-orbiting Partnership (SNPP) satellite are directly com...
The internal consistency of perfect competition
Jakob Kapeller; Stephan Pühringer
2010-01-01
This article surveys some arguments brought forward in defense of the theory of perfect competition. While some critics propose that the theory of perfect competition, and thus also the theory of the firm, are logically flawed, (mainstream) economists defend their most popular textbook model by a series of apparently different arguments. Here it is examined whether these arguments are comparable, consistent and convincing from the point of view of philosophy of science.
Cloud Standardization: Consistent Business Processes and Information
Razvan Daniel ZOTA
2013-01-01
Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.
Dynamic consistency for Stochastic Optimal Control problems
Carpentier, Pierre; Cohen, Guy; De Lara, Michel; Girardeau, Pierre
2010-01-01
For a sequence of dynamic optimization problems, we aim at discussing a notion of consistency over time. This notion can be informally introduced as follows. At the very first time step $t_0$, the decision maker formulates an optimization problem that yields optimal decision rules for all the forthcoming time step $t_0, t_1, ..., T$; at the next time step $t_1$, he is able to formulate a new optimization problem starting at time $t_1$ that yields a new sequence of optimal decision rules. This process can be continued until final time $T$ is reached. A family of optimization problems formulated in this way is said to be time consistent if the optimal strategies obtained when solving the original problem remain optimal for all subsequent problems. The notion of time consistency, well-known in the field of Economics, has been recently introduced in the context of risk measures, notably by Artzner et al. (2007) and studied in the Stochastic Programming framework by Shapiro (2009) and for Markov Decision Processes...
CMB lens sample covariance and consistency relations
Motloch, Pavel; Hu, Wayne; Benoit-Lévy, Aurélien
2017-02-01
Gravitational lensing information from the two and higher point statistics of the cosmic microwave background (CMB) temperature and polarization fields are intrinsically correlated because they are lensed by the same realization of structure between last scattering and observation. Using an analytic model for lens sample covariance, we show that there is one mode, separately measurable in the lensed CMB power spectra and lensing reconstruction, that carries most of this correlation. Once these measurements become lens sample variance dominated, this mode should provide a useful consistency check between the observables that is largely free of sampling and cosmological parameter errors. Violations of consistency could indicate systematic errors in the data and lens reconstruction or new physics at last scattering, any of which could bias cosmological inferences and delensing for gravitational waves. A second mode provides a weaker consistency check for a spatially flat universe. Our analysis isolates the additional information supplied by lensing in a model-independent manner but is also useful for understanding and forecasting CMB cosmological parameter errors in the extended Λ cold dark matter parameter space of dark energy, curvature, and massive neutrinos. We introduce and test a simple but accurate forecasting technique for this purpose that neither double counts lensing information nor neglects lensing in the observables.
Sparse Multi-View Consistency for Object Segmentation.
Djelouah, Abdelaziz; Franco, Jean-Sébastien; Boyer, Edmond; Le Clerc, François; Pérez, Patrick
2015-09-01
Multiple view segmentation consists in segmenting objects simultaneously in several views. A key issue in that respect and compared to monocular settings is to ensure propagation of segmentation information between views while minimizing complexity and computational cost. In this work, we first investigate the idea that examining measurements at the projections of a sparse set of 3D points is sufficient to achieve this goal. The proposed algorithm softly assigns each of these 3D samples to the scene background if it projects on the background region in at least one view, or to the foreground if it projects on foreground region in all views. Second, we show how other modalities such as depth may be seamlessly integrated in the model and benefit the segmentation. The paper exposes a detailed set of experiments used to validate the algorithm, showing results comparable with the state of art, with reduced computational complexity. We also discuss the use of different modalities for specific situations, such as dealing with a low number of viewpoints or a scene with color ambiguities between foreground and background.
Evaluating Temporal Consistency in Marine Biodiversity Hotspots.
Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S
2015-01-01
With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other
Consistency Relations for the Conformal Mechanism
Creminelli, Paolo; Khoury, Justin; Simonović, Marko
2012-01-01
We systematically derive the consistency relations associated to the non-linearly realized symmetries of theories with spontaneously broken conformal symmetry but with a linearly-realized de Sitter subalgebra. These identities relate (N+1)-point correlation functions with a soft external Goldstone to N-point functions. These relations have direct implications for the recently proposed conformal mechanism for generating density perturbations in the early universe. We study the observational consequences, in particular a novel one-loop contribution to the four-point function, relevant for the stochastic scale-dependent bias and CMB mu-distortion.
Consistency relations for the conformal mechanism
Creminelli, Paolo [Abdus Salam International Centre for Theoretical Physics, Strada Costiera 11, 34151, Trieste (Italy); Joyce, Austin; Khoury, Justin [Center for Particle Cosmology, Department of Physics and Astronomy, University of Pennsylvania, Philadelphia, PA 19104 (United States); Simonović, Marko, E-mail: creminel@ictp.it, E-mail: joyceau@sas.upenn.edu, E-mail: jkhoury@sas.upenn.edu, E-mail: marko.simonovic@sissa.it [SISSA, via Bonomea 265, 34136, Trieste (Italy)
2013-04-01
We systematically derive the consistency relations associated to the non-linearly realized symmetries of theories with spontaneously broken conformal symmetry but with a linearly-realized de Sitter subalgebra. These identities relate (N+1)-point correlation functions with a soft external Goldstone to N-point functions. These relations have direct implications for the recently proposed conformal mechanism for generating density perturbations in the early universe. We study the observational consequences, in particular a novel one-loop contribution to the four-point function, relevant for the stochastic scale-dependent bias and CMB μ-distortion.
Consistency of non-minimal renormalisation schemes
Jack, I
2016-01-01
Non-minimal renormalisation schemes such as the momentum subtraction scheme (MOM) have frequently been used for physical computations. The consistency of such a scheme relies on the existence of a coupling redefinition linking it to MSbar. We discuss the implementation of this procedure in detail for a general theory and show how to construct the relevant redefinition up to three-loop order, for the case of a general theory of fermions and scalars in four dimensions and a general scalar theory in six dimensions.
Gentzen's centenary the quest for consistency
Rathjen, Michael
2015-01-01
Gerhard Gentzen has been described as logic’s lost genius, whom Gödel called a better logician than himself. This work comprises articles by leading proof theorists, attesting to Gentzen’s enduring legacy to mathematical logic and beyond. The contributions range from philosophical reflections and re-evaluations of Gentzen’s original consistency proofs to the most recent developments in proof theory. Gentzen founded modern proof theory. His sequent calculus and natural deduction system beautifully explain the deep symmetries of logic. They underlie modern developments in computer science such as automated theorem proving and type theory.
Consistent Predictions of Future Forest Mortality
McDowell, N. G.
2014-12-01
We examined empirical and model based estimates of current and future forest mortality of conifers in the northern hemisphere. Consistent water potential thresholds were found that resulted in mortality of our case study species, pinon pine and one-seed juniper. Extending these results with IPCC climate scenarios suggests that most existing trees in this region (SW USA) will be dead by 2050. Further, independent estimates of future mortality for the entire coniferous biome suggest widespread mortality by 2100. The validity and assumptions and implications of these results are discussed.
Surface consistent finite frequency phase corrections
Kimman, W. P.
2016-07-01
Static time-delay corrections are frequency independent and ignore velocity variations away from the assumed vertical ray path through the subsurface. There is therefore a clear potential for improvement if the finite frequency nature of wave propagation can be properly accounted for. Such a method is presented here based on the Born approximation, the assumption of surface consistency and the misfit of instantaneous phase. The concept of instantaneous phase lends itself very well for sweep-like signals, hence these are the focus of this study. Analytical sensitivity kernels are derived that accurately predict frequency-dependent phase shifts due to P-wave anomalies in the near surface. They are quick to compute and robust near the source and receivers. An additional correction is presented that re-introduces the nonlinear relation between model perturbation and phase delay, which becomes relevant for stronger velocity anomalies. The phase shift as function of frequency is a slowly varying signal, its computation therefore does not require fine sampling even for broad-band sweeps. The kernels reveal interesting features of the sensitivity of seismic arrivals to the near surface: small anomalies can have a relative large impact resulting from the medium field term that is dominant near the source and receivers. Furthermore, even simple velocity anomalies can produce a distinct frequency-dependent phase behaviour. Unlike statics, the predicted phase corrections are smooth in space. Verification with spectral element simulations shows an excellent match for the predicted phase shifts over the entire seismic frequency band. Applying the phase shift to the reference sweep corrects for wavelet distortion, making the technique akin to surface consistent deconvolution, even though no division in the spectral domain is involved. As long as multiple scattering is mild, surface consistent finite frequency phase corrections outperform traditional statics for moderately large
Are there consistent models giving observable NSI ?
Martinez, Enrique Fernandez
2013-01-01
While the existing direct bounds on neutrino NSI are rather weak, order 10(−)(1) for propagation and 10(−)(2) for production and detection, the close connection between these interactions and new NSI affecting the better-constrained charged letpon sector through gauge invariance make these bounds hard to saturate in realistic models. Indeed, Standard Model extensions leading to neutrino NSI typically imply constraints at the 10(−)(3) level. The question of whether consistent models leading to observable neutrino NSI naturally arises and was discussed in a dedicated session at NUFACT 11. Here we summarize that discussion.
Consistent thermodynamic properties of lipids systems
Cunico, Larissa; Ceriani, Roberta; Sarup, Bent
Physical and thermodynamic properties of pure components and their mixtures are the basic requirement for process design, simulation, and optimization. In the case of lipids, our previous works[1-3] have indicated a lack of experimental data for pure components and also for their mixtures...... different pressures, with azeotrope behavior observed. Available thermodynamic consistency tests for TPx data were applied before performing parameter regressions for Wilson, NRTL, UNIQUAC and original UNIFAC models. The relevance of enlarging experimental databank of lipids systems data in order to improve...
Consistency Checking of Web Service Contracts
Cambronero, M. Emilia; Okika, Joseph C.; Ravn, Anders Peter
2008-01-01
Behavioural properties are analyzed for web service contracts formulated in Business Process Execution Language (BPEL) and Choreography Description Language (CDL). The key result reported is an automated technique to check consistency between protocol aspects of the contracts. The contracts...... are abstracted to (timed) automata and from there a simulation is set up, which is checked using automated tools for analyzing networks of finite state processes. Here we use the Concurrency Work Bench. The proposed techniques are illustrated with a case study that include otherwise difficult to analyze fault...
Consistent Data Assimilation of Isotopes: 242Pu and 105Pd
G. Palmiotti; H. Hiruta; M. Salvatores
2012-09-01
In this annual report we illustrate the methodology of the consistent data assimilation that allows to use the information coming from integral experiments for improving the basic nuclear parameters used in cross section evaluation. A series of integral experiments are analyzed using the EMPIRE evaluated files for 242Pu and 105Pd. In particular irradiation experiments (PROFIL-1 and -2, TRAPU-1, -2 and -3) provide information about capture cross sections, and a critical configuration, COSMO, where fission spectral indexes were measured, provides information about fission cross section. The observed discrepancies between calculated and experimental results are used in conjunction with the computed sensitivity coefficients and covariance matrix for nuclear parameters in a consistent data assimilation. The results obtained by the consistent data assimilation indicate that not so large modifications on some key identified nuclear parameters allow to obtain reasonable C/E. However, for some parameters such variations are outside the range of 1 s of their initial standard deviation. This can indicate a possible conflict between differential measurements (used to calculate the initial standard deviations) and the integral measurements used in the statistical data adjustment. Moreover, an inconsistency between the C/E of two sets of irradiation experiments (PROFIL and TRAPU) is observed for 242Pu. This is the end of this project funded by the Nuclear Physics Program of the DOE Office of Science. We can indicate that a proof of principle has been demonstrated for a few isotopes for this innovative methodology. However, we are still far from having explored all the possibilities and made this methodology to be considered proved and robust. In particular many issues are worth further investigation: • Non-linear effects • Flexibility of nuclear parameters in describing cross sections • Multi-isotope consistent assimilation • Consistency between differential and integral
Sludge characterization: the role of physical consistency
Spinosa, Ludovico; Wichmann, Knut
2003-07-01
The physical consistency is an important parameter in sewage sludge characterization as it strongly affects almost all treatment, utilization and disposal operations. In addition, in many european Directives a reference to the physical consistency is reported as a characteristic to be evaluated for fulfilling the regulations requirements. Further, in many analytical methods for sludge different procedures are indicated depending on whether a sample is liquid or not, is solid or not. Three physical behaviours (liquid, paste-like and solid) can be observed with sludges, so the development of analytical procedures to define the boundary limit between liquid and paste-like behaviours (flowability) and that between solid and paste-like ones (solidity) is of growing interest. Several devices can be used for evaluating the flowability and solidity properties, but often they are costly and difficult to be operated in the field. Tests have been carried out to evaluate the possibility to adopt a simple extrusion procedure for flowability measurements, and a Vicat needle for solidity ones. (author)
Consistent mutational paths predict eukaryotic thermostability
van Noort Vera
2013-01-01
Full Text Available Abstract Background Proteomes of thermophilic prokaryotes have been instrumental in structural biology and successfully exploited in biotechnology, however many proteins required for eukaryotic cell function are absent from bacteria or archaea. With Chaetomium thermophilum, Thielavia terrestris and Thielavia heterothallica three genome sequences of thermophilic eukaryotes have been published. Results Studying the genomes and proteomes of these thermophilic fungi, we found common strategies of thermal adaptation across the different kingdoms of Life, including amino acid biases and a reduced genome size. A phylogenetics-guided comparison of thermophilic proteomes with those of other, mesophilic Sordariomycetes revealed consistent amino acid substitutions associated to thermophily that were also present in an independent lineage of thermophilic fungi. The most consistent pattern is the substitution of lysine by arginine, which we could find in almost all lineages but has not been extensively used in protein stability engineering. By exploiting mutational paths towards the thermophiles, we could predict particular amino acid residues in individual proteins that contribute to thermostability and validated some of them experimentally. By determining the three-dimensional structure of an exemplar protein from C. thermophilum (Arx1, we could also characterise the molecular consequences of some of these mutations. Conclusions The comparative analysis of these three genomes not only enhances our understanding of the evolution of thermophily, but also provides new ways to engineer protein stability.
Viewpoint Consistency: An Eye Movement Study
Filipe Cristino
2012-05-01
Full Text Available Eye movements have been widely studied, using images and videos in laboratories or portable eye trackers in the real world. Although a good understanding of the saccadic system and extensive models of gaze have been developed over the years, only a few studies have focused on the consistency of eye movements across viewpoints. We have developed a new technique to compute and map the depth of collected eye movements on stimuli rendered from 3D mesh objects using a traditional corneal reflection eye tracker (SR Eyelink 1000. Having eye movements mapped into 3D space (and not on an image space allowed us to compare fixations across viewpoints. Fixation sequences (scanpaths were also studied across viewpoints using the ScanMatch method (Cristino et al 2010, Behavioural and Research Methods 42, 692–700, extended to work with 3D eye movements. In a set of experiments where participants were asked to perform a recognition task on either a set of objects or faces, we recorded their gaze while performing the task. Participants either viewed the stimuli in 2D or using anaglyph glasses. The stimuli were shown from different viewpoints during the learning and testing phases. A high degree of gaze consistency was found across the different viewpoints, particularly between learning and testing phases. Scanpaths were also similar across viewpoints, suggesting not only that the gazed spatial locations are alike, but also their temporal order.
Subgame consistent cooperation a comprehensive treatise
Yeung, David W K
2016-01-01
Strategic behavior in the human and social world has been increasingly recognized in theory and practice. It is well known that non-cooperative behavior could lead to suboptimal or even highly undesirable outcomes. Cooperation suggests the possibility of obtaining socially optimal solutions and the calls for cooperation are prevalent in real-life problems. Dynamic cooperation cannot be sustainable if there is no guarantee that the agreed upon optimality principle at the beginning is maintained throughout the cooperation duration. It is due to the lack of this kind of guarantees that cooperative schemes fail to last till its end or even fail to get started. The property of subgame consistency in cooperative dynamic games and the corresponding solution mechanism resolve this “classic” problem in game theory. This book is a comprehensive treatise on subgame consistent dynamic cooperation covering the up-to-date state of the art analyses in this important topic. It sets out to provide the theory, solution tec...
Merging By Decentralized Eventual Consistency Algorithms
Ahmed-Nacer Mehdi
2015-12-01
Full Text Available Merging mechanism is an essential operation for version control systems. When each member of collaborative development works on an individual copy of the project, software merging allows to reconcile modifications made concurrently as well as managing software change through branching. The collaborative system is in charge to propose a merge result that includes user’s modifications. Theusers now have to check and adapt this result. The adaptation should be as effort-less as possible, otherwise, the users may get frustrated and will quit the collaboration. This paper aims to reduce the conflicts during the collaboration and im prove the productivity. It has three objectives: study the users’ behavior during the collaboration, evaluate the quality of textual merging results produced by specific algorithms and propose a solution to improve the r esult quality produced by the default merge tool of distributed version control systems. Through a study of eight open-source repositories totaling more than 3 million lines of code, we observe the behavior of the concurrent modifications during t he merge p rocedure. We i dentified when th e ex isting merge techniques under-perform, and we propose solutions to improve the quality of the merge. We finally compare with the traditional merge tool through a large corpus of collaborative editing.
Interdisciplinary research has consistently lower funding success.
Bromham, Lindell; Dinnage, Russell; Hua, Xia
2016-06-30
Interdisciplinary research is widely considered a hothouse for innovation, and the only plausible approach to complex problems such as climate change. One barrier to interdisciplinary research is the widespread perception that interdisciplinary projects are less likely to be funded than those with a narrower focus. However, this commonly held belief has been difficult to evaluate objectively, partly because of lack of a comparable, quantitative measure of degree of interdisciplinarity that can be applied to funding application data. Here we compare the degree to which research proposals span disparate fields by using a biodiversity metric that captures the relative representation of different fields (balance) and their degree of difference (disparity). The Australian Research Council's Discovery Programme provides an ideal test case, because a single annual nationwide competitive grants scheme covers fundamental research in all disciplines, including arts, humanities and sciences. Using data on all 18,476 proposals submitted to the scheme over 5 consecutive years, including successful and unsuccessful applications, we show that the greater the degree of interdisciplinarity, the lower the probability of being funded. The negative impact of interdisciplinarity is significant even when number of collaborators, primary research field and type of institution are taken into account. This is the first broad-scale quantitative assessment of success rates of interdisciplinary research proposals. The interdisciplinary distance metric allows efficient evaluation of trends in research funding, and could be used to identify proposals that require assessment strategies appropriate to interdisciplinary research.
Consistent evolution in a pedestrian flow
Guan, Junbiao; Wang, Kaihua
2016-03-01
In this paper, pedestrian evacuation considering different human behaviors is studied by using a cellular automaton (CA) model combined with the snowdrift game theory. The evacuees are divided into two types, i.e. cooperators and defectors, and two different human behaviors, herding behavior and independent behavior, are investigated. It is found from a large amount of numerical simulations that the ratios of the corresponding evacuee clusters are evolved to consistent states despite 11 typically different initial conditions, which may largely owe to self-organization effect. Moreover, an appropriate proportion of initial defectors who are of herding behavior, coupled with an appropriate proportion of initial defectors who are of rationally independent thinking, are two necessary factors for short evacuation time.
Consistency of warm k-inflation
Peng, Zhi-Peng; Zhang, Xiao-Min; Zhu, Jian-Yang
2016-01-01
We extend the k-inflation which is a type of kinetically driven inflationary model under the standard inflationary scenario to a possible warm inflationary scenario. The dynamical equations of this warm k-inflation model are obtained. We rewrite the slow-roll parameters which are different from the usual potential driven inflationary models and perform a linear stability analysis to give the proper slow-roll conditions in the warm k-inflation. Two cases, a power-law kinetic function and an exponential kinetic function, are studied, when the dissipative coefficient $\\Gamma=\\Gamma_0$ and $\\Gamma=\\Gamma(\\phi)$, respectively. A proper number of e-folds is obtained in both concrete cases of warm k-inflation. We find a constant dissipative coefficient ($\\Gamma=\\Gamma_0$) is not a workable choice for these two cases while the two cases with $\\Gamma=\\Gamma(\\phi)$ are self-consistent warm inflationary models.
Compact difference approximation with consistent boundary condition
FU Dexun; MA Yanwen; LI Xinliang; LIU Mingyu
2003-01-01
For simulating multi-scale complex flow fields it should be noted that all the physical quantities we are interested in must be simulated well. With limitation of the computer resources it is preferred to use high order accurate difference schemes. Because of their high accuracy and small stencil of grid points computational fluid dynamics (CFD) workers pay more attention to compact schemes recently. For simulating the complex flow fields the treatment of boundary conditions at the far field boundary points and near far field boundary points is very important. According to authors' experience and published results some aspects of boundary condition treatment for far field boundary are presented, and the emphasis is on treatment of boundary conditions for the upwind compact schemes. The consistent treatment of boundary conditions at the near boundary points is also discussed. At the end of the paper are given some numerical examples. The computed results with presented method are satisfactory.
Reliability and Consistency of Surface Contamination Measurements
Rouppert, F.; Rivoallan, A.; Largeron, C.
2002-02-26
Surface contamination evaluation is a tough problem since it is difficult to isolate the radiations emitted by the surface, especially in a highly irradiating atmosphere. In that case the only possibility is to evaluate smearable (removeable) contamination since ex-situ countings are possible. Unfortunately, according to our experience at CEA, these values are not consistent and thus non relevant. In this study, we show, using in-situ Fourier Transform Infra Red spectrometry on contaminated metal samples, that fixed contamination seems to be chemisorbed and removeable contamination seems to be physisorbed. The distribution between fixed and removeable contamination appears to be variable. Chemical equilibria and reversible ion exchange mechanisms are involved and are closely linked to environmental conditions such as humidity and temperature. Measurements of smearable contamination only give an indication of the state of these equilibria between fixed and removeable contamination at the time and in the environmental conditions the measurements were made.
Trisomy 21 consistently activates the interferon response.
Sullivan, Kelly D; Lewis, Hannah C; Hill, Amanda A; Pandey, Ahwan; Jackson, Leisa P; Cabral, Joseph M; Smith, Keith P; Liggett, L Alexander; Gomez, Eliana B; Galbraith, Matthew D; DeGregori, James; Espinosa, Joaquín M
2016-07-29
Although it is clear that trisomy 21 causes Down syndrome, the molecular events acting downstream of the trisomy remain ill defined. Using complementary genomics analyses, we identified the interferon pathway as the major signaling cascade consistently activated by trisomy 21 in human cells. Transcriptome analysis revealed that trisomy 21 activates the interferon transcriptional response in fibroblast and lymphoblastoid cell lines, as well as circulating monocytes and T cells. Trisomy 21 cells show increased induction of interferon-stimulated genes and decreased expression of ribosomal proteins and translation factors. An shRNA screen determined that the interferon-activated kinases JAK1 and TYK2 suppress proliferation of trisomy 21 fibroblasts, and this defect is rescued by pharmacological JAK inhibition. Therefore, we propose that interferon activation, likely via increased gene dosage of the four interferon receptors encoded on chromosome 21, contributes to many of the clinical impacts of trisomy 21, and that interferon antagonists could have therapeutic benefits.
On the consistent use of Constructed Observables
Trott, Michael
2015-01-01
We define "constructed observables" as relating experimental measurements to terms in a Lagrangian while simultaneously making assumptions about possible deviations from the Standard Model (SM), in other Lagrangian terms. Ensuring that the SM effective field theory (EFT) is constrained correctly when using constructed observables requires that their defining conditions are imposed on the EFT in a manner that is consistent with the equations of motion. Failing to do so can result in a "functionally redundant" operator basis and the wrong expectation as to how experimental quantities are related in the EFT. We illustrate the issues involved considering the $\\rm S$ parameter and the off shell triple gauge coupling (TGC) verticies. We show that the relationships between $h \\rightarrow V \\bar{f} \\, f$ decay and the off shell TGC verticies are subject to these subtleties, and how the connections between these observables vanish in the limit of strong bounds due to LEP. The challenge of using constructed observables...
Consistently weighted measures for complex network topologies
Heitzig, Jobst; Zou, Yong; Marwan, Norbert; Kurths, Jürgen
2011-01-01
When network and graph theory are used in the study of complex systems, a typically finite set of nodes of the network under consideration is frequently either explicitly or implicitly considered representative of a much larger finite or infinite set of objects of interest. The selection procedure, e.g., formation of a subset or some kind of discretization or aggregation, typically results in individual nodes of the studied network representing quite differently sized parts of the domain of interest. This heterogeneity may induce substantial bias and artifacts in derived network statistics. To avoid this bias, we propose an axiomatic scheme based on the idea of {\\em node splitting invariance} to derive consistently weighted variants of various commonly used statistical network measures. The practical relevance and applicability of our approach is demonstrated for a number of example networks from different fields of research, and is shown to be of fundamental importance in particular in the study of climate n...
Consistent 4-form fluxes for maximal supergravity
Godazgar, Hadi; Krueger, Olaf; Nicolai, Hermann
2015-01-01
We derive new ansaetze for the 4-form field strength of D=11 supergravity corresponding to uplifts of four-dimensional maximal gauged supergravity. In particular, the ansaetze directly yield the components of the 4-form field strength in terms of the scalars and vectors of the four-dimensional maximal gauged supergravity---in this way they provide an explicit uplift of all four-dimensional consistent truncations of D=11 supergravity. The new ansaetze provide a substantially simpler method for uplifting d=4 flows compared to the previously available method using the 3-form and 6-form potential ansaetze. The ansatz for the Freund-Rubin term allows us to conjecture a `master formula' for the latter in terms of the scalar potential of d=4 gauged supergravity and its first derivative. We also resolve a long-standing puzzle concerning the antisymmetry of the flux obtained from uplift ansaetze.
Quantum cosmological consistency condition for inflation
Calcagni, Gianluca [Instituto de Estructura de la Materia, CSIC, calle Serrano 121, 28006 Madrid (Spain); Kiefer, Claus [Institut für Theoretische Physik, Universität zu Köln, Zülpicher Strasse 77, 50937 Köln (Germany); Steinwachs, Christian F., E-mail: calcagni@iem.cfmac.csic.es, E-mail: kiefer@thp.uni-koeln.de, E-mail: christian.steinwachs@physik.uni-freiburg.de [Physikalisches Institut, Albert-Ludwigs-Universität Freiburg, Hermann-Herder-Str. 3, 79104 Freiburg (Germany)
2014-10-01
We investigate the quantum cosmological tunneling scenario for inflationary models. Within a path-integral approach, we derive the corresponding tunneling probability distribution. A sharp peak in this distribution can be interpreted as the initial condition for inflation and therefore as a quantum cosmological prediction for its energy scale. This energy scale is also a genuine prediction of any inflationary model by itself, as the primordial gravitons generated during inflation leave their imprint in the B-polarization of the cosmic microwave background. In this way, one can derive a consistency condition for inflationary models that guarantees compatibility with a tunneling origin and can lead to a testable quantum cosmological prediction. The general method is demonstrated explicitly for the model of natural inflation.
Consistent Stochastic Modelling of Meteocean Design Parameters
Sørensen, John Dalsgaard; Sterndorff, M. J.
2000-01-01
Consistent stochastic models of metocean design parameters and their directional dependencies are essential for reliability assessment of offshore structures. In this paper a stochastic model for the annual maximum values of the significant wave height, and the associated wind velocity, current...... velocity, and water level is presented. The stochastic model includes statistical uncertainty and dependency between the four stochastic variables. Further, a new stochastic model for annual maximum directional significant wave heights is presented. The model includes dependency between the maximum wave...... height from neighboring directional sectors. Numerical examples are presented where the models are calibrated using the Maximum Likelihood method to data from the central part of the North Sea. The calibration of the directional distributions is made such that the stochastic model for the omnidirectional...
Internal Branding and Employee Brand Consistent Behaviours
Mazzei, Alessandra; Ravazzani, Silvia
2017-01-01
Employee behaviours conveying brand values, named brand consistent behaviours, affect the overall brand evaluation. Internal branding literature highlights a knowledge gap in terms of communication practices intended to sustain such behaviours. This study contributes to the development of a non......-normative and constitutive approach to internal branding by proposing an enablement-oriented communication approach. The conceptual background presents a holistic model of the inside-out process of brand building. This model adopts a theoretical approach to internal branding as a nonnormative practice that facilitates...... constitutive processes. In particular, the paper places emphasis on the role and kinds of communication practices as a central part of the nonnormative and constitutive internal branding process. The paper also discusses an empirical study based on interviews with 32 Italian and American communication managers...
Quantum cosmological consistency condition for inflation
Calcagni, Gianluca; Steinwachs, Christian F
2014-01-01
We investigate the quantum cosmological tunneling scenario for inflationary models. Within a path-integral approach, we derive the corresponding tunneling probability distribution. A sharp peak in this distribution can be interpreted as the initial condition for inflation and therefore as a quantum cosmological prediction for its energy scale. This energy scale is also a genuine prediction of any inflationary model by itself, as the primordial gravitons generated during inflation leave their imprint in the B-polarization of the cosmic microwave background. In this way, one can derive a consistency condition for inflationary models that guarantees compatibility with a tunneling origin and can lead to a testable quantum cosmological prediction. The general method is demonstrated explicitly for the model of natural inflation.
Thermodynamically consistent model calibration in chemical kinetics
Goutsias John
2011-05-01
Full Text Available Abstract Background The dynamics of biochemical reaction systems are constrained by the fundamental laws of thermodynamics, which impose well-defined relationships among the reaction rate constants characterizing these systems. Constructing biochemical reaction systems from experimental observations often leads to parameter values that do not satisfy the necessary thermodynamic constraints. This can result in models that are not physically realizable and may lead to inaccurate, or even erroneous, descriptions of cellular function. Results We introduce a thermodynamically consistent model calibration (TCMC method that can be effectively used to provide thermodynamically feasible values for the parameters of an open biochemical reaction system. The proposed method formulates the model calibration problem as a constrained optimization problem that takes thermodynamic constraints (and, if desired, additional non-thermodynamic constraints into account. By calculating thermodynamically feasible values for the kinetic parameters of a well-known model of the EGF/ERK signaling cascade, we demonstrate the qualitative and quantitative significance of imposing thermodynamic constraints on these parameters and the effectiveness of our method for accomplishing this important task. MATLAB software, using the Systems Biology Toolbox 2.1, can be accessed from http://www.cis.jhu.edu/~goutsias/CSS lab/software.html. An SBML file containing the thermodynamically feasible EGF/ERK signaling cascade model can be found in the BioModels database. Conclusions TCMC is a simple and flexible method for obtaining physically plausible values for the kinetic parameters of open biochemical reaction systems. It can be effectively used to recalculate a thermodynamically consistent set of parameter values for existing thermodynamically infeasible biochemical reaction models of cellular function as well as to estimate thermodynamically feasible values for the parameters of new
A PROJECT WITHIN MICROSOFT PROJECT 2007
Emil COSMA
2009-10-01
Full Text Available The main purpose of this article is to emphasize the innumerable advantages of the Microsoft Project 2007 projecting environment that a project manager could benefit of. More exactly, Project Management stands for a function that is recognized within the majority of domains. A project is defined as “a temporary effort made for creating a product or a unique service”. A projects’ administrative programme within an informational system (such as Microsoft Project, Primavera Planner represents a “database that is in concordance with time”. It should help proceeding the required operations and, at the same time, to look and behave the same way other frequently utilized productive programmes. It keeps accounts of all information regarding the job requests, period and the project’s needed resources and visualizes the project’s plan in standard, well-defined formats, organizes the activities and resources consistently and efficiently,shares information regarding the project with all persons involved in an intranet or Internet network, and communicates efficiently with the resources and other involved persons, permitting at the same time the project manager to take the final control/decision as his/her responsibility.
Consistent lattice Boltzmann equations for phase transitions.
Siebert, D N; Philippi, P C; Mattila, K K
2014-11-01
Unlike conventional computational fluid dynamics methods, the lattice Boltzmann method (LBM) describes the dynamic behavior of fluids in a mesoscopic scale based on discrete forms of kinetic equations. In this scale, complex macroscopic phenomena like the formation and collapse of interfaces can be naturally described as related to source terms incorporated into the kinetic equations. In this context, a novel athermal lattice Boltzmann scheme for the simulation of phase transition is proposed. The continuous kinetic model obtained from the Liouville equation using the mean-field interaction force approach is shown to be consistent with diffuse interface model using the Helmholtz free energy. Density profiles, interface thickness, and surface tension are analytically derived for a plane liquid-vapor interface. A discrete form of the kinetic equation is then obtained by applying the quadrature method based on prescribed abscissas together with a third-order scheme for the discretization of the streaming or advection term in the Boltzmann equation. Spatial derivatives in the source terms are approximated with high-order schemes. The numerical validation of the method is performed by measuring the speed of sound as well as by retrieving the coexistence curve and the interface density profiles. The appearance of spurious currents near the interface is investigated. The simulations are performed with the equations of state of Van der Waals, Redlich-Kwong, Redlich-Kwong-Soave, Peng-Robinson, and Carnahan-Starling.
Exploring the Consistent behavior of Information Services
Kapidakis Sarantos
2016-01-01
Full Text Available Computer services are normally assumed to work well all the time. This usually happens for crucial services like bank electronic services, but not necessarily so for others, that there is no commercial interest in their operation. In this work we examined the operation and the errors of information services and tried to find clues that will help predicting the consistency of the behavior and the quality of the harvesting, which is harder because of the transient conditions and the many services and the huge amount of harvested information. We found many unexpected situations. The services that always successfully satisfy a request may in fact return part of it. A significant part of the OAI services have ceased working while many other serves occasionally fail to respond. Some services fail in the same way each time, and we pronounce them dead, as we do not see a way to overcome that. Others also always, or sometimes fail, but not in the same way, and we hope that their behavior is affected by temporary factors, that may improve later on. We categorized the services into classes, to study their behavior in more detail.
A Consistent Phylogenetic Backbone for the Fungi
Ebersberger, Ingo; de Matos Simoes, Ricardo; Kupczok, Anne; Gube, Matthias; Kothe, Erika; Voigt, Kerstin; von Haeseler, Arndt
2012-01-01
The kingdom of fungi provides model organisms for biotechnology, cell biology, genetics, and life sciences in general. Only when their phylogenetic relationships are stably resolved, can individual results from fungal research be integrated into a holistic picture of biology. However, and despite recent progress, many deep relationships within the fungi remain unclear. Here, we present the first phylogenomic study of an entire eukaryotic kingdom that uses a consistency criterion to strengthen phylogenetic conclusions. We reason that branches (splits) recovered with independent data and different tree reconstruction methods are likely to reflect true evolutionary relationships. Two complementary phylogenomic data sets based on 99 fungal genomes and 109 fungal expressed sequence tag (EST) sets analyzed with four different tree reconstruction methods shed light from different angles on the fungal tree of life. Eleven additional data sets address specifically the phylogenetic position of Blastocladiomycota, Ustilaginomycotina, and Dothideomycetes, respectively. The combined evidence from the resulting trees supports the deep-level stability of the fungal groups toward a comprehensive natural system of the fungi. In addition, our analysis reveals methodologically interesting aspects. Enrichment for EST encoded data—a common practice in phylogenomic analyses—introduces a strong bias toward slowly evolving and functionally correlated genes. Consequently, the generalization of phylogenomic data sets as collections of randomly selected genes cannot be taken for granted. A thorough characterization of the data to assess possible influences on the tree reconstruction should therefore become a standard in phylogenomic analyses. PMID:22114356
Volume Haptics with Topology-Consistent Isosurfaces.
Corenthy, Loc; Otaduy, Miguel A; Pastor, Luis; Garcia, Marcos
2015-01-01
Haptic interfaces offer an intuitive way to interact with and manipulate 3D datasets, and may simplify the interpretation of visual information. This work proposes an algorithm to provide haptic feedback directly from volumetric datasets, as an aid to regular visualization. The haptic rendering algorithm lets users perceive isosurfaces in volumetric datasets, and it relies on several design features that ensure a robust and efficient rendering. A marching tetrahedra approach enables the dynamic extraction of a piecewise linear continuous isosurface. Robustness is achieved using a continuous collision detection step coupled with state-of-the-art proxy-based rendering methods over the extracted isosurface. The introduced marching tetrahedra approach guarantees that the extracted isosurface will match the topology of an equivalent isosurface computed using trilinear interpolation. The proposed haptic rendering algorithm improves the consistency between haptic and visual cues computing a second proxy on the isosurface displayed on screen. Our experiments demonstrate the improvements on the isosurface extraction stage as well as the robustness and the efficiency of the complete algorithm.
Consistency between GRUAN sondes, LBLRTM and IASI
X. Calbet
2017-06-01
Full Text Available Radiosonde soundings from the GCOS Reference Upper-Air Network (GRUAN data record are shown to be consistent with Infrared Atmospheric Sounding Instrument (IASI-measured radiances via LBLRTM (Line-By-Line Radiative Transfer Model in the part of the spectrum that is mostly affected by water vapour absorption in the upper troposphere (from 700 hPa up. This result is key for climate data records, since GRUAN, IASI and LBLRTM constitute reference measurements or a reference radiative transfer model in each of their fields. This is specially the case for night-time radiosonde measurements. Although the sample size is small (16 cases, daytime GRUAN radiosonde measurements seem to have a small dry bias of 2.5 % in absolute terms of relative humidity, located mainly in the upper troposphere, with respect to LBLRTM and IASI. Full metrological closure is not yet possible and will not be until collocation uncertainties are better characterized and a full uncertainty covariance matrix is clarified for GRUAN.
Retrocausation, Consistency, and the Bilking Paradox
Dobyns, York H.
2011-11-01
Retrocausation seems to admit of time paradoxes in which events prevent themselves from occurring and thereby create a physical instance of the liar's paradox, an event which occurs iff it does not occur. The specific version in which a retrocausal event is used to trigger an intervention which prevents its own future cause is called the bilking paradox (the event is bilked of its cause). The analysis of Echeverria, Klinkhammer, and Thorne (EKT) suggests time paradoxes cannot arise even in the presence of retrocausation. Any self-contradictory event sequence will be replaced in reality by a closely related but noncontradictory sequence. The EKT analysis implies that attempts to create bilking must instead produce logically consistent sequences wherein the bilked event arises from alternative causes. Bilking a retrocausal information channel of limited reliability usually results only in failures of signaling. An exception applies when the bilking is conducted in response only to some of the signal values that can be carried on the channel. Theoretical analysis based on EKT predicts that, since some of the channel outcomes are not bilked, the channel is capable of transmitting data with its normal reliability, and the paradox-avoidance effects will instead suppress the outcomes that would lead to forbidden (bilked) transmissions. A recent parapsychological experiment by Bem displays a retrocausal information channel of sufficient reliability to test this theoretical model of physical reality's response to retrocausal effects. A modified version with partial bilking would provide a direct test of the generality of the EKT mechanism.
Ciliate communities consistently associated with coral diseases
Sweet, M. J.; Séré, M. G.
2016-07-01
Incidences of coral disease are increasing. Most studies which focus on diseases in these organisms routinely assess variations in bacterial associates. However, other microorganism groups such as viruses, fungi and protozoa are only recently starting to receive attention. This study aimed at assessing the diversity of ciliates associated with coral diseases over a wide geographical range. Here we show that a wide variety of ciliates are associated with all nine coral diseases assessed. Many of these ciliates such as Trochilia petrani and Glauconema trihymene feed on the bacteria which are likely colonizing the bare skeleton exposed by the advancing disease lesion or the necrotic tissue itself. Others such as Pseudokeronopsis and Licnophora macfarlandi are common predators of other protozoans and will be attracted by the increase in other ciliate species to the lesion interface. However, a few ciliate species (namely Varistrombidium kielum, Philaster lucinda, Philaster guamense, a Euplotes sp., a Trachelotractus sp. and a Condylostoma sp.) appear to harbor symbiotic algae, potentially from the coral themselves, a result which may indicate that they play some role in the disease pathology at the very least. Although, from this study alone we are not able to discern what roles any of these ciliates play in disease causation, the consistent presence of such communities with disease lesion interfaces warrants further investigation.
2011-01-01
Project name： 90,000t/a BR device and auxiliary projects Construction unit： Sinopec Beijing Yanshan Petrochemical Company Total investment： 2.257 billion yuan Project description： It will cover an area of 14. lha.
Improving electrofishing catch consistency by standardizing power
Burkhardt, Randy W.; Gutreuter, Steve
1995-01-01
The electrical output of electrofishing equipment is commonly standardized by using either constant voltage or constant amperage, However, simplified circuit and wave theories of electricity suggest that standardization of power (wattage) available for transfer from water to fish may be critical for effective standardization of electrofishing. Electrofishing with standardized power ensures that constant power is transferable to fish regardless of water conditions. The in situ performance of standardized power output is poorly known. We used data collected by the interagency Long Term Resource Monitoring Program (LTRMP) in the upper Mississippi River system to assess the effectiveness of standardizing power output. The data consisted of 278 electrofishing collections, comprising 9,282 fishes in eight species groups, obtained during 1990 from main channel border, backwater, and tailwater aquatic areas in four reaches of the upper Mississippi River and one reach of the Illinois River. Variation in power output explained an average of 14.9% of catch variance for night electrofishing and 12.1 % for day electrofishing. Three patterns in catch per unit effort were observed for different species: increasing catch with increasing power, decreasing catch with increasing power, and no power-related pattern. Therefore, in addition to reducing catch variation, controlling power output may provide some capability to select particular species. The LTRMP adopted standardized power output beginning in 1991; standardized power output is adjusted for variation in water conductivity and water temperature by reference to a simple chart. Our data suggest that by standardizing electrofishing power output, the LTRMP has eliminated substantial amounts of catch variation at virtually no additional cost.
Are paleoclimate model ensembles consistent with the MARGO data synthesis?
J. C. Hargreaves
2011-03-01
Full Text Available We investigate the consistency of various ensembles of model simulations with the Multiproxy Approach for the Reconstruction of the Glacial Ocean Surface (MARGO sea surface temperature data synthesis. We discover that while two multi-model ensembles, created through the Paleoclimate Model Intercomparison Projects (PMIP and PMIP2, pass our simple tests of reliability, an ensemble based on parameter variation in a single model does not perform so well. We show that accounting for observational uncertainty in the MARGO database is of prime importance for correctly evaluating the ensembles. Perhaps surprisingly, the inclusion of a coupled dynamical ocean (compared to the use of a slab ocean does not appear to cause a wider spread in the sea surface temperature anomalies, but rather causes systematic changes with more heat transported north in the Atlantic. There is weak evidence that the sea surface temperature data may be more consistent with meridional overturning in the North Atlantic being similar for the LGM and the present day, however, the small size of the PMIP2 ensemble prevents any statistically significant results from being obtained.
Are paleoclimate model ensembles consistent with the MARGO data synthesis?
J. C. Hargreaves
2011-08-01
Full Text Available We investigate the consistency of various ensembles of climate model simulations with the Multiproxy Approach for the Reconstruction of the Glacial Ocean Surface (MARGO sea surface temperature data synthesis. We discover that while two multi-model ensembles, created through the Paleoclimate Model Intercomparison Projects (PMIP and PMIP2, pass our simple tests of reliability, an ensemble based on parameter variation in a single model does not perform so well. We show that accounting for observational uncertainty in the MARGO database is of prime importance for correctly evaluating the ensembles. Perhaps surprisingly, the inclusion of a coupled dynamical ocean (compared to the use of a slab ocean does not appear to cause a wider spread in the sea surface temperature anomalies, but rather causes systematic changes with more heat transported north in the Atlantic. There is weak evidence that the sea surface temperature data may be more consistent with meridional overturning in the North Atlantic being similar for the LGM and the present day. However, the small size of the PMIP2 ensemble prevents any statistically significant results from being obtained.
Project Management Theory Meets Practice contains the proceedings from the 1st Danish Project Management Research Conference (DAPMARC 2015), held in Copenhagen, Denmark, on May 21st, 2015.......Project Management Theory Meets Practice contains the proceedings from the 1st Danish Project Management Research Conference (DAPMARC 2015), held in Copenhagen, Denmark, on May 21st, 2015....
Munk-Madsen, Andreas
2005-01-01
"Project" is a key concept in IS management. The word is frequently used in textbooks and standards. Yet we seldom find a precise definition of the concept. This paper discusses how to define the concept of a project. The proposed definition covers both heavily formalized projects and informally...... organized, agile projects. Based on the proposed definition popular existing definitions are discussed....
Structural Consistency: Enabling XML Keyword Search to Eliminate Spurious Results Consistently
Lee, Ki-Hoon; Han, Wook-Shin; Kim, Min-Soo
2009-01-01
XML keyword search is a user-friendly way to query XML data using only keywords. In XML keyword search, to achieve high precision without sacrificing recall, it is important to remove spurious results not intended by the user. Efforts to eliminate spurious results have enjoyed some success by using the concepts of LCA or its variants, SLCA and MLCA. However, existing methods still could find many spurious results. The fundamental cause for the occurrence of spurious results is that the existing methods try to eliminate spurious results locally without global examination of all the query results and, accordingly, some spurious results are not consistently eliminated. In this paper, we propose a novel keyword search method that removes spurious results consistently by exploiting the new concept of structural consistency.
Pilkington, Alan; Chai, Kah-Hin; Le, Yang
2015-01-01
This paper identifies the true coverage of PM theory through a bibliometric analysis of the International Journal of Project Management from 1996-2012. We identify six persistent research themes: project time management, project risk management, programme management, large-scale project management......, project success/failure and practitioner development. These differ from those presented in review and editorial articles in the literature. In addition, topics missing from the PM BOK: knowledge management project-based organization and project portfolio management have become more popular topics...
Pilkington, Alan; Chai, Kah-Hin; Le, Yang
2015-01-01
This paper identifies the true coverage of PM theory through a bibliometric analysis of the International Journal of Project Management from 1996-2012. We identify six persistent research themes: project time management, project risk management, programme management, large-scale project management......, project success/failure and practitioner development. These differ from those presented in review and editorial articles in the literature. In addition, topics missing from the PM BOK: knowledge management project-based organization and project portfolio management have become more popular topics...
Time-Consistent and Market-Consistent Evaluations (Revised version of 2012-086)
Stadje, M.A.; Pelsser, A.
2014-01-01
Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
Prometheus Project final report
Taylor, Randall
2005-01-01
This Final Report serves as an executive summary of the Prometheus Project's activities and deliverables from November 2002 through September 2005. It focuses on the challenges from a technical and management perspective, what was different and innovative about this project, and identifies the major options, decisions, and accomplishments of the Project team as a whole. However, the details of the activities performed by DOE NR and its contractors will be documented separately in accordance with closeout requirements of the DOE NR and consistent with agreements between NASA and NR.
H. Boado Magan
2011-01-01
Full Text Available CAREM is a CNEA (Comisión Nacional de Energía Atómica project. This project consists of the development, design, and construction of a small nuclear power plant. First, a prototype of an electrical output of about 27 MW, CAREM 25, will be built in order to validate the innovation of the CAREM concept and then developed to commercial version. After several years of development, the CAREM Project reached such a maturity level that the Argentine government decided on the construction of CAREM prototype. Several activities are ongoing with the purpose of obtaining the construction permit for the CAREM prototype.
Santer, Richard P.; Fell, Frank
2003-05-01
), combining satellite data, evaluation algorithms and value-adding ancillary digital information. This prevents the end user from investing funds into expensive equipment or to hire specialized personnel. The data processor shall be a generic tool, which may be applied to a large variety of operationally gathered satellite data. In the frame of SISCAL, the processor shall be applied to remotely sensed data of selected coastal areas and lakes in Central Europe and the Eastern Mediterranean, according to the needs of the end users within the SISCAL consortium. A number of measures are required to achieve the objective of the proposed project: (1) Identification and specification of the SISCAL end user needs for NRT water related data products accessible to EO techniques. (2) Selection of the most appropriate instruments, evaluation algorithms and ancillary data bases required to provide the identified data products. (3) Development of the actual Near-Real-Time data processor for the specified EO data products. (4) Development of the GIS processor adding ancillary digital information to the satellite images and providing the required geographical projections. (5) Development of a product retrieval and management system to handle ordering and distribution of data products between the SISCAL server and the end users, including payment and invoicing. (6) Evaluation of the derived data products in terms of accuracy and usefulness by comparison with available in-situ measurements and by making use of the local expertise of the end users. (7) Establishing an Internet server dedicated to internal communication between the consortium members as well as presenting the SISCAL project to a larger public. (8) Marketing activities, presentation of data processor to potential external customers, identification of their exact needs. The innovative aspect of the SISCAL project consists in the generation of NRT data products on water quality parameters from EO data. This article mainly deals
Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation
Lindell, Annukka K.
2017-01-01
Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals’ selfie corpora.
Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation.
Lindell, Annukka K
2017-01-01
Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals' selfie corpora.
Hughitt, Brian; Generazio, Edward (Principal Investigator); Nichols, Charles; Myers, Mika (Principal Investigator); Spencer, Floyd (Principal Investigator); Waller, Jess (Principal Investigator); Wladyka, Jordan (Principal Investigator); Aldrin, John; Burke, Eric; Cerecerez, Laura; Corbett, Judy; George, Jill; Hodges, Kenneth; Jones, Justin; Parker, Bradford; Petry, Jennifer
2016-01-01
NASA-STD-5009 requires that successful flaw detection by NDE methods be statistically qualified for use on fracture critical metallic components, but does not standardize practices. This task works towards standardizing calculations and record retention with a web-based tool, the NNWG POD Standards Library or NPSL. Test methods will also be standardized with an appropriately flexible appendix to -5009 identifying best practices. Additionally, this appendix will describe how specimens used to qualify NDE systems will be cataloged, stored and protected from corrosion, damage, or loss.
... state Home > News & Events > Upcoming Events > HBO Alzheimer’s Project In the News Walk to End Alzheimer's Upcoming ... Disease Awareness Month World Alzheimer's Month HBO Alzheimer’s Project MAKE A DONATION Your gift will help us ...
Full Text Available ... Momentum in Science, Part 2" (70 minutes) Be a part of something big. HBO's "THE ALZHEIMER'S PROJECT" ... vital research and services. "THE ALZHEIMER'S PROJECT" is a presentation of HBO Documentary Films and the National ...
Office 2004 Test Drive User
report. Details on the background of the project, the fieldwork, the work completed, the projected end ... Studies, University of Ghana, with a starting date of 01/01/12 and an expected completion of ..... de Bondoukou. Paris: Editions Ernest Leroux.
Full Text Available ... Momentum in Science, Part 2" (70 minutes) Be a part of something big. HBO's "THE ALZHEIMER'S PROJECT" ... vital research and services. "THE ALZHEIMER'S PROJECT" is a presentation of HBO Documentary Films and the National ...
Tryggestad, Kjell; Justesen, Lise; Mouritsen, Jan
2013-01-01
into account. This may require investments in new project management technologies. Originality/value – This paper adds to the literatures on project temporalities and stakeholder theory by connecting them to the question of non-human stakeholders and to project management technologies.......Purpose – The purpose of this paper is to explore how animals can become stakeholders in interaction with project management technologies and what happens with project temporalities when new and surprising stakeholders become part of a project and a recognized matter of concern to be taken...... into account. Design/methodology/approach – The paper is based on a qualitative case study of a project in the building industry. The authors use actor-network theory (ANT) to analyze the emergence of animal stakeholders, stakes and temporalities. Findings – The study shows how project temporalities can...
Jonasson, Haukur Ingi
2013-01-01
How relevant is ethics to project management? The book - which aims to demystify the field of ethics for project managers and managers in general - takes both a critical and a practical look at project management in terms of success criteria, and ethical opportunities and risks. The goal is to help the reader to use ethical theory to further identify opportunities and risks within their projects and thereby to advance more directly along the path of mature and sustainable managerial practice.
Markić, Lucija; Mandušić, Dubravka; Grbavac, Vitomir
2005-01-01
Microsoft Project je alat čije su prednosti u svakodnevnom radu nezamjenjive. Pomoću Microsoft Projecta omogućeno je upravljanje resursima, stvaranje izvještaja o projektima u vremenu, te analize različitih scenarija. Pojavljuje u tri verzije: Microsoft Project Professional, Microsoft Project Server i Microsoft Project Server Client Access Licenses. Upravo je trend da suvremeni poslovni ljudi zadatke povjeravaju Microsoft Projectu jer on znatno povećava produktivnost rada. Te prednos...
Svejvig, Per; Commisso, Trine Hald
2012-01-01
that the best practice knowledge has not permeated sufficiently to the practice. Furthermore, the appropriate application of information and communication technology (ICT) remains a big challenge, and finally project managers are not sufficiently trained in organizing and conducting virtual projects......Virtual projects are common with global competition, market development, and not least the financial crisis forcing organizations to reduce their costs drastically. Organizations therefore have to place high importance on ways to carry out virtual projects and consider appropriate practices...
Ghaderpour, Ebrahim
2014-01-01
In this paper, we introduce some known map projections from a model of the Earth to a flat sheet of paper or map and derive the plotting equations for these projections. The first fundamental form and the Gaussian fundamental quantities are defined and applied to obtain the plotting equations and distortions in length, shape and size for some of these map projections.
2002-01-01
Description of co-operation projects implemented with the help of Sweden is presented. Information on performance of Phare and IAEA Regional and National Technical Cooperation projects is provided. Phare project 'Creation of Radiation Protection Infrastructure and Development of Supporting Services' was started in 2002
Smith, Bert Kruger
Project STAY (Scholarships to Able Youth), located in the barrio of San Antonio, Texas, helps young people stay in school beyond the secondary grades. The project provides outreach services to meet the needs of the students. Its primary service is to act as an advocate for these young people. The project recruits all types of youth from families…
Efficient and Effective Project Management
Dusan Pene
2014-03-01
Full Text Available The purpose of the article is to investigate different authorities and responsibilities of a project manager and of a project leader. Considering the fact that nowadays the project management is becoming the important factor in performing and leading the investments which are modified by modern leadership theories, we can say that the key element is the sovereign leadership of a manager and a project leader. The current multi-project environments and modern techniques at the project management area need the interdisciplinary leadership approach and at the same time they enable the strengthening of company’s competitive features so they are consistently satisfying high project expectations of the project investor or a client.
Increasing Robotic Science Applications Project
National Aeronautics and Space Administration — The project consists of the design, fabrication and testing of two science platforms. The first is a portable Robonaut-compatible science “toolbox”...
Svejvig, Per; Commisso, Trine Hald
2012-01-01
Virtual projects are common with global competition, market development, and not least the financial crisis forcing organizations to reduce their costs drastically. Organizations therefore have to place high importance on ways to carry out virtual projects and consider appropriate practices...... for performing these projects. This paper compares best practices with practiced practices for virtual projects and discusses ways to bridge the gap between them. We have studied eleven virtual projects in five Danish organizations and compared them with a predefined list of best practices compiled from...... that the best practice knowledge has not permeated sufficiently to the practice. Furthermore, the appropriate application of information and communication technology (ICT) remains a big challenge, and finally project managers are not sufficiently trained in organizing and conducting virtual projects...
Desktop Computing Integration Project
Tureman, Robert L., Jr.
1992-01-01
The Desktop Computing Integration Project for the Human Resources Management Division (HRMD) of LaRC was designed to help division personnel use personal computing resources to perform job tasks. The three goals of the project were to involve HRMD personnel in desktop computing, link mainframe data to desktop capabilities, and to estimate training needs for the division. The project resulted in increased usage of personal computers by Awards specialists, an increased awareness of LaRC resources to help perform tasks, and personal computer output that was used in presentation of information to center personnel. In addition, the necessary skills for HRMD personal computer users were identified. The Awards Office was chosen for the project because of the consistency of their data requests and the desire of employees in that area to use the personal computer.
Linear algebra and projective geometry
Baer, Reinhold
2005-01-01
Geared toward upper-level undergraduates and graduate students, this text establishes that projective geometry and linear algebra are essentially identical. The supporting evidence consists of theorems offering an algebraic demonstration of certain geometric concepts. These focus on the representation of projective geometries by linear manifolds, of projectivities by semilinear transformations, of collineations by linear transformations, and of dualities by semilinear forms. These theorems lead to a reconstruction of the geometry that constituted the discussion's starting point, within algebra
Spatially explicit global population scenarios consistent with the Shared Socioeconomic Pathways
Jones, B.; O'Neill, B. C.
2016-08-01
The projected size and spatial distribution of the future population are important drivers of global change and key determinants of exposure and vulnerability to hazards. Spatial demographic projections are widely used as inputs to spatial projections of land use, energy use, and emissions, as well as to assessments of the impacts of extreme events, sea level rise, and other climate-related outcomes. To date, however, there are very few global-scale, spatially explicit population projections, and those that do exist are often based on simple scaling or trend extrapolation. Here we present a new set of global, spatially explicit population scenarios that are consistent with the new Shared Socioeconomic Pathways (SSPs) developed to facilitate global change research. We use a parameterized gravity-based downscaling model to produce projections of spatial population change that are quantitatively consistent with national population and urbanization projections for the SSPs and qualitatively consistent with assumptions in the SSP narratives regarding spatial development patterns. We show that the five SSPs lead to substantially different spatial population outcomes at the continental, national, and sub-national scale. In general, grid cell-level outcomes are most influenced by national-level population change, second by urbanization rate, and third by assumptions about the spatial style of development. However, the relative importance of these factors is a function of the magnitude of the projected change in total population and urbanization for each country and across SSPs. We also demonstrate variation in outcomes considering the example of population existing in a low-elevation coastal zone under alternative scenarios.
The Impact of Project Management Maturity upon IT/IS Project Management Outcomes
Carcillo, Anthony Joseph, Jr.
2013-01-01
Although it is assumed that increasing the institutionalization (or maturity) of project management in an organization leads to greater project success, the literature has diverse views. The purpose of this mixed methods study was to examine the correlation between project management maturity and IT/IS project outcomes. The sample consisted of two…
The Impact of Project Management Maturity upon IT/IS Project Management Outcomes
Carcillo, Anthony Joseph, Jr.
2013-01-01
Although it is assumed that increasing the institutionalization (or maturity) of project management in an organization leads to greater project success, the literature has diverse views. The purpose of this mixed methods study was to examine the correlation between project management maturity and IT/IS project outcomes. The sample consisted of two…
Rigual Martínez, Jaume
2014-01-01
The purpose of my project is to draw up a Business Plan to set up an audiovisual production company in partnership with my University Tecnocampus Mataró-Maresme. A production company which is intended for the formation of new professionals as well as the continued development of quality audiovisual projects. I want to make a feasibility project to show that this production company can be created and be a useful element for my University, particularly for students.
Full Text Available ... voluntary health organization in Alzheimer's care, support and research, the Alzheimer's Association has been an active partner in "THE ALZHEIMER'S PROJECT," ... (48 minutes) "Momentum ...
Product consistency testing of West Valley Compositional Variation Glasses
Olson, K.M.; Marschman, S.C.; Piepel, G.F.; Whiting, G.K.
1994-11-01
Nuclear waste glass produced by the West Valley Demonstration Project (WVDP) must meet the requirements of the Waste Acceptance Preliminary Specification (WAPS) as developed by the US Department of Energy (DOE). To assist WVDP in complying with WAPS, the Materials Characterization Center (MCC) at Pacific Northwest Laboratory (PNL) used the Product Consistency Test (PCT) to evaluate 44 West Valley glasses that had previously been tested in FY 1987 and FY 1988. This report summarizes the results of the PCTs. The glasses tested, which were fabricated as sets of Compositional Variation Glasses for studies performed by the West Valley Support Task (WVST) at PNL during FY 1987 and FY 1988, were doped with Th and U and were variations of West Valley reference glasses. In addition, Approved Reference Material-1 (ARM-1) was used as a test standard (ARM-1 is supplied by the MCC). The PCT was originated at Westinghouse Savannah River Company (WSRC) by C. M. Jantzen and N. R. Bibler (Jantzen and Bibler 1989). The test is a seven-day modified MCC-3 test that uses crushed glass in the size range -100 +200 mesh with deionized water in a Teflon container. There is no agitation during the PCT, and no attempt to include CO{sub 2} from the test environment. Based on B and Li release, the glasses performed about the same as in previous modified MCC-3 testing performed in FY 1987 and FY 1988 (Reimus et al. 1988). The modified MCC-3 tests performed by Reimus et al. were similar to the PCT containers and the exclusion of CO{sub 2} from the tests.
Kim, Changhwan; Park, Miran; Lee, Hoyeon; Cho, Seungryong
2016-03-01
Our earlier work has demonstrated that the data consistency condition can be used as a criterion for scatter kernel optimization in deconvolution methods in a full-fan mode cone-beam CT [1]. However, this scheme cannot be directly applied to CBCT system with an offset detector (half-fan mode) because of transverse data truncation in projections. In this study, we proposed a modified scheme of the scatter kernel optimization method that can be used in a half-fan mode cone-beam CT, and have successfully shown its feasibility. Using the first-reconstructed volume image from half-fan projection data, we acquired full-fan projection data by forward projection synthesis. The synthesized full-fan projections were partly used to fill the truncated regions in the half-fan data. By doing so, we were able to utilize the existing data consistency-driven scatter kernel optimization method. The proposed method was validated by a simulation study using the XCAT numerical phantom and also by an experimental study using the ACS head phantom.
Damkilde, Lars; Larsen, Torben J.; Walbjørn, Jacob
This document is aimed at helping all parties involved in the LEX project to get a common understanding of words, process, levels and the overall concept.......This document is aimed at helping all parties involved in the LEX project to get a common understanding of words, process, levels and the overall concept....
Damkilde, Lars; Larsen, Torben J.; Walbjørn, Jacob
This document is aimed at helping all parties involved in the LEX project to get a common understanding of words, process, levels and the overall concept.......This document is aimed at helping all parties involved in the LEX project to get a common understanding of words, process, levels and the overall concept....
Elaine Cristina Batista de Oliveira
2016-03-01
Full Text Available Abstract This study presents an integrated model to support the process of classifying projects and selecting project managers for these projects in accordance with their characteristics and skills using a multiple criteria decision aid (MCDA approach. Such criteria are often conflicting. The model also supports the process of allocating project managers to projects by evaluating the characteristics/types of projects. The framework consists of a set of structured techniques and methods that are deemed very appropriate within the context of project management. A practical application of the proposed model was performed in a Brazilian electric energy company, which has a portfolio of projects that are specifically related to the company´s defined strategic plan. As a result, it was possible to classify the projects and project managers into definable categories, thus enabling more effective management as different projects require different levels of skills and abilities.
Smith, Rhett [Schweitzer Engineering Laboratories, Inc., Pullman, WA (United States); Campbell, Jack [CenterPoint Energy Houston Electric, TX (United States); Hadley, Mark [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2016-12-30
The Watchdog Project completed 100% of the project Statement of Project Objective (SOPO). The Watchdog project was a very aggressive project looking to accomplish commercialization of technology that had never been commercialized, as a result it took six years to complete not the original three that were planned. No additional federal funds were requested from the original proposal and SEL contributed the additional cost share required to complete the project. The result of the Watchdog Project is the world’s first industrial rated Software Defined Network (SDN) switch commercially available. This technology achieved the SOPOO and DOE Roadmap goals to have strong network access control, improve reliability and network performance, and give the asset owner the ability to minimize attack surface before and during an attack. The Watchdog project is an alliance between CenterPoint Energy Houston Electric, Pacific Northwest National Laboratories (PNNL), and Schweitzer Engineering Laboratories, Inc. (SEL). SEL is the world’s leader in microprocessor-based electronic equipment for protecting electric power systems. PNNL performs basic and applied research to deliver energy, environmental, and national security for our nation. CenterPoint Energy is the third largest publicly traded natural gas delivery company in the U.S and third largest combined electricity and natural gas delivery company. The Watchdog Project efforts were combined with the SDN Project efforts to produce the entire SDN system solution for the critical infrastructure. The Watchdog project addresses Topic Area of Interest 5: Secure Communications, for the DEFOA- 0000359 by protecting the control system local area network itself and the communications coming from and going to the electronic devices on the local network. Local area networks usually are not routed and have little or no filtering capabilities. Combine this with the fact control system protocols are designed with inherent trust the control
Project-Based Teaching: Helping Students Make Project Connections
Johnson, Heather Jo Pusich
variable. As such, teachers adopting project-based curriculum materials need more support - through educative curriculum materials, coaching, or ongoing professional development - to help them support project connections consistently and explicitly in their teaching practice.
Characterization of consistent triggers of migraine with aura
Hauge, Anne Werner; Kirchmann, Malene; Olesen, Jes
2011-01-01
The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA).......The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA)....
Characterization of consistent triggers of migraine with aura
Hauge, Anne Werner; Kirchmann, Malene; Olesen, Jes
2011-01-01
The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA).......The aim of the present study was to characterize perceived consistent triggers of migraine with aura (MA)....
The utility of theory of planned behavior in predicting consistent ...
admin
outcomes of the behavior and the evaluations of these outcomes (behavioral beliefs) ... belief towards consistent condom use and motivation for compliance with .... consistency of the items used before constructing a scale. Results. All of the ...
Generalized contexts and consistent histories in quantum mechanics
Losada, Marcelo; Laura, Roberto
2014-05-01
We analyze a restriction of the theory of consistent histories by imposing that a valid description of a physical system must include quantum histories which satisfy the consistency conditions for all states. We prove that these conditions are equivalent to imposing the compatibility conditions of our formalism of generalized contexts. Moreover, we show that the theory of consistent histories with the consistency conditions for all states and the formalism of generalized context are equally useful representing expressions which involve properties at different times.
Study on consistent query answering in inconsistent databases
XIE Dong; YANG Luming
2007-01-01
Consistent query answering is an approach to retrieving consistent answers over databases that might be inconsistent with respect to some given integrity constraints The approach is based on a concept of repair.This paper surveys several recent researches on obtaining consistent information from inconsistent databases,such as the underlying semantic model,a number of approaches to computing consistent query answers and the computational complexity of this problem.Furthermore,the work outlines potential research directions in this area.
Alejandra Suarez
2014-02-01
Full Text Available Freedom Project trains prisoners in nonviolent communication and meditation. Two complementary studies of its effects are reported in this article. The first study is correlational; we found decreased recidivism rates among prisoners trained by Freedom Project compared with recidivism rates in Washington state. The second study compared trained prisoners with a matched-pair control group and found improvement in self-reported anger, self-compassion, and certain forms of mindfulness among the trained group. Ratings of role-plays simulating difficult interactions show increased social skills among the group trained by Freedom Project than in the matched controls.
Malecki, P.
2013-10-01
The aim of the Test Infrastructure and Accelerator Research Area - the TIARA project[1] is to consolidate and support the European R&D program in the field of physics and techniques of particle accelerators. This project, partially funded by the European Commission, groups 11 participants from 8 European countries, including Poland. Its present, threeyear (2011-2013) preparatory phase (PP) is shortly described in this paper. The project is divided into 9 work packages (WP). We will concentrate on four of them dedicated to governance, R&D infrastructures, joint R&D programming, and education and training, in which Polish participants are actively involved.
Chitambar, Eric; Gour, Gilad
2016-07-01
Considerable work has recently been directed toward developing resource theories of quantum coherence. In this Letter, we establish a criterion of physical consistency for any resource theory. This criterion requires that all free operations in a given resource theory be implementable by a unitary evolution and projective measurement that are both free operations in an extended resource theory. We show that all currently proposed basis-dependent theories of coherence fail to satisfy this criterion. We further characterize the physically consistent resource theory of coherence and find its operational power to be quite limited. After relaxing the condition of physical consistency, we introduce the class of dephasing-covariant incoherent operations as a natural generalization of the physically consistent operations. Necessary and sufficient conditions are derived for the convertibility of qubit states using dephasing-covariant operations, and we show that these conditions also hold for other well-known classes of incoherent operations.
Chitambar, Eric; Gour, Gilad
2016-07-15
Considerable work has recently been directed toward developing resource theories of quantum coherence. In this Letter, we establish a criterion of physical consistency for any resource theory. This criterion requires that all free operations in a given resource theory be implementable by a unitary evolution and projective measurement that are both free operations in an extended resource theory. We show that all currently proposed basis-dependent theories of coherence fail to satisfy this criterion. We further characterize the physically consistent resource theory of coherence and find its operational power to be quite limited. After relaxing the condition of physical consistency, we introduce the class of dephasing-covariant incoherent operations as a natural generalization of the physically consistent operations. Necessary and sufficient conditions are derived for the convertibility of qubit states using dephasing-covariant operations, and we show that these conditions also hold for other well-known classes of incoherent operations.
Vempala, Santosh S
2005-01-01
Random projection is a simple geometric technique for reducing the dimensionality of a set of points in Euclidean space while preserving pairwise distances approximately. The technique plays a key role in several breakthrough developments in the field of algorithms. In other cases, it provides elegant alternative proofs. The book begins with an elementary description of the technique and its basic properties. Then it develops the method in the context of applications, which are divided into three groups. The first group consists of combinatorial optimization problems such as maxcut, graph coloring, minimum multicut, graph bandwidth and VLSI layout. Presented in this context is the theory of Euclidean embeddings of graphs. The next group is machine learning problems, specifically, learning intersections of halfspaces and learning large margin hypotheses. The projection method is further refined for the latter application. The last set consists of problems inspired by information retrieval, namely, nearest neig...
CONSISTENCY OF LS ESTIMATOR IN SIMPLE LINEAR EV REGRESSION MODELS
Liu Jixue; Chen Xiru
2005-01-01
Consistency of LS estimate of simple linear EV model is studied. It is shown that under some common assumptions of the model, both weak and strong consistency of the estimate are equivalent but it is not so for quadratic-mean consistency.
Checking Consistency of Pedigree Information is NP-complete
Aceto, Luca; Hansen, Jens A.; Ingolfsdottir, Anna
Consistency checking is a fundamental computational problem in genetics. Given a pedigree and information on the genotypes of some of the individuals in it, the aim of consistency checking is to determine whether these data are consistent with the classic Mendelian laws of inheritance. This probl...
Non-Numeric Intrajudge Consistency Feedback in an Angoff Procedure
Harrison, George M.
2015-01-01
The credibility of standard-setting cut scores depends in part on two sources of consistency evidence: intrajudge and interjudge consistency. Although intrajudge consistency feedback has often been provided to Angoff judges in practice, more evidence is needed to determine whether it achieves its intended effect. In this randomized experiment with…
Full Text Available ... support vital research and services. "THE ALZHEIMER'S PROJECT" is a presentation of HBO Documentary Films and the ... Grandpa? by Maria Shriver Maria Shriver's children's book is about a grandparent with Alzheimer's. A great resource ...
Full Text Available ... Alzheimer's Gala A Night at Sardi's Alzheimer's Disease Awareness Month World Alzheimer's Month HBO Alzheimer’s Project MAKE ... for kids Learn how Maria Shriver is raising awareness FIND YOUR WALK Get help and support I ...
2012-01-01
Project name:Jintan tire production base project (the first-phase project) Construction site:Jintan Economic Development Zone, Jiangsu Province Construction unit:Zhongce Rubber (Jintan) Co., Ltd. Total investment:RMB 2.42 billion yuan Project description:It is planned to cover an area of 3,000 mu. In the first phase, it will cover an area of 520.43 mu with designed staff of 4,500 people. It will mix 150,000 tons of rubber and produce 10 million u- nits of high-performance semi-steel-wire saloon car and light truck radial tires, 500,000 units of OTR tires and 100,000 tons of carbon black per year.
Full Text Available ... about the films on our message board . Watch films free online now "The Memory Loss Tapes" (85 ... ALZHEIMER'S PROJECT" is a presentation of HBO Documentary Films and the National Institute on Aging at the ...
Full Text Available ... 14, 2009 "The Alzheimer's Project" wins two Creative Arts Emmys Two installments of the multi-part HBO ... from the Alzheimer's Association and others, won Creative Arts Emmy awards. "The Memory Loss Tapes" was honored ...
Full Text Available ... Home > News & Events > Upcoming Events > HBO Alzheimer’s Project In the News Walk to End Alzheimer's Upcoming Events ... Memory Loss Tapes" was honored for Exceptional Merit in Nonfiction Filmmaking, while "Grandpa, Do You Know Who ...
Full Text Available ... Program. - Emmys.com As the leading voluntary health organization in Alzheimer's care, support and research, the Alzheimer's Association has been an active partner in "THE ALZHEIMER'S PROJECT," providing expert insight ...
Full Text Available | News | Events | Press | Contact 24/7 Helpline: 1.800.272.3900 Find your chapter: search by state Home > News & Events > Upcoming Events > HBO Alzheimer’s Project In the News ...
Full Text Available ... disease has on those with Alzheimer's and their families. September 14, 2009 "The Alzheimer's Project" wins two ... way Americans thinks about Alzheimer's disease. Tell your family and friends. Post info on your Web site . ...
Full Text Available ... HBO's "THE ALZHEIMER'S PROJECT" will expose the Alzheimer's crisis facing our nation and drive concerned citizens to ... Institute on Aging at the National Institutes of Health in association with the Alzheimer's Association, The Fidelity ® ...
Full Text Available ... 14, 2009 "The Alzheimer's Project" wins two Creative Arts Emmys Two installments of the multi-part HBO ... from the Alzheimer's Association and others, won Creative Arts Emmy awards. "The Memory Loss Tapes" was honored ...
Full Text Available ... their families. September 14, 2009 "The Alzheimer's Project" wins two Creative Arts Emmys Two installments of the ... you can help us change the way Americans thinks about Alzheimer's disease. Tell your family and friends. ...
Full Text Available ... disease has on those with Alzheimer's and their families. September 14, 2009 "The Alzheimer's Project" wins two ... way Americans thinks about Alzheimer's disease. Tell your family and friends. Post info on your Web site . ...
On consistency of the weighted arithmetical mean complex judgement matrix
无
2007-01-01
The weighted arithmetical mean complex judgement matrix(WAMCJM)is the most common method for aggregating group opinions,but it has a shortcoming,namely the WAMCJM of the perfectly consistent judgement matrices given by experts canot guarantee its perfect consistency.An upper bound of the WAMCJM's consistency is presented.Simultaneously,a compatibility index of judging the aggregating extent of group opinions is also introduced.The WAMCJM is of acceptable consistency and is proved provided the compatibilities of all judgement matrices given by experts are smaller than the threshold value of acceptable consistency.These conclusions are important to group decision making.
On multidimensional consistent systems of asymmetric quad-equations
Boll, Raphael
2012-01-01
Multidimensional Consistency becomes more and more important in the theory of discrete integrable systems. Recently, we gave a classification of all 3D consistent 6-tuples of equations with the tetrahedron property, where several novel asymmetric systems have been found. In the present paper we discuss higher-dimensional consistency for 3D consistent systems coming up with this classification. In addition, we will give a classification of certain 4D consistent systems of quad-equations. The results of this paper allow for a proof of the Bianchi permutability among other applications.
Kampf, Constance
2009-01-01
In this video Associate Professor Constance Kampf talks about the importance project management. Not only as a tool in implementation, but also as a way of thinking, and as something that needs to be considered from idea conception......In this video Associate Professor Constance Kampf talks about the importance project management. Not only as a tool in implementation, but also as a way of thinking, and as something that needs to be considered from idea conception...
Kampf, Constance
2009-01-01
In this video Associate Professor Constance Kampf talks about the importance project management. Not only as a tool in implementation, but also as a way of thinking, and as something that needs to be considered from idea conception......In this video Associate Professor Constance Kampf talks about the importance project management. Not only as a tool in implementation, but also as a way of thinking, and as something that needs to be considered from idea conception...
Project as a System and its Management
Jiří Skalický
2017-06-01
Full Text Available The contribution aims to describe project as a system, to define project control goal and strategy, control variables and their relationships. Three common control variables represented by the project triangle, are extended by two other important variables – project risk and quality. The control system consists of two components: social one – project manager and project team – and technical one – project dynamic simulation model as a decision making support of project manager in project milestones. In the project planning phase, the project baseline with planned controlled variables is created. In milestones after project launch, the actual values of these variables are measured. If the actual values deviate from planned ones, corrective actions are proposed and new baseline for the following control interval is created. Project plan takes into account the actual project progress and optimum corrective actions are determined by simulation, respecting control strategy and availability of resources. The contribution presents list of references to articles dealing with project as a system and its simulation. In most cases, they refer to the project control using the Earned Value Management method and its derivatives. Using of the dynamic simulation model for project monitoring and control, suggested in this contribution, presents a novel approach. The proposed model can serve as departure point to future research of authors and for development of appropriate and applicable tool.
Incompatible multiple consistent sets of histories and measures of quantumness
Halliwell, J. J.
2017-07-01
In the consistent histories approach to quantum theory probabilities are assigned to histories subject to a consistency condition of negligible interference. The approach has the feature that a given physical situation admits multiple sets of consistent histories that cannot in general be united into a single consistent set, leading to a number of counterintuitive or contrary properties if propositions from different consistent sets are combined indiscriminately. An alternative viewpoint is proposed in which multiple consistent sets are classified according to whether or not there exists any unifying probability for combinations of incompatible sets which replicates the consistent histories result when restricted to a single consistent set. A number of examples are exhibited in which this classification can be made, in some cases with the assistance of the Bell, Clauser-Horne-Shimony-Holt, or Leggett-Garg inequalities together with Fine's theorem. When a unifying probability exists logical deductions in different consistent sets can in fact be combined, an extension of the "single framework rule." It is argued that this classification coincides with intuitive notions of the boundary between classical and quantum regimes and in particular, the absence of a unifying probability for certain combinations of consistent sets is regarded as a measure of the "quantumness" of the system. The proposed approach and results are closely related to recent work on the classification of quasiprobabilities and this connection is discussed.
Alagba, Tonye J.
Oil and gas drilling projects are the primary means by which oil companies recover large volumes of commercially available hydrocarbons from deep reservoirs. These types of projects are complex in nature, involving management of multiple stakeholder interfaces, multidisciplinary personnel, complex contractor relationships, and turbulent environmental and market conditions, necessitating the application of proven project management best practices and critical success factors (CSFs) to achieve success. Although there is some practitioner oriented literature on project management CSFs for drilling projects, none of these is based on empirical evidence, from research. In addition, the literature has reported alarming rates of oil and gas drilling project failure, which is attributable not to technical factors, but to failure of project management. The aim of this quantitative correlational study therefore, was to discover an empirically verified list of project management CSFs, which consistent application leads to successful implementation of oil and gas drilling projects. The study collected survey data online, from a random sample of 127 oil and gas drilling personnel who were members of LinkedIn's online community "Drilling Supervisors, Managers, and Engineers". The results of the study indicated that 10 project management factors are individually related to project success of oil and gas drilling projects. These 10 CSFs are namely; Project mission, Top management support, Project schedule/plan, Client consultation, Personnel, Technical tasks, Client acceptance, Monitoring and feedback, Communication, and Troubleshooting. In addition, the study found that the relationships between the 10 CSFs and drilling project success is unaffected by participant and project demographics---role of project personnel, and project location. The significance of these findings are both practical, and theoretical. Practically, application of an empirically verified CSFs list to oil
Integrable Heisenberg Ferromagnet Equations with self-consistent potentials
Zhunussova, Zh Kh; Tungushbaeva, D I; Mamyrbekova, G K; Nugmanova, G N; Myrzakulov, R
2013-01-01
In this paper, we consider some integrable Heisenberg Ferromagnet Equations with self-consistent potentials. We study their Lax representations. In particular we give their equivalent counterparts which are nonlinear Schr\\"odinger type equations. We present the integrable reductions of the Heisenberg Ferromagnet Equations with self-consistent potentials. These integrable Heisenberg Ferromagnet Equations with self-consistent potentials describe nonlinear waves in ferromagnets with magnetic fields.
Behavioural consistency and life history of Rana dalmatina tadpoles
Urszán, Tamás Janós; Török, János; Hettyey, Attila; Garamszegi, László Z; Herczeg, Gábor
2015-01-01
The focus of evolutionary behavioural ecologists has recently turned towards understanding the causes and consequences of behavioural consistency, manifesting either as animal personality (consistency in a single behaviour) or behavioural syndrome (consistency across more behaviours). Behavioural type (mean individual behaviour) has been linked to life-history strategies, leading to the emergence of the integrated pace-of-life syndrome (POLS) theory. Using Rana dalmatina tadpoles as models, w...
Students’ conceptual understanding consistency of heat and temperature
Slamet Budiarti, Indah; Suparmi; Sarwanto; Harjana
2017-01-01
The aims of the research were to explore and to describe the consistency of students’ understanding of heat and temperature concept. The sample that was taken using purposive random sampling technique consisted of 99 high school students from 3 senior high schools in Jayapura city. The descriptive qualitative method was employed in this study. The data were collected using tests and interviews regarding the subject matters of Heat and Temperature. Based on the results of data analysis, it was concluded that 3.03% of the students was the consistency of right answer, 79.80% of the students was consistency but wrong answer and 17.17% of the students was inconsistency.
Bootstrap-Based Inference for Cube Root Consistent Estimators
Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi
This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known to be inconsis......This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...
Forestry and biomass energy projects
Swisher, J.N.
1994-01-01
This paper presents a comprehensive and consistent methodology to account for the costs and net carbon flows of different categories of forestry and biomass energy projects and describes the application of the methodology to several sets of projects in Latin America. The results suggest that both...... is sufficient as either a national or global strategy for sustainable land use or carbon emission reduction. The methodology allows consistent comparisons of the costs and quantities of carbon stored in different types of projects and/or national programs, facilitating the inclusion of forestry and biomass...... biomass energy development and forestry measures including reforestation and forest protection can contribute significantly to the reduction of global CO2 emissions, and that local land-use capacity must determine the type of project that is appropriate in specific cases. No single approach alone...
Chiu, George L.; Yang, Kei H.
1998-08-01
Projection display in today's market is dominated by cathode ray tubes (CRTs). Further progress in this mature CRT projector technology will be slow and evolutionary. Liquid crystal based projection displays have gained rapid acceptance in the business market. New technologies are being developed on several fronts: (1) active matrix built from polysilicon or single crystal silicon; (2) electro- optic materials using ferroelectric liquid crystal, polymer dispersed liquid crystals or other liquid crystal modes, (3) micromechanical-based transducers such as digital micromirror devices, and grating light valves, (4) high resolution displays to SXGA and beyond, and (5) high brightness. This article reviews the projection displays from a transducer technology perspective along with a discussion of markets and trends.
Personality Consistency in Dogs: A Meta-Analysis
Fratkin, Jamie L.; Sinn, David L.; Patall, Erika A.; Gosling, Samuel D.
2013-01-01
Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that ‘puppy tests’ measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., ‘puppy tests’) versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed. PMID:23372787
Gfader, Verina; Carson, Rebecca; Kraus, Chris
Echo project (ed. by Verina Gfader and Ruth Höflich) is an online publication and community board that developed from a visit to the Los Angeles Art Book fair in January 2014. It was on the occasion of a prior book project, titled Prospectus, that the editorial team had been invited by the LAABF...... Intellect and Financialization sets a conceptual ground for rethinking subjective freedom; an encounter with Another LA opens out a multitude of cartographies - revealing more discreet and politically dynamic movements in the urban grid; there are glimpses of Machine Project’s events, a visual story around...
Arnal, E. M.; Abraham, Z.; Giménez de Castro, G.; de Gouveia dal Pino, E. M.; Larrarte, J. J.; Lepine, J.; Morras, R.; Viramonte, J.
2014-10-01
The project LLAMA, acronym of Long Latin American Millimetre Array is very briefly described in this paper. This project is a joint scientific and technological undertaking of Argentina and Brazil on the basis of an equal investment share, whose mail goal is both to install and to operate an observing facility capable of exploring the Universe at millimetre and sub/millimetre wavelengths. This facility will be erected in the argentinean province of Salta, in a site located at 4830m above sea level.
Guidelines for CSM project development.
1983-01-01
This document summarizes guidelines for contraceptive social marketing project development prepared by the International Contraceptive Social Marketing Project (ICSMP) as an aid to consultants and technical assistance contractors. The ICSMP has developed a checklist to guide planning in 4 major areas: 1) project organization and management structure, 2) target market, 3) product line, and 4) pricing strategy and project costs. A clear statement of project objectives is essential, and these objectives must be internally consistent so that strategies to accomplish them can be unified. The position of each governmental entity and sponsoring agency involved in the social marketing project must be clearly understood. Projects receiving US government funds must have a mechanism for financial and programmatic reporting and accountability. Thorough knowledge of commercial rules and regulations in a country is necessary for planning. To ascertain whether the necessary resources are available, it is necessary to examine the existing marketing infrastructure in terms of distribution, advertising, market research, and packagaing capabilities. The target market should be specified in quantifiable terms; in addition, a consumer profile that defines the overall demographics of the country, the family planning environment, and potential social marketing consumers should be developed. The couple years of protection projection can be translated into the percentage of the target market that the project expects to capture. It is necessary to price products early in project development in order to assess program costs. Revenue projections should be based on previous calculations of couple year of protection goals, product line, product price, and price structure. Each element of the advertsing budget should be justifiable in terms of project objectives. Finally, positions and anticpated salaries for staff should be specified through the 1st 3 years of project implementation.
Affective-cognitive consistency and thought-induced attitude polarization.
Chaiken, S; Yates, S
1985-12-01
Subjects whose preexperimental attitudes toward either capital punishment or censorship were high or low in affective-cognitive consistency were identified. These four groups thought about their attitudes by writing two essays, one on the topic for which consistency had been assessed (relevant essay) and one on the unassessed topic (distractor essay). In accord with the hypothesis that thought-induced attitude polarization requires the presence of a well-developed knowledge structure, high-consistency subjects evidenced greater polarization than low-consistency subjects only on the relevant topic after writing the relevant essay. Content analyses of subjects' relevant essays yielded additional data confirming Tesser's ideas regarding mediation: High (vs. low) consistency subjects expressed a greater proportion of cognitions that were evaluatively consistent with their prior affect toward the attitude object and a smaller proportion of evaluatively inconsistent and neutral cognitions. Moreover, although high-and low-consistency subjects did not differ in the amount of attitudinally relevant information they possessed or their awareness of inconsistent cognitions, their method of dealing with discrepant information diverged: High-consistency subjects evidenced a greater tendency to assimilate discrepant information by generating refutational thoughts that discredited or minimized the importance of inconsistent information.
Philosophical and Methodological Problem of Consistency of Mathematical Theories
Michailova N. V.
2013-01-01
Full Text Available Increased abstraction of modern mathematical theories has revived interest in traditional philosophical and methodological problem of internally consistent system of axioms where the contradicting each other statements can’t be deduced. If we are talking about axioms describing a well-known area of mathematical objects from the standpoint of local consistency this problem does not appear to be as relevant. But these problems are associated with the various attempts of formalists to explain the mathematical existence through consistency. But, for example, with regard to the problem of establishing of consistency of mathematical analysis the solution of which would clarify the fate of Hilbert's proof theory it has not solved yet so as the problem of the consistency of axiomatic set theory. Therefore it can be assumed that the criterion of consistency despite its essential role in axiomatic systems both formal and substantive nature is the same auxiliary logical criterion as well as mathematical provability. An adequate solution of the problem of consistency of mathematics can be achieved in the area of methodological and substantive arguments revealing the mechanism of appearance of contradictions in the mathematical theory. The paper shows that from a systemic point of view in the context of philosophical and methodological synthesis of various directions of justification of modern mathematics it can’t insist on only the rationale for consistency of mathematical theories.
MANUFACTURE OF THE FERMENTED SAUSAGES WITH THE SMEARED CONSISTENCE
Nesterenko A. A.
2014-10-01
Full Text Available In foreign practice we have a great demand of using smoked sausage products with a smeared consistence. In the article the basic aspects of manufacturing smoked sausages with a smeared consistence are resulted: the choice of spices, starting cultures and the way of drawing up of forcemeat
Uniform Consistency for Nonparametric Estimators in Null Recurrent Time Series
Gao, Jiti; Kanaya, Shin; Li, Degui
2015-01-01
This paper establishes uniform consistency results for nonparametric kernel density and regression estimators when time series regressors concerned are nonstationary null recurrent Markov chains. Under suitable regularity conditions, we derive uniform convergence rates of the estimators. Our...... results can be viewed as a nonstationary extension of some well-known uniform consistency results for stationary time series....
Delimiting Coefficient a from Internal Consistency and Unidimensionality
Sijtsma, Klaas
2015-01-01
I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient a to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient a is a lower bound to reliability and that concepts of internal consistency and…
Carl Rogers during Initial Interviews: A Moderate and Consistent Therapist.
Edwards, H. P.; And Others
1982-01-01
Analyzed two initial interviews by Carl Rogers in their entirety using the Carkhuff scales, Hill's category system, and a brief grammatical analysis to establish the level and consistency with which Rogers provides facilitative conditions. Results indicated his behavior as counselor was stable and consistent within and across interviews. (Author)
Decentralized Consistency Checking in Cross-organizational Workflows
Wombacher, Andreas
Service Oriented Architectures facilitate loosely coupled composed services, which are established in a decentralized way. One challenge for such composed services is to guarantee consistency, i.e., deadlock-freeness. This paper presents a decentralized approach to consistency checking, which
The Self-Consistency Model of Subjective Confidence
Koriat, Asher
2012-01-01
How do people monitor the correctness of their answers? A self-consistency model is proposed for the process underlying confidence judgments and their accuracy. In answering a 2-alternative question, participants are assumed to retrieve a sample of representations of the question and base their confidence on the consistency with which the chosen…
Dynamic Consistency between Value and Coordination Models - Research Issues.
Bodenstaff, L.; Wombacher, Andreas; Reichert, M.U.; meersman, R; Tari, Z; herrero, p
Inter-organizational business cooperations can be described from different viewpoints each fulfilling a specific purpose. Since all viewpoints describe the same system they must not contradict each other, thus, must be consistent. Consistency can be checked based on common semantic concepts of the
Decentralized Consistency Checking in Cross-organizational Workflows
Wombacher, A.
2006-01-01
Service Oriented Architectures facilitate loosely coupled composed services, which are established in a decentralized way. One challenge for such composed services is to guarantee consistency, i.e., deadlock-freeness. This paper presents a decentralized approach to consistency checking, which utiliz
Logical consistency and sum-constrained linear models
van Perlo -ten Kleij, Frederieke; Steerneman, A.G.M.; Koning, Ruud H.
2006-01-01
A topic that has received quite some attention in the seventies and eighties is logical consistency of sum-constrained linear models. Loosely defined, a sum-constrained model is logically consistent if the restrictions on the parameters and explanatory variables are such that the sum constraint is a
Screening of resonant magnetic perturbations taking into account a self-consistent electric field
Kaveeva, E.; Rozhansky, V.
2012-05-01
Steady-state screening of resonant magnetic perturbations (RMPs) in a tokamak is analysed taking into account a self-consistent electric field. On the one hand, the self-consistent radial electric field is determined by the balance of the electron radial conductivity in a stochastic magnetic field screened by the plasma and by the neoclassical ion conductivity. On the other hand, the parallel current of electrons, the radial projection of which is balanced by the ion current, determines the screening of RMPs. In this work, the self-consistent electric field and RMP screening are calculated. Two different regimes of screening are found: the ‘ion’ branch which corresponds to the negative radial electric field and the ‘electron’ branch for which the electric field is positive. Predictions of the model are compared with the experimental data and results of the simulation with various codes. The corresponding toroidal rotation and pump-out effect are discussed.
Basalt Waste Isolation Project Reclamation Support Project:
Brandt, C.A.; Rickard, W.H. Jr.; Cadoret, N.A.
1992-06-01
The Basalt Waste Isolation Project (BWIP) Reclamation Support Project began in the spring of 1988 by categorizing sites distributed during operations of the BWIP into those requiring revegetation and those to be abandoned or transferred to other programs. The Pacific Northwest Laboratory's role in this project was to develop plans for reestablishing native vegetation on the first category of sites, to monitor the implementation of these plans, to evaluate the effectiveness of these efforts, and to identify remediation methods where necessary. The Reclamation Support Project focused on three major areas: geologic hydrologic boreholes, the Exploratory Shaft Facility (ESF), and the Near-Surface Test Facility (NSTF). A number of BWIP reclamation sites seeded between 1989 and 1990 were found to be far below reclamation objectives. These sites were remediated in 1991 using various seedbed treatments designed to rectify problems with water-holding capacity, herbicide activity, surficial crust formation, and nutrient imbalances. Remediation was conducted during November and early December 1991. Sites were examined on a monthly basis thereafter to evaluate plant growth responses to these treatments. At all remediation sites early plant growth responses to these treatments. At all remediation sites, early plant growth far exceeded any previously obtained using other methods and seedbed treatments. Seeded plants did best where amendments consisted of soil-plus-compost or fertilizer-only. Vegetation growth on Gable Mountain was less than that found on other areas nearby, but this difference is attributed primarily to the site's altitude and north-facing orientation.
The construction and combined operation for fuzzy consistent matrixes
YAO Min; SHEN Bin; LUO Jian-hua
2005-01-01
Fuzziness is one of the general characteristics of human thinking and objective things. Introducing fuzzy techniques into decision-making yields very good results. Fuzzy consistent matrix has many excellent characteristics, especially center-division transitivity conforming to the reality of the human thinking process in decision-making. This paper presents a new approach for creating fuzzy consistent matrix from mutual supplementary matrix in fuzzy decision-making. At the same time,based on the distance between individual fuzzy consistent matrix and average fuzzy consistent matrix, a kind of combined operation for several fuzzy consistent matrixes is presented which reflects most opinions of experienced experts. Finally, a practical example shows its flexibility and practicability further.
Personality and Situation Predictors of Consistent Eating Patterns.
Uku Vainik
Full Text Available A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied.A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner. The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation.Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption.Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.
Personality and Situation Predictors of Consistent Eating Patterns.
Vainik, Uku; Dubé, Laurette; Lu, Ji; Fellows, Lesley K
2015-01-01
A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied. A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner). The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation. Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption. Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.
Full Text Available ... Chicago National Alzheimer's Gala A Night at Sardi's Alzheimer's Disease Awareness Month World Alzheimer's Month HBO Alzheimer’s Project ... help us change the way Americans thinks about Alzheimer's disease. Tell your family and friends. Post info on ...
Journal of College Science Teaching, 1972
1972-01-01
The Environmental Protection Agency has started a project to actually picture the environmental movement in the United States. This is an attempt to make the public aware of the air pollution in their area or state and to acquaint them with the effects of air cleaning efforts. (PS)
Full Text Available ... A Night at Sardi's Alzheimer's Disease Awareness Month World Alzheimer's Month HBO Alzheimer’s Project MAKE A DONATION ... Alzheimer's Association ® . All rights reserved. Our vision: A world without Alzheimer's disease ® . Formed in 1980, the Alzheimer's ...
Michael E. Goerndt; W. Keith Moser; Patrick D. Miles; Dave Wear; Ryan D. DeSantis; Robert J. Huggett; Stephen R. Shifley; Francisco X. Aguilar; Kenneth E. Skog
2016-01-01
One purpose of the Northern Forest Futures Project is to predict change in future forest attributes across the 20 States in the U.S. North for the period that extends from 2010 to 2060. The forest attributes of primary interest are the 54 indicators of forest sustainability identified in the Montreal Process Criteria and Indicators (Montreal Process Working Group, n.d...
Juhlin, Jonas Alastair
'Project Avatar' tager udgangspunkt i den efterretningsdisciplin, der kaldes Open Source Intelligence og indebærer al den information, som ligger frit tilgængeligt i åbne kilder. Med udbredelsen af sociale medier åbners der op for helt nye typer af informationskilder. Spørgsmålet er; hvor nyttig er...
Full Text Available ... families. September 14, 2009 "The Alzheimer's Project" wins two Creative Arts Emmys Two installments of the multi-part HBO documentary "The ... minutes) "Caregivers" (48 minutes) "Momentum in Science, Part 2" (70 minutes) Be a part of something big. ...
P.M. Latyshev
2008-09-01
Full Text Available "Urals Industrial - Urals Polar" is the unique project and thus it will provide the economic security not only of the local territory but of the whole Russia in terms of several courses. This article is devoted to the main courses of these ones and their influence on the economy of the country.
2007-01-01
Uus arhitektuuriajakiri "Project Baltia" tutvustab Baltimaade, Soome ja Peterburi regiooni arhitektuuri, linnaehitust ja disaini. Ilmub neli korda aastas inglise- ja venekeelsena. Väljaandja: kirjastus Balticum Peterburis koostöös Amsterdami ja Moskva kirjastusega A-Fond. Peatoimetaja Vladimir Frolov
Dehlholm, Christian; Brockhoff, Per B.; Bredie, Wender Laurentius Petrus
2012-01-01
Projective Mapping (Risvik et.al., 1994) and its Napping (Pagès, 2003) variations have become increasingly popular in the sensory field for rapid collection of spontaneous product perceptions. It has been applied in variations which sometimes are caused by the purpose of the analysis and sometime...
Kazanjian, Wendy C.
1982-01-01
Describes Project COLD (Climate, Ocean, Land, Discovery) a scientific study of the Polar Regions, a collection of 35 modules used within the framework of existing subjects: oceanography, biology, geology, meterology, geography, social science. Includes a partial list of topics and one activity (geodesic dome) from a module. (Author/SK)
Full Text Available ... minutes) "Momentum in Science, Part 2" (70 minutes) Be a part of something big. HBO's "THE ALZHEIMER'S PROJECT" will expose the Alzheimer's crisis facing our nation and drive concerned citizens to take action. Here are three ways you ...
Buforn, E.; Davila, J. Martin; Bock, G.; Pazos, A.; Udias, A.; Hanka, W.
The TEDESE (Terremotos y Deformacion Cortical en el Sur de España) project is a joint project of the Universidad Complutense de Madrid (UCM) and Real Instituto y Observatorio de la Armada de San Fernando, Cadiz (ROA) supported by the Spanish Ministerio de Ciencia y Tecnologia with the participation of the GeoforschungZen- trum, Potsdam (GFZ). The aim is to carry out a study of the characteristics of the oc- currence and mechanism of earthquakes together with measurements of crustal struc- ture and deformations in order to obtain an integrated evaluation of seismic risk in southern Spain from. As part of this project a temporal network of 10 broad-band seismological stations, which will complete those already existing in the zone, have been installed in southern Spain and northern Africa for one year beginning in October 2001. The objectives of the project are the study in detail of the focal mechanisms of earthquakes in this area, of structural in crust and upper mantle, of seismic anisotropy in crust and mantle as indicator for tectonic deformation processed and the measure- ments of crustal deformations using techniques with permanent GPS and SLR stations and temporary GPS surveys. From these studies, seismotectonic models and maps will be elaborated and seismic risk in the zone will be evaluated.
2007-01-01
Uus arhitektuuriajakiri "Project Baltia" tutvustab Baltimaade, Soome ja Peterburi regiooni arhitektuuri, linnaehitust ja disaini. Ilmub neli korda aastas inglise- ja venekeelsena. Väljaandja: kirjastus Balticum Peterburis koostöös Amsterdami ja Moskva kirjastusega A-Fond. Peatoimetaja Vladimir Frolov
Driscoll, Mary C. [St. Bonaventure University, St Bonaventure, NY(United States)
2012-07-12
The Project Narrative describes how the funds from the DOE grant were used to purchase equipment for the biology, chemistry, physics and mathematics departments. The Narrative also describes how the equipment is being used. There is also a list of the positive outcomes as a result of having the equipment that was purchased with the DOE grant.
Smith, Rhett [Schweitzer Engineering Laboratories Inc, Pullman, WA (United States)
2016-12-23
The SDN Project completed on time and on budget and successfully accomplished 100% of the scope of work outlined in the original Statement of Project Objective (SOPO). The SDN Project formed an alliance between Ameren Corporation, University of Illinois Urbana- Champaign (UIUC), Pacific Northwest National Laboratories (PNNL), and Schweitzer Engineering Laboratories, Inc. (SEL). The objective of the SDN Project is to address Topic Area of Interest 2: Sustain critical energy delivery functions while responding to a cyber-intrusion under Funding Opportunity Announcement DE-FOA-0000797. The goal of the project is to design and commercially release technology that provides a method to sustain critical energy delivery functions during a cyber intrusion and to do this control system operators need the ability to quickly identify and isolate the affected network areas, and re-route critical information and control flows around. The objective of the SDN Project is to develop a Flow Controller that monitors, configures, and maintains the safe, reliable network traffic flows of all the local area networks (LANs) on a control system in the Energy sector. The SDN team identified the core attributes of a control system and produced an SDN flow controller that has the same core attributes enabling networks to be designed, configured and deployed that maximize the whitelisted, deny-bydefault and purpose built networks. This project researched, developed and commercially released technology that: Enables all field networks be to configured and monitored as if they are a single asset to be protected; Enables greatly improved and even precalculated response actions to reliability and cyber events; Supports pre-configured localized response actions tailored to provide resilience against failures and centralized response to cyber-attacks that improve network reliability and availability; Architecturally enables the right subject matter experts, who are usually the information
Consistent assignment of nurse aides: association with turnover and absenteeism.
Castle, Nicholas G
2013-01-01
Consistent assignment refers to the same caregivers consistently caring for the same residents almost every time caregivers are on duty. This article examines the association of consistent assignment of nurse aides with turnover and absenteeism. Data came from a survey of nursing home administrators, the Online Survey Certification and Reporting data, and the Area Resource File. The measures were from 2007 and came from 3,941 nursing homes. Multivariate logistic regression models were used to examine turnover and absenteeism. An average of 68% of nursing homes reported using consistent assignment, with 28% of nursing homes using nurse aides consistent assignment at the often recommended level of 85% (or more). Nursing homes using recommended levels of consistent assignment had significantly lower rates of turnover and of absenteeism. In the multivariate analyses, consistent assignment was significantly associated with both lower turnover and lower absenteeism (p assignment is a practice recommended by many policy makers, government agencies, and industry advocates. The findings presented here provide some evidence that the use of this staffing practice can be beneficial.
Projective geometry and projective metrics
Busemann, Herbert
2005-01-01
The basic results and methods of projective and non-Euclidean geometry are indispensable for the geometer, and this book--different in content, methods, and point of view from traditional texts--attempts to emphasize that fact. Results of special theorems are discussed in detail only when they are needed to develop a feeling for the subject or when they illustrate a general method. On the other hand, an unusual amount of space is devoted to the discussion of the fundamental concepts of distance, motion, area, and perpendicularity.Topics include the projective plane, polarities and conic sectio
THE CONSISTENCY OF STATISTICAL ESTIMATES OF THURSTONE-MOSTELLER
Y. V. Bugaev
2015-01-01
Full Text Available The traditional method of analysis procedures of collective choice involves three different approaches: investigation operator voting against the characteristic conditions, investigation the properties of the function of choice, analysis on the possibility of manipulating (verification the stability of the voting process under the influence of negative impacts from voters or organizer. Research team of the department ITMU VSUET proposed and implemented a fourth approach, which is to research the probabilistic characteristics the results of the procedures (value of the displacement valuation of estimate the usefulness of specific alternative to its true value, the standard deviation of evaluating of estimate the usefulness of alternative from its true value, the probability of correct ranking of alternatives at the output the procedure of choice, etc.. This article is dedicated to the analysis of the consistency of estimates the usefulness to compare alternatives, obtained at the output of the traditional procedure Thurstone-Mosteller and its generalizations, created by the authors of. In the general, the term of consistency of estimator of statistical estimation assumes tending to zero error of estimation by increasing the sample size. However, depending on the interpretation of "calculation errors" in science are the following main types of consistency: the weak consistency of statistical estimation, based on the notion of convergence in probability of the random quantity; the strong consistency, based on the concept of convergence with probability to one; the consistency of statistical estimation in the mean square. The variance of this assessment tends to zero. This article provides a proof of the theorem, according to which the assumptions rather general nature of estimates the usefulness being ranked alternatives obtained using the procedure Thurstone-Mosteller satisfied the consistency of statistical estimation in the mean square. In this
The Consistent Preferences Approach to Deductive Reasoning in Games
Asheim, Geir B
2006-01-01
"The Consistent Preferences Approach to Deductive Reasoning in Games" presents, applies, and synthesizes what my co-authors and I have called the 'consistent preferences' approach to deductive reasoning in games. Briefly described, this means that the object of the analysis is the ranking by each player of his own strategies, rather than his choice. The ranking can be required to be consistent (in different senses) with his beliefs about the opponent's ranking of her strategies. This can be contrasted to the usual 'rational choice' approach where a player's strategy choice is (in dif
On the Consistency of ZFn in ZFn＋3
李旭华
1993-01-01
By restricting the common replacement axiom schema of ZF to ∑n-formulae,Profexxor Zhang Jinwen constructed a series of subsystems of Zermelo-Frankel set theory ZF and he called them ZFn.Zhao Xi shun show that the consistency of ZFn can be deducted from ZF.Porfessor Zhang Jinwen raised the question whether the consistency of ZFn can be deducted from ZFn+m(n) for some m(n)≥1.In this paper,we get a positive solution to Professor Zhang's problem.Moreover,we show that the consistency of ZFn can be deducted from ZFn+3.
Model Checking Data Consistency for Cache Coherence Protocols
Hong Pan; Hui-Min Lin; Yi Lv
2006-01-01
A method for automatic verification of cache coherence protocols is presented, in which cache coherence protocols are modeled as concurrent value-passing processes, and control and data consistency requirement are described as formulas in first-orderμ-calculus. A model checker is employed to check if the protocol under investigation satisfies the required properties. Using this method a data consistency error has been revealed in a well-known cache coherence protocol.The error has been corrected, and the revised protocol has been shown free from data consistency error for any data domain size, by appealing to data independence technique.
Consistency of assertive, aggressive, and submissive behavior for children.
Deluty, R H
1985-10-01
The interpersonal behavior of 50 third- through fifth-grade children was assessed over an 8-month period in a wide variety of naturally occurring school activities. The consistency of the children's behavior was found to vary as a function of the child's sex, the class of behavior examined, and the similarity/dissimilarity of the contexts in which the behaviors occurred. Boys demonstrated remarkable consistency in their aggressive expression; 46 of 105 intercorrelations for the aggressiveness dimensions were statistically significant. In general, the consistency of assertive behavior for both boys and girls was unexpectedly high.
Consistent increase in Indian monsoon rainfall and its variability across CMIP-5 models
A. Menon
2013-01-01
Full Text Available The possibility of an impact of global warming on the Indian monsoon is of critical importance for the large population of this region. Future projections within the Coupled Model Intercomparison Project Phase 3 (CMIP-3 showed a wide range of trends with varying magnitude and sign across models. Here the Indian summer monsoon rainfall is evaluated in 20 CMIP-5 models for the period 1850 to 2100. In the new generation of climate models a consistent increase in seasonal mean rainfall during the summer monsoon periods arises. All models simulate stronger seasonal mean rainfall in the future compared to the historic period under the strongest warming scenario RCP-8.5. Increase in seasonal mean rainfall is the largest for the RCP-8.5 scenario compared to other RCPs. The interannual variability of the Indian monsoon rainfall also shows a consistent positive trend under unabated global warming. Since both the long-term increase in monsoon rainfall as well as the increase in interannual variability in the future is robust across a wide range of models, some confidence can be attributed to these projected trends.
Consonance in Information System Projects: A Relationship Marketing Perspective
Lin, Pei-Ying
2010-01-01
Different stakeholders in the information system project usually have different perceptions and expectations of the projects. There is seldom consistency in the stakeholders' evaluations of the project outcome. Thus the outcomes of information system projects are usually disappointing to one or more stakeholders. Consonance is a process that can…
On exact triangles consisting of stable vector bundles on tori
Kobayashi, Kazushi
2016-01-01
In this paper, we consider the exact triangles consisting of stable holomorphic vector bundles on one-dimensional complex tori, and discuss their relations with the corresponding Fukaya category via the homological mirror symmetry.
A new insight into the consistency of smoothed particle hydrodynamics
Sigalotti, Leonardo Di G; Klapp, Jaime; Vargas, Carlos A; Campos, Kilver
2016-01-01
In this paper the problem of consistency of smoothed particle hydrodynamics (SPH) is solved. A novel error analysis is developed in $n$-dimensional space using the Poisson summation formula, which enables the treatment of the kernel and particle approximation errors in combined fashion. New consistency integral relations are derived for the particle approximation which correspond to the cosine Fourier transform of the classically known consistency conditions for the kernel approximation. The functional dependence of the error bounds on the SPH interpolation parameters, namely the smoothing length $h$ and the number of particles within the kernel support ${\\cal{N}}$ is demonstrated explicitly from which consistency conditions are seen to follow naturally. As ${\\cal{N}}\\to\\infty$, the particle approximation converges to the kernel approximation independently of $h$ provided that the particle mass scales with $h$ as $m\\propto h^{\\beta}$, with $\\beta >n$. This implies that as $h\\to 0$, the joint limit $m\\to 0$, $...
Island of Stability for Consistent Deformations of Einstein's Gravity
Dietrich, Dennis D.; Berkhahn, Felix; Hofmann, Stefan;
2012-01-01
We construct deformations of general relativity that are consistent and phenomenologically viable, since they respect, in particular, cosmological backgrounds. These deformations have unique symmetries in accordance with their Minkowski cousins (Fierz-Pauli theory for massive gravitons) and incor...
Consistency of Trend Break Point Estimator with Underspecified Break Number
Jingjing Yang
2017-01-01
Full Text Available This paper discusses the consistency of trend break point estimators when the number of breaks is underspecified. The consistency of break point estimators in a simple location model with level shifts has been well documented by researchers under various settings, including extensions such as allowing a time trend in the model. Despite the consistency of break point estimators of level shifts, there are few papers on the consistency of trend shift break point estimators in the presence of an underspecified break number. The simulation study and asymptotic analysis in this paper show that the trend shift break point estimator does not converge to the true break points when the break number is underspecified. In the case of two trend shifts, the inconsistency problem worsens if the magnitudes of the breaks are similar and the breaks are either both positive or both negative. The limiting distribution for the trend break point estimator is developed and closely approximates the finite sample performance.
Consistency in experiments on multistable driven delay systems
Oliver, Neus; Larger, Laurent; Fischer, Ingo
2016-10-01
We investigate the consistency properties in the responses of a nonlinear delay optoelectronic intensity oscillator subject to different drives, in particular, harmonic and self-generated waveforms. This system, an implementation of the Ikeda oscillator, is operating in a closed-loop configuration, exhibiting its autonomous dynamics while the drive signals are additionally introduced. Applying the same drive multiple times, we compare the dynamical responses of the optoelectronic oscillator and quantify the degree of consistency among them via their correlation. Our results show that consistency is not restricted to conditions close to the first Hopf bifurcation but can be found in a broad range of dynamical regimes, even in the presence of multistability. Finally, we discuss the dependence of consistency on the nature of the drive signal.
Design of a Turbulence Generator of Medium Consistency Pulp Pumps
Hong Li; Haifei Zhuang; Weihao Geng
2012-01-01
The turbulence generator is a key component of medium consistency centrifugal pulp pumps, with functions to fluidize the medium consistency pulp and to separate gas from the liquid. Structure sizes of the generator affect the hydraulic performance. The radius and the blade laying angle are two important structural sizes of a turbulence generator. Starting with the research on the flow inside and shearing characteristics of the MC pulp, a simple mathematical model at the flow section of the sh...
Consistent Reconstruction of Cortical Surfaces from Longitudinal Brain MR Images
Li, Gang; Nie, Jingxin; Shen, Dinggang
2011-01-01
Accurate and consistent reconstruction of cortical surfaces from longitudinal human brain MR images is of great importance in studying subtle morphological changes of the cerebral cortex. This paper presents a new deformable surface method for consistent and accurate reconstruction of inner, central and outer cortical surfaces from longitudinal MR images. Specifically, the cortical surfaces of the group-mean image of all aligned longitudinal images of the same subject are first reconstructed ...
Consistent Reconstruction of Cortical Surfaces from Longitudinal Brain MR Images
Li, Gang; Nie, Jingxin; Wu, Guorong; Wang, Yaping; Shen, Dinggang
2011-01-01
Accurate and consistent reconstruction of cortical surfaces from longitudinal human brain MR images is of great importance in studying longitudinal subtle change of the cerebral cortex. This paper presents a novel deformable surface method for consistent and accurate reconstruction of inner, central and outer cortical surfaces from longitudinal brain MR images. Specifically, the cortical surfaces of the group-mean image of all aligned longitudinal images of the same subject are first reconstr...
On the consistency of coset space dimensional reduction
Chatzistavrakidis, A. [Institute of Nuclear Physics, NCSR DEMOKRITOS, GR-15310 Athens (Greece); Physics Department, National Technical University of Athens, GR-15780 Zografou Campus, Athens (Greece)], E-mail: cthan@mail.ntua.gr; Manousselis, P. [Physics Department, National Technical University of Athens, GR-15780 Zografou Campus, Athens (Greece); Department of Engineering Sciences, University of Patras, GR-26110 Patras (Greece)], E-mail: pman@central.ntua.gr; Prezas, N. [CERN PH-TH, 1211 Geneva (Switzerland)], E-mail: nikolaos.prezas@cern.ch; Zoupanos, G. [Physics Department, National Technical University of Athens, GR-15780 Zografou Campus, Athens (Greece)], E-mail: george.zoupanos@cern.ch
2007-11-15
In this Letter we consider higher-dimensional Yang-Mills theories and examine their consistent coset space dimensional reduction. Utilizing a suitable ansatz and imposing a simple set of constraints we determine the four-dimensional gauge theory obtained from the reduction of both the higher-dimensional Lagrangian and the corresponding equations of motion. The two reductions yield equivalent results and hence they constitute an example of a consistent truncation.
Truncations driven by constraints: consistency and conditions for correct upliftings
Pons, J M; Pons, Josep M.; Talavera, Pere
2004-01-01
We discuss the mechanism of truncations driven by the imposition of constraints. We show how the consistency of such truncations is controlled, and give general theorems that establish conditions for the correct uplifting of solutions. We show in some particular examples how one can get correct upliftings from 7d supergravities to 10d type IIB supergravity, even in cases when the truncation is not initially consistent by its own.
S Matrix Proof of Consistency Condition Derived from Mixed Anomaly
Bhansali, Vineer
For a confining quantum field theory with conserved current J and stress tensor T, the JJJ> and anomalies computed in terms of elementary quanta must be precisely equal to the same anomalies computed in terms of the exact physical spectrum if the conservation law corresponding to J is unbroken. These strongly constrain the allowed representations of the low energy spectrum. We present a proof of the latter consistency condition based on the proof by Coleman and Grossman of the former consistency condition.
Consistent histories, quantum truth functionals, and hidden variables
Griffiths, Robert B.
2000-01-01
A central principle of consistent histories quantum theory, the requirement that quantum descriptions be based upon a single framework (or family), is employed to show that there is no conflict between consistent histories and a no-hidden-variables theorem of Bell, and Kochen and Specker, contrary to a recent claim by Bassi and Ghirardi. The argument makes use of `truth functionals' defined on a Boolean algebra of classical or quantum properties.
Consistent histories, quantum truth functionals, and hidden variables
Griffiths, R B
1999-01-01
A central principle of consistent histories quantum theory, the requirement that quantum descriptions be based upon a single framework (or family), is employed to show that there is no conflict between consistent histories and a no-hidden-variables theorem of Bell, and Kochen and Specker, contrary to a recent claim by Bassi and Ghirardi. The argument makes use of ``truth functionals'' defined on a Boolean algebra of classical or quantum properties.
Behavioural consistency and life history of Rana dalmatina tadpoles.
Urszán, Tamás János; Török, János; Hettyey, Attila; Garamszegi, László Zsolt; Herczeg, Gábor
2015-05-01
The focus of evolutionary behavioural ecologists has recently turned towards understanding the causes and consequences of behavioural consistency, manifesting either as animal personality (consistency in a single behaviour) or behavioural syndrome (consistency across more behaviours). Behavioural type (mean individual behaviour) has been linked to life-history strategies, leading to the emergence of the integrated pace-of-life syndrome (POLS) theory. Using Rana dalmatina tadpoles as models, we tested if behavioural consistency and POLS could be detected during the early ontogenesis of this amphibian. We targeted two ontogenetic stages and measured activity, exploration and risk-taking in a common garden experiment, assessing both individual behavioural type and intra-individual behavioural variation. We observed that activity was consistent in all tadpoles, exploration only became consistent with advancing age and risk-taking only became consistent in tadpoles that had been tested, and thus disturbed, earlier. Only previously tested tadpoles showed trends indicative of behavioural syndromes. We found an activity-age at metamorphosis POLS in the previously untested tadpoles irrespective of age. Relative growth rate correlated positively with the intra-individual variation of activity of the previously untested older tadpoles. In previously tested older tadpoles, intra-individual variation of exploration correlated negatively and intra-individual variation of risk-taking correlated positively with relative growth rate. We provide evidence for behavioural consistency and POLS in predator- and conspecific-naive tadpoles. Intra-individual behavioural variation was also correlated to life history, suggesting its relevance for the POLS theory. The strong effect of moderate disturbance related to standard behavioural testing on later behaviour draws attention to the pitfalls embedded in repeated testing.
TWO APPROACHES TO IMPROVING THE CONSISTENCY OF COMPLEMENTARY JUDGEMENT MATRIX
XuZeshui
2002-01-01
By the transformation relations between complementary judgement matrix and reciprocal judgement matrix ,this paper proposes two methods for improving the consistency of complementary judgement matrix and gives two simple practical iterative algorithms. These two algorithms are easy to implement on computer,and the modified complementary judgement matrices remain most information that original matrix contains. Thus the methods supplement and develop the theory and methodology for improving consistency of complementary judgement matrix.
Autonomous Navigation with Constrained Consistency for C-Ranger
Shujing Zhang
2014-06-01
Full Text Available Autonomous underwater vehicles (AUVs have become the most widely used tools for undertaking complex exploration tasks in marine environments. Their synthetic ability to carry out localization autonomously and build an environmental map concurrently, in other words, simultaneous localization and mapping (SLAM, are considered to be pivotal requirements for AUVs to have truly autonomous navigation. However, the consistency problem of the SLAM system has been greatly ignored during the past decades. In this paper, a consistency constrained extended Kalman filter (EKF SLAM algorithm, applying the idea of local consistency, is proposed and applied to the autonomous navigation of the C-Ranger AUV, which is developed as our experimental platform. The concept of local consistency (LC is introduced after an explicit theoretical derivation of the EKF-SLAM system. Then, we present a locally consistency- constrained EKF-SLAM design, LC-EKF, in which the landmark estimates used for linearization are fixed at the beginning of each local time period, rather than evaluated at the latest landmark estimates. Finally, our proposed LC- EKF algorithm is experimentally verified, both in simulations and sea trials. The experimental results show that the LC-EKF performs well with regard to consistency, accuracy and computational efficiency.
Measuring consistency of autobiographical memory recall in depression.
Semkovska, Maria
2012-05-15
Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms.
Autonomous Navigation with Constrained Consistency for C-Ranger
Shujing Zhang
2014-06-01
Full Text Available Autonomous underwater vehicles (AUVs have become the most widely used tools for undertaking complex exploration tasks in marine environments. Their synthetic ability to carry out localization autonomously and build an environmental map concurrently, in other words, simultaneous localization and mapping (SLAM, are considered to be pivotal requirements for AUVs to have truly autonomous navigation. However, the consistency problem of the SLAM system has been greatly ignored during the past decades. In this paper, a consistency constrained extended Kalman filter (EKF SLAM algorithm, applying the idea of local consistency, is proposed and applied to the autonomous navigation of the C-Ranger AUV, which is developed as our experimental platform. The concept of local consistency (LC is introduced after an explicit theoretical derivation of the EKF-SLAM system. Then, we present a locally consistency-constrained EKF-SLAM design, LC-EKF, in which the landmark estimates used for linearization are fixed at the beginning of each local time period, rather than evaluated at the latest landmark estimates. Finally, our proposed LC-EKF algorithm is experimentally verified, both in simulations and sea trials. The experimental results show that the LC-EKF performs well with regard to consistency, accuracy and computational efficiency.
Consistency analysis of accelerated degradation mechanism based on gray theory
Yunxia Chen; Hongxia Chen; Zhou Yang; Rui Kang; Yi Yang
2014-01-01
A fundamental premise of an accelerated testing is that the failure mechanism under elevated and normal stress levels should remain the same. Thus, verification of the consistency of failure mechanisms is essential during an accelerated testing. A new consistency analysis method based on the gray theory is pro-posed for complex products. First of al , existing consistency ana-lysis methods are reviewed with a focus on the comparison of the differences among them. Then, the proposed consistency ana-lysis method is introduced. Two effective gray prediction models, gray dynamic model and new information and equal dimensional (NIED) model, are adapted in the proposed method. The process to determine the dimension of NIED model is also discussed, and a decision rule is expanded. Based on that, the procedure of ap-plying the new consistent analysis method is developed. Final y, a case study of the consistency analysis of a reliability enhancement testing is conducted to demonstrate and validate the proposed method.
Shimizu, Y.; Kobayashi, T.; Kubo, T.; Chiga, N.; Isobe, T.; Kawabata, T.; Kondo, Y.; Kusaka, K.; Matsuda, Y.; Motobayashi, T.; Murakami, T.; Nakamura, T.; Ohnishi, J.; Ohnishi, T.; Okuno, H.; Otsu, H.; Sakurai, H.; Sato, H.; Satou, Y.; Sekiguchi, K.; Togano, Y.; Yoneda, K.
2011-09-01
SAMURAI project aims to open a new research field in nuclear physics by the use of a large acceptance spectrometer for kinematically complete measurements of multiple particles emitted in RI-beam induced reactions. The SAMURAI spectrometer consists of a large gap superconducting dipole magnet, heavy ion detectors, neutron detectors, and proton detectors. What is special about the SAMURAI system is that projectile-rapidity protons or neutrons are detected with large angular and momentum acceptance in coincidence with heavy projectile fragments. With an effective combination of these equipments, the SAMURAI system allows us to perform various experiments: electromagnetic dissociation, various direct reactions, polarized deuteron induced reactions, and EOS studies. SAMURAI project is currently underway at RIBF. The construction of the superconducting dipole magnet will start in autumn 2010 and finish in spring 2011. The detectors are also being constructed in parallel. The first commissioning run will be performed in early 2012.
Pascual Aventí, Guillem
2004-01-01
El marc d'aquest projecte és el servei d' atenció tècnica d'un dels distribuïdors d'una coneguda marca d'ofimàtica i electrònica domèstica. Un dels processos de negoci consisteix en la reparació d'aquesta mena d'equips. El marco del proyecto es el servicio de atención técnica de uno de los distribuidores de una conocida marca de ofimática y electrónica doméstica. Uno de los procesos de negocio consiste en la reparación de este tipo de equipos. The framework of this project is the techni...
Johnson, Steve
2003-01-01
Project Prometheus will enable a new paradigm in the scientific exploration of the Solar System. The proposed JIMO mission will start a new generation of missions characterized by more maneuverability, flexibility, power and lifetime. Project Prometheus organization is established at NASA Headquarters: 1.Organization established to carry out development of JIMO, nuclear power (radioisotope), and nuclear propulsion research. 2.Completed broad technology and national capacity assessments to inform decision making on planning and technology development. 3.Awarded five NRA s for nuclear propulsion research. 4.Radioisotope power systems in development, and Plutonium-238 being purchased from Russia. 5.Formulated science driven near-term and long-term plan for the safe utilization of nuclear propulsion based missions. 6.Completed preliminary studies (Pre-Phase A) of JIMO and other missions. 7.Initiated JIMO Phase A studies by Contractors and NASA.
Ballantine, A; Dixon-Altaber, H; Dosanjh, M; Kuchina, L
2011-01-01
Hadrontherapy uses particle beams to treat tumours located near critical organs and tumours that respond poorly to conventional radiation therapy. It has become evident that there is an emerging need for reinforcing research in hadrontherapy and it is essential to train professionals in this rapidly developing field. PARTNER is a 4-year Marie Curie Training project funded by the European Commission with 5.6 million Euros aimed at the creation of the next generation of experts. Ten academic institutes and research centres and two leading companies are participating in PARTNER, that is coordinated by CERN, forming a unique multidisciplinary and multinational European network. The project offers research and training opportunities to 25 young biologists, engineers, physicians and physicists and is allowing them to actively develop modern techniques for treating cancer in close collaboration with leading European Institutions. For this purpose PARTNER relies on cutting edge research and technology development, ef...
None
2015-04-02
The Water Power Program helps industry harness this renewable, emissions-free resource to generate environmentally sustainable and cost-effective electricity. Through support for public, private, and nonprofit efforts, the Water Power Program promotes the development, demonstration, and deployment of advanced hydropower devices and pumped storage hydropower applications. These technologies help capture energy stored by diversionary structures, increase the efficiency of hydroelectric generation, and use excess grid energy to replenish storage reserves for use during periods of peak electricity demand. In addition, the Water Power Program works to assess the potential extractable energy from domestic water resources to assist industry and government in planning for our nation’s energy future. From FY 2008 to FY 2014, DOE’s Water Power Program announced awards totaling approximately $62.5 million to 33 projects focused on hydropower. Table 1 provides a brief description of these projects.
Zhang, Rui; Noels, Kimberly A.; Lalonde, Richard N.; Salas, S. J.
2017-01-01
Prior research differentiates dialectical (e.g., East Asian) from non-dialectical cultures (e.g., North American and Latino) and attributes cultural differences in self-concept consistency to naïve dialecticism. In this research, we explored the effects of managing two cultural identities on consistency within the bicultural self-concept via the role of dialectical beliefs. Because the challenge of integrating more than one culture within the self is common to biculturals of various heritage backgrounds, the effects of bicultural identity integration should not depend on whether the heritage culture is dialectical or not. In four studies across diverse groups of bicultural Canadians, we showed that having an integrated bicultural identity was associated with being more consistent across roles (Studies 1–3) and making less ambiguous self-evaluations (Study 4). Furthermore, dialectical self-beliefs mediated the effect of bicultural identity integration on self-consistency (Studies 2–4). Finally, Latino biculturals reported being more consistent across roles than did East Asian biculturals (Study 2), revealing the ethnic heritage difference between the two groups. We conclude that both the content of heritage culture and the process of integrating cultural identities influence the extent of self-consistency among biculturals. Thus, consistency within the bicultural self-concept can be understood, in part, to be a unique psychological product of bicultural experience. PMID:28326052
Wagner, Falko Jens; Poulsen, Mikael Zebbelin
1999-01-01
When trying to solve a DAE problem of high index with more traditional methods, it often causes instability in some of the variables, and finally leads to breakdown of convergence and integration of the solution. This is nicely shown in [ESF98, p. 152 ff.].This chapter will introduce projection...... methods as a way of handling these special problems. It is assumed that we have methods for solving normal ODE systems and index-1 systems....
Iordache, Octavian
2013-01-01
How do you know what works and what doesn't? This book contains case studies highlighting the power of polytope projects for complex problem solving. Any sort of combinational problem characterized by a large variety of possibly complex constructions and deconstructions based on simple building blocks can be studied in a similar way. Although the majority of case studies are related to chemistry, the method is general and equally applicable to other fields for engineering or science.
Peters, Baron; Bolhuis, Peter G; Mullen, Ryan G; Shea, Joan-Emma
2013-02-01
We propose a method for identifying accurate reaction coordinates among a set of trial coordinates. The method applies to special cases where motion along the reaction coordinate follows a one-dimensional Smoluchowski equation. In these cases the reaction coordinate can predict its own short-time dynamical evolution, i.e., the dynamics projected from multiple dimensions onto the reaction coordinate depend only on the reaction coordinate itself. To test whether this property holds, we project an ensemble of short trajectory swarms onto trial coordinates and compare projections of individual swarms to projections of the ensemble of swarms. The comparison, quantified by the Kullback-Leibler divergence, is numerically performed for each isosurface of each trial coordinate. The ensemble of short dynamical trajectories is generated only once by sampling along an initial order parameter. The initial order parameter should separate the reactants and products with a free energy barrier, and distributions on isosurfaces of the initial parameter should be unimodal. The method is illustrated for three model free energy landscapes with anisotropic diffusion. Where exact coordinates can be obtained from Kramers-Langer-Berezhkovskii-Szabo theory, results from the new method agree with the exact results. We also examine characteristics of systems where the proposed method fails. We show how dynamical self-consistency is related (through the Chapman-Kolmogorov equation) to the earlier isocommittor criterion, which is based on longer paths.
Masellis, A; Atiyeh, B
2009-12-31
The BurNet project, a pilot project of the Eumedis initiative, has become true. The Eumedis (EUro MEDiterranean Information Society) initiative is part of the MEDA programme of the EU to develop the Information Society in the Mediterranean area. In the health care sector, the objective of Eumedis is: the deployment of network-based solutions to interconnect - using userfriendly and affordable solutions - the actors at all levels of the "health care system" of the Euro-Mediterranean region. The Bur Net project interconnects 17 Burn Centres (BC) in the Mediterranean Area through an information network both to standardize courses of action in the field of prevention, treatment, and functional and psychological rehabilitation of burn patients and to coordinate interactions between BC and emergency rooms in peripheral hospitals using training/information activities and telemedicine to optimize first aid provided to burn patients before referral to a BC. Shared procedure protocols for prevention and the care and rehabilitation of patients, both at individual and mass level, will help to create an international specialized database and a Webbased teleconsultation system.
Silvia Ferrari
2010-04-01
Full Text Available The project is focused on a detailed study of some chemical, physical and toxicological parameters and on health, epidemiological and environmental assessment by interpretative models, in the atmosphere of Emilia-Romagna (Italy. The project rises from the necessity to improve knowledge about environmental and health aspects of fine and ultrafine particles, in primary and secondary components, in the atmosphere. The project, structured in 7 workpackages, is organized in two measurement programmes: the routine one that has a mainly daily time resolution, and the intensive one with high time resolution and a higher chemical speciation than the routine one. The sampling sites are five: three in urban areas (Bologna, Parma and Rimini, one in a rural area (San Pietro Capofiume and one in a remote area (Monte Cimone. Parallel to outdoor studies, a workpackage is planned for indoor studies and chemical composition analysis with the outdoor/indoor ratio for characterizing indoor human exposure to outdoor pollution.
Does object view influence the scene consistency effect?
Sastyin, Gergo; Niimi, Ryosuke; Yokosawa, Kazuhiko
2015-04-01
Traditional research on the scene consistency effect only used clearly recognizable object stimuli to show mutually interactive context effects for both the object and background components on scene perception (Davenport & Potter in Psychological Science, 15, 559-564, 2004). However, in real environments, objects are viewed from multiple viewpoints, including an accidental, hard-to-recognize one. When the observers named target objects in scenes (Experiments 1a and 1b, object recognition task), we replicated the scene consistency effect (i.e., there was higher accuracy for the objects with consistent backgrounds). However, there was a significant interaction effect between consistency and object viewpoint, which indicated that the scene consistency effect was more important for identifying objects in the accidental view condition than in the canonical view condition. Therefore, the object recognition system may rely more on the scene context when the object is difficult to recognize. In Experiment 2, the observers identified the background (background recognition task) while the scene consistency and object views were manipulated. The results showed that object viewpoint had no effect, while the scene consistency effect was observed. More specifically, the canonical and accidental views both equally provided contextual information for scene perception. These findings suggested that the mechanism for conscious recognition of objects could be dissociated from the mechanism for visual analysis of object images that were part of a scene. The "context" that the object images provided may have been derived from its view-invariant, relatively low-level visual features (e.g., color), rather than its semantic information.
Inter-laboratory consistency of gait analysis measurements.
Benedetti, M G; Merlo, A; Leardini, A
2013-09-01
The dissemination of gait analysis as a clinical assessment tool requires the results to be consistent, irrespective of the laboratory. In this work a baseline assessment of between site consistency of one healthy subject examined at 7 different laboratories is presented. Anthropometric and spatio-temporal parameters, pelvis and lower limb joint rotations, joint sagittal moments and powers, and ground reaction forces were compared. The consistency between laboratories for single parameters was assessed by the median absolute deviation and maximum difference, for curves by linear regression. Twenty-one lab-to-lab comparisons were performed and averaged. Large differences were found between the characteristics of the laboratories (i.e. motion capture systems and protocols). Different values for the anthropometric parameters were found, with the largest variability for a pelvis measurement. The spatio-temporal parameters were in general consistent. Segment and joint kinematics consistency was in general high (R2>0.90), except for hip and knee joint rotations. The main difference among curves was a vertical shift associated to the corresponding value in the static position. The consistency between joint sagittal moments ranged form R2=0.90 at the ankle to R2=0.66 at the hip, the latter was increasing when comparing separately laboratories using the same protocol. Pattern similarity was good for ankle power but not satisfactory for knee and hip power. The force was found the most consistent, as expected. The differences found were in general lower than the established minimum detectable changes for gait kinematics and kinetics for healthy adults.
Norman, Patrick; Bishop, David M.; Jensen, Hans Jørgen Aa;
2001-01-01
Computationally tractable expressions for the evaluation of the linear response function in the multiconfigurational self-consistent field approximation were derived and implemented. The finite lifetime of the electronically excited states was considered and the linear response function was shown...
Martial arts striking hand peak acceleration, accuracy and consistency.
Neto, Osmar Pinto; Marzullo, Ana Carolina De Miranda; Bolander, Richard P; Bir, Cynthia A
2013-01-01
The goal of this paper was to investigate the possible trade-off between peak hand acceleration and accuracy and consistency of hand strikes performed by martial artists of different training experiences. Ten male martial artists with training experience ranging from one to nine years volunteered to participate in the experiment. Each participant performed 12 maximum effort goal-directed strikes. Hand acceleration during the strikes was obtained using a tri-axial accelerometer block. A pressure sensor matrix was used to determine the accuracy and consistency of the strikes. Accuracy was estimated by the radial distance between the centroid of each subject's 12 strikes and the target, whereas consistency was estimated by the square root of the 12 strikes mean squared distance from their centroid. We found that training experience was significantly correlated to hand peak acceleration prior to impact (r(2)=0.456, p =0.032) and accuracy (r(2)=0. 621, p=0.012). These correlations suggest that more experienced participants exhibited higher hand peak accelerations and at the same time were more accurate. Training experience, however, was not correlated to consistency (r(2)=0.085, p=0.413). Overall, our results suggest that martial arts training may lead practitioners to achieve higher striking hand accelerations with better accuracy and no change in striking consistency.
Cognitive consistency and math-gender stereotypes in Singaporean children.
Cvencek, Dario; Meltzoff, Andrew N; Kapur, Manu
2014-01-01
In social psychology, cognitive consistency is a powerful principle for organizing psychological concepts. There have been few tests of cognitive consistency in children and no research about cognitive consistency in children from Asian cultures, who pose an interesting developmental case. A sample of 172 Singaporean elementary school children completed implicit and explicit measures of math-gender stereotype (male=math), gender identity (me=male), and math self-concept (me=math). Results showed strong evidence for cognitive consistency; the strength of children's math-gender stereotypes, together with their gender identity, significantly predicted their math self-concepts. Cognitive consistency may be culturally universal and a key mechanism for developmental change in social cognition. We also discovered that Singaporean children's math-gender stereotypes increased as a function of age and that boys identified with math more strongly than did girls despite Singaporean girls' excelling in math. The results reveal both cultural universals and cultural variation in developing social cognition. Copyright © 2013 Elsevier Inc. All rights reserved.
Analysis of Consistency of Printing Blankets using Correlation Technique
Lalitha Jayaraman
2010-01-01
Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thicknessdirection. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimizing the torque for blankets from different manufacturers.
GRAVITATIONALLY CONSISTENT HALO CATALOGS AND MERGER TREES FOR PRECISION COSMOLOGY
Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi [Kavli Institute for Particle Astrophysics and Cosmology, Department of Physics, Stanford University, Stanford, CA 94305 (United States); Busha, Michael T. [Institute for Theoretical Physics, University of Zurich, CH-8006 Zurich (Switzerland); Klypin, Anatoly A. [Astronomy Department, New Mexico State University, Las Cruces, NM 88003 (United States); Primack, Joel R., E-mail: behroozi@stanford.edu, E-mail: rwechsler@stanford.edu [Department of Physics, University of California at Santa Cruz, Santa Cruz, CA 95064 (United States)
2013-01-20
We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.
Self-consistent generalized Langevin equation for colloidal mixtures.
Chávez-Rojo, Marco Antonio; Medina-Noyola, Magdaleno
2005-09-01
A self-consistent theory of collective and tracer diffusion in colloidal mixtures is presented. This theory is based on exact results for the partial intermediate scattering functions derived within the framework of the generalized Langevin equation formalism, plus a number of conceptually simple and sensible approximations. The first of these consists of a Vineyard-like approximation between collective and tracer diffusion, which writes the collective dynamics in terms of the memory function related to tracer diffusion. The second consists of interpolating this only unknown memory function between its two exact limits at small and large wave vectors; for this, a phenomenologically determined, but not arbitrary, interpolating function is introduced: a Lorentzian with its inflection point located at the first minimum of the partial static structure factor. The small wave-vector exact limit involves a time-dependent friction function, for which we take a general approximate result, previously derived within the generalized Langevin equation formalism. This general result expresses the time-dependent friction function in terms of the partial intermediate scattering functions, thus closing the system of equations into a fully self-consistent scheme. This extends to mixtures a recently proposed self-consistent theory developed for monodisperse suspensions [Yeomans-Reyna and Medina-Noyola, Phys. Rev. E 64, 066114 (2001)]. As an illustration of its quantitative accuracy, its application to a simple model of a binary dispersion in the absence of hydrodynamic interactions is reported.
Pulsed laser photoacoustic monitoring of paper pulp consistency
Zhao, Zuomin; Törmänen, Matti; Myllylä, Risto
2008-06-01
This study involves measurements of pulp consistency in cuvette and by an online apparatus, by innovatively scattering photoacoustic (SPA) method. The theoretical aspects were described at first. Then, a few kinds of wood fiber suspensions with consistencies from 0.5% to 5% were studied in cuvette. After that, a pilot of online apparatus was built to measure suspensions with fiber consistency lower than 1% and filler content up to 3%. The results showed that although there were many fiber flocks in cuvette which strongly affected the measurement accuracy of samples consistencies, the apparatus can sense fiber types with different optical and acoustic properties. The measurement accuracy can be greatly improved in the online style apparatus, by pumping suspension fluids in a circulating system to improve the suspension homogeneity. The results demonstrated that wood fibers cause larger attenuation of acoustic waves but fillers do not. On the other hand, fillers cause stronger scattering of incident light. Therefore, our SPA apparatus has a potential ability to simultaneously determine fiber and filler fractions in pulp suspensions with consistency up to 5%.
Consistency of Scalar Potentials from Quantum de Sitter Space
Espinosa, José R; Trépanier, Maxime
2015-01-01
We derive constraints on the scalar potential of a quantum field theory in de Sitter space. The constraints, which we argue should be understood as consistency conditions for quantum field theories in dS space, originate from a consistent interpretation of quantum de Sitter space through its Coleman-De Luccia tunneling rate. Indeed, consistency of de Sitter space as a quantum theory of gravity with a finite number of degrees of freedom suggests the tunneling rates to vacua with negative cosmological constants be interpreted as Poincar\\'e recurrences. Demanding the tunneling rate to be a Poincar\\'e recurrence imposes two constraints, or consistency conditions, on the scalar potential. Although the exact consistency conditions depend on the shape of the scalar potential, generically they correspond to: the distance in field space between the de Sitter vacuum and any other vacuum with negative cosmological constant must be of the order of the reduced Planck mass or larger; and the fourth root of the vacuum energ...
Gravitationally Consistent Halo Catalogs and Merger Trees for Precision Cosmology
Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi; Busha, Michael T.; Klypin, Anatoly A.; Primack, Joel R.
2013-01-01
We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.
Analysis of Consistency of Printing Blankets using Correlation Technique
Balaraman Kumar
2010-06-01
Full Text Available This paper presents the application of an analytical tool to quantify material consistency of offset printing blankets. Printing blankets are essentially viscoelastic rubber composites of several laminas. High levels of material consistency are expected from rubber blankets for quality print and for quick recovery from smash encountered during the printing process. The present study aims at determining objectively the consistency of printing blankets at three specific torque levels of tension under two distinct stages; 1. under normal printing conditions and 2. on recovery after smash. The experiment devised exhibits a variation in tone reproduction properties of each blanket signifying the levels of inconsistency also in thickness direction. Correlation technique was employed on ink density variations obtained from the blanket on paper. Both blankets exhibited good consistency over three torque levels under normal printing conditions. However on smash the recovery of blanket and its consistency was a function of manufacturing and torque levels. This study attempts to provide a new metrics for failure analysis of offset printing blankets. It also underscores the need for optimising the torque for blankets from different manufacturers.
Mainstreaming life cycle thinking through a consistent approach to footprints
Ridoutt, Brad; Pfister, Stephan; Manzardo, Alessandro
2016-01-01
the auspices of the UNEP/SETAC Life Cycle Initiative project on environmental Life Cycle Impact Assessment has been working to develop generic guidance for developers of footprint metrics. The initial work involved forming a consensual position on the difference between footprints and existing LCA impact......-alone and not part of a framework intended for comprehensive environmental performance assessment. Accordingly, footprints are universally defined as metrics used to report life cycle assessment results addressing an Area of Concern....
Soubielle, Marie-Laure
2015-04-01
2015 has been declared the year of light. Sunlight plays a major role in the world. From the sunbeams that heat our planet and feed our plants to the optical analysis of the sun or the modern use of sun particles in technologies, sunlight is everywhere and it is vital. This project aims to understand better the light of the Sun in a variety of fields. The experiments are carried out by students aged 15 to 20 in order to share their discoveries with Italian students from primary and secondary schools. The experiments will also be presented to a group of Danish students visiting our school in January. All experiments are carried out in English and involve teams of teachers. This project is 3 folds: part 1: Biological project = what are the mechanisms of photosynthesis? part 2: Optical project= what are the components of sunlight and how to use it? part 3: Technical project= how to use the energy of sunlight for modern devices? Photosynthesis project Biology and English Context:Photosynthesis is a process used by plants and other organisms to convert light energy, normally from the Sun, into chemical energy that can later fuel the organisms' activities. This chemical energy is stored in molecules which are synthesized from carbon dioxide and water. In most cases, oxygen is released as a waste product. Most plants perform photosynthesis. Photosynthesis maintains atmospheric oxygen levels and supplies all of the organic compounds and most of the energy necessary for life on Earth. Outcome: Our project consists in understanding the various steps of photosynthesis. Students will shoot a DVD of the experiments presenting the equipments required, the steps of the experiments and the results they have obtained for a better understanding of photosynthesis Digital pen project Electricity, Optics and English Context: Sunlight is a complex source of light based on white light that can be decomposed to explain light radiations or colours. This light is a precious source to create
A Dynamical Mechanism for Large Volumes with Consistent Couplings
Abel, Steven
2016-01-01
A mechanism for addressing the 'decompactification problem' is proposed, which consists of balancing the vacuum energy in Scherk-Schwarzed theories against contributions coming from non- perturbative physics. Universality of threshold corrections ensures that, in such situations, the stable minimum will have consistent gauge couplings for any gauge group that shares the same N = 2 beta function for the bulk excitations as the gauge group that takes part in the minimisation. Scherk- Schwarz compactification from 6D to 4D in heterotic strings is discussed explicitly, together with two alternative possibilities for the non-perturbative physics, namely metastable SQCD vacua and a single gaugino condensate. In the former case, it is shown that modular symmetries gives various consistency checks, and allow one to follow soft-terms, playing a similar role to R-symmetry in global SQCD. The latter case is particularly attractive when there is nett Bose-Fermi degeneracy in the massless sector. In such cases, because th...
Consistent group selection in high-dimensional linear regression
Wei, Fengrong; 10.3150/10-BEJ252
2010-01-01
In regression problems where covariates can be naturally grouped, the group Lasso is an attractive method for variable selection since it respects the grouping structure in the data. We study the selection and estimation properties of the group Lasso in high-dimensional settings when the number of groups exceeds the sample size. We provide sufficient conditions under which the group Lasso selects a model whose dimension is comparable with the underlying model with high probability and is estimation consistent. However, the group Lasso is, in general, not selection consistent and also tends to select groups that are not important in the model. To improve the selection results, we propose an adaptive group Lasso method which is a generalization of the adaptive Lasso and requires an initial estimator. We show that the adaptive group Lasso is consistent in group selection under certain conditions if the group Lasso is used as the initial estimator.
Lightness constancy through transparency: internal consistency in layered surface representations.
Singh, Manish
2004-01-01
Asymmetric lightness matching was employed to measure how the visual system assigns lightness to surface patches seen through partially-transmissive surfaces. Observers adjusted the luminance of a comparison patch seen through transparency, in order to match the lightness of a standard patch seen in plain view. Plots of matched-to-standard luminance were linear, and their slopes were consistent with Metelli's alpha. A control experiment confirmed that these matches were indeed transparency based. Consistent with recent results, however, when observers directly matched the transmittance of transparent surfaces, their matches deviated strongly and systematically from Metelli's alpha. Although the two sets of results appear to be contradictory, formal analysis reveals a deeper mutual consistency in the representation of the two layers. A ratio-of-contrasts model is shown to explain both the success of Metelli's model in predicting lightness through transparency, and its failure to predict perceived transmittance--and hence is seen to play the primary role in perceptual transparency.
A Consistent Semantics of Self-Adjusting Computation
Acar, Umut A; Donham, Jacob
2011-01-01
This paper presents a semantics of self-adjusting computation and proves that the semantics are correct and consistent. The semantics integrate change propagation with the classic idea of memoization to enable reuse of computations under mutation to memory. During evaluation, reuse of a computation via memoization triggers a change propagation that adjusts the reused computation to reflect the mutated memory. Since the semantics integrate memoization and change-propagation, it involves both non-determinism (due to memoization) and mutation (due to change propagation). Our consistency theorem states that the non-determinism is not harmful: any two evaluations of the same program starting at the same state yield the same result. Our correctness theorem states that mutation is not harmful: self-adjusting programs are consistent with purely functional programming. We formalize the semantics and their meta-theory in the LF logical framework and machine check our proofs using Twelf.
Consistency and Reconciliation Model In Regional Development Planning
Dina Suryawati
2016-10-01
Full Text Available The aim of this study was to identify the problems and determine the conceptual model of regional development planning. Regional development planning is a systemic, complex and unstructured process. Therefore, this study used soft systems methodology to outline unstructured issues with a structured approach. The conceptual models that were successfully constructed in this study are a model of consistency and a model of reconciliation. Regional development planning is a process that is well-integrated with central planning and inter-regional planning documents. Integration and consistency of regional planning documents are very important in order to achieve the development goals that have been set. On the other hand, the process of development planning in the region involves technocratic system, that is, both top-down and bottom-up system of participation. Both must be balanced, do not overlap and do not dominate each other. regional, development, planning, consistency, reconciliation
The consistency approach for the quality control of vaccines.
Hendriksen, Coenraad; Arciniega, Juan L; Bruckner, Lukas; Chevalier, Michel; Coppens, Emmanuelle; Descamps, Johan; Duchêne, Michel; Dusek, David Michael; Halder, Marlies; Kreeftenberg, Hans; Maes, Alexandrine; Redhead, Keith; Ravetkar, Satish D; Spieser, Jean-Marc; Swam, Hanny
2008-01-01
Current lot release testing of conventional vaccines emphasizes quality control of the final product and is characterized by its extensive use of laboratory animals. This report, which is based on the outcome of an ECVAM (European Centre for Validation of Alternative Methods, Institute for Health and Consumer Protection, European Commission Joint Research Centre, Ispra, Italy) workshop, discusses the concept of consistency testing as an alternative approach for lot release testing. The consistency approach for the routine release of vaccines is based upon the principle that the quality of vaccines is a consequence of a quality system and of consistent production of lots with similar characteristics to those lots that have been shown to be safe and effective in humans or the target species. The report indicates why and under which circumstances this approach can be applied, the role of the different stakeholders, and the need for international harmonization. It also gives recommendations for its implementation.
Self-consistent modelling of resonant tunnelling structures
Fiig, T.; Jauho, A.P.
1992-01-01
We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated with the ......We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated...... applied voltages and carrier densities at the emitter-barrier interface. We include the two-dimensional accumulation layer charge and the quantum well charge in our self-consistent scheme. We discuss the evaluation of the current contribution originating from the two-dimensional accumulation layer charges...
Model-Consistent Sparse Estimation through the Bootstrap
Bach, Francis
2009-01-01
We consider the least-square linear regression problem with regularization by the $\\ell^1$-norm, a problem usually referred to as the Lasso. In this paper, we first present a detailed asymptotic analysis of model consistency of the Lasso in low-dimensional settings. For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection. For a specific rate decay, we show that the Lasso selects all the variables that should enter the model with probability tending to one exponentially fast, while it selects all other variables with strictly positive probability. We show that this property implies that if we run the Lasso for several bootstrapped replications of a given sample, then intersecting the supports of the Lasso bootstrap estimates leads to consistent model selection. This novel variable selection procedure, referred to as the Bolasso, is extended to high-dimensional settings by a provably consistent two-step procedure.
Consistency of the group Lasso and multiple kernel learning
Bach, Francis
2007-01-01
We consider the least-square regression problem with regularization by a block 1-norm, i.e., a sum of Euclidean norms over spaces of dimensions larger than one. This problem, referred to as the group Lasso, extends the usual regularization by the 1-norm where all spaces have dimension one, where it is commonly referred to as the Lasso. In this paper, we study the asymptotic model consistency of the group Lasso. We derive necessary and sufficient conditions for the consistency of group Lasso under practical assumptions, such as model misspecification. When the linear predictors and Euclidean norms are replaced by functions and reproducing kernel Hilbert norms, the problem is usually referred to as multiple kernel learning and is commonly used for learning from heterogeneous data sources and for non linear variable selection. Using tools from functional analysis, and in particular covariance operators, we extend the consistency results to this infinite dimensional case and also propose an adaptive scheme to obt...
The consistent histories approach to loop quantum cosmology
Craig, David A
2016-01-01
We review the application of the consistent (or decoherent) histories formulation of quantum theory to canonical loop quantum cosmology. Conventional quantum theory relies crucially on "measurements" to convert unrealized quantum potentialities into physical outcomes that can be assigned probabilities. In the early universe and other physical contexts in which there are no observers or measuring apparatus (or indeed, in any closed quantum system), what criteria determine which alternative outcomes may be realized and what their probabilities are? In the consistent histories formulation it is the vanishing of interference between the branch wave functions describing alternative histories -- as determined by the system's decoherence functional -- that determines which alternatives may be assigned probabilities. We describe the consistent histories formulation and how it may be applied to canonical loop quantum cosmology, describing in detail the application to homogeneous and isotropic cosmological models with ...
One-particle-irreducible consistency relations for cosmological perturbations
Goldberger, Walter D; Nicolis, Alberto
2013-01-01
We derive consistency relations for correlators of scalar cosmological perturbations which hold in the "squeezed limit" in which one or more of the external momenta become soft. Our results are formulated as relations between suitably defined one-particle irreducible N-point and (N-1)-point functions that follow from residual spatial conformal diffeomorphisms of the unitary gauge Lagrangian. As such, some of these relations are exact to all orders in perturbation theory, and do not rely on approximate deSitter invariance or other dynamical assumptions (e.g., properties of the operator product expansion or the behavior of modes at horizon crossing). The consistency relations apply model-independently to cosmological scenarios where the time evolution is driven by a single scalar field. Besides reproducing the known results for single-field inflation in the slow roll limit, we verify that our consistency relations hold more generally, for instance in ghost condensate models in flat space. We comment on possible...
Multiscale Parameter Regionalization for consistent global water resources modelling
Wanders, Niko; Wood, Eric; Pan, Ming; Samaniego, Luis; Thober, Stephan; Kumar, Rohini; Sutanudjaja, Edwin; van Beek, Rens; Bierkens, Marc F. P.
2017-04-01
Due to an increasing demand for high- and hyper-resolution water resources information, it has become increasingly important to ensure consistency in model simulations across scales. This consistency can be ensured by scale independent parameterization of the land surface processes, even after calibration of the water resource model. Here, we use the Multiscale Parameter Regionalization technique (MPR, Samaniego et al. 2010, WRR) to allow for a novel, spatially consistent, scale independent parameterization of the global water resource model PCR-GLOBWB. The implementation of MPR in PCR-GLOBWB allows for calibration at coarse resolutions and subsequent parameter transfer to the hyper-resolution. In this study, the model was calibrated at 50 km resolution over Europe and validation carried out at resolutions of 50 km, 10 km and 1 km. MPR allows for a direct transfer of the calibrated transfer function parameters across scales and we find that we can maintain consistent land-atmosphere fluxes across scales. Here we focus on the 2003 European drought and show that the new parameterization allows for high-resolution calibrated simulations of water resources during the drought. For example, we find a reduction from 29% to 9.4% in the percentile difference in the annual evaporative flux across scales when compared against default simulations. Soil moisture errors are reduced from 25% to 6.9%, clearly indicating the benefits of the MPR implementation. This new parameterization allows us to show more spatial detail in water resources simulations that are consistent across scales and also allow validation of discharge for smaller catchments, even with calibrations at a coarse 50 km resolution. The implementation of MPR allows for novel high-resolution calibrated simulations of a global water resources model, providing calibrated high-resolution model simulations with transferred parameter sets from coarse resolutions. The applied methodology can be transferred to other
Neighborhood consistency in mental arithmetic: Behavioral and ERP evidence
Verguts Tom
2007-12-01
Full Text Available Abstract Background Recent cognitive and computational models (e.g. the Interacting Neighbors Model state that in simple multiplication decade and unit digits of the candidate answers (including the correct result are represented separately. Thus, these models challenge holistic views of number representation as well as traditional accounts of the classical problem size effect in simple arithmetic (i.e. the finding that large problems are answered slower and less accurate than small problems. Empirical data supporting this view are still scarce. Methods Data of 24 participants who performed a multiplication verification task with Arabic digits (e.g. 8 × 4 = 36 - true or false? are reported. Behavioral (i.e. RT and errors and EEG (i.e. ERP measures were recorded in parallel. Results We provide evidence for neighborhood-consistency effects in the verification of simple multiplication problems (e.g. 8 × 4. Behaviorally, we find that decade-consistent lures, which share their decade digit with the correct result (e.g. 36, are harder to reject than matched inconsistent lures, which differ in both digits from the correct result (e.g. 28. This neighborhood consistency effect in product verification is similar to recent observations in the production of multiplication results. With respect to event-related potentials we find significant differences for consistent compared to inconsistent lures in the N400 (increased negativity and Late Positive Component (reduced positivity. In this respect consistency effects in our paradigm resemble lexico-semantic effects earlier found in simple arithmetic and in orthographic input processing. Conclusion Our data suggest that neighborhood consistency effects in simple multiplication stem at least partly from central (lexico-semantic' stages of processing. These results are compatible with current models on the representation of simple multiplication facts – in particular with the Interacting Neighbors Model
Agent-Based Context Consistency Management in Smart Space Environments
Jih, Wan-Rong; Hsu, Jane Yung-Jen; Chang, Han-Wen
Context-aware systems in smart space environments must be aware of the context of their surroundings and adapt to changes in highly dynamic environments. Data management of contextual information is different from traditional approaches because the contextual information is dynamic, transient, and fallible in nature. Consequently, the capability to detect context inconsistency and maintain consistent contextual information are two key issues for context management. We propose an ontology-based model for representing, deducing, and managing consistent contextual information. In addition, we use ontology reasoning to detect and resolve context inconsistency problems, which will be described in a Smart Alarm Clock scenario.
Consistency among integral measurements of aggregate decay heat power
Takeuchi, H.; Sagisaka, M.; Oyamatsu, K.; Kukita, Y. [Nagoya Univ. (Japan)
1998-03-01
Persisting discrepancies between summation calculations and integral measurements force us to assume large uncertainties in the recommended decay heat power. In this paper, we develop a hybrid method to calculate the decay heat power of a fissioning system from those of different fissioning systems. Then, this method is applied to examine consistency among measured decay heat powers of {sup 232}Th, {sup 233}U, {sup 235}U, {sup 238}U and {sup 239}Pu at YAYOI. The consistency among the measured values are found to be satisfied for the {beta} component and fairly well for the {gamma} component, except for cooling times longer than 4000 s. (author)
The consistency service of the ATLAS Distributed Data Management system
Serfon, C; The ATLAS collaboration
2011-01-01
With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failures is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically corrects the errors reported and informs the users in case of irrecoverable file loss.
The Consistency Service of the ATLAS Distributed Data Management system
Serfon, C; The ATLAS collaboration
2010-01-01
With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failure is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically correct the errors reported and informs the users in case of irrecoverable file loss.
Towards consistent nuclear models and comprehensive nuclear data evaluations
Bouland, O [Los Alamos National Laboratory; Hale, G M [Los Alamos National Laboratory; Lynn, J E [Los Alamos National Laboratory; Talou, P [Los Alamos National Laboratory; Bernard, D [FRANCE; Litaize, O [FRANCE; Noguere, G [FRANCE; De Saint Jean, C [FRANCE; Serot, O [FRANCE
2010-01-01
The essence of this paper is to enlighten the consistency achieved nowadays in nuclear data and uncertainties assessments in terms of compound nucleus reaction theory from neutron separation energy to continuum. Making the continuity of theories used in resolved (R-matrix theory), unresolved resonance (average R-matrix theory) and continuum (optical model) rangcs by the generalization of the so-called SPRT method, consistent average parameters are extracted from observed measurements and associated covariances are therefore calculated over the whole energy range. This paper recalls, in particular, recent advances on fission cross section calculations and is willing to suggest some hints for future developments.
Standard Model Vacuum Stability and Weyl Consistency Conditions
Antipin, Oleg; Gillioz, Marc; Krog, Jens;
2013-01-01
At high energy the standard model possesses conformal symmetry at the classical level. This is reflected at the quantum level by relations between the different beta functions of the model. These relations are known as the Weyl consistency conditions. We show that it is possible to satisfy them...... order by order in perturbation theory, provided that a suitable coupling constant counting scheme is used. As a direct phenomenological application, we study the stability of the standard model vacuum at high energies and compare with previous computations violating the Weyl consistency conditions....
Remark on the Consistent Gauge Anomaly in Supersymmetric Theories
Ohshima, Y; Suzuki, H; Yasuta, H; Ohshima, Yoshihisa; Okuyama, Kiyoshi; Suzuki, Hiroshi; Yasuta, Hirofumi
1999-01-01
We present a direct field theoretical calculation of the consistent gauge anomaly in the superfield formalism, on the basis of a definition of the effective action through the covariant gauge current. The scheme is conceptually and technically simple and the gauge covariance in intermediate steps reduces calculational labors considerably. The resultant superfield anomaly, being proportional to the anomaly $d^{abc}=\\tr T^a\\{T^b,T^c\\}$, is minimal even without supplementing any counterterms. Our anomaly coincides with the anomaly obtained by Marinkovi\\'c as the solution of the Wess-Zumino consistency condition.
A Van Atta reflector consisting of half-wave dipoles
Appel-Hansen, Jørgen
1966-01-01
The reradiation pattern of a passive Van Atta reflector consisting of half-wave dipoles is investigated. The character of the reradiation pattern first is deduced by qualitative and physical considerations. Various types of array elements are considered and several geometrical configurations...... of these elements are outlined. Following this, an analysis is made of the reradiation pattern of a linear Van Atta array consisting of four equispaced half-wave dipoles. The general form of the reradiation pattern is studied analytically. The influence of scattering and coupling is determined and the dependence...
HIGH CONSISTENCY PULPING OF OLD NEWSPRINT AND ITS FLOTATION PROPERTIES
Chunhui Zhang; Menghua Qin
2004-01-01
The mechanical and chemical effect on the pulping properties of the old newsprint was studied using a FORMAX Micro-Maelstrom Laboratory Pulper, and the flotation conditions such as velocity of air flow,air pressure and flotation time were also discussed with a FORMAX Deink Cell. The results show that sodium hydroxide, sodium silicate, hydrogen peroxide and deinking agent are the key factors in the chemical effect, and pulping consistency is more important than pulping time and rotation speed in the mechanical effect during the high consistency pulping of the ONP. In general, the chemical effect has a greater influence on the deinked pulp properties than the mechanical effect.
Island of Stability for Consistent Deformations of Einstein's Gravity
Berkhahn, Felix; Hofmann, Stefan; Kühnel, Florian; Moyassari, Parvin
2011-01-01
We construct explicitly deformations of Einstein's theory of gravity that are consistent and phenomenologically viable since they respect, in particular, cosmological backgrounds. We show that these deformations have unique symmetries in accordance with unitarity requirements, and give rise to a curvature induced self-stabilizing mechanism. As a consequence, any nonlinear completed deformation must incorporate self-stabilization on generic spacetimes already at lowest order in perturbation theory. Furthermore, our findings include the possibility of consistent and phenomenologically viable deformations of general relativity that are solely operative on curved spacetime geometries, reducing to Einstein's theory on the Minkowski background.
Quantum monadology: a consistent world model for consciousness and physics.
Nakagomi, Teruaki
2003-04-01
The NL world model presented in the previous paper is embodied by use of relativistic quantum mechanics, which reveals the significance of the reduction of quantum states and the relativity principle, and locates consciousness and the concept of flowing time consistently in physics. This model provides a consistent framework to solve apparent incompatibilities between consciousness (as our interior experience) and matter (as described by quantum mechanics and relativity theory). Does matter have an inside? What is the flowing time now? Does physics allow the indeterminism by volition? The problem of quantum measurement is also resolved in this model.
The cluster bootstrap consistency in generalized estimating equations
Cheng, Guang
2013-03-01
The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.
Consistent Deformed Bosonic Algebra in Noncommutative Quantum Mechanics
Zhang, Jian-Zu
2009-01-01
In two-dimensional noncommutive space for the case of both position - position and momentum - momentum noncommuting, the consistent deformed bosonic algebra at the non-perturbation level described by the deformed annihilation and creation operators is investigated. A general relation between noncommutative parameters is fixed from the consistency of the deformed Heisenberg - Weyl algebra with the deformed bosonic algebra. A Fock space is found, in which all calculations can be similarly developed as if in commutative space and all effects of spatial noncommutativity are simply represented by parameters.
Consistent Deformed Bosonic Algebra in Noncommutative Quantum Mechanics
Zhang, Jian-Zu
In two-dimensional noncommutative space for the case of both position-position and momentum-momentum noncommuting, the consistent deformed bosonic algebra at the nonperturbation level described by the deformed annihilation and creation operators is investigated. A general relation between noncommutative parameters is fixed from the consistency of the deformed Heisenberg-Weyl algebra with the deformed bosonic algebra. A Fock space is found, in which all calculations can be similarly developed as if in commutative space and all effects of spatial noncommutativity are simply represented by parameters.
HIGH CONSISTENCY PULPING OF OLD NEWSPRINT AND ITS FLOTATION PROPERTIES
ChunhuiZhang; MenghuaQin
2004-01-01
The mechanical and chemical effect on the pulping properties of the old newsprint was studied using a FORMAX Micro-Maelstrom Laboratory Pulper, and the flotation conditions such as velocity of air flow, air pressure and flotation time were also discussed with a FORMAX Deink Cell The results show that sodium hydroxide, sodium silicate, hydrogen peroxide and deinking agent are the key factors in the chemical effect, and pulping consistency is more important than pulping time and rotation speed in the mechanical effect during the high consistency pulping of the ONP. In general, the chemical effect has a greater influence on the deinked pulp properties than the mechanical effect.
Dynamically Consistent Nonlinear Evaluations with Their Generating Functions in Lp
Feng HU
2013-01-01
In this paper,we study dynamically consistent nonlinear evaluations in Lp (1 ＜ p ＜ 2).One of our aim is to obtain the following result:under a domination condition,an Ft-consistent evaluation is an ∑g-evaluation in Lp.Furthermore,without the assumption that the generating function g(t,ω,y,z) is continuous with respect to t,we provide some useful characterizations of an εg-evaluation by g and give some applications.These results include and extend some existing results.
Consistency of Social Sensing Signatures Across Major US Cities
Soliman, Aiman; Padmanabhan, Anand; Wang, Shaowen
2016-01-01
Previous studies have shown that Twitter users have biases to tweet from certain locations, locational bias, and during certain hours, temporal bias. We used three years of geolocated Twitter Data to quantify these biases and test our central hypothesis that Twitter users biases are consistent across US cities. Our results suggest that temporal and locational bias of Twitter users are inconsistent between three US metropolitan cities. We derive conclusions about the role of the complexity of the underlying data producing process on its consistency and argue for the potential research avenue for Geospatial Data Science to test and quantify these inconsistencies in the class of organically evolved Big Data.
Harvey-Collard, Patrick [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-10-27
From January 2015 to July 2015, I was doing research at Sandia National Laboratories in Albuquerque, United States. My work there consisted of performing experimental measurements using Sandia’s unique silicon quantum computing platform. The project is about coupling donor spin quantum bits, or qubits, to quantum dots in a silicon nanostructure based on conventional microchip technology. During the project, I devised a new quantum state readout mechanism that allow better, longer lived measurement signals. The measurement (or readout) mechanism is key to any qubit architecture. Next, I was able to demonstrate a quantum manipulation of the two-electron spin states of the coupled donor and quantum dot system. This constitutes a breakthrough for donor spin qubits in silicon because it could enable larger systems consisting of many qubits. This project will lead to publications in scientific journals, presentations in international conferences, and generates exciting new opportunities for manipulating nature at the nanoscale.
The consistent Riccati expansion and new interaction solution for a Boussinesq-type coupled system
Ruan, Shao-Qing; Yu, Wei-Feng; Yu, Jun; Yu, Guo-Xiang
2015-06-01
Starting from the Davey-Stewartson equation, a Boussinesq-type coupled equation system is obtained by using a variable separation approach. For the Boussinesq-type coupled equation system, its consistent Riccati expansion (CRE) solvability is studied with the help of a Riccati equation. It is significant that the soliton-cnoidal wave interaction solution, expressed explicitly by Jacobi elliptic functions and the third type of incomplete elliptic integral, of the system is also given. Project supported by the National Natural Science Foundation of China (Grant No. 11275129).
1990-01-01
Project Exodus is an in-depth study to identify and address the basic problems of a manned mission to Mars. The most important problems concern propulsion, life support, structure, trajectory, and finance. Exodus will employ a passenger ship, cargo ship, and landing craft for the journey to Mars. These three major components of the mission design are discussed separately. Within each component the design characteristics of structures, trajectory, and propulsion are addressed. The design characteristics of life support are mentioned only in those sections requiring it.
Sørensen, Eigil V.; Aarup, Bendt
The objective of the FLOAT project is to study the reliability of high-performance fibre-reinforced concrete, also known as Compact Reinforced Composite (CRC), for the floats of wave energy converters. In order to reach a commercial breakthrough, wave energy converters need to achieve a lower price...... of energy produced, comparable to prices currently obtained from offshore wind power, and this can be done by the use of more suitable materials. The flotation device is a key part of converters, as it accounts for a considerable share of initial investment, up to 27% depending on the converter. CRC floats...
Pagosa Springs geothermal project. Final technical report
1984-10-19
This booklet discusses some ideas and methods for using Colorado geothermal energy. A project installed in Pagosa Springs, which consists of a pipeline laid down 8th street with service to residences retrofitted to geothermal space heating, is described. (ACR)
Eberenz, J.; Herold, M.; Verbesselt, J.; Wijaya, A.; Lindquist, E.; Defourny, P.; Gibbs, H.K.; Arino, O.; Achard, F.
2015-01-01
This study predicts global forest cover change for the 1980s and 1990s from AVHRR time series metrics in order to show how the series of consistent land cover maps for climate modeling produced by the ESA climate change initiative land cover project can be extended back in time. A Random Forest mode
Koenig, Bruce E; Lacey, Douglas S
2014-07-01
In this research project, nine small digital audio recorders were tested using five sets of 30-min recordings at all available recording modes, with consistent audio material, identical source and microphone locations, and identical acoustic environments. The averaged direct current (DC) offset values and standard deviations were measured for 30-sec and 1-, 2-, 3-, 6-, 10-, 15-, and 30-min segments. The research found an inverse association between segment lengths and the standard deviation values and that lengths beyond 30 min may not meaningfully reduce the standard deviation values. This research supports previous studies indicating that measured averaged DC offsets should only be used for exclusionary purposes in authenticity analyses and exhibit consistent values when the general acoustic environment and microphone/recorder configurations were held constant. Measured average DC offset values from exemplar recorders may not be directly comparable to those of submitted digital audio recordings without exactly duplicating the acoustic environment and microphone/recorder configurations.
Brief Report: Consistency of Search Engine Rankings for Autism Websites
Reichow, Brian; Naples, Adam; Steinhoff, Timothy; Halpern, Jason; Volkmar, Fred R.
2012-01-01
The World Wide Web is one of the most common methods used by parents to find information on autism spectrum disorders and most consumers find information through search engines such as Google or Bing. However, little is known about how the search engines operate or the consistency of the results that are returned over time. This study presents the…
Consistency Within Diversity: Guidelines for Programs to Honor Exemplary Teaching.
Svinicki, Marilla D.; Menges, Robert J.
1996-01-01
Good programs for recognizing exemplary college teaching are consistent with institutional mission and values, are grounded in research-based competencies and practices, recognize all significant facets of instruction, reward both collaborative and individual achievements, neither preclude nor replace the institutional reward system, call on those…
Weakly time consistent concave valuations and their dual representations
Roorda, Berend; Schumacher, Johannes M.
2016-01-01
We derive dual characterizations of two notions of weak time consistency for concave valuations, which are convex risk measures under a positive sign convention. Combined with a suitable risk aversion property, these notions are shown to amount to three simple rules for not necessarily minimal repr
A Consistent Procedure for Pseudo-Component Delumping
Leibovici, Claude; Stenby, Erling Halfdan; Knudsen, Kim
1996-01-01
. Thereby infinite dilution K-values can be obtained exactly without any further computation.Based on these results a consistent procedure for the estimation of equilibrium constants in the more classical cases of finite dilution has been developed. It can be used when moderate binary interaction parameters...
Weakly time consistent concave valuations and their dual representations
Roorda, B.; Schumacher, Hans
2016-01-01
We derive dual characterizations of two notions of weak time consistency for concave valuations, which are convex risk measures under a positive sign convention. Combined with a suitable risk aversion property, these notions are shown to amount to three simple rules for not necessarily minimal repre
Hippocampography Guides Consistent Mesial Resections in Neocortical Temporal Lobe Epilepsy
Marcus C. Ng
2016-01-01
Full Text Available Background. The optimal surgery in lesional neocortical temporal lobe epilepsy is unknown. Hippocampal electrocorticography maximizes seizure freedom by identifying normal-appearing epileptogenic tissue for resection and minimizes neuropsychological deficit by limiting resection to demonstrably epileptogenic tissue. We examined whether standardized hippocampal electrocorticography (hippocampography guides resection for more consistent hippocampectomy than unguided resection in conventional electrocorticography focused on the lesion. Methods. Retrospective chart reviews any kind of electrocorticography (including hippocampography as part of combined lesionectomy, anterolateral temporal lobectomy, and hippocampectomy over 8 years . Patients were divided into mesial (i.e., hippocampography and lateral electrocorticography groups. Primary outcome was deviation from mean hippocampectomy length. Results. Of 26 patients, fourteen underwent hippocampography-guided mesial temporal resection. Hippocampography was associated with 2.6 times more consistent resection. The range of hippocampal resection was 0.7 cm in the mesial group and 1.8 cm in the lateral group (p=0.01. 86% of mesial group versus 42% of lateral group patients achieved seizure freedom (p=0.02. Conclusions. By rationally tailoring excision to demonstrably epileptogenic tissue, hippocampography significantly reduces resection variability for more consistent hippocampectomy than unguided resection in conventional electrocorticography. More consistent hippocampal resection may avoid overresection, which poses greater neuropsychological risk, and underresection, which jeopardizes postoperative seizure freedom.
Discrete anomalies in supergravity and consistency of string backgrounds
Minasian, Ruben; Sasmal, Soumya; Savelli, Raffaele
2017-02-01
We examine SL(2, ℤ) anomalies in ten and eight-dimensional supergravities, the induced local counterterms and their realization in string theory. Composite connections play an important rôle in the cancellation mechanism. At the same time their global properties lead to novel non-trivial consistency constraints on compactifications.
Robust Visual Tracking Via Consistent Low-Rank Sparse Learning
Zhang, Tianzhu
2014-06-19
Object tracking is the process of determining the states of a target in consecutive video frames based on properties of motion and appearance consistency. In this paper, we propose a consistent low-rank sparse tracker (CLRST) that builds upon the particle filter framework for tracking. By exploiting temporal consistency, the proposed CLRST algorithm adaptively prunes and selects candidate particles. By using linear sparse combinations of dictionary templates, the proposed method learns the sparse representations of image regions corresponding to candidate particles jointly by exploiting the underlying low-rank constraints. In addition, the proposed CLRST algorithm is computationally attractive since temporal consistency property helps prune particles and the low-rank minimization problem for learning joint sparse representations can be efficiently solved by a sequence of closed form update operations. We evaluate the proposed CLRST algorithm against 14 state-of-the-art tracking methods on a set of 25 challenging image sequences. Experimental results show that the CLRST algorithm performs favorably against state-of-the-art tracking methods in terms of accuracy and execution time.
Checking Consistency of Pedigree Information is NP-complete
Aceto, Luca; Hansen, Jens A.; Ingolfsdottir, Anna
arose originally from the geneticists' need to filter their input data from erroneous information, and is well motivated from both a biological and a sociological viewpoint. This paper shows that consistency checking is NP-complete, even in the presence of three alleles. Several other results...
An algebraic method for constructing stable and consistent autoregressive filters
Harlim, John, E-mail: jharlim@psu.edu [Department of Mathematics, the Pennsylvania State University, University Park, PA 16802 (United States); Department of Meteorology, the Pennsylvania State University, University Park, PA 16802 (United States); Hong, Hoon, E-mail: hong@ncsu.edu [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States); Robbins, Jacob L., E-mail: jlrobbi3@ncsu.edu [Department of Mathematics, North Carolina State University, Raleigh, NC 27695 (United States)
2015-02-15
In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides a discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern.
Consistency of the Takens estimator for the correlation dimension
Borovkova, S; Burton, R; Dehling, H
1999-01-01
Motivated by the problem of estimating the fractal dimension of a strange attractor, we prove weak consistency of U-statistics for stationary ergodic and mixing sequences when the kernel function is unbounded, extending by this earlier results of Aaronson, Burton, Dehling, Gilat, Hill and Weiss. We
Usability problem reports for comparative studies: consistency and inspectability
Vermeeren, A.P.O.S.; Attema, J.; Akar, E.; De Ridder, H.; Van Doorn, A.J.; Erburg, Ç.; Berkman, A.E.; Maguire, M.
2008-01-01
This study explores issues of consistency and inspectability in usability test data analysis processes and reports. Problem reports resulting from usability tests performed by three professional usability labs in three different countries are compared. Each of the labs conducted a usability test on
Body saccades of Drosophila consist of stereotyped banked turns
Muijres, F.T.; Elzinga, M.J.; Iwasaki, N.A.; Dickinson, M.H.
2015-01-01
The flight pattern of many fly species consists of straight flight segments interspersed with rapid turns called body saccades, a strategy that is thought to minimize motion blur. We analyzed the body saccades of fruit flies (Drosophila hydei), using high-speed 3D videography to track body and wing
Assessing atmospheric bias correction for dynamical consistency using potential vorticity
Rocheta, Eytan; Evans, Jason P.; Sharma, Ashish
2014-12-01
Correcting biases in atmospheric variables prior to impact studies or dynamical downscaling can lead to new biases as dynamical consistency between the ‘corrected’ fields is not maintained. Use of these bias corrected fields for subsequent impact studies and dynamical downscaling provides input conditions that do not appropriately represent intervariable relationships in atmospheric fields. Here we investigate the consequences of the lack of dynamical consistency in bias correction using a measure of model consistency—the potential vorticity (PV). This paper presents an assessment of the biases present in PV using two alternative correction techniques—an approach where bias correction is performed individually on each atmospheric variable, thereby ignoring the physical relationships that exists between the multiple variables that are corrected, and a second approach where bias correction is performed directly on the PV field, thereby keeping the system dynamically coherent throughout the correction process. In this paper we show that bias correcting variables independently results in increased errors above the tropopause in the mean and standard deviation of the PV field, which are improved when using the alternative proposed. Furthermore, patterns of spatial variability are improved over nearly all vertical levels when applying the alternative approach. Results point to a need for a dynamically consistent atmospheric bias correction technique which results in fields that can be used as dynamically consistent lateral boundaries in follow-up downscaling applications.
Discrete anomalies in supergravity and consistency of string backgrounds
Minasian, Ruben; Savelli, Raffaele
2016-01-01
We examine SL(2, Z) anomalies in ten and eight-dimensional supergravities, the induced local counterterms and their realization in string theory. Composite connections play an important role in the cancellation mechanism. At the same time their global properties lead to novel non-trivial consistency constraints on compactifications.
New sequential quadratic programming algorithm with consistent subproblems
贺国平; 高自友; 赖炎连
1997-01-01
One of the most interesting topics related to sequential quadratic programming algorithms is how to guarantee the consistence of all quadratic programming subproblems. In this decade, much work trying to change the form of constraints to obtain the consistence of the subproblems has been done The method proposed by De O. Panto-ja J F A and coworkers solves the consistent problem of SQP method, and is the best to the authors’ knowledge. However, the scale and complexity of the subproblems in De O. Pantoja’s work will be increased greatly since all equality constraints have to be changed into absolute form A new sequential quadratic programming type algorithm is presented by means of a special ε-active set scheme and a special penalty function. Subproblems of the new algorithm are all consistent, and the form of constraints of the subproblems is as simple as one of the general SQP type algorithms. It can be proved that the new method keeps global convergence and local superhnear convergence.
Personalities in great tits, Parus major : stability and consistency
Carere, C; Drent, Piet J.; Privitera, Lucia; Koolhaas, Jaap M.; Groothuis, TGG
2005-01-01
We carried out a longitudinal study on great tits from two lines bidirectionally selected for fast or slow exploratory performance during the juvenile phase, a trait thought to reflect different personalities. We analysed temporal stability and consistency of responses within and between situations
Self-Consistence of Semi-Classical Gravity
Suen, W M
1992-01-01
Simon argued that the semi-classical theory of gravity, unless with some of its solutions excluded, is unacceptable for reasons of both self-consistency and experiment, and that it has to be replaced by a constrained semi-classical theory. We examined whether the evidence is conclusive.
SOCIAL COMPARISON, SELF-CONSISTENCY AND THE PRESENTATION OF SELF.
MORSE, STANLEY J.; GERGEN, KENNETH J.
TO DISCOVER HOW A PERSON'S (P) SELF-CONCEPT IS AFFECTED BY THE CHARACTERISTICS OF ANOTHER (O) WHO SUDDENLY APPEARS IN THE SAME SOCIAL ENVIRONMENT, SEVERAL QUESTIONNAIRES, INCLUDING THE GERGEN-MORSE (1967) SELF-CONSISTENCY SCALE AND HALF THE COOPERSMITH SELF-ESTEEM INVENTORY, WERE ADMINISTERED TO 78 UNDERGRADUATE MEN WHO HAD ANSWERED AN AD FOR WORK…
Plant functional traits have globally consistent effects on competition
Kunstler, Georges; Falster, Daniel; Coomes, David A.; Poorter, Lourens
2016-01-01
Phenotypic traits and their associated trade-offs have been shown to have globally consistent effects on individual plant physiological functions, but how these effects scale up to influence competition, a key driver of community assembly in terrestrial vegetation, has remained unclear. Here we
Fully self-consistent GW calculations for molecules
Rostgaard, Carsten; Jacobsen, Karsten Wedel; Thygesen, Kristian Sommer
2010-01-01
We calculate single-particle excitation energies for a series of 34 molecules using fully self-consistent GW, one-shot G0W0, Hartree-Fock (HF), and hybrid density-functional theory (DFT). All calculations are performed within the projector-augmented wave method using a basis set of Wannier...
On ZRP wind input term consistency in Hasselmann equation
Zakharov, Vladimir; Pushkarev, Andrei
2016-01-01
The new ZRP wind input source term (Zakharov et al. 2012) is checked for its consistency via numerical simulation of Hasselmann equation. The results are compared to field experimental data, collected at different sites around the world, and theoretical predictions of self-similarity analysis. Good agreement is obtained for limited fetch and time domain statements
Consistency in behavior of the CEO regarding corporate social responsibility
Elving, W.J.L.; Kartal, D.
2012-01-01
Purpose - When corporations adopt a corporate social responsibility (CSR) program and use and name it in their external communications, their members should act in line with CSR. The purpose of this paper is to present an experiment in which the consistent or inconsistent behavior of a CEO was
An Intuitionistic Epistemic Logic for Sequential Consistency on Shared Memory
Hirai, Yoichi
In the celebrated Gödel Prize winning papers, Herlihy, Shavit, Saks and Zaharoglou gave topological characterization of waitfree computation. In this paper, we characterize waitfree communication logically. First, we give an intuitionistic epistemic logic k∨ for asynchronous communication. The semantics for the logic k∨ is an abstraction of Herlihy and Shavit's topological model. In the same way Kripke model for intuitionistic logic informally describes an agent increasing its knowledge over time, the semantics of k∨ describes multiple agents passing proofs around and developing their knowledge together. On top of the logic k∨, we give an axiom type that characterizes sequential consistency on shared memory. The advantage of intuitionistic logic over classical logic then becomes apparent as the axioms for sequential consistency are meaningless for classical logic because they are classical tautologies. The axioms are similar to the axiom type for prelinearity (ϕ ⊃ ψ) ∨ (ψ ⊃ ϕ). This similarity reflects the analogy between sequential consistency for shared memory scheduling and linearity for Kripke frames: both require total order on schedules or models. Finally, under sequential consistency, we give soundness and completeness between a set of logical formulas called waitfree assertions and a set of models called waitfree schedule models.
Noncommuting Electric Fields and Algebraic Consistency in Noncommutative Gauge theories
Banerjee, R
2003-01-01
We show that noncommuting electric fields occur naturally in noncommutative gauge theories. Using this noncommutativity, which is field dependent, and a hamiltonian generalisation of the Seiberg-Witten Map, the algebraic consistency in the lagrangian and hamiltonian formulations of these theories, is established. The stability of the Poisson algebra, under this generalised map, is studied.
Efficient self-consistent quantum transport simulator for quantum devices
Gao, X., E-mail: xngao@sandia.gov; Mamaluy, D.; Nielsen, E.; Young, R. W.; Lilly, M. P.; Bishop, N. C.; Carroll, M. S.; Muller, R. P. [Sandia National Laboratories, 1515 Eubank SE, Albuquerque, New Mexico 87123 (United States); Shirkhorshidian, A. [Sandia National Laboratories, 1515 Eubank SE, Albuquerque, New Mexico 87123 (United States); University of New Mexico, Albuquerque, New Mexico 87131 (United States)
2014-04-07
We present a self-consistent one-dimensional (1D) quantum transport simulator based on the Contact Block Reduction (CBR) method, aiming for very fast and robust transport simulation of 1D quantum devices. Applying the general CBR approach to 1D open systems results in a set of very simple equations that are derived and given in detail for the first time. The charge self-consistency of the coupled CBR-Poisson equations is achieved by using the predictor-corrector iteration scheme with the optional Anderson acceleration. In addition, we introduce a new way to convert an equilibrium electrostatic barrier potential calculated from an external simulator to an effective doping profile, which is then used by the CBR-Poisson code for transport simulation of the barrier under non-zero biases. The code has been applied to simulate the quantum transport in a double barrier structure and across a tunnel barrier in a silicon double quantum dot. Extremely fast self-consistent 1D simulations of the differential conductance across a tunnel barrier in the quantum dot show better qualitative agreement with experiment than non-self-consistent simulations.
Context-dependent individual behavioral consistency in Daphnia
Heuschele, Jan; Ekvall, Mikael T.; Bianco, Giuseppe
2017-01-01
, whereas studies focusing on smaller aquatic organisms are still rare. Here, we show individual differences in the swimming behavior of Daphnia magna, a clonal freshwater invertebrate, before, during, and after being exposed to a lethal threat, ultraviolet radiation (UVR). We show consistency in swimming...
FINITE DEFORMATION ELASTO-PLASTIC THEORY AND CONSISTENT ALGORITHM
Liu Xuejun; Li Mingrui; Huang Wenbin
2001-01-01
By using the logarithmic strain, the finite deformation plastic theory, corresponding to the infinitesimal plastic theory, is established successively. The plastic consistent algorithm with first order accuracy for the finite element method (FEM) is developed. Numerical examples are presented to illustrate the validity of the theory and effectiveness of the algorithm.
Consistent measurements comparing the drift features of noble gas mixtures
Becker, U; Fortunato, E M; Kirchner, J; Rosera, K; Uchida, Y
1999-01-01
We present a consistent set of measurements of electron drift velocities and Lorentz deflection angles for all noble gases with methane and ethane as quenchers in magnetic fields up to 0.8 T. Empirical descriptions are also presented. Details on the World Wide Web allow for guided design and optimization of future detectors.
A Nonparametric Approach to Estimate Classification Accuracy and Consistency
Lathrop, Quinn N.; Cheng, Ying
2014-01-01
When cut scores for classifications occur on the total score scale, popular methods for estimating classification accuracy (CA) and classification consistency (CC) require assumptions about a parametric form of the test scores or about a parametric response model, such as item response theory (IRT). This article develops an approach to estimate CA…
Evaluating Reflective Writing for Appropriateness, Fairness, and Consistency.
Kennison, Monica Metrick; Misselwitz, Shirley
2002-01-01
Samples from 17 reflective journals of nursing students were evaluated by 6 faculty. Results indicate a lack of consistency in grading reflective writing, lack of consensus regarding evaluation, and differences among faculty regarding their view of such exercises. (Contains 26 references.) (JOW)
Improving consistency in student evaluation at affiliated family practice centers.
Rabinowitz, H K
1986-01-01
The Department of Family Medicine at Jefferson Medical College has since 1974 been successful in administering a required third-year family medicine clerkship, providing students with a structured, didactic, and experiential curriculum in six affiliated family practice centers. Prior analysis (1976-1981) had indicated, however, that variation existed in evaluating similar students, depending on the clerkship training site, i.e., three sites graded students in a significantly different fashion than the three other sites. Utilizing these data to focus on the evaluation process, a comprehensive and specific six-point plan was developed to improve consistency in evaluations at the different training sites. This plan consisted of a yearly meeting of affiliate faculty, assigning predoctoral training administrative responsibility to one faculty member at each training site, increased telephone communication, affiliate-faculty attendance at the university site evaluation session, faculty rotation to spend time at other training sites, and financial reimbursement to the affiliate training sites. After intervention, analysis (1981-1983) indicated that five of the six clerkship sites now grade students in a consistent fashion, with only one affiliate using different grading standards. The intervention was therefore judged to be successful for five of the six training sites, allowing for better communication and more critical and consistent evaluation of medical students.
[Consistent presentation of medical images based on CPI integration profile].
Jiang, Tao; An, Ji-ye; Chen, Zhong-yong; Lu, Xu-dong; Duan, Hui-long
2007-11-01
Because of different display parameters and other factors, digital medical images present different display states in different section offices of a hospital. Based on CPI integration profile of IHE, this paper implements the consistent presentation of medical images, and it is helpful for doctors to carry out medical treatments of teamwork.
Measures of Consistency for Holland-Type Codes.
Strahan, Robert F.
1987-01-01
Describes two new measures of consistency which refer to the extent to which more closely related scale types are found together in Holland's Self-Directed Search sort. One measure is based on the hexagonal model for use with three-point codes. The other is based on conditional probabilities for use with two-point codes. (Author/ABL)
PROJECT TEAM MOTIVATION IN PROJECT REALISATION
Perica Jankoviæ
2014-01-01
Managing a project team is an everyday activity when managing a project realization. In order to accomplish efficiency at work on a project, it is necessary for all the participants in the project to be motivated and interested, focused on accomplishing the project. To the end of providing greater motivation of the project team for the realization of the project, the project manager should be very well acquainted with the needs and motives of the people he/she is managing and should find the ...
Loyal, Rebecca E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2015-07-14
The objective of the Portunus Project is to create large, automated offshore ports that will the pace and scale of international trade. Additionally, these ports would increase the number of U.S. domestic trade vessels needed, as the imported goods would need to be transported from these offshore platforms to land-based ports such as Boston, Los Angeles, and Newark. Currently, domestic trade in the United States can only be conducted by vessels that abide by the Merchant Marine Act of 1920 – also referred to as the Jones Act. The Jones Act stipulates that vessels involved in domestic trade must be U.S. owned, U.S. built, and manned by a crew made up of U.S. citizens. The Portunus Project would increase the number of Jones Act vessels needed, which raises an interesting economic concern. Are Jones Act ships more expensive to operate than foreign vessels? Would it be more economically efficient to modify the Jones Act and allow vessels manned by foreign crews to engage in U.S. domestic trade? While opposition to altering the Jones Act is strong, it is important to consider the possibility that ship-owners who employ foreign crews will lobby for the chance to enter a growing domestic trade market. Their success would mean potential job loss for thousands of Americans currently employed in maritime trade.
Zulauf, W.E. [Sao Paolos Environmental Secretariat, Sao Paolo (Brazil); Goelho, A.S.R. [Riocell, S.A. (Brazil); Saber, A. [IEA-Instituto de Estudos Avancados (Brazil)] [and others
1995-12-31
The project FLORAM was formulated at the `Institute for Advanced Studies` of the University of Sao Paulo. It aims at decreasing the level of carbon dioxide in the atmosphere and thus curbing the green-house effect by way of a huge effort of forestation and reforestation. The resulting forests when the trees mature, will be responsible for the absorption of about 6 billion tons of excess carbon. It represents 5 % of the total amount of CO{sub 2} which is in excess in the earth`s atmosphere and represents 5 % of the available continental surfaces which can be forested as well. Therefore, if similar projects are implemented throughout the world, in theory all the exceeding CO{sub 2}, responsible for the `greenhouse effect`, (27 % or 115 billion tons of carbon) would be absorbed. Regarding this fact, there would be a 400 million hectar increase of growing forests. FLORAM in Brazil aims to plant 20.000.000 ha in 2 years at a cost of 20 billion dollars. If it reaches its goals that will mean that Brazil will have reforested an area almost half as big as France. (author)
Dannenberg, K. K.; Henderson, A.; Lee, J.; Smith, G.; Stluka, E.
1984-01-01
PROJECT EXPLORER is a program that will fly student-developed experiments onboard the Space Shuttle in NASA's Get-Away Special (GAS) containers. The program is co-sponsored by the Alabama Space and Rocket Center, the Alabama-Mississippi Section of the American Institute of Aeronautics and Astronautics, Alabama A&M University and requires extensive support by the University of Alabama in Huntsville. A unique feature of this project will demonstrate transmissions to ground stations on amateur radio frequencies in English language. Experiments Nos. 1, 2, and 3 use the microgravity of space flight to study the solidification of lead-antimony and aluminum-copper alloys, the growth of potassium-tetracyanoplatinate hydrate crystals in an aqueous solution, and the germination of radish seeds. Flight results will be compared with Earth-based data. Experiment No. 4 features radio transmission and will also provide timing for the start of all other experiments. A microprocessor will obtain real-time data from all experiments as well as temperature and pressure measurements taken inside the canister. These data will be transmitted on previously announced amateur radio frequencies after they have been converted into the English language by a digitalker for general reception.
Performance and consistency of indicator groups in two biodiversity hotspots.
Joaquim Trindade-Filho
Full Text Available BACKGROUND: In a world limited by data availability and limited funds for conservation, scientists and practitioners must use indicator groups to define spatial conservation priorities. Several studies have evaluated the effectiveness of indicator groups, but still little is known about the consistency in performance of these groups in different regions, which would allow their a priori selection. METHODOLOGY/PRINCIPAL FINDINGS: We systematically examined the effectiveness and the consistency of nine indicator groups in representing mammal species in two top-ranked Biodiversity Hotspots (BH: the Brazilian Cerrado and the Atlantic Forest. To test for group effectiveness we first found the best sets of sites able to maximize the representation of each indicator group in the BH and then calculated the average representation of different target species by the indicator groups in the BH. We considered consistent indicator groups whose representation of target species was not statistically different between BH. We called effective those groups that outperformed the target-species representation achieved by random sets of species. Effective indicator groups required the selection of less than 2% of the BH area for representing target species. Restricted-range species were the most effective indicators for the representation of all mammal diversity as well as target species. It was also the only group with high consistency. CONCLUSIONS/SIGNIFICANCE: We show that several indicator groups could be applied as shortcuts for representing mammal species in the Cerrado and the Atlantic Forest to develop conservation plans, however, only restricted-range species consistently held as the most effective indicator group for such a task. This group is of particular importance in conservation planning as it captures high diversity of endemic and endangered species.
The internal consistency of the North Sea carbonate system
Salt, Lesley A.; Thomas, Helmuth; Bozec, Yann; Borges, Alberto V.; de Baar, Hein J. W.
2016-05-01
In 2002 (February) and 2005 (August), the full suite of carbonate system parameters (total alkalinity (AT), dissolved inorganic carbon (DIC), pH, and partial pressure of CO2 (pCO2) were measured on two re-occupations of the entire North Sea basin, with three parameters (AT, DIC, pCO2) measured on four additional re-occupations, covering all four seasons, allowing an assessment of the internal consistency of the carbonate system. For most of the year, there is a similar level of internal consistency, with AT being calculated to within ± 6 μmol kg- 1 using DIC and pH, DIC to ± 6 μmol kg- 1 using AT and pH, pH to ± 0.008 using AT and pCO2, and pCO2 to ± 8 μatm using DIC and pH, with the dissociation constants of Millero et al. (2006). In spring, however, we observe a significant decline in the ability to accurately calculate the carbonate system. Lower consistency is observed with an increasing fraction of Baltic Sea water, caused by the high contribution of organic alkalinity in this water mass, not accounted for in the carbonate system calculations. Attempts to improve the internal consistency by accounting for the unconventional salinity-borate relationships in freshwater and the Baltic Sea, and through application of the new North Atlantic salinity-boron relationship (Lee et al., 2010), resulted in no significant difference in the internal consistency.
Consistency of accuracy assessment indices for soft classification: Simulation analysis
Chen, Jin; Zhu, Xiaolin; Imura, Hidefumi; Chen, Xuehong
Accuracy assessment plays a crucial role in the implementation of soft classification. Even though many indices of accuracy assessment for soft classification have been proposed, the consistencies among these indices are not clear, and the impact of sample size on these consistencies has not been investigated. This paper examines two kinds of indices: map-level indices, including root mean square error ( rmse), kappa, and overall accuracy ( oa) from the sub-pixel confusion matrix (SCM); and category-level indices, including crmse, user accuracy ( ua) and producer accuracy ( pa). A careful simulation was conducted to investigate the consistency of these indices and the effect of sample size. The major findings were as follows: (1) The map-level indices are highly consistent with each other, whereas the category-level indices are not. (2) The consistency among map-level and category-level indices becomes weaker when the sample size decreases. (3) The rmse is more affected by error distribution among classes than are kappa and oa. Based on these results, we recommend that rmse can be used for map-level accuracy due to its simplicity, although kappa and oa may be better alternatives when the sample size is limited because the two indices are affected less by the error distribution among classes. We also suggest that crmse should be provided when map users are not concerned about the error source, whereas ua and pa are more useful when the complete information about different errors is required. The results of this study will be of benefit to the development and application of soft classifiers.
Consistent SPH Simulations of Protostellar Collapse and Fragmentation
Gabbasov, Ruslan; Sigalotti, Leonardo Di G.; Cruz, Fidel; Klapp, Jaime; Ramírez-Velasquez, José M.
2017-02-01
We study the consistency and convergence of smoothed particle hydrodynamics (SPH) as a function of the interpolation parameters, namely the number of particles N, the number of neighbors n, and the smoothing length h, using simulations of the collapse and fragmentation of protostellar rotating cores. The calculations are made using a modified version of the GADGET-2 code that employs an improved scheme for the artificial viscosity and power-law dependences of n and h on N, as was recently proposed by Zhu et al., which comply with the combined limit N\\to ∞ , h\\to 0, and n\\to ∞ with n/N\\to 0 for full SPH consistency as the domain resolution is increased. We apply this realization to the “standard isothermal test case” in the variant calculated by Burkert & Bodenheimer and the Gaussian cloud model of Boss to investigate the response of the method to adaptive smoothing lengths in the presence of large density and pressure gradients. The degree of consistency is measured by tracking how well the estimates of the consistency integral relations reproduce their continuous counterparts. In particular, C 0 and C 1 particle consistency is demonstrated, meaning that the calculations are close to second-order accuracy. As long as n is increased with N, mass resolution also improves as the minimum resolvable mass {M}\\min ∼ {n}-1. This aspect allows proper calculation of small-scale structures in the flow associated with the formation and instability of protostellar disks around the growing fragments, which are seen to develop a spiral structure and fragment into close binary/multiple systems as supported by recent observations.
1946-12-01
Iron these photographs the aebematie drawing, fig. (10) «a» prepared. Thla shows that the burner consists of three concentric ahella assembled...alloy ataol* vhleh ar* oapabl* of Mlntalnlng high tonsil * strength at elevated tesperetures, it 1* hoped that th* fnel/strueture Might ratio My he...possible by the Me of an alloy steel, «Men exhibits good tonsil * strength at high temeratnre*. A study is being conducted, leading to a preliminary
Materials Processes (MP) Engineering Internship Projects
Tomsik, Elizabeth
2017-01-01
This poster illustrates my major and minor projects worked on during my entire time interning at KSC in the Materials Science Branch. My major projects consist of three Failure Analyses, a research project on Magnesium Alloys, and the manufacturing and mechanical testing of the Advanced Plant Habitat. My three Failure Analyses are Umbilical Testing Ground Plates, Lithium Ion Battery Locking Spring Blade, and a Liquid Oxygen Poppet.
Project Success Report, 1974-1975.
Arko, Dorothy N.
Project Success, a program of modified interdisciplinary instruction coordinated with special services, is considered to be consistent with the theory of individual differences and individualized instruction which is inherent in the philosophy of Bloomington Public School's educational goals and objectives. The project, which began as an…
44 CFR 80.17 - Project implementation.
2010-10-01
... RELOCATION FOR OPEN SPACE Post-Award Requirements § 80.17 Project implementation. (a) Hazardous materials... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Project implementation. 80.17... described in this part, consistent with FEMA model deed restriction language....
Penin, Lucie; Vidal-Dupiol, Jeremie; Adjeroud, Mehdi
2013-06-01
Mass bleaching events resulting in coral mortality are among the greatest threats to coral reefs, and are projected to increase in frequency and intensity with global warming. Achieving a better understanding of the consistency of the response of coral assemblages to thermal stress, both spatially and temporally, is essential to determine which reefs are more able to tolerate climate change. We compared variations in spatial and taxonomic patterns between two bleaching events at the scale of an island (Moorea Island, French Polynesia). Despite similar thermal stress and light conditions, bleaching intensity was significantly lower in 2007 (approximately 37 % of colonies showed signs of bleaching) than in 2002, when 55 % of the colonies bleached. Variations in the spatial patterns of bleaching intensity were consistent between the two events. Among nine sampling stations at three locations and three depths, the stations at which the bleaching response was lowest in 2002 were those that showed the lowest levels of bleaching in 2007. The taxonomic patterns of susceptibility to bleaching were also consistent between the two events. These findings have important implications for conservation because they indicate that corals are capable of acclimatization and/or adaptation and that, even at small spatial scales, some areas are consistently more susceptible to bleaching than others.
Giorgi, F.; Coppola, E.; Raffaele, F.
2014-10-01
We analyze trends of six daily precipitation-based and physically interconnected hydroclimatic indices in an ensemble of historical and 21st century climate projections under forcing from increasing greenhouse gas (GHG) concentrations (Representative Concentration Pathways (RCP)8.5), along with gridded (land only) observations for the late decades of the twentieth century. The indices include metrics of intensity (SDII) and extremes (R95) of precipitation, dry (DSL), and wet spell length, the hydroclimatic intensity index (HY-INT), and a newly introduced index of precipitation area (PA). All the indices in both the 21st century and historical simulations provide a consistent picture of a predominant shift toward a hydroclimatic regime of more intense, shorter, less frequent, and less widespread precipitation events in response to GHG-induced global warming. The trends are larger and more spatially consistent over tropical than extratropical regions, pointing to the importance of tropical convection in regulating this response, and show substantial regional spatial variability. Observed trends in the indices analyzed are qualitatively and consistently in line with the simulated ones, at least at the global and full tropical scale, further supporting the robustness of the identified prevailing hydroclimatic responses. The HY-INT, PA, and R95 indices show the most consistent response to global warming, and thus offer the most promising tools for formal hydroclimatic model validation and detection/attribution studies. The physical mechanism underlying this response and some of the applications of our results are also discussed.
Stable functional networks exhibit consistent timing in the human brain.
Chapeton, Julio I; Inati, Sara K; Zaghloul, Kareem A
2017-03-01
Despite many advances in the study of large-scale human functional networks, the question of timing, stability, and direction of communication between cortical regions has not been fully addressed. At the cellular level, neuronal communication occurs through axons and dendrites, and the time required for such communication is well defined and preserved. At larger spatial scales, however, the relationship between timing, direction, and communication between brain regions is less clear. Here, we use a measure of effective connectivity to identify connections between brain regions that exhibit communication with consistent timing. We hypothesized that if two brain regions are communicating, then knowledge of the activity in one region should allow an external observer to better predict activity in the other region, and that such communication involves a consistent time delay. We examine this question using intracranial electroencephalography captured from nine human participants with medically refractory epilepsy. We use a coupling measure based on time-lagged mutual information to identify effective connections between brain regions that exhibit a statistically significant increase in average mutual information at a consistent time delay. These identified connections result in sparse, directed functional networks that are stable over minutes, hours, and days. Notably, the time delays associated with these connections are also highly preserved over multiple time scales. We characterize the anatomic locations of these connections, and find that the propagation of activity exhibits a preferred posterior to anterior temporal lobe direction, consistent across participants. Moreover, networks constructed from connections that reliably exhibit consistent timing between anatomic regions demonstrate features of a small-world architecture, with many reliable connections between anatomically neighbouring regions and few long range connections. Together, our results demonstrate
Alberto Cargnelutti Filho
2011-09-01
Full Text Available O objetivo deste trabalho foi avaliar a consistência do padrão de agrupamento obtido a partir da combinação de duas medidas de dissimilaridade e quatro métodos de agrupamento, em cenários formados por combinações de número de cultivares e número de variáveis, com dados reais de cultivares de milho (Zea mays L. e com dados simulados. Foram usados os dados reais de cinco variáveis mensuradas em 69 experimentos de competição de cultivares de milho, cujo número de cultivares avaliadas oscilou entre 9 e 40. A fim de investigar os resultados com maior número de cultivares e de variáveis, foram simulados, sob distribuição normal padrão, 1.000 experimentos para cada um dos 54 cenários formados pela combinação entre o número de cultivares (20, 30, 40, 50, 60, 70, 80, 90 e 100 e o número de variáveis (5, 6, 7, 8, 9 e 10. Foram realizadas análises de correlação, de diagnóstico de multicolinearidade e de agrupamento. A consistência do padrão de agrupamento foi avaliada por meio do coeficiente de correlação cofenética. Há decréscimo da consistência do padrão de agrupamento com o acréscimo do número de cultivares e de variáveis. A distância euclidiana proporciona maior consistência no padrão de agrupamento em relação à distância de Manhattan. A consistência do padrão de agrupamento entre os métodos aumenta na seguinte ordem: Ward, ligação completa, ligação simples e ligação média entre grupo.The objective of this research was to evaluate the clustering pattern consistency obtained from the combination of the two dissimilarity measures and four clustering methods, in scenarios consist of combinations number of cultivars and number of variables, with real data in corn cultivars (Zea mays L. and simulated data. We used real data from five variables measured in 69 trials involving corn cultivars, the number of cultivars ranged between 9 and 40. In order to investigate the results with more cultivars and
A step toward regionalized scale-consistent agricultural life cycle assessment inventories.
Morais, Tiago G; Teixeira, Ricardo Fm; Domingos, Tiago
2017-09-01
Life cycle inventory (LCI) regionalization (i.e., the determination of input and output flows from production processes at a subcountry scale) is a priority in life cycle assessment (LCA) studies, particularly in the agri-food sector. Many regionalized LCAs fail to ensure that microlevel inventories are consistent with country-level aggregated data-or "scale consistent." They also fail to construct LCIs using international reference guidelines and trustworthy standardized data sources. This failure generates inaccuracies and biases in inventories and can compromise comparability among international LCA studies. Our study introduces scale consistency as a principle for regionalized agri-food LCIs. We present a generic procedure that defines how scale-dependent LCI flows should be regionalized, depending on data availability. We then present a list of inventory flows that require regionalization and their suggested calculation procedures (methods and models) from 2 methodological guides developed by projects Agribalyse and World Food LCA Database. As proof of concept, we apply the procedure to Portugal and assess whether the methods and models proposed for each type of inventory flow in both guides can potentially be applied consistently with the data available. For 17 inventory flows, we apply calculated scale-consistent inventory flows for Portuguese agriculture, covering 260 products that can be used in future LCA studies. Comparing results with international databases, we show that this procedure can improve country-level estimates significantly. Our study is the first step in introducing scale consistency as a guiding principle for regionalized LCIs for agri-food LCA studies. Integr Environ Assess Manag 2017;13:939-951. © 2017 SETAC. © 2017 SETAC.
Continental-scale hydrological consistency of evapotranspiration products using GRACE
Lopez, O.; McCabe, M. F.
2014-12-01
Multiple remote sensing products based on satellite observations are available at regional and global scales, allowing to obtain an estimation of the individual components of the hydrological cycle. However, using these products to provide closure of the water budget at the basin scale with accuracy remains a challenge. In this work, 12 large continental-scale basins covering a range of various climate types were chosen as regions of interest. Terrestrial water storage changes from GRACE, streamflow data from the Global Runoff Database and precipitation from the Tropical Rainfall Measuring Mission (TRMM) Multi Satellite Precipitation Analysis (TMPA) and Global Precipitation Climatology Project (GPCP), were used as a surrogate evaluation of observed spatio-temporal patterns of multi-model evapotranspiration estimates, derived from a long-term flux product as part of the LandFLUX project. The 10 year period of analysis also allows for the estimation of temporal trends in water storage changes and provides an opportunity to examine the capacity for water budget closure.
Ballantine, A; Dixon-Altaber, H; Dosanjh, M; Kuchina, L
2011-01-01
Hadrontherapy is a highly advanced technique of cancer radiotherapy that uses beams of charged particles (ions) to destroy tumour cells. While conventional X-rays traverse the human body depositing radiation as they pass through, ions deliver most of their energy at one point. Hadrontherapy is most advantageous once the position of the tumour is accurately known, so that healthy tissues can be protected. Accurate positioning is a crucial challenge for targeting moving organs, as in lung cancer, and for adapting the irradiation as the tumour shrinks with treatment. Therefore, quality assurance becomes one of the most relevant issues for an effective outcome of the cancer treatment. In order to improve the quality assurance tools for hadrontherapy, the European Commission is funding ENVISION, a 4-year project that aims at developing solutions for: real-• time non invasive monitoring • quantitative imaging • precise determination of delivered dose • fast feedback for optimal treatment planning • real-t...
Violation of consistency relations and the protoinflationary transition
Giovannini, Massimo
2014-01-01
If we posit the validity of the consistency relations, the tensor spectral index and the relative amplitude of the scalar and tensor power spectra are both fixed by a single slow roll parameter. The physics of the protoinflationary transition can break explicitly the consistency relations causing a reduction of the inflationary curvature scale in comparison with the conventional lore. After a critical scrutiny, we argue that the inflationary curvature scale, the total number of inflationary efolds and, ultimately, the excursion of the inflaton across its Planckian boundary are all characterized by a computable theoretical error. While these considerations ease some of the tensions between the Bicep2 data and the other satellite observations, they also demand an improved understanding of the protoinflationary transition whose physical features may be assessed, in the future, through a complete analysis of the spectral properties of the B mode autocorrelations.
Design of a Turbulence Generator of Medium Consistency Pulp Pumps
Hong Li
2012-01-01
Full Text Available The turbulence generator is a key component of medium consistency centrifugal pulp pumps, with functions to fluidize the medium consistency pulp and to separate gas from the liquid. Structure sizes of the generator affect the hydraulic performance. The radius and the blade laying angle are two important structural sizes of a turbulence generator. Starting with the research on the flow inside and shearing characteristics of the MC pulp, a simple mathematical model at the flow section of the shearing chamber is built, and the formula and procedure to calculate the radius of the turbulence generator are established. The blade laying angle is referenced from the turbine agitator which has the similar shape with the turbulence generator, and the CFD simulation is applied to study the different flow fields with different blade laying angles. Then the recommended blade laying angle of the turbulence generator is formed to be between 60° and 75°.
Non-trivial checks of novel consistency relations
Berezhiani, Lasha; Khoury, Justin [Center for Particle Cosmology, Department of Physics and Astronomy, University of Pennsylvania, Philadelphia, PA 19104 (United States); Wang, Junpu, E-mail: lashaber@gmail.com, E-mail: jkhoury@sas.upenn.edu, E-mail: jwang217@jhu.edu [Department of Physics and Astronomy, Johns Hopkins University, Baltimore, MD 21218 (United States)
2014-06-01
Single-field perturbations satisfy an infinite number of consistency relations constraining the squeezed limit of correlation functions at each order in the soft momentum. These can be understood as Ward identities for an infinite set of residual global symmetries, or equivalently as Slavnov-Taylor identities for spatial diffeomorphisms. In this paper, we perform a number of novel, non-trivial checks of the identities in the context of single field inflationary models with arbitrary sound speed. We focus for concreteness on identities involving 3-point functions with a soft external mode, and consider all possible scalar and tensor combinations for the hard-momentum modes. In all these cases, we check the consistency relations up to and including cubic order in the soft momentum. For this purpose, we compute for the first time the 3-point functions involving 2 scalars and 1 tensor, as well as 2 tensors and 1 scalar, for arbitrary sound speed.
Non-Trivial Checks of Novel Consistency Relations
Berezhiani, Lasha; Wang, Junpu
2014-01-01
Single-field perturbations satisfy an infinite number of consistency relations constraining the squeezed limit of correlation functions at each order in the soft momentum. These can be understood as Ward identities for an infinite set of residual global symmetries, or equivalently as Slavnov-Taylor identities for spatial diffeomorphisms. In this paper, we perform a number of novel, non-trivial checks of the identities in the context of slow-roll single field inflationary models with arbitrary sound speed. We focus for concreteness on identities involving 3-point functions with a soft external mode, and consider all possible scalar and tensor combinations for the hard-momentum modes. In all these cases, we check the consistency relations up to and including cubic order in the soft momentum. For this purpose, we compute for the first time the 3-point functions involving 2 scalars and 1 tensor, as well as 2 tensors and 1 scalar, for arbitrary sound speed.
Multiconfigurational self-consistent reaction field theory for nonequilibrium solvation
Mikkelsen, Kurt V.; Cesar, Amary; Ågren, Hans
1995-01-01
We present multiconfigurational self-consistent reaction field theory and implementation for solvent effects on a solute molecular system that is not in equilibrium with the outer solvent. The approach incorporates two different polarization vectors for studying the influence of the solvent...... states influenced by the two types of polarization vectors. The general treatment of the correlation problem through the use of complete and restricted active space methodologies makes the present multiconfigurational self-consistent reaction field approach general in that it can handle any type of state......, open-shell, excited, and transition states. We demonstrate the theory by computing solvatochromatic shifts in optical/UV spectra of some small molecules and electron ionization and electron detachment energies of the benzene molecule. It is shown that the dependency of the solvent induced affinity...
Branch dependence in the "consistent histories" approach to quantum mechanics
Müller, T
2005-01-01
In the consistent histories formalism one specifies a family of histories as an exhaustive set of pairwise exclusive descriptions of the dynamics of a quantum system. We define branching families of histories, which strike a middle ground between the two available mathematically precise definitions of families of histories, viz., product families and Isham's history projector operator formalism. The former are too narrow for applications, and the latter's generality comes at a certain cost, barring an intuitive reading of the ``histories''. Branching families retain the intuitiveness of product families, they allow for the interpretation of a history's weight as a probability, and they allow one to distinguish two kinds of coarse-graining. It is shown that for branching families, the ``consistency condition'' is not a precondition for assigning probabilities, but for a specific kind of coarse-graining.
Structures, profile consistency, and transport scaling in electrostatic convection
Bian, N.H.; Garcia, O.E.
2005-01-01
that for interchange modes, profile consistency is in fact due to mixing by persistent large-scale convective cells. This mechanism is not a turbulent diffusion, cannot occur in collisionless systems, and is the analog of the well-known laminar "magnetic flux expulsion" in magneiohydrodynamics. This expulsion process...... involves a "pinch" across closed streamlines and further results in the formation of pressure fingers along the-separatrix of the convective cells. By nature, these coherent structures are dissipative because the mixing process that leads to their formation relies on a finite amount of collisional...... diffusion. Numerical simulations of two-dimensional interchange modes confirm the role of laminar expulsion by convective cells, for profile consistency and structure formation. They also show that the fingerlike pressure structures ultimately control the rate of heat transport across the plasma layer...
Turbulent MHD transport coefficients - An attempt at self-consistency
Chen, H.; Montgomery, D.
1987-01-01
In this paper, some multiple scale perturbation calculations of turbulent MHD transport coefficients begun in earlier papers are first completed. These generalize 'alpha effect' calculations by treating the velocity field and magnetic field on the same footing. Then the problem of rendering such calculations self-consistent is addressed, generalizing an eddy-viscosity hypothesis similar to that of Heisenberg for the Navier-Stokes case. The method also borrows from Kraichnan's direct interaction approximation. The output is a set of integral equations relating the spectra and the turbulent transport coefficients. Previous 'alpha effect' and 'beta effect' coefficients emerge as limiting cases. A treatment of the inertial range can also be given, consistent with a -5/3 energy spectrum power law. In the Navier-Stokes limit, a value of 1.72 is extracted for the Kolmogorov constant. Further applications to MHD are possible.
Consistent return mapping algorithm for plane stress elastoplasticity
Simo, J.C.; Taylor, R.L.
1985-05-01
An unconditionally stable algorithm for plane stress elastoplasticity is developed, based upon the motion of elastic predictor return mapping (plastic corrector). Enforcement of the consistency condition is shown to reduce to the solution of a simple nonlinear equation. Consistent elastoplastic tangent moduli are obtained by exact linearization of the algorithm. Use of these moduli is essential in order to preserve the asymptotic rate of quadratic convergence of Newton methods. An exact solution for constant strain rate over the typical time step is derived. On the basis of this solution the accuracy of the algorithm is assessed by means of iso-error maps. The excellent performance of the algorithm for large time steps is illustrated in numerical experiments.
Strong Consistency of the Empirical Martingale Simulation Option Price Estimator
Zhu-shun Yuan; Ge-mai Chen
2009-01-01
A simulation technique known as empirical martingale simulation (EMS) was proposed to improve simulation accuracy. By an adjustment to the standard Monte Carlo simulation, EMS ensures that the simulated price satisfies the rational option pricing bounds and that the estimated derivative contract price is strongly consistent with payoffs that satisfy Lipschitz condition. However, for some currently used contracts such as self-quanto options and asymmetric or symmetric power options, it is open whether the above asymptotic result holds. In this paper, we prove that the strong consistency of the EMS option price estimator holds for a wider class of univariate payoffs than those restricted by Lipschitz condition. Numerical experiments demonstrate that EMS can also substantially increase simulation accuracy in the extended setting.
A correlation consistency based multivariate alarm thresholds optimization approach.
Gao, Huihui; Liu, Feifei; Zhu, Qunxiong
2016-11-01
Different alarm thresholds could generate different alarm data, resulting in different correlations. A new multivariate alarm thresholds optimization methodology based on the correlation consistency between process data and alarm data is proposed in this paper. The interpretative structural modeling is adopted to select the key variables. For the key variables, the correlation coefficients of process data are calculated by the Pearson correlation analysis, while the correlation coefficients of alarm data are calculated by kernel density estimation. To ensure the correlation consistency, the objective function is established as the sum of the absolute differences between these two types of correlations. The optimal thresholds are obtained using particle swarm optimization algorithm. Case study of Tennessee Eastman process is given to demonstrate the effectiveness of proposed method.
Current Status Of Velocity Field Surveys: A Consistency Check
Sarkar, D; Watkins, R; Sarkar, Devdeep; Feldman, Hume A.
2006-01-01
We present a statistical analysis comparing the bulk--flow measurements for six recent peculiar velocity surveys, namely, ENEAR, SFI, RFGC, SBF and the Mark III singles and group catalogs. We study whether the bulk--flow estimates are consistent with each other and construct the full three dimensional bulk--flow vectors. The method we discuss could be used to test the consistency of all velocity field surveys. We show that although these surveys differ in their geometry and measurement errors, their bulk flow vectors are expected to be highly correlated and in fact show impressive agreement in all cases. Our results suggest that even though the surveys we study target galaxies of different morphology and use different distance measures, they all reliably reflect the same underlying large-scale flow.
Stochastic multi-configurational self-consistent field theory
Thomas, Robert E; Alavi, Ali; Booth, George H
2015-01-01
The multi-configurational self-consistent field theory is considered the standard starting point for almost all multireference approaches required for strongly-correlated molecular problems. The limitation of the approach is generally given by the number of strongly-correlated orbitals in the molecule, as its cost will grow exponentially with this number. We present a new multi-configurational self-consistent field approach, wherein linear determinant coefficients of a multi-configurational wavefunction are optimized via the stochastic full configuration interaction quantum Monte Carlo technique at greatly reduced computational cost, with non-linear orbital rotation parameters updated variationally based on this sampled wavefunction. This extends this approach to strongly-correlated systems with far larger active spaces than it is possible to treat by conventional means. By comparison with this traditional approach, we demonstrate that the introduction of stochastic noise in both the determinant amplitudes an...
Consistency Across Standards or Standards in a New Business Model
Russo, Dane M.
2010-01-01
Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.
Exceptional generalised geometry for massive IIA and consistent reductions
Cassani, Davide; Petrini, Michela; Strickland-Constable, Charles; Waldram, Daniel
2016-01-01
We develop an exceptional generalised geometry formalism for massive type IIA supergravity. In particular, we construct a deformation of the generalised Lie derivative, which generates the type IIA gauge transformations as modified by the Romans mass. We apply this new framework to consistent Kaluza-Klein reductions preserving maximal supersymmetry. We find a generalised parallelisation of the exceptional tangent bundle on S^6, and from this reproduce the consistent truncation ansatz and embedding tensor leading to dyonically gauged ISO(7) supergravity in four dimensions. We also discuss closely related hyperboloid reductions, yielding a dyonic ISO(p,7-p) gauging. Finally, while for vanishing Romans mass we find a generalised parallelisation on S^d, d=4,3,2, leading to a maximally supersymmetric reduction with gauge group SO(d+1) (or larger), we provide evidence that an analogous reduction does not exist in the massive theory.
Exceptional generalised geometry for massive IIA and consistent reductions
Cassani, Davide; de Felice, Oscar; Petrini, Michela; Strickland-Constable, Charles; Waldram, Daniel
2016-08-01
We develop an exceptional generalised geometry formalism for massive type IIA supergravity. In particular, we construct a deformation of the generalised Lie derivative, which generates the type IIA gauge transformations as modified by the Romans mass. We apply this new framework to consistent Kaluza-Klein reductions preserving maximal supersymmetry. We find a generalised parallelisation of the exceptional tangent bundle on S 6, and from this reproduce the consistent truncation ansatz and embedding tensor leading to dyonically gauged ISO(7) supergravity in four dimensions. We also discuss closely related hyperboloid reductions, yielding a dyonic ISO( p, 7 - p) gauging. Finally, while for vanishing Romans mass we find a generalised parallelisation on S d , d = 4 , 3 , 2, leading to a maximally supersymmetric reduction with gauge group SO( d + 1) (or larger), we provide evidence that an analogous reduction does not exist in the massive theory.
Bolasso: model consistent Lasso estimation through the bootstrap
Bach, Francis
2008-01-01
We consider the least-square linear regression problem with regularization by the l1-norm, a problem usually referred to as the Lasso. In this paper, we present a detailed asymptotic analysis of model consistency of the Lasso. For various decays of the regularization parameter, we compute asymptotic equivalents of the probability of correct model selection (i.e., variable selection). For a specific rate decay, we show that the Lasso selects all the variables that should enter the model with probability tending to one exponentially fast, while it selects all other variables with strictly positive probability. We show that this property implies that if we run the Lasso for several bootstrapped replications of a given sample, then intersecting the supports of the Lasso bootstrap estimates leads to consistent model selection. This novel variable selection algorithm, referred to as the Bolasso, is compared favorably to other linear regression methods on synthetic data and datasets from the UCI machine learning rep...
Consistency Relations for an Implicit n-dimensional Regularization Scheme
Scarpelli, A P B; Nemes, M C
2001-01-01
We extend an implicit regularization scheme to be applicable in the $n$-dimensional space-time. Within this scheme divergences involving parity violating objects can be consistently treated without recoursing to dimensional continuation. Special attention is paid to differences between integrals of the same degree of divergence, typical of one loop calculations, which are in principle undetermined. We show how to use symmetries in order to fix these quantities consistently. We illustrate with examples in which regularization plays a delicate role in order to both corroborate and elucidate the results in the literature for the case of CPT violation in extended $QED_4$, topological mass generation in 3-dimensional gauge theories and the Schwinger Model and its chiral version.
Collisional decoherence of tunneling molecules: a consistent histories treatment
Coles, Patrick J; Griffiths, Robert B
2012-01-01
The decoherence of a two-state tunneling molecule, such as a chiral molecule or ammonia, due to collisions with a buffer gas is analyzed in terms of a succession of quantum states of the molecule satisfying the conditions for a consistent family of histories. With $\\hbar \\omega$ the separation in energy of the levels in the isolated molecule and $\\gamma$ a decoherence rate proportional to the rate of collisions, we find for $\\gamma \\gg \\omega$ (strong decoherence) a consistent family in which the molecule flips randomly back and forth between the left- and right-handed chiral states in a stationary Markov process. For $\\gamma \\omega$ and for $\\gamma < \\omega$. In addition we relate the speed with which chiral information is transferred to the environment to the rate of decrease of complementary types of information (e.g., parity information) remaining in the molecule itself.
A New Heuristic for Feature Selection by Consistent Biclustering
Mucherino, Antonio
2010-01-01
Given a set of data, biclustering aims at finding simultaneous partitions in biclusters of its samples and of the features which are used for representing the samples. Consistent biclusterings allow to obtain correct classifications of the samples from the known classification of the features, and vice versa, and they are very useful for performing supervised classifications. The problem of finding consistent biclusterings can be seen as a feature selection problem, where the features that are not relevant for classification purposes are removed from the set of data, while the total number of features is maximized in order to preserve information. This feature selection problem can be formulated as a linear fractional 0-1 optimization problem. We propose a reformulation of this problem as a bilevel optimization problem, and we present a heuristic algorithm for an efficient solution of the reformulated problem. Computational experiments show that the presented algorithm is able to find better solutions with re...
Viscoelastic models with consistent hypoelasticity for fluids undergoing finite deformations
Altmeyer, Guillaume; Rouhaud, Emmanuelle; Panicaud, Benoit; Roos, Arjen; Kerner, Richard; Wang, Mingchuan
2015-08-01
Constitutive models of viscoelastic fluids are written with rate-form equations when considering finite deformations. Trying to extend the approach used to model these effects from an infinitesimal deformation to a finite transformation framework, one has to ensure that the tensors and their rates are indifferent with respect to the change of observer and to the superposition with rigid body motions. Frame-indifference problems can be solved with the use of an objective stress transport, but the choice of such an operator is not obvious and the use of certain transports usually leads to physically inconsistent formulation of hypoelasticity. The aim of this paper is to present a consistent formulation of hypoelasticity and to combine it with a viscosity model to construct a consistent viscoelastic model. In particular, the hypoelastic model is reversible.
Consistency and axiomatization of a natural extensional combinatory logic
蒋颖
1996-01-01
In the light of a question of J. L. Krivine about the consistency of an extensional λ-theory,an extensional combinatory logic ECL+U(G)+RU_∞+ is established, with its consistency model provedtheoretically and it is shown the it is not equivalent to any system of universal axioms. It is expressed bythe theory in first order logic that, for every given group G of order n, there simultaneously exist infinitelymany universal retractions and a surjective n-tuple notion, such that each element of G acts as a permutationof the components of the n-tuple, and as an Ap-automorphism of the model; further each of the universalretractions is invarian under the action of the Ap-automorphisms induced by G The difference between thetheory and that of Krivine is the G need not be a symmetric group.
A minimal model of self-consistent partial synchrony
Clusella, Pau; Politi, Antonio; Rosenblum, Michael
2016-09-01
We show that self-consistent partial synchrony in globally coupled oscillatory ensembles is a general phenomenon. We analyze in detail appearance and stability properties of this state in possibly the simplest setup of a biharmonic Kuramoto-Daido phase model as well as demonstrate the effect in limit-cycle relaxational Rayleigh oscillators. Such a regime extends the notion of splay state from a uniform distribution of phases to an oscillating one. Suitable collective observables such as the Kuramoto order parameter allow detecting the presence of an inhomogeneous distribution. The characteristic and most peculiar property of self-consistent partial synchrony is the difference between the frequency of single units and that of the macroscopic field.
Planck 2013 results. XXXI. Consistency of the Planck data
Ade, P. A. R.; Arnaud, M.; Ashdown, M.
2014-01-01
by deviation of the ratio from unity) between 70 and 100 GHz power spectra averaged over 70 ≤∫≥ 390 at the 0.8% level, and agreement between 143 and 100 GHz power spectra of 0.4% over the same ` range. These values are within and consistent with the overall uncertainties in calibration given in the Planck 2013...... foreground emission. In this paper, we analyse the level of consistency achieved in the 2013 Planck data. We concentrate on comparisons between the 70, 100, and 143 GHz channel maps and power spectra, particularly over the angular scales of the first and second acoustic peaks, on maps masked for diuse....../100 ratio. Correcting for this, the 70, 100, and 143 GHz power spectra agree to 0.4% over the first two acoustic peaks. The likelihood analysis that produced the 2013 cosmological parameters incorporated uncertainties larger than this. We show explicitly that correction of the missing near sidelobe power...
Time-Consistent and Market-Consistent Evaluations (Revised version of CentER DP 2011-063)
Pelsser, A.; Stadje, M.A.
2012-01-01
Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
Time-Consistent and Market-Consistent Evaluations (replaced by CentER DP 2012-086)
Pelsser, A.; Stadje, M.A.
2011-01-01
We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from mathemati
Non-autonomous discrete Boussinesq equation: Solutions and consistency
Nong, Li-Juan; Zhang, Da-Juan
2014-07-01
A non-autonomous 3-component discrete Boussinesq equation is discussed. Its spacing parameters pn and qm are related to independent variables n and m, respectively. We derive bilinear form and solutions in Casoratian form. The plain wave factor is defined through the cubic roots of unity. The plain wave factor also leads to extended non-autonomous discrete Boussinesq equation which contains a parameter δ. Tree-dimendional consistency and Lax pair of the obtained equation are discussed.
An Algebraic Characterization of Inductive Soundness in Proof by Consistency
邵志清; 宋国新
1995-01-01
Kapur and Musser studied the theoretical basis for proof by consistency and obtained an inductive completeness result:p=q if and only if p=q is true in every inductive model.However,there is a loophole in their proof for the soundness part:p=q implies p=q is true in every inductive model.The aim of this paper is to give a correct characterization of inductive soundness from an algebraic view by introducing strong inductive models.
Consistency analysis of a nonbirefringent Lorentz-violating planar model
Casana, Rodolfo; Moreira, Roemir P M
2011-01-01
In this work analyze the physical consistency of a nonbirefringent Lorentz-violating planar model via the analysis of the pole structure of its Feynman's propagators. The nonbirefringent planar model, obtained from the dimensional reduction of the CPT-even gauge sector of the standard model extension, is composed of a gauge and a scalar fields, being affected by Lorentz-violating (LIV) coefficients encoded in the symmetric tensor $\\kappa_{\\mu\
Incomplete Lineage Sorting: Consistent Phylogeny Estimation From Multiple Loci
Mossel, Elchanan
2008-01-01
We introduce a simple algorithm for reconstructing phylogenies from multiple gene trees in the presence of incomplete lineage sorting, that is, when the topology of the gene trees may differ from that of the species tree. We show that our technique is statistically consistent under standard stochastic assumptions, that is, it returns the correct tree given sufficiently many unlinked loci. We also show that it can tolerate moderate estimation errors.
Consistent 4D Brain Extraction of Serial Brain MR Images
Wang, Yaping; Li, Gang; Nie, Jingxin; Yap, Pew-Thian; Guo, Lei; Shen, Dinggang
2013-01-01
Accurate and consistent skull stripping of serial brain MR images is of great importance in longitudinal studies that aim to detect subtle brain morphological changes. To avoid inconsistency and the potential bias introduced by independently performing skull-stripping for each time-point image, we propose an effective method that is capable of skull-stripping serial brain MR images simultaneously. Specifically, all serial images of the same subject are first affine aligned in a groupwise mann...
Consistency argument and classification problem in λ-calculus
王驹; 赵希顺; 黄且圆; 蒋颖
1999-01-01
Enlightened by Mal’cev theorem in universal algebra, a new criterion for consistency argument in λ-calculus has been introduced. It is equivalent to Jacopini and Baeten-Boerboom’ s, but more convenient to use. Based on the new criterion, one uses an enhanced technique to show a few results which provides a deeper insight in the classification problem of λ-terms with no normal forms.
Security Policy:Consistency,Adjustments and Restraining Factors
Yang Jiemian
2004-01-01
@@ In the 2004 U.S. presidential election, despite well-divided domestic opinions and Kerry's appealing slogan of "Reversing the Trend," a slight majority still voted for George W. Bush in the end. It is obvious that, based on the author's analysis, security agenda such as counter-terrorism and Iraqi issue has contributed greatly to the reelection of Mr. Bush. This also indicates that the security policy of Bush's second term will basically be consistent.
Measuring Consistent Poverty in Ireland with EU SILC Data
Whelan, Christopher T.; Nolan, Brian; Maitre, Bertrand
2006-01-01
In this paper we seek to make use of the newly available Irish component of the European Union Statistics on Income and Living Conditions (EU-SILC) in order to develop a measure of consistent poverty that overcomes some of the difficulties associated with the original indicators employed as targets in the Irish National Anti-Poverty Strategy. Our analysis leads us to propose a set of economic strain indicators that cover a broader range than the original basic deprivation set. The accumulated...
Personalities in great tits, Parus major: stability and consistency
Carere, C; Drent, PJ; Privitera, L; Koolhaas, JM; Groothuis, TGG; Drent, Piet J; Koolhaas, Jaap M.; Groothuis, Ton G.G.
2005-01-01
We carried out a longitudinal study on great tits from two lines bidirectionally selected for fast or slow exploratory performance during the juvenile phase, a trait thought to reflect different personalities. We analysed temporal stability and consistency of responses within and between situations involving exploratory and sociosexual behaviour. Exploratory behaviour was assessed both in the juvenile phase and in adulthood (2-3-year interval) by means of a novel object test and an open field...
Noncommuting electric fields and algebraic consistency in noncommutative gauge theories
Banerjee, Rabin
2003-05-01
We show that noncommuting electric fields occur naturally in θ-expanded noncommutative gauge theories. Using this noncommutativity, which is field dependent, and a Hamiltonian generalization of the Seiberg-Witten map, the algebraic consistency in the Lagrangian and Hamiltonian formulations of these theories is established. A comparison of results in different descriptions shows that this generalized map acts as a canonical transformation in the physical subspace only. Finally, we apply the Hamiltonian formulation to derive the gauge symmetries of the action.
Identification of consistency in rating curve data: Bidirectional Reach (BReach)
Van Eerdenbrugh, Katrien; Van Hoey, Stijn; Verhoest, Niko E. C.
2016-04-01
Before calculating rating curve discharges, it is crucial to identify possible interruptions in data consistency. In this research, a methodology to perform this preliminary analysis is developed and validated. This methodology, called Bidirectional Reach (BReach), evaluates in each data point results of a rating curve model with randomly sampled parameter sets. The combination of a parameter set and a data point is classified as non-acceptable if the deviation between the accompanying model result and the measurement exceeds observational uncertainty. Moreover, a tolerance degree that defines satisfactory behavior of a sequence of model results is chosen. This tolerance degree equals the percentage of observations that are allowed to have non-acceptable model results. Subsequently, the results of the classification is used to assess the maximum left and right reach for each data point of a chronologically sorted time series. This maximum left and right reach in a gauging point represent the data points in the direction of the previous respectively the following observations beyond which none of the sampled parameter sets both are satisfactory and result in an acceptable deviation. This analysis is repeated for a variety of tolerance degrees. Plotting results of this analysis for all data points and all tolerance degrees in a combined BReach plot enables the detection of changes in data consistency. Moreover, if consistent periods are detected, limits of these periods can be derived. The methodology is validated with various synthetic stage-discharge data sets and proves to be a robust technique to investigate temporal consistency of rating curve data. It provides satisfying results despite of low data availability, large errors in the estimated observational uncertainty, and a rating curve model that is known to cover only a limited part of the observations.
Consistent Algorithm for Multi-value Constraint with Continuous Variables
无
1999-01-01
Mature algorithms for the Constraint Satisfaction Problem (CSP) of binary constraint with discrete variables have already been obtained for the application. For the instance of multi-value constraint with continuous variables, the approach will be quite different and the difficulty of settling will aggrandize a lot. This paper presents the algorithm for realizing global consistency of continuous variable. And this algorithm can be applied to multi-value constraint.
Influence of Sensor Ingestion Timing on Consistency of Temperature Measures
2009-01-01
Copyright @ 200 by the American College of Sports Medicine. Unauthorized reproduction of this article is prohibited.9 Influence of Sensor Ingestion ... Ingestion Timing on Consistency of Temperature Measures. Med. Sci. Sports Exerc., Vol. 41, No. 3, pp. 597–602, 2009. Purpose: The validity and the...reliability of using intestinal temperature (Tint) via ingestible temperature sensors (ITS) to measure core body temperature have been demonstrated. However
Spectrally Consistent Satellite Image Fusion with Improved Image Priors
Nielsen, Allan Aasbjerg; Aanæs, Henrik; Jensen, Thomas B.S.;
2006-01-01
Here an improvement to our previous framework for satellite image fusion is presented. A framework purely based on the sensor physics and on prior assumptions on the fused image. The contributions of this paper are two fold. Firstly, a method for ensuring 100% spectrally consistency is proposed......, even when more sophisticated image priors are applied. Secondly, a better image prior is introduced, via data-dependent image smoothing....
GW method with the self-consistent Sternheimer equation
2010-01-01
We propose a novel approach to quasiparticle GW calculations which does not require the computation of unoccupied electronic states. In our approach the screened Coulomb interaction is evaluated by solving self-consistent linear-response Sternheimer equations, and the noninteracting Green's function is evaluated by solving inhomogeneous linear systems. The frequency-dependence of the screened Coulomb interaction is explicitly taken into account. In order to avoid the singularities of the scre...
Internal consistency reliability is a poor predictor of responsiveness
Heels-Ansdell Diane
2005-05-01
Full Text Available Abstract Background Whether responsiveness represents a measurement property of health-related quality of life (HRQL instruments that is distinct from reliability and validity is an issue of debate. We addressed the claims of a recent study, which suggested that investigators could rely on internal consistency to reflect instrument responsiveness. Methods 516 patients with chronic obstructive pulmonary disease or knee injury participating in four longitudinal studies completed generic and disease-specific HRQL questionnaires before and after an intervention that impacted on HRQL. We used Pearson correlation coefficients and linear regression to assess the relationship between internal consistency reliability (expressed as Cronbach's alpha, instrument type (generic and disease-specific and responsiveness (expressed as the standardised response mean, SRM. Results Mean Cronbach's alpha was 0.83 (SD 0.08 and mean SRM was 0.59 (SD 0.33. The correlation between Cronbach's alpha and SRMs was 0.10 (95% CI -0.12 to 0.32 across all studies. Cronbach's alpha alone did not explain variability in SRMs (p = 0.59, r2 = 0.01 whereas the type of instrument was a strong predictor of the SRM (p = 0.012, r2 = 0.37. In multivariable models applied to individual studies Cronbach's alpha consistently failed to predict SRMs (regression coefficients between -0.45 and 1.58, p-values between 0.15 and 0.98 whereas the type of instrument did predict SRMs (regression coefficients between -0.25 to -0.59, p-values between Conclusion Investigators must look to data other than internal consistency reliability to select a responsive instrument for use as an outcome in clinical trials.
Consistent deniable lying : privacy in mobile social networks
Belle, Sebastian Kay; Waldvogel, Marcel
2008-01-01
Social networking is moving to mobile phones. This not only means continuous access, but also allows to link virtual and physical neighbourhood in novel ways. To make such systems useful, personal data such as lists of friends and interests need to be shared with more and frequently unknown people, posing a risk to your privacy. In this paper, we present our approach to social networking, Consistent Deniable Lying (CDL). Using easy-to-understand mechanisms and tuned to this environment, i...
On the scalar consistency relation away from slow roll
Sreenath, V; Sriramkumar, L
2014-01-01
As is well known, the non-Gaussianity parameter $f_{_{\\rm NL}}$, which is often used to characterize the amplitude of the scalar bi-spectrum, can be expressed completely in terms of the scalar spectral index $n_{\\rm s}$ in the squeezed limit, a relation that is referred to as the consistency condition. This relation, while it is largely discussed in the context of slow roll inflation, is actually expected to hold in any single field model of inflation, irrespective of the dynamics of the underlying model. In this work, we explicitly examine the validity of the consistency relation, analytically as well as numerically, away from slow roll. Analytically, we first arrive at the relation in the simple case of power law inflation. We also consider the non-trivial example of the Starobinsky model involving a linear potential with a sudden change in its slope (which leads to a brief period of fast roll), and establish the condition completely analytically. We then numerically examine the validity of the consistency ...
Wide baseline stereo matching based on double topological relationship consistency
Zou, Xiaohong; Liu, Bin; Song, Xiaoxue; Liu, Yang
2009-07-01
Stereo matching is one of the most important branches in computer vision. In this paper, an algorithm is proposed for wide-baseline stereo vision matching. Here, a novel scheme is presented called double topological relationship consistency (DCTR). The combination of double topological configuration includes the consistency of first topological relationship (CFTR) and the consistency of second topological relationship (CSTR). It not only sets up a more advanced model on matching, but discards mismatches by iteratively computing the fitness of the feature matches and overcomes many problems of traditional methods depending on the powerful invariance to changes in the scale, rotation or illumination across large view changes and even occlusions. Experimental examples are shown where the two cameras have been located in very different orientations. Also, epipolar geometry can be recovered using RANSAC by far the most widely method adopted possibly. By the method, we can obtain correspondences with high precision on wide baseline matching problems. Finally, the effectiveness and reliability of this method are demonstrated in wide-baseline experiments on the image pairs.
Consistency Property of Finite FC-Normal Logic Programs
Yi-Song Wang; Ming-Yi Zhang; Yu-Ping Shen
2007-01-01
Marek's forward-chaining construction is one of the important techniques for investigating the non-monotonic reasoning. By introduction of consistency property over a logic program, they proposed a class of logic programs, FC-normal programs, each of which has at least one stable model. However, it is not clear how to choose one appropriate consistency property for deciding whether or not a logic program is FC-normal. In this paper, we firstly discover that, for any finite logic program ∏, there exists the least consistency property LCon(∏) over ∏, which just depends on ∏ itself, such that, ∏ is FC-normal if and only if ∏ is FC-normal with respect to (w.r.t.) LCon(∏). Actually, in order to determine the FC-normality of a logic program, it is sufficient to check the monotonic closed sets in LCon(∏) for all non-monotonic rules, that is LFC(∏). Secondly, we present an algorithm for computing LFC(∏). Finally, we reveal that the brave reasoning task and cautious reasoning task for FC-normal logic programs are of the same difficulty as that of normal logic programs.
Context-specific metabolic networks are consistent with experiments.
Scott A Becker
2008-05-01
Full Text Available Reconstructions of cellular metabolism are publicly available for a variety of different microorganisms and some mammalian genomes. To date, these reconstructions are "genome-scale" and strive to include all reactions implied by the genome annotation, as well as those with direct experimental evidence. Clearly, many of the reactions in a genome-scale reconstruction will not be active under particular conditions or in a particular cell type. Methods to tailor these comprehensive genome-scale reconstructions into context-specific networks will aid predictive in silico modeling for a particular situation. We present a method called Gene Inactivity Moderated by Metabolism and Expression (GIMME to achieve this goal. The GIMME algorithm uses quantitative gene expression data and one or more presupposed metabolic objectives to produce the context-specific reconstruction that is most consistent with the available data. Furthermore, the algorithm provides a quantitative inconsistency score indicating how consistent a set of gene expression data is with a particular metabolic objective. We show that this algorithm produces results consistent with biological experiments and intuition for adaptive evolution of bacteria, rational design of metabolic engineering strains, and human skeletal muscle cells. This work represents progress towards producing constraint-based models of metabolism that are specific to the conditions where the expression profiling data is available.
On the kernel and particle consistency in smoothed particle hydrodynamics
Sigalotti, Leonardo Di G; Rendón, Otto; Vargas, Carlos A; Peña-Polo, Franklin
2016-01-01
The problem of consistency of smoothed particle hydrodynamics (SPH) has demanded considerable attention in the past few years due to the ever increasing number of applications of the method in many areas of science and engineering. A loss of consistency leads to an inevitable loss of approximation accuracy. In this paper, we revisit the issue of SPH kernel and particle consistency and demonstrate that SPH has a limiting second-order convergence rate. Numerical experiments with suitably chosen test functions validate this conclusion. In particular, we find that when using the root mean square error as a model evaluation statistics, well-known corrective SPH schemes, which were thought to converge to second, or even higher order, are actually first-order accurate, or at best close to second order. We also find that observing the joint limit when $N\\to\\infty$, $h\\to 0$, and $n\\to\\infty$, as was recently proposed by Zhu et al., where $N$ is the total number of particles, $h$ is the smoothing length, and $n$ is th...
Consistently Trained Artificial Neural Network for Automatic Ship Berthing Control
Y.A. Ahmed
2015-09-01
Full Text Available In this paper, consistently trained Artificial Neural Network controller for automatic ship berthing is discussed. Minimum time course changing manoeuvre is utilised to ensure such consistency and a new concept named ‘virtual window’ is introduced. Such consistent teaching data are then used to train two separate multi-layered feed forward neural networks for command rudder and propeller revolution output. After proper training, several known and unknown conditions are tested to judge the effectiveness of the proposed controller using Monte Carlo simulations. After getting acceptable percentages of success, the trained networks are implemented for the free running experiment system to judge the network’s real time response for Esso Osaka 3-m model ship. The network’s behaviour during such experiments is also investigated for possible effect of initial conditions as well as wind disturbances. Moreover, since the final goal point of the proposed controller is set at some distance from the actual pier to ensure safety, therefore a study on automatic tug assistance is also discussed for the final alignment of the ship with actual pier.
A dynamical mechanism for large volumes with consistent couplings
Abel, Steven
2016-11-01
A mechanism for addressing the "decompactification problem" is proposed, which consists of balancing the vacuum energy in Scherk-Schwarzed theories against contributions coming from non-perturbative physics. Universality of threshold corrections ensures that, in such situations, the stable minimum will have consistent gauge couplings for any gauge group that shares the same N = 2 beta function for the bulk excitations as the gauge group that takes part in the minimisation. Scherk-Schwarz compactification from 6D to 4D in heterotic strings is discussed explicitly, together with two alternative possibilities for the non-perturbative physics, namely metastable SQCD vacua and a single gaugino condensate. In the former case, it is shown that modular symmetries gives various consistency checks, and allow one to follow soft-terms, playing a similar role to R-symmetry in global SQCD. The latter case is particularly attractive when there is nett Bose-Fermi degeneracy in the massless sector. In such cases, because the original Casimir energy is generated entirely by excited and/or non-physical string modes, it is completely immune to the non-perturbative IR physics. Such a separation between UV and IR contributions to the potential greatly simplifies the analysis of stabilisation, and is a general possibility that has not been considered before.
The consistent histories approach to loop quantum cosmology
Craig, David A.
2016-06-01
We review the application of the consistent (or decoherent) histories formulation of quantum theory to canonical loop quantum cosmology. Conventional quantum theory relies crucially on “measurements” to convert unrealized quantum potentialities into physical outcomes that can be assigned probabilities. In the early universe and other physical contexts in which there are no observers or measuring apparatus (or indeed, in any closed quantum system), what criteria determine which alternative outcomes may be realized and what their probabilities are? In the consistent histories formulation it is the vanishing of interference between the branch wave functions describing alternative histories — as determined by the system’s decoherence functional — that determines which alternatives may be assigned probabilities. We describe the consistent histories formulation and how it may be applied to canonical loop quantum cosmology, describing in detail the application to homogeneous and isotropic cosmological models with scalar matter. We show how the theory may be used to make definite physical predictions in the absence of “observers”. As an application, we demonstrate how the theory predicts that loop quantum models “bounce” from large volume to large volume, while conventional “Wheeler-DeWitt”-quantized universes are invariably singular. We also briefly indicate the relation to other work.
Consistent Adjoint Driven Importance Sampling using Space, Energy and Angle
Peplow, Douglas E. [ORNL; Mosher, Scott W [ORNL; Evans, Thomas M [ORNL
2012-08-01
For challenging radiation transport problems, hybrid methods combine the accuracy of Monte Carlo methods with the global information present in deterministic methods. One of the most successful hybrid methods is CADIS Consistent Adjoint Driven Importance Sampling. This method uses a deterministic adjoint solution to construct a biased source distribution and consistent weight windows to optimize a specific tally in a Monte Carlo calculation. The method has been implemented into transport codes using just the spatial and energy information from the deterministic adjoint and has been used in many applications to compute tallies with much higher figures-of-merit than analog calculations. CADIS also outperforms user-supplied importance values, which usually take long periods of user time to develop. This work extends CADIS to develop weight windows that are a function of the position, energy, and direction of the Monte Carlo particle. Two types of consistent source biasing are presented: one method that biases the source in space and energy while preserving the original directional distribution and one method that biases the source in space, energy, and direction. Seven simple example problems are presented which compare the use of the standard space/energy CADIS with the new space/energy/angle treatments.