Image recognition and consistency of response
Haygood, Tamara M.; Ryan, John; Liu, Qing Mary A.; Bassett, Roland; Brennan, Patrick C.
2012-02-01
Purpose: To investigate the connection between conscious recognition of an image previously encountered in an experimental setting and consistency of response to the experimental question. Materials and Methods: Twenty-four radiologists viewed 40 frontal chest radiographs and gave their opinion as to the position of a central venous catheter. One-to-three days later they again viewed 40 frontal chest radiographs and again gave their opinion as to the position of the central venous catheter. Half of the radiographs in the second set were repeated images from the first set and half were new. The radiologists were asked of each image whether it had been included in the first set. For this study, we are evaluating only the 20 repeated images. We used the Kruskal-Wallis test and Fisher's exact test to determine the relationship between conscious recognition of a previously interpreted image and consistency in interpretation of the image. Results. There was no significant correlation between recognition of the image and consistency in response regarding the position of the central venous catheter. In fact, there was a trend in the opposite direction, with radiologists being slightly more likely to give a consistent response with respect to images they did not recognize than with respect to those they did recognize. Conclusion: Radiologists' recognition of previously-encountered images in an observer-performance study does not noticeably color their interpretation on the second encounter.
Guided color consistency optimization for image mosaicking
Xie, Renping; Xia, Menghan; Yao, Jian; Li, Li
2018-01-01
This paper studies the problem of color consistency correction for sequential images with diverse color characteristics. Existing algorithms try to adjust all images to minimize color differences among images under a unified energy framework, however, the results are prone to presenting a consistent but unnatural appearance when the color difference between images is large and diverse. In our approach, this problem is addressed effectively by providing a guided initial solution for the global consistency optimization, which avoids converging to a meaningless integrated solution. First of all, to obtain the reliable intensity correspondences in overlapping regions between image pairs, we creatively propose the histogram extreme point matching algorithm which is robust to image geometrical misalignment to some extents. In the absence of the extra reference information, the guided initial solution is learned from the major tone of the original images by searching some image subset as the reference, whose color characteristics will be transferred to the others via the paths of graph analysis. Thus, the final results via global adjustment will take on a consistent color similar to the appearance of the reference image subset. Several groups of convincing experiments on both the synthetic dataset and the challenging real ones sufficiently demonstrate that the proposed approach can achieve as good or even better results compared with the state-of-the-art approaches.
Spectrally Consistent Satellite Image Fusion with Improved Image Priors
DEFF Research Database (Denmark)
Nielsen, Allan Aasbjerg; Aanæs, Henrik; Jensen, Thomas B.S.
2006-01-01
Here an improvement to our previous framework for satellite image fusion is presented. A framework purely based on the sensor physics and on prior assumptions on the fused image. The contributions of this paper are two fold. Firstly, a method for ensuring 100% spectrally consistency is proposed......, even when more sophisticated image priors are applied. Secondly, a better image prior is introduced, via data-dependent image smoothing....
Smoothing of Fused Spectral Consistent Satellite Images
DEFF Research Database (Denmark)
Sveinsson, Johannes; Aanæs, Henrik; Benediktsson, Jon Atli
2006-01-01
on satellite data. Additionally, most conventional methods are loosely connected to the image forming physics of the satellite image, giving these methods an ad hoc feel. Vesteinsson et al. (2005) proposed a method of fusion of satellite images that is based on the properties of imaging physics...
Algebra 1 groups, rings, fields and arithmetic
Lal, Ramji
2017-01-01
This is the first in a series of three volumes dealing with important topics in algebra. It offers an introduction to the foundations of mathematics together with the fundamental algebraic structures, namely groups, rings, fields, and arithmetic. Intended as a text for undergraduate and graduate students of mathematics, it discusses all major topics in algebra with numerous motivating illustrations and exercises to enable readers to acquire a good understanding of the basic algebraic structures, which they can then use to find the exact or the most realistic solutions to their problems.
Example-Based Image Colorization Using Locality Consistent Sparse Representation.
Bo Li; Fuchen Zhao; Zhuo Su; Xiangguo Liang; Yu-Kun Lai; Rosin, Paul L
2017-11-01
Image colorization aims to produce a natural looking color image from a given gray-scale image, which remains a challenging problem. In this paper, we propose a novel example-based image colorization method exploiting a new locality consistent sparse representation. Given a single reference color image, our method automatically colorizes the target gray-scale image by sparse pursuit. For efficiency and robustness, our method operates at the superpixel level. We extract low-level intensity features, mid-level texture features, and high-level semantic features for each superpixel, which are then concatenated to form its descriptor. The collection of feature vectors for all the superpixels from the reference image composes the dictionary. We formulate colorization of target superpixels as a dictionary-based sparse reconstruction problem. Inspired by the observation that superpixels with similar spatial location and/or feature representation are likely to match spatially close regions from the reference image, we further introduce a locality promoting regularization term into the energy formulation, which substantially improves the matching consistency and subsequent colorization results. Target superpixels are colorized based on the chrominance information from the dominant reference superpixels. Finally, to further improve coherence while preserving sharpness, we develop a new edge-preserving filter for chrominance channels with the guidance from the target gray-scale image. To the best of our knowledge, this is the first work on sparse pursuit image colorization from single reference images. Experimental results demonstrate that our colorization method outperforms the state-of-the-art methods, both visually and quantitatively using a user study.
Tunneling in a self-consistent dynamic image potential
International Nuclear Information System (INIS)
Rudberg, B.G.R.; Jonson, M.
1991-01-01
We have calculated the self-consistent effective potential for an electron tunneling through a square barrier while interacting with surface plasmons. This potential reduces to the classical image potential in the static limit. In the opposite limit, when the ''velocity'' of the tunneling electron is large, it reduces to the unperturbed square-barrier potential. For a wide variety of parameters the dynamic effects on the transmission coefficient T=|t 2 | can, for instance, be related to the Buettiker-Landauer traversal time for tunneling, given by τ BL =ℎ|d lnt/dV|
Fueling moving ring field-reversed mirror reactor plasmas
International Nuclear Information System (INIS)
Felber, F.S.
1980-01-01
The concept of small fusion reactors is being studied jointly by Lawrence Livermore Laboratory General Atomic Company, and Pacific Gas and Electric Company. The objective is to investigate alternatives and then to develop a conceptual design for a small reactor that could produce useful, though not necessarily economical, energy by the late 1980s. Three methods of fueling a small moving ring field-reversed mirror are considered: injection of fuel pellets accelerated by laser ablation, injection of fuel pellets accelerated by deflagration-gun ablation, and direct injection of plasma by a deflagration gun. 13 refs
Moving-ring field-reversed mirror reactor
International Nuclear Information System (INIS)
Smith, A.C. Jr.; Ashworth, C.P.; Abreu, K.E.
1981-01-01
We describe a first prototype fusion reactor design of the Moving-Ring Field-Reversed Mirror Reactor. The fusion fuel is confined in current-carrying rings of magnetically-field-reversed plasma. The plamsa rings, formed by a coaxial plasma gun, are magnetically compressed to ignition temperature while they are being injected into the reactor's burner section. DT ice pellets refuel the rings during the burn at a rate which maintains constant fusion power. A steady train of plasma rings moves at constant speed through the reactor under the influence of a slightly diverging magnetic field. The aluminum first wall and breeding zone structure minimize induced radioactivity; hands-on maintenance is possible on reactor components outside the breeding blanket. Helium removes the heat from the Li 2 O tritium breeding blanket and is used to generate steam. The reactor produces a constant, net power of 376 MW
Moving ring field-reversed mirror blanket design considerations
International Nuclear Information System (INIS)
Wong, C.P.C.; Cheng, E.T.; Creedon, L.; Kessel, C.; Norman, J.; Schultz, K.R.
1981-01-01
A blanket design for the Moving Ring Field-Reversed Mirror Reactor (MRFRM) is presented in this paper. The design emphasis is placed on minimizing the induced radioactivities in the first-wall, blanket and shield. To this end, aluminum-alloy was selected as the reference structural material, giving dose rates two weeks after shutdown that are 3 to 4 orders of magnitude lower than comparable steel structures. The aluminum first-wall is water-cooled and thermally insulated from the high temperature SiC-clad Li 2 O tritium breeding zone. A local tritium breeding ratio of 1.05 was obtained for the design. The tritium is extracted from the Li 2 O by the use of a small dry helium purge stream through the SiC tubes. About 1 ppM hydrogen is added to the helium purge stream to enhance the tritium recovery rate. Helium at 28 atmospheres pressure is circulated through the blanket and shield, with an outlet temperature of 850 0 C, which is coupled with an existing small size closed-cycle gas turbine (CCGT) power conversion system. The spatial and temporal variations of the first-wall temperature caused by the translational movement of the plasma rings along the axis of the cylindrical reactor were evaluated. The after-heat cooling problems of the first-wall were also considered
Smoothing of Fused Spectral Consistent Satellite Images with TV-based Edge Detection
DEFF Research Database (Denmark)
Sveinsson, Johannes; Aanæs, Henrik; Benediktsson, Jon Atli
2007-01-01
based on satellite data. Additionally, most conventional methods are loosely connected to the image forming physics of the satellite image, giving these methods an ad hoc feel. Vesteinsson et al. [1] proposed a method of fusion of satellite images that is based on the properties of imaging physics...... in a statistically meaningful way and was called spectral consistent panshapening (SCP). In this paper we improve this framework for satellite image fusion by introducing a better image prior, via data-dependent image smoothing. The dependency is obtained via total variation edge detection method.......Several widely used methods have been proposed for fusing high resolution panchromatic data and lower resolution multi-channel data. However, many of these methods fail to maintain the spectral consistency of the fused high resolution image, which is of high importance to many of the applications...
Method used to test the imaging consistency of binocular camera's left-right optical system
Liu, Meiying; Wang, Hu; Liu, Jie; Xue, Yaoke; Yang, Shaodong; Zhao, Hui
2016-09-01
To binocular camera, the consistency of optical parameters of the left and the right optical system is an important factor that will influence the overall imaging consistency. In conventional testing procedure of optical system, there lacks specifications suitable for evaluating imaging consistency. In this paper, considering the special requirements of binocular optical imaging system, a method used to measure the imaging consistency of binocular camera is presented. Based on this method, a measurement system which is composed of an integrating sphere, a rotary table and a CMOS camera has been established. First, let the left and the right optical system capture images in normal exposure time under the same condition. Second, a contour image is obtained based on the multiple threshold segmentation result and the boundary is determined using the slope of contour lines near the pseudo-contour line. Third, the constraint of gray level based on the corresponding coordinates of left-right images is established and the imaging consistency could be evaluated through standard deviation σ of the imaging grayscale difference D (x, y) between the left and right optical system. The experiments demonstrate that the method is suitable for carrying out the imaging consistency testing for binocular camera. When the standard deviation 3σ distribution of imaging gray difference D (x, y) between the left and right optical system of the binocular camera does not exceed 5%, it is believed that the design requirements have been achieved. This method could be used effectively and paves the way for the imaging consistency testing of the binocular camera.
Heo, Yong Seok; Lee, Kyoung Mu; Lee, Sang Uk
2013-05-01
Abstract—In this paper, we propose a method that infers both accurate depth maps and color-consistent stereo images for radiometrically varying stereo images. In general, stereo matching and performing color consistency between stereo images are a chicken-and-egg problem since it is not a trivial task to simultaneously achieve both goals. Hence, we have developed an iterative framework in which these two processes can boost each other. First, we transform the input color images to log-chromaticity color space, from which a linear relationship can be established during constructing a joint pdf of transformed left and right color images. From this joint pdf, we can estimate a linear function that relates the corresponding pixels in stereo images. Based on this linear property, we present a new stereo matching cost by combining Mutual Information (MI), SIFT descriptor, and segment-based plane-fitting to robustly find correspondence for stereo image pairs which undergo radiometric variations. Meanwhile, we devise a Stereo Color Histogram Equalization (SCHE) method to produce color-consistent stereo image pairs, which conversely boost the disparity map estimation. Experimental results show that our method produces both accurate depth maps and color-consistent stereo images, even for stereo images with severe radiometric differences.
Object-Oriented Hierarchy Radiation Consistency for Different Temporal and Different Sensor Images
Directory of Open Access Journals (Sweden)
Nan Su
2018-02-01
Full Text Available In the paper, we propose a novel object-oriented hierarchy radiation consistency method for dense matching of different temporal and different sensor data in the 3D reconstruction. For different temporal images, our illumination consistency method is proposed to solve both the illumination uniformity for a single image and the relative illumination normalization for image pairs. Especially in the relative illumination normalization step, singular value equalization and linear relationship of the invariant pixels is combined used for the initial global illumination normalization and the object-oriented refined illumination normalization in detail, respectively. For different sensor images, we propose the union group sparse method, which is based on improving the original group sparse model. The different sensor images are set to a similar smoothness level by the same threshold of singular value from the union group matrix. Our method comprehensively considered the influence factors on the dense matching of the different temporal and different sensor stereoscopic image pairs to simultaneously improve the illumination consistency and the smoothness consistency. The radiation consistency experimental results verify the effectiveness and superiority of the proposed method by comparing two other methods. Moreover, in the dense matching experiment of the mixed stereoscopic image pairs, our method has more advantages for objects in the urban area.
Inverse consistent non-rigid image registration based on robust point set matching
2014-01-01
Background Robust point matching (RPM) has been extensively used in non-rigid registration of images to robustly register two sets of image points. However, except for the location at control points, RPM cannot estimate the consistent correspondence between two images because RPM is a unidirectional image matching approach. Therefore, it is an important issue to make an improvement in image registration based on RPM. Methods In our work, a consistent image registration approach based on the point sets matching is proposed to incorporate the property of inverse consistency and improve registration accuracy. Instead of only estimating the forward transformation between the source point sets and the target point sets in state-of-the-art RPM algorithms, the forward and backward transformations between two point sets are estimated concurrently in our algorithm. The inverse consistency constraints are introduced to the cost function of RPM and the fuzzy correspondences between two point sets are estimated based on both the forward and backward transformations simultaneously. A modified consistent landmark thin-plate spline registration is discussed in detail to find the forward and backward transformations during the optimization of RPM. The similarity of image content is also incorporated into point matching in order to improve image matching. Results Synthetic data sets, medical images are employed to demonstrate and validate the performance of our approach. The inverse consistent errors of our algorithm are smaller than RPM. Especially, the topology of transformations is preserved well for our algorithm for the large deformation between point sets. Moreover, the distance errors of our algorithm are similar to that of RPM, and they maintain a downward trend as whole, which demonstrates the convergence of our algorithm. The registration errors for image registrations are evaluated also. Again, our algorithm achieves the lower registration errors in same iteration number
Group sparse multiview patch alignment framework with view consistency for image classification.
Gui, Jie; Tao, Dacheng; Sun, Zhenan; Luo, Yong; You, Xinge; Tang, Yuan Yan
2014-07-01
No single feature can satisfactorily characterize the semantic concepts of an image. Multiview learning aims to unify different kinds of features to produce a consensual and efficient representation. This paper redefines part optimization in the patch alignment framework (PAF) and develops a group sparse multiview patch alignment framework (GSM-PAF). The new part optimization considers not only the complementary properties of different views, but also view consistency. In particular, view consistency models the correlations between all possible combinations of any two kinds of view. In contrast to conventional dimensionality reduction algorithms that perform feature extraction and feature selection independently, GSM-PAF enjoys joint feature extraction and feature selection by exploiting l(2,1)-norm on the projection matrix to achieve row sparsity, which leads to the simultaneous selection of relevant features and learning transformation, and thus makes the algorithm more discriminative. Experiments on two real-world image data sets demonstrate the effectiveness of GSM-PAF for image classification.
Multi-instance learning based on instance consistency for image retrieval
Zhang, Miao; Wu, Zhize; Wan, Shouhong; Yue, Lihua; Yin, Bangjie
2017-07-01
Multiple-instance learning (MIL) has been successfully utilized in image retrieval. Existing approaches cannot select positive instances correctly from positive bags which may result in a low accuracy. In this paper, we propose a new image retrieval approach called multiple instance learning based on instance-consistency (MILIC) to mitigate such issue. First, we select potential positive instances effectively in each positive bag by ranking instance-consistency (IC) values of instances. Then, we design a feature representation scheme, which can represent the relationship among bags and instances, based on potential positive instances to convert a bag into a single instance. Finally, we can use a standard single-instance learning strategy, such as the support vector machine, for performing object-based image retrieval. Experimental results on two challenging data sets show the effectiveness of our proposal in terms of accuracy and run time.
Consistent reconstruction of 4D fetal heart ultrasound images to cope with fetal motion.
Tanner, Christine; Flach, Barbara; Eggenberger, Céline; Mattausch, Oliver; Bajka, Michael; Goksel, Orcun
2017-08-01
4D ultrasound imaging of the fetal heart relies on reconstructions from B-mode images. In the presence of fetal motion, current approaches suffer from artifacts, which are unrecoverable for single sweeps. We propose to use many sweeps and exploit the resulting redundancy to automatically recover from motion by reconstructing a 4D image which is consistent in phase, space, and time. An interactive visualization framework to view animated ultrasound slices from 4D reconstructions on arbitrary planes was developed using a magnetically tracked mock probe. We first quantified the performance of 10 4D reconstruction formulations on simulated data. Reconstructions of 14 in vivo sequences by a baseline, the current state-of-the-art, and the proposed approach were then visually ranked with respect to temporal quality on orthogonal views. Rankings from 5 observers showed that the proposed 4D reconstruction approach significantly improves temporal image quality in comparison with the baseline. The 4D reconstructions of the baseline and the proposed methods were then inspected interactively for accessibility to clinically important views and rated for their clinical usefulness by an ultrasound specialist in obstetrics and gynecology. The reconstructions by the proposed method were rated as 'very useful' in 71% and were statistically significantly more useful than the baseline reconstructions. Multi-sweep fetal heart ultrasound acquisitions in combination with consistent 4D image reconstruction improves quality as well as clinical usefulness of the resulting 4D images in the presence of fetal motion.
International Nuclear Information System (INIS)
Yang Deshan; Li Hua; Low, Daniel A; Deasy, Joseph O; Naqa, Issam El
2008-01-01
Deformable image registration is widely used in various radiation therapy applications including daily treatment planning adaptation to map planned tissue or dose to changing anatomy. In this work, a simple and efficient inverse consistency deformable registration method is proposed with aims of higher registration accuracy and faster convergence speed. Instead of registering image I to a second image J, the two images are symmetrically deformed toward one another in multiple passes, until both deformed images are matched and correct registration is therefore achieved. In each pass, a delta motion field is computed by minimizing a symmetric optical flow system cost function using modified optical flow algorithms. The images are then further deformed with the delta motion field in the positive and negative directions respectively, and then used for the next pass. The magnitude of the delta motion field is forced to be less than 0.4 voxel for every pass in order to guarantee smoothness and invertibility for the two overall motion fields that are accumulating the delta motion fields in both positive and negative directions, respectively. The final motion fields to register the original images I and J, in either direction, are calculated by inverting one overall motion field and combining the inversion result with the other overall motion field. The final motion fields are inversely consistent and this is ensured by the symmetric way that registration is carried out. The proposed method is demonstrated with phantom images, artificially deformed patient images and 4D-CT images. Our results suggest that the proposed method is able to improve the overall accuracy (reducing registration error by 30% or more, compared to the original and inversely inconsistent optical flow algorithms), reduce the inverse consistency error (by 95% or more) and increase the convergence rate (by 100% or more). The overall computation speed may slightly decrease, or increase in most cases
Image velocimetry for clouds with relaxation labeling based on deformation consistency
International Nuclear Information System (INIS)
Horinouchi, Takeshi; Murakami, Shin-ya; Yamazaki, Atsushi; Kouyama, Toru; Ogohara, Kazunori; Yamada, Manabu; Watanabe, Shigeto
2017-01-01
Correlation-based cloud tracking has been extensively used to measure atmospheric winds, but still difficulty remains. In this study, aiming at developing a cloud tracking system for Akatsuki, an artificial satellite orbiting Venus, a formulation is developed for improving the relaxation labeling technique to select appropriate peaks of cross-correlation surfaces which tend to have multiple peaks. The formulation makes an explicit use of consistency inherent in the type of cross-correlation method where template sub-images are slid without deformation; if the resultant motion vectors indicate a too-large deformation, it is contradictory to the assumption of the method. The deformation consistency is exploited further to develop two post processes; one clusters the motion vectors into groups within each of which the consistency is perfect, and the other extends the groups using the original candidate lists. These processes are useful to eliminate erroneous vectors, distinguish motion vectors at different altitudes, and detect phase velocities of waves in fluids such as atmospheric gravity waves. As a basis of the relaxation labeling and the post processes as well as uncertainty estimation, the necessity to find isolated (well-separated) peaks of cross-correlation surfaces is argued, and an algorithm to realize it is presented. All the methods are implemented, and their effectiveness is demonstrated with initial images obtained by the ultraviolet imager onboard Akatsuki. Since the deformation consistency regards the logical consistency inherent in template matching methods, it should have broad application beyond cloud tracking. (paper)
Image velocimetry for clouds with relaxation labeling based on deformation consistency
Horinouchi, Takeshi; Murakami, Shin-ya; Kouyama, Toru; Ogohara, Kazunori; Yamazaki, Atsushi; Yamada, Manabu; Watanabe, Shigeto
2017-08-01
Correlation-based cloud tracking has been extensively used to measure atmospheric winds, but still difficulty remains. In this study, aiming at developing a cloud tracking system for Akatsuki, an artificial satellite orbiting Venus, a formulation is developed for improving the relaxation labeling technique to select appropriate peaks of cross-correlation surfaces which tend to have multiple peaks. The formulation makes an explicit use of consistency inherent in the type of cross-correlation method where template sub-images are slid without deformation; if the resultant motion vectors indicate a too-large deformation, it is contradictory to the assumption of the method. The deformation consistency is exploited further to develop two post processes; one clusters the motion vectors into groups within each of which the consistency is perfect, and the other extends the groups using the original candidate lists. These processes are useful to eliminate erroneous vectors, distinguish motion vectors at different altitudes, and detect phase velocities of waves in fluids such as atmospheric gravity waves. As a basis of the relaxation labeling and the post processes as well as uncertainty estimation, the necessity to find isolated (well-separated) peaks of cross-correlation surfaces is argued, and an algorithm to realize it is presented. All the methods are implemented, and their effectiveness is demonstrated with initial images obtained by the ultraviolet imager onboard Akatsuki. Since the deformation consistency regards the logical consistency inherent in template matching methods, it should have broad application beyond cloud tracking.
Energy Technology Data Exchange (ETDEWEB)
Lee, Danny [Radiation Physics Laboratory, Sydney Medical School, The University of Sydney, Sidney, NSW (Australia); Greer, Peter B. [School of Mathematical and Physical Sciences, The University of Newcastle, Newcastle, NSW (Australia); Department of Radiation Oncology, Calvary Mater Newcastle, Newcastle, NSW (Australia); Ludbrook, Joanna; Arm, Jameen; Hunter, Perry [Department of Radiation Oncology, Calvary Mater Newcastle, Newcastle, NSW (Australia); Pollock, Sean; Makhija, Kuldeep; O' brien, Ricky T. [Radiation Physics Laboratory, Sydney Medical School, The University of Sydney, Sidney, NSW (Australia); Kim, Taeho [Radiation Physics Laboratory, Sydney Medical School, The University of Sydney, Sidney, NSW (Australia); Department of Radiation Oncology, Virginia Commonwealth University, Richmond, Virginia (United States); Keall, Paul, E-mail: paul.keall@sydney.edu.au [Radiation Physics Laboratory, Sydney Medical School, The University of Sydney, Sidney, NSW (Australia)
2016-03-01
Purpose: To assess the impact of an audiovisual (AV) biofeedback on intra- and interfraction tumor motion for lung cancer patients. Methods and Materials: Lung tumor motion was investigated in 9 lung cancer patients who underwent a breathing training session with AV biofeedback before 2 3T magnetic resonance imaging (MRI) sessions. The breathing training session was performed to allow patients to become familiar with AV biofeedback, which uses a guiding wave customized for each patient according to a reference breathing pattern. In the first MRI session (pretreatment), 2-dimensional cine-MR images with (1) free breathing (FB) and (2) AV biofeedback were obtained, and the second MRI session was repeated within 3-6 weeks (mid-treatment). Lung tumors were directly measured from cine-MR images using an auto-segmentation technique; the centroid and outlier motions of the lung tumors were measured from the segmented tumors. Free breathing and AV biofeedback were compared using several metrics: intra- and interfraction tumor motion consistency in displacement and period, and the outlier motion ratio. Results: Compared with FB, AV biofeedback improved intrafraction tumor motion consistency by 34% in displacement (P=.019) and by 73% in period (P<.001). Compared with FB, AV biofeedback improved interfraction tumor motion consistency by 42% in displacement (P<.046) and by 74% in period (P=.005). Compared with FB, AV biofeedback reduced the outlier motion ratio by 21% (P<.001). Conclusions: These results demonstrated that AV biofeedback significantly improved intra- and interfraction lung tumor motion consistency for lung cancer patients. These results demonstrate that AV biofeedback can facilitate consistent tumor motion, which is advantageous toward achieving more accurate medical imaging and radiation therapy procedures.
International Nuclear Information System (INIS)
Lee, Danny; Greer, Peter B.; Ludbrook, Joanna; Arm, Jameen; Hunter, Perry; Pollock, Sean; Makhija, Kuldeep; O'brien, Ricky T.; Kim, Taeho; Keall, Paul
2016-01-01
Purpose: To assess the impact of an audiovisual (AV) biofeedback on intra- and interfraction tumor motion for lung cancer patients. Methods and Materials: Lung tumor motion was investigated in 9 lung cancer patients who underwent a breathing training session with AV biofeedback before 2 3T magnetic resonance imaging (MRI) sessions. The breathing training session was performed to allow patients to become familiar with AV biofeedback, which uses a guiding wave customized for each patient according to a reference breathing pattern. In the first MRI session (pretreatment), 2-dimensional cine-MR images with (1) free breathing (FB) and (2) AV biofeedback were obtained, and the second MRI session was repeated within 3-6 weeks (mid-treatment). Lung tumors were directly measured from cine-MR images using an auto-segmentation technique; the centroid and outlier motions of the lung tumors were measured from the segmented tumors. Free breathing and AV biofeedback were compared using several metrics: intra- and interfraction tumor motion consistency in displacement and period, and the outlier motion ratio. Results: Compared with FB, AV biofeedback improved intrafraction tumor motion consistency by 34% in displacement (P=.019) and by 73% in period (P<.001). Compared with FB, AV biofeedback improved interfraction tumor motion consistency by 42% in displacement (P<.046) and by 74% in period (P=.005). Compared with FB, AV biofeedback reduced the outlier motion ratio by 21% (P<.001). Conclusions: These results demonstrated that AV biofeedback significantly improved intra- and interfraction lung tumor motion consistency for lung cancer patients. These results demonstrate that AV biofeedback can facilitate consistent tumor motion, which is advantageous toward achieving more accurate medical imaging and radiation therapy procedures.
Self-consistent density functional calculation of the image potential at a metal surface
International Nuclear Information System (INIS)
Jung, J; Alvarellos, J E; Chacon, E; GarcIa-Gonzalez, P
2007-01-01
It is well known that the exchange-correlation (XC) potential at a metal surface has an image-like asymptotic behaviour given by -1/4(z-z 0 ), where z is the coordinate perpendicular to the surface. Using a suitable fully non-local functional prescription, we evaluate self-consistently the XC potential with the correct image behaviour for simple jellium surfaces in the range of metallic densities. This allows a proper comparison between the corresponding image-plane position, z 0 , and other related quantities such as the centroid of an induced charge by an external perturbation. As a by-product, we assess the routinely used local density approximation when evaluating electron density profiles, work functions, and surface energies by focusing on the XC effects included in the fully non-local description
Self-consistent density functional calculation of the image potential at a metal surface
Energy Technology Data Exchange (ETDEWEB)
Jung, J [Departamento de Fisica Fundamental, Universidad Nacional de Educacion a Distancia, Apartado 60141, 28080 Madrid (Spain); Alvarellos, J E [Departamento de Fisica Fundamental, Universidad Nacional de Educacion a Distancia, Apartado 60141, 28080 Madrid (Spain); Chacon, E [Instituto de Ciencias de Materiales de Madrid, Consejo Superior de Investigaciones CientIficas, E-28049 Madrid (Spain); GarcIa-Gonzalez, P [Departamento de Fisica Fundamental, Universidad Nacional de Educacion a Distancia, Apartado 60141, 28080 Madrid (Spain)
2007-07-04
It is well known that the exchange-correlation (XC) potential at a metal surface has an image-like asymptotic behaviour given by -1/4(z-z{sub 0}), where z is the coordinate perpendicular to the surface. Using a suitable fully non-local functional prescription, we evaluate self-consistently the XC potential with the correct image behaviour for simple jellium surfaces in the range of metallic densities. This allows a proper comparison between the corresponding image-plane position, z{sub 0}, and other related quantities such as the centroid of an induced charge by an external perturbation. As a by-product, we assess the routinely used local density approximation when evaluating electron density profiles, work functions, and surface energies by focusing on the XC effects included in the fully non-local description.
Christensen, Gary E; Song, Joo Hyun; Lu, Wei; El Naqa, Issam; Low, Daniel A
2007-06-01
Breathing motion is one of the major limiting factors for reducing dose and irradiation of normal tissue for conventional conformal radiotherapy. This paper describes a relationship between tracking lung motion using spirometry data and image registration of consecutive CT image volumes collected from a multislice CT scanner over multiple breathing periods. Temporal CT sequences from 5 individuals were analyzed in this study. The couch was moved from 11 to 14 different positions to image the entire lung. At each couch position, 15 image volumes were collected over approximately 3 breathing periods. It is assumed that the expansion and contraction of lung tissue can be modeled as an elastic material. Furthermore, it is assumed that the deformation of the lung is small over one-fifth of a breathing period and therefore the motion of the lung can be adequately modeled using a small deformation linear elastic model. The small deformation inverse consistent linear elastic image registration algorithm is therefore well suited for this problem and was used to register consecutive image scans. The pointwise expansion and compression of lung tissue was measured by computing the Jacobian of the transformations used to register the images. The logarithm of the Jacobian was computed so that expansion and compression of the lung were scaled equally. The log-Jacobian was computed at each voxel in the volume to produce a map of the local expansion and compression of the lung during the breathing period. These log-Jacobian images demonstrate that the lung does not expand uniformly during the breathing period, but rather expands and contracts locally at different rates during inhalation and exhalation. The log-Jacobian numbers were averaged over a cross section of the lung to produce an estimate of the average expansion or compression from one time point to the next and compared to the air flow rate measured by spirometry. In four out of five individuals, the average log
International Nuclear Information System (INIS)
Christensen, Gary E.; Song, Joo Hyun; Lu, Wei; Naqa, Issam El; Low, Daniel A.
2007-01-01
Breathing motion is one of the major limiting factors for reducing dose and irradiation of normal tissue for conventional conformal radiotherapy. This paper describes a relationship between tracking lung motion using spirometry data and image registration of consecutive CT image volumes collected from a multislice CT scanner over multiple breathing periods. Temporal CT sequences from 5 individuals were analyzed in this study. The couch was moved from 11 to 14 different positions to image the entire lung. At each couch position, 15 image volumes were collected over approximately 3 breathing periods. It is assumed that the expansion and contraction of lung tissue can be modeled as an elastic material. Furthermore, it is assumed that the deformation of the lung is small over one-fifth of a breathing period and therefore the motion of the lung can be adequately modeled using a small deformation linear elastic model. The small deformation inverse consistent linear elastic image registration algorithm is therefore well suited for this problem and was used to register consecutive image scans. The pointwise expansion and compression of lung tissue was measured by computing the Jacobian of the transformations used to register the images. The logarithm of the Jacobian was computed so that expansion and compression of the lung were scaled equally. The log-Jacobian was computed at each voxel in the volume to produce a map of the local expansion and compression of the lung during the breathing period. These log-Jacobian images demonstrate that the lung does not expand uniformly during the breathing period, but rather expands and contracts locally at different rates during inhalation and exhalation. The log-Jacobian numbers were averaged over a cross section of the lung to produce an estimate of the average expansion or compression from one time point to the next and compared to the air flow rate measured by spirometry. In four out of five individuals, the average log
Derivation of the scan time requirement for maintaining a consistent PET image quality
International Nuclear Information System (INIS)
Kim, Jin Su; Lee, Jae Sung; Kim, Seok-Ki
2015-01-01
Objectives: the image quality of PET for larger patients is relatively poor, even though the injection dose is optimized considering the NECR characteristics of the PET scanner. This poor image quality is due to the lower level of maximum NECR that can be achieved in these large patients. The aim of this study was to optimize the PET scan time to obtain a consistent PET image quality regardless of the body size, based on the relationship between the patient specific NECR (pNECR) and body weight. Methods: eighty patients (M/F=53/27, body weight: 059 ± 1 kg) underwent whole-body FDG PET scans using a Philips GEMINI GS PET/CT scanner after an injection of 0.14 mCi/kg FDG. The relationship between the scatter fraction (SF) and body weight was determined by repeated Monte Carlo simulations using a NEMA scatter phantom, the size of which varied according to the relationship between the abdominal circumference and body weight. Using this information, the pNECR was calculated from the prompt and delayed PET sinograms to obtain the prediction equation of NECR vs. body weight. The time scaling factor (F TS ) for the scan duration was finally derived to make PET images with equivalent SNR levels. Results: the SF and NECR had the following nonlinear relationships with the body weight: SF=0.15 ⋅ body weight 0.3 and NECR = 421.36 (body weight) −0.84 . The equation derived for F TS was 0.01⋅ body weight + 0.2, which means that, for example, a 120-kg person should be scanned 1.8 times longer than a 70 kg person, or the scan time for a 40-kg person can be reduced by 30%. Conclusion: the equation of the relative time demand derived in this study will be useful for maintaining consistent PET image quality in clinics
101 labeled brain images and a consistent human cortical labeling protocol
Directory of Open Access Journals (Sweden)
Arno eKlein
2012-12-01
Full Text Available We introduce the Mindboggle-101 dataset, the largest and most complete set of free, publicly accessible, manually labeled human brain images. To manually label the macroscopic anatomy in magnetic resonance images of 101 healthy participants, we created a new cortical labeling protocol that relies on robust anatomical landmarks and minimal manual edits after initialization with automated labels. The Desikan-Killiany-Tourville (DKT protocol is intended to improve the ease, consistency, and accuracy of labeling human cortical areas. Given how difficult it is to label brains, the Mindboggle-101 dataset is intended to serve as brain atlases for use in labeling other brains, as a normative dataset to establish morphometric variation in a healthy population for comparison against clinical populations, and contribute to the development, training, testing, and evaluation of automated registration and labeling algorithms. To this end, we also introduce benchmarks for the evaluation of such algorithms by comparing our manual labels with labels automatically generated by probabilistic and multi-atlas registration-based approaches. All data and related software and updated information are available on the http://www.mindboggle.info/data/ website.
101 Labeled Brain Images and a Consistent Human Cortical Labeling Protocol
Klein, Arno; Tourville, Jason
2012-01-01
We introduce the Mindboggle-101 dataset, the largest and most complete set of free, publicly accessible, manually labeled human brain images. To manually label the macroscopic anatomy in magnetic resonance images of 101 healthy participants, we created a new cortical labeling protocol that relies on robust anatomical landmarks and minimal manual edits after initialization with automated labels. The “Desikan–Killiany–Tourville” (DKT) protocol is intended to improve the ease, consistency, and accuracy of labeling human cortical areas. Given how difficult it is to label brains, the Mindboggle-101 dataset is intended to serve as brain atlases for use in labeling other brains, as a normative dataset to establish morphometric variation in a healthy population for comparison against clinical populations, and contribute to the development, training, testing, and evaluation of automated registration and labeling algorithms. To this end, we also introduce benchmarks for the evaluation of such algorithms by comparing our manual labels with labels automatically generated by probabilistic and multi-atlas registration-based approaches. All data and related software and updated information are available on the http://mindboggle.info/data website. PMID:23227001
International Nuclear Information System (INIS)
Luo Min; Peng Chenglin; Wang Xiaolin; Luo Song; Lei Wenyong; Wang Kang; Wang Xuejian; Wen Hongyu; Wu Hongxing
2003-01-01
Objective: To realize the consistency of patient information through modality worklist among picture archiving and communication system (PACS), radiology information systems (RIS), and imaging equipments. Methods: Many digital modalities including GE Signa 1.5 T MR system, digital mammography, Agfa digital radiography, computed radiography, GE CT, and etc were installed in our hospital. Since the GE PACS system was in English version, the images were saved and the information was managed to PACS without a truly worklist. Two months later, while still using the previous PACS, the consistency of patient information was resolved through modality worklist by adopting the international advanced approach to realize thread worklist of digital imaging and communication in medicine. The information in Chinese will be transformed into information in English in RIS system and saved as the English information to worklist. Results: After the implementation and integration of PACS and RIS was achieved in my hospital, the consistency of patient information was guaranteed through modality worklist between RIS and imaging equipments and the first-rate effect was acquired. The patient information could be checked, edited, and created by using the Chinese RIS system on all diagnostic workstations. Conclusion: The consistency of patient information through modality worklist was realized among PACS, RIS, and imaging equipments, yet at present, the question of guarantee the consistency of patient information must be thought over in PACS building
Design and construction of an Offner spectrometer based on geometrical analysis of ring fields.
Kim, Seo Hyun; Kong, Hong Jin; Lee, Jong Ung; Lee, Jun Ho; Lee, Jai Hoon
2014-08-01
A method to obtain an aberration-corrected Offner spectrometer without ray obstruction is proposed. A new, more efficient spectrometer optics design is suggested in order to increase its spectral resolution. The derivation of a new ring equation to eliminate ray obstruction is based on geometrical analysis of the ring fields for various numerical apertures. The analytical design applying this equation was demonstrated using the optical design software Code V in order to manufacture a spectrometer working in wavelengths of 900-1700 nm. The simulation results show that the new concept offers an analytical initial design taking the least time of calculation. The simulated spectrometer exhibited a modulation transfer function over 80% at Nyquist frequency, root-mean-square spot diameters under 8.6 μm, and a spectral resolution of 3.2 nm. The final design and its realization of a high resolution Offner spectrometer was demonstrated based on the simulation result. The equation and analytical design procedure shown here can be applied to most Offner systems regardless of the wavelength range.
Yu, Shaode; Dai, Guangzhe; Wang, Zhaoyang; Li, Leida; Wei, Xinhua; Xie, Yaoqin
2018-05-16
Quality assessment of medical images is highly related to the quality assurance, image interpretation and decision making. As to magnetic resonance (MR) images, signal-to-noise ratio (SNR) is routinely used as a quality indicator, while little knowledge is known of its consistency regarding different observers. In total, 192, 88, 76 and 55 brain images are acquired using T 2 * , T 1 , T 2 and contrast-enhanced T 1 (T 1 C) weighted MR imaging sequences, respectively. To each imaging protocol, the consistency of SNR measurement is verified between and within two observers, and white matter (WM) and cerebral spinal fluid (CSF) are alternately used as the tissue region of interest (TOI) for SNR measurement. The procedure is repeated on another day within 30 days. At first, overlapped voxels in TOIs are quantified with Dice index. Then, test-retest reliability is assessed in terms of intra-class correlation coefficient (ICC). After that, four models (BIQI, BLIINDS-II, BRISQUE and NIQE) primarily used for the quality assessment of natural images are borrowed to predict the quality of MR images. And in the end, the correlation between SNR values and predicted results is analyzed. To the same TOI in each MR imaging sequence, less than 6% voxels are overlapped between manual delineations. In the quality estimation of MR images, statistical analysis indicates no significant difference between observers (Wilcoxon rank sum test, p w ≥ 0.11; paired-sample t test, p p ≥ 0.26), and good to very good intra- and inter-observer reliability are found (ICC, p icc ≥ 0.74). Furthermore, Pearson correlation coefficient (r p ) suggests that SNR wm correlates strongly with BIQI, BLIINDS-II and BRISQUE in T 2 * (r p ≥ 0.78), BRISQUE and NIQE in T 1 (r p ≥ 0.77), BLIINDS-II in T 2 (r p ≥ 0.68) and BRISQUE and NIQE in T 1 C (r p ≥ 0.62) weighted MR images, while SNR csf correlates strongly with BLIINDS-II in T 2 * (r p ≥ 0.63) and in T
Consistency and standardization of color in medical imaging: a consensus report.
Badano, Aldo; Revie, Craig; Casertano, Andrew; Cheng, Wei-Chung; Green, Phil; Kimpe, Tom; Krupinski, Elizabeth; Sisson, Christye; Skrøvseth, Stein; Treanor, Darren; Boynton, Paul; Clunie, David; Flynn, Michael J; Heki, Tatsuo; Hewitt, Stephen; Homma, Hiroyuki; Masia, Andy; Matsui, Takashi; Nagy, Balázs; Nishibori, Masahiro; Penczek, John; Schopf, Thomas; Yagi, Yukako; Yokoi, Hideto
2015-02-01
This article summarizes the consensus reached at the Summit on Color in Medical Imaging held at the Food and Drug Administration (FDA) on May 8-9, 2013, co-sponsored by the FDA and ICC (International Color Consortium). The purpose of the meeting was to gather information on how color is currently handled by medical imaging systems to identify areas where there is a need for improvement, to define objective requirements, and to facilitate consensus development of best practices. Participants were asked to identify areas of concern and unmet needs. This summary documents the topics that were discussed at the meeting and recommendations that were made by the participants. Key areas identified where improvements in color would provide immediate tangible benefits were those of digital microscopy, telemedicine, medical photography (particularly ophthalmic and dental photography), and display calibration. Work in these and other related areas has been started within several professional groups, including the creation of the ICC Medical Imaging Working Group.
Li, Yusheng; Defrise, Michel; Metzler, Scott D.; Matej, Samuel
2015-08-01
In positron emission tomography (PET) imaging, attenuation correction with accurate attenuation estimation is crucial for quantitative patient studies. Recent research showed that the attenuation sinogram can be determined up to a scaling constant utilizing the time-of-flight information. The TOF-PET data can be naturally and efficiently stored in a histo-image without information loss, and the radioactive tracer distribution can be efficiently reconstructed using the DIRECT approaches. In this paper, we explore transmission-less attenuation estimation from TOF-PET histo-images. We first present the TOF-PET histo-image formation and the consistency equations in the histo-image parameterization, then we derive a least-squares solution for estimating the directional derivatives of the attenuation factors from the measured emission histo-images. Finally, we present a fast solver to estimate the attenuation factors from their directional derivatives using the discrete sine transform and fast Fourier transform while considering the boundary conditions. We find that the attenuation histo-images can be uniquely determined from the TOF-PET histo-images by considering boundary conditions. Since the estimate of the attenuation directional derivatives can be inaccurate for LORs tangent to the patient boundary, external sources, e.g. a ring or annulus source, might be needed to give an accurate estimate of the attenuation gradient for such LORs. The attenuation estimation from TOF-PET emission histo-images is demonstrated using simulated 2D TOF-PET data.
Assessing the consistency of UAV-derived point clouds and images acquired at different altitudes
Ozcan, O.
2016-12-01
Unmanned Aerial Vehicles (UAVs) offer several advantages in terms of cost and image resolution compared to terrestrial photogrammetry and satellite remote sensing system. Nowadays, UAVs that bridge the gap between the satellite scale and field scale applications were initiated to be used in various application areas to acquire hyperspatial and high temporal resolution imageries due to working capacity and acquiring in a short span of time with regard to conventional photogrammetry methods. UAVs have been used for various fields such as for the creation of 3-D earth models, production of high resolution orthophotos, network planning, field monitoring and agricultural lands as well. Thus, geometric accuracy of orthophotos and volumetric accuracy of point clouds are of capital importance for land surveying applications. Correspondingly, Structure from Motion (SfM) photogrammetry, which is frequently used in conjunction with UAV, recently appeared in environmental sciences as an impressive tool allowing for the creation of 3-D models from unstructured imagery. In this study, it was aimed to reveal the spatial accuracy of the images acquired from integrated digital camera and the volumetric accuracy of Digital Surface Models (DSMs) which were derived from UAV flight plans at different altitudes using SfM methodology. Low-altitude multispectral overlapping aerial photography was collected at the altitudes of 30 to 100 meters and georeferenced with RTK-GPS ground control points. These altitudes allow hyperspatial imagery with the resolutions of 1-5 cm depending upon the sensor being used. Preliminary results revealed that the vertical comparison of UAV-derived point clouds with respect to GPS measurements pointed out an average distance at cm-level. Larger values are found in areas where instantaneous changes in surface are present.
Self-consistent depth profiling and imaging of GaN-based transistors using ion microbeams
Energy Technology Data Exchange (ETDEWEB)
Redondo-Cubero, A., E-mail: andres.redondo@uam.es [IPFN, Instituto Superior Técnico, Campus Tecnológico e Nuclear, Universidade de Lisboa, 2686-953 Bobadela (Portugal); Departamento de Física Aplicada y Centro de Micro-Análisis de Materiales, Universidad Autónoma de Madrid, 28049 Madrid (Spain); Corregidor, V. [IPFN, Instituto Superior Técnico, Campus Tecnológico e Nuclear, Universidade de Lisboa, 2686-953 Bobadela (Portugal); Vázquez, L. [Instituto de Ciencia de Materiales de Madrid, Consejo Superior de Investigaciones Científicas, 28049 Madrid (Spain); Alves, L.C. [C2TN, Instituto Superior Técnico, Campus Tecnológico e Nuclear, Universidade de Lisboa, 2686-953 Bobadela (Portugal)
2015-04-01
Using an ion microprobe, a comprehensive lateral and in-depth characterization of a single GaN-based high electron mobility transistor is carried out by means of Rutherford backscattering spectrometry (RBS) in combination with particle induced X-ray emission (PIXE). Elemental distribution was obtained for every individual section of the device (wafer, gate and source contact), identifying the basic constituents of the transistor (including the detection of the passivant layer) and checking its homogeneity. A self-consistent analysis of each individual regions of the transistor was carried out with a simultaneous fit of RBS and PIXE spectra with two different beam conditions. Following this approach, the quantification of the atomic content and the layer thicknesses was successfully achieved overcoming the mass-depth ambiguity of certain elements.
Contextual consistency facilitates long-term memory of perceptual detail in barely seen images.
Gronau, Nurit; Shachar, Meytal
2015-08-01
It is long known that contextual information affects memory for an object's identity (e.g., its basic level category), yet it is unclear whether schematic knowledge additionally enhances memory for the precise visual appearance of an item. Here we investigated memory for visual detail of merely glimpsed objects. Participants viewed pairs of contextually related and unrelated stimuli, presented for an extremely brief duration (24 ms, masked). They then performed a forced-choice memory-recognition test for the precise perceptual appearance of 1 of 2 objects within each pair (i.e., the "memory-target" item). In 3 experiments, we show that memory-target stimuli originally appearing within contextually related pairs are remembered better than targets appearing within unrelated pairs. These effects are obtained whether the target is presented at test with its counterpart pair object (i.e., when reiterating the original context at encoding) or whether the target is presented alone, implying that the contextual consistency effects are mediated predominantly by processes occurring during stimulus encoding, rather than during stimulus retrieval. Furthermore, visual detail encoding is improved whether object relations involve implied action or not, suggesting that, contrary to some prior suggestions, action is not a necessary component for object-to-object associative "grouping" processes. Our findings suggest that during a brief glimpse, but not under long viewing conditions, contextual associations may play a critical role in reducing stimulus competition for attention selection and in facilitating rapid encoding of sensory details. Theoretical implications with respect to classic frame theories are discussed. (PsycINFO Database Record (c) 2015 APA, all rights reserved).
International Nuclear Information System (INIS)
Ceylan, C; Heide, U A van der; Bol, G H; Lagendijk, J J W; Kotte, A N T J
2005-01-01
Registration of different imaging modalities such as CT, MRI, functional MRI (fMRI), positron (PET) and single photon (SPECT) emission tomography is used in many clinical applications. Determining the quality of any automatic registration procedure has been a challenging part because no gold standard is available to evaluate the registration. In this note we present a method, called the 'multiple sub-volume registration' (MSR) method, for assessing the consistency of a rigid registration. This is done by registering sub-images of one data set on the other data set, performing a crude non-rigid registration. By analysing the deviations (local deformations) of the sub-volume registrations from the full registration we get a measure of the consistency of the rigid registration. Registration of 15 data sets which include CT, MR and PET images for brain, head and neck, cervix, prostate and lung was performed utilizing a rigid body registration with normalized mutual information as the similarity measure. The resulting registrations were classified as good or bad by visual inspection. The resulting registrations were also classified using our MSR method. The results of our MSR method agree with the classification obtained from visual inspection for all cases (p < 0.02 based on ANOVA of the good and bad groups). The proposed method is independent of the registration algorithm and similarity measure. It can be used for multi-modality image data sets and different anatomic sites of the patient. (note)
Safari, Mohammad Reza; Rowe, Philip; McFadyen, Angus; Buis, Arjan
2013-01-01
Residual limb shape capturing (Casting) consistency has a great influence on the quality of socket fit. Magnetic Resonance Imaging was used to establish a reliable reference grid for intercast and intracast shape and volume consistency of two common casting methods, Hands-off and Hands-on. Residual limbs were cast for twelve people with a unilateral below knee amputation and scanned twice for each casting concept. Subsequently, all four volume images of each amputee were semiautomatically segmented and registered to a common coordinate system using the tibia and then the shape and volume differences were calculated. The results show that both casting methods have intra cast volume consistency and there is no significant volume difference between the two methods. Inter- and intracast mean volume differences were not clinically significant based on the volume of one sock criteria. Neither the Hands-off nor the Hands-on method resulted in a consistent residual limb shape as the coefficient of variation of shape differences was high. The resultant shape of the residual limb in the Hands-off casting was variable but the differences were not clinically significant. For the Hands-on casting, shape differences were equal to the maximum acceptable limit for a poor socket fit.
Mahajan, Dhruv; Ramamoorthi, Ravi; Curless, Brian
2008-02-01
This paper develops a theory of frequency domain invariants in computer vision. We derive novel identities using spherical harmonics, which are the angular frequency domain analog to common spatial domain invariants such as reflectance ratios. These invariants are derived from the spherical harmonic convolution framework for reflection from a curved surface. Our identities apply in a number of canonical cases, including single and multiple images of objects under the same and different lighting conditions. One important case we consider is two different glossy objects in two different lighting environments. For this case, we derive a novel identity, independent of the specific lighting configurations or BRDFs, that allows us to directly estimate the fourth image if the other three are available. The identity can also be used as an invariant to detecttampering in the images. While this paper is primarily theoretical, it has the potential to lay the mathematical foundations for two important practical applications. First, we can develop more general algorithms for inverse rendering problems, which can directly relight and change material properties by transferring the BRDF or lighting from another object or illumination. Second, we can check the consistency of an image, to detect tampering or image splicing.
Energy Technology Data Exchange (ETDEWEB)
Christianson, O; Winslow, J; Samei, E [Duke University Medical Center, Durham, NC (United States)
2014-06-15
Purpose: One of the principal challenges of clinical imaging is to achieve an ideal balance between image quality and radiation dose across multiple CT models. The number of scanners and protocols at large medical centers necessitates an automated quality assurance program to facilitate this objective. Therefore, the goal of this work was to implement an automated CT image quality and radiation dose monitoring program based on actual patient data and to use this program to assess consistency of protocols across CT scanner models. Methods: Patient CT scans are routed to a HIPPA compliant quality assurance server. CTDI, extracted using optical character recognition, and patient size, measured from the localizers, are used to calculate SSDE. A previously validated noise measurement algorithm determines the noise in uniform areas of the image across the scanned anatomy to generate a global noise level (GNL). Using this program, 2358 abdominopelvic scans acquired on three commercial CT scanners were analyzed. Median SSDE and GNL were compared across scanner models and trends in SSDE and GNL with patient size were used to determine the impact of differing automatic exposure control (AEC) algorithms. Results: There was a significant difference in both SSDE and GNL across scanner models (9–33% and 15–35% for SSDE and GNL, respectively). Adjusting all protocols to achieve the same image noise would reduce patient dose by 27–45% depending on scanner model. Additionally, differences in AEC methodologies across vendors resulted in disparate relationships of SSDE and GNL with patient size. Conclusion: The difference in noise across scanner models indicates that protocols are not optimally matched to achieve consistent image quality. Our results indicated substantial possibility for dose reduction while achieving more consistent image appearance. Finally, the difference in AEC methodologies suggests the need for size-specific CT protocols to minimize variability in image
International Nuclear Information System (INIS)
Christianson, O; Winslow, J; Samei, E
2014-01-01
Purpose: One of the principal challenges of clinical imaging is to achieve an ideal balance between image quality and radiation dose across multiple CT models. The number of scanners and protocols at large medical centers necessitates an automated quality assurance program to facilitate this objective. Therefore, the goal of this work was to implement an automated CT image quality and radiation dose monitoring program based on actual patient data and to use this program to assess consistency of protocols across CT scanner models. Methods: Patient CT scans are routed to a HIPPA compliant quality assurance server. CTDI, extracted using optical character recognition, and patient size, measured from the localizers, are used to calculate SSDE. A previously validated noise measurement algorithm determines the noise in uniform areas of the image across the scanned anatomy to generate a global noise level (GNL). Using this program, 2358 abdominopelvic scans acquired on three commercial CT scanners were analyzed. Median SSDE and GNL were compared across scanner models and trends in SSDE and GNL with patient size were used to determine the impact of differing automatic exposure control (AEC) algorithms. Results: There was a significant difference in both SSDE and GNL across scanner models (9–33% and 15–35% for SSDE and GNL, respectively). Adjusting all protocols to achieve the same image noise would reduce patient dose by 27–45% depending on scanner model. Additionally, differences in AEC methodologies across vendors resulted in disparate relationships of SSDE and GNL with patient size. Conclusion: The difference in noise across scanner models indicates that protocols are not optimally matched to achieve consistent image quality. Our results indicated substantial possibility for dose reduction while achieving more consistent image appearance. Finally, the difference in AEC methodologies suggests the need for size-specific CT protocols to minimize variability in image
Iwai, Daiki; Suganami, Haruka; Hosoba, Minoru; Ohno, Kazuko; Emoto, Yutaka; Tabata, Yoshito; Matsui, Norihisa
2013-03-01
Color image consistency has not been accomplished yet except the Digital Imaging and Communication in Medicine (DICOM) Supplement 100 for implementing a color reproduction pipeline and device independent color spaces. Thus, most healthcare enterprises could not check monitor degradation routinely. To ensure color consistency in medical color imaging, monitor color calibration should be introduced. Using simple color calibration device . chromaticity of colors including typical color (Red, Green, Blue, Green and White) are measured as device independent profile connection space value called u'v' before and after calibration. In addition, clinical color images are displayed and visual differences are observed. In color calibration, monitor brightness level has to be set to quite lower value 80 cd/m2 according to sRGB standard. As Maximum brightness of most color monitors available currently for medical use have much higher brightness than 80 cd/m2, it is not seemed to be appropriate to use 80 cd/m2 level for calibration. Therefore, we propose that new brightness standard should be introduced while maintaining the color representation in clinical use. To evaluate effects of brightness to chromaticity experimentally, brightness level is changed in two monitors from 80 to 270cd/m2 and chromaticity value are compared with each brightness levels. As a result, there are no significant differences in chromaticity diagram when brightness levels are changed. In conclusion, chromaticity is close to theoretical value after color calibration. Moreover, chromaticity isn't moved when brightness is changed. The results indicate optimized reference brightness level for clinical use could be set at high brightness in current monitors .
Energy Technology Data Exchange (ETDEWEB)
Kang, H; Malin, M; Chmura, S; Hasan, Y; Al-Hallaq, H [The Department of Radiation and Cellular Oncology, The University of Chicago Medicine, Chicago, IL (United States)
2016-06-15
Purpose: For African-American patients receiving breast radiotherapy with a bolus, skin darkening can affect the surface visualization when using optical imaging for daily positioning and gating at deep-inspiration breath holds (DIBH). Our goal is to identify a region-of-interest (ROI) that is robust against deteriorating surface image quality due to skin darkening. Methods: We study four patients whose post-mastectomy surfaces are imaged daily with AlignRT (VisionRT, UK) for DIBH radiotherapy and whose surface image quality is degraded toward the end of treatment. To simulate the effects of skin darkening, surfaces from the first ten fractions of each patient are systematically degraded by 25–35%, 40–50% and 65–75% of the total area of the clinically used ROI-ipsilateral-chestwall. The degraded surfaces are registered to the reference surface in six degrees-of-freedom. To identify a robust ROI, three additional reference ROIs — ROI-chest+abdomen, ROI-bilateral-chest and ROI-extended-ipsilateral-chestwall are created and registered to the degraded surfaces. Differences in registration using these ROIs are compared to that using ROI-ipsilateral-chestwall. Results: For three patients, the deviations in the registrations to ROI-ipsilateral-chestwall are > 2.0, 3.1 and 7.9mm on average for 25–35%, 40–50% and 65–75% degraded surfaces, respectively. Rotational deviations reach 11.1° in pitch. For the last patient, registration is consistent to within 2.6mm even on the 65–75% degraded surfaces, possibly because the surface topography has more distinct features. For ROI-bilateral-chest and ROI-extended-ipsilateral-chest registrations deviate in a similar pattern. However, registration on ROI-chest+abdomen is robust to deteriorating image qualities to within 4.2mm for all four patients. Conclusion: Registration deviations using ROI-ipsilateral-chestwall can reach 9.8mm on the 40–50% degraded surfaces. Caution is required when using AlignRT for patients
WE-FG-207B-10: Dual-Energy CT Monochromatic Image Consistency Across Vendors and Platforms
Energy Technology Data Exchange (ETDEWEB)
Jacobsen, M; Wood, C; Cody, D [UT MD Anderson Cancer Center, Houston, TX (United States)
2016-06-15
Purpose: Although dual-energy CT provides improved sensitivity of HU for certain tissue types at lower simulated energy levels, if these values vary by scanner type they may impact clinical patient management decisions. Each manufacturer has selected a specific dual-energy CT approach (or in one case, three different approaches); understanding HU variability among low monochromatic images may be required when more than one dual-energy CT scanner type is available for use. Methods: A large elliptical dualenergy quality control phantom (Gammex Inc.; Middleton, WI) containing several standard tissue type materials was scanned at least three times on each of the following systems: GE HD750, prototype GE Revolution CT with GSI, Siemens Flash, Siemens Edge, Siemens AS 128, and Philips IQon. Images were generated at 50, 70, and 140 keV. Soft tissue and Iodine HU were measured on a single central 5mm-thick image; NIST constants were used to calculate the ideal HU for each material. Scan acquisitions were approximately dose-matched (∼25mGy CTDIvol) and image parameters were held as consistent as possible (thickness, kernel, no noise reduction). Results: Measured soft tissue (29 HU at 120 kVp) varied from 28 HU to 44 HU at 50 keV (excluding one outlier), from 21 HU to 31 HU at 70 keV, and from 19 HU to 32 HU at 140 keV. Measured iodine (5mg/ml, 106 HU at 120 kVp) varied from 246 HU to 280 HU at 50 keV, from 123 HU to 129 HU at 70 keV, and from 22 HU to 32 HU at 140 keV. Conclusion: Measured HU in standard rods across 3 dual-energy CT manufacturers and 6 scanner models varied directly with monochromatic level, with the most variability was observed at 50 keV and least variability at 70keV. Future work will include additional scanner platforms and how measurement variability impacts radiologists. This research has been supported by funds from Dr. William Murphy, Jr., the John S. Dunn, Sr. Distinguished Chair in Diagnostic Imaging at MD Anderson Cancer Center.
International Nuclear Information System (INIS)
Patton, T; Du, K; Bayouth, J; Christensen, G; Reinhardt, J
2014-01-01
Purpose: Four-dimensional computed tomography (4DCT) can be used to evaluate longitudinal changes in pulmonary function. The sensitivity of such measurements to identify function change may be improved with reproducible breathing patterns. The purpose of this study was to determine if inhale was more consistent than exhale, i.e., lung expansion during inhalation compared to lung contraction during exhalation. Methods: Repeat 4DCT image data acquired within a short time interval from 8 patients. Using a tissue volume preserving deformable image registration algorithm, Jacobian ventilation maps in two scanning sessions were computed and compared on the same coordinate for reproducibility analysis. Equivalent lung volumes (ELV) were used for 5 subjects and equivalent title volumes (ETV) for the 3 subjects who experienced a baseline shift between scans. In addition, gamma pass rate was calculated from a modified gamma index evaluation between two ventilation maps, using acceptance criterions of 2mm distance-to-agreement and 5% ventilation difference. The gamma pass rates were then compared using paired t-test to determine if there was a significant difference. Results: Inhalation was more reproducible than exhalation. In the 5 ELV subjects 78.5% of the lung voxels met the gamma criteria for expansion during inhalation when comparing the two scans, while significantly fewer (70.9% of the lung voxels) met the gamma criteria for contraction during exhalation (p = .027). In the 8 total subjects analyzed the average gamma pass rate for expansion during inhalation was 75.2% while for contraction during exhalation it was 70.3%; which trended towards significant (p = .064). Conclusion: This work implies inhalation is more reproducible than exhalation, when equivalent respiratory volumes are considered. The reason for this difference is unknown. Longitudinal investigation of pulmonary function change based on inhalation images appears appropriate for Jacobian-based measure of
International Nuclear Information System (INIS)
Sun Hongliang; Xu Yanyan; Hu Yingying; Tian Yuanjiang; Wang Wu
2014-01-01
Objective: To determine the consistency between quantitative CT perfusion measurements of colorectal cancer obtained from single section with maximal tumor dimension and from average of whole tumor, and compare intra- and inter-observer consistency of the two analysis methods. Methods: Twenty-two patients with histologically proven colorectal cancer were examined prospectively with 256-slice CT and the whole tumor perfusion images were obtained. Perfusion parameters were obtained from region of interest (ROI) inserted in single section showing maximal tumor dimension, then from ROI inserted in all tumor-containing sections by two radiologists. Consistency between values of blood flow (BF), blood volume (BV) and time to peak (TTP) calculated by two methods was assessed. Intra-observer consistency was evaluated by comparing repeated measurements done by the same radiologist using both methods after 3 months. Perfusion measurements were done by another radiologist independently to assess inter-observer consistency of both methods. The results from different methods were compared using paired t test and Bland-Altman plot. Results: Twenty-two patients were examined successfully. The perfusion parameters BF, BV and TTP obtained by whole tumor perfusion and single-section analysis were (35.59 ± 14.59) ml · min -1 · 100 g -1 , (17.55 ±4.21) ml · 100 g -1 , (21.30 ±7.57) s and (34.64 ± 13.29)ml · min -1 · 100 g -1 , (17.61 ±6.39)ml · 100 g -1 , (19.82 ±9.01) s, respectively. No significant differences were observed between the means of the perfusion parameters (BF, BV, TTP) calculated by the two methods (t=0.218, -0.033, -0.668, P>0.05, respectively). The intra-observer 95% limits of consistency of perfusion parameters were BF -5.3% to 10.0%, BV -13.8% to 10.8%, TTP -15.0% to 12.6% with whole tumor analysis, respectively; BF -14.3% to 16.5%, BV -24.2% to 22.2%, TTP -19.0% to 16.1% with single section analysis, respectively. The inter-observer 95% limits of
Tsai, Shang-Yueh; Hsu, Yi-Cheng; Chu, Ying-Hua; Kuo, Wen-Jui; Lin, Fa-Hsuan
2015-12-01
One major challenge of MRSI is the poor signal-to-noise ratio (SNR), which can be improved by using a surface coil array. Here we propose to exploit the spatial sensitivity of different channels of a coil array to enforce the k-space data consistency (DC) in order to suppress noise and consequently to improve MRSI SNR. MRSI data were collected using a proton echo planar spectroscopic imaging (PEPSI) sequence at 3 T using a 32-channel coil array and were averaged with one, two and eight measurements (avg-1, avg-2 and avg-8). The DC constraint was applied using a regularization parameter λ of 1, 2, 3, 5 or 10. Metabolite concentrations were quantified using LCModel. Our results show that the suppression of noise by applying the DC constraint to PEPSI reconstruction yields up to 32% and 27% SNR gain for avg-1 and avg-2 data with λ = 5, respectively. According to the reported Cramer-Rao lower bounds, the improvement in metabolic fitting was significant (p < 0.01) when the DC constraint was applied with λ ≥ 2. Using the DC constraint with λ = 3 or 5 can minimize both root-mean-square errors and spatial variation for all subjects using the avg-8 data set as reference values. Our results suggest that MRSI reconstructed with a DC constraint can save around 70% of scanning time to obtain images and spectra with similar SNRs using λ = 5. Copyright © 2015 John Wiley & Sons, Ltd.
Onuki, Yoshinori; Funatani, Chiaki; Yamamoto, Yoshihisa; Fukami, Toshiro; Koide, Tatsuo; Hayashi, Yoshihiro; Takayama, Kozo
2017-01-01
A moisturizing cream mixed with a steroid ointment is frequently prescribed to patients suffering from atopic dermatitis. However, there is a concern that the mixing operation causes destabilization. The present study was performed to investigate the stability of such preparations closely using magnetic resonance imaging (MRI). As sample preparations, five commercial moisturizing creams that are popular in Japan were mixed with an ointment base, a white petrolatum, at a volume ratio of 1 : 1. The mixed preparations were stored at 60°C to accelerate the destabilization processes. Subsequently, the phase separations induced by the storage test were monitored using MRI. Using advanced MR technologies including spin-spin relaxation time (T 2 ) mapping and MR spectroscopy, we successfully characterized the phase-separation behavior of the test samples. For most samples, phase separations developed by the bleeding of liquid oil components. From a sample consisting of an oil-in-water-type cream, Urepearl Cream 10%, a distinct phase-separation mode was observed, which was initiated by the aqueous component separating from the bottom part of the sample. The resultant phase separation was the most distinct among the test samples. To investigate the phase separation quantitatively and objectively, we conducted a histogram analysis on the acquired T 2 maps. The water-in-oil type creams were found to be much more stable after mixing with ointment base than those of oil-in-water type creams. This finding strongly supported the validity of the mixing operation traditionally conducted in pharmacies.
Adegoke, Alfred A.
Noting the lack of self-concept research in nonwestern cultures and the need to determine if a western measure of self-image is embedded in the same network of constructs in another culture, this study examined the appropriateness of using the Self Image Questionnaire for Young Adolescents (SIQYA) with Nigerian students. Participating in the study…
Vessel, Edward A; Biederman, Irving; Subramaniam, Suresh; Greene, Michelle R
2016-07-01
An L-vertex, the point at which two contours coterminate, provides highly reliable evidence that a surface terminates at that vertex, thus providing the strongest constraint on the extraction of shape from images (Guzman, 1968). Such vertices are pervasive in our visual world but the importance of a statistical regularity about them has been underappreciated: The contours defining the vertex are (almost) always of the same direction of contrast with respect to the background (i.e., both darker or both lighter). Here we show that when the two contours are of different directions of contrast, the capacity of the L-vertex to signal the termination of a surface, as reflected in object recognition, is markedly reduced. Although image statistics have been implicated in determining the connectivity in the earliest cortical visual stage (V1) and in grouping during visual search, this finding provides evidence that such statistics are involved in later stages where object representations are derived from two-dimensional images.
Thompson, William; Stern, Lewis; Ferranti, Dave; Huynh, Chuong; Scipioni, Larry; Notte, John; Sanford, Colin
2010-06-01
Recent helium ion microscope (HIM) imaging studies have shown the strong sensitivity of HIM induced secondary electron (SE) yields [1] to the sample physical and chemical properties and to its surface topography. This SE yield sensitivity is due to the low recoil energy of the HIM initiated electrons and their resulting short mean free path. Additionally, a material's SE escape probability is modulated by changes in the material's work function and surface potential. Due to the escape electrons' roughly 2eV mean energy and their nanometer range mean free path, HIM SE mode image contrast has significant material and surface sensitivity. The latest generation of HIM has a 0.35 nanometer resolution specification and is equipped with a plasma cleaning process to mitigate the effects of hydrocarbon contamination. However, for surfaces that may have native oxide chemistries influencing the secondary electron yield, a new process of low energy, shallow angle argon sputtering, was evaluated. The intent of this work was to study the effect of removing pre-existing native oxides and any in-situ deposited surface contaminants. We will introduce the sputter yield predictions of two established computer models and the sputter yield and sample modification forecasts of the molecular dynamics program, Kalypso. We will review the experimental technique applied to copper samples and show the copper grain contrast improvement that resulted when argon cleaned samples were imaged in HIM SE mode.
International Nuclear Information System (INIS)
Wang, G.; Gao, J.; Zhao, S.; Sun, X.; Chen, X.; Cui, X.
2014-01-01
Aim: To develop a quantitative body mass index (BMI)-dependent tube voltage and tube current selection method for obtaining consistent image quality and overall dose reduction in computed tomography coronary angiography (CTCA). Methods and materials: The images of 190 consecutive patients (group A) who underwent CTCA with fixed protocols (100 kV/193 mAs for 100 patients with a BMI of <27 and 120 kV/175 mAs for 90 patients with a BMI of >27) were retrospectively analysed and reconstructed with an adaptive statistical iterative reconstruction (ASIR) algorithm at 50% blending. Image noise was measured and the relationship to BMI was studied to establish BMI-dependent tube current for obtaining CTCA images with user-specified image noise. One hundred additional cardiac patients (group B) were examined using prospective triggering with the BMI-dependent tube voltage/current. CTCA image-quality score, image noise, and effective dose from groups B and C (subgroup of A of 100 patients examined with prospective triggering only) were obtained and compared. Results: There was a linear relationship between image noise and BMI in group A. Using a BMI-dependent tube current in group B, an average CTCA image noise of 27.7 HU (target 28 HU) and 31.7 HU (target 33 HU) was obtained for the subgroups of patients with BMIs of >27 and of <27, respectively, and was independent of patient BMI. There was no difference between image-quality scores between groups B and C (4.52 versus 4.60, p > 0.05). The average effective dose for group B (2.56 mSv) was 42% lower than group C (4.38 mSv; p < 0.01). Conclusion: BMI-dependent tube voltage/current selection in CTCA provides an individualized protocol that generates consistent image quality and helps to reduce overall patient radiation dose. - Highlights: • BMI-dependent kVp and mA selection method may be established in CCTA. • BMI-dependent kVp and mA enables consistent CCTA image quality. • Overall dose reduction of 40% can
DEFF Research Database (Denmark)
Staunstrup, Jørgen
1998-01-01
This paper proposes that Interface Consistency is an important issue for the development of modular designs. Byproviding a precise specification of component interfaces it becomes possible to check that separately developedcomponents use a common interface in a coherent matter thus avoiding a very...... significant source of design errors. Awide range of interface specifications are possible, the simplest form is a syntactical check of parameter types.However, today it is possible to do more sophisticated forms involving semantic checks....
International Nuclear Information System (INIS)
Jean-Paul Negre
2011-07-01
This manuscript presents an original concept of passive, lightweight, and compact dosimeter based on a stack of BaFBr:Eu 2+ photostimulable phosphor plates (Image Plate) alternating with high-Z metal screens. It describes the manufacture and the method to calibrate the dosimeter. This method consists in using a Co 60 standard source and Monte Carlo N-Particle codes (MCNPX/MCNP5) to apply them to a large area of radiation energies, such as quasi mono-energetic radiation (Gamma rays decay of radioactive isotope, X-ray fluorescence), or continuous radiation spectra (Bremsstrahlung radiation, synchrotron light source). The measurement recurrence in the stack of couples 'metallic foil / IP' ensures consistency measurements, determines the threshold depth of electronic equilibrium (depending on the radiation energy) and allows us to infer the absolute dose in air (Kerma). The depth-dose curve in the stack and transmission measurements provide an estimate of the effective energy of incident radiations, report the presence of parasite scattered radiations and allow us to discriminate the nature of ionizing particles. The 2D features of the device are used to characterize the ballistic fate of charged particles in material thickness, which is of great interest with narrow particles beams. This dosimeter has remarkable advantages over other passive dosimeters, including a dynamic range larger than 10 7 of linear photon dose detection from less than 0.5 μGy and up to several Gy for radiation energies between a few tens of keV and more than 10 MeV (20 MeV with Bremsstrahlung X-ray spectra). This concept originality consists in almost immediately getting the measurement results after an exposure and a single pass reading of the dosimeter. It can respond positively to most of the usual needs in radiation metrology: personal or environmental dosimetry (radiation protection); Controls / characterization / mapping around materials and emitting ionizing radiation devices
Directory of Open Access Journals (Sweden)
Elizângela Moreira Careta Galindo
2007-02-01
Full Text Available Este trabalho tem por objetivo traduzir, adaptar e validar o Eating Behaviours and Body Image Test, para uso com crianças de uma cidade do interior do estado de São Paulo. Foram sujeitos do estudo 261 escolares do sexo feminino, na faixa etária de 9 a 12 anos. Por meio da análise fatorial, com rotação varimax avaliou-se a consistência interna do instrumento. Esta análise, realizada com o auxílio do programa Statistical Package for Social Sciences, versão 10.0, revelou dois fatores. Para o instrumento total a consistência interna foi adequada (coeficiente a de Cronbach: 0,89 e para os dois fatores (1 e 2 os valores de a também foram considerados satisfatórios (alfa=0,90 e alfa=0,80, respectivamente, mostrando, assim, que o Eating Behaviours and Body Image Test é útil para uma avaliação precoce, rastreando atitudes indicadoras de possíveis distúrbios no comportamento alimentar. Foram mantidas as características psicométricas do instrumento original.This study aimed to translate, adapt and validate the Eating Bahaviours and Body Image Test, to be used with children in a city in upstate São Paulo. Study subjects were 261 female students aging from 9 to 12 years. The internal consistency of the instrument was evaluated by means of factorial analysis with varimax rotation. This analysis was accomplished through Statistical Package for Social Sciences, version 10.0, revealing two factors. The internal consistency was adequate for the total instrument (Cronbach's alpha=0.89 and a values were also considered satisfactory for the two factors (1 and 2 (alpha=0.90 and alpha=0.80, respectively, which demonstrated that the Eating Bahaviours and Body Image Test is useful for an initial evaluation, tracing symptoms that indicate possible eating behavior disorders. The psychometric characteristics of the original instrument were maintained.
Structural Consistency, Consistency, and Sequential Rationality.
Kreps, David M; Ramey, Garey
1987-01-01
Sequential equilibria comprise consistent beliefs and a sequentially ra tional strategy profile. Consistent beliefs are limits of Bayes ratio nal beliefs for sequences of strategies that approach the equilibrium strategy. Beliefs are structurally consistent if they are rationaliz ed by some single conjecture concerning opponents' strategies. Consis tent beliefs are not necessarily structurally consistent, notwithstan ding a claim by Kreps and Robert Wilson (1982). Moreover, the spirit of stru...
Graham, Daniel J; Stockinger, Simone; Leder, Helmut
2013-01-01
Alzheimer's disease (AD) causes severe impairments in cognitive function but there is evidence that aspects of esthetic perception are somewhat spared, at least in early stages of the disease. People with early Alzheimer's-related dementia have been found to show similar degrees of stability over time in esthetic judgment of paintings compared to controls, despite poor explicit memory for the images. Here we expand on this line of inquiry to investigate the types of perceptual judgments involved, and to test whether people in later stages of the disease also show evidence of preserved esthetic judgment. Our results confirm that, compared to healthy controls, there is similar esthetic stability in early stage AD in the absence of explicit memory, and we report here that people with later stages of the disease also show similar stability compared to controls. However, while we find that stability for portrait paintings, landscape paintings, and landscape photographs is not different compared to control group performance, stability for face photographs - which were matched for identity with the portrait paintings - was significantly impaired in the AD group. We suggest that partially spared face-processing systems interfere with esthetic processing of natural faces in ways that are not found for artistic images and landscape photographs. Thus, our work provides a novel form of evidence regarding face-processing in healthy and diseased aging. Our work also gives insights into general theories of esthetics, since people with AD are not encumbered by many of the semantic and emotional factors that otherwise color esthetic judgment. We conclude that, for people with AD, basic esthetic judgment of artistic images represents an "island of stability" in a condition that in most other respects causes profound cognitive disruption. As such, esthetic response could be a promising route to future therapies.
Graham, Daniel J.; Stockinger, Simone; Leder, Helmut
2013-01-01
Alzheimer’s disease (AD) causes severe impairments in cognitive function but there is evidence that aspects of esthetic perception are somewhat spared, at least in early stages of the disease. People with early Alzheimer’s-related dementia have been found to show similar degrees of stability over time in esthetic judgment of paintings compared to controls, despite poor explicit memory for the images. Here we expand on this line of inquiry to investigate the types of perceptual judgments involved, and to test whether people in later stages of the disease also show evidence of preserved esthetic judgment. Our results confirm that, compared to healthy controls, there is similar esthetic stability in early stage AD in the absence of explicit memory, and we report here that people with later stages of the disease also show similar stability compared to controls. However, while we find that stability for portrait paintings, landscape paintings, and landscape photographs is not different compared to control group performance, stability for face photographs – which were matched for identity with the portrait paintings – was significantly impaired in the AD group. We suggest that partially spared face-processing systems interfere with esthetic processing of natural faces in ways that are not found for artistic images and landscape photographs. Thus, our work provides a novel form of evidence regarding face-processing in healthy and diseased aging. Our work also gives insights into general theories of esthetics, since people with AD are not encumbered by many of the semantic and emotional factors that otherwise color esthetic judgment. We conclude that, for people with AD, basic esthetic judgment of artistic images represents an “island of stability” in a condition that in most other respects causes profound cognitive disruption. As such, esthetic response could be a promising route to future therapies. PMID:23471005
Directory of Open Access Journals (Sweden)
Daniel eGraham
2013-03-01
Full Text Available Alzheimer’s disease causes severe impairments in cognitive function but there is evidence that aspects of aesthetic perception are somewhat spared, at least in early stages of the disease. People with early Alzheimer’s-related dementia have been found to show similar degrees of stability over time in aesthetic judgment of paintings compared to controls, despite poor explicit memory for the images. Here we expand on this line of inquiry to investigate the types of perceptual judgments involved, and to test whether people in later stages of the disease also show evidence of preserved aesthetic judgment. Our results confirm that, compared to healthy controls, there is similar aesthetic stability in early stage Alzheimer’s disease (AD in the absence of explicit memory, and we report here that people with later stages of the disease also show similar stability compared to controls. However, while we find that stability for portrait paintings, landscape paintings, and landscape photographs is not different compared to control group performance, stability for face photographs—which were matched for identity with the portrait paintings—was significantly impaired in the AD group. We suggest that partially spared face-processing systems interfere with aesthetic processing of natural faces in ways that are not found for artistic images and landscape photographs. Thus, our work provides a novel form of evidence regarding face processing in healthy and diseased ageing. Our work also gives insights into general theories of aesthetics, since people with Alzheimer’s disease are not encumbered by many of the semantic and emotional factors that otherwise color aesthetic judgment. We conclude that, for people with Alzheimer’s disease, basic aesthetic judgment of artistic images represents an island of stability in a condition that in most other respects causes profound cognitive disruption. As such, aesthetic response could be a promising route to
Directory of Open Access Journals (Sweden)
Mattia Cattaneo
2016-12-01
Full Text Available Here we provide the correlation among different carotid ultrasound (US variables to assess echogenicity n standard carotid US and to assess intraplaque neovascularization on contrast enhanced US. We recruited 45 consecutive subjects with an asymptomatic≥50% carotid artery stenosis. Carotid plaque echogenicity at standard US was visually graded according to Gray–Weale classification (GW and measured by the greyscale median (GSM, a semi-automated computerized measurement performed by Adobe Photoshop®. On CEUS imaging IPNV was graded according to the visual appearance of contrast within the plaque according to three different methods: CEUS_A (1=absent; 2=present; CEUS_B a three-point scale (increasing IPNV from 1 to 3; CEUS_C a four-point scale (increasing IPNV from 0 to 3. We have also implemented a new simple quantification method derived from region of interest (ROI signal intensity ratio as assessed by QLAB software. Further information is available in “Contrast-enhanced ultrasound imaging of intraplaque neovascularization and its correlation to plaque echogenicity in human carotid arteries atherosclerosis (M. Cattaneo, D. Staub, A.P. Porretta, J.M. Gallino, P. Santini, C. Limoni et al., 2016 [1].
Dudek, Kathleen Burke
The noxious weed leafy spurge (Euphorbia esula L.) has spread throughout the northern Great Plains of North America since it was introduced in the early 1800s, and it is currently a significant management concern. Accurate, rapid location and repeatable measurements are critical for successful temporal monitoring of infestations. Imaging spectroscopy is well suited for identification of spurge; however, the development and dissemination of standardized hyperspectral mapping procedures that produce consistent multi-temporal maps has been absent. Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) data, collected in 1999 and 2001 over Theodore Roosevelt National Park, North Dakota, were used to locate leafy spurge. Published image-processing methods were tested to determine the most successful for consistent maps. Best results were obtained using: (1) NDVI masking; (2) cross-track illumination correction; (3) image-derived spectral libraries; and (4) mixture-tuned matched filtering algorithm. Application of the algorithm was modified to standardize processing and eliminate threshold decisions; the image-derived library was refined to eliminate additional variability. Primary (spurge dominant), secondary (spurge non-dominant), abundance, and area-wide vegetation maps were produced. Map accuracies were analyzed with point, polygon, and grid reference sets, using confusion matrices and regression between field-measured and image-derived abundances. Accuracies were recalculated after applying a majority filter, and buffers ranging from 1-5 pixels wide around classified pixels, to accommodate poor reference-image alignment. Overall accuracy varied from 39% to 82%, however, regression analyses yielded r2 = 0.725, indicating a strong relationship between field and image-derived densities. Accuracy was sensitive to: (1) registration offsets between field and image locations; (2) modification of analytical methods; and (3) reference data quality. Sensor viewing angle
Consistent model driven architecture
Niepostyn, Stanisław J.
2015-09-01
The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.
Bitcoin Meets Strong Consistency
Decker, Christian; Seidel, Jochen; Wattenhofer, Roger
2014-01-01
The Bitcoin system only provides eventual consistency. For everyday life, the time to confirm a Bitcoin transaction is prohibitively slow. In this paper we propose a new system, built on the Bitcoin blockchain, which enables strong consistency. Our system, PeerCensus, acts as a certification authority, manages peer identities in a peer-to-peer network, and ultimately enhances Bitcoin and similar systems with strong consistency. Our extensive analysis shows that PeerCensus is in a secure state...
Consistent classical supergravity theories
International Nuclear Information System (INIS)
Muller, M.
1989-01-01
This book offers a presentation of both conformal and Poincare supergravity. The consistent four-dimensional supergravity theories are classified. The formulae needed for further modelling are included
Consistency of orthodox gravity
Energy Technology Data Exchange (ETDEWEB)
Bellucci, S. [INFN, Frascati (Italy). Laboratori Nazionali di Frascati; Shiekh, A. [International Centre for Theoretical Physics, Trieste (Italy)
1997-01-01
A recent proposal for quantizing gravity is investigated for self consistency. The existence of a fixed-point all-order solution is found, corresponding to a consistent quantum gravity. A criterion to unify couplings is suggested, by invoking an application of their argument to more complex systems.
Quasiparticles and thermodynamical consistency
International Nuclear Information System (INIS)
Shanenko, A.A.; Biro, T.S.; Toneev, V.D.
2003-01-01
A brief and simple introduction into the problem of the thermodynamical consistency is given. The thermodynamical consistency relations, which should be taken into account under constructing a quasiparticle model, are found in a general manner from the finite-temperature extension of the Hellmann-Feynman theorem. Restrictions following from these relations are illustrated by simple physical examples. (author)
Bergantiños, Gustavo; Valencia-Toledo, Alfredo; Vidal-Puga, Juan
2016-01-01
The program evaluation review technique (PERT) is a tool used to schedule and coordinate activities in a complex project. In assigning the cost of a potential delay, we characterize the Shapley rule as the only rule that satisfies consistency and other desirable properties.
DEFF Research Database (Denmark)
Thomsen, Christa; Nielsen, Anne Ellerup
2006-01-01
This chapter first outlines theory and literature on CSR and Stakeholder Relations focusing on the different perspectives and the contextual and dynamic character of the CSR concept. CSR reporting challenges are discussed and a model of analysis is proposed. Next, our paper presents the results...... of a case study showing that companies use different and not necessarily consistent strategies for reporting on CSR. Finally, the implications for managerial practice are discussed. The chapter concludes by highlighting the value and awareness of the discourse and the discourse types adopted...... in the reporting material. By implementing consistent discourse strategies that interact according to a well-defined pattern or order, it is possible to communicate a strong social commitment on the one hand, and to take into consideration the expectations of the shareholders and the other stakeholders...
Geometrically Consistent Mesh Modification
Bonito, A.
2010-01-01
A new paradigm of adaptivity is to execute refinement, coarsening, and smoothing of meshes on manifolds with incomplete information about their geometry and yet preserve position and curvature accuracy. We refer to this collectively as geometrically consistent (GC) mesh modification. We discuss the concept of discrete GC, show the failure of naive approaches, and propose and analyze a simple algorithm that is GC and accuracy preserving. © 2010 Society for Industrial and Applied Mathematics.
Serfon, Cedric; The ATLAS collaboration
2016-01-01
One of the biggest challenge with Large scale data management system is to ensure the consistency between the global file catalog and what is physically on all storage elements. To tackle this issue, the Rucio software which is used by the ATLAS Distributed Data Management system has been extended to automatically handle lost or unregistered files (aka Dark Data). This system automatically detects these inconsistencies and take actions like recovery or deletion of unneeded files in a central manner. In this talk, we will present this system, explain the internals and give some results.
International Nuclear Information System (INIS)
Wang Xiaomin; Tegmark, Max; Zaldarriaga, Matias
2002-01-01
We perform a detailed analysis of the latest cosmic microwave background (CMB) measurements (including BOOMERaNG, DASI, Maxima and CBI), both alone and jointly with other cosmological data sets involving, e.g., galaxy clustering and the Lyman Alpha Forest. We first address the question of whether the CMB data are internally consistent once calibration and beam uncertainties are taken into account, performing a series of statistical tests. With a few minor caveats, our answer is yes, and we compress all data into a single set of 24 bandpowers with associated covariance matrix and window functions. We then compute joint constraints on the 11 parameters of the 'standard' adiabatic inflationary cosmological model. Our best fit model passes a series of physical consistency checks and agrees with essentially all currently available cosmological data. In addition to sharp constraints on the cosmic matter budget in good agreement with those of the BOOMERaNG, DASI and Maxima teams, we obtain a heaviest neutrino mass range 0.04-4.2 eV and the sharpest constraints to date on gravity waves which (together with preference for a slight red-tilt) favor 'small-field' inflation models
Griffiths, Robert B.
2001-11-01
Quantum mechanics is one of the most fundamental yet difficult subjects in physics. Nonrelativistic quantum theory is presented here in a clear and systematic fashion, integrating Born's probabilistic interpretation with Schrödinger dynamics. Basic quantum principles are illustrated with simple examples requiring no mathematics beyond linear algebra and elementary probability theory. The quantum measurement process is consistently analyzed using fundamental quantum principles without referring to measurement. These same principles are used to resolve several of the paradoxes that have long perplexed physicists, including the double slit and Schrödinger's cat. The consistent histories formalism used here was first introduced by the author, and extended by M. Gell-Mann, J. Hartle and R. Omnès. Essential for researchers yet accessible to advanced undergraduate students in physics, chemistry, mathematics, and computer science, this book is supplementary to standard textbooks. It will also be of interest to physicists and philosophers working on the foundations of quantum mechanics. Comprehensive account Written by one of the main figures in the field Paperback edition of successful work on philosophy of quantum mechanics
Measuring process and knowledge consistency
DEFF Research Database (Denmark)
Edwards, Kasper; Jensen, Klaes Ladeby; Haug, Anders
2007-01-01
When implementing configuration systems, knowledge about products and processes are documented and replicated in the configuration system. This practice assumes that products are specified consistently i.e. on the same rule base and likewise for processes. However, consistency cannot be taken...... for granted; rather the contrary, and attempting to implement a configuration system may easily ignite a political battle. This is because stakes are high in the sense that the rules and processes chosen may only reflect one part of the practice, ignoring a majority of the employees. To avoid this situation......, this paper presents a methodology for measuring product and process consistency prior to implementing a configuration system. The methodology consists of two parts: 1) measuring knowledge consistency and 2) measuring process consistency. Knowledge consistency is measured by developing a questionnaire...
Consistency argued students of fluid
Viyanti; Cari; Suparmi; Winarti; Slamet Budiarti, Indah; Handika, Jeffry; Widyastuti, Fatma
2017-01-01
Problem solving for physics concepts through consistency arguments can improve thinking skills of students and it is an important thing in science. The study aims to assess the consistency of the material Fluid student argmentation. The population of this study are College students PGRI Madiun, UIN Sunan Kalijaga Yogyakarta and Lampung University. Samples using cluster random sampling, 145 samples obtained by the number of students. The study used a descriptive survey method. Data obtained through multiple-choice test and interview reasoned. Problem fluid modified from [9] and [1]. The results of the study gained an average consistency argmentation for the right consistency, consistency is wrong, and inconsistent respectively 4.85%; 29.93%; and 65.23%. Data from the study have an impact on the lack of understanding of the fluid material which is ideally in full consistency argued affect the expansion of understanding of the concept. The results of the study as a reference in making improvements in future studies is to obtain a positive change in the consistency of argumentations.
Coordinating user interfaces for consistency
Nielsen, Jakob
2001-01-01
In the years since Jakob Nielsen's classic collection on interface consistency first appeared, much has changed, and much has stayed the same. On the one hand, there's been exponential growth in the opportunities for following or disregarding the principles of interface consistency-more computers, more applications, more users, and of course the vast expanse of the Web. On the other, there are the principles themselves, as persistent and as valuable as ever. In these contributed chapters, you'll find details on many methods for seeking and enforcing consistency, along with bottom-line analys
Choice, internal consistency, and rationality
Aditi Bhattacharyya; Prasanta K. Pattanaik; Yongsheng Xu
2010-01-01
The classical theory of rational choice is built on several important internal consistency conditions. In recent years, the reasonableness of those internal consistency conditions has been questioned and criticized, and several responses to accommodate such criticisms have been proposed in the literature. This paper develops a general framework to accommodate the issues raised by the criticisms of classical rational choice theory, and examines the broad impact of these criticisms from both no...
International Nuclear Information System (INIS)
Rafelski, J.
1979-01-01
After an introductory overview of the bag model the author uses the self-consistent solution of the coupled Dirac-meson fields to represent a bound state of strongly ineteracting fermions. In this framework he discusses the vivial approach to classical field equations. After a short description of the used numerical methods the properties of bound states of scalar self-consistent Fields and the solutions of a self-coupled Dirac field are considered. (HSI) [de
Time-consistent and market-consistent evaluations
Pelsser, A.; Stadje, M.A.
2014-01-01
We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
Market-consistent actuarial valuation
Wüthrich, Mario V
2016-01-01
This is the third edition of this well-received textbook, presenting powerful methods for measuring insurance liabilities and assets in a consistent way, with detailed mathematical frameworks that lead to market-consistent values for liabilities. Topics covered are stochastic discounting with deflators, valuation portfolio in life and non-life insurance, probability distortions, asset and liability management, financial risks, insurance technical risks, and solvency. Including updates on recent developments and regulatory changes under Solvency II, this new edition of Market-Consistent Actuarial Valuation also elaborates on different risk measures, providing a revised definition of solvency based on industry practice, and presents an adapted valuation framework which takes a dynamic view of non-life insurance reserving risk.
The Principle of Energetic Consistency
Cohn, Stephen E.
2009-01-01
A basic result in estimation theory is that the minimum variance estimate of the dynamical state, given the observations, is the conditional mean estimate. This result holds independently of the specifics of any dynamical or observation nonlinearity or stochasticity, requiring only that the probability density function of the state, conditioned on the observations, has two moments. For nonlinear dynamics that conserve a total energy, this general result implies the principle of energetic consistency: if the dynamical variables are taken to be the natural energy variables, then the sum of the total energy of the conditional mean and the trace of the conditional covariance matrix (the total variance) is constant between observations. Ensemble Kalman filtering methods are designed to approximate the evolution of the conditional mean and covariance matrix. For them the principle of energetic consistency holds independently of ensemble size, even with covariance localization. However, full Kalman filter experiments with advection dynamics have shown that a small amount of numerical dissipation can cause a large, state-dependent loss of total variance, to the detriment of filter performance. The principle of energetic consistency offers a simple way to test whether this spurious loss of variance limits ensemble filter performance in full-blown applications. The classical second-moment closure (third-moment discard) equations also satisfy the principle of energetic consistency, independently of the rank of the conditional covariance matrix. Low-rank approximation of these equations offers an energetically consistent, computationally viable alternative to ensemble filtering. Current formulations of long-window, weak-constraint, four-dimensional variational methods are designed to approximate the conditional mode rather than the conditional mean. Thus they neglect the nonlinear bias term in the second-moment closure equation for the conditional mean. The principle of
Consistent guiding center drift theories
International Nuclear Information System (INIS)
Wimmel, H.K.
1982-04-01
Various guiding-center drift theories are presented that are optimized in respect of consistency. They satisfy exact energy conservation theorems (in time-independent fields), Liouville's theorems, and appropriate power balance equations. A theoretical framework is given that allows direct and exact derivation of associated drift-kinetic equations from the respective guiding-center drift-orbit theories. These drift-kinetic equations are listed. Northrop's non-optimized theory is discussed for reference, and internal consistency relations of G.C. drift theories are presented. (orig.)
Weak consistency and strong paraconsistency
Directory of Open Access Journals (Sweden)
Gemma Robles
2009-11-01
Full Text Available In a standard sense, consistency and paraconsistency are understood as, respectively, the absence of any contradiction and as the absence of the ECQ (“E contradictione quodlibet” rule that allows us to conclude any well formed formula from any contradiction. The aim of this paper is to explain the concepts of weak consistency alternative to the standard one, the concepts of paraconsistency related to them and the concept of strong paraconsistency, all of which have been defined by the author together with José M. Méndez.
Consistent force fields for saccharides
DEFF Research Database (Denmark)
Rasmussen, Kjeld
1999-01-01
Consistent force fields for carbohydrates were hitherto developed by extensive optimization ofpotential energy function parameters on experimental data and on ab initio results. A wide range of experimental data is used: internal structures obtained from gas phase electron diffraction and from x......-anomeric effects are accounted for without addition of specific terms. The work is done in the framework of the Consistent Force Field which originatedin Israel and was further developed in Denmark. The actual methods and strategies employed havebeen described previously. Extensive testing of the force field...
Glass consistency and glass performance
International Nuclear Information System (INIS)
Plodinec, M.J.; Ramsey, W.G.
1994-01-01
Glass produced by the Defense Waste Processing Facility (DWPF) will have to consistently be more durable than a benchmark glass (evaluated using a short-term leach test), with high confidence. The DWPF has developed a Glass Product Control Program to comply with this specification. However, it is not clear what relevance product consistency has on long-term glass performance. In this report, the authors show that DWPF glass, produced in compliance with this specification, can be expected to effectively limit the release of soluble radionuclides to natural environments. However, the release of insoluble radionuclides to the environment will be limited by their solubility, and not glass durability
Time-consistent actuarial valuations
Pelsser, A.A.J.; Salahnejhad Ghalehjooghi, A.
2016-01-01
Time-consistent valuations (i.e. pricing operators) can be created by backward iteration of one-period valuations. In this paper we investigate the continuous-time limits of well-known actuarial premium principles when such backward iteration procedures are applied. This method is applied to an
Dynamically consistent oil import tariffs
International Nuclear Information System (INIS)
Karp, L.; Newbery, D.M.
1992-01-01
The standard theory of optimal tariffs considers tariffs on perishable goods produced abroad under static conditions, in which tariffs affect prices only in that period. Oil and other exhaustable resources do not fit this model, for current tariffs affect the amount of oil imported, which will affect the remaining stock and hence its future price. The problem of choosing a dynamically consistent oil import tariff when suppliers are competitive but importers have market power is considered. The open-loop Nash tariff is solved for the standard competitive case in which the oil price is arbitraged, and it was found that the resulting tariff rises at the rate of interest. This tariff was found to have an equilibrium that in general is dynamically inconsistent. Nevertheless, it is shown that necessary and sufficient conditions exist under which the tariff satisfies the weaker condition of time consistency. A dynamically consistent tariff is obtained by assuming that all agents condition their current decisions on the remaining stock of the resource, in contrast to open-loop strategies. For the natural case in which all agents choose their actions simultaneously in each period, the dynamically consistent tariff was characterized, and found to differ markedly from the time-inconsistent open-loop tariff. It was shown that if importers do not have overwhelming market power, then the time path of the world price is insensitive to the ability to commit, as is the level of wealth achieved by the importer. 26 refs., 4 figs
Consistently violating the non-Gaussian consistency relation
International Nuclear Information System (INIS)
Mooij, Sander; Palma, Gonzalo A.
2015-01-01
Non-attractor models of inflation are characterized by the super-horizon evolution of curvature perturbations, introducing a violation of the non-Gaussian consistency relation between the bispectrum's squeezed limit and the power spectrum's spectral index. In this work we show that the bispectrum's squeezed limit of non-attractor models continues to respect a relation dictated by the evolution of the background. We show how to derive this relation using only symmetry arguments, without ever needing to solve the equations of motion for the perturbations
Consistence of Network Filtering Rules
Institute of Scientific and Technical Information of China (English)
SHE Kun; WU Yuancheng; HUANG Juncai; ZHOU Mingtian
2004-01-01
The inconsistence of firewall/VPN(Virtual Private Network) rule makes a huge maintainable cost.With development of Multinational Company,SOHO office,E-government the number of firewalls/VPN will increase rapidly.Rule table in stand-alone or network will be increased in geometric series accordingly.Checking the consistence of rule table manually is inadequate.A formal approach can define semantic consistence,make a theoretic foundation of intelligent management about rule tables.In this paper,a kind of formalization of host rules and network ones for auto rule-validation based on SET theory were proporsed and a rule validation scheme was defined.The analysis results show the superior performance of the methods and demonstrate its potential for the intelligent management based on rule tables.
International Nuclear Information System (INIS)
Hazeltine, R.D.
1988-12-01
The boundary layer arising in the radial vicinity of a tokamak limiter is examined, with special reference to the TEXT tokamak. It is shown that sheath structure depends upon the self-consistent effects of ion guiding-center orbit modification, as well as the radial variation of E /times/ B-induced toroidal rotation. Reasonable agreement with experiment is obtained from an idealized model which, however simplified, preserves such self-consistent effects. It is argued that the radial sheath, which occurs whenever confining magnetic field-lines lie in the plasma boundary surface, is an object of some intrinsic interest. It differs from the more familiar axial sheath because magnetized charges respond very differently to parallel and perpendicular electric fields. 11 refs., 1 fig
Lagrangian multiforms and multidimensional consistency
Energy Technology Data Exchange (ETDEWEB)
Lobb, Sarah; Nijhoff, Frank [Department of Applied Mathematics, University of Leeds, Leeds LS2 9JT (United Kingdom)
2009-10-30
We show that well-chosen Lagrangians for a class of two-dimensional integrable lattice equations obey a closure relation when embedded in a higher dimensional lattice. On the basis of this property we formulate a Lagrangian description for such systems in terms of Lagrangian multiforms. We discuss the connection of this formalism with the notion of multidimensional consistency, and the role of the lattice from the point of view of the relevant variational principle.
Consistency and Communication in Committees
Inga Deimen; Felix Ketelaar; Mark T. Le Quement
2013-01-01
This paper analyzes truthtelling incentives in pre-vote communication in heterogeneous committees. We generalize the classical Condorcet jury model by introducing a new informational structure that captures consistency of information. In contrast to the impossibility result shown by Coughlan (2000) for the classical model, full pooling of information followed by sincere voting is an equilibrium outcome of our model for a large set of parameter values implying the possibility of ex post confli...
Deep Feature Consistent Variational Autoencoder
Hou, Xianxu; Shen, Linlin; Sun, Ke; Qiu, Guoping
2016-01-01
We present a novel method for constructing Variational Autoencoder (VAE). Instead of using pixel-by-pixel loss, we enforce deep feature consistency between the input and the output of a VAE, which ensures the VAE's output to preserve the spatial correlation characteristics of the input, thus leading the output to have a more natural visual appearance and better perceptual quality. Based on recent deep learning works such as style transfer, we employ a pre-trained deep convolutional neural net...
International Nuclear Information System (INIS)
Kellum, C.D.; Fisher, L.M.; Tegtmeyer, C.J.
1987-01-01
This paper examines the advantages of the use of excretory urography for diagnosis. According to the authors, excretory urography remains the basic radiologic examination of the urinary tract and is the foundation for the evaluation of suspected urologic disease. Despite development of the newer diagnostic modalities such as isotope scanning, ultrasonography, CT, and magnetic resonsance imaging (MRI), excretory urography has maintained a prominent role in ruorradiology. Some indications have been altered and will continue to change with the newer imaging modalities, but the initial evaluation of suspected urinary tract structural abnormalities; hematuria, pyuria, and calculus disease is best performed with excretory urography. The examination is relatively inexpensive and simple to perform, with few contraindictions. Excretory urography, when properly performed, can provide valuable information about the renal parenchyma, pelvicalyceal system, ureters, and urinary bladder
Gwinnutt, James M; Sharp, Charlotte A; Symmons, Deborah P M; Lunt, Mark; Verstappen, Suzanne M M
2018-03-15
To assess baseline predictors of long-term functional disability in patients with inflammatory arthritis (IA). We conducted a systematic review of the literature from 1990 to 2017 using MEDLINE and EMBASE. Studies were included if (i) they were prospective observational studies, (ii) all patients had IA with symptom duration ≤2 years at baseline, (iii) follow-up was at least 5 years, and (iv) baseline predictors of HAQ score at long-term follow-up (i.e., ≥5 years following baseline) were assessed. Information on the included studies and estimates of the association between baseline variables and long-term HAQ scores were extracted from the full manuscripts. Of 1037 abstracts identified by the search strategy, 37 met the inclusion/exclusion criteria and were included in the review. Older age at baseline and female gender were reported to be associated with higher long-term HAQ scores in the majority of studies assessing these relationships, as were higher baseline HAQ and greater pain scores (total patients included in analyses reporting significant associations/total number of patients analysed: age 9.8k/10.7k (91.6%); gender 9.9k/11.3k (87.4%); HAQ 4.0k/4.0k (99.0%); pain 2.8k/2.9k (93.6%)). Tender joint count, erythrocyte sedimentation rate (ESR) and DAS28 were also reported to predict long-term HAQ score; other disease activity measures were less consistent (tender joints 2.1k/2.5k (84.5%); erythrocyte sedimentation rate 1.6k/2.2k (72.3%); DAS28 888/1.1k (79.2%); swollen joints 684/2.6k (26.6%); C-reactive protein 279/510 (54.7%)). Rheumatoid factor (RF) and erosions were not useful predictors (RF 546/4.6k (11.9%); erosions 191/2.7k (7.0%)), whereas the results for anti-citrullinated protein antibody positivity were equivocal (ACPA 2.0k/3.8k (52.9%)). Baseline age, gender, HAQ and pain scores are associated with long-term disability and knowledge of these may aid the assessment of prognosis. Copyright © 2018 The Authors. Published by Elsevier Inc. All
Maintaining consistency in distributed systems
Birman, Kenneth P.
1991-01-01
In systems designed as assemblies of independently developed components, concurrent access to data or data structures normally arises within individual programs, and is controlled using mutual exclusion constructs, such as semaphores and monitors. Where data is persistent and/or sets of operation are related to one another, transactions or linearizability may be more appropriate. Systems that incorporate cooperative styles of distributed execution often replicate or distribute data within groups of components. In these cases, group oriented consistency properties must be maintained, and tools based on the virtual synchrony execution model greatly simplify the task confronting an application developer. All three styles of distributed computing are likely to be seen in future systems - often, within the same application. This leads us to propose an integrated approach that permits applications that use virtual synchrony with concurrent objects that respect a linearizability constraint, and vice versa. Transactional subsystems are treated as a special case of linearizability.
Decentralized Consistent Updates in SDN
Nguyen, Thanh Dang
2017-04-10
We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and blackholes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes information needed by the switches during the update execution. This information is distributed to the switches, which use partial knowledge and direct message passing to efficiently realize the update. This separation of concerns has the key benefit of improving update performance as the communication and computation bottlenecks at the controller are removed. Our evaluations via network emulations and large-scale simulations demonstrate the efficiency of ez-Segway, which compared to a centralized approach, improves network update times by up to 45% and 57% at the median and the 99th percentile, respectively. A deployment of a system prototype in a real OpenFlow switch and an implementation in P4 demonstrate the feasibility and low overhead of implementing simple network update functionality within switches.
Consistency or Discrepancy? Rethinking Schools from Organizational Hypocrisy to Integrity
Kiliçoglu, Gökhan
2017-01-01
Consistency in statements, decisions and practices is highly important for both organization members and the image of an organization. It is expected from organizations, especially from their administrators, to "walk the talk"--in other words, to try to practise what they preach. However, in the process of gaining legitimacy and adapting…
Consistency Anchor Formalization and Correctness Proofs
Miguel, Correia; Bessani, Alysson
2014-01-01
This is report contains the formal proofs for the techniques for increasing the consistency of cloud storage as presented in "Bessani et al. SCFS: A Cloud-backed File System. Proc. of the 2014 USENIX Annual Technical Conference. June 2014." The consistency anchor technique allows one to increase the consistency provided by eventually consistent cloud storage services like Amazon S3. This technique has been used in the SCFS (Shared Cloud File System) cloud-backed file system for solving rea...
A new approach to hull consistency
Directory of Open Access Journals (Sweden)
Kolev Lubomir
2016-06-01
Full Text Available Hull consistency is a known technique to improve the efficiency of iterative interval methods for solving nonlinear systems describing steady-states in various circuits. Presently, hull consistency is checked in a scalar manner, i.e. successively for each equation of the nonlinear system with respect to a single variable. In the present poster, a new more general approach to implementing hull consistency is suggested which consists in treating simultaneously several equations with respect to the same number of variables.
Replica consistency in a Data Grid
International Nuclear Information System (INIS)
Domenici, Andrea; Donno, Flavia; Pucciani, Gianni; Stockinger, Heinz; Stockinger, Kurt
2004-01-01
A Data Grid is a wide area computing infrastructure that employs Grid technologies to provide storage capacity and processing power to applications that handle very large quantities of data. Data Grids rely on data replication to achieve better performance and reliability by storing copies of data sets on different Grid nodes. When a data set can be modified by applications, the problem of maintaining consistency among existing copies arises. The consistency problem also concerns metadata, i.e., additional information about application data sets such as indices, directories, or catalogues. This kind of metadata is used both by the applications and by the Grid middleware to manage the data. For instance, the Replica Management Service (the Grid middleware component that controls data replication) uses catalogues to find the replicas of each data set. Such catalogues can also be replicated and their consistency is crucial to the correct operation of the Grid. Therefore, metadata consistency generally poses stricter requirements than data consistency. In this paper we report on the development of a Replica Consistency Service based on the middleware mainly developed by the European Data Grid Project. The paper summarises the main issues in the replica consistency problem, and lays out a high-level architectural design for a Replica Consistency Service. Finally, results from simulations of different consistency models are presented
Student Effort, Consistency, and Online Performance
Patron, Hilde; Lopez, Salvador
2011-01-01
This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas…
Translationally invariant self-consistent field theories
International Nuclear Information System (INIS)
Shakin, C.M.; Weiss, M.S.
1977-01-01
We present a self-consistent field theory which is translationally invariant. The equations obtained go over to the usual Hartree-Fock equations in the limit of large particle number. In addition to deriving the dynamic equations for the self-consistent amplitudes we discuss the calculation of form factors and various other observables
Sticky continuous processes have consistent price systems
DEFF Research Database (Denmark)
Bender, Christian; Pakkanen, Mikko; Sayit, Hasanjan
Under proportional transaction costs, a price process is said to have a consistent price system, if there is a semimartingale with an equivalent martingale measure that evolves within the bid-ask spread. We show that a continuous, multi-asset price process has a consistent price system, under...
Consistent-handed individuals are more authoritarian.
Lyle, Keith B; Grillo, Michael C
2014-01-01
Individuals differ in the consistency with which they use one hand over the other to perform everyday activities. Some individuals are very consistent, habitually using a single hand to perform most tasks. Others are relatively inconsistent, and hence make greater use of both hands. More- versus less-consistent individuals have been shown to differ in numerous aspects of personality and cognition. In several respects consistent-handed individuals resemble authoritarian individuals. For example, both consistent-handedness and authoritarianism have been linked to cognitive inflexibility. Therefore we hypothesised that consistent-handedness is an external marker for authoritarianism. Confirming our hypothesis, we found that consistent-handers scored higher than inconsistent-handers on a measure of submission to authority, were more likely to identify with a conservative political party (Republican), and expressed less-positive attitudes towards out-groups. We propose that authoritarianism may be influenced by the degree of interaction between the left and right brain hemispheres, which has been found to differ between consistent- and inconsistent-handed individuals.
Testing the visual consistency of web sites
van der Geest, Thea; Loorbach, N.R.
2005-01-01
Consistency in the visual appearance of Web pages is often checked by experts, such as designers or reviewers. This article reports a card sort study conducted to determine whether users rather than experts could distinguish visual (in-)consistency in Web elements and pages. The users proved to
Consistent spectroscopy for a extended gauge model
International Nuclear Information System (INIS)
Oliveira Neto, G. de.
1990-11-01
The consistent spectroscopy was obtained with a Lagrangian constructed with vector fields with a U(1) group extended symmetry. As consistent spectroscopy is understood the determination of quantum physical properties described by the model in an manner independent from the possible parametrizations adopted in their description. (L.C.J.A.)
Modeling and Testing Legacy Data Consistency Requirements
DEFF Research Database (Denmark)
Nytun, J. P.; Jensen, Christian Søndergaard
2003-01-01
An increasing number of data sources are available on the Internet, many of which offer semantically overlapping data, but based on different schemas, or models. While it is often of interest to integrate such data sources, the lack of consistency among them makes this integration difficult....... This paper addresses the need for new techniques that enable the modeling and consistency checking for legacy data sources. Specifically, the paper contributes to the development of a framework that enables consistency testing of data coming from different types of data sources. The vehicle is UML and its...... accompanying XMI. The paper presents techniques for modeling consistency requirements using OCL and other UML modeling elements: it studies how models that describe the required consistencies among instances of legacy models can be designed in standard UML tools that support XMI. The paper also considers...
Robust Visual Tracking Via Consistent Low-Rank Sparse Learning
Zhang, Tianzhu
2014-06-19
Object tracking is the process of determining the states of a target in consecutive video frames based on properties of motion and appearance consistency. In this paper, we propose a consistent low-rank sparse tracker (CLRST) that builds upon the particle filter framework for tracking. By exploiting temporal consistency, the proposed CLRST algorithm adaptively prunes and selects candidate particles. By using linear sparse combinations of dictionary templates, the proposed method learns the sparse representations of image regions corresponding to candidate particles jointly by exploiting the underlying low-rank constraints. In addition, the proposed CLRST algorithm is computationally attractive since temporal consistency property helps prune particles and the low-rank minimization problem for learning joint sparse representations can be efficiently solved by a sequence of closed form update operations. We evaluate the proposed CLRST algorithm against 14 state-of-the-art tracking methods on a set of 25 challenging image sequences. Experimental results show that the CLRST algorithm performs favorably against state-of-the-art tracking methods in terms of accuracy and execution time.
Consistency in the World Wide Web
DEFF Research Database (Denmark)
Thomsen, Jakob Grauenkjær
Tim Berners-Lee envisioned that computers will behave as agents of humans on the World Wide Web, where they will retrieve, extract, and interact with information from the World Wide Web. A step towards this vision is to make computers capable of extracting this information in a reliable...... and consistent way. In this dissertation we study steps towards this vision by showing techniques for the specication, the verication and the evaluation of the consistency of information in the World Wide Web. We show how to detect certain classes of errors in a specication of information, and we show how...... the World Wide Web, in order to help perform consistent evaluations of web extraction techniques. These contributions are steps towards having computers reliable and consistently extract information from the World Wide Web, which in turn are steps towards achieving Tim Berners-Lee's vision. ii...
Consistent histories and operational quantum theory
International Nuclear Information System (INIS)
Rudolph, O.
1996-01-01
In this work a generalization of the consistent histories approach to quantum mechanics is presented. We first critically review the consistent histories approach to nonrelativistic quantum mechanics in a mathematically rigorous way and give some general comments about it. We investigate to what extent the consistent histories scheme is compatible with the results of the operational formulation of quantum mechanics. According to the operational approach, nonrelativistic quantum mechanics is most generally formulated in terms of effects, states, and operations. We formulate a generalized consistent histories theory using the concepts and the terminology which have proven useful in the operational formulation of quantum mechanics. The logical rule of the logical interpretation of quantum mechanics is generalized to the present context. The algebraic structure of the generalized theory is studied in detail
Self-consistent areas law in QCD
International Nuclear Information System (INIS)
Makeenko, Yu.M.; Migdal, A.A.
1980-01-01
The problem of obtaining the self-consistent areas law in quantum chromodynamics (QCD) is considered from the point of view of the quark confinement. The exact equation for the loop average in multicolor QCD is reduced to a bootstrap form. Its iterations yield new manifestly gauge invariant perturbation theory in the loop space, reproducing asymptotic freedom. For large loops, the areas law apprears to be a self-consistent solution
Consistency of the MLE under mixture models
Chen, Jiahua
2016-01-01
The large-sample properties of likelihood-based statistical inference under mixture models have received much attention from statisticians. Although the consistency of the nonparametric MLE is regarded as a standard conclusion, many researchers ignore the precise conditions required on the mixture model. An incorrect claim of consistency can lead to false conclusions even if the mixture model under investigation seems well behaved. Under a finite normal mixture model, for instance, the consis...
Self-consistent asset pricing models
Malevergne, Y.; Sornette, D.
2007-08-01
We discuss the foundations of factor or regression models in the light of the self-consistency condition that the market portfolio (and more generally the risk factors) is (are) constituted of the assets whose returns it is (they are) supposed to explain. As already reported in several articles, self-consistency implies correlations between the return disturbances. As a consequence, the alphas and betas of the factor model are unobservable. Self-consistency leads to renormalized betas with zero effective alphas, which are observable with standard OLS regressions. When the conditions derived from internal consistency are not met, the model is necessarily incomplete, which means that some sources of risk cannot be replicated (or hedged) by a portfolio of stocks traded on the market, even for infinite economies. Analytical derivations and numerical simulations show that, for arbitrary choices of the proxy which are different from the true market portfolio, a modified linear regression holds with a non-zero value αi at the origin between an asset i's return and the proxy's return. Self-consistency also introduces “orthogonality” and “normality” conditions linking the betas, alphas (as well as the residuals) and the weights of the proxy portfolio. Two diagnostics based on these orthogonality and normality conditions are implemented on a basket of 323 assets which have been components of the S&P500 in the period from January 1990 to February 2005. These two diagnostics show interesting departures from dynamical self-consistency starting about 2 years before the end of the Internet bubble. Assuming that the CAPM holds with the self-consistency condition, the OLS method automatically obeys the resulting orthogonality and normality conditions and therefore provides a simple way to self-consistently assess the parameters of the model by using proxy portfolios made only of the assets which are used in the CAPM regressions. Finally, the factor decomposition with the
Howell, Robert T.
2004-01-01
With all the talk today about accountability, budget cuts, and the closing of programs in public education, teachers cannot overlook the importance of image in the field of industrial technology. It is very easy for administrators to cut ITE (industrial technology education) programs to save school money--money they might shift to teaching the…
Towards thermodynamical consistency of quasiparticle picture
International Nuclear Information System (INIS)
Biro, T.S.; Shanenko, A.A.; Toneev, V.D.; Research Inst. for Particle and Nuclear Physics, Hungarian Academy of Sciences, Budapest
2003-01-01
The purpose of the present article is to call attention to some realistic quasi-particle-based description of the quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamical consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamical consistency. A particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential, which can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics [ru
Toward thermodynamic consistency of quasiparticle picture
International Nuclear Information System (INIS)
Biro, T.S.; Toneev, V.D.; Shanenko, A.A.
2003-01-01
The purpose of the present article is to call attention to some realistic quasiparticle-based description of quark/gluon matter and its consistent implementation in thermodynamics. A simple and transparent representation of the thermodynamic consistency conditions is given. This representation allows one to review critically and systemize available phenomenological approaches to the deconfinement problem with respect to their thermodynamic consistency. Particular attention is paid to the development of a method for treating the string screening in the dense matter of unbound color charges. The proposed method yields an integrable effective pair potential that can be incorporated into the mean-field picture. The results of its application are in reasonable agreement with lattice data on the QCD thermodynamics
International Nuclear Information System (INIS)
Shepard, J.R.
1991-01-01
The authors examine the RPA based on a relativistic Hartree approximation description for nuclear ground states. This model includes contributions from the negative energy sea at the 1-loop level. They emphasize consistency between the treatment of the ground state and the RPA. This consistency is important in the description of low-lying collective levels but less important for the longitudinal (e, e') quasi-elastic response. They also study the effect of imposing a 3-momentum cutoff on negative energy sea contributions. A cutoff of twice the nucleon mass improves agreement with observed spin orbit splittings in nuclei compared to the standard infinite cutoff results, an effect traceable to the fact that imposing the cutoff reduces m*/m. The cutoff is much less important than consistency in the description of low-lying collective levels. The cutoff model provides excellent agreement with quasi-elastic (e, e') data
Personalized recommendation based on unbiased consistence
Zhu, Xuzhen; Tian, Hui; Zhang, Ping; Hu, Zheng; Zhou, Tao
2015-08-01
Recently, in physical dynamics, mass-diffusion-based recommendation algorithms on bipartite network provide an efficient solution by automatically pushing possible relevant items to users according to their past preferences. However, traditional mass-diffusion-based algorithms just focus on unidirectional mass diffusion from objects having been collected to those which should be recommended, resulting in a biased causal similarity estimation and not-so-good performance. In this letter, we argue that in many cases, a user's interests are stable, and thus bidirectional mass diffusion abilities, no matter originated from objects having been collected or from those which should be recommended, should be consistently powerful, showing unbiased consistence. We further propose a consistence-based mass diffusion algorithm via bidirectional diffusion against biased causality, outperforming the state-of-the-art recommendation algorithms in disparate real data sets, including Netflix, MovieLens, Amazon and Rate Your Music.
International Nuclear Information System (INIS)
Chen Shengzu
2003-01-01
The technique of High Energy Positron Imaging (HEPI) is the new development and extension of Positron Emission Tomography (PET). It consists of High Energy Collimation Imaging (HECI), Dual Head Coincidence Detection Imaging (DHCDI) and Positron Emission Tomography (PET). We describe the history of the development and the basic principle of the imaging methods of HEPI in details in this paper. Finally, the new technique of the imaging fusion, which combined the anatomical image and the functional image together are also introduced briefly
Financial model calibration using consistency hints.
Abu-Mostafa, Y S
2001-01-01
We introduce a technique for forcing the calibration of a financial model to produce valid parameters. The technique is based on learning from hints. It converts simple curve fitting into genuine calibration, where broad conclusions can be inferred from parameter values. The technique augments the error function of curve fitting with consistency hint error functions based on the Kullback-Leibler distance. We introduce an efficient EM-type optimization algorithm tailored to this technique. We also introduce other consistency hints, and balance their weights using canonical errors. We calibrate the correlated multifactor Vasicek model of interest rates, and apply it successfully to Japanese Yen swaps market and US dollar yield market.
ACHIEVING CONSISTENT DOPPLER MEASUREMENTS FROM SDO /HMI VECTOR FIELD INVERSIONS
International Nuclear Information System (INIS)
Schuck, Peter W.; Antiochos, S. K.; Leka, K. D.; Barnes, Graham
2016-01-01
NASA’s Solar Dynamics Observatory is delivering vector magnetic field observations of the full solar disk with unprecedented temporal and spatial resolution; however, the satellite is in a highly inclined geosynchronous orbit. The relative spacecraft–Sun velocity varies by ±3 km s −1 over a day, which introduces major orbital artifacts in the Helioseismic Magnetic Imager (HMI) data. We demonstrate that the orbital artifacts contaminate all spatial and temporal scales in the data. We describe a newly developed three-stage procedure for mitigating these artifacts in the Doppler data obtained from the Milne–Eddington inversions in the HMI pipeline. The procedure ultimately uses 32 velocity-dependent coefficients to adjust 10 million pixels—a remarkably sparse correction model given the complexity of the orbital artifacts. This procedure was applied to full-disk images of AR 11084 to produce consistent Dopplergrams. The data adjustments reduce the power in the orbital artifacts by 31 dB. Furthermore, we analyze in detail the corrected images and show that our procedure greatly improves the temporal and spectral properties of the data without adding any new artifacts. We conclude that this new procedure makes a dramatic improvement in the consistency of the HMI data and in its usefulness for precision scientific studies.
Proteolysis and consistency of Meshanger cheese
Jong, de L.
1978-01-01
Proteolysis in Meshanger cheese, estimated by quantitative polyacrylamide gel electrophoresis is discussed. The conversion of α _{s1} -casein was proportional to rennet concentration in the cheese. Changes in consistency, after a maximum, were correlated to breakdown of
Developing consistent pronunciation models for phonemic variants
CSIR Research Space (South Africa)
Davel, M
2006-09-01
Full Text Available Pronunciation lexicons often contain pronunciation variants. This can create two problems: It can be difficult to define these variants in an internally consistent way and it can also be difficult to extract generalised grapheme-to-phoneme rule sets...
Consistent Valuation across Curves Using Pricing Kernels
Directory of Open Access Journals (Sweden)
Andrea Macrina
2018-03-01
Full Text Available The general problem of asset pricing when the discount rate differs from the rate at which an asset’s cash flows accrue is considered. A pricing kernel framework is used to model an economy that is segmented into distinct markets, each identified by a yield curve having its own market, credit and liquidity risk characteristics. The proposed framework precludes arbitrage within each market, while the definition of a curve-conversion factor process links all markets in a consistent arbitrage-free manner. A pricing formula is then derived, referred to as the across-curve pricing formula, which enables consistent valuation and hedging of financial instruments across curves (and markets. As a natural application, a consistent multi-curve framework is formulated for emerging and developed inter-bank swap markets, which highlights an important dual feature of the curve-conversion factor process. Given this multi-curve framework, existing multi-curve approaches based on HJM and rational pricing kernel models are recovered, reviewed and generalised and single-curve models extended. In another application, inflation-linked, currency-based and fixed-income hybrid securities are shown to be consistently valued using the across-curve valuation method.
Consistent application of codes and standards
International Nuclear Information System (INIS)
Scott, M.A.
1989-01-01
The guidelines presented in the US Department of Energy, General Design Criteria (DOE 6430.1A), and the Design and Evaluation Guidelines for Department of Energy Facilities Subject to Natural Phenomena Hazards (UCRL-15910) provide a consistent and well defined approach to determine the natural phenomena hazards loads for US Department of Energy site facilities. The guidelines for the application of loads combinations and allowables criteria are not as well defined and are more flexible in interpretation. This flexibility in the interpretation of load combinations can lead to conflict between the designer and overseer. The establishment of an efficient set of acceptable design criteria, based on US Department of Energy guidelines, provides a consistent baseline for analysis, design, and review. Additionally, the proposed method should not limit the design and analytical innovation necessary to analyze or qualify the unique structure. This paper investigates the consistent application of load combinations, analytical methods, and load allowables and suggests a reference path consistent with the US Department of Energy guidelines
Consistency in multi-viewpoint architectural design
Dijkman, R.M.; Dijkman, Remco Matthijs
2006-01-01
This thesis presents a framework that aids in preserving consistency in multi-viewpoint designs. In a multi-viewpoint design each stakeholder constructs his own design part. We call each stakeholder’s design part the view of that stakeholder. To construct his view, a stakeholder has a viewpoint.
Consistent Visual Analyses of Intrasubject Data
Kahng, SungWoo; Chung, Kyong-Mee; Gutshall, Katharine; Pitts, Steven C.; Kao, Joyce; Girolami, Kelli
2010-01-01
Visual inspection of single-case data is the primary method of interpretation of the effects of an independent variable on a dependent variable in applied behavior analysis. The purpose of the current study was to replicate and extend the results of DeProspero and Cohen (1979) by reexamining the consistency of visual analysis across raters. We…
Consistent Stochastic Modelling of Meteocean Design Parameters
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Sterndorff, M. J.
2000-01-01
Consistent stochastic models of metocean design parameters and their directional dependencies are essential for reliability assessment of offshore structures. In this paper a stochastic model for the annual maximum values of the significant wave height, and the associated wind velocity, current...
On the existence of consistent price systems
DEFF Research Database (Denmark)
Bayraktar, Erhan; Pakkanen, Mikko S.; Sayit, Hasanjan
2014-01-01
We formulate a sufficient condition for the existence of a consistent price system (CPS), which is weaker than the conditional full support condition (CFS). We use the new condition to show the existence of CPSs for certain processes that fail to have the CFS property. In particular this condition...
Dynamic phonon exchange requires consistent dressing
International Nuclear Information System (INIS)
Hahne, F.J.W.; Engelbrecht, C.A.; Heiss, W.D.
1976-01-01
It is shown that states with undersirable properties (such as ghosts, states with complex eigenenergies and states with unrestricted normalization) emerge from two-body calculations using dynamic effective interactions if one is not careful in introducing single-particle self-energy insertions in a consistent manner
Consistent feeding positions of great tit parents
Lessells, C.M.; Poelman, E.H.; Mateman, A.C.; Cassey, Ph.
2006-01-01
When parent birds arrive at the nest to provision their young, their position on the nest rim may influence which chick or chicks are fed. As a result, the consistency of feeding positions of the individual parents, and the difference in position between the parents, may affect how equitably food is
Consistency of the postulates of special relativity
International Nuclear Information System (INIS)
Gron, O.; Nicola, M.
1976-01-01
In a recent article in this journal, Kingsley has tried to show that the postulates of special relativity contradict each other. It is shown that the arguments of Kingsley are invalid because of an erroneous appeal to symmetry in a nonsymmetric situation. The consistency of the postulates of special relativity and the relativistic kinematics deduced from them is restated
Consistency of Network Traffic Repositories: An Overview
Lastdrager, E.; Lastdrager, E.E.H.; Pras, Aiko
2009-01-01
Traffc repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffc that has been ï¬‚owing over the network; little thoughts are made regarding the consistency of these repositories. Still, for
Consistency analysis of network traffic repositories
Lastdrager, Elmer; Lastdrager, E.E.H.; Pras, Aiko
Traffic repositories with TCP/IP header information are very important for network analysis. Researchers often assume that such repositories reliably represent all traffic that has been flowing over the network; little thoughts are made regarding the consistency of these repositories. Still, for
A consistent interpretation of quantum mechanics
International Nuclear Information System (INIS)
Omnes, Roland
1990-01-01
Some mostly recent theoretical and mathematical advances can be linked together to yield a new consistent interpretation of quantum mechanics. It relies upon a unique and universal interpretative rule of a logical character which is based upon Griffiths consistent history. Some new results in semi-classical physics allow classical physics to be derived from this rule, including its logical aspects, and to prove accordingly the existence of determinism within the quantum framework. Together with decoherence, this can be used to retrieve the existence of facts, despite the probabilistic character of the theory. Measurement theory can then be made entirely deductive. It is accordingly found that wave packet reduction is a logical property, whereas one can always choose to avoid using it. The practical consequences of this interpretation are most often in agreement with the Copenhagen formulation but they can be proved never to give rise to any logical inconsistency or paradox. (author)
Self-consistency in Capital Markets
Benbrahim, Hamid
2013-03-01
Capital Markets are considered, at least in theory, information engines whereby traders contribute to price formation with their diverse perspectives. Regardless whether one believes in efficient market theory on not, actions by individual traders influence prices of securities, which in turn influence actions by other traders. This influence is exerted through a number of mechanisms including portfolio balancing, margin maintenance, trend following, and sentiment. As a result market behaviors emerge from a number of mechanisms ranging from self-consistency due to wisdom of the crowds and self-fulfilling prophecies, to more chaotic behavior resulting from dynamics similar to the three body system, namely the interplay between equities, options, and futures. This talk will address questions and findings regarding the search for self-consistency in capital markets.
Student Effort, Consistency and Online Performance
Directory of Open Access Journals (Sweden)
Hilde Patron
2011-07-01
Full Text Available This paper examines how student effort, consistency, motivation, and marginal learning, influence student grades in an online course. We use data from eleven Microeconomics courses taught online for a total of 212 students. Our findings show that consistency, or less time variation, is a statistically significant explanatory variable, whereas effort, or total minutes spent online, is not. Other independent variables include GPA and the difference between a pre-test and a post-test. The GPA is used as a measure of motivation, and the difference between a post-test and pre-test as marginal learning. As expected, the level of motivation is found statistically significant at a 99% confidence level, and marginal learning is also significant at a 95% level.
Consistent thermodynamic properties of lipids systems
DEFF Research Database (Denmark)
Cunico, Larissa; Ceriani, Roberta; Sarup, Bent
different pressures, with azeotrope behavior observed. Available thermodynamic consistency tests for TPx data were applied before performing parameter regressions for Wilson, NRTL, UNIQUAC and original UNIFAC models. The relevance of enlarging experimental databank of lipids systems data in order to improve......Physical and thermodynamic properties of pure components and their mixtures are the basic requirement for process design, simulation, and optimization. In the case of lipids, our previous works[1-3] have indicated a lack of experimental data for pure components and also for their mixtures...... the performance of predictive thermodynamic models was confirmed in this work by analyzing the calculated values of original UNIFAC model. For solid-liquid equilibrium (SLE) data, new consistency tests have been developed [2]. Some of the developed tests were based in the quality tests proposed for VLE data...
Consistency relation for cosmic magnetic fields
DEFF Research Database (Denmark)
Jain, R. K.; Sloth, M. S.
2012-01-01
If cosmic magnetic fields are indeed produced during inflation, they are likely to be correlated with the scalar metric perturbations that are responsible for the cosmic microwave background anisotropies and large scale structure. Within an archetypical model of inflationary magnetogenesis, we show...... that there exists a new simple consistency relation for the non-Gaussian cross correlation function of the scalar metric perturbation with two powers of the magnetic field in the squeezed limit where the momentum of the metric perturbation vanishes. We emphasize that such a consistency relation turns out...... to be extremely useful to test some recent calculations in the literature. Apart from primordial non-Gaussianity induced by the curvature perturbations, such a cross correlation might provide a new observational probe of inflation and can in principle reveal the primordial nature of cosmic magnetic fields. DOI...
Consistent Estimation of Partition Markov Models
Directory of Open Access Journals (Sweden)
Jesús E. García
2017-04-01
Full Text Available The Partition Markov Model characterizes the process by a partition L of the state space, where the elements in each part of L share the same transition probability to an arbitrary element in the alphabet. This model aims to answer the following questions: what is the minimal number of parameters needed to specify a Markov chain and how to estimate these parameters. In order to answer these questions, we build a consistent strategy for model selection which consist of: giving a size n realization of the process, finding a model within the Partition Markov class, with a minimal number of parts to represent the process law. From the strategy, we derive a measure that establishes a metric in the state space. In addition, we show that if the law of the process is Markovian, then, eventually, when n goes to infinity, L will be retrieved. We show an application to model internet navigation patterns.
Internal Branding and Employee Brand Consistent Behaviours
DEFF Research Database (Denmark)
Mazzei, Alessandra; Ravazzani, Silvia
2017-01-01
constitutive processes. In particular, the paper places emphasis on the role and kinds of communication practices as a central part of the nonnormative and constitutive internal branding process. The paper also discusses an empirical study based on interviews with 32 Italian and American communication managers...... and 2 focus groups with Italian communication managers. Findings show that, in order to enhance employee brand consistent behaviours, the most effective communication practices are those characterised as enablement-oriented. Such a communication creates the organizational conditions adequate to sustain......Employee behaviours conveying brand values, named brand consistent behaviours, affect the overall brand evaluation. Internal branding literature highlights a knowledge gap in terms of communication practices intended to sustain such behaviours. This study contributes to the development of a non...
Self-consistent velocity dependent effective interactions
International Nuclear Information System (INIS)
Kubo, Takayuki; Sakamoto, Hideo; Kammuri, Tetsuo; Kishimoto, Teruo.
1993-09-01
The field coupling method is extended to a system with a velocity dependent mean potential. By means of this method, we can derive the effective interactions which are consistent with the mean potential. The self-consistent velocity dependent effective interactions are applied to the microscopic analysis of the structures of giant dipole resonances (GDR) of 148,154 Sm, of the first excited 2 + states of Sn isotopes and of the first excited 3 - states of Mo isotopes. It is clarified that the interactions play crucial roles in describing the splitting of the resonant structure of GDR peaks, in restoring the energy weighted sum rule values, and in reducing B (Eλ) values. (author)
Evaluating Temporal Consistency in Marine Biodiversity Hotspots
Piacenza, Susan E.; Thurman, Lindsey L.; Barner, Allison K.; Benkwitt, Cassandra E.; Boersma, Kate S.; Cerny-Chipman, Elizabeth B.; Ingeman, Kurt E.; Kindinger, Tye L.; Lindsley, Amy J.; Nelson, Jake; Reimer, Jessica N.; Rowe, Jennifer C.; Shen, Chenchen; Thompson, Kevin A.; Heppell, Selina S.
2015-01-01
With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monito...
Cloud Standardization: Consistent Business Processes and Information
Directory of Open Access Journals (Sweden)
Razvan Daniel ZOTA
2013-01-01
Full Text Available Cloud computing represents one of the latest emerging trends in distributed computing that enables the existence of hardware infrastructure and software applications as services. The present paper offers a general approach to the cloud computing standardization as a mean of improving the speed of adoption for the cloud technologies. Moreover, this study tries to show out how organizations may achieve more consistent business processes while operating with cloud computing technologies.
Consistency Analysis of Nearest Subspace Classifier
Wang, Yi
2015-01-01
The Nearest subspace classifier (NSS) finds an estimation of the underlying subspace within each class and assigns data points to the class that corresponds to its nearest subspace. This paper mainly studies how well NSS can be generalized to new samples. It is proved that NSS is strongly consistent under certain assumptions. For completeness, NSS is evaluated through experiments on various simulated and real data sets, in comparison with some other linear model based classifiers. It is also ...
Consistency relations in effective field theory
Energy Technology Data Exchange (ETDEWEB)
Munshi, Dipak; Regan, Donough, E-mail: D.Munshi@sussex.ac.uk, E-mail: D.Regan@sussex.ac.uk [Astronomy Centre, School of Mathematical and Physical Sciences, University of Sussex, Brighton BN1 9QH (United Kingdom)
2017-06-01
The consistency relations in large scale structure relate the lower-order correlation functions with their higher-order counterparts. They are direct outcome of the underlying symmetries of a dynamical system and can be tested using data from future surveys such as Euclid. Using techniques from standard perturbation theory (SPT), previous studies of consistency relation have concentrated on continuity-momentum (Euler)-Poisson system of an ideal fluid. We investigate the consistency relations in effective field theory (EFT) which adjusts the SPT predictions to account for the departure from the ideal fluid description on small scales. We provide detailed results for the 3D density contrast δ as well as the scaled divergence of velocity θ-bar . Assuming a ΛCDM background cosmology, we find the correction to SPT results becomes important at k ∼> 0.05 h/Mpc and that the suppression from EFT to SPT results that scales as square of the wave number k , can reach 40% of the total at k ≈ 0.25 h/Mpc at z = 0. We have also investigated whether effective field theory corrections to models of primordial non-Gaussianity can alter the squeezed limit behaviour, finding the results to be rather insensitive to these counterterms. In addition, we present the EFT corrections to the squeezed limit of the bispectrum in redshift space which may be of interest for tests of theories of modified gravity.
Consistent probabilities in loop quantum cosmology
International Nuclear Information System (INIS)
Craig, David A; Singh, Parampreet
2013-01-01
A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)
Orthology and paralogy constraints: satisfiability and consistency.
Lafond, Manuel; El-Mabrouk, Nadia
2014-01-01
A variety of methods based on sequence similarity, reconciliation, synteny or functional characteristics, can be used to infer orthology and paralogy relations between genes of a given gene family G. But is a given set C of orthology/paralogy constraints possible, i.e., can they simultaneously co-exist in an evolutionary history for G? While previous studies have focused on full sets of constraints, here we consider the general case where C does not necessarily involve a constraint for each pair of genes. The problem is subdivided in two parts: (1) Is C satisfiable, i.e. can we find an event-labeled gene tree G inducing C? (2) Is there such a G which is consistent, i.e., such that all displayed triplet phylogenies are included in a species tree? Previous results on the Graph sandwich problem can be used to answer to (1), and we provide polynomial-time algorithms for satisfiability and consistency with a given species tree. We also describe a new polynomial-time algorithm for the case of consistency with an unknown species tree and full knowledge of pairwise orthology/paralogy relationships, as well as a branch-and-bound algorithm in the case when unknown relations are present. We show that our algorithms can be used in combination with ProteinOrtho, a sequence similarity-based orthology detection tool, to extract a set of robust orthology/paralogy relationships.
Consistency of color representation in smart phones.
Dain, Stephen J; Kwan, Benjamin; Wong, Leslie
2016-03-01
One of the barriers to the construction of consistent computer-based color vision tests has been the variety of monitors and computers. Consistency of color on a variety of screens has necessitated calibration of each setup individually. Color vision examination with a carefully controlled display has, as a consequence, been a laboratory rather than a clinical activity. Inevitably, smart phones have become a vehicle for color vision tests. They have the advantage that the processor and screen are associated and there are fewer models of smart phones than permutations of computers and monitors. Colorimetric consistency of display within a model may be a given. It may extend across models from the same manufacturer but is unlikely to extend between manufacturers especially where technologies vary. In this study, we measured the same set of colors in a JPEG file displayed on 11 samples of each of four models of smart phone (iPhone 4s, iPhone5, Samsung Galaxy S3, and Samsung Galaxy S4) using a Photo Research PR-730. The iPhones are white LED backlit LCD and the Samsung are OLEDs. The color gamut varies between models and comparison with sRGB space shows 61%, 85%, 117%, and 110%, respectively. The iPhones differ markedly from the Samsungs and from one another. This indicates that model-specific color lookup tables will be needed. Within each model, the primaries were quite consistent (despite the age of phone varying within each sample). The worst case in each model was the blue primary; the 95th percentile limits in the v' coordinate were ±0.008 for the iPhone 4 and ±0.004 for the other three models. The u'v' variation in white points was ±0.004 for the iPhone4 and ±0.002 for the others, although the spread of white points between models was u'v'±0.007. The differences are essentially the same for primaries at low luminance. The variation of colors intermediate between the primaries (e.g., red-purple, orange) mirror the variation in the primaries. The variation in
Crespin, Daniel J; Christianson, Jon B; McCullough, Jeffrey S; Finch, Michael D
This study addresses whether health systems have consistent diabetes care performance across their ambulatory clinics and whether increasing consistency is associated with improvements in clinic performance. Study data included 2007 to 2013 diabetes care intermediate outcome measures for 661 ambulatory clinics in Minnesota and bordering states. Health systems provided more consistent performance, as measured by the standard deviation of performance for clinics in a system, relative to propensity score-matched proxy systems created for comparison purposes. No evidence was found that improvements in consistency were associated with higher clinic performance. The combination of high performance and consistent care is likely to enhance a health system's brand reputation, allowing it to better mitigate the financial risks of consumers seeking care outside the organization. These results suggest that larger health systems are most likely to deliver the combination of consistent and high-performance care. Future research should explore the mechanisms that drive consistent care within health systems.
Structural covariance networks across healthy young adults and their consistency.
Guo, Xiaojuan; Wang, Yan; Guo, Taomei; Chen, Kewei; Zhang, Jiacai; Li, Ke; Jin, Zhen; Yao, Li
2015-08-01
To investigate structural covariance networks (SCNs) as measured by regional gray matter volumes with structural magnetic resonance imaging (MRI) from healthy young adults, and to examine their consistency and stability. Two independent cohorts were included in this study: Group 1 (82 healthy subjects aged 18-28 years) and Group 2 (109 healthy subjects aged 20-28 years). Structural MRI data were acquired at 3.0T and 1.5T using a magnetization prepared rapid-acquisition gradient echo sequence for these two groups, respectively. We applied independent component analysis (ICA) to construct SCNs and further applied the spatial overlap ratio and correlation coefficient to evaluate the spatial consistency of the SCNs between these two datasets. Seven and six independent components were identified for Group 1 and Group 2, respectively. Moreover, six SCNs including the posterior default mode network, the visual and auditory networks consistently existed across the two datasets. The overlap ratios and correlation coefficients of the visual network reached the maximums of 72% and 0.71. This study demonstrates the existence of consistent SCNs corresponding to general functional networks. These structural covariance findings may provide insight into the underlying organizational principles of brain anatomy. © 2014 Wiley Periodicals, Inc.
Evaluating Temporal Consistency in Marine Biodiversity Hotspots.
Piacenza, Susan E; Thurman, Lindsey L; Barner, Allison K; Benkwitt, Cassandra E; Boersma, Kate S; Cerny-Chipman, Elizabeth B; Ingeman, Kurt E; Kindinger, Tye L; Lindsley, Amy J; Nelson, Jake; Reimer, Jessica N; Rowe, Jennifer C; Shen, Chenchen; Thompson, Kevin A; Heppell, Selina S
2015-01-01
With the ongoing crisis of biodiversity loss and limited resources for conservation, the concept of biodiversity hotspots has been useful in determining conservation priority areas. However, there has been limited research into how temporal variability in biodiversity may influence conservation area prioritization. To address this information gap, we present an approach to evaluate the temporal consistency of biodiversity hotspots in large marine ecosystems. Using a large scale, public monitoring dataset collected over an eight year period off the US Pacific Coast, we developed a methodological approach for avoiding biases associated with hotspot delineation. We aggregated benthic fish species data from research trawls and calculated mean hotspot thresholds for fish species richness and Shannon's diversity indices over the eight year dataset. We used a spatial frequency distribution method to assign hotspot designations to the grid cells annually. We found no areas containing consistently high biodiversity through the entire study period based on the mean thresholds, and no grid cell was designated as a hotspot for greater than 50% of the time-series. To test if our approach was sensitive to sampling effort and the geographic extent of the survey, we followed a similar routine for the northern region of the survey area. Our finding of low consistency in benthic fish biodiversity hotspots over time was upheld, regardless of biodiversity metric used, whether thresholds were calculated per year or across all years, or the spatial extent for which we calculated thresholds and identified hotspots. Our results suggest that static measures of benthic fish biodiversity off the US West Coast are insufficient for identification of hotspots and that long-term data are required to appropriately identify patterns of high temporal variability in biodiversity for these highly mobile taxa. Given that ecological communities are responding to a changing climate and other
Self-consistent gravitational self-force
International Nuclear Information System (INIS)
Pound, Adam
2010-01-01
I review the problem of motion for small bodies in general relativity, with an emphasis on developing a self-consistent treatment of the gravitational self-force. An analysis of the various derivations extant in the literature leads me to formulate an asymptotic expansion in which the metric is expanded while a representative worldline is held fixed. I discuss the utility of this expansion for both exact point particles and asymptotically small bodies, contrasting it with a regular expansion in which both the metric and the worldline are expanded. Based on these preliminary analyses, I present a general method of deriving self-consistent equations of motion for arbitrarily structured (sufficiently compact) small bodies. My method utilizes two expansions: an inner expansion that keeps the size of the body fixed, and an outer expansion that lets the body shrink while holding its worldline fixed. By imposing the Lorenz gauge, I express the global solution to the Einstein equation in the outer expansion in terms of an integral over a worldtube of small radius surrounding the body. Appropriate boundary data on the tube are determined from a local-in-space expansion in a buffer region where both the inner and outer expansions are valid. This buffer-region expansion also results in an expression for the self-force in terms of irreducible pieces of the metric perturbation on the worldline. Based on the global solution, these pieces of the perturbation can be written in terms of a tail integral over the body's past history. This approach can be applied at any order to obtain a self-consistent approximation that is valid on long time scales, both near and far from the small body. I conclude by discussing possible extensions of my method and comparing it to alternative approaches.
Consistency Checking of Web Service Contracts
DEFF Research Database (Denmark)
Cambronero, M. Emilia; Okika, Joseph C.; Ravn, Anders Peter
2008-01-01
Behavioural properties are analyzed for web service contracts formulated in Business Process Execution Language (BPEL) and Choreography Description Language (CDL). The key result reported is an automated technique to check consistency between protocol aspects of the contracts. The contracts...... are abstracted to (timed) automata and from there a simulation is set up, which is checked using automated tools for analyzing networks of finite state processes. Here we use the Concurrency Work Bench. The proposed techniques are illustrated with a case study that include otherwise difficult to analyze fault...
A method for consistent precision radiation therapy
International Nuclear Information System (INIS)
Leong, J.
1985-01-01
Using a meticulous setup procedure in which repeated portal films were taken before each treatment until satisfactory portal verifications were obtained, a high degree of precision in patient positioning was achieved. A fluctuation from treatment to treatment, over 11 treatments, of less than +-0.10 cm (S.D.) for anatomical points inside the treatment field was obtained. This, however, only applies to specific anatomical points selected for this positioning procedure and does not apply to all points within the portal. We have generalized this procedure and have suggested a means by which any target volume can be consistently positioned which may approach this degree of precision. (orig.)
Gentzen's centenary the quest for consistency
Rathjen, Michael
2015-01-01
Gerhard Gentzen has been described as logic’s lost genius, whom Gödel called a better logician than himself. This work comprises articles by leading proof theorists, attesting to Gentzen’s enduring legacy to mathematical logic and beyond. The contributions range from philosophical reflections and re-evaluations of Gentzen’s original consistency proofs to the most recent developments in proof theory. Gentzen founded modern proof theory. His sequent calculus and natural deduction system beautifully explain the deep symmetries of logic. They underlie modern developments in computer science such as automated theorem proving and type theory.
Two consistent calculations of the Weinberg angle
International Nuclear Information System (INIS)
Fairlie, D.B.
1979-01-01
The Weinberg-Salam theory is reformulated as a pure Yang-Mills theory in a six-dimensional space, the Higgs field being interpreted as gauge potentials in the additional dimensions. Viewed in this way, the condition that the Higgs field transforms as a U(1) representation of charge one is equivalent to requiring a value of 30 0 C for the Weinberg angle. A second consistent determination comes from the idea borrowed from monopole theory that the electromagnetic field is in the direction of the Higgs field. (Author)
Consistent resolution of some relativistic quantum paradoxes
International Nuclear Information System (INIS)
Griffiths, Robert B.
2002-01-01
A relativistic version of the (consistent or decoherent) histories approach to quantum theory is developed on the basis of earlier work by Hartle, and used to discuss relativistic forms of the paradoxes of spherical wave packet collapse, Bohm's formulation of the Einstein-Podolsky-Rosen paradox, and Hardy's paradox. It is argued that wave function collapse is not needed for introducing probabilities into relativistic quantum mechanics, and in any case should never be thought of as a physical process. Alternative approaches to stochastic time dependence can be used to construct a physical picture of the measurement process that is less misleading than collapse models. In particular, one can employ a coarse-grained but fully quantum-mechanical description in which particles move along trajectories, with behavior under Lorentz transformations the same as in classical relativistic physics, and detectors are triggered by particles reaching them along such trajectories. States entangled between spacelike separate regions are also legitimate quantum descriptions, and can be consistently handled by the formalism presented here. The paradoxes in question arise because of using modes of reasoning which, while correct for classical physics, are inconsistent with the mathematical structure of quantum theory, and are resolved (or tamed) by using a proper quantum analysis. In particular, there is no need to invoke, nor any evidence for, mysterious long-range superluminal influences, and thus no incompatibility, at least from this source, between relativity theory and quantum mechanics
Self-consistent model of confinement
International Nuclear Information System (INIS)
Swift, A.R.
1988-01-01
A model of the large-spatial-distance, zero--three-momentum, limit of QCD is developed from the hypothesis that there is an infrared singularity. Single quarks and gluons do not propagate because they have infinite energy after renormalization. The Hamiltonian formulation of the path integral is used to quantize QCD with physical, nonpropagating fields. Perturbation theory in the infrared limit is simplified by the absence of self-energy insertions and by the suppression of large classes of diagrams due to vanishing propagators. Remaining terms in the perturbation series are resummed to produce a set of nonlinear, renormalizable integral equations which fix both the confining interaction and the physical propagators. Solutions demonstrate the self-consistency of the concepts of an infrared singularity and nonpropagating fields. The Wilson loop is calculated to provide a general proof of confinement. Bethe-Salpeter equations for quark-antiquark pairs and for two gluons have finite-energy solutions in the color-singlet channel. The choice of gauge is addressed in detail. Large classes of corrections to the model are discussed and shown to support self-consistency
Subgame consistent cooperation a comprehensive treatise
Yeung, David W K
2016-01-01
Strategic behavior in the human and social world has been increasingly recognized in theory and practice. It is well known that non-cooperative behavior could lead to suboptimal or even highly undesirable outcomes. Cooperation suggests the possibility of obtaining socially optimal solutions and the calls for cooperation are prevalent in real-life problems. Dynamic cooperation cannot be sustainable if there is no guarantee that the agreed upon optimality principle at the beginning is maintained throughout the cooperation duration. It is due to the lack of this kind of guarantees that cooperative schemes fail to last till its end or even fail to get started. The property of subgame consistency in cooperative dynamic games and the corresponding solution mechanism resolve this “classic” problem in game theory. This book is a comprehensive treatise on subgame consistent dynamic cooperation covering the up-to-date state of the art analyses in this important topic. It sets out to provide the theory, solution tec...
Sludge characterization: the role of physical consistency
Energy Technology Data Exchange (ETDEWEB)
Spinosa, Ludovico; Wichmann, Knut
2003-07-01
The physical consistency is an important parameter in sewage sludge characterization as it strongly affects almost all treatment, utilization and disposal operations. In addition, in many european Directives a reference to the physical consistency is reported as a characteristic to be evaluated for fulfilling the regulations requirements. Further, in many analytical methods for sludge different procedures are indicated depending on whether a sample is liquid or not, is solid or not. Three physical behaviours (liquid, paste-like and solid) can be observed with sludges, so the development of analytical procedures to define the boundary limit between liquid and paste-like behaviours (flowability) and that between solid and paste-like ones (solidity) is of growing interest. Several devices can be used for evaluating the flowability and solidity properties, but often they are costly and difficult to be operated in the field. Tests have been carried out to evaluate the possibility to adopt a simple extrusion procedure for flowability measurements, and a Vicat needle for solidity ones. (author)
Consistent mutational paths predict eukaryotic thermostability
Directory of Open Access Journals (Sweden)
van Noort Vera
2013-01-01
Full Text Available Abstract Background Proteomes of thermophilic prokaryotes have been instrumental in structural biology and successfully exploited in biotechnology, however many proteins required for eukaryotic cell function are absent from bacteria or archaea. With Chaetomium thermophilum, Thielavia terrestris and Thielavia heterothallica three genome sequences of thermophilic eukaryotes have been published. Results Studying the genomes and proteomes of these thermophilic fungi, we found common strategies of thermal adaptation across the different kingdoms of Life, including amino acid biases and a reduced genome size. A phylogenetics-guided comparison of thermophilic proteomes with those of other, mesophilic Sordariomycetes revealed consistent amino acid substitutions associated to thermophily that were also present in an independent lineage of thermophilic fungi. The most consistent pattern is the substitution of lysine by arginine, which we could find in almost all lineages but has not been extensively used in protein stability engineering. By exploiting mutational paths towards the thermophiles, we could predict particular amino acid residues in individual proteins that contribute to thermostability and validated some of them experimentally. By determining the three-dimensional structure of an exemplar protein from C. thermophilum (Arx1, we could also characterise the molecular consequences of some of these mutations. Conclusions The comparative analysis of these three genomes not only enhances our understanding of the evolution of thermophily, but also provides new ways to engineer protein stability.
Consistency of extreme flood estimation approaches
Felder, Guido; Paquet, Emmanuel; Penot, David; Zischg, Andreas; Weingartner, Rolf
2017-04-01
Estimations of low-probability flood events are frequently used for the planning of infrastructure as well as for determining the dimensions of flood protection measures. There are several well-established methodical procedures to estimate low-probability floods. However, a global assessment of the consistency of these methods is difficult to achieve, the "true value" of an extreme flood being not observable. Anyway, a detailed comparison performed on a given case study brings useful information about the statistical and hydrological processes involved in different methods. In this study, the following three different approaches for estimating low-probability floods are compared: a purely statistical approach (ordinary extreme value statistics), a statistical approach based on stochastic rainfall-runoff simulation (SCHADEX method), and a deterministic approach (physically based PMF estimation). These methods are tested for two different Swiss catchments. The results and some intermediate variables are used for assessing potential strengths and weaknesses of each method, as well as for evaluating the consistency of these methods.
Consistent biokinetic models for the actinide elements
International Nuclear Information System (INIS)
Leggett, R.W.
2001-01-01
The biokinetic models for Th, Np, Pu, Am and Cm currently recommended by the International Commission on Radiological Protection (ICRP) were developed within a generic framework that depicts gradual burial of skeletal activity in bone volume, depicts recycling of activity released to blood and links excretion to retention and translocation of activity. For other actinide elements such as Ac, Pa, Bk, Cf and Es, the ICRP still uses simplistic retention models that assign all skeletal activity to bone surface and depicts one-directional flow of activity from blood to long-term depositories to excreta. This mixture of updated and older models in ICRP documents has led to inconsistencies in dose estimates and interpretation of bioassay for radionuclides with reasonably similar biokinetics. This paper proposes new biokinetic models for Ac, Pa, Bk, Cf and Es that are consistent with the updated models for Th, Np, Pu, Am and Cm. The proposed models are developed within the ICRP's generic model framework for bone-surface-seeking radionuclides, and an effort has been made to develop parameter values that are consistent with results of comparative biokinetic data on the different actinide elements. (author)
Multi-view 3D echocardiography compounding based on feature consistency
Yao, Cheng; Simpson, John M.; Schaeffter, Tobias; Penney, Graeme P.
2011-09-01
Echocardiography (echo) is a widely available method to obtain images of the heart; however, echo can suffer due to the presence of artefacts, high noise and a restricted field of view. One method to overcome these limitations is to use multiple images, using the 'best' parts from each image to produce a higher quality 'compounded' image. This paper describes our compounding algorithm which specifically aims to reduce the effect of echo artefacts as well as improving the signal-to-noise ratio, contrast and extending the field of view. Our method weights image information based on a local feature coherence/consistency between all the overlapping images. Validation has been carried out using phantom, volunteer and patient datasets consisting of up to ten multi-view 3D images. Multiple sets of phantom images were acquired, some directly from the phantom surface, and others by imaging through hard and soft tissue mimicking material to degrade the image quality. Our compounding method is compared to the original, uncompounded echocardiography images, and to two basic statistical compounding methods (mean and maximum). Results show that our method is able to take a set of ten images, degraded by soft and hard tissue artefacts, and produce a compounded image of equivalent quality to images acquired directly from the phantom. Our method on phantom, volunteer and patient data achieves almost the same signal-to-noise improvement as the mean method, while simultaneously almost achieving the same contrast improvement as the maximum method. We show a statistically significant improvement in image quality by using an increased number of images (ten compared to five), and visual inspection studies by three clinicians showed very strong preference for our compounded volumes in terms of overall high image quality, large field of view, high endocardial border definition and low cavity noise.
Multi-view 3D echocardiography compounding based on feature consistency
International Nuclear Information System (INIS)
Yao Cheng; Schaeffter, Tobias; Penney, Graeme P; Simpson, John M
2011-01-01
Echocardiography (echo) is a widely available method to obtain images of the heart; however, echo can suffer due to the presence of artefacts, high noise and a restricted field of view. One method to overcome these limitations is to use multiple images, using the 'best' parts from each image to produce a higher quality 'compounded' image. This paper describes our compounding algorithm which specifically aims to reduce the effect of echo artefacts as well as improving the signal-to-noise ratio, contrast and extending the field of view. Our method weights image information based on a local feature coherence/consistency between all the overlapping images. Validation has been carried out using phantom, volunteer and patient datasets consisting of up to ten multi-view 3D images. Multiple sets of phantom images were acquired, some directly from the phantom surface, and others by imaging through hard and soft tissue mimicking material to degrade the image quality. Our compounding method is compared to the original, uncompounded echocardiography images, and to two basic statistical compounding methods (mean and maximum). Results show that our method is able to take a set of ten images, degraded by soft and hard tissue artefacts, and produce a compounded image of equivalent quality to images acquired directly from the phantom. Our method on phantom, volunteer and patient data achieves almost the same signal-to-noise improvement as the mean method, while simultaneously almost achieving the same contrast improvement as the maximum method. We show a statistically significant improvement in image quality by using an increased number of images (ten compared to five), and visual inspection studies by three clinicians showed very strong preference for our compounded volumes in terms of overall high image quality, large field of view, high endocardial border definition and low cavity noise.
Consistency of canonical formulation of Horava gravity
International Nuclear Information System (INIS)
Soo, Chopin
2011-01-01
Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.
Consistency of canonical formulation of Horava gravity
Energy Technology Data Exchange (ETDEWEB)
Soo, Chopin, E-mail: cpsoo@mail.ncku.edu.tw [Department of Physics, National Cheng Kung University, Tainan, Taiwan (China)
2011-09-22
Both the non-projectable and projectable version of Horava gravity face serious challenges. In the non-projectable version, the constraint algebra is seemingly inconsistent. The projectable version lacks a local Hamiltonian constraint, thus allowing for an extra graviton mode which can be problematic. A new formulation (based on arXiv:1007.1563) of Horava gravity which is naturally realized as a representation of the master constraint algebra (instead of the Dirac algebra) studied by loop quantum gravity researchers is presented. This formulation yields a consistent canonical theory with first class constraints; and captures the essence of Horava gravity in retaining only spatial diffeomorphisms as the physically relevant non-trivial gauge symmetry. At the same time the local Hamiltonian constraint is equivalently enforced by the master constraint.
A consistent thermodynamic database for cement minerals
International Nuclear Information System (INIS)
Blanc, P.; Claret, F.; Burnol, A.; Marty, N.; Gaboreau, S.; Tournassat, C.; Gaucher, E.C.; Giffault, E.; Bourbon, X.
2010-01-01
work - the formation enthalpy and the Cp(T) function are taken from the literature or estimated - finally, the Log K(T) function is calculated, based on the selected dataset and it is compared to experimental data gathered at different temperatures. Each experimental point is extracted from solution compositions by using PHREEQC with a selection of aqueous complexes, consistent with the Thermochimie database. The selection was tested namely by drawing activity diagrams, allowing to assess phases relations. An example of such a diagram, drawn in the CaO-Al 2 O 3 -SiO 2 -H 2 O system is displayed. It can be seen that low pH concrete alteration proceeds essentially in decreasing the C/S ratio in C-S-H phases to the point where C-S-H are no longer stable and replaced by zeolite, then clay minerals. This evolution corresponds to a decrease in silica activity, which is consistent with the pH decrease, as silica concentration depends essentially on pH. Some rather consistent phase relations have been obtained for the SO 3 -Al 2 O 3 -CaO-CO 2 -H 2 O system. Addition of iron III enlarges the AFm-SO 4 stability field to the low temperature domain, whereas it decreases the pH domain where ettringite is stable. On the other hand, the stability field of katoite remains largely ambiguous, namely with respect to a hydro-garnet/grossular solid solution. With respect to other databases this work was made in consistency with a larger mineral selection, so that it can be used for modelling works in the cement clay interaction context
Evaluating the hydrological consistency of evaporation products
Lopez Valencia, Oliver Miguel; Houborg, Rasmus; McCabe, Matthew
2017-01-01
Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months
Self-consistent modelling of ICRH
International Nuclear Information System (INIS)
Hellsten, T.; Hedin, J.; Johnson, T.; Laxaaback, M.; Tennfors, E.
2001-01-01
The performance of ICRH is often sensitive to the shape of the high energy part of the distribution functions of the resonating species. This requires self-consistent calculations of the distribution functions and the wave-field. In addition to the wave-particle interactions and Coulomb collisions the effects of the finite orbit width and the RF-induced spatial transport are found to be important. The inward drift dominates in general even for a symmetric toroidal wave spectrum in the centre of the plasma. An inward drift does not necessarily produce a more peaked heating profile. On the contrary, for low concentrations of hydrogen minority in deuterium plasmas it can even give rise to broader profiles. (author)
Non linear self consistency of microtearing modes
International Nuclear Information System (INIS)
Garbet, X.; Mourgues, F.; Samain, A.
1987-01-01
The self consistency of a microtearing turbulence is studied in non linear regimes where the ergodicity of the flux lines determines the electron response. The current which sustains the magnetic perturbation via the Ampere law results from the combines action of the radial electric field in the frame where the island chains are static and of the thermal electron diamagnetism. Numerical calculations show that at usual values of β pol in Tokamaks the turbulence can create a diffusion coefficient of order ν th p 2 i where p i is the ion larmor radius and ν th the electron ion collision frequency. On the other hand, collisionless regimes involving special profiles of each mode near the resonant surface seem possible
Consistent evolution in a pedestrian flow
Guan, Junbiao; Wang, Kaihua
2016-03-01
In this paper, pedestrian evacuation considering different human behaviors is studied by using a cellular automaton (CA) model combined with the snowdrift game theory. The evacuees are divided into two types, i.e. cooperators and defectors, and two different human behaviors, herding behavior and independent behavior, are investigated. It is found from a large amount of numerical simulations that the ratios of the corresponding evacuee clusters are evolved to consistent states despite 11 typically different initial conditions, which may largely owe to self-organization effect. Moreover, an appropriate proportion of initial defectors who are of herding behavior, coupled with an appropriate proportion of initial defectors who are of rationally independent thinking, are two necessary factors for short evacuation time.
Evaluating the hydrological consistency of evaporation products
Lopez Valencia, Oliver Miguel
2017-01-18
Advances in space-based observations have provided the capacity to develop regional- to global-scale estimates of evaporation, offering insights into this key component of the hydrological cycle. However, the evaluation of large-scale evaporation retrievals is not a straightforward task. While a number of studies have intercompared a range of these evaporation products by examining the variance amongst them, or by comparison of pixel-scale retrievals against ground-based observations, there is a need to explore more appropriate techniques to comprehensively evaluate remote-sensing-based estimates. One possible approach is to establish the level of product agreement between related hydrological components: for instance, how well do evaporation patterns and response match with precipitation or water storage changes? To assess the suitability of this "consistency"-based approach for evaluating evaporation products, we focused our investigation on four globally distributed basins in arid and semi-arid environments, comprising the Colorado River basin, Niger River basin, Aral Sea basin, and Lake Eyre basin. In an effort to assess retrieval quality, three satellite-based global evaporation products based on different methodologies and input data, including CSIRO-PML, the MODIS Global Evapotranspiration product (MOD16), and Global Land Evaporation: the Amsterdam Methodology (GLEAM), were evaluated against rainfall data from the Global Precipitation Climatology Project (GPCP) along with Gravity Recovery and Climate Experiment (GRACE) water storage anomalies. To ensure a fair comparison, we evaluated consistency using a degree correlation approach after transforming both evaporation and precipitation data into spherical harmonics. Overall we found no persistent hydrological consistency in these dryland environments. Indeed, the degree correlation showed oscillating values between periods of low and high water storage changes, with a phase difference of about 2–3 months
Thermodynamically consistent model calibration in chemical kinetics
Directory of Open Access Journals (Sweden)
Goutsias John
2011-05-01
Full Text Available Abstract Background The dynamics of biochemical reaction systems are constrained by the fundamental laws of thermodynamics, which impose well-defined relationships among the reaction rate constants characterizing these systems. Constructing biochemical reaction systems from experimental observations often leads to parameter values that do not satisfy the necessary thermodynamic constraints. This can result in models that are not physically realizable and may lead to inaccurate, or even erroneous, descriptions of cellular function. Results We introduce a thermodynamically consistent model calibration (TCMC method that can be effectively used to provide thermodynamically feasible values for the parameters of an open biochemical reaction system. The proposed method formulates the model calibration problem as a constrained optimization problem that takes thermodynamic constraints (and, if desired, additional non-thermodynamic constraints into account. By calculating thermodynamically feasible values for the kinetic parameters of a well-known model of the EGF/ERK signaling cascade, we demonstrate the qualitative and quantitative significance of imposing thermodynamic constraints on these parameters and the effectiveness of our method for accomplishing this important task. MATLAB software, using the Systems Biology Toolbox 2.1, can be accessed from http://www.cis.jhu.edu/~goutsias/CSS lab/software.html. An SBML file containing the thermodynamically feasible EGF/ERK signaling cascade model can be found in the BioModels database. Conclusions TCMC is a simple and flexible method for obtaining physically plausible values for the kinetic parameters of open biochemical reaction systems. It can be effectively used to recalculate a thermodynamically consistent set of parameter values for existing thermodynamically infeasible biochemical reaction models of cellular function as well as to estimate thermodynamically feasible values for the parameters of new
Dictionary-based fiber orientation estimation with improved spatial consistency.
Ye, Chuyang; Prince, Jerry L
2018-02-01
Diffusion magnetic resonance imaging (dMRI) has enabled in vivo investigation of white matter tracts. Fiber orientation (FO) estimation is a key step in tract reconstruction and has been a popular research topic in dMRI analysis. In particular, the sparsity assumption has been used in conjunction with a dictionary-based framework to achieve reliable FO estimation with a reduced number of gradient directions. Because image noise can have a deleterious effect on the accuracy of FO estimation, previous works have incorporated spatial consistency of FOs in the dictionary-based framework to improve the estimation. However, because FOs are only indirectly determined from the mixture fractions of dictionary atoms and not modeled as variables in the objective function, these methods do not incorporate FO smoothness directly, and their ability to produce smooth FOs could be limited. In this work, we propose an improvement to Fiber Orientation Reconstruction using Neighborhood Information (FORNI), which we call FORNI+; this method estimates FOs in a dictionary-based framework where FO smoothness is better enforced than in FORNI alone. We describe an objective function that explicitly models the actual FOs and the mixture fractions of dictionary atoms. Specifically, it consists of data fidelity between the observed signals and the signals represented by the dictionary, pairwise FO dissimilarity that encourages FO smoothness, and weighted ℓ 1 -norm terms that ensure the consistency between the actual FOs and the FO configuration suggested by the dictionary representation. The FOs and mixture fractions are then jointly estimated by minimizing the objective function using an iterative alternating optimization strategy. FORNI+ was evaluated on a simulation phantom, a physical phantom, and real brain dMRI data. In particular, in the real brain dMRI experiment, we have qualitatively and quantitatively evaluated the reproducibility of the proposed method. Results demonstrate that
Image compression of bone images
International Nuclear Information System (INIS)
Hayrapetian, A.; Kangarloo, H.; Chan, K.K.; Ho, B.; Huang, H.K.
1989-01-01
This paper reports a receiver operating characteristic (ROC) experiment conducted to compare the diagnostic performance of a compressed bone image with the original. The compression was done on custom hardware that implements an algorithm based on full-frame cosine transform. The compression ratio in this study is approximately 10:1, which was decided after a pilot experiment. The image set consisted of 45 hand images, including normal images and images containing osteomalacia and osteitis fibrosa. Each image was digitized with a laser film scanner to 2,048 x 2,048 x 8 bits. Six observers, all board-certified radiologists, participated in the experiment. For each ROC session, an independent ROC curve was constructed and the area under that curve calculated. The image set was randomized for each session, as was the order for viewing the original and reconstructed images. Analysis of variance was used to analyze the data and derive statistically significant results. The preliminary results indicate that the diagnostic quality of the reconstructed image is comparable to that of the original image
Wide baseline stereo matching based on double topological relationship consistency
Zou, Xiaohong; Liu, Bin; Song, Xiaoxue; Liu, Yang
2009-07-01
Stereo matching is one of the most important branches in computer vision. In this paper, an algorithm is proposed for wide-baseline stereo vision matching. Here, a novel scheme is presented called double topological relationship consistency (DCTR). The combination of double topological configuration includes the consistency of first topological relationship (CFTR) and the consistency of second topological relationship (CSTR). It not only sets up a more advanced model on matching, but discards mismatches by iteratively computing the fitness of the feature matches and overcomes many problems of traditional methods depending on the powerful invariance to changes in the scale, rotation or illumination across large view changes and even occlusions. Experimental examples are shown where the two cameras have been located in very different orientations. Also, epipolar geometry can be recovered using RANSAC by far the most widely method adopted possibly. By the method, we can obtain correspondences with high precision on wide baseline matching problems. Finally, the effectiveness and reliability of this method are demonstrated in wide-baseline experiments on the image pairs.
Exploring the Consistent behavior of Information Services
Directory of Open Access Journals (Sweden)
Kapidakis Sarantos
2016-01-01
Full Text Available Computer services are normally assumed to work well all the time. This usually happens for crucial services like bank electronic services, but not necessarily so for others, that there is no commercial interest in their operation. In this work we examined the operation and the errors of information services and tried to find clues that will help predicting the consistency of the behavior and the quality of the harvesting, which is harder because of the transient conditions and the many services and the huge amount of harvested information. We found many unexpected situations. The services that always successfully satisfy a request may in fact return part of it. A significant part of the OAI services have ceased working while many other serves occasionally fail to respond. Some services fail in the same way each time, and we pronounce them dead, as we do not see a way to overcome that. Others also always, or sometimes fail, but not in the same way, and we hope that their behavior is affected by temporary factors, that may improve later on. We categorized the services into classes, to study their behavior in more detail.
A Consistent Phylogenetic Backbone for the Fungi
Ebersberger, Ingo; de Matos Simoes, Ricardo; Kupczok, Anne; Gube, Matthias; Kothe, Erika; Voigt, Kerstin; von Haeseler, Arndt
2012-01-01
The kingdom of fungi provides model organisms for biotechnology, cell biology, genetics, and life sciences in general. Only when their phylogenetic relationships are stably resolved, can individual results from fungal research be integrated into a holistic picture of biology. However, and despite recent progress, many deep relationships within the fungi remain unclear. Here, we present the first phylogenomic study of an entire eukaryotic kingdom that uses a consistency criterion to strengthen phylogenetic conclusions. We reason that branches (splits) recovered with independent data and different tree reconstruction methods are likely to reflect true evolutionary relationships. Two complementary phylogenomic data sets based on 99 fungal genomes and 109 fungal expressed sequence tag (EST) sets analyzed with four different tree reconstruction methods shed light from different angles on the fungal tree of life. Eleven additional data sets address specifically the phylogenetic position of Blastocladiomycota, Ustilaginomycotina, and Dothideomycetes, respectively. The combined evidence from the resulting trees supports the deep-level stability of the fungal groups toward a comprehensive natural system of the fungi. In addition, our analysis reveals methodologically interesting aspects. Enrichment for EST encoded data—a common practice in phylogenomic analyses—introduces a strong bias toward slowly evolving and functionally correlated genes. Consequently, the generalization of phylogenomic data sets as collections of randomly selected genes cannot be taken for granted. A thorough characterization of the data to assess possible influences on the tree reconstruction should therefore become a standard in phylogenomic analyses. PMID:22114356
[Consistent Declarative Memory with Depressive Symptomatology].
Botelho de Oliveira, Silvia; Flórez, Ruth Natalia Suárez; Caballero, Diego Andrés Vásquez
2012-12-01
Some studies have suggested that potentiated remembrance of negative events on people with depressive disorders seems to be an important factor in the etiology, course and maintenance of depression. Evaluate the emotional memory in people with and without depressive symptomatology by means of an audio-visual test. 73 university students were evaluated, male and female, between 18 and 40 years old, distributed in two groups: with depressive symptomatology (32) and without depressive symptomatology (40), using the Scale from the Center of Epidemiologic Studies for Depression (CES-D, English Abbreviation) and a cutting point of 20. There were not meaningful differences between free and voluntary recalls, with and without depressive symptomatology, in spite of the fact that both groups had granted a higher emotional value to the audio-visual test and that they had associated it with emotional sadness. People with depressive symptomatology did not exhibit the effect of mnemonic potentiation generally associated to the content of the emotional version of the test; therefore, the hypothesis of emotional consistency was not validated. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.
Self consistent field theory of virus assembly
Li, Siyu; Orland, Henri; Zandi, Roya
2018-04-01
The ground state dominance approximation (GSDA) has been extensively used to study the assembly of viral shells. In this work we employ the self-consistent field theory (SCFT) to investigate the adsorption of RNA onto positively charged spherical viral shells and examine the conditions when GSDA does not apply and SCFT has to be used to obtain a reliable solution. We find that there are two regimes in which GSDA does work. First, when the genomic RNA length is long enough compared to the capsid radius, and second, when the interaction between the genome and capsid is so strong that the genome is basically localized next to the wall. We find that for the case in which RNA is more or less distributed uniformly in the shell, regardless of the length of RNA, GSDA is not a good approximation. We observe that as the polymer-shell interaction becomes stronger, the energy gap between the ground state and first excited state increases and thus GSDA becomes a better approximation. We also present our results corresponding to the genome persistence length obtained through the tangent-tangent correlation length and show that it is zero in case of GSDA but is equal to the inverse of the energy gap when using SCFT.
Consistency based correlations for tailings consolidation
Energy Technology Data Exchange (ETDEWEB)
Azam, S.; Paul, A.C. [Regina Univ., Regina, SK (Canada). Environmental Systems Engineering
2010-07-01
The extraction of oil, uranium, metals and mineral resources from the earth generates significant amounts of tailings slurry. The tailings are contained in a disposal area with perimeter dykes constructed from the coarser fraction of the slurry. There are many unique challenges pertaining to the management of the containment facilities for several decades beyond mine closure that are a result of the slow settling rates of the fines and the high standing toxic waters. Many tailings dam failures in different parts of the world have been reported to result in significant contaminant releases causing public concern over the conventional practice of tailings disposal. Therefore, in order to reduce and minimize the environmental footprint, the fluid tailings need to undergo efficient consolidation. This paper presented an investigation into the consolidation behaviour of tailings in conjunction with soil consistency that captured physicochemical interactions. The paper discussed the large strain consolidation behaviour (volume compressibility and hydraulic conductivity) of six fine-grained soil slurries based on published data. The paper provided background information on the study and presented the research methodology. The geotechnical index properties of the selected materials were also presented. The large strain consolidation, volume compressibility correlations, and hydraulic conductivity correlations were provided. It was concluded that the normalized void ratio best described volume compressibility whereas liquidity index best explained the hydraulic conductivity. 17 refs., 3 tabs., 4 figs.
Consistency between GRUAN sondes, LBLRTM and IASI
Directory of Open Access Journals (Sweden)
X. Calbet
2017-06-01
Full Text Available Radiosonde soundings from the GCOS Reference Upper-Air Network (GRUAN data record are shown to be consistent with Infrared Atmospheric Sounding Instrument (IASI-measured radiances via LBLRTM (Line-By-Line Radiative Transfer Model in the part of the spectrum that is mostly affected by water vapour absorption in the upper troposphere (from 700 hPa up. This result is key for climate data records, since GRUAN, IASI and LBLRTM constitute reference measurements or a reference radiative transfer model in each of their fields. This is specially the case for night-time radiosonde measurements. Although the sample size is small (16 cases, daytime GRUAN radiosonde measurements seem to have a small dry bias of 2.5 % in absolute terms of relative humidity, located mainly in the upper troposphere, with respect to LBLRTM and IASI. Full metrological closure is not yet possible and will not be until collocation uncertainties are better characterized and a full uncertainty covariance matrix is clarified for GRUAN.
Self-consistent nuclear energy systems
International Nuclear Information System (INIS)
Shimizu, A.; Fujiie, Y.
1995-01-01
A concept of self-consistent energy systems (SCNES) has been proposed as an ultimate goal of the nuclear energy system in the coming centuries. SCNES should realize a stable and unlimited energy supply without endangering the human race and the global environment. It is defined as a system that realizes at least the following four objectives simultaneously: (a) energy generation -attain high efficiency in the utilization of fission energy; (b) fuel production - secure inexhaustible energy source: breeding of fissile material with the breeding ratio greater than one and complete burning of transuranium through recycling; (c) burning of radionuclides - zero release of radionuclides from the system: complete burning of transuranium and elimination of radioactive fission products by neutron capture reactions through recycling; (d) system safety - achieve system safety both for the public and experts: eliminate criticality-related safety issues by using natural laws and simple logic. This paper describes the concept of SCNES and discusses the feasibility of the system. Both ''neutron balance'' and ''energbalance'' of the system are introduced as the necessary conditions to be satisfied at least by SCNES. Evaluations made so far indicate that both the neutron balance and the energy balance can be realized by fast reactors but not by thermal reactors. Concerning the system safety, two safety concepts: ''self controllability'' and ''self-terminability'' are introduced to eliminate the criticality-related safety issues in fast reactors. (author)
Toward a consistent model for glass dissolution
International Nuclear Information System (INIS)
Strachan, D.M.; McGrail, B.P.; Bourcier, W.L.
1994-01-01
Understanding the process of glass dissolution in aqueous media has advanced significantly over the last 10 years through the efforts of many scientists around the world. Mathematical models describing the glass dissolution process have also advanced from simple empirical functions to structured models based on fundamental principles of physics, chemistry, and thermodynamics. Although borosilicate glass has been selected as the waste form for disposal of high-level wastes in at least 5 countries, there is no international consensus on the fundamental methodology for modeling glass dissolution that could be used in assessing the long term performance of waste glasses in a geologic repository setting. Each repository program is developing their own model and supporting experimental data. In this paper, we critically evaluate a selected set of these structured models and show that a consistent methodology for modeling glass dissolution processes is available. We also propose a strategy for a future coordinated effort to obtain the model input parameters that are needed for long-term performance assessments of glass in a geologic repository. (author) 4 figs., tabs., 75 refs
View from Europe: stability, consistency or pragmatism
International Nuclear Information System (INIS)
Dunster, H.J.
1988-01-01
The last few years of this decade look like a period of reappraisal of radiation protection standards. The revised risk estimates from Japan will be available, and the United Nations Scientific Committee on the Effects of Atomic Radiation will be publishing new reports on biological topics. The International Commission on Radiological Protection (ICRP) has started a review of its basic recommendations, and the new specification for dose equivalent in radiation fields of the International Commission on Radiation Units and Measurements (ICRU) will be coming into use. All this is occurring at a time when some countries are still trying to catch up with committed dose equivalent and the recently recommended change in the value of the quality factor for neutrons. In Europe, the problems of adapting to new ICRP recommendations are considerable. The European Community, including 12 states and nine languages, takes ICRP recommendations as a basis and develops council directives that are binding on member states, which have then to arrange for their own regulatory changes. Any substantial adjustments could take 5 y or more to work through the system. Clearly, the regulatory preference is for stability. Equally clearly, trade unions and public interest groups favor a rapid response to scientific developments (provided that the change is downward). Organizations such as the ICRP have to balance their desire for internal consistency and intellectual purity against the practical problems of their clients in adjusting to change. This paper indicates some of the changes that might be necessary over the next few years and how, given a pragmatic approach, they might be accommodated in Europe without too much regulatory confusion
The Consistency Between Clinical and Electrophysiological Diagnoses
Directory of Open Access Journals (Sweden)
Esra E. Okuyucu
2009-09-01
Full Text Available OBJECTIVE: The aim of this study was to provide information concerning the impact of electrophysiological tests in the clinical management and diagnosis of patients, and to evaluate the consistency between referring clinical diagnoses and electrophysiological diagnoses. METHODS: The study included 957 patients referred to the electroneuromyography (ENMG laboratory from different clinics with different clinical diagnoses in 2008. Demographic data, referring clinical diagnoses, the clinics where the requests wanted, and diagnoses after ENMG testing were recorded and statistically evaluated. RESULTS: In all, 957 patients [644 (67.3% female and 313 (32.7% male] were included in the study. Mean age of the patients was 45.40 ± 14.54 years. ENMG requests were made by different specialists; 578 (60.4% patients were referred by neurologists, 122 (12.8% by orthopedics, 140 (14.6% by neurosurgeons, and 117 (12.2% by physical treatment and rehabilitation departments. According to the results of ENMG testing, 513 (53.6% patients’ referrals were related to their referral diagnosis, whereas 397 (41.5% patients had normal ENMG test results, and 47 (4.9% patients had a diagnosis that differed from the referring diagnosis. Among the relation between the referral diagnosis and electrophysiological diagnosis according to the clinics where the requests were made, there was no statistical difference (p= 0.794, but there were statistically significant differences between the support of different clinical diagnoses, such as carpal tunnel syndrome, polyneuropathy, radiculopathy-plexopathy, entrapment neuropathy, and myopathy based on ENMG test results (p< 0.001. CONCLUSION: ENMG is a frequently used neurological examination. As such, referrals for ENMG can be made to either support the referring diagnosis or to exclude other diagnoses. This may explain the inconsistency between clinical referring diagnoses and diagnoses following ENMG
Self-consistent meson mass spectrum
International Nuclear Information System (INIS)
Balazs, L.A.P.
1982-01-01
A dual-topological-unitarization (or dual-fragmentation) approach to the calculation of hadron masses is presented, in which the effect of planar ''sea''-quark loops is taken into account from the beginning. Using techniques based on analyticity and generalized ladder-graph dynamics, we first derive the approximate ''generic'' Regge-trajectory formula α(t) = max (S 1 +S 2 , S 3 +S 4 )-(1/2) +2alpha-circumflex'[s/sub a/ +(1/2)(t-summationm/sub i/ 2 )] for any given hadronic process 1+2→3+4, where S/sub i/ and m/sub i/ are the spins and masses of i = 1,2,3,4, and √s/sub a/ is the effective mass of the lowest nonvanishing contribution (a) exchanged in the crossed channel. By requiring a minimization of secondary (background, etc.) contributions to a, and demanding simultaneous consistency for entire sets of such processes, we are then able to calculate the masses of all the lowest pseudoscalar and vector qq-bar states with q = u,d,s and the Regge trajectories on which they lie. By making certain additional assumptions we are also able to do this with q = u,d,c and q = u,d,b. Our only arbitrary parameters are m/sub rho/, m/sub K/*, m/sub psi/, and m/sub Upsilon/, one of which merely serves to fix the energy scale. In contrast to many other approaches, a small m/sub π/ 2 /m/sub rho/ 2 ratio arises quite naturally in the present scheme
Speed Consistency in the Smart Tachograph.
Borio, Daniele; Cano, Eduardo; Baldini, Gianmarco
2018-05-16
In the transportation sector, safety risks can be significantly reduced by monitoring the behaviour of drivers and by discouraging possible misconducts that entail fatigue and can increase the possibility of accidents. The Smart Tachograph (ST), the new revision of the Digital Tachograph (DT), has been designed with this purpose: to verify that speed limits and compulsory rest periods are respected by drivers. In order to operate properly, the ST periodically checks the consistency of data from different sensors, which can be potentially manipulated to avoid the monitoring of the driver behaviour. In this respect, the ST regulation specifies a test procedure to detect motion conflicts originating from inconsistencies between Global Navigation Satellite System (GNSS) and odometry data. This paper provides an experimental evaluation of the speed verification procedure specified by the ST regulation. Several hours of data were collected using three vehicles and considering light urban and highway environments. The vehicles were equipped with an On-Board Diagnostics (OBD) data reader and a GPS/Galileo receiver. The tests prescribed by the regulation were implemented with specific focus on synchronization aspects. The experimental analysis also considered aspects such as the impact of tunnels and the presence of data gaps. The analysis shows that the metrics selected for the tests are resilient to data gaps, latencies between GNSS and odometry data and simplistic manipulations such as data scaling. The new ST forces an attacker to falsify data from both sensors at the same time and in a coherent way. This makes more difficult the implementation of frauds in comparison to the current version of the DT.
International Nuclear Information System (INIS)
Hardcastle, Nicholas; Bender, Edward T.; Tomé, Wolfgang A.
2014-01-01
It has previously been shown that deformable image registrations (DIRs) often result in deformation maps that are neither inverse-consistent nor transitive, and that the dose accumulation based on these deformation maps can be inconsistent if different image pathways are used for dose accumulation. A method presented to reduce inverse consistency and transitivity errors has been shown to result in more consistent dose accumulation, regardless of the image pathway selected for dose accumulation. The present study investigates the effect on the dose accumulation accuracy of deformation maps processed to reduce inverse consistency and transitivity errors. A set of lung 4DCT phases were analysed, consisting of four images on which a dose grid was created. Dose to 75 corresponding anatomical locations was manually tracked. Dose accumulation was performed between all image sets with Demons derived deformation maps as well as deformation maps processed to reduce inverse consistency and transitivity errors. The ground truth accumulated dose was then compared with the accumulated dose derived from DIR. Two dose accumulation image pathways were considered. The post-processing method to reduce inverse consistency and transitivity errors had minimal effect on the dose accumulation accuracy. There was a statistically significant improvement in dose accumulation accuracy for one pathway, but for the other pathway there was no statistically significant difference. A post-processing technique to reduce inverse consistency and transitivity errors has a positive, yet minimal effect on the dose accumulation accuracy. Thus the post-processing technique improves consistency of dose accumulation with minimal effect on dose accumulation accuracy.
Analytic Intermodel Consistent Modeling of Volumetric Human Lung Dynamics.
Ilegbusi, Olusegun; Seyfi, Behnaz; Neylon, John; Santhanam, Anand P
2015-10-01
Human lung undergoes breathing-induced deformation in the form of inhalation and exhalation. Modeling the dynamics is numerically complicated by the lack of information on lung elastic behavior and fluid-structure interactions between air and the tissue. A mathematical method is developed to integrate deformation results from a deformable image registration (DIR) and physics-based modeling approaches in order to represent consistent volumetric lung dynamics. The computational fluid dynamics (CFD) simulation assumes the lung is a poro-elastic medium with spatially distributed elastic property. Simulation is performed on a 3D lung geometry reconstructed from four-dimensional computed tomography (4DCT) dataset of a human subject. The heterogeneous Young's modulus (YM) is estimated from a linear elastic deformation model with the same lung geometry and 4D lung DIR. The deformation obtained from the CFD is then coupled with the displacement obtained from the 4D lung DIR by means of the Tikhonov regularization (TR) algorithm. The numerical results include 4DCT registration, CFD, and optimal displacement data which collectively provide consistent estimate of the volumetric lung dynamics. The fusion method is validated by comparing the optimal displacement with the results obtained from the 4DCT registration.
Temporal consistent depth map upscaling for 3DTV
Schwarz, Sebastian; Sjöström, Mârten; Olsson, Roger
2014-03-01
The ongoing success of three-dimensional (3D) cinema fuels increasing efforts to spread the commercial success of 3D to new markets. The possibilities of a convincing 3D experience at home, such as three-dimensional television (3DTV), has generated a great deal of interest within the research and standardization community. A central issue for 3DTV is the creation and representation of 3D content. Acquiring scene depth information is a fundamental task in computer vision, yet complex and error-prone. Dedicated range sensors, such as the Time of-Flight camera (ToF), can simplify the scene depth capture process and overcome shortcomings of traditional solutions, such as active or passive stereo analysis. Admittedly, currently available ToF sensors deliver only a limited spatial resolution. However, sophisticated depth upscaling approaches use texture information to match depth and video resolution. At Electronic Imaging 2012 we proposed an upscaling routine based on error energy minimization, weighted with edge information from an accompanying video source. In this article we develop our algorithm further. By adding temporal consistency constraints to the upscaling process, we reduce disturbing depth jumps and flickering artifacts in the final 3DTV content. Temporal consistency in depth maps enhances the 3D experience, leading to a wider acceptance of 3D media content. More content in better quality can boost the commercial success of 3DTV.
Time-Consistent and Market-Consistent Evaluations (Revised version of 2012-086)
Stadje, M.A.; Pelsser, A.
2014-01-01
Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
Development of a Consistent and Reproducible Porcine Scald Burn Model
Kempf, Margit; Kimble, Roy; Cuttle, Leila
2016-01-01
There are very few porcine burn models that replicate scald injuries similar to those encountered by children. We have developed a robust porcine burn model capable of creating reproducible scald burns for a wide range of burn conditions. The study was conducted with juvenile Large White pigs, creating replicates of burn combinations; 50°C for 1, 2, 5 and 10 minutes and 60°C, 70°C, 80°C and 90°C for 5 seconds. Visual wound examination, biopsies and Laser Doppler Imaging were performed at 1, 24 hours and at 3 and 7 days post-burn. A consistent water temperature was maintained within the scald device for long durations (49.8 ± 0.1°C when set at 50°C). The macroscopic and histologic appearance was consistent between replicates of burn conditions. For 50°C water, 10 minute duration burns showed significantly deeper tissue injury than all shorter durations at 24 hours post-burn (p ≤ 0.0001), with damage seen to increase until day 3 post-burn. For 5 second duration burns, by day 7 post-burn the 80°C and 90°C scalds had damage detected significantly deeper in the tissue than the 70°C scalds (p ≤ 0.001). A reliable and safe model of porcine scald burn injury has been successfully developed. The novel apparatus with continually refreshed water improves consistency of scald creation for long exposure times. This model allows the pathophysiology of scald burn wound creation and progression to be examined. PMID:27612153
SU-E-J-29: Audiovisual Biofeedback Improves Tumor Motion Consistency for Lung Cancer Patients
International Nuclear Information System (INIS)
Lee, D; Pollock, S; Makhija, K; Keall, P; Greer, P; Arm, J; Hunter, P; Kim, T
2014-01-01
Purpose: To investigate whether the breathing-guidance system: audiovisual (AV) biofeedback improves tumor motion consistency for lung cancer patients. This will minimize respiratory-induced tumor motion variations across cancer imaging and radiotherapy procedues. This is the first study to investigate the impact of respiratory guidance on tumor motion. Methods: Tumor motion consistency was investigated with five lung cancer patients (age: 55 to 64), who underwent a training session to get familiarized with AV biofeedback, followed by two MRI sessions across different dates (pre and mid treatment). During the training session in a CT room, two patient specific breathing patterns were obtained before (Breathing-Pattern-1) and after (Breathing-Pattern-2) training with AV biofeedback. In each MRI session, four MRI scans were performed to obtain 2D coronal and sagittal image datasets in free breathing (FB), and with AV biofeedback utilizing Breathing-Pattern-2. Image pixel values of 2D images after the normalization of 2D images per dataset and Gaussian filter per image were used to extract tumor motion using image pixel values. The tumor motion consistency of the superior-inferior (SI) direction was evaluated in terms of an average tumor motion range and period. Results: Audiovisual biofeedback improved tumor motion consistency by 60% (p value = 0.019) from 1.0±0.6 mm (FB) to 0.4±0.4 mm (AV) in SI motion range, and by 86% (p value < 0.001) from 0.7±0.6 s (FB) to 0.1±0.2 s (AV) in period. Conclusion: This study demonstrated that audiovisual biofeedback improves both breathing pattern and tumor motion consistency for lung cancer patients. These results suggest that AV biofeedback has the potential for facilitating reproducible tumor motion towards achieving more accurate medical imaging and radiation therapy procedures
SU-E-J-29: Audiovisual Biofeedback Improves Tumor Motion Consistency for Lung Cancer Patients
Energy Technology Data Exchange (ETDEWEB)
Lee, D; Pollock, S; Makhija, K; Keall, P [The University of Sydney, Camperdown, NSW (Australia); Greer, P [The University of Newcastle, Newcastle, NSW (Australia); Calvary Mater Newcastle Hospital, Newcastle, NSW (Australia); Arm, J; Hunter, P [Calvary Mater Newcastle Hospital, Newcastle, NSW (Australia); Kim, T [The University of Sydney, Camperdown, NSW (Australia); University of Virginia Health System, Charlottesville, VA (United States)
2014-06-01
Purpose: To investigate whether the breathing-guidance system: audiovisual (AV) biofeedback improves tumor motion consistency for lung cancer patients. This will minimize respiratory-induced tumor motion variations across cancer imaging and radiotherapy procedues. This is the first study to investigate the impact of respiratory guidance on tumor motion. Methods: Tumor motion consistency was investigated with five lung cancer patients (age: 55 to 64), who underwent a training session to get familiarized with AV biofeedback, followed by two MRI sessions across different dates (pre and mid treatment). During the training session in a CT room, two patient specific breathing patterns were obtained before (Breathing-Pattern-1) and after (Breathing-Pattern-2) training with AV biofeedback. In each MRI session, four MRI scans were performed to obtain 2D coronal and sagittal image datasets in free breathing (FB), and with AV biofeedback utilizing Breathing-Pattern-2. Image pixel values of 2D images after the normalization of 2D images per dataset and Gaussian filter per image were used to extract tumor motion using image pixel values. The tumor motion consistency of the superior-inferior (SI) direction was evaluated in terms of an average tumor motion range and period. Results: Audiovisual biofeedback improved tumor motion consistency by 60% (p value = 0.019) from 1.0±0.6 mm (FB) to 0.4±0.4 mm (AV) in SI motion range, and by 86% (p value < 0.001) from 0.7±0.6 s (FB) to 0.1±0.2 s (AV) in period. Conclusion: This study demonstrated that audiovisual biofeedback improves both breathing pattern and tumor motion consistency for lung cancer patients. These results suggest that AV biofeedback has the potential for facilitating reproducible tumor motion towards achieving more accurate medical imaging and radiation therapy procedures.
Consistently Showing Your Best Side? Intra-individual Consistency in #Selfie Pose Orientation
Lindell, Annukka K.
2017-01-01
Painted and photographic portraits of others show an asymmetric bias: people favor their left cheek. Both experimental and database studies confirm that the left cheek bias extends to selfies. To date all such selfie studies have been cross-sectional; whether individual selfie-takers tend to consistently favor the same pose orientation, or switch between multiple poses, remains to be determined. The present study thus examined intra-individual consistency in selfie pose orientations. Two hundred selfie-taking participants (100 male and 100 female) were identified by searching #selfie on Instagram. The most recent 10 single-subject selfies for the each of the participants were selected and coded for type of selfie (normal; mirror) and pose orientation (left, midline, right), resulting in a sample of 2000 selfies. Results indicated that selfie-takers do tend to consistently adopt a preferred pose orientation (α = 0.72), with more participants showing an overall left cheek bias (41%) than would be expected by chance (overall right cheek bias = 31.5%; overall midline bias = 19.5%; no overall bias = 8%). Logistic regression modellng, controlling for the repeated measure of participant identity, indicated that sex did not affect pose orientation. However, selfie type proved a significant predictor when comparing left and right cheek poses, with a stronger left cheek bias for mirror than normal selfies. Overall, these novel findings indicate that selfie-takers show intra-individual consistency in pose orientation, and in addition, replicate the previously reported left cheek bias for selfies and other types of portrait, confirming that the left cheek bias also presents within individuals’ selfie corpora. PMID:28270790
Consistency of parametric registration in serial MRI studies of brain tumor progression
International Nuclear Information System (INIS)
Mang, Andreas; Buzug, Thorsten M.; Schnabel, Julia A.; Crum, William R.; Modat, Marc; Ourselin, Sebastien; Hawkes, David J.; Camara-Rey, Oscar; Palm, Christoph; Caseiras, Gisele Brasil; Jaeger, H.R.
2008-01-01
The consistency of parametric registration in multi-temporal magnetic resonance (MR) imaging studies was evaluated. Serial MRI scans of adult patients with a brain tumor (glioma) were aligned by parametric registration. The performance of low-order spatial alignment (6/9/12 degrees of freedom) of different 3D serial MR-weighted images is evaluated. A registration protocol for the alignment of all images to one reference coordinate system at baseline is presented. Registration results were evaluated for both, multimodal intra-timepoint and mono-modal multi-temporal registration. The latter case might present a challenge to automatic intensity-based registration algorithms due to ill-defined correspondences. The performance of our algorithm was assessed by testing the inverse registration consistency. Four different similarity measures were evaluated to assess consistency. Careful visual inspection suggests that images are well aligned, but their consistency may be imperfect. Sub-voxel inconsistency within the brain was found for allsimilarity measures used for parametric multi-temporal registration. T1-weighted images were most reliable for establishing spatial correspondence between different timepoints. The parametric registration algorithm is feasible for use in this application. The sub-voxel resolution mean displacement error of registration transformations demonstrates that the algorithm converges to an almost identical solution for forward and reverse registration. (orig.)
International Nuclear Information System (INIS)
Wells, P.N.T.
1983-01-01
Ultrasound is a form of energy which consists of mechanical vibrations the frequencies of which are so high that they are above the range of human hearing. The lower frequency limit of the ultrasonic spectrum may generally be taken to be about 20 kHz. Most biomedical applications of ultrasound employ frequencies in the range 1-15 MHz. At these frequencies, the wavelength is in the range 1.5 - 0.1 mm in soft tissues, and narrow beams of ultrasound can be generated which propagate through such tissues without excessive attenuation. This chapter begins with brief reviews of the physics of diagnostic ultrasound pulse-echo imaging methods and Doppler imaging methods. The remainder of the chapter is a resume of the applications of ultrasonic imaging to physiological measurement
Infrared upconversion hyperspectral imaging
DEFF Research Database (Denmark)
Kehlet, Louis Martinus; Tidemand-Lichtenberg, Peter; Dam, Jeppe Seidelin
2015-01-01
In this Letter, hyperspectral imaging in the mid-IR spectral region is demonstrated based on nonlinear frequency upconversion and subsequent imaging using a standard Si-based CCD camera. A series of upconverted images are acquired with different phase match conditions for the nonlinear frequency...... conversion process. From this, a sequence of monochromatic images in the 3.2-3.4 mu m range is generated. The imaged object consists of a standard United States Air Force resolution target combined with a polystyrene film, resulting in the presence of both spatial and spectral information in the infrared...... image. (C) 2015 Optical Society of America...
Multiplicative Consistency for Interval Valued Reciprocal Preference Relations
Wu, Jian; Chiclana, Francisco
2014-01-01
The multiplicative consistency (MC) property of interval additive reciprocal preference relations (IARPRs) is explored, and then the consistency index is quantified by the multiplicative consistency estimated IARPR. The MC property is used to measure the level of consistency of the information provided by the experts and also to propose the consistency index induced ordered weighted averaging (CI-IOWA) operator. The novelty of this operator is that it aggregates individual IARPRs in such ...
Directory of Open Access Journals (Sweden)
Luke Mander
2014-08-01
Full Text Available Premise of the study: Humans frequently identify pollen grains at a taxonomic rank above species. Grass pollen is a classic case of this situation, which has led to the development of computational methods for identifying grass pollen species. This paper aims to provide context for these computational methods by quantifying the accuracy and consistency of human identification. Methods: We measured the ability of nine human analysts to identify 12 species of grass pollen using scanning electron microscopy images. These are the same images that were used in computational identifications. We have measured the coverage, accuracy, and consistency of each analyst, and investigated their ability to recognize duplicate images. Results: Coverage ranged from 87.5% to 100%. Mean identification accuracy ranged from 46.67% to 87.5%. The identification consistency of each analyst ranged from 32.5% to 87.5%, and each of the nine analysts produced considerably different identification schemes. The proportion of duplicate image pairs that were missed ranged from 6.25% to 58.33%. Discussion: The identification errors made by each analyst, which result in a decline in accuracy and consistency, are likely related to psychological factors such as the limited capacity of human memory, fatigue and boredom, recency effects, and positivity bias.
Directory of Open Access Journals (Sweden)
Danny Lee, PhD
2017-07-01
Conclusions: This study demonstrated that audiovisual biofeedback can be used to improve the reproducibility and consistency of breath-hold lung tumor position and volume, respectively. These results may provide a pathway to achieve more accurate lung cancer radiation treatment in addition to improving various medical imaging and treatments by using breath-hold procedures.
Privacy, Time Consistent Optimal Labour Income Taxation and Education Policy
Konrad, Kai A.
1999-01-01
Incomplete information is a commitment device for time consistency problems. In the context of time consistent labour income taxation privacy reduces welfare losses and increases the effectiveness of public education as a second best policy.
Generalized contexts and consistent histories in quantum mechanics
International Nuclear Information System (INIS)
Losada, Marcelo; Laura, Roberto
2014-01-01
We analyze a restriction of the theory of consistent histories by imposing that a valid description of a physical system must include quantum histories which satisfy the consistency conditions for all states. We prove that these conditions are equivalent to imposing the compatibility conditions of our formalism of generalized contexts. Moreover, we show that the theory of consistent histories with the consistency conditions for all states and the formalism of generalized context are equally useful representing expressions which involve properties at different times
Personality and Situation Predictors of Consistent Eating Patterns
Vainik, Uku; Dub?, Laurette; Lu, Ji; Fellows, Lesley K.
2015-01-01
Introduction A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studi...
Two Impossibility Results on the Converse Consistency Principle in Bargaining
Youngsub Chun
1999-01-01
We present two impossibility results on the converse consistency principle in the context of bargaining. First, we show that there is no solution satis-fying Pareto optimality, contraction independence, and converse consistency. Next, we show that there is no solution satisfying Pareto optimality, strong individual rationality, individual monotonicity, and converse consistency.
Personality consistency analysis in cloned quarantine dog candidates
Directory of Open Access Journals (Sweden)
Jin Choi
2017-01-01
Full Text Available In recent research, personality consistency has become an important characteristic. Diverse traits and human-animal interactions, in particular, are studied in the field of personality consistency in dogs. Here, we investigated the consistency of dominant behaviours in cloned and control groups followed by the modified Puppy Aptitude Test, which consists of ten subtests to ascertain the influence of genetic identity. In this test, puppies are exposed to stranger, restraint, prey-like object, noise, startling object, etc. Six cloned and four control puppies participated and the consistency of responses at ages 7–10 and 16 weeks in the two groups was compared. The two groups showed different consistencies in the subtests. While the average scores of the cloned group were consistent (P = 0.7991, those of the control group were not (P = 0.0089. Scores of Pack Drive and Fight or Flight Drive were consistent in the cloned group, however, those of the control group were not. Scores of Prey Drive were not consistent in either the cloned or the control group. Therefore, it is suggested that consistency of dominant behaviour is affected by genetic identity and some behaviours can be influenced more than others. Our results suggest that cloned dogs could show more consistent traits than non-cloned. This study implies that personality consistency could be one of the ways to analyse traits of puppies.
Checking Consistency of Pedigree Information is NP-complete
DEFF Research Database (Denmark)
Aceto, Luca; Hansen, Jens A.; Ingolfsdottir, Anna
Consistency checking is a fundamental computational problem in genetics. Given a pedigree and information on the genotypes of some of the individuals in it, the aim of consistency checking is to determine whether these data are consistent with the classic Mendelian laws of inheritance. This probl...
26 CFR 1.338-8 - Asset and stock consistency.
2010-04-01
... that are controlled foreign corporations. (6) Stock consistency. This section limits the application of... 26 Internal Revenue 4 2010-04-01 2010-04-01 false Asset and stock consistency. 1.338-8 Section 1... (CONTINUED) INCOME TAXES Effects on Corporation § 1.338-8 Asset and stock consistency. (a) Introduction—(1...
Full Text Available ... consist of a console containing a computer and electronics, a video display screen and a transducer that ... the preferred imaging modality for the diagnosis and monitoring of pregnant women and their unborn babies. Ultrasound ...
European Space Imaging & Skybox Imaging
International Nuclear Information System (INIS)
Clark, J.; Schichor, P.
2015-01-01
Skybox and European Space Imaging have partnered to bring timely, Very High-Resolution imagery to customers in Europe and North Africa. Leveraging Silicon Valley ingenuity and world-class aerospace expertise, Skybox designs, builds, and operates a fleet of imaging satellites. With two satellites currently on-orbit, Skybox is quickly advancing towards a planned constellation of 24+ satellites with the potential for daily or sub-daily imaging at 70-90 cm resolution. With consistent, high-resolution imagery and video, European customers can monitor the dynamic units of human activity - cars, trucks, shipping containers, ships, aircraft, etc. - and derive valuable insights about the global economy. With multiple imaging opportunities per day, the Skybox constellation provides unprecedented access to imagery and information about critical targets that require rapid analysis. Skybox's unique capability to deliver high-definition video from space enables European customers to monitor a network of globally distributed assets with full-motion snapshots, without the need to deploy an aircraft or field team. The movement captured in these 30-90 second video windows yield unique insights that improve operational decisions. Skybox and EUSI are excited to offer a unique data source that can drive a better understanding of our world through supply chain monitoring, natural resource management, infrastructure monitoring, and crisis response. (author)
International Nuclear Information System (INIS)
Haddad, Maurice C.
2008-01-01
This book provides an overview of the imaging findings of parasitic diseases using modern imaging equipment. The chapters consist of short descriptions of causative pathogens, epidemiology, modes of transmission, pathology, clinical manifestations, laboratory tests, and imaging findings, with illustrative examples of parasitic diseases that can affect various systems of the human body. Tables summarizing key diagnostic features and clinical data pertinent to diagnosis are also included. This book is intended for radiologists worldwide. (orig.)
Energy Technology Data Exchange (ETDEWEB)
Haddad, Maurice C. [American Univ. of Beirut Medical Center (Lebanon). Dept. of Diagnostic Radiology; Abd El Bagi, Mohamed E. [Riyadh Military Hospital (Saudi Arabia). Radiology and Imaging Dept. 920W; Tamraz, Jean C. (eds.) [CHU Hotel-Dieu de France, Beirut (Lebanon)
2008-07-01
This book provides an overview of the imaging findings of parasitic diseases using modern imaging equipment. The chapters consist of short descriptions of causative pathogens, epidemiology, modes of transmission, pathology, clinical manifestations, laboratory tests, and imaging findings, with illustrative examples of parasitic diseases that can affect various systems of the human body. Tables summarizing key diagnostic features and clinical data pertinent to diagnosis are also included. This book is intended for radiologists worldwide. (orig.)
Consistent Regulation of Infrastructure Businesses: Some Economic Issues
Flavio M. Menezes
2008-01-01
This paper examines some important economic aspects associated with the notion that consistency in the regulation of infrastructure businesses is a desirable feature. It makes two important points. First, it is not easy to measure consistency. In particular, one cannot simply point to different regulatory parameters as evidence of inconsistent regulatory policy. Second, even if one does observe consistency emerging from decisions made by different regulators, it does not necessarily mean that...
Multimodality imaging techniques.
Martí-Bonmatí, Luis; Sopena, Ramón; Bartumeus, Paula; Sopena, Pablo
2010-01-01
In multimodality imaging, the need to combine morphofunctional information can be approached by either acquiring images at different times (asynchronous), and fused them through digital image manipulation techniques or simultaneously acquiring images (synchronous) and merging them automatically. The asynchronous post-processing solution presents various constraints, mainly conditioned by the different positioning of the patient in the two scans acquired at different times in separated machines. The best solution to achieve consistency in time and space is obtained by the synchronous image acquisition. There are many multimodal technologies in molecular imaging. In this review we will focus on those multimodality image techniques more commonly used in the field of diagnostic imaging (SPECT-CT, PET-CT) and new developments (as PET-MR). The technological innovations and development of new tracers and smart probes are the main key points that will condition multimodality image and diagnostic imaging professionals' future. Although SPECT-CT and PET-CT are standard in most clinical scenarios, MR imaging has some advantages, providing excellent soft-tissue contrast and multidimensional functional, structural and morphological information. The next frontier is to develop efficient detectors and electronics systems capable of detecting two modality signals at the same time. Not only PET-MR but also MR-US or optic-PET will be introduced in clinical scenarios. Even more, MR diffusion-weighted, pharmacokinetic imaging, spectroscopy or functional BOLD imaging will merge with PET tracers to further increase molecular imaging as a relevant medical discipline. Multimodality imaging techniques will play a leading role in relevant clinical applications. The development of new diagnostic imaging research areas, mainly in the field of oncology, cardiology and neuropsychiatry, will impact the way medicine is performed today. Both clinical and experimental multimodality studies, in
Fourier rebinning and consistency equations for time-of-flight PET planograms.
Li, Yusheng; Defrise, Michel; Matej, Samuel; Metzler, Scott D
2016-01-01
Due to the unique geometry, dual-panel PET scanners have many advantages in dedicated breast imaging and on-board imaging applications since the compact scanners can be combined with other imaging and treatment modalities. The major challenges of dual-panel PET imaging are the limited-angle problem and data truncation, which can cause artifacts due to incomplete data sampling. The time-of-flight (TOF) information can be a promising solution to reduce these artifacts. The TOF planogram is the native data format for dual-panel TOF PET scanners, and the non-TOF planogram is the 3D extension of linogram. The TOF planograms is five-dimensional while the objects are three-dimensional, and there are two degrees of redundancy. In this paper, we derive consistency equations and Fourier-based rebinning algorithms to provide a complete understanding of the rich structure of the fully 3D TOF planograms. We first derive two consistency equations and John's equation for 3D TOF planograms. By taking the Fourier transforms, we obtain two Fourier consistency equations and the Fourier-John equation, which are the duals of the consistency equations and John's equation, respectively. We then solve the Fourier consistency equations and Fourier-John equation using the method of characteristics. The two degrees of entangled redundancy of the 3D TOF data can be explicitly elicited and exploited by the solutions along the characteristic curves. As the special cases of the general solutions, we obtain Fourier rebinning and consistency equations (FORCEs), and thus we obtain a complete scheme to convert among different types of PET planograms: 3D TOF, 3D non-TOF, 2D TOF and 2D non-TOF planograms. The FORCEs can be used as Fourier-based rebinning algorithms for TOF-PET data reduction, inverse rebinnings for designing fast projectors, or consistency conditions for estimating missing data. As a byproduct, we show the two consistency equations are necessary and sufficient for 3D TOF planograms
Personality consistency in dogs: a meta-analysis.
Fratkin, Jamie L; Sinn, David L; Patall, Erika A; Gosling, Samuel D
2013-01-01
Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests') versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.
Personality consistency in dogs: a meta-analysis.
Directory of Open Access Journals (Sweden)
Jamie L Fratkin
Full Text Available Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that 'puppy tests' measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family. Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43. Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., 'puppy tests' versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed.
Personality Consistency in Dogs: A Meta-Analysis
Fratkin, Jamie L.; Sinn, David L.; Patall, Erika A.; Gosling, Samuel D.
2013-01-01
Personality, or consistent individual differences in behavior, is well established in studies of dogs. Such consistency implies predictability of behavior, but some recent research suggests that predictability cannot be assumed. In addition, anecdotally, many dog experts believe that ‘puppy tests’ measuring behavior during the first year of a dog's life are not accurate indicators of subsequent adult behavior. Personality consistency in dogs is an important aspect of human-dog relationships (e.g., when selecting dogs suitable for substance-detection work or placement in a family). Here we perform the first comprehensive meta-analysis of studies reporting estimates of temporal consistency of dog personality. A thorough literature search identified 31 studies suitable for inclusion in our meta-analysis. Overall, we found evidence to suggest substantial consistency (r = 0.43). Furthermore, personality consistency was higher in older dogs, when behavioral assessment intervals were shorter, and when the measurement tool was exactly the same in both assessments. In puppies, aggression and submissiveness were the most consistent dimensions, while responsiveness to training, fearfulness, and sociability were the least consistent dimensions. In adult dogs, there were no dimension-based differences in consistency. There was no difference in personality consistency in dogs tested first as puppies and later as adults (e.g., ‘puppy tests’) versus dogs tested first as puppies and later again as puppies. Finally, there were no differences in consistency between working versus non-working dogs, between behavioral codings versus behavioral ratings, and between aggregate versus single measures. Implications for theory, practice, and future research are discussed. PMID:23372787
Predictive value of PWI for blood supply and T1-spin echo MRI for consistency of pituitary adenoma
Energy Technology Data Exchange (ETDEWEB)
Ma, Zengyi; He, Wenqiang; Zhao, Yao; Zhang, Qilin; Li, Shiqi; Wang, Yongfei [Fudan University, Department of Neurosurgery, Huashan Hospital, Shanghai Medical College, Shanghai (China); Shanghai Pituitary Tumor Center, Shanghai (China); Yuan, Jie; Wu, Yue; Yao, Zhenwei [Fudan University, Department of Radiology, Huashan Hospital, Shanghai Medical College, Shanghai (China); Chen, Hong [Fudan University, Department of Neuropathology, Huashan Hospital, Shanghai Medical College, Shanghai (China)
2016-01-15
It is a common view that consistency and blood supply of pituitary adenoma (PA) can influence the surgical effect. The aim of this study was to determine whether MRI signal intensity (SI) was correlated to the consistency or blood supply of pituitary macroadenoma. Forty eight pituitary macroadenoma patients were underwent preoperative MRI, including precontrast and contrast-enhanced (CE) T1-spin echo (T1-SE) imaging, CE-sampling perfection with application-optimized contrasts by using different flip angle evolutions (SPACE) imaging, and perfusion-weighted imaging (PWI). The tumor consistency and blood supply were determined by neurosurgeons. The expression of collagen IV and MIB-1 was detected with immunohistology. The correlation of the relative SI (rSI) values (tumor to normal frontal white matter SI) and PWI data to the tumor consistency, blood supply, and the expression level of collagen IV and MIB-1 was statistically studied by Kruskal-Wallis rank test (K-W test). A significant correlation was observed between the tumor consistency and the rSI on precontrast T1-SE imaging (P = 0.004) but not on CE T1-SE and CE SPACE imaging. The expression of collagen IV was also significantly associated with rSI on T1-SE imaging (P = 0.010). The blood supply was correlated with the relative CBV (rCBV) (P = 0.030). In addition, the expression of MIB-1 was correlated with rSI of CE T1-SE imaging (P = 0.007). Our results suggest that T1-SE imaging may be a simple and useful method for predicting consistency of PA. CBV value can provide helpful information for assessing the blood supply of pituitary macroadenoma. (orig.)
Student Consistency and Implications for Feedback in Online Assessment Systems
Madhyastha, Tara M.; Tanimoto, Steven
2009-01-01
Most of the emphasis on mining online assessment logs has been to identify content-specific errors. However, the pattern of general "consistency" is domain independent, strongly related to performance, and can itself be a target of educational data mining. We demonstrate that simple consistency indicators are related to student outcomes,…
26 CFR 301.6224(c)-3 - Consistent settlements.
2010-04-01
... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Consistent settlements. 301.6224(c)-3 Section... settlements. (a) In general. If the Internal Revenue Service enters into a settlement agreement with any..., settlement terms consistent with those contained in the settlement agreement entered into. (b) Requirements...
Self-consistent calculation of atomic structure for mixture
International Nuclear Information System (INIS)
Meng Xujun; Bai Yun; Sun Yongsheng; Zhang Jinglin; Zong Xiaoping
2000-01-01
Based on relativistic Hartree-Fock-Slater self-consistent average atomic model, atomic structure for mixture is studied by summing up component volumes in mixture. Algorithmic procedure for solving both the group of Thomas-Fermi equations and the self-consistent atomic structure is presented in detail, and, some numerical results are discussed
A Preliminary Study toward Consistent Soil Moisture from AMSR2
Parinussa, R.M.; Holmes, T.R.H.; Wanders, N.; Dorigo, W.A.; de Jeu, R.A.M.
2015-01-01
A preliminary study toward consistent soil moisture products from the Advanced Microwave Scanning Radiometer 2 (AMSR2) is presented. Its predecessor, the Advanced Microwave Scanning Radiometer for Earth Observing System (AMSR-E), has providedEarth scientists with a consistent and continuous global
Consistency and Inconsistency in PhD Thesis Examination
Holbrook, Allyson; Bourke, Sid; Lovat, Terry; Fairbairn, Hedy
2008-01-01
This is a mixed methods investigation of consistency in PhD examination. At its core is the quantification of the content and conceptual analysis of examiner reports for 804 Australian theses. First, the level of consistency between what examiners say in their reports and the recommendation they provide for a thesis is explored, followed by an…
Delimiting Coefficient a from Internal Consistency and Unidimensionality
Sijtsma, Klaas
2015-01-01
I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient a to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient a is a lower bound to reliability and that concepts of internal consistency and…
Risk aversion vs. the Omega ratio : Consistency results
Balder, Sven; Schweizer, Nikolaus
This paper clarifies when the Omega ratio and related performance measures are consistent with second order stochastic dominance and when they are not. To avoid consistency problems, the threshold parameter in the ratio should be chosen as the expected return of some benchmark – as is commonly done
Carl Rogers during Initial Interviews: A Moderate and Consistent Therapist.
Edwards, H. P.; And Others
1982-01-01
Analyzed two initial interviews by Carl Rogers in their entirety using the Carkhuff scales, Hill's category system, and a brief grammatical analysis to establish the level and consistency with which Rogers provides facilitative conditions. Results indicated his behavior as counselor was stable and consistent within and across interviews. (Author)
Policy consistency and the achievement of Nigeria's foreign policy ...
African Journals Online (AJOL)
This study is an attempt to investigate the policy consistency of Nigeria‟s foreign policy and to understand the basis for this consistency; and also to see whether peacekeeping/peace-enforcement is key instrument in the achievement of Nigeria‟s foreign policy goals. The objective of the study was to examine whether the ...
Decentralized Consistency Checking in Cross-organizational Workflows
Wombacher, Andreas
Service Oriented Architectures facilitate loosely coupled composed services, which are established in a decentralized way. One challenge for such composed services is to guarantee consistency, i.e., deadlock-freeness. This paper presents a decentralized approach to consistency checking, which
Consistency of a system of equations: What does that mean?
Still, Georg J.; Kern, Walter; Koelewijn, Jaap; Bomhoff, M.J.
2010-01-01
The concept of (structural) consistency also called structural solvability is an important basic tool for analyzing the structure of systems of equations. Our aim is to provide a sound and practically relevant meaning to this concept. The implications of consistency are expressed in terms of
Quasi-Particle Self-Consistent GW for Molecules.
Kaplan, F; Harding, M E; Seiler, C; Weigend, F; Evers, F; van Setten, M J
2016-06-14
We present the formalism and implementation of quasi-particle self-consistent GW (qsGW) and eigenvalue only quasi-particle self-consistent GW (evGW) adapted to standard quantum chemistry packages. Our implementation is benchmarked against high-level quantum chemistry computations (coupled-cluster theory) and experimental results using a representative set of molecules. Furthermore, we compare the qsGW approach for five molecules relevant for organic photovoltaics to self-consistent GW results (scGW) and analyze the effects of the self-consistency on the ground state density by comparing calculated dipole moments to their experimental values. We show that qsGW makes a significant improvement over conventional G0W0 and that partially self-consistent flavors (in particular evGW) can be excellent alternatives.
Consistency of hand preference: predictions to intelligence and school achievement.
Kee, D W; Gottfried, A; Bathurst, K
1991-05-01
Gottfried and Bathurst (1983) reported that hand preference consistency measured over time during infancy and early childhood predicts intellectual precocity for females, but not for males. In the present study longitudinal assessments of children previously classified by Gottfried and Bathurst as consistent or nonconsistent in cross-time hand preference were conducted during middle childhood (ages 5 to 9). Findings show that (a) early measurement of hand preference consistency for females predicts school-age intellectual precocity, (b) the locus of the difference between consistent vs. nonconsistent females is in verbal intelligence, and (c) the precocity of the consistent females was also revealed on tests of school achievement, particularly tests of reading and mathematics.
Putting humans in ecology: consistency in science and management.
Hobbs, Larry; Fowler, Charles W
2008-03-01
Normal and abnormal levels of human participation in ecosystems can be revealed through the use of macro-ecological patterns. Such patterns also provide consistent and objective guidance that will lead to achieving and maintaining ecosystem health and sustainability. This paper focuses on the consistency of this type of guidance and management. Such management, in sharp contrast to current management practices, ensures that our actions as individuals, institutions, political groups, societies, and as a species are applied consistently across all temporal, spatial, and organizational scales. This approach supplants management of today, where inconsistency results from debate, politics, and legal and religious polarity. Consistency is achieved when human endeavors are guided by natural patterns. Pattern-based management meets long-standing demands for enlightened management that requires humans to participate in complex systems in consistent and sustainable ways.
Personality and Situation Predictors of Consistent Eating Patterns.
Vainik, Uku; Dubé, Laurette; Lu, Ji; Fellows, Lesley K
2015-01-01
A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied. A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner). The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation. Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption. Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.
Personality and Situation Predictors of Consistent Eating Patterns.
Directory of Open Access Journals (Sweden)
Uku Vainik
Full Text Available A consistent eating style might be beneficial to avoid overeating in a food-rich environment. Eating consistency entails maintaining a similar dietary pattern across different eating situations. This construct is relatively under-studied, but the available evidence suggests that eating consistency supports successful weight maintenance and decreases risk for metabolic syndrome and cardiovascular disease. Yet, personality and situation predictors of consistency have not been studied.A community-based sample of 164 women completed various personality tests, and 139 of them also reported their eating behaviour 6 times/day over 10 observational days. We focused on observations with meals (breakfast, lunch, or dinner. The participants indicated if their momentary eating patterns were consistent with their own baseline eating patterns in terms of healthiness or size of the meal. Further, participants described various characteristics of each eating situation.Eating consistency was positively predicted by trait self-control. Eating consistency was undermined by eating in the evening, eating with others, eating away from home, having consumed alcohol and having undertaken physical exercise. Interactions emerged between personality traits and situations, including punishment sensitivity, restraint, physical activity and alcohol consumption.Trait self-control and several eating situation variables were related to eating consistency. These findings provide a starting point for targeting interventions to improve consistency, suggesting that a focus on self-control skills, together with addressing contextual factors such as social situations and time of day, may be most promising. This work is a first step to provide people with the tools they need to maintain a consistently healthy lifestyle in a food-rich environment.
Radiological Image Compression
Lo, Shih-Chung Benedict
The movement toward digital images in radiology presents the problem of how to conveniently and economically store, retrieve, and transmit the volume of digital images. Basic research into image data compression is necessary in order to move from a film-based department to an efficient digital -based department. Digital data compression technology consists of two types of compression technique: error-free and irreversible. Error -free image compression is desired; however, present techniques can only achieve compression ratio of from 1.5:1 to 3:1, depending upon the image characteristics. Irreversible image compression can achieve a much higher compression ratio; however, the image reconstructed from the compressed data shows some difference from the original image. This dissertation studies both error-free and irreversible image compression techniques. In particular, some modified error-free techniques have been tested and the recommended strategies for various radiological images are discussed. A full-frame bit-allocation irreversible compression technique has been derived. A total of 76 images which include CT head and body, and radiographs digitized to 2048 x 2048, 1024 x 1024, and 512 x 512 have been used to test this algorithm. The normalized mean -square-error (NMSE) on the difference image, defined as the difference between the original and the reconstructed image from a given compression ratio, is used as a global measurement on the quality of the reconstructed image. The NMSE's of total of 380 reconstructed and 380 difference images are measured and the results tabulated. Three complex compression methods are also suggested to compress images with special characteristics. Finally, various parameters which would effect the quality of the reconstructed images are discussed. A proposed hardware compression module is given in the last chapter.
Fourier rebinning and consistency equations for time-of-flight PET planograms
International Nuclear Information System (INIS)
Li, Yusheng; Matej, Samuel; Metzler, Scott D; Defrise, Michel
2016-01-01
Due to the unique geometry, dual-panel PET scanners have many advantages in dedicated breast imaging and on-board imaging applications since the compact scanners can be combined with other imaging and treatment modalities. The major challenges of dual-panel PET imaging are the limited-angle problem and data truncation, which can cause artifacts due to incomplete data sampling. The time-of-flight (TOF) information can be a promising solution to reduce these artifacts. The TOF planogram is the native data format for dual-panel TOF PET scanners, and the non-TOF planogram is the 3D extension of linogram. The TOF planograms is five-dimensional while the objects are three-dimensional, and there are two degrees of redundancy. In this paper, we derive consistency equations and Fourier-based rebinning algorithms to provide a complete understanding of the rich structure of the fully 3D TOF planograms. We first derive two consistency equations and John’s equation for 3D TOF planograms. By taking the Fourier transforms, we obtain two Fourier consistency equations (FCEs) and the Fourier–John equation (FJE), which are the duals of the consistency equations and John’s equation, respectively. We then solve the FCEs and FJE using the method of characteristics. The two degrees of entangled redundancy of the 3D TOF data can be explicitly elicited and exploited by the solutions along the characteristic curves. As the special cases of the general solutions, we obtain Fourier rebinning and consistency equations (FORCEs), and thus we obtain a complete scheme to convert among different types of PET planograms: 3D TOF, 3D non-TOF, 2D TOF and 2D non-TOF planograms. The FORCEs can be used as Fourier-based rebinning algorithms for TOF-PET data reduction, inverse rebinnings for designing fast projectors, or consistency conditions for estimating missing data. As a byproduct, we show the two consistency equations are necessary and sufficient for 3D TOF planograms. Finally, we give
ImageSURF: An ImageJ Plugin for Batch Pixel-Based Image Segmentation Using Random Forests
Directory of Open Access Journals (Sweden)
Aidan O'Mara
2017-11-01
Full Text Available Image segmentation is a necessary step in automated quantitative imaging. ImageSURF is a macro-compatible ImageJ2/FIJI plugin for pixel-based image segmentation that considers a range of image derivatives to train pixel classifiers which are then applied to image sets of any size to produce segmentations without bias in a consistent, transparent and reproducible manner. The plugin is available from ImageJ update site http://sites.imagej.net/ImageSURF/ and source code from https://github.com/omaraa/ImageSURF. Funding statement: This research was supported by an Australian Government Research Training Program Scholarship.
Facial Mimicry and Emotion Consistency: Influences of Memory and Context.
Kirkham, Alexander J; Hayes, Amy E; Pawling, Ralph; Tipper, Steven P
2015-01-01
This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene), whilst others were always inconsistent (e.g., frowning with a positive scene). During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.
Facial Mimicry and Emotion Consistency: Influences of Memory and Context.
Directory of Open Access Journals (Sweden)
Alexander J Kirkham
Full Text Available This study investigates whether mimicry of facial emotions is a stable response or can instead be modulated and influenced by memory of the context in which the emotion was initially observed, and therefore the meaning of the expression. The study manipulated emotion consistency implicitly, where a face expressing smiles or frowns was irrelevant and to be ignored while participants categorised target scenes. Some face identities always expressed emotions consistent with the scene (e.g., smiling with a positive scene, whilst others were always inconsistent (e.g., frowning with a positive scene. During this implicit learning of face identity and emotion consistency there was evidence for encoding of face-scene emotion consistency, with slower RTs, a reduction in trust, and inhibited facial EMG for faces expressing incompatible emotions. However, in a later task where the faces were subsequently viewed expressing emotions with no additional context, there was no evidence for retrieval of prior emotion consistency, as mimicry of emotion was similar for consistent and inconsistent individuals. We conclude that facial mimicry can be influenced by current emotion context, but there is little evidence of learning, as subsequent mimicry of emotionally consistent and inconsistent faces is similar.
Quasiparticle self-consistent GW method: a short summary
International Nuclear Information System (INIS)
Kotani, Takao; Schilfgaarde, Mark van; Faleev, Sergey V; Chantis, Athanasios
2007-01-01
We have developed a quasiparticle self-consistent GW method (QSGW), which is a new self-consistent method to calculate the electronic structure within the GW approximation. The method is formulated based on the idea of a self-consistent perturbation; the non-interacting Green function G 0 , which is the starting point for GWA to obtain G, is determined self-consistently so as to minimize the perturbative correction generated by GWA. After self-consistency is attained, we have G 0 , W (the screened Coulomb interaction) and G self-consistently. This G 0 can be interpreted as the optimum non-interacting propagator for the quasiparticles. We will summarize some theoretical discussions to justify QSGW. Then we will survey results which have been obtained up to now: e.g., band gaps for normal semiconductors are predicted to a precision of 0.1-0.3 eV; the self-consistency including the off-diagonal part is required for NiO and MnO; and so on. There are still some remaining disagreements with experiments; however, they are very systematic, and can be explained from the neglect of excitonic effects
Protective Factors, Risk Indicators, and Contraceptive Consistency Among College Women.
Morrison, Leslie F; Sieving, Renee E; Pettingell, Sandra L; Hellerstedt, Wendy L; McMorris, Barbara J; Bearinger, Linda H
2016-01-01
To explore risk and protective factors associated with consistent contraceptive use among emerging adult female college students and whether effects of risk indicators were moderated by protective factors. Secondary analysis of National Longitudinal Study of Adolescent to Adult Health Wave III data. Data collected through in-home interviews in 2001 and 2002. National sample of 18- to 25-year-old women (N = 842) attending 4-year colleges. We examined relationships between protective factors, risk indicators, and consistent contraceptive use. Consistent contraceptive use was defined as use all of the time during intercourse in the past 12 months. Protective factors included external supports of parental closeness and relationship with caring nonparental adult and internal assets of self-esteem, confidence, independence, and life satisfaction. Risk indicators included heavy episodic drinking, marijuana use, and depression symptoms. Multivariable logistic regression models were used to evaluate relationships between protective factors and consistent contraceptive use and between risk indicators and contraceptive use. Self-esteem, confidence, independence, and life satisfaction were significantly associated with more consistent contraceptive use. In a final model including all internal assets, life satisfaction was significantly related to consistent contraceptive use. Marijuana use and depression symptoms were significantly associated with less consistent use. With one exception, protective factors did not moderate relationships between risk indicators and consistent use. Based on our findings, we suggest that risk and protective factors may have largely independent influences on consistent contraceptive use among college women. A focus on risk and protective factors may improve contraceptive use rates and thereby reduce unintended pregnancy among college students. Copyright © 2016 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses. Published
The Consistent Preferences Approach to Deductive Reasoning in Games
Asheim, Geir B
2006-01-01
"The Consistent Preferences Approach to Deductive Reasoning in Games" presents, applies, and synthesizes what my co-authors and I have called the 'consistent preferences' approach to deductive reasoning in games. Briefly described, this means that the object of the analysis is the ranking by each player of his own strategies, rather than his choice. The ranking can be required to be consistent (in different senses) with his beliefs about the opponent's ranking of her strategies. This can be contrasted to the usual 'rational choice' approach where a player's strategy choice is (in dif
Dong, S.
2018-05-01
We present a reduction-consistent and thermodynamically consistent formulation and an associated numerical algorithm for simulating the dynamics of an isothermal mixture consisting of N (N ⩾ 2) immiscible incompressible fluids with different physical properties (densities, viscosities, and pair-wise surface tensions). By reduction consistency we refer to the property that if only a set of M (1 ⩽ M ⩽ N - 1) fluids are present in the system then the N-phase governing equations and boundary conditions will exactly reduce to those for the corresponding M-phase system. By thermodynamic consistency we refer to the property that the formulation honors the thermodynamic principles. Our N-phase formulation is developed based on a more general method that allows for the systematic construction of reduction-consistent formulations, and the method suggests the existence of many possible forms of reduction-consistent and thermodynamically consistent N-phase formulations. Extensive numerical experiments have been presented for flow problems involving multiple fluid components and large density ratios and large viscosity ratios, and the simulation results are compared with the physical theories or the available physical solutions. The comparisons demonstrate that our method produces physically accurate results for this class of problems.
... R S T U V W X Y Z Image Gallery Share: The Image Gallery contains high-quality digital photographs available from ... Select a category below to view additional thumbnail images. Images are available for direct download in 2 ...
On the consistent histories approach to quantum mechanics
International Nuclear Information System (INIS)
Dowker, F.; Kent, A.
1996-01-01
We review the consistent histories formulations of quantum mechanics developed by Griffiths, Omnes, Gell-Man, and Hartle, and we describe the classifications of consistent sets. We illustrate some general features of consistent sets by a few lemmas and examples. We also consider various interpretations of the formalism, and we examine the new problems which arise in reconstructing the past and predicting the future. It is shown that Omnes characterization of true statements---statements that can be deduced unconditionally in his interpretation---is incorrect. We examine critically Gell-Mann and Hartle's interpretation of the formalism, and in particular, their discussions of communication, prediction, and retrodiction, and we conclude that their explanation of the apparent persistence of quasiclassicality relies on assumptions about an as-yet-unknown theory of experience. Our overall conclusion is that the consistent histories approach illustrates the need to supplement quantum mechanics by some selection principle in order to produce a fundamental theory capable of unconditional predictions
Consistency of Trend Break Point Estimator with Underspecified Break Number
Directory of Open Access Journals (Sweden)
Jingjing Yang
2017-01-01
Full Text Available This paper discusses the consistency of trend break point estimators when the number of breaks is underspecified. The consistency of break point estimators in a simple location model with level shifts has been well documented by researchers under various settings, including extensions such as allowing a time trend in the model. Despite the consistency of break point estimators of level shifts, there are few papers on the consistency of trend shift break point estimators in the presence of an underspecified break number. The simulation study and asymptotic analysis in this paper show that the trend shift break point estimator does not converge to the true break points when the break number is underspecified. In the case of two trend shifts, the inconsistency problem worsens if the magnitudes of the breaks are similar and the breaks are either both positive or both negative. The limiting distribution for the trend break point estimator is developed and closely approximates the finite sample performance.
Liking for Evaluators: Consistency and Self-Esteem Theories
Regan, Judith Weiner
1976-01-01
Consistency and self-esteem theories make contrasting predictions about the relationship between a person's self-evaluation and his liking for an evaluator. Laboratory experiments confirmed predictions about these theories. (Editor/RK)
Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering
Sicat, Ronell Barrera; Kruger, Jens; Moller, Torsten; Hadwiger, Markus
2014-01-01
This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined
Structures, profile consistency, and transport scaling in electrostatic convection
DEFF Research Database (Denmark)
Bian, N.H.; Garcia, O.E.
2005-01-01
Two mechanisms at the origin of profile consistency in models of electrostatic turbulence in magnetized plasmas are considered. One involves turbulent diffusion in collisionless plasmas and the subsequent turbulent equipartition of Lagrangian invariants. By the very nature of its definition...
15 CFR 930.36 - Consistency determinations for proposed activities.
2010-01-01
... necessity of issuing separate consistency determinations for each incremental action controlled by the major... plans), and that affect any coastal use or resource of more than one State. Many States share common...
Decentralized Consistent Network Updates in SDN with ez-Segway
Nguyen, Thanh Dang; Chiesa, Marco; Canini, Marco
2017-01-01
We present ez-Segway, a decentralized mechanism to consistently and quickly update the network state while preventing forwarding anomalies (loops and black-holes) and avoiding link congestion. In our design, the centralized SDN controller only pre-computes
The utility of theory of planned behavior in predicting consistent ...
African Journals Online (AJOL)
admin
disease. Objective: To examine the utility of theory of planned behavior in predicting consistent condom use intention of HIV .... (24-25), making subjective norms as better predictors of intention ..... Organizational Behavior and Human Decision.
A methodology for the data energy regional consumption consistency analysis
International Nuclear Information System (INIS)
Canavarros, Otacilio Borges; Silva, Ennio Peres da
1999-01-01
The article introduces a methodology for data energy regional consumption consistency analysis. The work was going based on recent studies accomplished by several cited authors and boarded Brazilian matrices and Brazilian energetics regional balances. The results are compared and analyzed
Island of Stability for Consistent Deformations of Einstein's Gravity
DEFF Research Database (Denmark)
Dietrich, Dennis D.; Berkhahn, Felix; Hofmann, Stefan
2012-01-01
We construct deformations of general relativity that are consistent and phenomenologically viable, since they respect, in particular, cosmological backgrounds. These deformations have unique symmetries in accordance with their Minkowski cousins (Fierz-Pauli theory for massive gravitons) and incor...
Self-consistent normal ordering of gauge field theories
International Nuclear Information System (INIS)
Ruehl, W.
1987-01-01
Mean-field theories with a real action of unconstrained fields can be self-consistently normal ordered. This leads to a considerable improvement over standard mean-field theory. This concept is applied to lattice gauge theories. First an appropriate real action mean-field theory is constructed. The equations determining the Gaussian kernel necessary for self-consistent normal ordering of this mean-field theory are derived. (author). 4 refs
Consistency of the least weighted squares under heteroscedasticity
Czech Academy of Sciences Publication Activity Database
Víšek, Jan Ámos
2011-01-01
Roč. 2011, č. 47 (2011), s. 179-206 ISSN 0023-5954 Grant - others:GA UK(CZ) GA402/09/055 Institutional research plan: CEZ:AV0Z10750506 Keywords : Regression * Consistency * The least weighted squares * Heteroscedasticity Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/SI/visek-consistency of the least weighted squares under heteroscedasticity.pdf
Cosmological consistency tests of gravity theory and cosmic acceleration
Ishak-Boushaki, Mustapha B.
2017-01-01
Testing general relativity at cosmological scales and probing the cause of cosmic acceleration are among the important objectives targeted by incoming and future astronomical surveys and experiments. I present our recent results on consistency tests that can provide insights about the underlying gravity theory and cosmic acceleration using cosmological data sets. We use statistical measures, the rate of cosmic expansion, the growth rate of large scale structure, and the physical consistency of these probes with one another.
Self-consistency corrections in effective-interaction calculations
International Nuclear Information System (INIS)
Starkand, Y.; Kirson, M.W.
1975-01-01
Large-matrix extended-shell-model calculations are used to compute self-consistency corrections to the effective interaction and to the linked-cluster effective interaction. The corrections are found to be numerically significant and to affect the rate of convergence of the corresponding perturbation series. The influence of various partial corrections is tested. It is concluded that self-consistency is an important effect in determining the effective interaction and improving the rate of convergence. (author)
Parquet equations for numerical self-consistent-field theory
International Nuclear Information System (INIS)
Bickers, N.E.
1991-01-01
In recent years increases in computational power have provided new motivation for the study of self-consistent-field theories for interacting electrons. In this set of notes, the so-called parquet equations for electron systems are derived pedagogically. The principal advantages of the parquet approach are outlined, and its relationship to simpler self-consistent-field methods, including the Baym-Kadanoff technique, is discussed in detail. (author). 14 refs, 9 figs
Consistent Estimation of Pricing Kernels from Noisy Price Data
Vladislav Kargin
2003-01-01
If pricing kernels are assumed non-negative then the inverse problem of finding the pricing kernel is well-posed. The constrained least squares method provides a consistent estimate of the pricing kernel. When the data are limited, a new method is suggested: relaxed maximization of the relative entropy. This estimator is also consistent. Keywords: $\\epsilon$-entropy, non-parametric estimation, pricing kernel, inverse problems.
Full data consistency conditions for cone-beam projections with sources on a plane
International Nuclear Information System (INIS)
Clackdoyle, Rolf; Desbat, Laurent
2013-01-01
Cone-beam consistency conditions (also known as range conditions) are mathematical relationships between different cone-beam projections, and they therefore describe the redundancy or overlap of information between projections. These redundancies have often been exploited for applications in image reconstruction. In this work we describe new consistency conditions for cone-beam projections whose source positions lie on a plane. A further restriction is that the target object must not intersect this plane. The conditions require that moments of the cone-beam projections be polynomial functions of the source positions, with some additional constraints on the coefficients of the polynomials. A precise description of the consistency conditions is that the four parameters of the cone-beam projections (two for the detector, two for the source position) can be expressed with just three variables, using a certain formulation involving homogeneous polynomials. The main contribution of this work is our demonstration that these conditions are not only necessary, but also sufficient. Thus the consistency conditions completely characterize all redundancies, so no other independent conditions are possible and in this sense the conditions are full. The idea of the proof is to use the known consistency conditions for 3D parallel projections, and to then apply a 1996 theorem of Edholm and Danielsson that links parallel to cone-beam projections. The consistency conditions are illustrated with a simulation example. (paper)
Measuring consistency of autobiographical memory recall in depression.
LENUS (Irish Health Repository)
Semkovska, Maria
2012-05-15
Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms.
Measuring consistency of autobiographical memory recall in depression.
Semkovska, Maria; Noone, Martha; Carton, Mary; McLoughlin, Declan M
2012-05-15
Autobiographical amnesia assessments in depression need to account for normal changes in consistency over time, contribution of mood and type of memories measured. We report herein validation studies of the Columbia Autobiographical Memory Interview - Short Form (CAMI-SF), exclusively used in depressed patients receiving electroconvulsive therapy (ECT) but without previous published report of normative data. The CAMI-SF was administered twice with a 6-month interval to 44 healthy volunteers to obtain normative data for retrieval consistency of its Semantic, Episodic-Extended and Episodic-Specific components and assess their reliability and validity. Healthy volunteers showed significant large decreases in retrieval consistency on all components. The Semantic and Episodic-Specific components demonstrated substantial construct validity. We then assessed CAMI-SF retrieval consistencies over a 2-month interval in 30 severely depressed patients never treated with ECT compared with healthy controls (n=19). On initial assessment, depressed patients produced less episodic-specific memories than controls. Both groups showed equivalent amounts of consistency loss over a 2-month interval on all components. At reassessment, only patients with persisting depressive symptoms were distinguishable from controls on episodic-specific memories retrieved. Research quantifying retrograde amnesia following ECT for depression needs to control for normal loss in consistency over time and contribution of persisting depressive symptoms. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Autonomous Navigation with Constrained Consistency for C-Ranger
Directory of Open Access Journals (Sweden)
Shujing Zhang
2014-06-01
Full Text Available Autonomous underwater vehicles (AUVs have become the most widely used tools for undertaking complex exploration tasks in marine environments. Their synthetic ability to carry out localization autonomously and build an environmental map concurrently, in other words, simultaneous localization and mapping (SLAM, are considered to be pivotal requirements for AUVs to have truly autonomous navigation. However, the consistency problem of the SLAM system has been greatly ignored during the past decades. In this paper, a consistency constrained extended Kalman filter (EKF SLAM algorithm, applying the idea of local consistency, is proposed and applied to the autonomous navigation of the C-Ranger AUV, which is developed as our experimental platform. The concept of local consistency (LC is introduced after an explicit theoretical derivation of the EKF-SLAM system. Then, we present a locally consistency-constrained EKF-SLAM design, LC-EKF, in which the landmark estimates used for linearization are fixed at the beginning of each local time period, rather than evaluated at the latest landmark estimates. Finally, our proposed LC-EKF algorithm is experimentally verified, both in simulations and sea trials. The experimental results show that the LC-EKF performs well with regard to consistency, accuracy and computational efficiency.
High-performance speech recognition using consistency modeling
Digalakis, Vassilios; Murveit, Hy; Monaco, Peter; Neumeyer, Leo; Sankar, Ananth
1994-12-01
The goal of SRI's consistency modeling project is to improve the raw acoustic modeling component of SRI's DECIPHER speech recognition system and develop consistency modeling technology. Consistency modeling aims to reduce the number of improper independence assumptions used in traditional speech recognition algorithms so that the resulting speech recognition hypotheses are more self-consistent and, therefore, more accurate. At the initial stages of this effort, SRI focused on developing the appropriate base technologies for consistency modeling. We first developed the Progressive Search technology that allowed us to perform large-vocabulary continuous speech recognition (LVCSR) experiments. Since its conception and development at SRI, this technique has been adopted by most laboratories, including other ARPA contracting sites, doing research on LVSR. Another goal of the consistency modeling project is to attack difficult modeling problems, when there is a mismatch between the training and testing phases. Such mismatches may include outlier speakers, different microphones and additive noise. We were able to either develop new, or transfer and evaluate existing, technologies that adapted our baseline genonic HMM recognizer to such difficult conditions.
Are prescription drug insurance choices consistent with expected utility theory?
Bundorf, M Kate; Mata, Rui; Schoenbaum, Michael; Bhattacharya, Jay
2013-09-01
To determine the extent to which people make choices inconsistent with expected utility theory when choosing among prescription drug insurance plans and whether tabular or graphical presentation format influences the consistency of their choices. Members of an Internet-enabled panel chose between two Medicare prescription drug plans. The "low variance" plan required higher out-of-pocket payments for the drugs respondents usually took but lower out-of-pocket payments for the drugs they might need if they developed a new health condition than the "high variance" plan. The probability of a change in health varied within subjects and the presentation format (text vs. graphical) and the affective salience of the clinical condition (abstract vs. risk related to specific clinical condition) varied between subjects. Respondents were classified based on whether they consistently chose either the low or high variance plan. Logistic regression models were estimated to examine the relationship between decision outcomes and task characteristics. The majority of respondents consistently chose either the low or high variance plan, consistent with expected utility theory. Half of respondents consistently chose the low variance plan. Respondents were less likely to make discrepant choices when information was presented in graphical format. Many people, although not all, make choices consistent with expected utility theory when they have information on differences among plans in the variance of out-of-pocket spending. Medicare beneficiaries would benefit from information on the extent to which prescription drug plans provide risk protection. PsycINFO Database Record (c) 2013 APA, all rights reserved.
The search for consistency in the manufacture of PET radiopharmaceuticals
International Nuclear Information System (INIS)
Finn, R.D.
1999-01-01
Nuclear Medicine is the specialty of medical imaging, which utilizes a variety of radionuclides incorporated into specific compounds for diagnostic imaging and therapeutic applications. During recent years, research efforts in this discipline have concentrated on the decay characteristics of particular radionuclides and the design of unique radiolabeled tracers necessary to achieve time-dependent molecular images. Various oncology applications have utilized specific PET and SPECT radiopharmaceuticals, which have allowed an extension from functional process imaging in tissue to pathologic processes and nuclide directed treatments. One of the most widely recognized advantages of positron emission tomography (PET) is its use of the attractive, positron-emitting biologic radiotracers that mimic natural substrates. However, a major disadvantage is that these substances are relatively short-lived and unable to be transported great distances. At this time, economic considerations and regulatory guidelines associated with the creation of a PET facility, as well as the operational costs of maintaining both the facility and the necessary procedural documentation, continue to create interesting strategic dilemmas. This commentary will focus on the current approach and anticipated impact of pending regulations, which relate to the manufacture and formulation of a variety of PET radiopharmaceuticals used in clinical research and patient management at Memorial Hospital. (author)
Mirror Neurons in Humans: Consisting or Confounding Evidence?
Turella, Luca; Pierno, Andrea C.; Tubaldi, Federico; Castiello, Umberto
2009-01-01
The widely known discovery of mirror neurons in macaques shows that premotor and parietal cortical areas are not only involved in executing one's own movement, but are also active when observing the action of others. The goal of this essay is to critically evaluate the substance of functional magnetic resonance imaging (fMRI) and positron emission…
Generalized internal multiple imaging
Zuberi, M. A. H.
2014-08-05
Internal multiples deteriorate the image when the imaging procedure assumes only single scattering, especially if the velocity model does not have sharp contrasts to reproduce such scattering in the Green’s function through forward modeling. If properly imaged, internal multiples (internally scattered energy) can enhance the seismic image. Conventionally, to image internal multiples, accurate, sharp contrasts in the velocity model are required to construct a Green’s function with all the scattered energy. As an alternative, we have developed a generalized internal multiple imaging procedure that images any order internal scattering using the background Green’s function (from the surface to each image point), constructed from a smooth velocity model, usually used for conventional imaging. For the first-order internal multiples, the approach consisted of three steps, in which we first back propagated the recorded surface seismic data using the background Green’s function, then crosscorrelated the back-propagated data with the recorded data, and finally crosscorrelated the result with the original background Green’s function. This procedure images the contribution of the recorded first-order internal multiples, and it is almost free of the single-scattering recorded energy. The cost includes one additional crosscorrelation over the conventional single-scattering imaging application. We generalized this method to image internal multiples of any order separately. The resulting images can be added to the conventional single-scattering image, obtained, e.g., from Kirchhoff or reverse-time migration, to enhance the image. Application to synthetic data with reflectors illuminated by multiple scattering (double scattering) demonstrated the effectiveness of the approach.
Martial arts striking hand peak acceleration, accuracy and consistency.
Neto, Osmar Pinto; Marzullo, Ana Carolina De Miranda; Bolander, Richard P; Bir, Cynthia A
2013-01-01
The goal of this paper was to investigate the possible trade-off between peak hand acceleration and accuracy and consistency of hand strikes performed by martial artists of different training experiences. Ten male martial artists with training experience ranging from one to nine years volunteered to participate in the experiment. Each participant performed 12 maximum effort goal-directed strikes. Hand acceleration during the strikes was obtained using a tri-axial accelerometer block. A pressure sensor matrix was used to determine the accuracy and consistency of the strikes. Accuracy was estimated by the radial distance between the centroid of each subject's 12 strikes and the target, whereas consistency was estimated by the square root of the 12 strikes mean squared distance from their centroid. We found that training experience was significantly correlated to hand peak acceleration prior to impact (r(2)=0.456, p =0.032) and accuracy (r(2)=0. 621, p=0.012). These correlations suggest that more experienced participants exhibited higher hand peak accelerations and at the same time were more accurate. Training experience, however, was not correlated to consistency (r(2)=0.085, p=0.413). Overall, our results suggest that martial arts training may lead practitioners to achieve higher striking hand accelerations with better accuracy and no change in striking consistency.
Self-consistent electrodynamic scattering in the symmetric Bragg case
International Nuclear Information System (INIS)
Campos, H.S.
1988-01-01
We have analyzed the symmetric Bragg case, introducing a model of self consistent scattering for two elliptically polarized beams. The crystal is taken as a set of mathematical planes, each of them defined by a surface density of dipoles. We have considered the mesofield and the epifield differently from that of the Ewald's theory and, we assumed a plane of dipoles and the associated fields as a self consistent scattering unit. The exact analytical treatment when applied to any two neighbouring planes, results in a general and self consistent Bragg's equation, in terms of the amplitude and phase variations. The generalized solution for the set of N planes was obtained after introducing an absorption factor in the incident radiation, in two ways: (i) the analytical one, through a rule of field similarity, which says that the incidence occurs in both faces of the all crystal planes and also, through a matricial development with the Chebyshev polynomials; (ii) using the numerical solution we calculated, iteratively, the reflectivity, the reflection phase, the transmissivity, the transmission phase and the energy. The results are showed through reflection and transmission curves, which are characteristics as from kinematical as dynamical theories. The conservation of the energy results from the Ewald's self consistency principle is used. In the absorption case, the results show that it is not the only cause for the asymmetric form in the reflection curves. The model contains basic elements for a unified, microscope, self consistent, vectorial and exact formulation for interpretating the X ray diffraction in perfect crystals. (author)
Cognitive consistency and math-gender stereotypes in Singaporean children.
Cvencek, Dario; Meltzoff, Andrew N; Kapur, Manu
2014-01-01
In social psychology, cognitive consistency is a powerful principle for organizing psychological concepts. There have been few tests of cognitive consistency in children and no research about cognitive consistency in children from Asian cultures, who pose an interesting developmental case. A sample of 172 Singaporean elementary school children completed implicit and explicit measures of math-gender stereotype (male=math), gender identity (me=male), and math self-concept (me=math). Results showed strong evidence for cognitive consistency; the strength of children's math-gender stereotypes, together with their gender identity, significantly predicted their math self-concepts. Cognitive consistency may be culturally universal and a key mechanism for developmental change in social cognition. We also discovered that Singaporean children's math-gender stereotypes increased as a function of age and that boys identified with math more strongly than did girls despite Singaporean girls' excelling in math. The results reveal both cultural universals and cultural variation in developing social cognition. Copyright © 2013 Elsevier Inc. All rights reserved.
A consistent response spectrum analysis including the resonance range
International Nuclear Information System (INIS)
Schmitz, D.; Simmchen, A.
1983-01-01
The report provides a complete consistent Response Spectrum Analysis for any component. The effect of supports with different excitation is taken into consideration, at is the description of the resonance ranges. It includes information explaining how the contributions of the eigenforms with higher eigenfrequencies are to be considered. Stocking of floor response spectra is also possible using the method described here. However, modified floor response spectra must now be calculated for each building mode. Once these have been prepared, the calculation of the dynamic component values is practically no more complicated than with the conventional, non-consistent methods. The consistent Response Spectrum Analysis can supply smaller and larger values than the conventional theory, a fact which can be demonstrated using simple examples. The report contains a consistent Response Spectrum Analysis (RSA), which, as far as we know, has been formulated in this way for the first time. A consistent RSA is so important because today this method is preferentially applied as an important tool for the earthquake proof of components in nuclear power plants. (orig./HP)
GRAVITATIONALLY CONSISTENT HALO CATALOGS AND MERGER TREES FOR PRECISION COSMOLOGY
International Nuclear Information System (INIS)
Behroozi, Peter S.; Wechsler, Risa H.; Wu, Hao-Yi; Busha, Michael T.; Klypin, Anatoly A.; Primack, Joel R.
2013-01-01
We present a new algorithm for generating merger trees and halo catalogs which explicitly ensures consistency of halo properties (mass, position, and velocity) across time steps. Our algorithm has demonstrated the ability to improve both the completeness (through detecting and inserting otherwise missing halos) and purity (through detecting and removing spurious objects) of both merger trees and halo catalogs. In addition, our method is able to robustly measure the self-consistency of halo finders; it is the first to directly measure the uncertainties in halo positions, halo velocities, and the halo mass function for a given halo finder based on consistency between snapshots in cosmological simulations. We use this algorithm to generate merger trees for two large simulations (Bolshoi and Consuelo) and evaluate two halo finders (ROCKSTAR and BDM). We find that both the ROCKSTAR and BDM halo finders track halos extremely well; in both, the number of halos which do not have physically consistent progenitors is at the 1%-2% level across all halo masses. Our code is publicly available at http://code.google.com/p/consistent-trees. Our trees and catalogs are publicly available at http://hipacc.ucsc.edu/Bolshoi/.
Consistent Partial Least Squares Path Modeling via Regularization.
Jung, Sunho; Park, JaeHong
2018-01-01
Partial least squares (PLS) path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc), designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.
Context-dependent individual behavioral consistency in Daphnia
DEFF Research Database (Denmark)
Heuschele, Jan; Ekvall, Mikael T.; Bianco, Giuseppe
2017-01-01
The understanding of consistent individual differences in behavior, often termed "personality," for adapting and coping with threats and novel environmental conditions has advanced considerably during the last decade. However, advancements are almost exclusively associated with higher-order animals......, whereas studies focusing on smaller aquatic organisms are still rare. Here, we show individual differences in the swimming behavior of Daphnia magna, a clonal freshwater invertebrate, before, during, and after being exposed to a lethal threat, ultraviolet radiation (UVR). We show consistency in swimming...... that of adults. Overall, we show that aquatic invertebrates are far from being identical robots, but instead they show considerable individual differences in behavior that can be attributed to both ontogenetic development and individual consistency. Our study also demonstrates, for the first time...
Consistent forcing scheme in the cascaded lattice Boltzmann method
Fei, Linlin; Luo, Kai Hong
2017-11-01
In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.
Application of consistent fluid added mass matrix to core seismic
International Nuclear Information System (INIS)
Koo, K. H.; Lee, J. H.
2003-01-01
In this paper, the application algorithm of a consistent fluid added mass matrix including the coupling terms to the core seismic analysis is developed and installed at SAC-CORE3.0 code. As an example, we assumed the 7-hexagon system of the LMR core and carried out the vibration modal analysis and the nonlinear time history seismic response analysis using SAC-CORE3.0. Used consistent fluid added mass matrix is obtained by using the finite element program of the FAMD(Fluid Added Mass and Damping) code. From the results of the vibration modal analysis, the core duct assemblies reveal strongly coupled vibration modes, which are so different from the case of in-air condition. From the results of the time history seismic analysis, it was verified that the effects of the coupled terms of the consistent fluid added mass matrix are significant in impact responses and the dynamic responses
Self-consistent approximations beyond the CPA: Part II
International Nuclear Information System (INIS)
Kaplan, T.; Gray, L.J.
1982-01-01
This paper concentrates on a self-consistent approximation for random alloys developed by Kaplan, Leath, Gray, and Diehl. The construction of the augmented space formalism for a binary alloy is sketched, and the notation to be used derived. Using the operator methods of the augmented space, the self-consistent approximation is derived for the average Green's function, and for evaluating the self-energy, taking into account the scattering by clusters of excitations. The particular cluster approximation desired is derived by treating the scattering by the excitations with S /SUB T/ exactly. Fourier transforms on the disorder-space clustersite labels solve the self-consistent set of equations. Expansion to short range order in the alloy is also discussed. A method to reduce the problem to a computationally tractable form is described
Linear augmented plane wave method for self-consistent calculations
International Nuclear Information System (INIS)
Takeda, T.; Kuebler, J.
1979-01-01
O.K. Andersen has recently introduced a linear augmented plane wave method (LAPW) for the calculation of electronic structure that was shown to be computationally fast. A more general formulation of an LAPW method is presented here. It makes use of a freely disposable number of eigenfunctions of the radial Schroedinger equation. These eigenfunctions can be selected in a self-consistent way. The present formulation also results in a computationally fast method. It is shown that Andersen's LAPW is obtained in a special limit from the present formulation. Self-consistent test calculations for copper show the present method to be remarkably accurate. As an application, scalar-relativistic self-consistent calculations are presented for the band structure of FCC lanthanum. (author)
Self-consistency and coherent effects in nonlinear resonances
International Nuclear Information System (INIS)
Hofmann, I.; Franchetti, G.; Qiang, J.; Ryne, R. D.
2003-01-01
The influence of space charge on emittance growth is studied in simulations of a coasting beam exposed to a strong octupolar perturbation in an otherwise linear lattice, and under stationary parameters. We explore the importance of self-consistency by comparing results with a non-self-consistent model, where the space charge electric field is kept 'frozen-in' to its initial values. For Gaussian distribution functions we find that the 'frozen-in' model results in a good approximation of the self-consistent model, hence coherent response is practically absent and the emittance growth is self-limiting due to space charge de-tuning. For KV or waterbag distributions, instead, strong coherent response is found, which we explain in terms of absence of Landau damping
A consistent time frame for Chaucer's Canterbury Pilgrimage
Kummerer, K. R.
2001-08-01
A consistent time frame for the pilgrimage that Geoffrey Chaucer describes in The Canterbury Tales can be established if the seven celestial assertions related to the journey mentioned in the text can be reconciled with each other and the date of April 18 that is also mentioned. Past attempts to establish such a consistency for all seven celestial assertions have not been successful. The analysis herein, however, indicates that in The Canterbury Tales Chaucer accurately describes the celestial conditions he observed in the April sky above the London(Canterbury region of England in the latter half of the fourteenth century. All seven celestial assertions are in agreement with each other and consistent with the April 18 date. The actual words of Chaucer indicate that the Canterbury journey began during the 'seson' he defines in the General Prologue and ends under the light of the full Moon on the night of April 18, 1391.
An approach to a self-consistent nuclear energy system
International Nuclear Information System (INIS)
Fujii-e, Yoichi; Arie, Kazuo; Endo, Hiroshi
1992-01-01
A nuclear energy system should provide a stable supply of energy without endangering the environment or humans. If there is fear about exhausting world energy resources, accumulating radionuclides, and nuclear reactor safety, tension is created in human society. Nuclear energy systems of the future should be able to eliminate fear from people's minds. In other words, the whole system, including the nuclear fuel cycle, should be self-consistent. This is the ultimate goal of nuclear energy. If it can be realized, public acceptance of nuclear energy will increase significantly. In a self-consistent nuclear energy system, misunderstandings between experts on nuclear energy and the public should be minimized. The way to achieve this goal is to explain using simple logic. This paper proposes specific targets for self-consistent nuclear energy systems and shows that the fast breeder reactor (FBR) lies on the route to attaining the final goal
Consistent forcing scheme in the cascaded lattice Boltzmann method.
Fei, Linlin; Luo, Kai Hong
2017-11-01
In this paper, we give an alternative derivation for the cascaded lattice Boltzmann method (CLBM) within a general multiple-relaxation-time (MRT) framework by introducing a shift matrix. When the shift matrix is a unit matrix, the CLBM degrades into an MRT LBM. Based on this, a consistent forcing scheme is developed for the CLBM. The consistency of the nonslip rule, the second-order convergence rate in space, and the property of isotropy for the consistent forcing scheme is demonstrated through numerical simulations of several canonical problems. Several existing forcing schemes previously used in the CLBM are also examined. The study clarifies the relation between MRT LBM and CLBM under a general framework.
Consistency and Reconciliation Model In Regional Development Planning
Directory of Open Access Journals (Sweden)
Dina Suryawati
2016-10-01
Full Text Available The aim of this study was to identify the problems and determine the conceptual model of regional development planning. Regional development planning is a systemic, complex and unstructured process. Therefore, this study used soft systems methodology to outline unstructured issues with a structured approach. The conceptual models that were successfully constructed in this study are a model of consistency and a model of reconciliation. Regional development planning is a process that is well-integrated with central planning and inter-regional planning documents. Integration and consistency of regional planning documents are very important in order to achieve the development goals that have been set. On the other hand, the process of development planning in the region involves technocratic system, that is, both top-down and bottom-up system of participation. Both must be balanced, do not overlap and do not dominate each other. regional, development, planning, consistency, reconciliation
Bootstrap-Based Inference for Cube Root Consistent Estimators
DEFF Research Database (Denmark)
Cattaneo, Matias D.; Jansson, Michael; Nagasawa, Kenichi
This note proposes a consistent bootstrap-based distributional approximation for cube root consistent estimators such as the maximum score estimator of Manski (1975) and the isotonic density estimator of Grenander (1956). In both cases, the standard nonparametric bootstrap is known...... to be inconsistent. Our method restores consistency of the nonparametric bootstrap by altering the shape of the criterion function defining the estimator whose distribution we seek to approximate. This modification leads to a generic and easy-to-implement resampling method for inference that is conceptually distinct...... from other available distributional approximations based on some form of modified bootstrap. We offer simulation evidence showcasing the performance of our inference method in finite samples. An extension of our methodology to general M-estimation problems is also discussed....
Self-consistent modelling of resonant tunnelling structures
DEFF Research Database (Denmark)
Fiig, T.; Jauho, A.P.
1992-01-01
We report a comprehensive study of the effects of self-consistency on the I-V-characteristics of resonant tunnelling structures. The calculational method is based on a simultaneous solution of the effective-mass Schrödinger equation and the Poisson equation, and the current is evaluated...... applied voltages and carrier densities at the emitter-barrier interface. We include the two-dimensional accumulation layer charge and the quantum well charge in our self-consistent scheme. We discuss the evaluation of the current contribution originating from the two-dimensional accumulation layer charges......, and our qualitative estimates seem consistent with recent experimental studies. The intrinsic bistability of resonant tunnelling diodes is analyzed within several different approximation schemes....
An Explicit Consistent Geometric Stiffness Matrix for the DKT Element
Directory of Open Access Journals (Sweden)
Eliseu Lucena Neto
Full Text Available Abstract A large number of references dealing with the geometric stiffness matrix of the DKT finite element exist in the literature, where nearly all of them adopt an inconsistent form. While such a matrix may be part of the element to treat nonlinear problems in general, it is of crucial importance for linearized buckling analysis. The present work seems to be the first to obtain an explicit expression for this matrix in a consistent way. Numerical results on linear buckling of plates assess the element performance either with the proposed explicit consistent matrix, or with the most commonly used inconsistent matrix.
The cluster bootstrap consistency in generalized estimating equations
Cheng, Guang
2013-03-01
The cluster bootstrap resamples clusters or subjects instead of individual observations in order to preserve the dependence within each cluster or subject. In this paper, we provide a theoretical justification of using the cluster bootstrap for the inferences of the generalized estimating equations (GEE) for clustered/longitudinal data. Under the general exchangeable bootstrap weights, we show that the cluster bootstrap yields a consistent approximation of the distribution of the regression estimate, and a consistent approximation of the confidence sets. We also show that a computationally more efficient one-step version of the cluster bootstrap provides asymptotically equivalent inference. © 2012.
Consistency in the description of diffusion in compacted bentonite
International Nuclear Information System (INIS)
Lehikoinen, J.; Muurinen, A.
2009-01-01
A macro-level diffusion model, which aims to provide a unifying framework for explaining the experimentally observed co-ion exclusion and greatly controversial counter-ion surface diffusion in a consistent fashion, is presented. It is explained in detail why a term accounting for the non-zero mobility of the counter-ion surface excess is required in the mathematical form of the macroscopic diffusion flux. The prerequisites for the consistency of the model and the problems associated with the interpretation of diffusion in such complex pore geometries as in compacted smectite clays are discussed. (author)
An energetically consistent vertical mixing parameterization in CCSM4
DEFF Research Database (Denmark)
Nielsen, Søren Borg; Jochum, Markus; Eden, Carsten
2018-01-01
An energetically consistent stratification-dependent vertical mixing parameterization is implemented in the Community Climate System Model 4 and forced with energy conversion from the barotropic tides to internal waves. The structures of the resulting dissipation and diffusivity fields are compared......, however, depends greatly on the details of the vertical mixing parameterizations, where the new energetically consistent parameterization results in low thermocline diffusivities and a sharper and shallower thermocline. It is also investigated if the ocean state is more sensitive to a change in forcing...
The consistency service of the ATLAS Distributed Data Management system
Serfon, C; The ATLAS collaboration
2011-01-01
With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failures is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically corrects the errors reported and informs the users in case of irrecoverable file loss.
The Consistency Service of the ATLAS Distributed Data Management system
Serfon, C; The ATLAS collaboration
2010-01-01
With the continuously increasing volume of data produced by ATLAS and stored on the WLCG sites, the probability of data corruption or data losses, due to software and hardware failure is increasing. In order to ensure the consistency of all data produced by ATLAS a Consistency Service has been developed as part of the DQ2 Distributed Data Management system. This service is fed by the different ATLAS tools, i.e. the analysis tools, production tools, DQ2 site services or by site administrators that report corrupted or lost files. It automatically correct the errors reported and informs the users in case of irrecoverable file loss.
Consistency among integral measurements of aggregate decay heat power
Energy Technology Data Exchange (ETDEWEB)
Takeuchi, H.; Sagisaka, M.; Oyamatsu, K.; Kukita, Y. [Nagoya Univ. (Japan)
1998-03-01
Persisting discrepancies between summation calculations and integral measurements force us to assume large uncertainties in the recommended decay heat power. In this paper, we develop a hybrid method to calculate the decay heat power of a fissioning system from those of different fissioning systems. Then, this method is applied to examine consistency among measured decay heat powers of {sup 232}Th, {sup 233}U, {sup 235}U, {sup 238}U and {sup 239}Pu at YAYOI. The consistency among the measured values are found to be satisfied for the {beta} component and fairly well for the {gamma} component, except for cooling times longer than 4000 s. (author)
Standard Model Vacuum Stability and Weyl Consistency Conditions
DEFF Research Database (Denmark)
Antipin, Oleg; Gillioz, Marc; Krog, Jens
2013-01-01
At high energy the standard model possesses conformal symmetry at the classical level. This is reflected at the quantum level by relations between the different beta functions of the model. These relations are known as the Weyl consistency conditions. We show that it is possible to satisfy them...... order by order in perturbation theory, provided that a suitable coupling constant counting scheme is used. As a direct phenomenological application, we study the stability of the standard model vacuum at high energies and compare with previous computations violating the Weyl consistency conditions....
STP: A mathematically and physically consistent library of steam properties
International Nuclear Information System (INIS)
Aguilar, F.; Hutter, A.C.; Tuttle, P.G.
1982-01-01
A new FORTRAN library of subroutines has been developed from the fundamental equation of Keenan et al. to evaluate a large set of water properties including derivatives such as sound speed and isothermal compressibility. The STP library uses the true saturation envelope of the Keenan et al. fundamental equation. The evaluation of the true envelope by a continuation method is explained. This envelope, along with other design features, imparts an exceptionally high degree of thermodynamic and mathematical consistency to the STP library, even at the critical point. Accuracy and smoothness, library self-consistency, and designed user convenience make the STP library a reliable and versatile water property package
Weyl consistency conditions in non-relativistic quantum field theory
Energy Technology Data Exchange (ETDEWEB)
Pal, Sridip; Grinstein, Benjamín [Department of Physics, University of California,San Diego, 9500 Gilman Drive, La Jolla, CA 92093 (United States)
2016-12-05
Weyl consistency conditions have been used in unitary relativistic quantum field theory to impose constraints on the renormalization group flow of certain quantities. We classify the Weyl anomalies and their renormalization scheme ambiguities for generic non-relativistic theories in 2+1 dimensions with anisotropic scaling exponent z=2; the extension to other values of z are discussed as well. We give the consistency conditions among these anomalies. As an application we find several candidates for a C-theorem. We comment on possible candidates for a C-theorem in higher dimensions.
A Van Atta reflector consisting of half-wave dipoles
DEFF Research Database (Denmark)
Appel-Hansen, Jørgen
1966-01-01
The reradiation pattern of a passive Van Atta reflector consisting of half-wave dipoles is investigated. The character of the reradiation pattern first is deduced by qualitative and physical considerations. Various types of array elements are considered and several geometrical configurations...... of these elements are outlined. Following this, an analysis is made of the reradiation pattern of a linear Van Atta array consisting of four equispaced half-wave dipoles. The general form of the reradiation pattern is studied analytically. The influence of scattering and coupling is determined and the dependence...
A self-consistent theory of the magnetic polaron
International Nuclear Information System (INIS)
Marvakov, D.I.; Kuzemsky, A.L.; Vlahov, J.P.
1984-10-01
A finite temperature self-consistent theory of magnetic polaron in the s-f model of ferromagnetic semiconductors is developed. The calculations are based on the novel approach of the thermodynamic two-time Green function methods. This approach consists in the introduction of the ''irreducible'' Green functions (IGF) and derivation of the exact Dyson equation and exact self-energy operator. It is shown that IGF method gives a unified and natural approach for a calculation of the magnetic polaron states by taking explicitly into account the damping effects and finite lifetime. (author)
Evidence for Consistency of the Glycation Gap in Diabetes
Nayak, Ananth U.; Holland, Martin R.; Macdonald, David R.; Nevill, Alan; Singh, Baldev M.
2011-01-01
OBJECTIVE Discordance between HbA1c and fructosamine estimations in the assessment of glycemia is often encountered. A number of mechanisms might explain such discordance, but whether it is consistent is uncertain. This study aims to coanalyze paired glycosylated hemoglobin (HbA1c)-fructosamine estimations by using fructosamine to determine a predicted HbA1c, to calculate a glycation gap (G-gap) and to determine whether the G-gap is consistent over time. RESEARCH DESIGN AND METHODS We include...
Diagnostic language consistency among multicultural English-speaking nurses.
Wieck, K L
1996-01-01
Cultural differences among nurses may influence the choice of terminology applicable to use of a nursing diagnostic statement. This study explored whether defining characteristics are consistently applied by culturally varied nurses in an English language setting. Two diagnoses, pain, and high risk for altered skin integrity, were studied within six cultures: African, Asian, Filipino, East Indian, African-American, and Anglo-American nurses. Overall, there was consistency between the cultural groups. Analysis of variance for the pain scale demonstrated differences among cultures on two characteristics of pain, restlessness and grimace. The only difference on the high risk for altered skin integrity scale was found on the distructor, supple skin.
Liu, Yiqiao; Zhou, Bo; Qutaish, Mohammed; Wilson, David L.
2016-01-01
We created a metastasis imaging, analysis platform consisting of software and multi-spectral cryo-imaging system suitable for evaluating emerging imaging agents targeting micro-metastatic tumor. We analyzed CREKA-Gd in MRI, followed by cryo-imaging which repeatedly sectioned and tiled microscope images of the tissue block face, providing anatomical bright field and molecular fluorescence, enabling 3D microscopic imaging of the entire mouse with single metastatic cell sensitivity. To register ...
Achieving Consistent Doppler Measurements from SDO/HMI Vector Field Inversions
Schuck, Peter W.; Antiochos, S. K.; Leka, K. D.; Barnes, Graham
2016-01-01
NASA's Solar Dynamics Observatory is delivering vector magnetic field observations of the full solar disk with unprecedented temporal and spatial resolution; however, the satellite is in a highly inclined geosynchronous orbit. The relative spacecraft-Sun velocity varies by +/-3 kms-1 over a day, which introduces major orbital artifacts in the Helioseismic Magnetic Imager (HMI) data. We demonstrate that the orbital artifacts contaminate all spatial and temporal scales in the data. We describe a newly developed three-stage procedure for mitigating these artifacts in the Doppler data obtained from the Milne-Eddington inversions in the HMI pipeline. The procedure ultimately uses 32 velocity-dependent coefficients to adjust 10 million pixels-a remarkably sparse correction model given the complexity of the orbital artifacts. This procedure was applied to full-disk images of AR 11084 to produce consistent Dopplergrams. The data adjustments reduce the power in the orbital artifacts by 31 dB. Furthermore, we analyze in detail the corrected images and show that our procedure greatly improves the temporal and spectral properties of the data without adding any new artifacts. We conclude that this new procedure makes a dramatic improvement in the consistency of the HMI data and in its usefulness for precision scientific studies.
A cosmic microwave background feature consistent with a cosmic texture.
Cruz, M; Turok, N; Vielva, P; Martínez-González, E; Hobson, M
2007-12-07
The Cosmic Microwave Background provides our most ancient image of the universe and our best tool for studying its early evolution. Theories of high-energy physics predict the formation of various types of topological defects in the very early universe, including cosmic texture, which would generate hot and cold spots in the Cosmic Microwave Background. We show through a Bayesian statistical analysis that the most prominent 5 degrees -radius cold spot observed in all-sky images, which is otherwise hard to explain, is compatible with having being caused by a texture. From this model, we constrain the fundamental symmetry-breaking energy scale to be (0) approximately 8.7 x 10(15) gigaelectron volts. If confirmed, this detection of a cosmic defect will probe physics at energies exceeding any conceivable terrestrial experiment.
The least weighted squares II. Consistency and asymptotic normality
Czech Academy of Sciences Publication Activity Database
Víšek, Jan Ámos
2002-01-01
Roč. 9, č. 16 (2002), s. 1-28 ISSN 1212-074X R&D Projects: GA AV ČR KSK1019101 Grant - others:GA UK(CR) 255/2000/A EK /FSV Institutional research plan: CEZ:AV0Z1075907 Keywords : robust regression * consistency * asymptotic normality Subject RIV: BA - General Mathematics
Consistency relation for the Lorentz invariant single-field inflation
International Nuclear Information System (INIS)
Huang, Qing-Guo
2010-01-01
In this paper we compute the sizes of equilateral and orthogonal shape bispectrum for the general Lorentz invariant single-field inflation. The stability of field theory implies a non-negative square of sound speed which leads to a consistency relation between the sizes of orthogonal and equilateral shape bispectrum, namely f NL orth. ≤ −0.054f NL equil. . In particular, for the single-field Dirac-Born-Infeld (DBI) inflation, the consistency relation becomes f NL orth. = 0.070f NL equil. ≤ 0. These consistency relations are also valid in the mixed scenario where the quantum fluctuations of some other light scalar fields contribute to a part of total curvature perturbation on the super-horizon scale and may generate a local form bispectrum. A distinguishing prediction of the mixed scenario is τ NL loc. > ((6/5)f NL loc. ) 2 . Comparing these consistency relations to WMAP 7yr data, there is still a big room for the Lorentz invariant inflation, but DBI inflation has been disfavored at more than 68% CL
Short-Cut Estimators of Criterion-Referenced Test Consistency.
Brown, James Dean
1990-01-01
Presents simplified methods for deriving estimates of the consistency of criterion-referenced, English-as-a-Second-Language tests, including (1) the threshold loss agreement approach using agreement or kappa coefficients, (2) the squared-error loss agreement approach using the phi(lambda) dependability approach, and (3) the domain score…
SOCIAL COMPARISON, SELF-CONSISTENCY AND THE PRESENTATION OF SELF.
MORSE, STANLEY J.; GERGEN, KENNETH J.
TO DISCOVER HOW A PERSON'S (P) SELF-CONCEPT IS AFFECTED BY THE CHARACTERISTICS OF ANOTHER (O) WHO SUDDENLY APPEARS IN THE SAME SOCIAL ENVIRONMENT, SEVERAL QUESTIONNAIRES, INCLUDING THE GERGEN-MORSE (1967) SELF-CONSISTENCY SCALE AND HALF THE COOPERSMITH SELF-ESTEEM INVENTORY, WERE ADMINISTERED TO 78 UNDERGRADUATE MEN WHO HAD ANSWERED AN AD FOR WORK…
Consistency of the Takens estimator for the correlation dimension
Borovkova, S.; Burton, Robert; Dehling, H.
Motivated by the problem of estimating the fractal dimension of a strange attractor, we prove weak consistency of U-statistics for stationary ergodic and mixing sequences when the kernel function is unbounded, extending by this earlier results of Aaronson, Burton, Dehling, Gilat, Hill and Weiss. We
Assessing atmospheric bias correction for dynamical consistency using potential vorticity
International Nuclear Information System (INIS)
Rocheta, Eytan; Sharma, Ashish; Evans, Jason P
2014-01-01
Correcting biases in atmospheric variables prior to impact studies or dynamical downscaling can lead to new biases as dynamical consistency between the ‘corrected’ fields is not maintained. Use of these bias corrected fields for subsequent impact studies and dynamical downscaling provides input conditions that do not appropriately represent intervariable relationships in atmospheric fields. Here we investigate the consequences of the lack of dynamical consistency in bias correction using a measure of model consistency—the potential vorticity (PV). This paper presents an assessment of the biases present in PV using two alternative correction techniques—an approach where bias correction is performed individually on each atmospheric variable, thereby ignoring the physical relationships that exists between the multiple variables that are corrected, and a second approach where bias correction is performed directly on the PV field, thereby keeping the system dynamically coherent throughout the correction process. In this paper we show that bias correcting variables independently results in increased errors above the tropopause in the mean and standard deviation of the PV field, which are improved when using the alternative proposed. Furthermore, patterns of spatial variability are improved over nearly all vertical levels when applying the alternative approach. Results point to a need for a dynamically consistent atmospheric bias correction technique which results in fields that can be used as dynamically consistent lateral boundaries in follow-up downscaling applications. (letter)
An algebraic method for constructing stable and consistent autoregressive filters
International Nuclear Information System (INIS)
Harlim, John; Hong, Hoon; Robbins, Jacob L.
2015-01-01
In this paper, we introduce an algebraic method to construct stable and consistent univariate autoregressive (AR) models of low order for filtering and predicting nonlinear turbulent signals with memory depth. By stable, we refer to the classical stability condition for the AR model. By consistent, we refer to the classical consistency constraints of Adams–Bashforth methods of order-two. One attractive feature of this algebraic method is that the model parameters can be obtained without directly knowing any training data set as opposed to many standard, regression-based parameterization methods. It takes only long-time average statistics as inputs. The proposed method provides a discretization time step interval which guarantees the existence of stable and consistent AR model and simultaneously produces the parameters for the AR models. In our numerical examples with two chaotic time series with different characteristics of decaying time scales, we find that the proposed AR models produce significantly more accurate short-term predictive skill and comparable filtering skill relative to the linear regression-based AR models. These encouraging results are robust across wide ranges of discretization times, observation times, and observation noise variances. Finally, we also find that the proposed model produces an improved short-time prediction relative to the linear regression-based AR-models in forecasting a data set that characterizes the variability of the Madden–Julian Oscillation, a dominant tropical atmospheric wave pattern
Delimiting coefficient alpha from internal consistency and unidimensionality
Sijtsma, K.
2015-01-01
I discuss the contribution by Davenport, Davison, Liou, & Love (2015) in which they relate reliability represented by coefficient α to formal definitions of internal consistency and unidimensionality, both proposed by Cronbach (1951). I argue that coefficient α is a lower bound to reliability and
Challenges of Predictability and Consistency in the First ...
African Journals Online (AJOL)
This article aims to investigate some features of Endemann's (1911) Wörterbuch der Sotho-Sprache (Dictionary of the Sotho language) with the focus on challenges of predictability and consistency in the lemmatization approach, the access alphabet, cross references and article treatments. The dictionary has hitherto ...
The Impact of Orthographic Consistency on German Spoken Word Identification
Beyermann, Sandra; Penke, Martina
2014-01-01
An auditory lexical decision experiment was conducted to find out whether sound-to-spelling consistency has an impact on German spoken word processing, and whether such an impact is different at different stages of reading development. Four groups of readers (school children in the second, third and fifth grades, and university students)…
Final Report Fermionic Symmetries and Self consistent Shell Model
International Nuclear Information System (INIS)
Zamick, Larry
2008-01-01
In this final report in the field of theoretical nuclear physics we note important accomplishments.We were confronted with 'anomoulous' magnetic moments by the experimetalists and were able to expain them. We found unexpected partial dynamical symmetries--completely unknown before, and were able to a large extent to expain them. The importance of a self consistent shell model was emphasized.
Using the Perceptron Algorithm to Find Consistent Hypotheses
Anthony, M.; Shawe-Taylor, J.
1993-01-01
The perceptron learning algorithm yields quite naturally an algorithm for finding a linearly separable boolean function consistent with a sample of such a function. Using the idea of a specifying sample, we give a simple proof that this algorithm is not efficient, in general.
Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines
Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.
2011-01-01
Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433
Consistent seasonal snow cover depth and duration variability over ...
Indian Academy of Sciences (India)
Decline in consistent seasonal snow cover depth, duration and changing snow cover build- up pattern over the WH in recent decades indicate that WH has undergone considerable climate change and winter weather patterns are changing in the WH. 1. Introduction. Mountainous regions around the globe are storehouses.
Is There a Future for Education Consistent with Agenda 21?
Smyth, John
1999-01-01
Discusses recent experiences in developing and implementing strategies for education consistent with the concept of sustainable development at two different levels: (1) the international level characterized by Agenda 21 along with the efforts of the United Nations Commission on Sustainable Development to foster its progress; and (2) the national…
Diagnosing a Strong-Fault Model by Conflict and Consistency
Directory of Open Access Journals (Sweden)
Wenfeng Zhang
2018-03-01
Full Text Available The diagnosis method for a weak-fault model with only normal behaviors of each component has evolved over decades. However, many systems now demand a strong-fault models, the fault modes of which have specific behaviors as well. It is difficult to diagnose a strong-fault model due to its non-monotonicity. Currently, diagnosis methods usually employ conflicts to isolate possible fault and the process can be expedited when some observed output is consistent with the model’s prediction where the consistency indicates probably normal components. This paper solves the problem of efficiently diagnosing a strong-fault model by proposing a novel Logic-based Truth Maintenance System (LTMS with two search approaches based on conflict and consistency. At the beginning, the original a strong-fault model is encoded by Boolean variables and converted into Conjunctive Normal Form (CNF. Then the proposed LTMS is employed to reason over CNF and find multiple minimal conflicts and maximal consistencies when there exists fault. The search approaches offer the best candidate efficiency based on the reasoning result until the diagnosis results are obtained. The completeness, coverage, correctness and complexity of the proposals are analyzed theoretically to show their strength and weakness. Finally, the proposed approaches are demonstrated by applying them to a real-world domain—the heat control unit of a spacecraft—where the proposed methods are significantly better than best first and conflict directly with A* search methods.
Consistent dynamical and statistical description of fission and comparison
Energy Technology Data Exchange (ETDEWEB)
Shunuan, Wang [Chinese Nuclear Data Center, Beijing, BJ (China)
1996-06-01
The research survey of consistent dynamical and statistical description of fission is briefly introduced. The channel theory of fission with diffusive dynamics based on Bohr channel theory of fission and Fokker-Planck equation and Kramers-modified Bohr-Wheeler expression according to Strutinsky method given by P.Frobrich et al. are compared and analyzed. (2 figs.).
Brief Report: Consistency of Search Engine Rankings for Autism Websites
Reichow, Brian; Naples, Adam; Steinhoff, Timothy; Halpern, Jason; Volkmar, Fred R.
2012-01-01
The World Wide Web is one of the most common methods used by parents to find information on autism spectrum disorders and most consumers find information through search engines such as Google or Bing. However, little is known about how the search engines operate or the consistency of the results that are returned over time. This study presents the…
Consistency of the Self-Schema in Depression.
Ross, Michael J.; Mueller, John H.
Depressed individuals may filter or distort environmental information in direct relationship to their self perceptions. To investigate the degree of uncertainty about oneself and others, as measured by consistent/inconsistent responses, 72 college students (32 depressed and 40 nondepressed) rated selected adjectives from the Derry and Kuiper…
Composition consisting of a dendrimer and an active substance
1995-01-01
The invention relates to a composition consisting of a dendrimer provided with blocking agents and an active substance occluded in the dendrimer. According to the invention a blocking agent is a compound which is sterically of sufficient size, which readily enters into a chemical bond with the
Analytical relativistic self-consistent-field calculations for atoms
International Nuclear Information System (INIS)
Barthelat, J.C.; Pelissier, M.; Durand, P.
1980-01-01
A new second-order representation of the Dirac equation is presented. This representation which is exact for a hydrogen atom is applied to approximate analytical self-consistent-field calculations for atoms. Results are given for the rare-gas atoms from helium to radon and for lead. The results compare favorably with numerical Dirac-Hartree-Fock solutions
A consistent analysis for the quark condensate in QCD
International Nuclear Information System (INIS)
Huang Zheng; Huang Tao
1988-08-01
The dynamical symmetry breaking in QCD is analysed based on the vacuum condensates. A self-consistent equation for the quark condensate (φ φ) is derived. A nontrivial solution for (φ φ) ≠ 0 is given in terms of the QCD scale parameter A
The consistency assessment of topological relations in cartographic generalization
Zheng, Chunyan; Guo, Qingsheng; Du, Xiaochu
2006-10-01
The field of research in the generalization assessment has been less studied than the generalization process itself, and it is very important to keep topological relation consistency for meeting generalization quality. This paper proposes a methodology to assess the quality of generalized map from topological relations consistency. Taking roads (including railway) and residential areas for examples, from the viewpoint of the spatial cognition, some issues about topological consistency in different map scales are analyzed. The statistic information about the inconsistent topological relations can be obtained by comparing the two matrices: one is the matrix for the topological relations in the generalized map; the other is the theoretical matrix for the topological relations that should be maintained after generalization. Based on the fuzzy set theory and the classification of map object types, the consistency evaluation model of topological relations is established. The paper proves the feasibility of the method through the example about how to evaluate the local topological relations between simple roads and residential area finally.
Numerical consistency check between two approaches to radiative ...
Indian Academy of Sciences (India)
approaches for a consistency check on numerical accuracy, and find out the stabil- ... ln(MR/1 GeV) to top-quark mass scale t0(= ln(mt/1 GeV)) where t0 ≤ t ≤ tR, we ..... It is in general to tone down the solar mixing angle through further fine.
Diagnosing a Strong-Fault Model by Conflict and Consistency.
Zhang, Wenfeng; Zhao, Qi; Zhao, Hongbo; Zhou, Gan; Feng, Wenquan
2018-03-29
The diagnosis method for a weak-fault model with only normal behaviors of each component has evolved over decades. However, many systems now demand a strong-fault models, the fault modes of which have specific behaviors as well. It is difficult to diagnose a strong-fault model due to its non-monotonicity. Currently, diagnosis methods usually employ conflicts to isolate possible fault and the process can be expedited when some observed output is consistent with the model's prediction where the consistency indicates probably normal components. This paper solves the problem of efficiently diagnosing a strong-fault model by proposing a novel Logic-based Truth Maintenance System (LTMS) with two search approaches based on conflict and consistency. At the beginning, the original a strong-fault model is encoded by Boolean variables and converted into Conjunctive Normal Form (CNF). Then the proposed LTMS is employed to reason over CNF and find multiple minimal conflicts and maximal consistencies when there exists fault. The search approaches offer the best candidate efficiency based on the reasoning result until the diagnosis results are obtained. The completeness, coverage, correctness and complexity of the proposals are analyzed theoretically to show their strength and weakness. Finally, the proposed approaches are demonstrated by applying them to a real-world domain-the heat control unit of a spacecraft-where the proposed methods are significantly better than best first and conflict directly with A* search methods.
Consistency Check for the Bin Packing Constraint Revisited
Dupuis, Julien; Schaus, Pierre; Deville, Yves
The bin packing problem (BP) consists in finding the minimum number of bins necessary to pack a set of items so that the total size of the items in each bin does not exceed the bin capacity C. The bin capacity is common for all the bins.
Matrix analysis for associated consistency in cooperative game theory
Xu, G.; Driessen, Theo; Sun, H.; Sun, H.
Hamiache's recent axiomatization of the well-known Shapley value for TU games states that the Shapley value is the unique solution verifying the following three axioms: the inessential game property, continuity and associated consistency. Driessen extended Hamiache's axiomatization to the enlarged
Matrix analysis for associated consistency in cooperative game theory
Xu Genjiu, G.; Driessen, Theo; Sun, H.; Sun, H.
Hamiache axiomatized the Shapley value as the unique solution verifying the inessential game property, continuity and associated consistency. Driessen extended Hamiache’s axiomatization to the enlarged class of efficient, symmetric, and linear values. In this paper, we introduce the notion of row
Consistent measurements comparing the drift features of noble gas mixtures
Becker, U; Fortunato, E M; Kirchner, J; Rosera, K; Uchida, Y
1999-01-01
We present a consistent set of measurements of electron drift velocities and Lorentz deflection angles for all noble gases with methane and ethane as quenchers in magnetic fields up to 0.8 T. Empirical descriptions are also presented. Details on the World Wide Web allow for guided design and optimization of future detectors.
Consistency in behavior of the CEO regarding corporate social responsibility
Elving, W.J.L.; Kartal, D.
2012-01-01
Purpose - When corporations adopt a corporate social responsibility (CSR) program and use and name it in their external communications, their members should act in line with CSR. The purpose of this paper is to present an experiment in which the consistent or inconsistent behavior of a CEO was
Self-consistent description of the isospin mixing
International Nuclear Information System (INIS)
Gabrakov, S.I.; Pyatov, N.I.; Baznat, M.I.; Salamov, D.I.
1978-03-01
The properties of collective 0 + states built of unlike particle-hole excitations in spherical nuclei have been investigated in a self-consistent microscopic approach. These states arise when the broken isospin symmetry of the nuclear shell model Hamiltonian is restored. The numerical calculations were performed with Woods-Saxon wave functions
Potential application of the consistency approach for vaccine potency testing.
Arciniega, J; Sirota, L A
2012-01-01
The Consistency Approach offers the possibility of reducing the number of animals used for a potency test. However, it is critical to assess the effect that such reduction may have on assay performance. Consistency of production, sometimes referred to as consistency of manufacture or manufacturing, is an old concept implicit in regulation, which aims to ensure the uninterrupted release of safe and effective products. Consistency of manufacture can be described in terms of process capability, or the ability of a process to produce output within specification limits. For example, the standard method for potency testing of inactivated rabies vaccines is a multiple-dilution vaccination challenge test in mice that gives a quantitative, although highly variable estimate. On the other hand, a single-dilution test that does not give a quantitative estimate, but rather shows if the vaccine meets the specification has been proposed. This simplified test can lead to a considerable reduction in the number of animals used. However, traditional indices of process capability assume that the output population (potency values) is normally distributed, which clearly is not the case for the simplified approach. Appropriate computation of capability indices for the latter case will require special statistical considerations.
Consistent and robust determination of border ownership based on asymmetric surrounding contrast.
Sakai, Ko; Nishimura, Haruka; Shimizu, Ryohei; Kondo, Keiichi
2012-09-01
Determination of the figure region in an image is a fundamental step toward surface construction, shape coding, and object representation. Localized, asymmetric surround modulation, reported neurophysiologically in early-to-intermediate-level visual areas, has been proposed as a mechanism for figure-ground segregation. We investigated, computationally, whether such surround modulation is capable of yielding consistent and robust determination of figure side for various stimuli. Our surround modulation model showed a surprisingly high consistency among pseudorandom block stimuli, with greater consistency for stimuli that yielded higher accuracy of, and shorter reaction times in, human perception. Our analyses revealed that the localized, asymmetric organization of surrounds is crucial in the detection of the contrast imbalance that leads to the determination of the direction of figure with respect to the border. The model also exhibited robustness for gray-scaled natural images, with a mean correct rate of 67%, which was similar to that of figure-side determination in human perception through a small window and of machine-vision algorithms based on local processing. These results suggest a crucial role of surround modulation in the local processing of figure-ground segregation. Copyright © 2012 Elsevier Ltd. All rights reserved.
International Nuclear Information System (INIS)
Borges, Alexandra; Casselman, Jan
2010-01-01
Of all cranial nerves, the trigeminal nerve is the largest and the most widely distributed in the supra-hyoid neck. It provides sensory input from the face and motor innervation to the muscles of mastication. In order to adequately image the full course of the trigeminal nerve and its main branches a detailed knowledge of neuroanatomy and imaging technique is required. Although the main trunk of the trigeminal nerve is consistently seen on conventional brain studies, high-resolution tailored imaging is mandatory to depict smaller nerve branches and subtle pathologic processes. Increasing developments in imaging technique made possible isotropic sub-milimetric images and curved reconstructions of cranial nerves and their branches and led to an increasing recognition of symptomatic trigeminal neuropathies. Whereas MRI has a higher diagnostic yield in patients with trigeminal neuropathy, CT is still required to demonstrate the bony anatomy of the skull base and is the modality of choice in the context of traumatic injury to the nerve. Imaging of the trigeminal nerve is particularly cumbersome as its long course from the brainstem nuclei to the peripheral branches and its rich anastomotic network impede, in most cases, a topographic approach. Therefore, except in cases of classic trigeminal neuralgia, in which imaging studies can be tailored to the root entry zone, the full course of the trigeminal nerve has to be imaged. This article provides an update in the most recent advances on MR imaging technique and a segmental imaging approach to the most common pathologic processes affecting the trigeminal nerve.
Energy Technology Data Exchange (ETDEWEB)
Borges, Alexandra [Radiology Department, Instituto Portugues de Oncologia Francisco Gentil, Centro de Lisboa, Rua Prof. Lima Basto, 1093, Lisboa (Portugal)], E-mail: borgalexandra@gmail.com; Casselman, Jan [Department of Radiology, A. Z. St Jan Brugge and A. Z. St Augustinus Antwerpen Hospitals (Belgium)
2010-05-15
Of all cranial nerves, the trigeminal nerve is the largest and the most widely distributed in the supra-hyoid neck. It provides sensory input from the face and motor innervation to the muscles of mastication. In order to adequately image the full course of the trigeminal nerve and its main branches a detailed knowledge of neuroanatomy and imaging technique is required. Although the main trunk of the trigeminal nerve is consistently seen on conventional brain studies, high-resolution tailored imaging is mandatory to depict smaller nerve branches and subtle pathologic processes. Increasing developments in imaging technique made possible isotropic sub-milimetric images and curved reconstructions of cranial nerves and their branches and led to an increasing recognition of symptomatic trigeminal neuropathies. Whereas MRI has a higher diagnostic yield in patients with trigeminal neuropathy, CT is still required to demonstrate the bony anatomy of the skull base and is the modality of choice in the context of traumatic injury to the nerve. Imaging of the trigeminal nerve is particularly cumbersome as its long course from the brainstem nuclei to the peripheral branches and its rich anastomotic network impede, in most cases, a topographic approach. Therefore, except in cases of classic trigeminal neuralgia, in which imaging studies can be tailored to the root entry zone, the full course of the trigeminal nerve has to be imaged. This article provides an update in the most recent advances on MR imaging technique and a segmental imaging approach to the most common pathologic processes affecting the trigeminal nerve.
Performance and consistency of indicator groups in two biodiversity hotspots.
Directory of Open Access Journals (Sweden)
Joaquim Trindade-Filho
Full Text Available In a world limited by data availability and limited funds for conservation, scientists and practitioners must use indicator groups to define spatial conservation priorities. Several studies have evaluated the effectiveness of indicator groups, but still little is known about the consistency in performance of these groups in different regions, which would allow their a priori selection.We systematically examined the effectiveness and the consistency of nine indicator groups in representing mammal species in two top-ranked Biodiversity Hotspots (BH: the Brazilian Cerrado and the Atlantic Forest. To test for group effectiveness we first found the best sets of sites able to maximize the representation of each indicator group in the BH and then calculated the average representation of different target species by the indicator groups in the BH. We considered consistent indicator groups whose representation of target species was not statistically different between BH. We called effective those groups that outperformed the target-species representation achieved by random sets of species. Effective indicator groups required the selection of less than 2% of the BH area for representing target species. Restricted-range species were the most effective indicators for the representation of all mammal diversity as well as target species. It was also the only group with high consistency.We show that several indicator groups could be applied as shortcuts for representing mammal species in the Cerrado and the Atlantic Forest to develop conservation plans, however, only restricted-range species consistently held as the most effective indicator group for such a task. This group is of particular importance in conservation planning as it captures high diversity of endemic and endangered species.
Performance and consistency of indicator groups in two biodiversity hotspots.
Trindade-Filho, Joaquim; Loyola, Rafael Dias
2011-01-01
In a world limited by data availability and limited funds for conservation, scientists and practitioners must use indicator groups to define spatial conservation priorities. Several studies have evaluated the effectiveness of indicator groups, but still little is known about the consistency in performance of these groups in different regions, which would allow their a priori selection. We systematically examined the effectiveness and the consistency of nine indicator groups in representing mammal species in two top-ranked Biodiversity Hotspots (BH): the Brazilian Cerrado and the Atlantic Forest. To test for group effectiveness we first found the best sets of sites able to maximize the representation of each indicator group in the BH and then calculated the average representation of different target species by the indicator groups in the BH. We considered consistent indicator groups whose representation of target species was not statistically different between BH. We called effective those groups that outperformed the target-species representation achieved by random sets of species. Effective indicator groups required the selection of less than 2% of the BH area for representing target species. Restricted-range species were the most effective indicators for the representation of all mammal diversity as well as target species. It was also the only group with high consistency. We show that several indicator groups could be applied as shortcuts for representing mammal species in the Cerrado and the Atlantic Forest to develop conservation plans, however, only restricted-range species consistently held as the most effective indicator group for such a task. This group is of particular importance in conservation planning as it captures high diversity of endemic and endangered species.
Conformal consistency relations for single-field inflation
International Nuclear Information System (INIS)
Creminelli, Paolo; Noreña, Jorge; Simonović, Marko
2012-01-01
We generalize the single-field consistency relations to capture not only the leading term in the squeezed limit — going as 1/q 3 , where q is the small wavevector — but also the subleading one, going as 1/q 2 . This term, for an (n+1)-point function, is fixed in terms of the variation of the n-point function under a special conformal transformation; this parallels the fact that the 1/q 3 term is related with the scale dependence of the n-point function. For the squeezed limit of the 3-point function, this conformal consistency relation implies that there are no terms going as 1/q 2 . We verify that the squeezed limit of the 4-point function is related to the conformal variation of the 3-point function both in the case of canonical slow-roll inflation and in models with reduced speed of sound. In the second case the conformal consistency conditions capture, at the level of observables, the relation among operators induced by the non-linear realization of Lorentz invariance in the Lagrangian. These results mean that, in any single-field model, primordial correlation functions of ζ are endowed with an SO(4,1) symmetry, with dilations and special conformal transformations non-linearly realized by ζ. We also verify the conformal consistency relations for any n-point function in models with a modulation of the inflaton potential, where the scale dependence is not negligible. Finally, we generalize (some of) the consistency relations involving tensors and soft internal momenta
Dark matter dynamics in Abell 3827: new data consistent with standard cold dark matter
Massey, Richard; Harvey, David; Liesenborgs, Jori; Richard, Johan; Stach, Stuart; Swinbank, Mark; Taylor, Peter; Williams, Liliya; Clowe, Douglas; Courbin, Frédéric; Edge, Alastair; Israel, Holger; Jauzac, Mathilde; Joseph, Rémy; Jullo, Eric; Kitching, Thomas D.; Leonard, Adrienne; Merten, Julian; Nagai, Daisuke; Nightingale, James; Robertson, Andrew; Romualdez, Luis Javier; Saha, Prasenjit; Smit, Renske; Tam, Sut-Ieng; Tittley, Eric
2018-06-01
We present integral field spectroscopy of galaxy cluster Abell 3827, using Atacama Large Millimetre Array (ALMA) and Very Large Telescope/Multi-Unit Spectroscopic Explorer. It reveals an unusual configuration of strong gravitational lensing in the cluster core, with at least seven lensed images of a single background spiral galaxy. Lens modelling based on Hubble Space Telescope imaging had suggested that the dark matter associated with one of the cluster's central galaxies may be offset. The new spectroscopic data enable better subtraction of foreground light, and better identification of multiple background images. The inferred distribution of dark matter is consistent with being centred on the galaxies, as expected by Λ cold dark matter. Each galaxy's dark matter also appears to be symmetric. Whilst, we do not find an offset between mass and light (suggestive of self-interacting dark matter) as previously reported, the numerical simulations that have been performed to calibrate Abell 3827 indicate that offsets and asymmetry are still worth looking for in collisions with particular geometries. Meanwhile, ALMA proves exceptionally useful for strong lens image identifications.
Raster images vectorization system
Genytė, Jurgita
2006-01-01
The problem of raster images vectorization was analyzed and researched in this work. Existing vectorization systems are quite expensive, the results are inaccurate, and the manual vectorization of a large number of drafts is impossible. That‘s why our goal was to design and develop a new raster images vectorization system using our suggested automatic vectorization algorithm and the way to record results in a new universal vectorial file format. The work consists of these main parts: analysis...
Acoustical holographic Siamese image technique for imaging radial cracks in reactor piping
International Nuclear Information System (INIS)
Collins, H.D.; Gribble, R.P.
1985-04-01
This paper describes a unique technique (i.e., ''Siamese imaging'') for imaging quasi-vertical defects in reactor pipe weldments. The Siamese image is a bi-symmetrical view of the inner surface defect. Image construction geometry consists of two probes (i.e., source/receiver) operating either from opposite sides or the same side of the defect to be imaged. As the probes are scanned across a lower surface connected defect, they encounter two images - first the normal upright image and then the inverted image. The final integrated image consists of two images connected along their baselines, thus we call it a ''Siamese image.'' The experimental imaging results on simulated and natural cracks in reactor piping weldments graphically illustrate this unique technique. Excellent images of mechanical fatique and thermal cracks were obtained on ferritic and austenitic piping
Dispersion Differences and Consistency of Artificial Periodic Structures.
Cheng, Zhi-Bao; Lin, Wen-Kai; Shi, Zhi-Fei
2017-10-01
Dispersion differences and consistency of artificial periodic structures, including phononic crystals, elastic metamaterials, as well as periodic structures composited of phononic crystals and elastic metamaterials, are investigated in this paper. By developing a K(ω) method, complex dispersion relations and group/phase velocity curves of both the single-mechanism periodic structures and the mixing-mechanism periodic structures are calculated at first, from which dispersion differences of artificial periodic structures are discussed. Then, based on a unified formulation, dispersion consistency of artificial periodic structures is investigated. Through a comprehensive comparison study, the correctness for the unified formulation is verified. Mathematical derivations of the unified formulation for different artificial periodic structures are presented. Furthermore, physical meanings of the unified formulation are discussed in the energy-state space.
Consistent Conformal Extensions of the Standard Model arXiv
Loebbert, Florian; Plefka, Jan
The question of whether classically conformal modifications of the standard model are consistent with experimental obervations has recently been subject to renewed interest. The method of Gildener and Weinberg provides a natural framework for the study of the effective potential of the resulting multi-scalar standard model extensions. This approach relies on the assumption of the ordinary loop hierarchy $\\lambda_\\text{s} \\sim g^2_\\text{g}$ of scalar and gauge couplings. On the other hand, Andreassen, Frost and Schwartz recently argued that in the (single-scalar) standard model, gauge invariant results require the consistent scaling $\\lambda_\\text{s} \\sim g^4_\\text{g}$. In the present paper we contrast these two hierarchy assumptions and illustrate the differences in the phenomenological predictions of minimal conformal extensions of the standard model.
Surfactant modified clays’ consistency limits and contact angles
Directory of Open Access Journals (Sweden)
S Akbulut
2012-07-01
Full Text Available This study was aimed at preparing a surfactant modified clay (SMC and researching the effect of surfactants on clays' contact angles and consistency limits; clay was thus modified by surfactants formodifying their engineering properties. Seven surfactants (trimethylglycine, hydroxyethylcellulose octyl phenol ethoxylate, linear alkylbenzene sulfonic acid, sodium lauryl ether sulfate, cetyl trimethylammonium chloride and quaternised ethoxylated fatty amine were used as surfactants in this study. The experimental results indicated that SMC consistency limits (liquid and plastic limits changedsignificantly compared to those of natural clay. Plasticity index and liquid limit (PI-LL values representing soil class approached the A-line when zwitterion, nonionic, and anionic surfactant percentageincreased. However, cationic SMC became transformed from CH (high plasticity clay to MH (high plasticity silt class soils, according to the unified soil classification system (USCS. Clay modifiedwith cationic and anionic surfactants gave higher and lower contact angles than natural clay, respectively.
Rotating D0-branes and consistent truncations of supergravity
International Nuclear Information System (INIS)
Anabalón, Andrés; Ortiz, Thomas; Samtleben, Henning
2013-01-01
The fluctuations around the D0-brane near-horizon geometry are described by two-dimensional SO(9) gauged maximal supergravity. We work out the U(1) 4 truncation of this theory whose scalar sector consists of five dilaton and four axion fields. We construct the full non-linear Kaluza–Klein ansatz for the embedding of the dilaton sector into type IIA supergravity. This yields a consistent truncation around a geometry which is the warped product of a two-dimensional domain wall and the sphere S 8 . As an application, we consider the solutions corresponding to rotating D0-branes which in the near-horizon limit approach AdS 2 ×M 8 geometries, and discuss their thermodynamical properties. More generally, we study the appearance of such solutions in the presence of non-vanishing axion fields
Substituting fields within the action: Consistency issues and some applications
International Nuclear Information System (INIS)
Pons, Josep M.
2010-01-01
In field theory, as well as in mechanics, the substitution of some fields in terms of other fields at the level of the action raises an issue of consistency with respect to the equations of motion. We discuss this issue and give an expression which neatly displays the difference between doing the substitution at the level of the Lagrangian or at the level of the equations of motion. Both operations do not commute in general. A very relevant exception is the case of auxiliary variables, which are discussed in detail together with some of their relevant applications. We discuss the conditions for the preservation of symmetries--Noether as well as non-Noether--under the reduction of degrees of freedom provided by the mechanism of substitution. We also examine how the gauge fixing procedures fit in our framework and give simple examples on the issue of consistency in this case.
Design of a Turbulence Generator of Medium Consistency Pulp Pumps
Directory of Open Access Journals (Sweden)
Hong Li
2012-01-01
Full Text Available The turbulence generator is a key component of medium consistency centrifugal pulp pumps, with functions to fluidize the medium consistency pulp and to separate gas from the liquid. Structure sizes of the generator affect the hydraulic performance. The radius and the blade laying angle are two important structural sizes of a turbulence generator. Starting with the research on the flow inside and shearing characteristics of the MC pulp, a simple mathematical model at the flow section of the shearing chamber is built, and the formula and procedure to calculate the radius of the turbulence generator are established. The blade laying angle is referenced from the turbine agitator which has the similar shape with the turbulence generator, and the CFD simulation is applied to study the different flow fields with different blade laying angles. Then the recommended blade laying angle of the turbulence generator is formed to be between 60° and 75°.
On the consistent effect histories approach to quantum mechanics
International Nuclear Information System (INIS)
Rudolph, O.
1996-01-01
A formulation of the consistent histories approach to quantum mechanics in terms of generalized observables (POV measures) and effect operators is provided. The usual notion of open-quote open-quote history close-quote close-quote is generalized to the notion of open-quote open-quote effect history.close-quote close-quote The space of effect histories carries the structure of a D-poset. Recent results of J. D. Maitland Wright imply that every decoherence functional defined for ordinary histories can be uniquely extended to a bi-additive decoherence functional on the space of effect histories. Omngrave es close-quote logical interpretation is generalized to the present context. The result of this work considerably generalizes and simplifies the earlier formulation of the consistent effect histories approach to quantum mechanics communicated in a previous work of this author. copyright 1996 American Institute of Physics
Consistency Across Standards or Standards in a New Business Model
Russo, Dane M.
2010-01-01
Presentation topics include: standards in a changing business model, the new National Space Policy is driving change, a new paradigm for human spaceflight, consistency across standards, the purpose of standards, danger of over-prescriptive standards, a balance is needed (between prescriptive and general standards), enabling versus inhibiting, characteristics of success-oriented standards, characteristics of success-oriented standards, and conclusions. Additional slides include NASA Procedural Requirements 8705.2B identifies human rating standards and requirements, draft health and medical standards for human rating, what's been done, government oversight models, examples of consistency from anthropometry, examples of inconsistency from air quality and appendices of government and non-governmental human factors standards.
Quantitative verification of ab initio self-consistent laser theory.
Ge, Li; Tandy, Robert J; Stone, A D; Türeci, Hakan E
2008-10-13
We generalize and test the recent "ab initio" self-consistent (AISC) time-independent semiclassical laser theory. This self-consistent formalism generates all the stationary lasing properties in the multimode regime (frequencies, thresholds, internal and external fields, output power and emission pattern) from simple inputs: the dielectric function of the passive cavity, the atomic transition frequency, and the transverse relaxation time of the lasing transition.We find that the theory gives excellent quantitative agreement with full time-dependent simulations of the Maxwell-Bloch equations after it has been generalized to drop the slowly-varying envelope approximation. The theory is infinite order in the non-linear hole-burning interaction; the widely used third order approximation is shown to fail badly.
Self-consistent studies of magnetic thin film Ni (001)
International Nuclear Information System (INIS)
Wang, C.S.; Freeman, A.J.
1979-01-01
Advances in experimental methods for studying surface phenomena have provided the stimulus to develop theoretical methods capable of interpreting this wealth of new information. Of particular interest have been the relative roles of bulk and surface contributions since in several important cases agreement between experiment and bulk self-consistent (SC) calculations within the local spin density functional formalism (LSDF) is lacking. We discuss our recent extension of the (LSDF) approach to the study of thin films (slabs) and the role of surface effects on magnetic properties. Results are described for Ni (001) films using our new SC numerical basis set LCAO method. Self-consistency within the superposition of overlapping spherical atomic charge density model is obtained iteratively with the atomic configuration as the adjustable parameter. Results are presented for the electronic charge densities and local density of states. The origin and role of (magnetic) surface states is discussed by comparison with results of earlier bulk calculations
Self-consistent equilibria in the pulsar magnetosphere
International Nuclear Information System (INIS)
Endean, V.G.
1976-01-01
For a 'collisionless' pulsar magnetosphere the self-consistent equilibrium particle distribution functions are functions of the constants of the motion ony. Reasons are given for concluding that to a good approximation they will be functions of the rotating frame Hamiltonian only. This is shown to result in a rigid rotation of the plasma, which therefore becomes trapped inside the velocity of light cylinder. The self-consistent field equations are derived, and a method of solving them is illustrated. The axial component of the magnetic field decays to zero at the plasma boundary. In practice, some streaming of particles into the wind zone may occur as a second-order effect. Acceleration of such particles to very high energies is expected when they approach the velocity of light cylinder, but they cannot be accelerated to very high energies near the star. (author)
Consistent creep and rupture properties for creep-fatigue evaluation
International Nuclear Information System (INIS)
Schultz, C.C.
1978-01-01
The currently accepted practice of using inconsistent representations of creep and rupture behaviors in the prediction of creep-fatigue life is shown to introduce a factor of safety beyond that specified in current ASME Code design rules for 304 stainless steel Class 1 nuclear components. Accurate predictions of creep-fatigue life for uniaxial tests on a given heat of material are obtained by using creep and rupture properties for that same heat of material. The use of a consistent representation of creep and rupture properties for a mininum strength heat is also shown to provide adequate predictions. The viability of using consistent properties (either actual or those of a minimum heat) to predict creep-fatigue life thus identifies significant design uses for the results of characterization tests and improved creep and rupture correlations
Lagrangian space consistency relation for large scale structure
International Nuclear Information System (INIS)
Horn, Bart; Hui, Lam; Xiao, Xiao
2015-01-01
Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias and Riotto and Peloso and Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present. The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space
Rotating D0-branes and consistent truncations of supergravity
Energy Technology Data Exchange (ETDEWEB)
Anabalón, Andrés [Departamento de Ciencias, Facultad de Artes Liberales, Facultad de Ingeniería y Ciencias, Universidad Adolfo Ibáñez, Av. Padre Hurtado 750, Viña del Mar (Chile); Université de Lyon, Laboratoire de Physique, UMR 5672, CNRS École Normale Supérieure de Lyon 46, allée d' Italie, F-69364 Lyon cedex 07 (France); Ortiz, Thomas; Samtleben, Henning [Université de Lyon, Laboratoire de Physique, UMR 5672, CNRS École Normale Supérieure de Lyon 46, allée d' Italie, F-69364 Lyon cedex 07 (France)
2013-12-18
The fluctuations around the D0-brane near-horizon geometry are described by two-dimensional SO(9) gauged maximal supergravity. We work out the U(1){sup 4} truncation of this theory whose scalar sector consists of five dilaton and four axion fields. We construct the full non-linear Kaluza–Klein ansatz for the embedding of the dilaton sector into type IIA supergravity. This yields a consistent truncation around a geometry which is the warped product of a two-dimensional domain wall and the sphere S{sup 8}. As an application, we consider the solutions corresponding to rotating D0-branes which in the near-horizon limit approach AdS{sub 2}×M{sub 8} geometries, and discuss their thermodynamical properties. More generally, we study the appearance of such solutions in the presence of non-vanishing axion fields.
Consistent creep and rupture properties for creep-fatigue evaluation
International Nuclear Information System (INIS)
Schultz, C.C.
1979-01-01
The currently accepted practice of using inconsistent representations of creep and rupture behaviors in the prediction of creep-fatigue life is shown to introduce a factor of safety beyond that specified in current ASME Code design rules for 304 stainless steel Class 1 nuclear components. Accurate predictions of creep-fatigue life for uniaxial tests on a given heat of material are obtained by using creep and rupture properties for that same heat of material. The use of a consistent representation of creep and rupture properties for a minimum strength heat is also shown to provide reasonable predictions. The viability of using consistent properties (either actual or those of a minimum strength heat) to predict creep-fatigue life thus identifies significant design uses for the results of characterization tests and improved creep and rupture correlations. 12 refs
Energy Technology Data Exchange (ETDEWEB)
Wendelberger, Laura Jean [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-08
In large datasets, it is time consuming or even impossible to pick out interesting images. Our proposed solution is to find statistics to quantify the information in each image and use those to identify and pick out images of interest.
Guidance that explains the process for getting images approved in One EPA Web microsites and resource directories. includes an appendix that shows examples of what makes some images better than others, how some images convey meaning more than others
International Nuclear Information System (INIS)
Pepy, G.
1999-01-01
After an introduction about data imaging in general, the principles of imaging data collected via neutron scattering experiments are presented. Some computer programs designed for data imaging purposes are reviewed. (K.A.)
Kutepov, A L
2015-08-12
Self-consistent solutions of Hedin's equations (HE) for the two-site Hubbard model (HM) have been studied. They have been found for three-point vertices of increasing complexity (Γ = 1 (GW approximation), Γ1 from the first-order perturbation theory, and the exact vertex Γ(E)). Comparison is made between the cases when an additional quasiparticle (QP) approximation for Green's functions is applied during the self-consistent iterative solving of HE and when QP approximation is not applied. The results obtained with the exact vertex are directly related to the present open question-which approximation is more advantageous for future implementations, GW + DMFT or QPGW + DMFT. It is shown that in a regime of strong correlations only the originally proposed GW + DMFT scheme is able to provide reliable results. Vertex corrections based on perturbation theory (PT) systematically improve the GW results when full self-consistency is applied. The application of QP self-consistency combined with PT vertex corrections shows similar problems to the case when the exact vertex is applied combined with QP sc. An analysis of Ward Identity violation is performed for all studied in this work's approximations and its relation to the general accuracy of the schemes used is provided.
Time-Consistent and Market-Consistent Evaluations (replaced by CentER DP 2012-086)
Pelsser, A.; Stadje, M.A.
2011-01-01
We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
Time-Consistent and Market-Consistent Evaluations (Revised version of CentER DP 2011-063)
Pelsser, A.; Stadje, M.A.
2012-01-01
Abstract: We consider evaluation methods for payoffs with an inherent financial risk as encountered for instance for portfolios held by pension funds and insurance companies. Pricing such payoffs in a way consistent to market prices typically involves combining actuarial techniques with methods from
Consistency of eye movements in MOT using horizontally flipped trials
Czech Academy of Sciences Publication Activity Database
Děchtěrenko, F.; Lukavský, Jiří
2013-01-01
Roč. 42, Suppl (2013), s. 42-42 ISSN 0301-0066. [36th European Conference on Visual Perception. 25.08.2013.-29.08.2013, Brémy] R&D Projects: GA ČR GA13-28709S Institutional support: RVO:68081740 Keywords : eye movements * symmetry * consistency Subject RIV: AN - Psychology http://www.ecvp.uni-bremen.de/~ecvpprog/abstract164.html
Overspecification of colour, pattern, and size: Salience, absoluteness, and consistency
Sammie eTarenskeen; Mirjam eBroersma; Mirjam eBroersma; Bart eGeurts
2015-01-01
The rates of overspecification of colour, pattern, and size are compared, to investigate how salience and absoluteness contribute to the production of overspecification. Colour and pattern are absolute attributes, whereas size is relative and less salient. Additionally, a tendency towards consistent responses is assessed. Using a within-participants design, we find similar rates of colour and pattern overspecification, which are both higher than the rate of size overspecification. Using a bet...
Monetary Poverty, Material Deprivation and Consistent Poverty in Portugal
Carlos Farinha Rodrigues; Isabel Andrade
2012-01-01
In this paper we use the Portuguese component of the European Union Statistics on Income and Living Conditions {EU-SILC) to develop a measure of consistent poverty in Portugal. It is widely agreed that being poor does not simply mean not having enough monetary resources. It also reflects a lack of access to the resources required to enjoy a minimum standard of living and participation in the society one belor]gs to. The coexistence of material deprivation and monetary poverty leads ...
Consistency requirements on Δ contributions to the NN potential
International Nuclear Information System (INIS)
Rinat, A.S.
1982-04-01
We discuss theories leading to intermediate state NΔ and ΔΔ contributions to Vsub(NN). We focus on the customary addition of Lsub(ΔNπ)' to Lsub(πNN)' in a conventional field theory and argue that overcounting of contributions to tsub(πN) and Vsub(NN) will be the rule. We then discuss the cloudy bag model where a similar interaction naturally arises and which leads to a consistent theory. (author)
Quark mean field theory and consistency with nuclear matter
International Nuclear Information System (INIS)
Dey, J.; Tomio, L.; Dey, M.; Frederico, T.
1989-01-01
1/N c expansion in QCD (with N c the number of colours) suggests using a potential from meson sector (e.g. Richardson) for baryons. For light quarks a σ field has to be introduced to ensure chiral symmetry breaking ( χ SB). It is found that nuclear matter properties can be used to pin down the χ SB-modelling. All masses, M Ν , m σ , m ω are found to scale with density. The equations are solved self consistently. (author)
Self-consistent T-matrix theory of superconductivity
Czech Academy of Sciences Publication Activity Database
Šopík, B.; Lipavský, Pavel; Männel, M.; Morawetz, K.; Matlock, P.
2011-01-01
Roč. 84, č. 9 (2011), 094529/1-094529/13 ISSN 1098-0121 R&D Projects: GA ČR GAP204/10/0212; GA ČR(CZ) GAP204/11/0015 Institutional research plan: CEZ:AV0Z10100521 Keywords : superconductivity * T-matrix * superconducting gap * restricted self-consistency Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 3.691, year: 2011
Overspecification of color, pattern, and size: salience, absoluteness, and consistency
Tarenskeen, S.L.; Broersma, M.; Geurts, B.
2015-01-01
The rates of overspecification of color, pattern, and size are compared, to investigate how salience and absoluteness contribute to the production of overspecification. Color and pattern are absolute and salient attributes, whereas size is relative and less salient. Additionally, a tendency toward consistent responses is assessed. Using a within-participants design, we find similar rates of color and pattern overspecification, which are both higher than the rate of size overspecification. Usi...
The consistent histories interpretation of quantum fields in curved spacetime
International Nuclear Information System (INIS)
Blencowe, M.
1991-01-01
As an initial attempt to address some of the foundation problems of quantum mechanics, the author formulates the consistent histories interpretation of quantum field theory on a globally hyperbolic curved space time. He then constructs quasiclassical histories for a free, massive scalar field. In the final part, he points out the shortcomings of the theory and conjecture that one must take into account the fact that gravity is quantized in order to overcome them
EVALUATION OF CONSISTENCY AND SETTING TIME OF IRANIAN DENTAL STONES
Directory of Open Access Journals (Sweden)
F GOL BIDI
2000-09-01
Full Text Available Introduction. Dental stones are widely used in dentistry and the success or failure of many dental treatments depend on the accuracy of these gypsums. The purpose of this study was the evaluation of Iranian dental stones and comparison between Iranian and foreign ones. In this investigation, consistency and setting time were compared between Pars Dendn, Almas and Hinrizit stones. The latter is accepted by ADA (American Dental Association. Consistency and setting time are 2 of 5 properties that are necessitated by both ADA specification No. 25 and Iranian Standard Organization specification No. 2569 for evaluation of dental stones. Methods. In this study, the number and preparation of specimens and test conditions were done according to the ADA specification No. 25 and all the measurements were done with vicat apparatus. Results. The results of this study showed that the standard consistency of Almas stone was obtained by 42ml water and 100gr powder and the setting time of this stone was 11±0.03 min. Which was with in the limits of ADA specification (12±4 min. The standard consistency of Pars Dandan stone was obrianed by 31ml water and 100 gr powder, but the setting time of this stone was 5± 0.16 min which was nt within the limits of ADA specification. Discussion: Comparison of Iranian and Hinrizit stones properties showed that two probable problems of Iranian stones are:1- Unhemogrnousity of Iranian stoned powder was caused by uncontrolled temperature, pressure and humidity in the production process of stone. 2- Impurities such as sodium chloride was responsible fo shortening of Pars Dendens setting time.
Modeling a Consistent Behavior of PLC-Sensors
Directory of Open Access Journals (Sweden)
E. V. Kuzmin
2014-01-01
Full Text Available The article extends the cycle of papers dedicated to programming and verificatoin of PLC-programs by LTL-specification. This approach provides the availability of correctness analysis of PLC-programs by the model checking method.The model checking method needs to construct a finite model of a PLC program. For successful verification of required properties it is important to take into consideration that not all combinations of input signals from the sensors can occur while PLC works with a control object. This fact requires more advertence to the construction of the PLC-program model.In this paper we propose to describe a consistent behavior of sensors by three groups of LTL-formulas. They will affect the program model, approximating it to the actual behavior of the PLC program. The idea of LTL-requirements is shown by an example.A PLC program is a description of reactions on input signals from sensors, switches and buttons. In constructing a PLC-program model, the approach to modeling a consistent behavior of PLC sensors allows to focus on modeling precisely these reactions without an extension of the program model by additional structures for realization of a realistic behavior of sensors. The consistent behavior of sensors is taken into account only at the stage of checking a conformity of the programming model to required properties, i. e. a property satisfaction proof for the constructed model occurs with the condition that the model contains only such executions of the program that comply with the consistent behavior of sensors.
Overspecification of colour, pattern, and size: Salience, absoluteness, and consistency
Directory of Open Access Journals (Sweden)
Sammie eTarenskeen
2015-11-01
Full Text Available The rates of overspecification of colour, pattern, and size are compared, to investigate how salience and absoluteness contribute to the production of overspecification. Colour and pattern are absolute attributes, whereas size is relative and less salient. Additionally, a tendency towards consistent responses is assessed. Using a within-participants design, we find similar rates of colour and pattern overspecification, which are both higher than the rate of size overspecification. Using a between-participants design, however, we find similar rates of pattern and size overspecification, which are both lower than the rate of colour overspecification. This indicates that although many speakers are more likely to include colour than pattern (probably because colour is more salient, they may also treat pattern like colour due to a tendency towards consistency. We find no increase in size overspecification when the salience of size is increased, suggesting that speakers are more likely to include absolute than relative attributes. However, we do find an increase in size overspecification when mentioning the attributes is triggered, which again shows that speakers tend refer in a consistent manner, and that there are circumstances in which even size overspecification is frequently produced.
Consistent Partial Least Squares Path Modeling via Regularization
Directory of Open Access Journals (Sweden)
Sunho Jung
2018-02-01
Full Text Available Partial least squares (PLS path modeling is a component-based structural equation modeling that has been adopted in social and psychological research due to its data-analytic capability and flexibility. A recent methodological advance is consistent PLS (PLSc, designed to produce consistent estimates of path coefficients in structural models involving common factors. In practice, however, PLSc may frequently encounter multicollinearity in part because it takes a strategy of estimating path coefficients based on consistent correlations among independent latent variables. PLSc has yet no remedy for this multicollinearity problem, which can cause loss of statistical power and accuracy in parameter estimation. Thus, a ridge type of regularization is incorporated into PLSc, creating a new technique called regularized PLSc. A comprehensive simulation study is conducted to evaluate the performance of regularized PLSc as compared to its non-regularized counterpart in terms of power and accuracy. The results show that our regularized PLSc is recommended for use when serious multicollinearity is present.
Temporal and contextual consistency of leadership in homing pigeon flocks.
Directory of Open Access Journals (Sweden)
Carlos D Santos
Full Text Available Organized flight of homing pigeons (Columba livia was previously shown to rely on simple leadership rules between flock mates, yet the stability of this social structuring over time and across different contexts remains unclear. We quantified the repeatability of leadership-based flock structures within a flight and across multiple flights conducted with the same animals. We compared two contexts of flock composition: flocks of birds of the same age and flight experience; and, flocks of birds of different ages and flight experience. All flocks displayed consistent leadership-based structures over time, showing that individuals have stable roles in the navigational decisions of the flock. However, flocks of balanced age and flight experience exhibited reduced leadership stability, indicating that these factors promote flock structuring. Our study empirically demonstrates that leadership and followership are consistent behaviours in homing pigeon flocks, but such consistency is affected by the heterogeneity of individual flight experiences and/or age. Similar evidence from other species suggests leadership as an important mechanism for coordinated motion in small groups of animals with strong social bonds.
Consistency checks in beam emission modeling for neutral beam injectors
International Nuclear Information System (INIS)
Punyapu, Bharathi; Vattipalle, Prahlad; Sharma, Sanjeev Kumar; Baruah, Ujjwal Kumar; Crowley, Brendan
2015-01-01
In positive neutral beam systems, the beam parameters such as ion species fractions, power fractions and beam divergence are routinely measured using Doppler shifted beam emission spectrum. The accuracy with which these parameters are estimated depend on the accuracy of the atomic modeling involved in these estimations. In this work, an effective procedure to check the consistency of the beam emission modeling in neutral beam injectors is proposed. As a first consistency check, at a constant beam voltage and current, the intensity of the beam emission spectrum is measured by varying the pressure in the neutralizer. Then, the scaling of measured intensity of un-shifted (target) and Doppler shifted intensities (projectile) of the beam emission spectrum at these pressure values are studied. If the un-shifted component scales with pressure, then the intensity of this component will be used as a second consistency check on the beam emission modeling. As a further check, the modeled beam fractions and emission cross sections of projectile and target are used to predict the intensity of the un-shifted component and then compared with the value of measured target intensity. An agreement between the predicted and measured target intensities provide the degree of discrepancy in the beam emission modeling. In order to test this methodology, a systematic analysis of Doppler shift spectroscopy data obtained on the JET neutral beam test stand data was carried out
A dynamical mechanism for large volumes with consistent couplings
Energy Technology Data Exchange (ETDEWEB)
Abel, Steven [IPPP, Durham University,Durham, DH1 3LE (United Kingdom)
2016-11-14
A mechanism for addressing the “decompactification problem” is proposed, which consists of balancing the vacuum energy in Scherk-Schwarzed theories against contributions coming from non-perturbative physics. Universality of threshold corrections ensures that, in such situations, the stable minimum will have consistent gauge couplings for any gauge group that shares the same N=2 beta function for the bulk excitations as the gauge group that takes part in the minimisation. Scherk-Schwarz compactification from 6D to 4D in heterotic strings is discussed explicitly, together with two alternative possibilities for the non-perturbative physics, namely metastable SQCD vacua and a single gaugino condensate. In the former case, it is shown that modular symmetries gives various consistency checks, and allow one to follow soft-terms, playing a similar role to R-symmetry in global SQCD. The latter case is particularly attractive when there is nett Bose-Fermi degeneracy in the massless sector. In such cases, because the original Casimir energy is generated entirely by excited and/or non-physical string modes, it is completely immune to the non-perturbative IR physics. Such a separation between UV and IR contributions to the potential greatly simplifies the analysis of stabilisation, and is a general possibility that has not been considered before.
Consistency of variables in PCS and JASTRO great area database
International Nuclear Information System (INIS)
Nishino, Tomohiro; Teshima, Teruki; Abe, Mitsuyuki
1998-01-01
To examine whether the Patterns of Care Study (PCS) reflects the data for the major areas in Japan, the consistency of variables in the PCS and in the major area database of the Japanese Society for Therapeutic Radiology and Oncology (JASTRO) were compared. Patients with esophageal or uterine cervical cancer were sampled from the PCS and JASTRO databases. From the JASTRO database, 147 patients with esophageal cancer and 95 patients with uterine cervical cancer were selected according to the eligibility criteria for the PCS. From the PCS, 455 esophageal and 432 uterine cervical cancer patients were surveyed. Six items for esophageal cancer and five items for uterine cervical cancer were selected for a comparative analysis of PCS and JASTRO databases. Esophageal cancer: Age (p=.0777), combination of radiation and surgery (p=.2136), and energy of the external beam (p=.6400) were consistent for PCS and JASTRO. However, the dose of the external beam for the non-surgery group showed inconsistency (p=.0467). Uterine cervical cancer: Age (p=.6301) and clinical stage (p=.8555) were consistent for the two sets of data. However, the energy of the external beam (p<.0001), dose rate of brachytherapy (p<.0001), and brachytherapy utilization by clinical stage (p<.0001) showed inconsistencies. It appears possible that the JASTRO major area database could not account for all patients' backgrounds and factors and that both surveys might have an imbalance in the stratification of institutions including differences in equipment and staffing patterns. (author)
Self-assessment: Strategy for higher standards, consistency, and performance
International Nuclear Information System (INIS)
Ide, W.E.
1996-01-01
In late 1994, Palo Verde operations underwent a transformation from a unitized structure to a single functional unit. It was necessary to build consistency in watchstanding practices and create a shared mission. Because there was a lack of focus on actual plant operations and because personnel were deeply involved with administrative tasks, command and control of evolutions were weak. Improvement was needed. Consistent performance standards have been set for all three operating units. These expectation focus on nuclear, radiological, and industrial safety. Straightforward descriptions of watchstanding and monitoring practices have been provided to all department personnel. The desired professional and leadership qualities for employee conduct have been defined and communicated thoroughly. A healthy and competitive atmosphere developed with the successful implementation of these standards. Overall performance improved. The auxiliary operators demonstrated increased pride and ownership in the performance of their work activities. In addition, their morale improved. Crew teamwork improved as well as the quality of shift briefs. There was a decrease in the noise level and the administrative functions in the control room. The use of self-assessment helped to anchor and define higher and more consistent standards. The proof of Palo Verde's success was evident when an Institute of Nuclear Power Operations finding was turned into a strength within 1 yr
Consistently Trained Artificial Neural Network for Automatic Ship Berthing Control
Directory of Open Access Journals (Sweden)
Y.A. Ahmed
2015-09-01
Full Text Available In this paper, consistently trained Artificial Neural Network controller for automatic ship berthing is discussed. Minimum time course changing manoeuvre is utilised to ensure such consistency and a new concept named ‘virtual window’ is introduced. Such consistent teaching data are then used to train two separate multi-layered feed forward neural networks for command rudder and propeller revolution output. After proper training, several known and unknown conditions are tested to judge the effectiveness of the proposed controller using Monte Carlo simulations. After getting acceptable percentages of success, the trained networks are implemented for the free running experiment system to judge the network’s real time response for Esso Osaka 3-m model ship. The network’s behaviour during such experiments is also investigated for possible effect of initial conditions as well as wind disturbances. Moreover, since the final goal point of the proposed controller is set at some distance from the actual pier to ensure safety, therefore a study on automatic tug assistance is also discussed for the final alignment of the ship with actual pier.
Consistency in color parameters of a commonly used shade guide.
Tashkandi, Esam
2010-01-01
The use of shade guides to assess the color of natural teeth subjectively remains one of the most common means for dental shade assessment. Any variation in the color parameters of the different shade guides may lead to significant clinical implications. Particularly, since the communication between the clinic and the dental laboratory is based on using the shade guide designation. The purpose of this study was to investigate the consistency of the L∗a∗b∗ color parameters of a sample of a commonly used shade guide. The color parameters of a total of 100 VITAPAN Classical Vacuum shade guide (VITA Zahnfabrik, Bad Säckingen, Germany(were measured using a X-Rite ColorEye 7000A Spectrophotometer (Grand Rapids, Michigan, USA). Each shade guide consists of 16 tabs with different designations. Each shade tab was measured five times and the average values were calculated. The ΔE between the average L∗a∗b∗ value for each shade tab and the average of the 100 shade tabs of the same designation was calculated. Using the Student t-test analysis, no significant differences were found among the measured sample. There is a high consistency level in terms of color parameters of the measured VITAPAN Classical Vacuum shade guide sample tested.
Context-specific metabolic networks are consistent with experiments.
Directory of Open Access Journals (Sweden)
Scott A Becker
2008-05-01
Full Text Available Reconstructions of cellular metabolism are publicly available for a variety of different microorganisms and some mammalian genomes. To date, these reconstructions are "genome-scale" and strive to include all reactions implied by the genome annotation, as well as those with direct experimental evidence. Clearly, many of the reactions in a genome-scale reconstruction will not be active under particular conditions or in a particular cell type. Methods to tailor these comprehensive genome-scale reconstructions into context-specific networks will aid predictive in silico modeling for a particular situation. We present a method called Gene Inactivity Moderated by Metabolism and Expression (GIMME to achieve this goal. The GIMME algorithm uses quantitative gene expression data and one or more presupposed metabolic objectives to produce the context-specific reconstruction that is most consistent with the available data. Furthermore, the algorithm provides a quantitative inconsistency score indicating how consistent a set of gene expression data is with a particular metabolic objective. We show that this algorithm produces results consistent with biological experiments and intuition for adaptive evolution of bacteria, rational design of metabolic engineering strains, and human skeletal muscle cells. This work represents progress towards producing constraint-based models of metabolism that are specific to the conditions where the expression profiling data is available.
The study of consistent properties of gelatinous shampoo with minoxidil
Directory of Open Access Journals (Sweden)
I. V. Gnitko
2016-04-01
Full Text Available The aim of the work is the study of consistent properties of gelatinous shampoo with minoxidil 1% for the complex therapy and prevention of alopecia. This shampoo with minoxidil was selected according to the complex physical-chemical, biopharmaceutical and microbiological investigations. Methods and results. It has been established that consistent properties of the gelatinous minoxidil 1% shampoo and the «mechanical stability» (1.70 describe the formulation as exceptionally thixotropic composition with possibility of restoration after mechanical loads. Also this fact allows to predict stability of the consistent properties during long storage. Conclusion. Factors of dynamic flowing for the foam detergent gel with minoxidil (Кd1=38.9%; Kd2=78.06% quantitatively confirm sufficient degree of distribution at the time of spreading composition on the skin surface of the hairy part of head or during technological operations of manufacturing. Insignificant difference of «mechanical stability» for the gelatinous minoxidil 1% shampoo and its base indicates the absence of interactions between active substance and the base.
Consistent Kaluza-Klein truncations via exceptional field theory
Energy Technology Data Exchange (ETDEWEB)
Hohm, Olaf [Center for Theoretical Physics, Massachusetts Institute of Technology,Cambridge, MA 02139 (United States); Samtleben, Henning [Université de Lyon, Laboratoire de Physique, UMR 5672, CNRS,École Normale Supérieure de Lyon, 46, allée d’Italie, F-69364 Lyon cedex 07 (France)
2015-01-26
We present the generalized Scherk-Schwarz reduction ansatz for the full supersymmetric exceptional field theory in terms of group valued twist matrices subject to consistency equations. With this ansatz the field equations precisely reduce to those of lower-dimensional gauged supergravity parametrized by an embedding tensor. We explicitly construct a family of twist matrices as solutions of the consistency equations. They induce gauged supergravities with gauge groups SO(p,q) and CSO(p,q,r). Geometrically, they describe compactifications on internal spaces given by spheres and (warped) hyperboloides H{sup p,q}, thus extending the applicability of generalized Scherk-Schwarz reductions beyond homogeneous spaces. Together with the dictionary that relates exceptional field theory to D=11 and IIB supergravity, respectively, the construction defines an entire new family of consistent truncations of the original theories. These include not only compactifications on spheres of different dimensions (such as AdS{sub 5}×S{sup 5}), but also various hyperboloid compactifications giving rise to a higher-dimensional embedding of supergravities with non-compact and non-semisimple gauge groups.
Marginal Consistency: Upper-Bounding Partition Functions over Commutative Semirings.
Werner, Tomás
2015-07-01
Many inference tasks in pattern recognition and artificial intelligence lead to partition functions in which addition and multiplication are abstract binary operations forming a commutative semiring. By generalizing max-sum diffusion (one of convergent message passing algorithms for approximate MAP inference in graphical models), we propose an iterative algorithm to upper bound such partition functions over commutative semirings. The iteration of the algorithm is remarkably simple: change any two factors of the partition function such that their product remains the same and their overlapping marginals become equal. In many commutative semirings, repeating this iteration for different pairs of factors converges to a fixed point when the overlapping marginals of every pair of factors coincide. We call this state marginal consistency. During that, an upper bound on the partition function monotonically decreases. This abstract algorithm unifies several existing algorithms, including max-sum diffusion and basic constraint propagation (or local consistency) algorithms in constraint programming. We further construct a hierarchy of marginal consistencies of increasingly higher levels and show than any such level can be enforced by adding identity factors of higher arity (order). Finally, we discuss instances of the framework for several semirings, including the distributive lattice and the max-sum and sum-product semirings.
Consistency relation in power law G-inflation
International Nuclear Information System (INIS)
Unnikrishnan, Sanil; Shankaranarayanan, S.
2014-01-01
In the standard inflationary scenario based on a minimally coupled scalar field, canonical or non-canonical, the subluminal propagation of speed of scalar perturbations ensures the following consistency relation: r ≤ −8n T , where r is the tensor-to-scalar-ratio and n T is the spectral index for tensor perturbations. However, recently, it has been demonstrated that this consistency relation could be violated in Galilean inflation models even in the absence of superluminal propagation of scalar perturbations. It is therefore interesting to investigate whether the subluminal propagation of scalar field perturbations impose any bound on the ratio r/|n T | in G-inflation models. In this paper, we derive the consistency relation for a class of G-inflation models that lead to power law inflation. Within these class of models, it turns out that one can have r > −8n T or r ≤ −8n T depending on the model parameters. However, the subluminal propagation of speed of scalar field perturbations, as required by causality, restricts r ≤ −(32/3) n T
CONSISTENCY UNDER SAMPLING OF EXPONENTIAL RANDOM GRAPH MODELS.
Shalizi, Cosma Rohilla; Rinaldo, Alessandro
2013-04-01
The growing availability of network data and of scientific interest in distributed systems has led to the rapid development of statistical models of network structure. Typically, however, these are models for the entire network, while the data consists only of a sampled sub-network. Parameters for the whole network, which is what is of interest, are estimated by applying the model to the sub-network. This assumes that the model is consistent under sampling , or, in terms of the theory of stochastic processes, that it defines a projective family. Focusing on the popular class of exponential random graph models (ERGMs), we show that this apparently trivial condition is in fact violated by many popular and scientifically appealing models, and that satisfying it drastically limits ERGM's expressive power. These results are actually special cases of more general results about exponential families of dependent random variables, which we also prove. Using such results, we offer easily checked conditions for the consistency of maximum likelihood estimation in ERGMs, and discuss some possible constructive responses.
International Nuclear Information System (INIS)
Potsaid, M.S.
1978-01-01
The clinical use of [ 75 Se] selenomethionine for visualising the pancreas is described. The physiological considerations, imaging procedure, image interpretations and reliability are considered. (C.F.)
ASTER 2002-2003 Kansas Satellite Image Database (KSID)
Kansas Data Access and Support Center — The Kansas Satellite Image Database (KSID):2002-2003 consists of image data gathered by three sensors. The first image data are terrain-corrected, precision...
MODIS 2002-2003 Kansas Satellite Image Database (KSID)
Kansas Data Access and Support Center — The Kansas Satellite Image Database (KSID):2002-2003 consists of image data gathered by three sensors. The first image data are terrain-corrected, precision...
Piranti Lunak Pengujian Struktur Matematika Grup, Ring, Field Berbasis Osp (Open Source Program
Directory of Open Access Journals (Sweden)
Ngarap Im Manik
2014-06-01
Full Text Available This design of a computer software is a development and continuation of the software made on the previous research (2009/2010. However, this further research developed and expanded the scopes of testing more on the Siclic Group, Isomorphism Group, Semi Group, Sub Group and Abelian Group, Factor Ring, Sub Ring and Polynomial Ring; developed on the OSP (Open Source Program-based. The software was developed using the OSP-based language programming, such Java, so it is open and free to use for its users. This research succeeded to develop an open source software of Java program that can be used for testing specific mathematical Groups, such Ciclic Group, Isomorphism Group, Semi Group, Sub Group and Abelian Group, and Rings, Commutative Ring, Division Ring, Ideal Sub Ring, Ring Homomorphism, Ring Epimorphism and Fields. By the results, the software developed was able to test as same as the results from manual testing.
Fusion component design for the moving-ring field-reversed mirror reactor
International Nuclear Information System (INIS)
Carlson, G.A.
1981-01-01
This partial report on the reactor design contains sections on the following: (1) burner section magnet system design, (2) plasma ring energy recovery, (3) vacuum system, (4) cryogenic system, (5) tritium flows and inventories, and (6) reactor design and layout
Nonlinear and self-consistent treatment of ECRH
Energy Technology Data Exchange (ETDEWEB)
Tsironis, C.; Vlahos, L.
2005-07-01
A self-consistent formulation for the nonlinear interaction of electromagnetic waves with relativistic magnetized electrons is applied for the description of the ECRH. In general, electron-cyclotron absorption is the result of resonances between the cyclotron harmonics and the Doppler-shifted waver frequency. The resonant interaction results to an intense wave-particle energy exchange and an electron acceleration, and for that reason it is widely applied in fusion experiments for plasma heating and current drive. The linear theory, for the wave absorption, as well as the quasilinear theory for the electron distribution function, are the most frequently-used tools for the study of wave-particle interactions. However, in many cases the validity of these theories is violated, namely cases where nonlinear effects, like, e. g. particle trapping in the wave field, are dominant in the particle phase-space. Our model consists of electrons streaming and gyrating in a tokamak plasma slab, which is finite in the directions perpendicular to the main magnetic field. The particles interact with an electromagnetic electron-cyclotron wave of the ordinary (O-) or the extraordinary (X-) mode. A set of nonlinear and relativistic equations is derived, which take into account the effects of the charged particle motions on the wave. These consist of the equations of motion for the plasma electrons in the slab, as well as the wave equation in terms of the vector potential. The effect of the electron motions on the temporal evolution of the wave is reflected in the current density source term. (Author)
Nonlinear and self-consistent treatment of ECRH
International Nuclear Information System (INIS)
Tsironis, C.; Vlahos, L.
2005-01-01
A self-consistent formulation for the nonlinear interaction of electromagnetic waves with relativistic magnetized electrons is applied for the description of the ECRH. In general, electron-cyclotron absorption is the result of resonances between the cyclotron harmonics and the Doppler-shifted waver frequency. The resonant interaction results to an intense wave-particle energy exchange and an electron acceleration, and for that reason it is widely applied in fusion experiments for plasma heating and current drive. The linear theory, for the wave absorption, as well as the quasilinear theory for the electron distribution function, are the most frequently-used tools for the study of wave-particle interactions. However, in many cases the validity of these theories is violated, namely cases where nonlinear effects, like, e. g. particle trapping in the wave field, are dominant in the particle phase-space. Our model consists of electrons streaming and gyrating in a tokamak plasma slab, which is finite in the directions perpendicular to the main magnetic field. The particles interact with an electromagnetic electron-cyclotron wave of the ordinary (O-) or the extraordinary (X-) mode. A set of nonlinear and relativistic equations is derived, which take into account the effects of the charged particle motions on the wave. These consist of the equations of motion for the plasma electrons in the slab, as well as the wave equation in terms of the vector potential. The effect of the electron motions on the temporal evolution of the wave is reflected in the current density source term. (Author)
Consistent individual differences in fathering in threespined stickleback Gasterosteus aculeatus
Directory of Open Access Journals (Sweden)
Laura R. STEIN, Alison M. BELL
2012-02-01
Full Text Available There is growing evidence that individual animals show consistent differences in behavior. For example, individual threespined stickleback fish differ in how they react to predators and how aggressive they are during social interactions with conspecifics. A relatively unexplored but potentially important axis of variation is parental behavior. In sticklebacks, fathers provide all of the parental care that is necessary for offspring survival; therefore paternal care is directly tied to fitness. In this study, we assessed whether individual male sticklebacks differ consistently from each other in parental behavior. We recorded visits to nest, total time fanning, and activity levels of 11 individual males every day throughout one clutch, and then allowed the males to breed again. Half of the males were exposed to predation risk while parenting during the first clutch, and the other half of the males experienced predation risk during the second clutch. We detected dramatic temporal changes in parental behaviors over the course of the clutch: for example, total time fanning increased six-fold prior to eggs hatching, then decreased to approximately zero. Despite these temporal changes, males retained their individually-distinctive parenting styles within a clutch that could not be explained by differences in body size or egg mass. Moreover, individual differences in parenting were maintained when males reproduced for a second time. Males that were exposed to simulated predation risk briefly decreased fanning and increased activity levels. Altogether, these results show that individual sticklebacks consistently differ from each other in how they behave as parents [Current Zoology 58 (1: 45–52, 2012].
Consistent individual differences in fathering in threespined stickleback Gasterosteus aculeatus
Institute of Scientific and Technical Information of China (English)
Laura R. STEIN; Alison M. BELL
2012-01-01
There is growing evidence that individual animals show consistent differences in behavior.For example,individual threespined stickleback fish differ in how they react to predators and how aggressive they are during social interactions with conspecifics.A relatively unexplored but potentially important axis of variation is parental behavior.In sticklebacks,fathers provide all of the parental care that is necessary for offspring survival; therefore paternal care is directly tied to fimess.In this study,we assessed whether individual male sticklebacks differ consistently from each other in parental behavior.We recorded visits to nest,total time fanning,and activity levels of 11 individual males every day throughout one clutch,and then allowed the males to breed again.Half of the males were exposed to predation risk while parenting during the fast clutch,and the other half of the males experienced predation risk during the second clutch.We detected dramatic temporal changes in parental behaviors over the course of the clutch:for example,total time fanning increased six-fold prior to eggs hatching,then decreased to approximately zero.Despite these temporal changes,males retained their individually-distinctive parenting styles within a clutch that could not be explained by differences in body size or egg mass.Moreover,individual differences in parenting were maintained when males reproduced for a second time.Males that were exposed to simulated predation risk briefly decreased fanning and increased activity levels.Altogether,these results show that individual sticklebacks consistently differ from each other in how they behave as parents [Current Zoology 58 (1):45-52,2012].
Flood damage curves for consistent global risk assessments
de Moel, Hans; Huizinga, Jan; Szewczyk, Wojtek
2016-04-01
Assessing potential damage of flood events is an important component in flood risk management. Determining direct flood damage is commonly done using depth-damage curves, which denote the flood damage that would occur at specific water depths per asset or land-use class. Many countries around the world have developed flood damage models using such curves which are based on analysis of past flood events and/or on expert judgement. However, such damage curves are not available for all regions, which hampers damage assessments in those regions. Moreover, due to different methodologies employed for various damage models in different countries, damage assessments cannot be directly compared with each other, obstructing also supra-national flood damage assessments. To address these problems, a globally consistent dataset of depth-damage curves has been developed. This dataset contains damage curves depicting percent of damage as a function of water depth as well as maximum damage values for a variety of assets and land use classes (i.e. residential, commercial, agriculture). Based on an extensive literature survey concave damage curves have been developed for each continent, while differentiation in flood damage between countries is established by determining maximum damage values at the country scale. These maximum damage values are based on construction cost surveys from multinational construction companies, which provide a coherent set of detailed building cost data across dozens of countries. A consistent set of maximum flood damage values for all countries was computed using statistical regressions with socio-economic World Development Indicators from the World Bank. Further, based on insights from the literature survey, guidance is also given on how the damage curves and maximum damage values can be adjusted for specific local circumstances, such as urban vs. rural locations, use of specific building material, etc. This dataset can be used for consistent supra
Coagulation of Agglomerates Consisting of Polydisperse Primary Particles.
Goudeli, E; Eggersdorfer, M L; Pratsinis, S E
2016-09-13
The ballistic agglomeration of polydisperse particles is investigated by an event-driven (ED) method and compared to the coagulation of spherical particles and agglomerates consisting of monodisperse primary particles (PPs). It is shown for the first time to our knowledge that increasing the width or polydispersity of the PP size distribution initially accelerates the coagulation rate of their agglomerates but delays the attainment of their asymptotic fractal-like structure and self-preserving size distribution (SPSD) without altering them, provided that sufficiently large numbers of PPs are employed. For example, the standard asymptotic mass fractal dimension, Df, of 1.91 is attained when clusters are formed containing, on average, about 15 monodisperse PPs, consistent with fractal theory and the literature. In contrast, when polydisperse PPs with a geometric standard deviation of 3 are employed, about 500 PPs are needed to attain that Df. Even though the same asymptotic Df and mass-mobility exponent, Dfm, are attained regardless of PP polydispersity, the asymptotic prefactors or lacunarities of Df and Dfm increase with PP polydispersity. For monodisperse PPs, the average agglomerate radius of gyration, rg, becomes larger than the mobility radius, rm, when agglomerates consist of more than 15 PPs. Increasing PP polydispersity increases that number of PPs similarly to the above for the attainment of the asymptotic Df or Dfm. The agglomeration kinetics are quantified by the overall collision frequency function. When the SPSD is attained, the collision frequency is independent of PP polydispersity. Accounting for the SPSD polydispersity in the overall agglomerate collision frequency is in good agreement with that frequency from detailed ED simulations once the SPSD is reached. Most importantly, the coagulation of agglomerates is described well by a monodisperse model for agglomerate and PP sizes, whereas the detailed agglomerate size distribution can be obtained by
DEFF Research Database (Denmark)
Norman, Patrick; Bishop, David M.; Jensen, Hans Jørgen Aa
2001-01-01
Computationally tractable expressions for the evaluation of the linear response function in the multiconfigurational self-consistent field approximation were derived and implemented. The finite lifetime of the electronically excited states was considered and the linear response function was shown...... to be convergent in the whole frequency region. This was achieved through the incorporation of phenomenological damping factors that lead to complex response function values....
Consistent and efficient processing of ADCP streamflow measurements
Mueller, David S.; Constantinescu, George; Garcia, Marcelo H.; Hanes, Dan
2016-01-01
The use of Acoustic Doppler Current Profilers (ADCPs) from a moving boat is a commonly used method for measuring streamflow. Currently, the algorithms used to compute the average depth, compute edge discharge, identify invalid data, and estimate velocity and discharge for invalid data vary among manufacturers. These differences could result in different discharges being computed from identical data. Consistent computational algorithm, automated filtering, and quality assessment of ADCP streamflow measurements that are independent of the ADCP manufacturer are being developed in a software program that can process ADCP moving-boat discharge measurements independent of the ADCP used to collect the data.
Consistency of FMEA used in the validation of analytical procedures
DEFF Research Database (Denmark)
Oldenhof, M.T.; van Leeuwen, J.F.; Nauta, Maarten
2011-01-01
is always carried out under the supervision of an experienced FMEA-facilitator and that the FMEA team has at least two members with competence in the analytical method to be validated. However, the FMEAs of both teams contained valuable information that was not identified by the other team, indicating......In order to explore the consistency of the outcome of a Failure Mode and Effects Analysis (FMEA) in the validation of analytical procedures, an FMEA was carried out by two different teams. The two teams applied two separate FMEAs to a High Performance Liquid Chromatography-Diode Array Detection...
Designing apps for success developing consistent app design practices
David, Matthew
2014-01-01
In 2007, Apple released the iPhone. With this release came tools as revolutionary as the internet was to businesses and individuals back in the mid- and late-nineties: Apps. Much like websites drove (and still drive) business, so too do apps drive sales, efficiencies and communication between people. But also like web design and development, in its early years and iterations, guidelines and best practices for apps are few and far between.Designing Apps for Success provides web/app designers and developers with consistent app design practices that result in timely, appropriate, and efficiently
The numerical multiconfiguration self-consistent field approach for atoms
International Nuclear Information System (INIS)
Stiehler, Johannes
1995-12-01
The dissertation uses the Multiconfiguration Self-Consistent Field Approach to specify the electronic wave function of N electron atoms in a static electrical field. It presents numerical approaches to describe the wave functions and introduces new methods to compute the numerical Fock equations. Based on results computed with an implemented computer program the universal application, flexibility and high numerical precision of the presented approach is shown. RHF results and for the first time MCSCF results for polarizabilities and hyperpolarizabilities of various states of the atoms He to Kr are discussed. In addition, an application to interpret a plasma spectrum of gallium is presented. (orig.)
Self-consistent potential variations in magnetic wells
International Nuclear Information System (INIS)
Kesner, J.; Knorr, G.; Nicholson, D.R.
1981-01-01
Self-consistent electrostatic potential variations are considered in a spatial region of weak magnetic field, as in the proposed tandem mirror thermal barriers (with no trapped ions). For some conditions, equivalent to ion distributions with a sufficiently high net drift speed along the magnetic field, the desired potential depressions are found. When the net drift speed is not high enough, potential depressions are found only in combination with strong electric fields on the boundaries of the system. These potential depressions are not directly related to the magnetic field depression. (author)
Applicability of self-consistent mean-field theory
International Nuclear Information System (INIS)
Guo Lu; Sakata, Fumihiko; Zhao Enguang
2005-01-01
Within the constrained Hartree-Fock (CHF) theory, an analytic condition is derived to estimate whether a concept of the self-consistent mean field is realized in the level repulsive region. The derived condition states that an iterative calculation of the CHF equation does not converge when the quantum fluctuations coming from two-body residual interaction and quadrupole deformation become larger than a single-particle energy difference between two avoided crossing orbits. By means of numerical calculation, it is shown that the analytic condition works well for a realistic case
Island of stability for consistent deformations of Einstein's gravity.
Berkhahn, Felix; Dietrich, Dennis D; Hofmann, Stefan; Kühnel, Florian; Moyassari, Parvin
2012-03-30
We construct deformations of general relativity that are consistent and phenomenologically viable, since they respect, in particular, cosmological backgrounds. These deformations have unique symmetries in accordance with their Minkowski cousins (Fierz-Pauli theory for massive gravitons) and incorporate a background curvature induced self-stabilizing mechanism. Self-stabilization is essential in order to guarantee hyperbolic evolution in and unitarity of the covariantized theory, as well as the deformation's uniqueness. We show that the deformation's parameter space contains islands of absolute stability that are persistent through the entire cosmic evolution.
The self-consistent dynamic pole tide in global oceans
Dickman, S. R.
1985-01-01
The dynamic pole tide is characterized in a self-consistent manner by means of introducing a single nondifferential matrix equation compatible with the Liouville equation, modelling the ocean as global and of uniform depth. The deviations of the theory from the realistic ocean, associated with the nonglobality of the latter, are also given consideration, with an inference that in realistic oceans long-period modes of resonances would be increasingly likely to exist. The analysis of the nature of the pole tide and its effects on the Chandler wobble indicate that departures of the pole tide from the equilibrium may indeed be minimal.
Simplified models for dark matter face their consistent completions
Energy Technology Data Exchange (ETDEWEB)
Gonçalves, Dorival; Machado, Pedro A. N.; No, Jose Miguel
2017-03-01
Simplified dark matter models have been recently advocated as a powerful tool to exploit the complementarity between dark matter direct detection, indirect detection and LHC experimental probes. Focusing on pseudoscalar mediators between the dark and visible sectors, we show that the simplified dark matter model phenomenology departs significantly from that of consistent ${SU(2)_{\\mathrm{L}} \\times U(1)_{\\mathrm{Y}}}$ gauge invariant completions. We discuss the key physics simplified models fail to capture, and its impact on LHC searches. Notably, we show that resonant mono-Z searches provide competitive sensitivities to standard mono-jet analyses at $13$ TeV LHC.
Two-particle self-consistent approach to unconventional superconductivity
Energy Technology Data Exchange (ETDEWEB)
Otsuki, Junya [Department of Physics, Tohoku University, Sendai (Japan); Theoretische Physik III, Zentrum fuer Elektronische Korrelationen und Magnetismus, Universitaet Augsburg (Germany)
2013-07-01
A non-perturbative approach to unconventional superconductivity is developed based on the idea of the two-particle self-consistent (TPSC) theory. An exact sum-rule which the momentum-dependent pairing susceptibility satisfies is derived. Effective pairing interactions between quasiparticles are determined so that an approximate susceptibility should fulfill this sum-rule, in which fluctuations belonging to different symmetries mix at finite momentum. The mixing leads to a suppression of the d{sub x{sup 2}-y{sup 2}} pairing close to the half-filling, resulting in a maximum of T{sub c} away from half-filling.
Correlations and self-consistency in pion scattering. II
International Nuclear Information System (INIS)
Johnson, M.B.; Keister, B.D.
1978-01-01
In an attempt to overcome certain difficulties of summing higher order processes in pion multiple scattering theories, a new, systematic expansion for the interaction of a pion in nuclear matter is derived within the context of the Foldy-Walecka theory, incorporating nucleon-nucleon correlations and an idea of self-consistency. The first two orders in the expansion are evaluated as a function of the nonlocality range; the expansion appears to be rapidly converging, in contrast to expansion schemes previously examined. (Auth.)
Quark mean field theory and consistency with nuclear matter
International Nuclear Information System (INIS)
Dey, J.; Dey, M.; Frederico, T.; Tomio, L.
1990-09-01
1/N c expansion in QCD (with N c the number of colours) suggests using a potential from meson sector (e.g. Richardson) for baryons. For light quarks a σ field has to be introduced to ensure chiral symmetry breaking ( χ SB). It is found that nuclear matter properties can be used to pin down the χ SB-modelling. All masses, M N , m σ , m ω are found to scale with density. The equations are solved self consistently. (author). 29 refs, 2 tabs
A self-consistent model of an isothermal tokamak
McNamara, Steven; Lilley, Matthew
2014-10-01
Continued progress in liquid lithium coating technologies have made the development of a beam driven tokamak with minimal edge recycling a feasibly possibility. Such devices are characterised by improved confinement due to their inherent stability and the suppression of thermal conduction. Particle and energy confinement become intrinsically linked and the plasma thermal energy content is governed by the injected beam. A self-consistent model of a purely beam fuelled isothermal tokamak is presented, including calculations of the density profile, bulk species temperature ratios and the fusion output. Stability considerations constrain the operating parameters and regions of stable operation are identified and their suitability to potential reactor applications discussed.
Self-consistent calculation of 208Pb spectrum
International Nuclear Information System (INIS)
Pal'chik, V.V.; Pyatov, N.I.; Fayans, S.A.
1981-01-01
The self-consistent model with exact accounting for one-particle continuum is applied to calculate all discrete particle-hole natural parity states with 2 208 Pb nucleus (up to the neutron emission threshold, 7.4 MeV). Contributions to the energy-weighted sum rules S(EL) of the first collective levels and total contributions of all discrete levels are evaluated. Most strongly the collectivization is manifested for octupole states. With multipolarity growth L contributions of discrete levels are sharply reduced. The results are compared with other models and the experimental data obtained in (e, e'), (p, p') reactions and other data [ru
Poisson solvers for self-consistent multi-particle simulations
International Nuclear Information System (INIS)
Qiang, J; Paret, S
2014-01-01
Self-consistent multi-particle simulation plays an important role in studying beam-beam effects and space charge effects in high-intensity beams. The Poisson equation has to be solved at each time-step based on the particle density distribution in the multi-particle simulation. In this paper, we review a number of numerical methods that can be used to solve the Poisson equation efficiently. The computational complexity of those numerical methods will be O(N log(N)) or O(N) instead of O(N2), where N is the total number of grid points used to solve the Poisson equation
Consistency of differential and integral thermonuclear neutronics data
International Nuclear Information System (INIS)
Reupke, W.A.
1978-01-01
To increase the accuracy of the neutronics analysis of nuclear reactors, physicists and engineers have employed a variety of techniques, including the adjustment of multigroup differential data to improve consistency with integral data. Of the various adjustment strategies, a generalized least-squares procedure which adjusts the combined differential and integral data can significantly improve the accuracy of neutronics calculations compared to calculations employing only differential data. This investigation analyzes 14 MeV neutron-driven integral experiments, using a more extensively developed methodology and a newly developed computer code, to extend the domain of adjustment from the energy range of fission reactors to the energy range of fusion reactors
Consistent treatment of one-body dynamics and collective fluctuations
International Nuclear Information System (INIS)
Pfitzner, A.
1986-09-01
We show how the residual coupling deltaV between collective and intrinsic motion induces correlations, which lead to fluctuations of the collective variables and to a redistribution of single-particle occupation numbers rho/sub α/. The evolution of rho/sub α/ and of the collective fluctuations is consistently described by a coupled system of equations, which accounts for the dependence of the transport coefficients on rho/sub α/, and for the dependence of the transition rates in the master equation on the collective variances. (author)
Mean-field theory and self-consistent dynamo modeling
International Nuclear Information System (INIS)
Yoshizawa, Akira; Yokoi, Nobumitsu
2001-12-01
Mean-field theory of dynamo is discussed with emphasis on the statistical formulation of turbulence effects on the magnetohydrodynamic equations and the construction of a self-consistent dynamo model. The dynamo mechanism is sought in the combination of the turbulent residual-helicity and cross-helicity effects. On the basis of this mechanism, discussions are made on the generation of planetary magnetic fields such as geomagnetic field and sunspots and on the occurrence of flow by magnetic fields in planetary and fusion phenomena. (author)
The Consistency Of High Attorney Of Papua In Corruption Investigation
Directory of Open Access Journals (Sweden)
Samsul Tamher
2015-08-01
Full Text Available This study aimed to determine the consistency of High Attorney of Papua in corruption investigation and efforts to return the state financial loss. The type of study used in this paper is a normative-juridical and empirical-juridical. The results showed that the High Attorney of Papua in corruption investigation is not optimal due to the political interference on a case that involving local officials so that the High Attorney in decide the case is not accordance with the rule of law. The efforts of the High Attorney of Papua to return the state financial loss through State Auction Body civil- and criminal laws.
Wavelets in self-consistent electronic structure calculations
International Nuclear Information System (INIS)
Wei, S.; Chou, M.Y.
1996-01-01
We report the first implementation of orthonormal wavelet bases in self-consistent electronic structure calculations within the local-density approximation. These local bases of different scales efficiently describe localized orbitals of interest. As an example, we studied two molecules, H 2 and O 2 , using pseudopotentials and supercells. Considerably fewer bases are needed compared with conventional plane-wave approaches, yet calculated binding properties are similar. Our implementation employs fast wavelet and Fourier transforms, avoiding evaluating any three-dimensional integral numerically. copyright 1996 The American Physical Society
Self-consistent electronic-structure calculations for interface geometries
International Nuclear Information System (INIS)
Sowa, E.C.; Gonis, A.; MacLaren, J.M.; Zhang, X.G.
1992-01-01
This paper describes a technique for computing self-consistent electronic structures and total energies of planar defects, such as interfaces, which are embedded in an otherwise perfect crystal. As in the Layer Korringa-Kohn-Rostoker approach, the solid is treated as a set of coupled layers of atoms, using Bloch's theorem to take advantage of the two-dimensional periodicity of the individual layers. The layers are coupled using the techniques of the Real-Space Multiple-Scattering Theory, avoiding artificial slab or supercell boundary conditions. A total-energy calculation on a Cu crystal, which has been split apart at a (111) plane, is used to illustrate the method
Sensor and control for consistent seed drill coulter depth
DEFF Research Database (Denmark)
Kirkegaard Nielsen, Søren; Nørremark, Michael; Green, Ole
2016-01-01
The consistent depth placement of seeds is vital for achieving the optimum yield of agricultural crops. In state-of-the-art seeding machines, the depth of drill coulters will vary with changes in soil resistance. This paper presents the retrofitting of an angle sensor to the pivoting point...... by a sub-millimetre accurate positioning system (iGPS, Nikon Metrology NV, Belgium) mounted on the drill coulter. At a drill coulter depth of 55 mm and controlled by an ordinary fixed spring loaded down force only, the change in soil resistance decreased the mean depth by 23 mm. By dynamically controlling...
SIMPLE ESTIMATOR AND CONSISTENT STRONGLY OF STABLE DISTRIBUTIONS
Directory of Open Access Journals (Sweden)
Cira E. Guevara Otiniano
2016-06-01
Full Text Available Stable distributions are extensively used to analyze earnings of financial assets, such as exchange rates and stock prices assets. In this paper we propose a simple and strongly consistent estimator for the scale parameter of a symmetric stable L´evy distribution. The advantage of this estimator is that your computational time is minimum thus it can be used to initialize intensive computational procedure such as maximum likelihood. With random samples of sized n we tested the efficacy of these estimators by Monte Carlo method. We also included applications for three data sets.
On the hydrodynamic limit of self-consistent field equations
International Nuclear Information System (INIS)
Pauli, H.C.
1980-01-01
As an approximation to the nuclear many-body problem, the hydrodynamical limit of self-consistent field equations is worked out and applied to the treatment of vibrational and rotational motion. Its validity is coupled to the value of a smallness parameter, behaving as 20Asup(-2/3) with the number of nucleons. For finite nuclei, this number is not small enough as compared to 1, and indeed one observes a discrepancy of roughly a factor of 5 between the hydrodynamic frequencies and the relevant experimental numbers. (orig.)
Multiconfigurational self-consistent reaction field theory for nonequilibrium solvation
DEFF Research Database (Denmark)
Mikkelsen, Kurt V.; Cesar, Amary; Ågren, Hans
1995-01-01
electronic structure whereas the inertial polarization vector is not necessarily in equilibrium with the actual electronic structure. The electronic structure of the compound is described by a correlated electronic wave function - a multiconfigurational self-consistent field (MCSCF) wave function. This wave......, open-shell, excited, and transition states. We demonstrate the theory by computing solvatochromatic shifts in optical/UV spectra of some small molecules and electron ionization and electron detachment energies of the benzene molecule. It is shown that the dependency of the solvent induced affinity...
A Consistent Pricing Model for Index Options and Volatility Derivatives
DEFF Research Database (Denmark)
Kokholm, Thomas
to be priced consistently, while allowing for jumps in volatility and returns. An affine specification using Lévy processes as building blocks leads to analytically tractable pricing formulas for volatility derivatives, such as VIX options, as well as efficient numerical methods for pricing of European options...... on the underlying asset. The model has the convenient feature of decoupling the vanilla skews from spot/volatility correlations and allowing for different conditional correlations in large and small spot/volatility moves. We show that our model can simultaneously fit prices of European options on S&P 500 across...
A Consistent Pricing Model for Index Options and Volatility Derivatives
DEFF Research Database (Denmark)
Cont, Rama; Kokholm, Thomas
2013-01-01
to be priced consistently, while allowing for jumps in volatility and returns. An affine specification using Lévy processes as building blocks leads to analytically tractable pricing formulas for volatility derivatives, such as VIX options, as well as efficient numerical methods for pricing of European options...... on the underlying asset. The model has the convenient feature of decoupling the vanilla skews from spot/volatility correlations and allowing for different conditional correlations in large and small spot/volatility moves. We show that our model can simultaneously fit prices of European options on S&P 500 across...
Consistent vapour-liquid equilibrium data containing lipids
DEFF Research Database (Denmark)
Cunico, Larissa; Ceriani, Roberta; Sarup, Bent
Consistent physical and thermodynamic properties of pure components and their mixtures are important for process design, simulation, and optimization as well as design of chemical based products. In the case of lipids, it was observed a lack of experimental data for pure compounds and also...... for their mixtures in open literature, what makes necessary the development of reliable predictive models based on limited data. To contribute to the missing data, measurements of isobaric vapour-liquid equilibrium (VLE) data of three binary mixtures at two different pressures were performed at State University...
Image quality (IQ) guided multispectral image compression
Zheng, Yufeng; Chen, Genshe; Wang, Zhonghai; Blasch, Erik
2016-05-01
Image compression is necessary for data transportation, which saves both transferring time and storage space. In this paper, we focus on our discussion on lossy compression. There are many standard image formats and corresponding compression algorithms, for examples, JPEG (DCT -- discrete cosine transform), JPEG 2000 (DWT -- discrete wavelet transform), BPG (better portable graphics) and TIFF (LZW -- Lempel-Ziv-Welch). The image quality (IQ) of decompressed image will be measured by numerical metrics such as root mean square error (RMSE), peak signal-to-noise ratio (PSNR), and structural Similarity (SSIM) Index. Given an image and a specified IQ, we will investigate how to select a compression method and its parameters to achieve an expected compression. Our scenario consists of 3 steps. The first step is to compress a set of interested images by varying parameters and compute their IQs for each compression method. The second step is to create several regression models per compression method after analyzing the IQ-measurement versus compression-parameter from a number of compressed images. The third step is to compress the given image with the specified IQ using the selected compression method (JPEG, JPEG2000, BPG, or TIFF) according to the regressed models. The IQ may be specified by a compression ratio (e.g., 100), then we will select the compression method of the highest IQ (SSIM, or PSNR). Or the IQ may be specified by a IQ metric (e.g., SSIM = 0.8, or PSNR = 50), then we will select the compression method of the highest compression ratio. Our experiments tested on thermal (long-wave infrared) images (in gray scales) showed very promising results.
Thermal Non-equilibrium Consistent with Widespread Cooling
Winebarger, A.; Lionello, R.; Mikic, Z.; Linker, J.; Mok, Y.
2014-01-01
Time correlation analysis has been used to show widespread cooling in the solar corona; this cooling has been interpreted as a result of impulsive (nanoflare) heating. In this work, we investigate wide-spread cooling using a 3D model for a solar active region which has been heated with highly stratified heating. This type of heating drives thermal non-equilibrium solutions, meaning that though the heating is effectively steady, the density and temperature in the solution are not. We simulate the expected observations in narrowband EUV images and apply the time correlation analysis. We find that the results of this analysis are qualitatively similar to the observed data. We discuss additional diagnostics that may be applied to differentiate between these two heating scenarios.
Self-consistent viscous heating of rapidly compressed turbulence
Campos, Alejandro; Morgan, Brandon
2017-11-01
Given turbulence subjected to infinitely rapid deformations, linear terms representing interactions between the mean flow and the turbulence dictate the evolution of the flow, whereas non-linear terms corresponding to turbulence-turbulence interactions are safely ignored. For rapidly deformed flows where the turbulence Reynolds number is not sufficiently large, viscous effects can't be neglected and tend to play a prominent role, as shown in the study of Davidovits & Fisch (2016). For such a case, the rapid increase of viscosity in a plasma-as compared to the weaker scaling of viscosity in a fluid-leads to the sudden viscous dissipation of turbulent kinetic energy. As shown in Davidovits & Fisch, increases in temperature caused by the direct compression of the plasma drive sufficiently large values of viscosity. We report on numerical simulations of turbulence where the increase in temperature is the result of both the direct compression (an inviscid mechanism) and the self-consistent viscous transfer of energy from the turbulent scales towards the thermal energy. A comparison between implicit large-eddy simulations against well-resolved direct numerical simulations is included to asses the effect of the numerical and subgrid-scale dissipation on the self-consistent viscous This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.
Parton Distributions based on a Maximally Consistent Dataset
Rojo, Juan
2016-04-01
The choice of data that enters a global QCD analysis can have a substantial impact on the resulting parton distributions and their predictions for collider observables. One of the main reasons for this has to do with the possible presence of inconsistencies, either internal within an experiment or external between different experiments. In order to assess the robustness of the global fit, different definitions of a conservative PDF set, that is, a PDF set based on a maximally consistent dataset, have been introduced. However, these approaches are typically affected by theory biases in the selection of the dataset. In this contribution, after a brief overview of recent NNPDF developments, we propose a new, fully objective, definition of a conservative PDF set, based on the Bayesian reweighting approach. Using the new NNPDF3.0 framework, we produce various conservative sets, which turn out to be mutually in agreement within the respective PDF uncertainties, as well as with the global fit. We explore some of their implications for LHC phenomenology, finding also good consistency with the global fit result. These results provide a non-trivial validation test of the new NNPDF3.0 fitting methodology, and indicate that possible inconsistencies in the fitted dataset do not affect substantially the global fit PDFs.
Self-consistent modeling of electron cyclotron resonance ion sources
International Nuclear Information System (INIS)
Girard, A.; Hitz, D.; Melin, G.; Serebrennikov, K.; Lecot, C.
2004-01-01
In order to predict the performances of electron cyclotron resonance ion source (ECRIS), it is necessary to perfectly model the different parts of these sources: (i) magnetic configuration; (ii) plasma characteristics; (iii) extraction system. The magnetic configuration is easily calculated via commercial codes; different codes also simulate the ion extraction, either in two dimension, or even in three dimension (to take into account the shape of the plasma at the extraction influenced by the hexapole). However the characteristics of the plasma are not always mastered. This article describes the self-consistent modeling of ECRIS: we have developed a code which takes into account the most important construction parameters: the size of the plasma (length, diameter), the mirror ratio and axial magnetic profile, whether a biased probe is installed or not. These input parameters are used to feed a self-consistent code, which calculates the characteristics of the plasma: electron density and energy, charge state distribution, plasma potential. The code is briefly described, and some of its most interesting results are presented. Comparisons are made between the calculations and the results obtained experimentally
Self-consistent modeling of electron cyclotron resonance ion sources
Girard, A.; Hitz, D.; Melin, G.; Serebrennikov, K.; Lécot, C.
2004-05-01
In order to predict the performances of electron cyclotron resonance ion source (ECRIS), it is necessary to perfectly model the different parts of these sources: (i) magnetic configuration; (ii) plasma characteristics; (iii) extraction system. The magnetic configuration is easily calculated via commercial codes; different codes also simulate the ion extraction, either in two dimension, or even in three dimension (to take into account the shape of the plasma at the extraction influenced by the hexapole). However the characteristics of the plasma are not always mastered. This article describes the self-consistent modeling of ECRIS: we have developed a code which takes into account the most important construction parameters: the size of the plasma (length, diameter), the mirror ratio and axial magnetic profile, whether a biased probe is installed or not. These input parameters are used to feed a self-consistent code, which calculates the characteristics of the plasma: electron density and energy, charge state distribution, plasma potential. The code is briefly described, and some of its most interesting results are presented. Comparisons are made between the calculations and the results obtained experimentally.
Consistent Quantum Histories: Towards a Universal Language of Physics
International Nuclear Information System (INIS)
Grygiel, W.P.
2007-01-01
The consistent histories interpretation of quantum mechanics is a reformulation of the standard Copenhagen interpretation that aims at incorporating quantum probabilities as part of the axiomatic foundations of the theory. It is not only supposed to equip quantum mechanics with clear criteria of its own experimental verification but, first and foremost, to alleviate one of the stumbling blocks of the theory - the measurement problem. Since the consistent histories interpretation operates with a series of quantum events integrated into one quantum history, the measurement problem is naturally absorbed as one of the events that build up a history. The interpretation rests upon the two following assumptions, proposed already by J. von Neumann: (1) both the microscopic and macroscopic regimes are subject to the same set of quantum laws and (2) a projector operator that is assigned to each event within a history permits to transcribe the history into a set of propositions that relate the entire course of quantum events. Based on this, a universal language of physics is expected to emerge that will bring the quantum apparatus back to common sense propositional logic. The basic philosophical issue raised this study is whether one should justify quantum mechanics by means of what emerges from it, that is, the properties of the macroscopic world, or use the axioms of quantum mechanics to demonstrate the mechanisms how the macroscopic world comes about from the quantum regime. (author)
Feeling Expression Using Avatars and Its Consistency for Subjective Annotation
Ito, Fuyuko; Sasaki, Yasunari; Hiroyasu, Tomoyuki; Miki, Mitsunori
Consumer Generated Media(CGM) is growing rapidly and the amount of content is increasing. However, it is often difficult for users to extract important contents and the existence of contents recording their experiences can easily be forgotten. As there are no methods or systems to indicate the subjective value of the contents or ways to reuse them, subjective annotation appending subjectivity, such as feelings and intentions, to contents is needed. Representation of subjectivity depends on not only verbal expression, but also nonverbal expression. Linguistically expressed annotation, typified by collaborative tagging in social bookmarking systems, has come into widespread use, but there is no system of nonverbally expressed annotation on the web. We propose the utilization of controllable avatars as a means of nonverbal expression of subjectivity, and confirmed the consistency of feelings elicited by avatars over time for an individual and in a group. In addition, we compared the expressiveness and ease of subjective annotation between collaborative tagging and controllable avatars. The result indicates that the feelings evoked by avatars are consistent in both cases, and using controllable avatars is easier than collaborative tagging for representing feelings elicited by contents that do not express meaning, such as photos.
Detection and quantification of flow consistency in business process models.
Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel; Soffer, Pnina; Weber, Barbara
2018-01-01
Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second, to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics addressing these challenges, each following a different view of flow consistency. We then report the results of an empirical evaluation, which indicates which metric is more effective in predicting the human perception of this feature. Moreover, two other automatic evaluations describing the performance and the computational capabilities of our metrics are reported as well.
Effect of irradiation on Brazilian honeys' consistency and their acceptability
International Nuclear Information System (INIS)
Matsuda, A.H.; Sabato, S.F.
2004-01-01
Contamination of bee products may occur during packing or even during the process of collection. Gamma irradiation was found to decrease the number of bacteria and fungi. However, little information is available on the effects of gamma irradiation on viscosity which is an important property of honey. In this work the viscosity of two varieties of Brazilian honey was measured when they were irradiated at 5 and 10 kGy. The viscosity was measured at four temperatures (25 deg. C, 30 deg. C, 35 deg. C and 40 deg. C) for both samples and compared with control and within the doses. The sensory evaluation was carried on for the parameters color, odor, taste and consistency, using a 9-point hedonic scale. All the data were treated with a statistical tool (Statistica 5.1, StatSoft, 1998). The viscosity was not impaired significantly by gamma irradiation in doses 5 and 10 kGy (p<0.05). The effect of gamma irradiation on sensorial characteristics (odor, color, taste and consistency) is presented. The taste for Parana type indicated a significant difference among irradiation doses (p<0.05) but the higher value was for 5 kGy dose, demonstrating the acceptability for this case. The Organic honey presented the taste parameter for 10 kGy, significantly lower than the control mean but it did not differ significantly from the 5 kGy value
Neutronic data consistency analysis for lithium blanket and shield design
International Nuclear Information System (INIS)
Reupke, W.A.; Muir, D.W.
1976-01-01
Using a compact least-squares treatment we analyze the consistency of evaluated cross sections with calculated and measured tritium production in /sup n/Li and 7 Li detectors embedded in a 14-MeV neutron driven /sup n/LiD sphere. The tritium production experimental error matrix is evaluated and an initial reduced chi 2 of 3.0 is calculated. A perturbation calculation of the tritium production cross section sensitivities is performed with secondary neutron energy and angular distributions held constant. The cross section error matrix is evaluated by the external consistency of available cross section measurements. A statistical adjustment of the combined data yields a reduced chi 2 of 2.3 and represents a tenfold improvement in statistical likelihood. The improvement is achieved by a decrease in the 7 Li(n,xt) 14-MeV group cross section from 328 mb to 284 mb and an adjustment of the /sup n/Li data closer to calculated values. The uncertainty in the tritium breeding ratio in pure 7 LiD is reduced by one-fifth
Consistent three-equation model for thin films
Richard, Gael; Gisclon, Marguerite; Ruyer-Quil, Christian; Vila, Jean-Paul
2017-11-01
Numerical simulations of thin films of newtonian fluids down an inclined plane use reduced models for computational cost reasons. These models are usually derived by averaging over the fluid depth the physical equations of fluid mechanics with an asymptotic method in the long-wave limit. Two-equation models are based on the mass conservation equation and either on the momentum balance equation or on the work-energy theorem. We show that there is no two-equation model that is both consistent and theoretically coherent and that a third variable and a three-equation model are required to solve all theoretical contradictions. The linear and nonlinear properties of two and three-equation models are tested on various practical problems. We present a new consistent three-equation model with a simple mathematical structure which allows an easy and reliable numerical resolution. The numerical calculations agree fairly well with experimental measurements or with direct numerical resolutions for neutral stability curves, speed of kinematic waves and of solitary waves and depth profiles of wavy films. The model can also predict the flow reversal at the first capillary trough ahead of the main wave hump.
Criteria for the generation of spectra consistent time histories
International Nuclear Information System (INIS)
Lin, C.-W.
1977-01-01
Several methods are available to conduct seismic analysis for nuclear power plant systems and components. Among them, the response spectrum technique has been most widely adopted for linear type of modal analysis. However, for designs which consist of structural or material nonlinearites such as frequency dependent soil properties, the existance of gaps, single tie rods, and friction between supports where the response has to be computed as a function of time, time history approach is the only viable method of analysis. Two examples of time history analysis are: 1) soil-structure interaction study and, 2) a coupled reactor coolant system and building analysis to either generate the floor response specra or compute nonlinear system time history response. The generation of a suitable time history input for the analysis has been discussed in the literature. Some general guidelines are available to insure that the time history imput will be as conservative as the design response spectra. Very little has been reported as to the effect of the dyanmic characteristics of the time history input upon the system response. In fact, the only available discussion in this respect concerns only with the statitical independent nature of the time history components. In this paper, numerical results for cases using the time history approach are presented. Criteria are also established which may be advantageously used to arrive at spectra consistent time histories which are conservative and more importantly, realistic. (Auth.)
A model for cytoplasmic rheology consistent with magnetic twisting cytometry.
Butler, J P; Kelly, S M
1998-01-01
Magnetic twisting cytometry is gaining wide applicability as a tool for the investigation of the rheological properties of cells and the mechanical properties of receptor-cytoskeletal interactions. Current technology involves the application and release of magnetically induced torques on small magnetic particles bound to or inside cells, with measurements of the resulting angular rotation of the particles. The properties of purely elastic or purely viscous materials can be determined by the angular strain and strain rate, respectively. However, the cytoskeleton and its linkage to cell surface receptors display elastic, viscous, and even plastic deformation, and the simultaneous characterization of these properties using only elastic or viscous models is internally inconsistent. Data interpretation is complicated by the fact that in current technology, the applied torques are not constant in time, but decrease as the particles rotate. This paper describes an internally consistent model consisting of a parallel viscoelastic element in series with a parallel viscoelastic element, and one approach to quantitative parameter evaluation. The unified model reproduces all essential features seen in data obtained from a wide variety of cell populations, and contains the pure elastic, viscoelastic, and viscous cases as subsets.
Self-consistent chaos in the beam-plasma instability
International Nuclear Information System (INIS)
Tennyson, J.L.; Meiss, J.D.
1993-01-01
The effect of self-consistency on Hamiltonian systems with a large number of degrees-of-freedom is investigated for the beam-plasma instability using the single-wave model of O'Neil, Winfrey, and Malmberg.The single-wave model is reviewed and then rederived within the Hamiltonian context, which leads naturally to canonical action- angle variables. Simulations are performed with a large (10 4 ) number of beam particles interacting with the single wave. It is observed that the system relaxes into a time asymptotic periodic state where only a few collective degrees are active; namely, a clump of trapped particles oscillating in a modulated wave, within a uniform chaotic sea with oscillating phase space boundaries. Thus self-consistency is seen to effectively reduce the number of degrees- of-freedom. A simple low degree-of-freedom model is derived that treats the clump as a single macroparticle, interacting with the wave and chaotic sea. The uniform chaotic sea is modeled by a fluid waterbag, where the waterbag boundaries correspond approximately to invariant tori. This low degree-of-freedom model is seen to compare well with the simulation
Planck 2013 results. XXXI. Consistency of the Planck data
Ade, P A R; Ashdown, M; Aumont, J; Baccigalupi, C; Banday, A.J; Barreiro, R.B; Battaner, E; Benabed, K; Benoit-Levy, A; Bernard, J.P; Bersanelli, M; Bielewicz, P; Bond, J.R; Borrill, J; Bouchet, F.R; Burigana, C; Cardoso, J.F; Catalano, A; Challinor, A; Chamballu, A; Chiang, H.C; Christensen, P.R; Clements, D.L; Colombi, S; Colombo, L.P.L; Couchot, F; Coulais, A; Crill, B.P; Curto, A; Cuttaia, F; Danese, L; Davies, R.D; Davis, R.J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Desert, F.X; Dickinson, C; Diego, J.M; Dole, H; Donzelli, S; Dore, O; Douspis, M; Dupac, X; Ensslin, T.A; Eriksen, H.K; Finelli, F; Forni, O; Frailis, M; Fraisse, A A; Franceschi, E; Galeotta, S; Ganga, K; Giard, M; Gonzalez-Nuevo, J; Gorski, K.M.; Gratton, S.; Gregorio, A; Gruppuso, A; Gudmundsson, J E; Hansen, F.K; Hanson, D; Harrison, D; Henrot-Versille, S; Herranz, D; Hildebrandt, S.R; Hivon, E; Hobson, M; Holmes, W.A.; Hornstrup, A; Hovest, W.; Huffenberger, K.M; Jaffe, T.R; Jaffe, A.H; Jones, W.C; Keihanen, E; Keskitalo, R; Knoche, J; Kunz, M; Kurki-Suonio, H; Lagache, G; Lahteenmaki, A; Lamarre, J.M; Lasenby, A; Lawrence, C.R; Leonardi, R; Leon-Tavares, J; Lesgourgues, J; Liguori, M; Lilje, P.B; Linden-Vornle, M; Lopez-Caniego, M; Lubin, P.M; Macias-Perez, J.F; Maino, D; Mandolesi, N; Maris, M; Martin, P.G; Martinez-Gonzalez, E; Masi, S; Matarrese, S; Mazzotta, P; Meinhold, P.R; Melchiorri, A; Mendes, L; Mennella, A; Migliaccio, M; Mitra, S; Miville-Deschenes, M.A; Moneti, A; Montier, L; Morgante, G; Mortlock, D; Moss, A; Munshi, D; Murphy, J A; Naselsky, P; Nati, F; Natoli, P; Norgaard-Nielsen, H.U; Noviello, F; Novikov, D; Novikov, I; Oxborrow, C.A; Pagano, L; Pajot, F; Paoletti, D; Partridge, B; Pasian, F; Patanchon, G; Pearson, D; Pearson, T.J; Perdereau, O; Perrotta, F; Piacentini, F; Piat, M; Pierpaoli, E; Pietrobon, D; Plaszczynski, S; Pointecouteau, E; Polenta, G; Ponthieu, N; Popa, L; Pratt, G.W; Prunet, S; Puget, J.L; Rachen, J.P; Reinecke, M; Remazeilles, M; Renault, C; Ricciardi, S.; Ristorcelli, I; Rocha, G.; Roudier, G; Rubino-Martin, J.A; Rusholme, B; Sandri, M; Scott, D; Stolyarov, V; Sudiwala, R; Sutton, D; Suur-Uski, A.S; Sygnet, J.F; Tauber, J.A; Terenzi, L; Toffolatti, L; Tomasi, M; Tristram, M; Tucci, M; Valenziano, L; Valiviita, J; Van Tent, B; Vielva, P; Villa, F; Wade, L.A; Wandelt, B.D; Wehus, I K; White, S D M; Yvon, D; Zacchei, A; Zonca, A
2014-01-01
The Planck design and scanning strategy provide many levels of redundancy that can be exploited to provide tests of internal consistency. One of the most important is the comparison of the 70 GHz (amplifier) and 100 GHz (bolometer) channels. Based on different instrument technologies, with feeds located differently in the focal plane, analysed independently by different teams using different software, and near the minimum of diffuse foreground emission, these channels are in effect two different experiments. The 143 GHz channel has the lowest noise level on Planck, and is near the minimum of unresolved foreground emission. In this paper, we analyse the level of consistency achieved in the 2013 Planck data. We concentrate on comparisons between the 70, 100, and 143 GHz channel maps and power spectra, particularly over the angular scales of the first and second acoustic peaks, on maps masked for diffuse Galactic emission and for strong unresolved sources. Difference maps covering angular scales from 8°...
Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering
Sicat, Ronell Barrera
2014-12-31
This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined in the 4D domain jointly comprising the 3D volume and its 1D intensity range. Crucially, the computation of sparse pdf volumes exploits data coherence in 4D, resulting in a sparse representation with surprisingly low storage requirements. At run time, we dynamically apply transfer functions to the pdfs using simple and fast convolutions. Whereas standard low-pass filtering and down-sampling incur visible differences between resolution levels, the use of pdfs facilitates consistent results independent of the resolution level used. We describe the efficient out-of-core computation of large-scale sparse pdf volumes, using a novel iterative simplification procedure of a mixture of 4D Gaussians. Finally, our data structure is optimized to facilitate interactive multi-resolution volume rendering on GPUs.
Self-consistent electron transport in collisional plasmas
International Nuclear Information System (INIS)
Mason, R.J.
1982-01-01
A self-consistent scheme has been developed to model electron transport in evolving plasmas of arbitrary classical collisionality. The electrons and ions are treated as either multiple donor-cell fluids, or collisional particles-in-cell. Particle suprathermal electrons scatter off ions, and drag against fluid background thermal electrons. The background electrons undergo ion friction, thermal coupling, and bremsstrahlung. The components move in self-consistent advanced E-fields, obtained by the Implicit Moment Method, which permits Δt >> ω/sub p/ -1 and Δx >> lambda/sub D/ - offering a 10 2 - 10 3 -fold speed-up over older explicit techniques. The fluid description for the background plasma components permits the modeling of transport in systems spanning more than a 10 7 -fold change in density, and encompassing contiguous collisional and collisionless regions. Results are presented from application of the scheme to the modeling of CO 2 laser-generated suprathermal electron transport in expanding thin foils, and in multi-foil target configurations
Efficient self-consistency for magnetic tight binding
Soin, Preetma; Horsfield, A. P.; Nguyen-Manh, D.
2011-06-01
Tight binding can be extended to magnetic systems by including an exchange interaction on an atomic site that favours net spin polarisation. We have used a published model, extended to include long-ranged Coulomb interactions, to study defects in iron. We have found that achieving self-consistency using conventional techniques was either unstable or very slow. By formulating the problem of achieving charge and spin self-consistency as a search for stationary points of a Harris-Foulkes functional, extended to include spin, we have derived a much more efficient scheme based on a Newton-Raphson procedure. We demonstrate the capabilities of our method by looking at vacancies and self-interstitials in iron. Self-consistency can indeed be achieved in a more efficient and stable manner, but care needs to be taken to manage this. The algorithm is implemented in the code PLATO. Program summaryProgram title:PLATO Catalogue identifier: AEFC_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEFC_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 228 747 No. of bytes in distributed program, including test data, etc.: 1 880 369 Distribution format: tar.gz Programming language: C and PERL Computer: Apple Macintosh, PC, Unix machines Operating system: Unix, Linux, Mac OS X, Windows XP Has the code been vectorised or parallelised?: Yes. Up to 256 processors tested RAM: Up to 2 Gbytes per processor Classification: 7.3 External routines: LAPACK, BLAS and optionally ScaLAPACK, BLACS, PBLAS, FFTW Catalogue identifier of previous version: AEFC_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 2616 Does the new version supersede the previous version?: Yes Nature of problem: Achieving charge and spin self-consistency in magnetic tight binding can be very
DEFF Research Database (Denmark)
2003-01-01
Image city exhibition explores a condition of mediation, through a focus on image and sound narratives with a point of departure on a number of Asian cities.......Image city exhibition explores a condition of mediation, through a focus on image and sound narratives with a point of departure on a number of Asian cities....
East India Company Logbooks - Images
National Oceanic and Atmospheric Administration, Department of Commerce — This collection consists of images of 1,235 ship logbooks created during British East India Company voyages. Period of record 1786-1834, peaking in 1804. The...
Hyperspectral band selection based on consistency-measure of neighborhood rough set theory
International Nuclear Information System (INIS)
Liu, Yao; Xie, Hong; Wang, Liguo; Tan, Kezhu; Chen, Yuehua; Xu, Zhen
2016-01-01
Band selection is a well-known approach for reducing dimensionality in hyperspectral imaging. In this paper, a band selection method based on consistency-measure of neighborhood rough set theory (CMNRS) was proposed to select informative bands from hyperspectral images. A decision-making information system was established by the reflection spectrum of soybeans’ hyperspectral data between 400 nm and 1000 nm wavelengths. The neighborhood consistency-measure, which reflects not only the size of the decision positive region, but also the sample distribution in the boundary region, was used as the evaluation function of band significance. The optimal band subset was selected by a forward greedy search algorithm. A post-pruning strategy was employed to overcome the over-fitting problem and find the minimum subset. To assess the effectiveness of the proposed band selection technique, two classification models (extreme learning machine (ELM) and random forests (RF)) were built. The experimental results showed that the proposed algorithm can effectively select key bands and obtain satisfactory classification accuracy. (paper)
Are exposure index values consistent in clinical practice? A multi-manufacturer investigation
International Nuclear Information System (INIS)
Butler, M. L.; Rainford, L.; Last, J.; Brennan, P. C.
2010-01-01
The advent of digital radiography poses the risk of unnoticed increases in patient dose. Manufacturers have responded to this by offering an exposure index (EI) value to the clinician. Whilst the EI value is a measure of the air kerma at the detector surface, it has been recommended by international agencies as a method of monitoring radiation dose to the patient. Recent studies by the group have shown that EI values are being used in clinical practice to monitor radiation dose and assess image quality. This study aims to compare the clinical consistency of the EI value in computed radiography (CR) and direct digital radiography (DR) systems. An anthropomorphic phantom was used to simulate four common radiographic examinations: skull, pelvis, chest and hand. These examinations were chosen as they provide contrasting exposure parameters, image detail and radiation dose measurements. Four manufacturers were used for comparison: Agfa Gaevert CR, Carestream CR, Philips Digital Diagnost DR and Siemens DR. For each examination, the phantom was placed in the optimal position and exposure parameters were chosen in accordance with European guidelines and clinical practice. Multiple exposures were taken and the EI recorded. All exposure parameters and clinical conditions remained constant throughout. For both DR systems, the EI values remained consistent throughout. No significant change was noted in any examination. In both CR systems, there were noteworthy fluctuations in the EI values for all examinations. The largest for the Agfa system was a variation of 1.88-2.21 for the skull examination. This represents to the clinician a doubling of detector dose, despite all exposure parameters remaining constant. In the Kodak system, the largest fluctuation was seen for the chest examination where the EI ranged from 2560 to 2660, representing approximately an increase of 30% in radiation dose, despite consistent parameters. The fluctuations seen with the CR systems are most likely
Portable Imaging Polarimeter and Imaging Experiments; TOPICAL
International Nuclear Information System (INIS)
PHIPPS, GARY S.; KEMME, SHANALYN A.; SWEATT, WILLIAM C.; DESCOUR, M.R.; GARCIA, J.P.; DERENIAK, E.L.
1999-01-01
Polarimetry is the method of recording the state of polarization of light. Imaging polarimetry extends this method to recording the spatially resolved state of polarization within a scene. Imaging-polarimetry data have the potential to improve the detection of manmade objects in natural backgrounds. We have constructed a midwave infrared complete imaging polarimeter consisting of a fixed wire-grid polarizer and rotating form-birefringent retarder. The retardance and the orientation angles of the retarder were optimized to minimize the sensitivity of the instrument to noise in the measurements. The optimal retardance was found to be 132(degree) rather than the typical 90(degree). The complete imaging polarimeter utilized a liquid-nitrogen cooled PtSi camera. The fixed wire-grid polarizer was located at the cold stop inside the camera dewar. The complete imaging polarimeter was operated in the 4.42-5(micro)m spectral range. A series of imaging experiments was performed using as targets a surface of water, an automobile, and an aircraft. Further analysis of the polarization measurements revealed that in all three cases the magnitude of circular polarization was comparable to the noise in the calculated Stokes-vector components
Consistency in performance evaluation reports and medical records.
Lu, Mingshan; Ma, Ching-to Albert
2002-12-01
In the health care market managed care has become the latest innovation for the delivery of services. For efficient implementation, the managed care organization relies on accurate information. So clinicians are often asked to report on patients before referrals are approved, treatments authorized, or insurance claims processed. What are clinicians responses to solicitation for information by managed care organizations? The existing health literature has already pointed out the importance of provider gaming, sincere reporting, nudging, and dodging the rules. We assess the consistency of clinicians reports on clients across administrative data and clinical records. For about 1,000 alcohol abuse treatment episodes, we compare clinicians reports across two data sets. The first one, the Maine Addiction Treatment System (MATS), was an administrative data set; the state government used it for program performance monitoring and evaluation. The second was a set of medical record abstracts, taken directly from the clinical records of treatment episodes. A clinician s reporting practice exhibits an inconsistency if the information reported in MATS differs from the information reported in the medical record in a statistically significant way. We look for evidence of inconsistencies in five categories: admission alcohol use frequency, discharge alcohol use frequency, termination status, admission employment status, and discharge employment status. Chi-square tests, Kappa statistics, and sensitivity and specificity tests are used for hypothesis testing. Multiple imputation methods are employed to address the problem of missing values in the record abstract data set. For admission and discharge alcohol use frequency measures, we find, respectively, strong and supporting evidence for inconsistencies. We find equally strong evidence for consistency in reports of admission and discharge employment status, and mixed evidence on report consistency on termination status. Patterns of
Making the Sustainable Development Goals Consistent with Sustainability
Directory of Open Access Journals (Sweden)
Mathis Wackernagel
2017-07-01
Full Text Available The UN’s Sustainable development Goals (SDGs are the most significant global effort so far to advance global sustainable development. Bertelsmann Stiftung and the sustainable development solutions network released an SDG index to assess countries’ average performance on SDGs. Ranking high on the SDG index strongly correlates with high per person demand on nature (or “Footprints”, and low ranking with low Footprints, making evident that the SDGs as expressed today vastly underperform on sustainability. Such underperformance is anti-poor because lowest-income people exposed to resource insecurity will lack the financial means to shield themselves from the consequences. Given the significance of the SDGs for guiding development, rigorous accounting is essential for making them consistent with the goals of sustainable development: thriving within the means of planet Earth.
A Secure Localization Approach against Wormhole Attacks Using Distance Consistency
Directory of Open Access Journals (Sweden)
Lou Wei
2010-01-01
Full Text Available Wormhole attacks can negatively affect the localization in wireless sensor networks. A typical wormhole attack can be launched by two colluding attackers, one of which sniffs packets at one point in the network and tunnels them through a wired or wireless link to another point, and the other relays them within its vicinity. In this paper, we investigate the impact of the wormhole attack on the localization and propose a novel distance-consistency-based secure localization scheme against wormhole attacks, which includes three phases of wormhole attack detection, valid locators identification and self-localization. The theoretical model is further formulated to analyze the proposed secure localization scheme. The simulation results validate the theoretical results and also demonstrate the effectiveness of our proposed scheme.
Simulations of tokamak disruptions including self-consistent temperature evolution
International Nuclear Information System (INIS)
Bondeson, A.
1986-01-01
Three-dimensional simulations of tokamaks have been carried out, including self-consistent temperature evolution with a highly anisotropic thermal conductivity. The simulations extend over the transport time-scale and address the question of how disruptive current profiles arise at low-q or high-density operation. Sharply defined disruptive events are triggered by the m/n=2/1 resistive tearing mode, which is mainly affected by local current gradients near the q=2 surface. If the global current gradient between q=2 and q=1 is sufficiently steep, the m=2 mode starts a shock which accelerates towards the q=1 surface, leaving stochastic fields, a flattened temperature profile and turbulent plasma behind it. For slightly weaker global current gradients, a shock may form, but it will dissipate before reaching q=1 and may lead to repetitive minidisruptions which flatten the temperature profile in a region inside the q=2 surface. (author)
Thermodynamically consistent model of brittle oil shales under overpressure
Izvekov, Oleg
2016-04-01
The concept of dual porosity is a common way for simulation of oil shale production. In the frame of this concept the porous fractured media is considered as superposition of two permeable continua with mass exchange. As a rule the concept doesn't take into account such as the well-known phenomenon as slip along natural fractures, overpressure in low permeability matrix and so on. Overpressure can lead to development of secondary fractures in low permeability matrix in the process of drilling and pressure reduction during production. In this work a new thermodynamically consistent model which generalizes the model of dual porosity is proposed. Particularities of the model are as follows. The set of natural fractures is considered as permeable continuum. Damage mechanics is applied to simulation of secondary fractures development in low permeability matrix. Slip along natural fractures is simulated in the frame of plasticity theory with Drucker-Prager criterion.
A consistency analysis on the tokamak reactor plasmas
International Nuclear Information System (INIS)
Fukuyama, A.; Itoh, S.-I.; Itoh, K.
1990-12-01
The parameter regime which simultaneously fulfills the various physics constraints are looked for in the case of ITER grade tokamaks. The consistency analysis code is applied. It is found that, if the energy confinement time reaches 1.6 times of the prediction of the L-mode scaling law, the Q-value of about 4 is possible for the full current drive operation at the input power P in of 100MW (Q is the ratio of fusion output and P in ). In the ignition mode, where half of the current is inductively sustained, Q approaches to 15 for this circulating power. If only the L-mode is realized, Q is about 1.5 for P in ≅100 MW. (author)
A self-consistent spin-diffusion model for micromagnetics
Abert, Claas; Ruggeri, Michele; Bruckner, Florian; Vogler, Christoph; Manchon, Aurelien; Praetorius, Dirk; Suess, Dieter
2016-01-01
We propose a three-dimensional micromagnetic model that dynamically solves the Landau-Lifshitz-Gilbert equation coupled to the full spin-diffusion equation. In contrast to previous methods, we solve for the magnetization dynamics and the electric potential in a self-consistent fashion. This treatment allows for an accurate description of magnetization dependent resistance changes. Moreover, the presented algorithm describes both spin accumulation due to smooth magnetization transitions and due to material interfaces as in multilayer structures. The model and its finite-element implementation are validated by current driven motion of a magnetic vortex structure. In a second experiment, the resistivity of a magnetic multilayer structure in dependence of the tilting angle of the magnetization in the different layers is investigated. Both examples show good agreement with reference simulations and experiments respectively.
ER=EPR, GHZ, and the consistency of quantum measurements
International Nuclear Information System (INIS)
Susskind, Leonard
2016-01-01
This paper illustrates various aspects of the ER=EPR conjecture. It begins with a brief heuristic argument, using the Ryu-Takayanagi correspondence, for why entanglement between black holes implies the existence of Einstein-Rosen bridges. The main part of the paper addresses a fundamental question: Is ER=EPR consistent with the standard postulates of quantum mechanics? Naively it seems to lead to an inconsistency between observations made on entangled systems by different observers. The resolution of the paradox lies in the properties of multiple black holes, entangled in the Greenberger-Horne-Zeilinger pattern. The last part of the paper is about entanglement as a resource for quantum communication. ER=EPR provides a way to visualize protocols like quantum teleportation. In some sense teleportation takes place through the wormhole, but as usual, classical communication is necessary to complete the protocol. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)
Kinematic Analysis of Continuum Robot Consisted of Driven Flexible Rods
Directory of Open Access Journals (Sweden)
Yingzhong Tian
2016-01-01
Full Text Available This paper presents the kinematic analysis of a continuum bionic robot with three flexible actuation rods. Since the motion of the end-effector is actuated by the deformation of the rods, the robot structure is with high elasticity and good compliance and the kinematic analysis of the robot requires special treatment. We propose a kinematic model based on the geometry with constant curvature. The analysis consists of two independent mappings: a general mapping for the kinematics of all robots and a specific mapping for this kind of robots. Both of those mappings are developed for the single section and for the multisections. We aim at providing a guide for kinematic analysis of the similar manipulators through this paper.
Self-consistent determination of quasiparticle properties in nuclear matter
International Nuclear Information System (INIS)
Oset, E.; Palanques-Mestre, A.
1981-01-01
The self-energy of nuclear matter is calculated by directing the attention to the energy and momentum dependent pieces which determine the quasiparticle properties. A microscopic approach is followed which starts from the boson exchange picture for the NN interaction, then the π-and p-mesons are shown to play a major role in the nucleon renormalization. The calculation is done self-consistently and the effective mass and pole strength determined as a function of the nuclear density and momentum. Particular emphasis is put on the non-static character of the interaction and its consequences. Finally a comparison is made with other calculations and with experimental results. The consequences of the nucleon renormalization in pion condensation are also examined with the result that the critical density is pushed up appreciably. (orig.)
Changes in forest productivity across Alaska consistent with biome shift.
Beck, Pieter S A; Juday, Glenn P; Alix, Claire; Barber, Valerie A; Winslow, Stephen E; Sousa, Emily E; Heiser, Patricia; Herriges, James D; Goetz, Scott J
2011-04-01
Global vegetation models predict that boreal forests are particularly sensitive to a biome shift during the 21st century. This shift would manifest itself first at the biome's margins, with evergreen forest expanding into current tundra while being replaced by grasslands or temperate forest at the biome's southern edge. We evaluated changes in forest productivity since 1982 across boreal Alaska by linking satellite estimates of primary productivity and a large tree-ring data set. Trends in both records show consistent growth increases at the boreal-tundra ecotones that contrast with drought-induced productivity declines throughout interior Alaska. These patterns support the hypothesized effects of an initiating biome shift. Ultimately, tree dispersal rates, habitat availability and the rate of future climate change, and how it changes disturbance regimes, are expected to determine where the boreal biome will undergo a gradual geographic range shift, and where a more rapid decline. © 2011 Blackwell Publishing Ltd/CNRS.
Self-Consistent Dynamical Model of the Broad Line Region
Energy Technology Data Exchange (ETDEWEB)
Czerny, Bozena [Center for Theoretical Physics, Polish Academy of Sciences, Warsaw (Poland); Li, Yan-Rong [Key Laboratory for Particle Astrophysics, Institute of High Energy Physics, Chinese Academy of Sciences, Beijing (China); Sredzinska, Justyna; Hryniewicz, Krzysztof [Copernicus Astronomical Center, Polish Academy of Sciences, Warsaw (Poland); Panda, Swayam [Center for Theoretical Physics, Polish Academy of Sciences, Warsaw (Poland); Copernicus Astronomical Center, Polish Academy of Sciences, Warsaw (Poland); Wildy, Conor [Center for Theoretical Physics, Polish Academy of Sciences, Warsaw (Poland); Karas, Vladimir, E-mail: bcz@cft.edu.pl [Astronomical Institute, Czech Academy of Sciences, Prague (Czech Republic)
2017-06-22
We develop a self-consistent description of the Broad Line Region based on the concept of a failed wind powered by radiation pressure acting on a dusty accretion disk atmosphere in Keplerian motion. The material raised high above the disk is illuminated, dust evaporates, and the matter falls back toward the disk. This material is the source of emission lines. The model predicts the inner and outer radius of the region, the cloud dynamics under the dust radiation pressure and, subsequently, the gravitational field of the central black hole, which results in asymmetry between the rise and fall. Knowledge of the dynamics allows us to predict the shapes of the emission lines as functions of the basic parameters of an active nucleus: black hole mass, accretion rate, black hole spin (or accretion efficiency) and the viewing angle with respect to the symmetry axis. Here we show preliminary results based on analytical approximations to the cloud motion.
Optimization of nonthermal fusion power consistent with energy channeling
International Nuclear Information System (INIS)
Snyder, P.B.; Herrmann, M.C.; Fisch, N.J.
1995-02-01
If the energy of charged fusion products can be diverted directly to fuel ions, non-Maxwellian fuel ion distributions and temperature differences between species will result. To determine the importance of these nonthermal effects, the fusion power density is optimized at constant-β for nonthermal distributions that are self-consistently maintained by channeling of energy from charged fusion products. For D-T and D- 3 He reactors, with 75% of charged fusion product power diverted to fuel ions, temperature differences between electrons and ions increase the reactivity by 40-70%, while non- Maxwellian fuel ion distributions and temperature differences between ionic species increase the reactivity by an additional 3-15%
A photon position sensor consisting of single-electron circuits
International Nuclear Information System (INIS)
Kikombo, Andrew Kilinga; Amemiya, Yoshihito; Tabe, Michiharu
2009-01-01
This paper proposes a solid-state sensor that can detect the position of incident photons with a high spatial resolution. The sensor consists of a two-dimensional array of single-electron oscillators, each coupled to its neighbors through coupling capacitors. An incident photon triggers an excitatory circular wave of electron tunneling in the oscillator array. The wave propagates in all directions to reach the periphery of the array. By measuring the arrival time of the wave at the periphery, we can know the position of the incident photon. The tunneling wave's generation, propagation, arrival at the array periphery, and the determination of incident photon positions are demonstrated with the results of Monte Carlo based computer simulations.
Self-consistent modeling of amorphous silicon devices
International Nuclear Information System (INIS)
Hack, M.
1987-01-01
The authors developed a computer model to describe the steady-state behaviour of a range of amorphous silicon devices. It is based on the complete set of transport equations and takes into account the important role played by the continuous distribution of localized states in the mobility gap of amorphous silicon. Using one set of parameters they have been able to self-consistently simulate the current-voltage characteristics of p-i-n (or n-i-p) solar cells under illumination, the dark behaviour of field-effect transistors, p-i-n diodes and n-i-n diodes in both the ohmic and space charge limited regimes. This model also describes the steady-state photoconductivity of amorphous silicon, in particular, its dependence on temperature, doping and illumination intensity
Self-consistent expansion for the molecular beam epitaxy equation.
Katzav, Eytan
2002-03-01
Motivated by a controversy over the correct results derived from the dynamic renormalization group (DRG) analysis of the nonlinear molecular beam epitaxy (MBE) equation, a self-consistent expansion for the nonlinear MBE theory is considered. The scaling exponents are obtained for spatially correlated noise of the general form D(r-r('),t-t('))=2D(0)[r-->-r(')](2rho-d)delta(t-t(')). I find a lower critical dimension d(c)(rho)=4+2rho, above which the linear MBE solution appears. Below the lower critical dimension a rho-dependent strong-coupling solution is found. These results help to resolve the controversy over the correct exponents that describe nonlinear MBE, using a reliable method that proved itself in the past by giving reasonable results for the strong-coupling regime of the Kardar-Parisi-Zhang system (for d>1), where DRG failed to do so.
Planck 2013 results. XXXI. Consistency of the Planck data
DEFF Research Database (Denmark)
Ade, P. A. R.; Arnaud, M.; Ashdown, M.
2014-01-01
The Planck design and scanning strategy provide many levels of redundancy that can be exploited to provide tests of internal consistency. One of the most important is the comparison of the 70 GHz (amplifier) and 100 GHz (bolometer) channels. Based on dierent instrument technologies, with feeds...... in the HFI channels would result in shifts in the posterior distributions of parameters of less than 0.3σ except for As, the amplitude of the primordial curvature perturbations at 0.05 Mpc-1, which changes by about 1.We extend these comparisons to include the sky maps from the complete nine-year mission...... located dierently in the focal plane, analysed independently by dierent teams using dierent software, and near∫ the minimum of diuse foreground emission, these channels are in eect two dierent experiments. The 143 GHz channel has the lowest noise level on Planck, and is near the minimum of unresolved...
Self-consistent Langmuir waves in resonantly driven thermal plasmas
Lindberg, R. R.; Charman, A. E.; Wurtele, J. S.
2007-12-01
The longitudinal dynamics of a resonantly driven Langmuir wave are analyzed in the limit that the growth of the electrostatic wave is slow compared to the bounce frequency. Using simple physical arguments, the nonlinear distribution function is shown to be nearly invariant in the canonical particle action, provided both a spatially uniform term and higher-order spatial harmonics are included along with the fundamental in the longitudinal electric field. Requirements of self-consistency with the electrostatic potential yield the basic properties of the nonlinear distribution function, including a frequency shift that agrees closely with driven, electrostatic particle simulations over a range of temperatures. This extends earlier work on nonlinear Langmuir waves by Morales and O'Neil [G. J. Morales and T. M. O'Neil, Phys. Rev. Lett. 28, 417 (1972)] and Dewar [R. L. Dewar, Phys. Plasmas 15, 712 (1972)], and could form the basis of a reduced kinetic treatment of plasma dynamics for accelerator applications or Raman backscatter.
Self-consistent Langmuir waves in resonantly driven thermal plasmas
International Nuclear Information System (INIS)
Lindberg, R. R.; Charman, A. E.; Wurtele, J. S.
2007-01-01
The longitudinal dynamics of a resonantly driven Langmuir wave are analyzed in the limit that the growth of the electrostatic wave is slow compared to the bounce frequency. Using simple physical arguments, the nonlinear distribution function is shown to be nearly invariant in the canonical particle action, provided both a spatially uniform term and higher-order spatial harmonics are included along with the fundamental in the longitudinal electric field. Requirements of self-consistency with the electrostatic potential yield the basic properties of the nonlinear distribution function, including a frequency shift that agrees closely with driven, electrostatic particle simulations over a range of temperatures. This extends earlier work on nonlinear Langmuir waves by Morales and O'Neil [G. J. Morales and T. M. O'Neil, Phys. Rev. Lett. 28, 417 (1972)] and Dewar [R. L. Dewar, Phys. Plasmas 15, 712 (1972)], and could form the basis of a reduced kinetic treatment of plasma dynamics for accelerator applications or Raman backscatter
Self-consistent, relativistic, ferromagnetic band structure of gadolinium
International Nuclear Information System (INIS)
Harmon, B.N.; Schirber, J.; Koelling, D.D.
1977-01-01
An initial self-consistent calculation of the ground state magnetic band structure of gadolinium is described. A linearized APW method was used which included all single particle relativistic effects except spin-orbit coupling. The spin polarized potential was obtained in the muffin-tin form using the local spin density approximation for exchange and correlation. The most striking and unorthodox aspect of the results is the position of the 4f spin-down ''bands'' which are required to float just on top of the Fermi level in order to obtain convergence. If the 4f states (l = 3 resonance) are removed from the occupied region of the conduction bands the magnetic moment is approximately .75 μ/sub B//atom; however, as the 4f spin-down states are allowed to find their own position they hybridize with the conduction bands at the Fermi level and the moment becomes smaller. Means of improving the calculation are discussed
Multirobot FastSLAM Algorithm Based on Landmark Consistency Correction
Directory of Open Access Journals (Sweden)
Shi-Ming Chen
2014-01-01
Full Text Available Considering the influence of uncertain map information on multirobot SLAM problem, a multirobot FastSLAM algorithm based on landmark consistency correction is proposed. Firstly, electromagnetism-like mechanism is introduced to the resampling procedure in single-robot FastSLAM, where we assume that each sampling particle is looked at as a charged electron and attraction-repulsion mechanism in electromagnetism field is used to simulate interactive force between the particles to improve the distribution of particles. Secondly, when multiple robots observe the same landmarks, every robot is regarded as one node and Kalman-Consensus Filter is proposed to update landmark information, which further improves the accuracy of localization and mapping. Finally, the simulation results show that the algorithm is suitable and effective.
Self-consistent mean-field models for nuclear structure
International Nuclear Information System (INIS)
Bender, Michael; Heenen, Paul-Henri; Reinhard, Paul-Gerhard
2003-01-01
The authors review the present status of self-consistent mean-field (SCMF) models for describing nuclear structure and low-energy dynamics. These models are presented as effective energy-density functionals. The three most widely used variants of SCMF's based on a Skyrme energy functional, a Gogny force, and a relativistic mean-field Lagrangian are considered side by side. The crucial role of the treatment of pairing correlations is pointed out in each case. The authors discuss other related nuclear structure models and present several extensions beyond the mean-field model which are currently used. Phenomenological adjustment of the model parameters is discussed in detail. The performance quality of the SCMF model is demonstrated for a broad range of typical applications
Making the Sustainable Development Goals Consistent with Sustainability
Energy Technology Data Exchange (ETDEWEB)
Wackernagel, Mathis, E-mail: mathis.wackernagel@footprintnetwork.org; Hanscom, Laurel; Lin, David [Global Footprint Network, Oakland, CA (United States)
2017-07-11
The UN’s Sustainable development Goals (SDGs) are the most significant global effort so far to advance global sustainable development. Bertelsmann Stiftung and the sustainable development solutions network released an SDG index to assess countries’ average performance on SDGs. Ranking high on the SDG index strongly correlates with high per person demand on nature (or “Footprints”), and low ranking with low Footprints, making evident that the SDGs as expressed today vastly underperform on sustainability. Such underperformance is anti-poor because lowest-income people exposed to resource insecurity will lack the financial means to shield themselves from the consequences. Given the significance of the SDGs for guiding development, rigorous accounting is essential for making them consistent with the goals of sustainable development: thriving within the means of planet Earth.
Migraine patients consistently show abnormal vestibular bedside tests
Directory of Open Access Journals (Sweden)
Eliana Teixeira Maranhão
2015-01-01
Full Text Available Migraine and vertigo are common disorders, with lifetime prevalences of 16% and 7% respectively, and co-morbidity around 3.2%. Vestibular syndromes and dizziness occur more frequently in migraine patients. We investigated bedside clinical signs indicative of vestibular dysfunction in migraineurs.Objective To test the hypothesis that vestibulo-ocular reflex, vestibulo-spinal reflex and fall risk (FR responses as measured by 14 bedside tests are abnormal in migraineurs without vertigo, as compared with controls.Method Cross-sectional study including sixty individuals – thirty migraineurs, 25 women, 19-60 y-o; and 30 gender/age healthy paired controls.Results Migraineurs showed a tendency to perform worse in almost all tests, albeit only the Romberg tandem test was statistically different from controls. A combination of four abnormal tests better discriminated the two groups (93.3% specificity.Conclusion Migraine patients consistently showed abnormal vestibular bedside tests when compared with controls.
Migraine patients consistently show abnormal vestibular bedside tests.
Maranhão, Eliana Teixeira; Maranhão-Filho, Péricles; Luiz, Ronir Raggio; Vincent, Maurice Borges
2016-01-01
Migraine and vertigo are common disorders, with lifetime prevalences of 16% and 7% respectively, and co-morbidity around 3.2%. Vestibular syndromes and dizziness occur more frequently in migraine patients. We investigated bedside clinical signs indicative of vestibular dysfunction in migraineurs. To test the hypothesis that vestibulo-ocular reflex, vestibulo-spinal reflex and fall risk (FR) responses as measured by 14 bedside tests are abnormal in migraineurs without vertigo, as compared with controls. Cross-sectional study including sixty individuals - thirty migraineurs, 25 women, 19-60 y-o; and 30 gender/age healthy paired controls. Migraineurs showed a tendency to perform worse in almost all tests, albeit only the Romberg tandem test was statistically different from controls. A combination of four abnormal tests better discriminated the two groups (93.3% specificity). Migraine patients consistently showed abnormal vestibular bedside tests when compared with controls.
Sustaining biological welfare for our future through consistent science
Directory of Open Access Journals (Sweden)
Shimomura Yoshihiro
2013-01-01
Full Text Available Abstract Physiological anthropology presently covers a very broad range of human knowledge and engineering technologies. This study reviews scientific inconsistencies within a variety of areas: sitting posture; negative air ions; oxygen inhalation; alpha brain waves induced by music and ultrasound; 1/f fluctuations; the evaluation of feelings using surface electroencephalography; Kansei; universal design; and anti-stress issues. We found that the inconsistencies within these areas indicate the importance of integrative thinking and the need to maintain the perspective on the biological benefit to humanity. Analytical science divides human physiological functions into discrete details, although individuals comprise a unified collection of whole-body functions. Such disparate considerations contribute to the misunderstanding of physiological functions and the misevaluation of positive and negative values for humankind. Research related to human health will, in future, depend on the concept of maintaining physiological functions based on consistent science and on sustaining human health to maintain biological welfare in future generations.
A New Heteroskedastic Consistent Covariance Matrix Estimator using Deviance Measure
Directory of Open Access Journals (Sweden)
Nuzhat Aftab
2016-06-01
Full Text Available In this article we propose a new heteroskedastic consistent covariance matrix estimator, HC6, based on deviance measure. We have studied and compared the finite sample behavior of the new test and compared it with other this kind of estimators, HC1, HC3 and HC4m, which are used in case of leverage observations. Simulation study is conducted to study the effect of various levels of heteroskedasticity on the size and power of quasi-t test with HC estimators. Results show that the test statistic based on our new suggested estimator has better asymptotic approximation and less size distortion as compared to other estimators for small sample sizes when high level ofheteroskedasticity is present in data.
Back to the Future: Consistency-Based Trajectory Tracking
Kurien, James; Nayak, P. Pandurand; Norvig, Peter (Technical Monitor)
2000-01-01
Given a model of a physical process and a sequence of commands and observations received over time, the task of an autonomous controller is to determine the likely states of the process and the actions required to move the process to a desired configuration. We introduce a representation and algorithms for incrementally generating approximate belief states for a restricted but relevant class of partially observable Markov decision processes with very large state spaces. The algorithm presented incrementally generates, rather than revises, an approximate belief state at any point by abstracting and summarizing segments of the likely trajectories of the process. This enables applications to efficiently maintain a partial belief state when it remains consistent with observations and revisit past assumptions about the process' evolution when the belief state is ruled out. The system presented has been implemented and results on examples from the domain of spacecraft control are presented.
Consistency of the tachyon warm inflationary universe models
International Nuclear Information System (INIS)
Zhang, Xiao-Min; Zhu, Jian-Yang
2014-01-01
This study concerns the consistency of the tachyon warm inflationary models. A linear stability analysis is performed to find the slow-roll conditions, characterized by the potential slow-roll (PSR) parameters, for the existence of a tachyon warm inflationary attractor in the system. The PSR parameters in the tachyon warm inflationary models are redefined. Two cases, an exponential potential and an inverse power-law potential, are studied, when the dissipative coefficient Γ = Γ 0 and Γ = Γ(φ), respectively. A crucial condition is obtained for a tachyon warm inflationary model characterized by the Hubble slow-roll (HSR) parameter ε H , and the condition is extendable to some other inflationary models as well. A proper number of e-folds is obtained in both cases of the tachyon warm inflation, in contrast to existing works. It is also found that a constant dissipative coefficient (Γ = Γ 0 ) is usually not a suitable assumption for a warm inflationary model
Detection and quantification of flow consistency in business process models
DEFF Research Database (Denmark)
Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel
2017-01-01
, to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics......Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect......, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second...
Self-consistent simulation of the CSR effect
International Nuclear Information System (INIS)
Li, R.; Bohn, C.L.; Bisogano, J.J.
1998-01-01
When a microbunch with high charge traverses a curved trajectory, the curvature-induced bunch self-interaction, by way of coherent synchrotron radiation (CSR) and space-charge forces, may cause serious emittance degradation. In this paper, the authors present a self-consistent simulation for the study of the impact of CSR on beam optics. The dynamics of the bunch under the influence of the CSR forces is simulated using macroparticles, where the CSR force in turn depends on the history of bunch dynamics in accordance with causality. The simulation is benchmarked with analytical results obtained for a rigid-line bunch. Here they present the algorithm used in the simulation, along with the simulation results obtained for bending systems in the Jefferson Lab (JLab) free-electron-laser (FEL) lattice
A self-consistent spin-diffusion model for micromagnetics
Abert, Claas
2016-12-17
We propose a three-dimensional micromagnetic model that dynamically solves the Landau-Lifshitz-Gilbert equation coupled to the full spin-diffusion equation. In contrast to previous methods, we solve for the magnetization dynamics and the electric potential in a self-consistent fashion. This treatment allows for an accurate description of magnetization dependent resistance changes. Moreover, the presented algorithm describes both spin accumulation due to smooth magnetization transitions and due to material interfaces as in multilayer structures. The model and its finite-element implementation are validated by current driven motion of a magnetic vortex structure. In a second experiment, the resistivity of a magnetic multilayer structure in dependence of the tilting angle of the magnetization in the different layers is investigated. Both examples show good agreement with reference simulations and experiments respectively.
Self-Consistent Dynamical Model of the Broad Line Region
Directory of Open Access Journals (Sweden)
Bozena Czerny
2017-06-01
Full Text Available We develop a self-consistent description of the Broad Line Region based on the concept of a failed wind powered by radiation pressure acting on a dusty accretion disk atmosphere in Keplerian motion. The material raised high above the disk is illuminated, dust evaporates, and the matter falls back toward the disk. This material is the source of emission lines. The model predicts the inner and outer radius of the region, the cloud dynamics under the dust radiation pressure and, subsequently, the gravitational field of the central black hole, which results in asymmetry between the rise and fall. Knowledge of the dynamics allows us to predict the shapes of the emission lines as functions of the basic parameters of an active nucleus: black hole mass, accretion rate, black hole spin (or accretion efficiency and the viewing angle with respect to the symmetry axis. Here we show preliminary results based on analytical approximations to the cloud motion.
Business architecture management architecting the business for consistency and alignment
Simon, Daniel
2015-01-01
This book presents a comprehensive overview of enterprise architecture management with a specific focus on the business aspects. While recent approaches to enterprise architecture management have dealt mainly with aspects of information technology, this book covers all areas of business architecture from business motivation and models to business execution. The book provides examples of how architectural thinking can be applied in these areas, thus combining different perspectives into a consistent whole. In-depth experiences from end-user organizations help readers to understand the abstract concepts of business architecture management and to form blueprints for their own professional approach. Business architecture professionals, researchers, and others working in the field of strategic business management will benefit from this comprehensive volume and its hands-on examples of successful business architecture management practices..
Making the Sustainable Development Goals Consistent with Sustainability
International Nuclear Information System (INIS)
Wackernagel, Mathis; Hanscom, Laurel; Lin, David
2017-01-01
The UN’s Sustainable development Goals (SDGs) are the most significant global effort so far to advance global sustainable development. Bertelsmann Stiftung and the sustainable development solutions network released an SDG index to assess countries’ average performance on SDGs. Ranking high on the SDG index strongly correlates with high per person demand on nature (or “Footprints”), and low ranking with low Footprints, making evident that the SDGs as expressed today vastly underperform on sustainability. Such underperformance is anti-poor because lowest-income people exposed to resource insecurity will lack the financial means to shield themselves from the consequences. Given the significance of the SDGs for guiding development, rigorous accounting is essential for making them consistent with the goals of sustainable development: thriving within the means of planet Earth.
Statistically Consistent k-mer Methods for Phylogenetic Tree Reconstruction.
Allman, Elizabeth S; Rhodes, John A; Sullivant, Seth
2017-02-01
Frequencies of k-mers in sequences are sometimes used as a basis for inferring phylogenetic trees without first obtaining a multiple sequence alignment. We show that a standard approach of using the squared Euclidean distance between k-mer vectors to approximate a tree metric can be statistically inconsistent. To remedy this, we derive model-based distance corrections for orthologous sequences without gaps, which lead to consistent tree inference. The identifiability of model parameters from k-mer frequencies is also studied. Finally, we report simulations showing that the corrected distance outperforms many other k-mer methods, even when sequences are generated with an insertion and deletion process. These results have implications for multiple sequence alignment as well since k-mer methods are usually the first step in constructing a guide tree for such algorithms.
On the consistency of classical and quantum supergravity theories
Energy Technology Data Exchange (ETDEWEB)
Hack, Thomas-Paul [II. Institute for Theoretical Physics, University of Hamburg (Germany); Makedonski, Mathias [Department of Mathematical Sciences, University of Copenhagen (Denmark); Schenkel, Alexander [Department of Stochastics, University of Wuppertal (Germany)
2012-07-01
It is known that pure N=1 supergravity in d=4 spacetime dimensions is consistent at a classical and quantum level, i.e. that in a particular gauge the field equations assume a hyperbolic form - ensuring causal propagation of the degrees of freedom - and that the associated canonical quantum field theory satisfies unitarity. It seems, however, that it is yet unclear whether these properties persist if one considers the more general and realistic case of N=1, d=4 supergravity theories including arbitrary matter fields. We partially clarify the issue by introducing novel hyperbolic gauges for the gravitino field and proving that they commute with the resulting equations of motion. Moreover, we review recent partial results on the unitarity of these general supergravity theories and suggest first steps towards a comprehensive unitarity proof.
Multiple intelligence: ethical leadership feature consistent financial institutions.
Directory of Open Access Journals (Sweden)
Diamela Nava
2015-03-01
Full Text Available This study aims to make a theoretical underpinning contrast analysis on the multiple intelligences: consistent feature of Ethical Leadership in Financial Institutions. However, this research was conducted under a qualitative approach, a descriptive, using document analysis, which eventually might be considered that would support multiple intelligences to implement certain capabilities, to achieve the objectives with the purpose and from the rational point of view, to know how to establish significant changes in some ways it is, the way to assess the cognitive abilities of integrating human talent in organizations. Therefore, the role of the leader is to guide and support the development of human potential in their group as a community of interest in order to achieve the aspirations of the organization using intelligence as a strategic tool in different ways to not limit your imagination, judgment, and cooperative action.
Consistent constraints on the Standard Model Effective Field Theory
International Nuclear Information System (INIS)
Berthier, Laure; Trott, Michael
2016-01-01
We develop the global constraint picture in the (linear) effective field theory generalisation of the Standard Model, incorporating data from detectors that operated at PEP, PETRA, TRISTAN, SpS, Tevatron, SLAC, LEPI and LEP II, as well as low energy precision data. We fit one hundred and three observables. We develop a theory error metric for this effective field theory, which is required when constraints on parameters at leading order in the power counting are to be pushed to the percent level, or beyond, unless the cut off scale is assumed to be large, Λ≳ 3 TeV. We more consistently incorporate theoretical errors in this work, avoiding this assumption, and as a direct consequence bounds on some leading parameters are relaxed. We show how an S,T analysis is modified by the theory errors we include as an illustrative example.