WorldWideScience

Sample records for largely northwest-striking normal

  1. Normal zone soliton in large composite superconductors

    International Nuclear Information System (INIS)

    Kupferman, R.; Mints, R.G.; Ben-Jacob, E.

    1992-01-01

    The study of normal zone of finite size (normal domains) in superconductors, has been continuously a subject of interest in the field of applied superconductivity. It was shown that in homogeneous superconductors normal domains are always unstable, so that if a normal domain nucleates, it will either expand or shrink. While testing the stability of large cryostable composite superconductors, a new phenomena was found, the existence of stable propagating normal solitons. The formation of these propagating domains was shown to be a result of the high Joule power generated in the superconductor during the relatively long process of current redistribution between the superconductor and the stabilizer. Theoretical studies were performed in investigate the propagation of normal domains in large composite super conductors in the cryostable regime. Huang and Eyssa performed numerical calculations simulating the diffusion of heat and current redistribution in the conductor, and showed the existence of stable propagating normal domains. They compared the velocity of normal domain propagation with the experimental data, obtaining a reasonable agreement. Dresner presented an analytical method to solve this problem if the time dependence of the Joule power is given. He performed explicit calculations of normal domain velocity assuming that the Joule power decays exponentially during the process of current redistribution. In this paper, the authors propose a system of two one-dimensional diffusion equations describing the dynamics of the temperature and the current density distributions along the conductor. Numerical simulations of the equations reconfirm the existence of propagating domains in the cryostable regime, while an analytical investigation supplies an explicit formula for the velocity of the normal domain

  2. On Normalized Compression Distance and Large Malware

    OpenAIRE

    Borbely, Rebecca Schuller

    2015-01-01

    Normalized Compression Distance (NCD) is a popular tool that uses compression algorithms to cluster and classify data in a wide range of applications. Existing discussions of NCD's theoretical merit rely on certain theoretical properties of compression algorithms. However, we demonstrate that many popular compression algorithms don't seem to satisfy these theoretical properties. We explore the relationship between some of these properties and file size, demonstrating that this theoretical pro...

  3. Large animal normal tissue tolerance with boron neutron capture.

    Science.gov (United States)

    Gavin, P R; Kraft, S L; DeHaan, C E; Swartz, C D; Griebenow, M L

    1994-03-30

    Normal tissue tolerance of boron neutron capture irradiation using borocaptate sodium (NA2B12H11SH) in an epithermal neutron beam was studied. Large retriever-type dogs were used and the irradiations were performed by single dose, 5 x 10 dorsal portal. Fourteen dogs were irradiated with the epithermal neutron beam alone and 35 dogs were irradiated following intravenous administration of borocaptate sodium. Total body irradiation effect could be seen from the decreased leukocytes and platelets following irradiation. Most values returned to normal within 40 days postirradiation. Severe dermal necrosis occurred in animals given 15 Gy epithermal neutrons alone and in animals irradiated to a total peak physical dose greater than 64 Gy in animals following borocaptate sodium infusion. Lethal brain necrosis was seen in animals receiving between 27 and 39 Gy. Lethal brain necrosis occurred at 22-36 weeks postirradiation. A total peak physical dose of approximately 27 Gy and blood-boron concentrations of 25-50 ppm resulted in abnormal magnetic resonance imaging results in 6 months postexamination. Seven of eight of these animals remained normal and the lesions were not detected at the 12-month postirradiation examination. The bimodal therapy presents a complex challenge in attempting to achieve dose response assays. The resultant total radiation dose is a composite of low and high LET components. The short track length of the boron fission fragments and the geometric effect of the vessels causes much of the intravascular dose to miss the presumed critical target of the endothelial cells. The results indicate a large dose-sparing effect from the boron capture reactions within the blood.

  4. Large animal normal tissue tolerance with boron neutron capture

    International Nuclear Information System (INIS)

    Gavin, P.R.; Swartz, C.D.; Kraft, S.L.; Briebenow, M.L.; DeHaan, C.E.

    1994-01-01

    Normal tissue tolerance of boron neutron capture irradiation using borocaptate sodium (NA 2 B 12 H 11 SH) in an epithermal neutron beam was studied. Large retriever-type dogs were used and the irradiations were performed by single dose, 5 x 10 dorsal portal. Fourteen dogs were irradiated with the epithermal neutron beam alone and 35 dogs were irradiated following intravenous administration of borocaptate sodium. Total body irradiation effect could be seen from the decreased leukocytes and platelets following irradiation. Most values returned to normal within 40 days postirradiation. Severe dermal necrosis occurred in animals given 15 Gy epithermal neutrons alone and in animals irradiated to a total peak physical dose greater than 64 Gy in animals following borocaptate sodium infusion. Lethal brain necrosis was seen in animals receiving between 27 and 39 Gy. Lethal brain necrosis occurred at 22-36 weeks postirradiation. A total peak physical dose of approximately 27 Gy and blood-boron concentrations of 25-50 ppm resulted in abnormal magnetic resonance imaging results in 6 months postexamination. Seven of eight of these animals remained normal and the lesions were not detected at the 12-month postirradiation examination. The bimodal therapy presents a complex challenge in attempting to achieve dose response assays. The resultant total radiation dose is a composite of low and high LET components. The short track length of the boron fission fragments and the geometric effect of the vessels causes much of the intravascular dose to miss the presumed critical target of the endothelial cells. The results indicate a large dose-sparing effect from the boron capture reactions within the blood. 23 refs., 6 figs., 2 tabs

  5. Optimal adaptive normalized matched filter for large antenna arrays

    KAUST Repository

    Kammoun, Abla

    2016-09-13

    This paper focuses on the problem of detecting a target in the presence of a compound Gaussian clutter with unknown statistics. To this end, we focus on the design of the adaptive normalized matched filter (ANMF) detector which uses the regularized Tyler estimator (RTE) built from N-dimensional observations x, · · ·, x in order to estimate the clutter covariance matrix. The choice for the RTE is motivated by its possessing two major attributes: first its resilience to the presence of outliers, and second its regularization parameter that makes it more suitable to handle the scarcity in observations. In order to facilitate the design of the ANMF detector, we consider the regime in which n and N are both large. This allows us to derive closed-form expressions for the asymptotic false alarm and detection probabilities. Based on these expressions, we propose an asymptotically optimal setting for the regularization parameter of the RTE that maximizes the asymptotic detection probability while keeping the asymptotic false alarm probability below a certain threshold. Numerical results are provided in order to illustrate the gain of the proposed detector over a recently proposed setting of the regularization parameter.

  6. Optimal adaptive normalized matched filter for large antenna arrays

    KAUST Repository

    Kammoun, Abla; Couillet, Romain; Pascal, Fré dé ric; Alouini, Mohamed-Slim

    2016-01-01

    This paper focuses on the problem of detecting a target in the presence of a compound Gaussian clutter with unknown statistics. To this end, we focus on the design of the adaptive normalized matched filter (ANMF) detector which uses the regularized Tyler estimator (RTE) built from N-dimensional observations x, · · ·, x in order to estimate the clutter covariance matrix. The choice for the RTE is motivated by its possessing two major attributes: first its resilience to the presence of outliers, and second its regularization parameter that makes it more suitable to handle the scarcity in observations. In order to facilitate the design of the ANMF detector, we consider the regime in which n and N are both large. This allows us to derive closed-form expressions for the asymptotic false alarm and detection probabilities. Based on these expressions, we propose an asymptotically optimal setting for the regularization parameter of the RTE that maximizes the asymptotic detection probability while keeping the asymptotic false alarm probability below a certain threshold. Numerical results are provided in order to illustrate the gain of the proposed detector over a recently proposed setting of the regularization parameter.

  7. A simple approximation to the bivariate normal distribution with large correlation coefficient

    NARCIS (Netherlands)

    Albers, Willem/Wim; Kallenberg, W.C.M.

    1994-01-01

    The bivariate normal distribution function is approximated with emphasis on situations where the correlation coefficient is large. The high accuracy of the approximation is illustrated by numerical examples. Moreover, exact upper and lower bounds are presented as well as asymptotic results on the

  8. Padé approximant for normal stress differences in large-amplitude oscillatory shear flow

    Science.gov (United States)

    Poungthong, P.; Saengow, C.; Giacomin, A. J.; Kolitawong, C.; Merger, D.; Wilhelm, M.

    2018-04-01

    Analytical solutions for the normal stress differences in large-amplitude oscillatory shear flow (LAOS), for continuum or molecular models, normally take the inexact form of the first few terms of a series expansion in the shear rate amplitude. Here, we improve the accuracy of these truncated expansions by replacing them with rational functions called Padé approximants. The recent advent of exact solutions in LAOS presents an opportunity to identify accurate and useful Padé approximants. For this identification, we replace the truncated expansion for the corotational Jeffreys fluid with its Padé approximants for the normal stress differences. We uncover the most accurate and useful approximant, the [3,4] approximant, and then test its accuracy against the exact solution [C. Saengow and A. J. Giacomin, "Normal stress differences from Oldroyd 8-constant framework: Exact analytical solution for large-amplitude oscillatory shear flow," Phys. Fluids 29, 121601 (2017)]. We use Ewoldt grids to show the stunning accuracy of our [3,4] approximant in LAOS. We quantify this accuracy with an objective function and then map it onto the Pipkin space. Our two applications illustrate how to use our new approximant reliably. For this, we use the Spriggs relations to generalize our best approximant to multimode, and then, we compare with measurements on molten high-density polyethylene and on dissolved polyisobutylene in isobutylene oligomer.

  9. Normal axonal ion channel function in large peripheral nerve fibers following chronic ciguatera sensitization.

    Science.gov (United States)

    Vucic, Steve; Kiernan, Matthew C

    2008-03-01

    Although the acute clinical effects of ciguatera poisoning, due to ingestion of ciguatoxin, are mediated by activation of transient Na+ channels, the mechanisms underlying ciguatera sensitization remain undefined. Axonal excitability studies were performed by stimulating the median motor and sensory nerves in two patients with ciguatera sensitization. Excitability parameters were all within normal limits, thereby arguing against dysfunction of axonal membrane ion channels in large-diameter fibers in ciguatera sensitization.

  10. RADIOMETRIC NORMALIZATION OF LARGE AIRBORNE IMAGE DATA SETS ACQUIRED BY DIFFERENT SENSOR TYPES

    Directory of Open Access Journals (Sweden)

    S. Gehrke

    2016-06-01

    Full Text Available Generating seamless mosaics of aerial images is a particularly challenging task when the mosaic comprises a large number of im-ages, collected over longer periods of time and with different sensors under varying imaging conditions. Such large mosaics typically consist of very heterogeneous image data, both spatially (different terrain types and atmosphere and temporally (unstable atmo-spheric properties and even changes in land coverage. We present a new radiometric normalization or, respectively, radiometric aerial triangulation approach that takes advantage of our knowledge about each sensor’s properties. The current implementation supports medium and large format airborne imaging sensors of the Leica Geosystems family, namely the ADS line-scanner as well as DMC and RCD frame sensors. A hierarchical modelling – with parameters for the overall mosaic, the sensor type, different flight sessions, strips and individual images – allows for adaptation to each sensor’s geometric and radiometric properties. Additional parameters at different hierarchy levels can compensate radiome-tric differences of various origins to compensate for shortcomings of the preceding radiometric sensor calibration as well as BRDF and atmospheric corrections. The final, relative normalization is based on radiometric tie points in overlapping images, absolute radiometric control points and image statistics. It is computed in a global least squares adjustment for the entire mosaic by altering each image’s histogram using a location-dependent mathematical model. This model involves contrast and brightness corrections at radiometric fix points with bilinear interpolation for corrections in-between. The distribution of the radiometry fixes is adaptive to each image and generally increases with image size, hence enabling optimal local adaptation even for very long image strips as typi-cally captured by a line-scanner sensor. The normalization approach is implemented in

  11. Normal limits of the electrocardiogram derived from a large database of Brazilian primary care patients.

    Science.gov (United States)

    Palhares, Daniel M F; Marcolino, Milena S; Santos, Thales M M; da Silva, José L P; Gomes, Paulo R; Ribeiro, Leonardo B; Macfarlane, Peter W; Ribeiro, Antonio L P

    2017-06-13

    Knowledge of the normal limits of the electrocardiogram (ECG) is mandatory for establishing which patients have abnormal ECGs. No studies have assessed the reference standards for a Latin American population. Our aim was to establish the normal ranges of the ECG for pediatric and adult Brazilian primary care patients. This retrospective observational study assessed all the consecutive 12-lead digital electrocardiograms of primary care patients at least 1 year old in Minas Gerais state, Brazil, recorded between 2010 and 2015. ECGs were excluded if there were technical problems, selected abnormalities were present or patients with selected self-declared comorbidities or on drug therapy. Only the first ECG from patients with multiple ECGs was accepted. The University of Glasgow ECG analysis program was used to automatically interpret the ECGs. For each variable, the 1st, 2nd, 50th, 98th and 99th percentiles were determined and results were compared to selected studies. A total of 1,493,905 ECGs were recorded. 1,007,891 were excluded and 486.014 were analyzed. This large study provided normal values for heart rate, P, QRS and T frontal axis, P and QRS overall duration, PR and QT overall intervals and QTc corrected by Hodges, Bazett, Fridericia and Framingham formulae. Overall, the results were similar to those from other studies performed in different populations but there were differences in extreme ages and specific measurements. This study has provided reference values for Latinos of both sexes older than 1 year. Our results are comparable to studies performed in different populations.

  12. Large-scale event extraction from literature with multi-level gene normalization.

    Directory of Open Access Journals (Sweden)

    Sofie Van Landeghem

    Full Text Available Text mining for the life sciences aims to aid database curation, knowledge summarization and information retrieval through the automated processing of biomedical texts. To provide comprehensive coverage and enable full integration with existing biomolecular database records, it is crucial that text mining tools scale up to millions of articles and that their analyses can be unambiguously linked to information recorded in resources such as UniProt, KEGG, BioGRID and NCBI databases. In this study, we investigate how fully automated text mining of complex biomolecular events can be augmented with a normalization strategy that identifies biological concepts in text, mapping them to identifiers at varying levels of granularity, ranging from canonicalized symbols to unique gene and proteins and broad gene families. To this end, we have combined two state-of-the-art text mining components, previously evaluated on two community-wide challenges, and have extended and improved upon these methods by exploiting their complementary nature. Using these systems, we perform normalization and event extraction to create a large-scale resource that is publicly available, unique in semantic scope, and covers all 21.9 million PubMed abstracts and 460 thousand PubMed Central open access full-text articles. This dataset contains 40 million biomolecular events involving 76 million gene/protein mentions, linked to 122 thousand distinct genes from 5032 species across the full taxonomic tree. Detailed evaluations and analyses reveal promising results for application of this data in database and pathway curation efforts. The main software components used in this study are released under an open-source license. Further, the resulting dataset is freely accessible through a novel API, providing programmatic and customized access (http://www.evexdb.org/api/v001/. Finally, to allow for large-scale bioinformatic analyses, the entire resource is available for bulk download from

  13. Unusually large magnetic moments in the normal state and superconducting state of Sn nanoparticles

    Energy Technology Data Exchange (ETDEWEB)

    Hung, Chi-Hang; Lee, Chi-Hung; Hsu, Chien-Kang; Li, Chi-Yen; Karna, Sunil K.; Wang, Chin-Wei; Wu, Chun-Ming; Li, Wen-Hsien, E-mail: whli@phy.ncu.edu.tw [National Central University, Department of Physics and Center for Neutron Beam Applications (China)

    2013-09-15

    We report on the observations of spontaneous magnetic moments in the normal as well as in the superconducting states of a 9 nm Sn nanoparticle assembly, through X-ray diffraction, magnetization, ac magnetic susceptibility, and neutron diffraction measurements. The saturation magnetization reaches an unexpectedly large value of 1.04 emu/g at 5 K, with a temperature profile that can be described by Bloch's law with an exponent of b = 1.8. A magnetic moment of Left-Pointing-Angle-Bracket {mu}{sub Z} Right-Pointing-Angle-Bracket = 0.38 {mu}{sub B} develops after cooling from 260 to 4 K. Superconductivity develops below T{sub C} = 3.98 K, which is 7 % higher than the T{sub C} = 3.72 K of bulk Sn. Surprisingly, an addition magnetic moment of Left-Pointing-Angle-Bracket {mu}{sub Z} Right-Pointing-Angle-Bracket = 0.05 {mu}{sub B} develops upon entering the superconducting state.

  14. Large-scale grain growth in the solid-state process: From "Abnormal" to "Normal"

    Science.gov (United States)

    Jiang, Minhong; Han, Shengnan; Zhang, Jingwei; Song, Jiageng; Hao, Chongyan; Deng, Manjiao; Ge, Lingjing; Gu, Zhengfei; Liu, Xinyu

    2018-02-01

    Abnormal grain growth (AGG) has been a common phenomenon during the ceramic or metallurgy processing since prehistoric times. However, usually it had been very difficult to grow big single crystal (centimeter scale over) by using the AGG method due to its so-called occasionality. Based on the AGG, a solid-state crystal growth (SSCG) method was developed. The greatest advantages of the SSCG technology are the simplicity and cost-effectiveness of the technique. But the traditional SSCG technology is still uncontrollable. This article first summarizes the history and current status of AGG, and then reports recent technical developments from AGG to SSCG, and further introduces a new seed-free, solid-state crystal growth (SFSSCG) technology. This SFSSCG method allows us to repeatedly and controllably fabricate large-scale single crystals with appreciable high quality and relatively stable chemical composition at a relatively low temperature, at least in (K0.5Na0.5)NbO3(KNN) and Cu-Al-Mn systems. In this sense, the exaggerated grain growth is no longer 'Abnormal' but 'Normal' since it is able to be artificially controllable and repeated now. This article also provides a crystal growth model to qualitatively explain the mechanism of SFSSCG for KNN system. Compared with the traditional melt and high temperature solution growth methods, the SFSSCG method has the advantages of low energy consumption, low investment, simple technique, composition homogeneity overcoming the issues with incongruent melting and high volatility. This SFSSCG could be helpful for improving the mechanical and physical properties of single crystals, which should be promising for industrial applications.

  15. Tectonic geomorphology of large normal faults bounding the Cuzco rift basin within the southern Peruvian Andes

    Science.gov (United States)

    Byers, C.; Mann, P.

    2015-12-01

    The Cuzco basin forms a 80-wide, relatively flat valley within the High Andes of southern Peru. This larger basin includes the regional capital of Cuzco and the Urubamba Valley, or "Sacred Valley of the Incas" favored by the Incas for its mild climate and broader expanses of less rugged and arable land. The valley is bounded on its northern edge by a 100-km-long and 10-km-wide zone of down-to-the-south systems of normal faults that separate the lower area of the down-dropped plateau of central Peru and the more elevated area of the Eastern Cordillera foldbelt that overthrusts the Amazon lowlands to the east. Previous workers have shown that the normal faults are dipslip with up to 600 m of measured displacements, reflect north-south extension, and have Holocene displacments with some linked to destructive, historical earthquakes. We have constructed topographic and structural cross sections across the entire area to demonstrate the normal fault on a the plateau peneplain. The footwall of the Eastern Cordillera, capped by snowcapped peaks in excess of 6 km, tilts a peneplain surface northward while the hanging wall of the Cuzco basin is radially arched. Erosion is accelerated along the trend of the normal fault zone. As the normal fault zone changes its strike from east-west to more more northwest-southeast, normal displacement decreases and is replaced by a left-lateral strike-slip component.

  16. Normal zone detectors for a large number of inductively coupled coils

    International Nuclear Information System (INIS)

    Owen, E.W.; Shimer, D.W.

    1983-01-01

    In order to protect a set of inductively coupled superconducting magnets, it is necessary to locate and measure normal zone voltages that are small compared with the mutual and self-induced voltages. The method described in this paper uses two sets of voltage measurements to locate and measure one or more normal zones in any number of coupled coils. One set of voltages is the outputs of bridges that balance out the self-induced voltages. The other set of voltages can be the voltages across the coils, although alternatives are possible. The two sets of equations form a single combined set of equations. Each normal zone location or combination of normal zones has a set of these combined equations associated with it. It is demonstrated that the normal zone can be located and the correct set chosen, allowing determination of the size of the normal zone. Only a few operations take place in a working detector: multiplication of a constant, addition, and simple decision-making. In many cases the detector for each coil, although weakly linked to the other detectors, can be considered to be independent

  17. Normal zone detectors for a large number of inductively coupled coils. Revision 1

    International Nuclear Information System (INIS)

    Owen, E.W.; Shimer, D.W.

    1983-01-01

    In order to protect a set of inductively coupled superconducting magnets, it is necessary to locate and measure normal zone voltages that are small compared with the mutual and self-induced voltages. The method described in this paper uses two sets of voltage measurements to locate and measure one or more normal zones in any number of coupled coils. One set of voltages is the outputs of bridges that balance out the self-induced voltages. The other set of voltages can be the voltages across the coils, although alternatives are possible. The two sets of equations form a single combined set of equations. Each normal zone location or combination of normal zones has a set of these combined equations associated with it. It is demonstrated that the normal zone can be located and the correct set chosen, allowing determination of the size of the normal zone. Only a few operations take place in a working detector: multiplication of a constant, addition, and simple decision-making. In many cases the detector for each coil, although weakly linked to the other detectors, can be considered to be independent. The effect on accuracy of changes in the system parameters is discussed

  18. Normal zone detectors for a large number of inductively coupled coils

    International Nuclear Information System (INIS)

    Owen, E.W.; Shimer, D.W.

    1983-01-01

    In order to protect a set of inductively coupled superconducting magnets, it is necessary to locate and measure normal zone voltages that are small compared with the mutual and self-induced voltages. The method described in this report uses two sets of voltage measurements to locate and measure one or more normal zones in any number of coupled coils. One set of voltages is the outputs of bridges that balance out the self-induced voltages The other set of voltages can be the voltages across the coils, although alternatives are possible. The two sets of equations form a single combined set of equations. Each normal zone location or combination of normal zones has a set of these combined equations associated with it. It is demonstrated that the normal zone can be located and the correct set chosen, allowing determination of the size of the normal zone. Only a few operations take plae in a working detector: multiplication of a constant, addition, and simple decision-making. In many cases the detector for each coil, although weakly linked to the other detectors, can be considered to be independent. An example of the detector design is given for four coils with realistic parameters. The effect on accuracy of changes in the system parameters is discussed

  19. An Investigation of the High Efficiency Estimation Approach of the Large-Scale Scattered Point Cloud Normal Vector

    Directory of Open Access Journals (Sweden)

    Xianglin Meng

    2018-03-01

    Full Text Available The normal vector estimation of the large-scale scattered point cloud (LSSPC plays an important role in point-based shape editing. However, the normal vector estimation for LSSPC cannot meet the great challenge of the sharp increase of the point cloud that is mainly attributed to its low computational efficiency. In this paper, a novel, fast method-based on bi-linear interpolation is reported on the normal vector estimation for LSSPC. We divide the point sets into many small cubes to speed up the local point search and construct interpolation nodes on the isosurface expressed by the point cloud. On the premise of calculating the normal vectors of these interpolated nodes, a normal vector bi-linear interpolation of the points in the cube is realized. The proposed approach has the merits of accurate, simple, and high efficiency, because the algorithm only needs to search neighbor and calculates normal vectors for interpolation nodes that are usually far less than the point cloud. The experimental results of several real and simulated point sets show that our method is over three times faster than the Elliptic Gabriel Graph-based method, and the average deviation is less than 0.01 mm.

  20. Teaching the Assessment of Normality Using Large Easily-Generated Real Data Sets

    Science.gov (United States)

    Kulp, Christopher W.; Sprechini, Gene D.

    2016-01-01

    A classroom activity is presented, which can be used in teaching students statistics with an easily generated, large, real world data set. The activity consists of analyzing a video recording of an object. The colour data of the recorded object can then be used as a data set to explore variation in the data using graphs including histograms,…

  1. A large-scale study of the ultrawideband microwave dielectric properties of normal breast tissue obtained from reduction surgeries.

    Science.gov (United States)

    Lazebnik, Mariya; McCartney, Leah; Popovic, Dijana; Watkins, Cynthia B; Lindstrom, Mary J; Harter, Josephine; Sewall, Sarah; Magliocco, Anthony; Booske, John H; Okoniewski, Michal; Hagness, Susan C

    2007-05-21

    The efficacy of emerging microwave breast cancer detection and treatment techniques will depend, in part, on the dielectric properties of normal breast tissue. However, knowledge of these properties at microwave frequencies has been limited due to gaps and discrepancies in previously reported small-scale studies. To address these issues, we experimentally characterized the wideband microwave-frequency dielectric properties of a large number of normal breast tissue samples obtained from breast reduction surgeries at the University of Wisconsin and University of Calgary hospitals. The dielectric spectroscopy measurements were conducted from 0.5 to 20 GHz using a precision open-ended coaxial probe. The tissue composition within the probe's sensing region was quantified in terms of percentages of adipose, fibroconnective and glandular tissues. We fit a one-pole Cole-Cole model to the complex permittivity data set obtained for each sample and determined median Cole-Cole parameters for three groups of normal breast tissues, categorized by adipose tissue content (0-30%, 31-84% and 85-100%). Our analysis of the dielectric properties data for 354 tissue samples reveals that there is a large variation in the dielectric properties of normal breast tissue due to substantial tissue heterogeneity. We observed no statistically significant difference between the within-patient and between-patient variability in the dielectric properties.

  2. A large-scale study of the ultrawideband microwave dielectric properties of normal breast tissue obtained from reduction surgeries

    International Nuclear Information System (INIS)

    Lazebnik, Mariya; McCartney, Leah; Popovic, Dijana; Watkins, Cynthia B; Lindstrom, Mary J; Harter, Josephine; Sewall, Sarah; Magliocco, Anthony; Booske, John H; Okoniewski, Michal; Hagness, Susan C

    2007-01-01

    The efficacy of emerging microwave breast cancer detection and treatment techniques will depend, in part, on the dielectric properties of normal breast tissue. However, knowledge of these properties at microwave frequencies has been limited due to gaps and discrepancies in previously reported small-scale studies. To address these issues, we experimentally characterized the wideband microwave-frequency dielectric properties of a large number of normal breast tissue samples obtained from breast reduction surgeries at University of Wisconsin and University of Calgary hospitals. The dielectric spectroscopy measurements were conducted from 0.5 to 20 GHz using a precision open-ended coaxial probe. The tissue composition within the probe's sensing region was quantified in terms of percentages of adipose, fibroconnective and glandular tissues. We fit a one-pole Cole-Cole model to the complex permittivity data set obtained for each sample and determined median Cole-Cole parameters for three groups of normal breast tissues, categorized by adipose tissue content (0-30%, 31-84% and 85-100%). Our analysis of the dielectric properties data for 354 tissue samples reveals that there is a large variation in the dielectric properties of normal breast tissue due to substantial tissue heterogeneity. We observed no statistically significant difference between the within-patient and between-patient variability in the dielectric properties

  3. Visual exposure to large and small portion sizes and perceptions of portion size normality: Three experimental studies

    OpenAIRE

    Robinson, Eric; Oldham, Melissa; Cuckson, Imogen; Brunstrom, Jeffrey M.; Rogers, Peter J.; Hardman, Charlotte A.

    2016-01-01

    Portion sizes of many foods have increased in recent times. In three studies we examined the effect that repeated visual exposure to larger versus smaller food portion sizes has on perceptions of what constitutes a normal-sized food portion and measures of portion size selection. In studies 1 and 2 participants were visually exposed to images of large or small portions of spaghetti bolognese, before making evaluations about an image of an intermediate sized portion of the same food. In study ...

  4. Hi-Corrector: a fast, scalable and memory-efficient package for normalizing large-scale Hi-C data.

    Science.gov (United States)

    Li, Wenyuan; Gong, Ke; Li, Qingjiao; Alber, Frank; Zhou, Xianghong Jasmine

    2015-03-15

    Genome-wide proximity ligation assays, e.g. Hi-C and its variant TCC, have recently become important tools to study spatial genome organization. Removing biases from chromatin contact matrices generated by such techniques is a critical preprocessing step of subsequent analyses. The continuing decline of sequencing costs has led to an ever-improving resolution of the Hi-C data, resulting in very large matrices of chromatin contacts. Such large-size matrices, however, pose a great challenge on the memory usage and speed of its normalization. Therefore, there is an urgent need for fast and memory-efficient methods for normalization of Hi-C data. We developed Hi-Corrector, an easy-to-use, open source implementation of the Hi-C data normalization algorithm. Its salient features are (i) scalability-the software is capable of normalizing Hi-C data of any size in reasonable times; (ii) memory efficiency-the sequential version can run on any single computer with very limited memory, no matter how little; (iii) fast speed-the parallel version can run very fast on multiple computing nodes with limited local memory. The sequential version is implemented in ANSI C and can be easily compiled on any system; the parallel version is implemented in ANSI C with the MPI library (a standardized and portable parallel environment designed for solving large-scale scientific problems). The package is freely available at http://zhoulab.usc.edu/Hi-Corrector/. © The Author 2014. Published by Oxford University Press.

  5. Study of coherent structures of turbulence with large wall-normal gradients in thermophysical properties using direct numerical simulation

    International Nuclear Information System (INIS)

    Reinink, Shawn K.; Yaras, Metin I.

    2015-01-01

    Forced-convection heat transfer in a heated working fluid at a thermodynamic state near its pseudocritical point is poorly predicted by correlations calibrated with data at subcritical temperatures and pressures. This is suggested to be primarily due to the influence of large wall-normal thermophysical property gradients that develop in proximity of the pseudocritical point on the concentration of coherent turbulence structures near the wall. The physical mechanisms dominating this influence remain poorly understood. In the present study, direct numerical simulation is used to study the development of coherent vortical structures within a turbulent spot under the influence of large wall-normal property gradients. A turbulent spot rather than a fully turbulent boundary layer is used for the study, for the coherent structures of turbulence in a spot tend to be in a more organized state which may allow for more effective identification of cause-and-effect relationships. Large wall-normal gradients in thermophysical properties are created by heating the working fluid which is near the pseudocritical thermodynamic state. It is found that during improved heat transfer, wall-normal gradients in density accelerate the growth of the Kelvin-Helmholtz instability mechanism in the shear layer enveloping low-speed streaks, causing it to roll up into hairpin vortices at a faster rate. It is suggested that this occurs by the baroclinic vorticity generation mechanism which accelerates the streamwise grouping of vorticity during shear layer roll-up. The increased roll-up frequency leads to reduced streamwise spacing between hairpin vortices in wave packets. The density gradients also promote the sinuous instability mode in low-speed streaks. The resulting oscillations in the streaks in the streamwise-spanwise plane lead to locally reduced spanwise spacing between hairpin vortices forming over adjacent low-speed streaks. The reduction in streamwise and spanwise spacing between

  6. Screen-Space Normal Distribution Function Caching for Consistent Multi-Resolution Rendering of Large Particle Data

    KAUST Repository

    Ibrahim, Mohamed; Wickenhauser, Patrick; Rautek, Peter; Reina, Guido; Hadwiger, Markus

    2017-01-01

    Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.

  7. Screen-Space Normal Distribution Function Caching for Consistent Multi-Resolution Rendering of Large Particle Data

    KAUST Repository

    Ibrahim, Mohamed

    2017-08-28

    Molecular dynamics (MD) simulations are crucial to investigating important processes in physics and thermodynamics. The simulated atoms are usually visualized as hard spheres with Phong shading, where individual particles and their local density can be perceived well in close-up views. However, for large-scale simulations with 10 million particles or more, the visualization of large fields-of-view usually suffers from strong aliasing artifacts, because the mismatch between data size and output resolution leads to severe under-sampling of the geometry. Excessive super-sampling can alleviate this problem, but is prohibitively expensive. This paper presents a novel visualization method for large-scale particle data that addresses aliasing while enabling interactive high-quality rendering. We introduce the novel concept of screen-space normal distribution functions (S-NDFs) for particle data. S-NDFs represent the distribution of surface normals that map to a given pixel in screen space, which enables high-quality re-lighting without re-rendering particles. In order to facilitate interactive zooming, we cache S-NDFs in a screen-space mipmap (S-MIP). Together, these two concepts enable interactive, scale-consistent re-lighting and shading changes, as well as zooming, without having to re-sample the particle data. We show how our method facilitates the interactive exploration of real-world large-scale MD simulation data in different scenarios.

  8. Dynamics of large-scale brain activity in normal arousal states and epileptic seizures

    Science.gov (United States)

    Robinson, P. A.; Rennie, C. J.; Rowe, D. L.

    2002-04-01

    Links between electroencephalograms (EEGs) and underlying aspects of neurophysiology and anatomy are poorly understood. Here a nonlinear continuum model of large-scale brain electrical activity is used to analyze arousal states and their stability and nonlinear dynamics for physiologically realistic parameters. A simple ordered arousal sequence in a reduced parameter space is inferred and found to be consistent with experimentally determined parameters of waking states. Instabilities arise at spectral peaks of the major clinically observed EEG rhythms-mainly slow wave, delta, theta, alpha, and sleep spindle-with each instability zone lying near its most common experimental precursor arousal states in the reduced space. Theta, alpha, and spindle instabilities evolve toward low-dimensional nonlinear limit cycles that correspond closely to EEGs of petit mal seizures for theta instability, and grand mal seizures for the other types. Nonlinear stimulus-induced entrainment and seizures are also seen, EEG spectra and potentials evoked by stimuli are reproduced, and numerous other points of experimental agreement are found. Inverse modeling enables physiological parameters underlying observed EEGs to be determined by a new, noninvasive route. This model thus provides a single, powerful framework for quantitative understanding of a wide variety of brain phenomena.

  9. Visual exposure to large and small portion sizes and perceptions of portion size normality: Three experimental studies.

    Science.gov (United States)

    Robinson, Eric; Oldham, Melissa; Cuckson, Imogen; Brunstrom, Jeffrey M; Rogers, Peter J; Hardman, Charlotte A

    2016-03-01

    Portion sizes of many foods have increased in recent times. In three studies we examined the effect that repeated visual exposure to larger versus smaller food portion sizes has on perceptions of what constitutes a normal-sized food portion and measures of portion size selection. In studies 1 and 2 participants were visually exposed to images of large or small portions of spaghetti bolognese, before making evaluations about an image of an intermediate sized portion of the same food. In study 3 participants were exposed to images of large or small portions of a snack food before selecting a portion size of snack food to consume. Across the three studies, visual exposure to larger as opposed to smaller portion sizes resulted in participants considering a normal portion of food to be larger than a reference intermediate sized portion. In studies 1 and 2 visual exposure to larger portion sizes also increased the size of self-reported ideal meal size. In study 3 visual exposure to larger portion sizes of a snack food did not affect how much of that food participants subsequently served themselves and ate. Visual exposure to larger portion sizes may adjust visual perceptions of what constitutes a 'normal' sized portion. However, we did not find evidence that visual exposure to larger portions altered snack food intake. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Large p sub(T) π0 production in high energy αα collisions: strange or normal

    International Nuclear Information System (INIS)

    Wilk, G.

    1982-01-01

    Abnormally large yield of π 0 with large p sub(T) was measured at 90 deg in the nucleon-nucleon CM frame in the αα collisions at CERN-ISR. The maximal momenta of the beams corresponded to p=31.5 GeV/c per charge: The reaction of corresponding inclusive cross sections for αα and pp collisions grows with p sub(T) well above the expected value R=16 (for helium nuclei) reaching R=40-60 for p sub(T) approx.=7 GeV/c. In this talk author tries to answer the question if this result is strange or normal. (M.F.W.)

  11. Dynamic evolution of temporal dissipative-soliton molecules in large normal path-averaged dispersion fiber lasers

    International Nuclear Information System (INIS)

    Liu Xueming

    2010-01-01

    The robust dissipative soliton molecules (DSM's) exhibiting as the quasirectangular spectral profile are investigated numerically and observed experimentally in mode-locked fiber lasers with the large normal path-averaged dispersion and the large net cavity dispersion. These DSM's have an independently evolving phase with a pulse duration T 0 of about 20 ps and a peak-to-peak separation of about 8T 0 . Under laboratory conditions, the proposed laser delivers vibrating DSM's with an oscillating amplitude of less than a percent of peak separation. Numerical simulations show that DSM's are characterized by a spectral modulation pattern with about a 3-dB modulation depth measured as an averaged value. The experimental observations are in excellent agreement with the numerical predictions.

  12. A triple network connectivity study of large-scale brain systems in cognitively normal APOE4 carriers

    Directory of Open Access Journals (Sweden)

    Xia Wu

    2016-09-01

    Full Text Available The triple network model, consisting of the central executive network, salience network and default mode network, has been recently employed to understand dysfunction in core networks across various disorders. Here we used the triple network model to investigate the large-scale brain networks in cognitively normal APOE4 carriers who are at risk of Alzheimer’s disease (AD. To explore the functional connectivity for each of the three networks and the effective connectivity among them, we evaluated 17 cognitively normal individuals with a family history of AD and at least one copy of the apolipoprotein e4 (APOE4 allele and compared the findings to those of 12 individuals who did not carry the APOE4 gene or have a family history of AD, using independent component analysis and Bayesian network approach. Our findings indicated altered within-network connectivity that suggests future cognitive decline risk, and preserved between-network connectivity that may support their current preserved cognition in the cognitively normal APOE4 allele carries. The study provides novel sights into our understanding of the risk factors for AD and their influence on the triple network model of major psychopathology.

  13. New limits on the population of normal and millisecond pulsars in the Large and Small Magellanic Clouds

    Science.gov (United States)

    Ridley, J. P.; Lorimer, D. R.

    2010-07-01

    We model the potentially observable populations of normal and millisecond radio pulsars in the Large and Small Magellanic Clouds (LMC and SMC, respectively) where the known population currently stands at 19 normal radio pulsars. Taking into account the detection thresholds of previous surveys, and assuming optimal period and luminosity distributions based on studies of Galactic pulsars, we estimate that there are (1.79 +/- 0.20) × 104 and (1.09 +/- 0.16) × 104 normal pulsars in the LMC and SMC, respectively. When we attempt to correct for beaming effects, and the fraction of high-velocity pulsars which escape the clouds, we estimate birth rates in both the LMC and SMC to be comparable and in the range of 0.5-1 pulsars per century. Although higher than estimates for the rate of core-collapse supernovae in the clouds, these pulsar birth rates are consistent with historical supernova observations in the past 300 yr. A substantial population of active radio pulsars (of the order of a few hundred thousand) has escaped the LMC and SMC and populates the local intergalactic medium. For the millisecond pulsar (MSP) population, the lack of any detections from current surveys leads to respective upper limits (at the 95 per cent confidence level) of 15000 for the LMC and 23000 for the SMC. Several MSPs could be detected by a currently ongoing survey of the SMC with improved time and frequency resolution using the Parkes multibeam system. Giant-pulse emitting neutron stars could also be seen by this survey.

  14. Normal stress differences from Oldroyd 8-constant framework: Exact analytical solution for large-amplitude oscillatory shear flow

    Science.gov (United States)

    Saengow, C.; Giacomin, A. J.

    2017-12-01

    The Oldroyd 8-constant framework for continuum constitutive theory contains a rich diversity of popular special cases for polymeric liquids. In this paper, we use part of our exact solution for shear stress to arrive at unique exact analytical solutions for the normal stress difference responses to large-amplitude oscillatory shear (LAOS) flow. The nonlinearity of the polymeric liquids, triggered by LAOS, causes these responses at even multiples of the test frequency. We call responses at a frequency higher than twice the test frequency higher harmonics. We find the new exact analytical solutions to be compact and intrinsically beautiful. These solutions reduce to those of our previous work on the special case of the corotational Maxwell fluid. Our solutions also agree with our new truncated Goddard integral expansion for the special case of the corotational Jeffreys fluid. The limiting behaviors of these exact solutions also yield new explicit expressions. Finally, we use our exact solutions to see how η∞ affects the normal stress differences in LAOS.

  15. Multiscale virtual particle based elastic network model (MVP-ENM) for normal mode analysis of large-sized biomolecules.

    Science.gov (United States)

    Xia, Kelin

    2017-12-20

    In this paper, a multiscale virtual particle based elastic network model (MVP-ENM) is proposed for the normal mode analysis of large-sized biomolecules. The multiscale virtual particle (MVP) model is proposed for the discretization of biomolecular density data. With this model, large-sized biomolecular structures can be coarse-grained into virtual particles such that a balance between model accuracy and computational cost can be achieved. An elastic network is constructed by assuming "connections" between virtual particles. The connection is described by a special harmonic potential function, which considers the influence from both the mass distributions and distance relations of the virtual particles. Two independent models, i.e., the multiscale virtual particle based Gaussian network model (MVP-GNM) and the multiscale virtual particle based anisotropic network model (MVP-ANM), are proposed. It has been found that in the Debye-Waller factor (B-factor) prediction, the results from our MVP-GNM with a high resolution are as good as the ones from GNM. Even with low resolutions, our MVP-GNM can still capture the global behavior of the B-factor very well with mismatches predominantly from the regions with large B-factor values. Further, it has been demonstrated that the low-frequency eigenmodes from our MVP-ANM are highly consistent with the ones from ANM even with very low resolutions and a coarse grid. Finally, the great advantage of MVP-ANM model for large-sized biomolecules has been demonstrated by using two poliovirus virus structures. The paper ends with a conclusion.

  16. Evaluation of Normalization Methods to Pave the Way Towards Large-Scale LC-MS-Based Metabolomics Profiling Experiments

    Science.gov (United States)

    Valkenborg, Dirk; Baggerman, Geert; Vanaerschot, Manu; Witters, Erwin; Dujardin, Jean-Claude; Burzykowski, Tomasz; Berg, Maya

    2013-01-01

    Abstract Combining liquid chromatography-mass spectrometry (LC-MS)-based metabolomics experiments that were collected over a long period of time remains problematic due to systematic variability between LC-MS measurements. Until now, most normalization methods for LC-MS data are model-driven, based on internal standards or intermediate quality control runs, where an external model is extrapolated to the dataset of interest. In the first part of this article, we evaluate several existing data-driven normalization approaches on LC-MS metabolomics experiments, which do not require the use of internal standards. According to variability measures, each normalization method performs relatively well, showing that the use of any normalization method will greatly improve data-analysis originating from multiple experimental runs. In the second part, we apply cyclic-Loess normalization to a Leishmania sample. This normalization method allows the removal of systematic variability between two measurement blocks over time and maintains the differential metabolites. In conclusion, normalization allows for pooling datasets from different measurement blocks over time and increases the statistical power of the analysis, hence paving the way to increase the scale of LC-MS metabolomics experiments. From our investigation, we recommend data-driven normalization methods over model-driven normalization methods, if only a few internal standards were used. Moreover, data-driven normalization methods are the best option to normalize datasets from untargeted LC-MS experiments. PMID:23808607

  17. Evaluation of Bias-Variance Trade-Off for Commonly Used Post-Summarizing Normalization Procedures in Large-Scale Gene Expression Studies

    Science.gov (United States)

    Qiu, Xing; Hu, Rui; Wu, Zhixin

    2014-01-01

    Normalization procedures are widely used in high-throughput genomic data analyses to remove various technological noise and variations. They are known to have profound impact to the subsequent gene differential expression analysis. Although there has been some research in evaluating different normalization procedures, few attempts have been made to systematically evaluate the gene detection performances of normalization procedures from the bias-variance trade-off point of view, especially with strong gene differentiation effects and large sample size. In this paper, we conduct a thorough study to evaluate the effects of normalization procedures combined with several commonly used statistical tests and MTPs under different configurations of effect size and sample size. We conduct theoretical evaluation based on a random effect model, as well as simulation and biological data analyses to verify the results. Based on our findings, we provide some practical guidance for selecting a suitable normalization procedure under different scenarios. PMID:24941114

  18. A large-scale study of the ultrawideband microwave dielectric properties of normal, benign and malignant breast tissues obtained from cancer surgeries

    Energy Technology Data Exchange (ETDEWEB)

    Lazebnik, Mariya [Department of Electrical and Computer Engineering, University of Wisconsin, Madison, WI (United States); Popovic, Dijana [Department of Electrical and Computer Engineering, University of Calgary, Calgary, AB (Canada); McCartney, Leah [Department of Electrical and Computer Engineering, University of Calgary, Calgary, AB (Canada); Watkins, Cynthia B [Department of Electrical and Computer Engineering, University of Wisconsin, Madison, WI (United States); Lindstrom, Mary J [Department of Biostatistics and Medical Informatics, University of Wisconsin, Madison, WI (United States); Harter, Josephine [Department of Pathology, University of Wisconsin, Madison, WI (United States); Sewall, Sarah [Department of Pathology, University of Wisconsin, Madison, WI (United States); Ogilvie, Travis [Department of Pathology, University of Calgary, Calgary, AB (Canada); Magliocco, Anthony [Department of Pathology, University of Calgary, Calgary, AB (Canada); Breslin, Tara M [Department of Surgery, University of Wisconsin, Madison, WI (United States); Temple, Walley [Department of Surgery and Oncology, University of Calgary, Calgary, AB (Canada); Mew, Daphne [Department of Surgery and Oncology, University of Calgary, Calgary, AB (Canada); Booske, John H [Department of Electrical and Computer Engineering, University of Wisconsin, Madison, WI (United States); Okoniewski, Michal [Department of Electrical and Computer Engineering, University of Calgary, Calgary, AB (Canada); Hagness, Susan C [Department of Electrical and Computer Engineering, University of Wisconsin, Madison, WI (United States)

    2007-10-21

    The development of microwave breast cancer detection and treatment techniques has been driven by reports of substantial contrast in the dielectric properties of malignant and normal breast tissues. However, definitive knowledge of the dielectric properties of normal and diseased breast tissues at microwave frequencies has been limited by gaps and discrepancies across previously published studies. To address these issues, we conducted a large-scale study to experimentally determine the ultrawideband microwave dielectric properties of a variety of normal, malignant and benign breast tissues, measured from 0.5 to 20 GHz using a precision open-ended coaxial probe. Previously, we reported the dielectric properties of normal breast tissue samples obtained from reduction surgeries. Here, we report the dielectric properties of normal (adipose, glandular and fibroconnective), malignant (invasive and non-invasive ductal and lobular carcinomas) and benign (fibroadenomas and cysts) breast tissue samples obtained from cancer surgeries. We fit a one-pole Cole-Cole model to the complex permittivity data set of each characterized sample. Our analyses show that the contrast in the microwave-frequency dielectric properties between malignant and normal adipose-dominated tissues in the breast is considerable, as large as 10:1, while the contrast in the microwave-frequency dielectric properties between malignant and normal glandular/fibroconnective tissues in the breast is no more than about 10%.

  19. A large-scale study of the ultrawideband microwave dielectric properties of normal, benign and malignant breast tissues obtained from cancer surgeries

    Science.gov (United States)

    Lazebnik, Mariya; Popovic, Dijana; McCartney, Leah; Watkins, Cynthia B.; Lindstrom, Mary J.; Harter, Josephine; Sewall, Sarah; Ogilvie, Travis; Magliocco, Anthony; Breslin, Tara M.; Temple, Walley; Mew, Daphne; Booske, John H.; Okoniewski, Michal; Hagness, Susan C.

    2007-10-01

    The development of microwave breast cancer detection and treatment techniques has been driven by reports of substantial contrast in the dielectric properties of malignant and normal breast tissues. However, definitive knowledge of the dielectric properties of normal and diseased breast tissues at microwave frequencies has been limited by gaps and discrepancies across previously published studies. To address these issues, we conducted a large-scale study to experimentally determine the ultrawideband microwave dielectric properties of a variety of normal, malignant and benign breast tissues, measured from 0.5 to 20 GHz using a precision open-ended coaxial probe. Previously, we reported the dielectric properties of normal breast tissue samples obtained from reduction surgeries. Here, we report the dielectric properties of normal (adipose, glandular and fibroconnective), malignant (invasive and non-invasive ductal and lobular carcinomas) and benign (fibroadenomas and cysts) breast tissue samples obtained from cancer surgeries. We fit a one-pole Cole-Cole model to the complex permittivity data set of each characterized sample. Our analyses show that the contrast in the microwave-frequency dielectric properties between malignant and normal adipose-dominated tissues in the breast is considerable, as large as 10:1, while the contrast in the microwave-frequency dielectric properties between malignant and normal glandular/fibroconnective tissues in the breast is no more than about 10%.

  20. A large-scale study of the ultrawideband microwave dielectric properties of normal, benign and malignant breast tissues obtained from cancer surgeries

    International Nuclear Information System (INIS)

    Lazebnik, Mariya; Popovic, Dijana; McCartney, Leah; Watkins, Cynthia B; Lindstrom, Mary J; Harter, Josephine; Sewall, Sarah; Ogilvie, Travis; Magliocco, Anthony; Breslin, Tara M; Temple, Walley; Mew, Daphne; Booske, John H; Okoniewski, Michal; Hagness, Susan C

    2007-01-01

    The development of microwave breast cancer detection and treatment techniques has been driven by reports of substantial contrast in the dielectric properties of malignant and normal breast tissues. However, definitive knowledge of the dielectric properties of normal and diseased breast tissues at microwave frequencies has been limited by gaps and discrepancies across previously published studies. To address these issues, we conducted a large-scale study to experimentally determine the ultrawideband microwave dielectric properties of a variety of normal, malignant and benign breast tissues, measured from 0.5 to 20 GHz using a precision open-ended coaxial probe. Previously, we reported the dielectric properties of normal breast tissue samples obtained from reduction surgeries. Here, we report the dielectric properties of normal (adipose, glandular and fibroconnective), malignant (invasive and non-invasive ductal and lobular carcinomas) and benign (fibroadenomas and cysts) breast tissue samples obtained from cancer surgeries. We fit a one-pole Cole-Cole model to the complex permittivity data set of each characterized sample. Our analyses show that the contrast in the microwave-frequency dielectric properties between malignant and normal adipose-dominated tissues in the breast is considerable, as large as 10:1, while the contrast in the microwave-frequency dielectric properties between malignant and normal glandular/fibroconnective tissues in the breast is no more than about 10%

  1. Human papillomavirus in normal conjunctival tissue and in conjunctival papilloma: types and frequencies in a large series.

    Science.gov (United States)

    Sjö, Nicolai Christian; von Buchwald, Christian; Cassonnet, Patricia; Norrild, Bodil; Prause, Jan Ulrik; Vinding, Troels; Heegaard, Steffen

    2007-08-01

    To examine conjunctival papilloma and normal conjunctival tissue for the presence of human papillomavirus (HPV). Archival paraffin wax-embedded tissue from 165 conjunctival papillomas and from 20 histological normal conjunctival biopsy specimens was analysed for the presence of HPV by PCR. Specimens considered HPV positive using consensus primers, but with a negative or uncertain PCR result using type-specific HPV probes, were analysed with DNA sequencing. HPV was present in 86 of 106 (81%) beta-globin-positive papillomas. HPV type 6 was positive in 80 cases, HPV type 11 was identified in 5 cases and HPV type 45 was present in a single papilloma. All the 20 normal conjunctival biopsy specimens were beta-globin positive and HPV negative. There is a strong association between HPV and conjunctival papilloma. The study presents the largest material of conjunctival papilloma investigated for HPV and the first investigation of HPV in normal conjunctival tissue. HPV types 6 and 11 are the most common HPV types in conjunctival papilloma. This also is the first report of HPV type 45 in conjunctival papilloma.

  2. Changes in cell proliferation and morphology in the large intestine of normal and DMH-treated rats following colostomy.

    Science.gov (United States)

    Barkla, D H; Tutton, P J

    1987-04-01

    Colostomies were formed in the midcolon of normal and DMH-treated rats. Changes in cell proliferation in the mucosa adjacent to the colostomy and in the defunctioned distal segment were measured at seven, 14, 30, and 72 days using a stathmokinetic technique. Animals were given intraperitoneal injections of vinblastine and sacrificed three hours later; counts of mitotic and nonmitotic cells were made in tissue sections, and three-hour accumulated mitotic indexes were estimated. The results show that, except at seven days in DMH-treated rats, cell proliferation was unchanged in the colon proximal to the colostomy. Morphologic evidence of hyperplasia was seen in some animals at seven and 14 days. The defunctioned segment showed rapid atrophy of both mucosa and muscularis and a gradual but progressive decrease in cell proliferation. The morphology of the mucosa adjacent to the suture line in both functioning and defunctioned segments in normal and DMH-treated rats was abnormal in many animals. Abnormalities that were seen included collections of dysplastic epithelial cells in the submucosa, focal adenomatous changes, and intramural carcinoma formation. Aggregates of lymphoid tissue often were associated with carcinomas.

  3. Differential effects of fresh frozen plasma and normal saline on secondary brain damage in a large animal model of polytrauma, hemorrhage and traumatic brain injury

    DEFF Research Database (Denmark)

    Hwabejire, John O; Imam, Ayesha M; Jin, Guang

    2013-01-01

    We have previously shown that the extent of traumatic brain injury (TBI) in large animal models can be reduced with early infusion of fresh frozen plasma (FFP), but the precise mechanisms remain unclear. In this study, we investigated whether resuscitation with FFP or normal saline differed in th...... in their effects on cerebral metabolism and excitotoxic secondary brain injury in a model of polytrauma, TBI, and hemorrhagic shock....

  4. Ordered one-component plasmas: Phase transitions, normal modes, large systems, and experiments in a storage ring

    International Nuclear Information System (INIS)

    Schiffer, J.P.

    1994-01-01

    The property of cold one-component plasmas, confined by external forces, to form an ordered array has been known for some time both from simulations and from experiment. The purpose of this talk is to summarize some recent work on simulations and some new experimental results. The author discusses some experimental work on real storage rings, magnetic storage devices in which partials circulate with large kinetic energies and for which laser cooling is used on partially ionized ions to attain temperatures ten or more orders of magnitude lower than their kinetic energies

  5. gamma-Aminobutyric acid production in small and large intestine of normal and germ-free Wistar rats. Influence of food intake and intestinal flora.

    Science.gov (United States)

    van Berlo, C L; de Jonge, H R; van den Bogaard, A E; van Eijk, H M; Janssen, M A; Soeters, P B

    1987-09-01

    In recent hypotheses concerning the pathogenesis of hepatic encephalopathy, gamma-aminobutyric acid (GABA) is claimed to be produced by the colonic flora, although enzymes necessary to generate GABA have been reported to be present in intestinal mucosa. In this study, using normal and germ-free Wistar rats, we determined GABA levels and amino-grams of arterial blood and of venous effluent from small and large bowel. The data indicate that large and small intestinal mucosa significantly contribute to GABA production. In the fasted state GABA concentrations are greater in the venous effluent of the small bowel than in the venous effluent of the large bowel. Feeding increases the arterioportal differences, and uptake in the small bowel is still significantly higher than in the large bowel. This process is not, or can only be to a minor degree, bacterially mediated, because GABA production in the gut both in the fed and fasted state is of similar magnitude in germ-free and normal animals. gamma-Aminobutyric acid release correlates significantly with glutamine uptake in the small bowel of fasted rats. Only a small fraction of the glutamine taken up is needed to account for GABA release, so that conclusions concerning which amino acids may serve as precursors of GABA cannot be drawn. Further studies are needed to delineate the metabolic pathways leading to GABA synthesis.

  6. Normal-Mode Analysis of Circular DNA at the Base-Pair Level. 2. Large-Scale Configurational Transformation of a Naturally Curved Molecule.

    Science.gov (United States)

    Matsumoto, Atsushi; Tobias, Irwin; Olson, Wilma K

    2005-01-01

    Fine structural and energetic details embedded in the DNA base sequence, such as intrinsic curvature, are important to the packaging and processing of the genetic material. Here we investigate the internal dynamics of a 200 bp closed circular molecule with natural curvature using a newly developed normal-mode treatment of DNA in terms of neighboring base-pair "step" parameters. The intrinsic curvature of the DNA is described by a 10 bp repeating pattern of bending distortions at successive base-pair steps. We vary the degree of intrinsic curvature and the superhelical stress on the molecule and consider the normal-mode fluctuations of both the circle and the stable figure-8 configuration under conditions where the energies of the two states are similar. To extract the properties due solely to curvature, we ignore other important features of the double helix, such as the extensibility of the chain, the anisotropy of local bending, and the coupling of step parameters. We compare the computed normal modes of the curved DNA model with the corresponding dynamical features of a covalently closed duplex of the same chain length constructed from naturally straight DNA and with the theoretically predicted dynamical properties of a naturally circular, inextensible elastic rod, i.e., an O-ring. The cyclic molecules with intrinsic curvature are found to be more deformable under superhelical stress than rings formed from naturally straight DNA. As superhelical stress is accumulated in the DNA, the frequency, i.e., energy, of the dominant bending mode decreases in value, and if the imposed stress is sufficiently large, a global configurational rearrangement of the circle to the figure-8 form takes place. We combine energy minimization with normal-mode calculations of the two states to decipher the configurational pathway between the two states. We also describe and make use of a general analytical treatment of the thermal fluctuations of an elastic rod to characterize the

  7. A Normalization-Free and Nonparametric Method Sharpens Large-Scale Transcriptome Analysis and Reveals Common Gene Alteration Patterns in Cancers.

    Science.gov (United States)

    Li, Qi-Gang; He, Yong-Han; Wu, Huan; Yang, Cui-Ping; Pu, Shao-Yan; Fan, Song-Qing; Jiang, Li-Ping; Shen, Qiu-Shuo; Wang, Xiao-Xiong; Chen, Xiao-Qiong; Yu, Qin; Li, Ying; Sun, Chang; Wang, Xiangting; Zhou, Jumin; Li, Hai-Peng; Chen, Yong-Bin; Kong, Qing-Peng

    2017-01-01

    Heterogeneity in transcriptional data hampers the identification of differentially expressed genes (DEGs) and understanding of cancer, essentially because current methods rely on cross-sample normalization and/or distribution assumption-both sensitive to heterogeneous values. Here, we developed a new method, Cross-Value Association Analysis (CVAA), which overcomes the limitation and is more robust to heterogeneous data than the other methods. Applying CVAA to a more complex pan-cancer dataset containing 5,540 transcriptomes discovered numerous new DEGs and many previously rarely explored pathways/processes; some of them were validated, both in vitro and in vivo , to be crucial in tumorigenesis, e.g., alcohol metabolism ( ADH1B ), chromosome remodeling ( NCAPH ) and complement system ( Adipsin ). Together, we present a sharper tool to navigate large-scale expression data and gain new mechanistic insights into tumorigenesis.

  8. A Simple Laparoscopic Procedure to Restore a Normal Vaginal Length After Colpohysterectomy With Large Upper Colpectomy for Cervical and/or Vaginal Neoplasia.

    Science.gov (United States)

    Leblanc, Eric; Bresson, Lucie; Merlot, Benjamin; Puga, Marco; Kridelka, Frederic; Tsunoda, Audrey; Narducci, Fabrice

    2016-01-01

    Colpohysterectomy is sometimes associated with a large upper colpectomy resulting in a shortened vagina, potentially impacting sexual function. We report on a preliminary experience of a laparoscopic colpoplasty to restore a normal vaginal length. Patients with shortened vaginas after a laparoscopic colpohysterectomy were considered for a laparoscopic modified Davydov's procedure to create a new vaginal vault using the peritoneum of the rectum and bladder. From 2010 to 2014, 8 patients were offered this procedure, after informed preoperative consent. Indications were 2 extensive recurrent vaginal intraepithelial neoplasias grade 3 and 6 radical hysterectomies for cervical cancer. Mean vaginal length before surgery was 3.8 cm (standard deviation, 1.6). Median operative time was 50 minutes (range, 45-90). Blood loss was minimal (50-100 mL). No perioperative complications occurred. Median vaginal length at discharge was 11.3 cm (range, 9-13). Sexual intercourse could be resumed around 10 weeks after surgery. At a median follow-up of 33.8 months (range, 2.4-51.3), 6 patients remained sexually active but 2 had stopped. Although this experience is small, this laparoscopic modified Davydov's procedure seems to be an effective procedure, adaptable to each patient's anatomy. If the initial postoperative regular self-dilatation is carefully observed, vaginal patency is durably restored and enables normal sexual function. Copyright © 2016 AAGL. Published by Elsevier Inc. All rights reserved.

  9. Dosimetric Coverage of the Prostate, Normal Tissue Sparing, and Acute Toxicity with High-Dose-Rate Brachytherapy for Large Prostate Volumes

    Directory of Open Access Journals (Sweden)

    George Yang

    2015-06-01

    Full Text Available ABSTRACTPurposeTo evaluate dosimetric coverage of the prostate, normal tissue sparing, and acute toxicity with HDR brachytherapy for large prostate volumes.Materials and MethodsOne hundred and two prostate cancer patients with prostate volumes >50 mL (range: 5-29 mL were treated with high-dose-rate (HDR brachytherapy ± intensity modulated radiation therapy (IMRT to 4,500 cGy in 25 daily fractions between 2009 and 2013. HDR brachytherapy monotherapy doses consisted of two 1,350-1,400 cGy fractions separated by 2-3 weeks, and HDR brachytherapy boost doses consisted of two 950-1,150 cGy fractions separated by 4 weeks. Twelve of 32 (38% unfavorable intermediate risk, high risk, and very high risk patients received androgen deprivation therapy. Acute toxicity was graded according to the Common Terminology Criteria for Adverse Events (CTCAE version 4.ResultsMedian follow-up was 14 months. Dosimetric goals were achieved in over 90% of cases. Three of 102 (3% patients developed Grade 2 acute proctitis. No variables were significantly associated with Grade 2 acute proctitis. Seventeen of 102 (17% patients developed Grade 2 acute urinary retention. American Urological Association (AUA symptom score was the only variable significantly associated with Grade 2 acute urinary retention (p=0.04. There was no ≥ Grade 3 acute toxicity.ConclusionsDosimetric coverage of the prostate and normal tissue sparing were adequate in patients with prostate volumes >50 mL. Higher pre-treatment AUA symptom scores increased the relative risk of Grade 2 acute urinary retention. However, the overall incidence of acute toxicity was acceptable in patients with large prostate volumes.

  10. Dosimetric coverage of the prostate, normal tissue sparing, and acute toxicity with high-dose-rate brachytherapy for large prostate volumes

    Energy Technology Data Exchange (ETDEWEB)

    Yang, George; Strom, Tobin J.; Shrinath, Kushagra; Mellon, Eric A.; Fernandez, Daniel C.; Biagioli, Matthew C. [Department of Radiation Oncology, H. Lee Moffitt Cancer Center and Research Institute, Tampa, FL (United States); Wilder, Richard B., E-mail: mcbiagioli@yahoo.com [Cancer Treatment Centers of America, Newnan, GA (United States)

    2015-05-15

    Purpose: to evaluate dosimetric coverage of the prostate, normal tissue sparing, and acute toxicity with HDR brachytherapy for large prostate volumes. Materials and methods: one hundred and two prostate cancer patients with prostate volumes >50 mL (range: 5-29 mL) were treated with high-dose-rate (HDR) brachytherapy ± intensity modulated radiation therapy (IMRT) to 4,500 cGy in 25 daily fractions between 2009 and 2013. HDR brachytherapy monotherapy doses consisted of two 1,350-1,400 cGy fractions separated by 2-3 weeks, and HDR brachytherapy boost doses consisted of two 950-1,150 cGy fractions separated by 4 weeks. Twelve of 32 (38%) unfavorable intermediate risk, high risk, and very high risk patients received androgen deprivation therapy. Acute toxicity was graded according to the Common Terminology Criteria for Adverse Events (CTCAE) version 4. Results: median follow-up was 14 months. Dosimetric goals were achieved in over 90% of cases. Three of 102 (3%) patients developed Grade 2 acute proctitis. No variables were significantly associated with Grade 2 acute proctitis. Seventeen of 102 (17%) patients developed Grade 2 acute urinary retention. American Urological Association (AUA) symptom score was the only variable significantly associated with Grade 2 acute urinary retention (p-0.04). There was no ≥ Grade 3 acute toxicity. Conclusions: dosimetric coverage of the prostate and normal tissue sparing were adequate in patients with prostate volumes >50 mL. Higher pre-treatment AUA symptom scores increased the relative risk of Grade 2 acute urinary retention. However, the overall incidence of acute toxicity was acceptable in patients with large prostate volumes. (author)

  11. Neonatal L-glutamine modulates anxiety-like behavior, cortical spreading depression, and microglial immunoreactivity: analysis in developing rats suckled on normal size- and large size litters.

    Science.gov (United States)

    de Lima, Denise Sandrelly Cavalcanti; Francisco, Elian da Silva; Lima, Cássia Borges; Guedes, Rubem Carlos Araújo

    2017-02-01

    In mammals, L-glutamine (Gln) can alter the glutamate-Gln cycle and consequently brain excitability. Here, we investigated in developing rats the effect of treatment with different doses of Gln on anxiety-like behavior, cortical spreading depression (CSD), and microglial activation expressed as Iba1-immunoreactivity. Wistar rats were suckled in litters with 9 and 15 pups (groups L 9 and L 15 ; respectively, normal size- and large size litters). From postnatal days (P) 7-27, the animals received Gln per gavage (250, 500 or 750 mg/kg/day), or vehicle (water), or no treatment (naive). At P28 and P30, we tested the animals, respectively, in the elevated plus maze and open field. At P30-35, we measured CSD parameters (velocity of propagation, amplitude, and duration). Fixative-perfused brains were processed for microglial immunolabeling with anti-IBA-1 antibodies to analyze cortical microglia. Rats treated with Gln presented an anxiolytic behavior and accelerated CSD propagation when compared to the water- and naive control groups. Furthermore, CSD velocity was higher (p litter sizes, and for microglial activation in the L 15 groups. Besides confirming previous electrophysiological findings (CSD acceleration after Gln), our data demonstrate for the first time a behavioral and microglial activation that is associated with early Gln treatment in developing animals, and that is possibly operated via changes in brain excitability.

  12. The very large G-protein-coupled receptor VLGR1: a component of the ankle link complex required for the normal development of auditory hair bundles.

    Science.gov (United States)

    McGee, Joann; Goodyear, Richard J; McMillan, D Randy; Stauffer, Eric A; Holt, Jeffrey R; Locke, Kirsten G; Birch, David G; Legan, P Kevin; White, Perrin C; Walsh, Edward J; Richardson, Guy P

    2006-06-14

    Sensory hair bundles in the inner ear are composed of stereocilia that can be interconnected by a variety of different link types, including tip links, horizontal top connectors, shaft connectors, and ankle links. The ankle link antigen is an epitope specifically associated with ankle links and the calycal processes of photoreceptors in chicks. Mass spectrometry and immunoblotting were used to identify this antigen as the avian ortholog of the very large G-protein-coupled receptor VLGR1, the product of the Usher syndrome USH2C (Mass1) locus. Like ankle links, Vlgr1 is expressed transiently around the base of developing hair bundles in mice. Ankle links fail to form in the cochleae of mice carrying a targeted mutation in Vlgr1 (Vlgr1/del7TM), and the bundles become disorganized just after birth. FM1-43 [N-(3-triethylammonium)propyl)-4-(4-(dibutylamino)styryl) pyridinium dibromide] dye loading and whole-cell recordings indicate mechanotransduction is impaired in cochlear, but not vestibular, hair cells of early postnatal Vlgr1/del7TM mutant mice. Auditory brainstem recordings and distortion product measurements indicate that these mice are severely deaf by the third week of life. Hair cells from the basal half of the cochlea are lost in 2-month-old Vlgr1/del7TM mice, and retinal function is mildly abnormal in aged mutants. Our results indicate that Vlgr1 is required for formation of the ankle link complex and the normal development of cochlear hair bundles.

  13. Normal accidents

    International Nuclear Information System (INIS)

    Perrow, C.

    1989-01-01

    The author has chosen numerous concrete examples to illustrate the hazardousness inherent in high-risk technologies. Starting with the TMI reactor accident in 1979, he shows that it is not only the nuclear energy sector that bears the risk of 'normal accidents', but also quite a number of other technologies and industrial sectors, or research fields. The author refers to the petrochemical industry, shipping, air traffic, large dams, mining activities, and genetic engineering, showing that due to the complexity of the systems and their manifold, rapidly interacting processes, accidents happen that cannot be thoroughly calculated, and hence are unavoidable. (orig./HP) [de

  14. Heuristic Relative Entropy Principles with Complex Measures: Large-Degree Asymptotics of a Family of Multi-variate Normal Random Polynomials

    Science.gov (United States)

    Kiessling, Michael Karl-Heinz

    2017-10-01

    Let z\\in C, let σ ^2>0 be a variance, and for N\\in N define the integrals E_N^{}(z;σ ) := {1/σ } \\int _R\\ (x^2+z^2) e^{-{1/2σ^2 x^2}}{√{2π }}/dx \\quad if N=1, {1/σ } \\int _{R^N} \\prod \\prod \\limits _{1≤ k1. These are expected values of the polynomials P_N^{}(z)=\\prod _{1≤ n≤ N}(X_n^2+z^2) whose 2 N zeros ± i X_k^{}_{k=1,\\ldots ,N} are generated by N identically distributed multi-variate mean-zero normal random variables {X_k}N_{k=1} with co-variance {Cov}_N^{}(X_k,X_l)=(1+σ ^2-1/N)δ _{k,l}+σ ^2-1/N(1-δ _{k,l}). The E_N^{}(z;σ ) are polynomials in z^2, explicitly computable for arbitrary N, yet a list of the first three E_N^{}(z;σ ) shows that the expressions become unwieldy already for moderate N—unless σ = 1, in which case E_N^{}(z;1) = (1+z^2)^N for all z\\in C and N\\in N. (Incidentally, commonly available computer algebra evaluates the integrals E_N^{}(z;σ ) only for N up to a dozen, due to memory constraints). Asymptotic evaluations are needed for the large- N regime. For general complex z these have traditionally been limited to analytic expansion techniques; several rigorous results are proved for complex z near 0. Yet if z\\in R one can also compute this "infinite-degree" limit with the help of the familiar relative entropy principle for probability measures; a rigorous proof of this fact is supplied. Computer algebra-generated evidence is presented in support of a conjecture that a generalization of the relative entropy principle to signed or complex measures governs the N→ ∞ asymptotics of the regime iz\\in R. Potential generalizations, in particular to point vortex ensembles and the prescribed Gauss curvature problem, and to random matrix ensembles, are emphasized.

  15. Short-term arginine deprivation results in large-scale modulation of hepatic gene expression in both normal and tumor cells: microarray bioinformatic analysis

    Directory of Open Access Journals (Sweden)

    Sabo Edmond

    2006-09-01

    Full Text Available Abstract Background We have reported arginine-sensitive regulation of LAT1 amino acid transporter (SLC 7A5 in normal rodent hepatic cells with loss of arginine sensitivity and high level constitutive expression in tumor cells. We hypothesized that liver cell gene expression is highly sensitive to alterations in the amino acid microenvironment and that tumor cells may differ substantially in gene sets sensitive to amino acid availability. To assess the potential number and classes of hepatic genes sensitive to arginine availability at the RNA level and compare these between normal and tumor cells, we used an Affymetrix microarray approach, a paired in vitro model of normal rat hepatic cells and a tumorigenic derivative with triplicate independent replicates. Cells were exposed to arginine-deficient or control conditions for 18 hours in medium formulated to maintain differentiated function. Results Initial two-way analysis with a p-value of 0.05 identified 1419 genes in normal cells versus 2175 in tumor cells whose expression was altered in arginine-deficient conditions relative to controls, representing 9–14% of the rat genome. More stringent bioinformatic analysis with 9-way comparisons and a minimum of 2-fold variation narrowed this set to 56 arginine-responsive genes in normal liver cells and 162 in tumor cells. Approximately half the arginine-responsive genes in normal cells overlap with those in tumor cells. Of these, the majority was increased in expression and included multiple growth, survival, and stress-related genes. GADD45, TA1/LAT1, and caspases 11 and 12 were among this group. Previously known amino acid regulated genes were among the pool in both cell types. Available cDNA probes allowed independent validation of microarray data for multiple genes. Among genes downregulated under arginine-deficient conditions were multiple genes involved in cholesterol and fatty acid metabolism. Expression of low-density lipoprotein receptor was

  16. Blood pressure normalization in a large population of hypertensive patients treated with perindopril/indapamide combination: results of the OPTIMAX trial

    Directory of Open Access Journals (Sweden)

    Jean-Jacques Mourad

    2007-03-01

    Full Text Available Jean-Jacques Mourad1, Viet Nguyen1, Marilucy Lopez-Sublet1, Bernard Waeber21Dept Internal Medicine and Hypertension Unit, Avicenne hospital-APHP and Paris 13 University, Bobigny, France; 2Bernard Waeber, Division de Physiopathologie Clinique, Lausanne, SwitzerlandObjective: To determine if the fixed-dose perindopril/indapamide combination (Per/Ind normalizes blood pressure (BP in the same fraction of hypertensive patients when treated in everyday practice or in controlled trials.Methods: In this prospective trial, 17 938 hypertensive patients were treated with Per 2 mg/Ind 0.625 mg for 3–6 months. In Group 1 Per/Ind was initiated in newly diagnosed patients (n = 7032; in Group 2 Per/Ind replaced previous therapy in patients already treated but having either their BP still uncontrolled or experiencing side-effects (n = 7423; in Group 3 Per/Ind was added to previous treatment in patients with persistently high BP (n = 3483. BP was considered normalized when ≤ 140/90 mm Hg. A multivariate analysis for predictors of BP normalization was performed.Results: Subjects were on average 62 years old and had a baseline BP of 162.3/93.6 mm Hg. After treatment with Per/Ind, BP normalization was reached in 69.6% of patients in the Initiation group, 67.5% in the Replacement Group, and 67.4% in the Add-on Group (where patients were more frequently at risk, diabetic, or with target organ damage. Mean decreases in systolic BP of 22.8 mm Hg and in diastolic BP of 12.4 mm Hg were recorded. Conclusions: This trial was established to reflect everyday clinical practice, and a treatment strategy based on the Per/Ind combination, administered as initial, replacement, or add-on therapy, led to normalization rates that were superior to those observed in Europe in routine practice. These results support recent hypertension guidelines which encourage the use of combination therapy in the management of arterial hypertension.Keywords: perindopril, indapamide, blood

  17. Clarifying Normalization

    Science.gov (United States)

    Carpenter, Donald A.

    2008-01-01

    Confusion exists among database textbooks as to the goal of normalization as well as to which normal form a designer should aspire. This article discusses such discrepancies with the intention of simplifying normalization for both teacher and student. This author's industry and classroom experiences indicate such simplification yields quicker…

  18. Low-loss, compact, and fabrication-tolerant Si-wire 90° waveguide bend using clothoid and normal curves for large scale photonic integrated circuits.

    Science.gov (United States)

    Fujisawa, Takeshi; Makino, Shuntaro; Sato, Takanori; Saitoh, Kunimasa

    2017-04-17

    Ultimately low-loss 90° waveguide bend composed of clothoid and normal curves is proposed for dense optical interconnect photonic integrated circuits. By using clothoid curves at the input and output of 90° waveguide bend, straight and bent waveguides are smoothly connected without increasing the footprint. We found that there is an optimum ratio of clothoid curves in the bend and the bending loss can be significantly reduced compared with normal bend. 90% reduction of the bending loss for the bending radius of 4 μm is experimentally demonstrated with excellent agreement between theory and experiment. The performance is compared with the waveguide bend with offset, and the proposed bend is superior to the waveguide bend with offset in terms of fabrication tolerance.

  19. PARK2, a Large Common Fragile Site Gene, is Part of a Stress Response Network in Normal Cells That is Disrupted During the Development of Ovarian Cancer

    National Research Council Canada - National Science Library

    Smith, David I; Zhu, Yu

    2007-01-01

    .... The central two questions that we want to address with this work are what role does inactivation of Parkin and other large CFS genes play in the development of ovarian cancer and whether these genes...

  20. Large enhancement of thermoelectric effects in a tunneling-coupled parallel DQD-AB ring attached to one normal and one superconducting lead

    Science.gov (United States)

    Yao, Hui; Zhang, Chao; Li, Zhi-Jian; Nie, Yi-Hang; Niu, Peng-bin

    2018-05-01

    We theoretically investigate the thermoelectric properties in a tunneling-coupled parallel DQD-AB ring attached to one normal and one superconducting lead. The role of the intrinsic and extrinsic parameters in improving thermoelectric properties is discussed. The peak value of figure of merit near gap edges increases with the asymmetry parameter decreasing, particularly, when asymmetry parameter is less than 0.5, the figure of merit near gap edges rapidly rises. When the interdot coupling strengh is less than the superconducting gap the thermopower spectrum presents a single-platform structure. While when the interdot coupling strengh is larger than the gap, a double-platform structure appears in thermopower spectrum. Outside the gap the peak values of figure of merit might reach the order of 102. On the basis of optimizing internal parameters the thermoelectric conversion efficiency of the device can be further improved by appropriately matching the total magnetic flux and the flux difference between two subrings.

  1. Birkhoff normalization

    NARCIS (Netherlands)

    Broer, H.; Hoveijn, I.; Lunter, G.; Vegter, G.

    2003-01-01

    The Birkhoff normal form procedure is a widely used tool for approximating a Hamiltonian systems by a simpler one. This chapter starts out with an introduction to Hamiltonian mechanics, followed by an explanation of the Birkhoff normal form procedure. Finally we discuss several algorithms for

  2. Fresh Frozen Plasma Resuscitation Provides Neuroprotection Compared to Normal Saline in a Large Animal Model of Traumatic Brain Injury and Polytrauma

    DEFF Research Database (Denmark)

    Imam, Ayesha; Jin, Guang; Sillesen, Martin

    2015-01-01

    Abstract We have previously shown that early treatment with fresh frozen plasma (FFP) is neuroprotective in a swine model of hemorrhagic shock (HS) and traumatic brain injury (TBI). However, it remains unknown whether this strategy would be beneficial in a more clinical polytrauma model. Yorkshire...... as well as cerebral perfusion pressures. Levels of cerebral eNOS were higher in the FFP-treated group (852.9 vs. 816.4 ng/mL; p=0.03), but no differences in brain levels of ET-1 were observed. Early administration of FFP is neuroprotective in a complex, large animal model of polytrauma, hemorrhage...

  3. Normal black holes in bulge-less galaxies: the largely quiescent, merger-free growth of black holes over cosmic time

    Science.gov (United States)

    Martin, G.; Kaviraj, S.; Volonteri, M.; Simmons, B. D.; Devriendt, J. E. G.; Lintott, C. J.; Smethurst, R. J.; Dubois, Y.; Pichon, C.

    2018-05-01

    Understanding the processes that drive the formation of black holes (BHs) is a key topic in observational cosmology. While the observed MBH-MBulge correlation in bulge-dominated galaxies is thought to be produced by major mergers, the existence of an MBH-M⋆ relation, across all galaxy morphological types, suggests that BHs may be largely built by secular processes. Recent evidence that bulge-less galaxies, which are unlikely to have had significant mergers, are offset from the MBH-MBulge relation, but lie on the MBH-M⋆ relation, has strengthened this hypothesis. Nevertheless, the small size and heterogeneity of current data sets, coupled with the difficulty in measuring precise BH masses, make it challenging to address this issue using empirical studies alone. Here, we use Horizon-AGN, a cosmological hydrodynamical simulation to probe the role of mergers in BH growth over cosmic time. We show that (1) as suggested by observations, simulated bulge-less galaxies lie offset from the main MBH-MBulge relation, but on the MBH-M⋆ relation, (2) the positions of galaxies on the MBH-M⋆ relation are not affected by their merger histories, and (3) only ˜35 per cent of the BH mass in today's massive galaxies is directly attributable to merging - the majority (˜65 per cent) of BH growth, therefore, takes place gradually, via secular processes, over cosmic time.

  4. Early Science with the Large Millimeter Telescope: Detection of Dust Emission in Multiple Images of a Normal Galaxy at z > 4 Lensed by a Frontier Fields Cluster

    Energy Technology Data Exchange (ETDEWEB)

    Pope, Alexandra; Battisti, Andrew; Wilson, Grant W.; Calzetti, Daniela; Cybulski, Ryan; Giavalisco, Mauro; Kirkpatrick, Allison [Department of Astronomy, University of Massachusetts, Amherst, MA 01003 (United States); Montaña, Alfredo; Aretxaga, Itziar; Hughes, David [Instituto Nacional de Astrofísica, Óptica y Electrónica (INAOE), Luis Enrique Erro 1, Sta. Ma. Tonantzintla, 72840 Puebla (Mexico); Limousin, Marceau [Aix Marseille Univ, CNRS, LAM, Laboratoire d' Astrophysique de Marseille, Marseille (France); Marchesini, Danilo; Kado-Fong, Erin [Department of Physics and Astronomy, Tufts University, Medford, MA 02155 (United States); Alberts, Stacey [Steward Observatory, University of Arizona, 933 North Cherry Avenue, Tucson, AZ 85721 (United States); Avila-Reese, Vladimir [Instituto de Astronomía, Universidad Nacional Autónoma de México, A.P. 70-264, 04510, CDMX (Mexico); Bermejo-Climent, José Ramón [Departamento de Astrofísica, Universidad de La Laguna. Vía Láctea s/n, La Laguna 38200, Tenerife (Spain); Brammer, Gabriel [Space Telescope Science Institute, 3700 San Martin Drive, Baltimore, MD 21218 (United States); Bravo-Alfaro, Hector [Departamento de Astronomia, Universidad de Guanajuato, Apdo. Postal 144, Guanajuato 36000 (Mexico); Chary, Ranga-Ram [Infrared Processing and Analysis Center, MS314-6, California Institute of Technology, Pasadena, CA 91125 (United States); Keller, Erica, E-mail: pope@astro.umass.edu [National Radio Astronomy Observatory, 520 Edgemont Road, Charlottesville, VA 22903 (United States); and others

    2017-04-01

    We directly detect dust emission in an optically detected, multiply imaged galaxy lensed by the Frontier Fields cluster MACSJ0717.5+3745. We detect two images of the same galaxy at 1.1 mm with the AzTEC camera on the Large Millimeter Telescope leaving no ambiguity in the counterpart identification. This galaxy, MACS0717-Az9, is at z > 4 and the strong lensing model ( μ = 7.5) allows us to calculate an intrinsic IR luminosity of 9.7 × 10{sup 10} L {sub ⊙} and an obscured star formation rate of 14.6 ± 4.5 M {sub ⊙} yr{sup −1}. The unobscured star formation rate from the UV is only 4.1 ± 0.3 M {sub ⊙} yr{sup −1}, which means the total star formation rate (18.7 ± 4.5 M {sub ⊙} yr{sup −1}) is dominated (75%–80%) by the obscured component. With an intrinsic stellar mass of only 6.9 × 10{sup 9} M {sub ⊙}, MACS0717-Az9 is one of only a handful of z > 4 galaxies at these lower masses that is detected in dust emission. This galaxy lies close to the estimated star formation sequence at this epoch. However, it does not lie on the dust obscuration relation (IRX- β ) for local starburst galaxies and is instead consistent with the Small Magellanic Cloud attenuation law. This remarkable lower mass galaxy, showing signs of both low metallicity and high dust content, may challenge our picture of dust production in the early universe.

  5. Early Science with the Large Millimeter Telescope: Detection of Dust Emission in Multiple Images of a Normal Galaxy at z > 4 Lensed by a Frontier Fields Cluster

    International Nuclear Information System (INIS)

    Pope, Alexandra; Battisti, Andrew; Wilson, Grant W.; Calzetti, Daniela; Cybulski, Ryan; Giavalisco, Mauro; Kirkpatrick, Allison; Montaña, Alfredo; Aretxaga, Itziar; Hughes, David; Limousin, Marceau; Marchesini, Danilo; Kado-Fong, Erin; Alberts, Stacey; Avila-Reese, Vladimir; Bermejo-Climent, José Ramón; Brammer, Gabriel; Bravo-Alfaro, Hector; Chary, Ranga-Ram; Keller, Erica

    2017-01-01

    We directly detect dust emission in an optically detected, multiply imaged galaxy lensed by the Frontier Fields cluster MACSJ0717.5+3745. We detect two images of the same galaxy at 1.1 mm with the AzTEC camera on the Large Millimeter Telescope leaving no ambiguity in the counterpart identification. This galaxy, MACS0717-Az9, is at z > 4 and the strong lensing model ( μ = 7.5) allows us to calculate an intrinsic IR luminosity of 9.7 × 10 10 L ⊙ and an obscured star formation rate of 14.6 ± 4.5 M ⊙ yr −1 . The unobscured star formation rate from the UV is only 4.1 ± 0.3 M ⊙ yr −1 , which means the total star formation rate (18.7 ± 4.5 M ⊙ yr −1 ) is dominated (75%–80%) by the obscured component. With an intrinsic stellar mass of only 6.9 × 10 9 M ⊙ , MACS0717-Az9 is one of only a handful of z > 4 galaxies at these lower masses that is detected in dust emission. This galaxy lies close to the estimated star formation sequence at this epoch. However, it does not lie on the dust obscuration relation (IRX- β ) for local starburst galaxies and is instead consistent with the Small Magellanic Cloud attenuation law. This remarkable lower mass galaxy, showing signs of both low metallicity and high dust content, may challenge our picture of dust production in the early universe.

  6. Molecular analysis of immunoglobulin variable genes supports a germinal center experienced normal counterpart in primary cutaneous diffuse large B-cell lymphoma, leg-type.

    Science.gov (United States)

    Pham-Ledard, Anne; Prochazkova-Carlotti, Martina; Deveza, Mélanie; Laforet, Marie-Pierre; Beylot-Barry, Marie; Vergier, Béatrice; Parrens, Marie; Feuillard, Jean; Merlio, Jean-Philippe; Gachard, Nathalie

    2017-11-01

    Immunophenotype of primary cutaneous diffuse large B-cell lymphoma, leg-type (PCLBCL-LT) suggests a germinal center-experienced B lymphocyte (BCL2+ MUM1+ BCL6+/-). As maturation history of B-cell is "imprinted" during B-cell development on the immunoglobulin gene sequence, we studied the structure and sequence of the variable part of the genes (IGHV, IGLV, IGKV), immunoglobulin surface expression and features of class switching in order to determine the PCLBCL-LT cell of origin. Clonality analysis with BIOMED2 protocol and VH leader primers was done on DNA extracted from frozen skin biopsies on retrospective samples from 14 patients. The clonal DNA IGHV sequence of the tumor was aligned and compared with the closest germline sequence and homology percentage was calculated. Superantigen binding sites were studied. Features of selection pressure were evaluated with the multinomial Lossos model. A functional monoclonal sequence was observed in 14 cases as determined for IGHV (10), IGLV (2) or IGKV (3). IGV mutation rates were high (>5%) in all cases but one (median:15.5%), with superantigen binding sites conservation. Features of selection pressure were identified in 11/12 interpretable cases, more frequently negative (75%) than positive (25%). Intraclonal variation was detected in 3 of 8 tumor specimens with a low rate of mutations. Surface immunoglobulin was an IgM in 12/12 cases. FISH analysis of IGHM locus, deleted during class switching, showed heterozygous IGHM gene deletion in half of cases. The genomic PCR analysis confirmed the deletions within the switch μ region. IGV sequences were highly mutated but functional, with negative features of selection pressure suggesting one or more germinal center passage(s) with somatic hypermutation, but superantigen (SpA) binding sites conservation. Genetic features of class switch were observed, but on the non functional allele and co-existing with primary isotype IgM expression. These data suggest that cell-of origin is

  7. Relationship between body composition, body mass index and bone mineral density in a large population of normal, osteopenic and osteoporotic women.

    Science.gov (United States)

    Andreoli, A; Bazzocchi, A; Celi, M; Lauro, D; Sorge, R; Tarantino, U; Guglielmi, G

    2011-10-01

    The knowledge of factors modulating the behaviour of bone mass is crucial for preventing and treating osteoporotic disease; among these factors, body weight (BW) has been shown to be of primary importance in postmenopausal women. Nevertheless, the relative effects of body composition indices are still being debated. Our aim was to analyze the relationship between body mass index (BMI), fat and lean mass and bone mineral density (BMD) in a large population of women. Moreover, this study represents a first important report on reference standard values for body composition in Italian women. Between 2005 and 2008, weight and height of 6,249 Italian women (aged 30-80 years) were measured and BMI was calculated; furthermore BMD, bone mineral content, fat and lean mass were measured by dual-energy X-ray absorptiometry. Individuals were divided into five groups by decades (group 1, 30.0-39.9; group 2, 40.0-49.9; group 3, 50.0-59.9; group 4, 60.0-69.9; group 5, 70.0-79.9). Differences among decades for all variables were calculated using a one-way analysis of variance (ANOVA) and Bonferroni test by the SPSS programme. Mean BW was 66.8±12.1 kg, mean height 159.1±6.3 cm and mean BMI 26.4±4.7 kg/m(2). According to BW and BMI, there was an increase of obesity with age, especially in women older than 50 years (posteoporosis in the examined population was 43.0% and 16.7%, respectively. Our data show that obesity significantly decreased the risk for osteoporosis but did not decrease the risk for osteopenia. It is strongly recommended that a strong policy regarding prevention of osteopenia and osteoporosis be commenced. An overall examination of our results suggests that both fat and lean body mass can influence bone mass and that their relative effect on bone could be modulated by their absolute amount and ratio to total BW.

  8. Malware Normalization

    OpenAIRE

    Christodorescu, Mihai; Kinder, Johannes; Jha, Somesh; Katzenbeisser, Stefan; Veith, Helmut

    2005-01-01

    Malware is code designed for a malicious purpose, such as obtaining root privilege on a host. A malware detector identifies malware and thus prevents it from adversely affecting a host. In order to evade detection by malware detectors, malware writers use various obfuscation techniques to transform their malware. There is strong evidence that commercial malware detectors are susceptible to these evasion tactics. In this paper, we describe the design and implementation of a malware normalizer ...

  9. Reconstructing Normality

    DEFF Research Database (Denmark)

    Gildberg, Frederik Alkier; Bradley, Stephen K.; Fristed, Peter Billeskov

    2012-01-01

    Forensic psychiatry is an area of priority for the Danish Government. As the field expands, this calls for increased knowledge about mental health nursing practice, as this is part of the forensic psychiatry treatment offered. However, only sparse research exists in this area. The aim of this study...... was to investigate the characteristics of forensic mental health nursing staff interaction with forensic mental health inpatients and to explore how staff give meaning to these interactions. The project included 32 forensic mental health staff members, with over 307 hours of participant observations, 48 informal....... The intention is to establish a trusting relationship to form behaviour and perceptual-corrective care, which is characterized by staff's endeavours to change, halt, or support the patient's behaviour or perception in relation to staff's perception of normality. The intention is to support and teach the patient...

  10. Pursuing Normality

    DEFF Research Database (Denmark)

    Madsen, Louise Sofia; Handberg, Charlotte

    2018-01-01

    implying an influence on whether to participate in cancer survivorship care programs. Because of "pursuing normality," 8 of 9 participants opted out of cancer survivorship care programming due to prospects of "being cured" and perceptions of cancer survivorship care as "a continuation of the disease......BACKGROUND: The present study explored the reflections on cancer survivorship care of lymphoma survivors in active treatment. Lymphoma survivors have survivorship care needs, yet their participation in cancer survivorship care programs is still reported as low. OBJECTIVE: The aim of this study...... was to understand the reflections on cancer survivorship care of lymphoma survivors to aid the future planning of cancer survivorship care and overcome barriers to participation. METHODS: Data were generated in a hematological ward during 4 months of ethnographic fieldwork, including participant observation and 46...

  11. Making nuclear 'normal'

    International Nuclear Information System (INIS)

    Haehlen, Peter; Elmiger, Bruno

    2000-01-01

    The mechanics of the Swiss NPPs' 'come and see' programme 1995-1999 were illustrated in our contributions to all PIME workshops since 1996. Now, after four annual 'waves', all the country has been covered by the NPPs' invitation to dialogue. This makes PIME 2000 the right time to shed some light on one particular objective of this initiative: making nuclear 'normal'. The principal aim of the 'come and see' programme, namely to give the Swiss NPPs 'a voice of their own' by the end of the nuclear moratorium 1990-2000, has clearly been attained and was commented on during earlier PIMEs. It is, however, equally important that Swiss nuclear energy not only made progress in terms of public 'presence', but also in terms of being perceived as a normal part of industry, as a normal branch of the economy. The message that Swiss nuclear energy is nothing but a normal business involving normal people, was stressed by several components of the multi-prong campaign: - The speakers in the TV ads were real - 'normal' - visitors' guides and not actors; - The testimonials in the print ads were all real NPP visitors - 'normal' people - and not models; - The mailings inviting a very large number of associations to 'come and see' activated a typical channel of 'normal' Swiss social life; - Spending money on ads (a new activity for Swiss NPPs) appears to have resulted in being perceived by the media as a normal branch of the economy. Today we feel that the 'normality' message has well been received by the media. In the controversy dealing with antinuclear arguments brought forward by environmental organisations journalists nowadays as a rule give nuclear energy a voice - a normal right to be heard. As in a 'normal' controversy, the media again actively ask themselves questions about specific antinuclear claims, much more than before 1990 when the moratorium started. The result is that in many cases such arguments are discarded by journalists, because they are, e.g., found to be

  12. Evaluation of {sup 123}I-orthoiodohippurate single kidney clearance rate by renal sequential scintigraphy in a large cohort of likely normal subjects aged between 0 and 18 years

    Energy Technology Data Exchange (ETDEWEB)

    Imperiale, Alessio; Olianti, Catia; Comis, Giannetto; Cava, Giuseppe la [University of Florence, Nuclear Medicine Unit, Department of Clinical Pathophysiology, Florence (Italy)

    2006-12-15

    Age-related values of{sup 123}I-orthoiodohippurate (OIH) single kidney clearance rate (Cl) were estimated in a large cohort of likely normal children aged between 0 and 18 years. Among 4,111 children examined in the past 10 years, 917 were selected with the following inclusion criteria: (a) mild ultrasonographic hydronephrosis with right differential renal function (DRF) <53% and >47% (498 pts), (b) known or suspected urinary tract infection with normal ultrasound, serum creatinine and DMSA and DRF <53% and >47% (419 pts).{sup 123}I-OIH-Cl was assessed using a validated gamma camera method. Children were divided into 21 age classes: from 0 to 2 years, eight 3-month classes; from 2 to 14 years, twelve 1-year classes; from 14 to 18 years, one 4-year class. Cl, plotted against age, was fitted using an increasing function (y = a - be - cx). Mean{sup 123}I-OIH-Cl of 1,834 kidneys was 306{+-}22 ml/min/1.73 m{sup 2} BSA. Mean{sup 123}I-OIH-Cl of the right and left kidneys was 307{+-}23 and 305{+-}22 ml/min/1.73 m{sup 2} BSA, respectively (p<0.002). The best-fitting{sup 123}I-OIH-Cl growing function was: Cl=311-230e-0.69 x Age (months).{sup 123}I-OIH-Cl improved progressively starting from birth, reaching 96% and 98% of the mature value at 1 and 1.5 years, respectively.{sup 123}I-OIH-Cl at birth (age=0) was 81 ml/min/1.73 m{sup 2} BSA. After 18.6 days of life, the renal function had doubled its starting value, and it reached a plateau of 311 ml/min/1.73 m{sup 2} BSA at 2 years. This work represents a systematic evaluation of ERPF by a gamma camera method in a large cohort of selected likely normal paediatric subjects. (orig.)

  13. The N'ormal Distribution

    Indian Academy of Sciences (India)

    An optimal way of choosing sample size in an opinion poll is indicated using the normal distribution. Introduction. In this article, the ubiquitous normal distribution is intro- duced as a convenient approximation for computing bino- mial probabilities for large values of n. Stirling's formula. • and DeMoivre-Laplace theorem ...

  14. Idiopathic Normal Pressure Hydrocephalus

    Directory of Open Access Journals (Sweden)

    Basant R. Nassar BS

    2016-04-01

    Full Text Available Idiopathic normal pressure hydrocephalus (iNPH is a potentially reversible neurodegenerative disease commonly characterized by a triad of dementia, gait, and urinary disturbance. Advancements in diagnosis and treatment have aided in properly identifying and improving symptoms in patients. However, a large proportion of iNPH patients remain either undiagnosed or misdiagnosed. Using PubMed search engine of keywords “normal pressure hydrocephalus,” “diagnosis,” “shunt treatment,” “biomarkers,” “gait disturbances,” “cognitive function,” “neuropsychology,” “imaging,” and “pathogenesis,” articles were obtained for this review. The majority of the articles were retrieved from the past 10 years. The purpose of this review article is to aid general practitioners in further understanding current findings on the pathogenesis, diagnosis, and treatment of iNPH.

  15. Normal Pressure Hydrocephalus (NPH)

    Science.gov (United States)

    ... local chapter Join our online community Normal Pressure Hydrocephalus (NPH) Normal pressure hydrocephalus is a brain disorder ... Symptoms Diagnosis Causes & risks Treatments About Normal Pressure Hydrocephalus Normal pressure hydrocephalus occurs when excess cerebrospinal fluid ...

  16. Histone H1 interphase phosphorylation becomes largely established in G1 or early S phase and differs in G1 between T-lymphoblastoid cells and normal T cells

    Directory of Open Access Journals (Sweden)

    Gréen Anna

    2011-08-01

    Full Text Available Abstract Background Histone H1 is an important constituent of chromatin, and is involved in regulation of its structure. During the cell cycle, chromatin becomes locally decondensed in S phase, highly condensed during metaphase, and again decondensed before re-entry into G1. This has been connected to increasing phosphorylation of H1 histones through the cell cycle. However, many of these experiments have been performed using cell-synchronization techniques and cell cycle-arresting drugs. In this study, we investigated the H1 subtype composition and phosphorylation pattern in the cell cycle of normal human activated T cells and Jurkat T-lymphoblastoid cells by capillary electrophoresis after sorting of exponentially growing cells into G1, S and G2/M populations. Results We found that the relative amount of H1.5 protein increased significantly after T-cell activation. Serine phosphorylation of H1 subtypes occurred to a large extent in late G1 or early S phase in both activated T cells and Jurkat cells. Furthermore, our data confirm that the H1 molecules newly synthesized during S phase achieve a similar phosphorylation pattern to the previous ones. Jurkat cells had more extended H1.5 phosphorylation in G1 compared with T cells, a difference that can be explained by faster cell growth and/or the presence of enhanced H1 kinase activity in G1 in Jurkat cells. Conclusion Our data are consistent with a model in which a major part of interphase H1 phosphorylation takes place in G1 or early S phase. This implies that H1 serine phosphorylation may be coupled to changes in chromatin structure necessary for DNA replication. In addition, the increased H1 phosphorylation of malignant cells in G1 may be affecting the G1/S transition control and enabling facilitated S-phase entry as a result of relaxed chromatin condensation. Furthermore, increased H1.5 expression may be coupled to the proliferative capacity of growth-stimulated T cells.

  17. Normalization: A Preprocessing Stage

    OpenAIRE

    Patro, S. Gopal Krishna; Sahu, Kishore Kumar

    2015-01-01

    As we know that the normalization is a pre-processing stage of any type problem statement. Especially normalization takes important role in the field of soft computing, cloud computing etc. for manipulation of data like scale down or scale up the range of data before it becomes used for further stage. There are so many normalization techniques are there namely Min-Max normalization, Z-score normalization and Decimal scaling normalization. So by referring these normalization techniques we are ...

  18. Normalized modes at selected points without normalization

    Science.gov (United States)

    Kausel, Eduardo

    2018-04-01

    As every textbook on linear algebra demonstrates, the eigenvectors for the general eigenvalue problem | K - λM | = 0 involving two real, symmetric, positive definite matrices K , M satisfy some well-defined orthogonality conditions. Equally well-known is the fact that those eigenvectors can be normalized so that their modal mass μ =ϕT Mϕ is unity: it suffices to divide each unscaled mode by the square root of the modal mass. Thus, the normalization is the result of an explicit calculation applied to the modes after they were obtained by some means. However, we show herein that the normalized modes are not merely convenient forms of scaling, but that they are actually intrinsic properties of the pair of matrices K , M, that is, the matrices already "know" about normalization even before the modes have been obtained. This means that we can obtain individual components of the normalized modes directly from the eigenvalue problem, and without needing to obtain either all of the modes or for that matter, any one complete mode. These results are achieved by means of the residue theorem of operational calculus, a finding that is rather remarkable inasmuch as the residues themselves do not make use of any orthogonality conditions or normalization in the first place. It appears that this obscure property connecting the general eigenvalue problem of modal analysis with the residue theorem of operational calculus may have been overlooked up until now, but which has in turn interesting theoretical implications.Á

  19. Normal foot and ankle

    International Nuclear Information System (INIS)

    Weissman, S.D.

    1989-01-01

    The foot may be thought of as a bag of bones tied tightly together and functioning as a unit. The bones re expected to maintain their alignment without causing symptomatology to the patient. The author discusses a normal radiograph. The bones must have normal shape and normal alignment. The density of the soft tissues should be normal and there should be no fractures, tumors, or foreign bodies

  20. A large scale analysis of cDNA in Arabidopsis thaliana: generation of 12,028 non-redundant expressed sequence tags from normalized and size-selected cDNA libraries.

    Science.gov (United States)

    Asamizu, E; Nakamura, Y; Sato, S; Tabata, S

    2000-06-30

    For comprehensive analysis of genes expressed in the model dicotyledonous plant, Arabidopsis thaliana, expressed sequence tags (ESTs) were accumulated. Normalized and size-selected cDNA libraries were constructed from aboveground organs, flower buds, roots, green siliques and liquid-cultured seedlings, respectively, and a total of 14,026 5'-end ESTs and 39,207 3'-end ESTs were obtained. The 3'-end ESTs could be clustered into 12,028 non-redundant groups. Similarity search of the non-redundant ESTs against the public non-redundant protein database indicated that 4816 groups show similarity to genes of known function, 1864 to hypothetical genes, and the remaining 5348 are novel sequences. Gene coverage by the non-redundant ESTs was analyzed using the annotated genomic sequences of approximately 10 Mb on chromosomes 3 and 5. A total of 923 regions were hit by at least one EST, among which only 499 regions were hit by the ESTs deposited in the public database. The result indicates that the EST source generated in this project complements the EST data in the public database and facilitates new gene discovery.

  1. Baby Poop: What's Normal?

    Science.gov (United States)

    ... I'm breast-feeding my newborn and her bowel movements are yellow and mushy. Is this normal for baby poop? Answers from Jay L. Hoecker, M.D. Yellow, mushy bowel movements are perfectly normal for breast-fed babies. Still, ...

  2. Visual Memories Bypass Normalization.

    Science.gov (United States)

    Bloem, Ilona M; Watanabe, Yurika L; Kibbe, Melissa M; Ling, Sam

    2018-05-01

    How distinct are visual memory representations from visual perception? Although evidence suggests that briefly remembered stimuli are represented within early visual cortices, the degree to which these memory traces resemble true visual representations remains something of a mystery. Here, we tested whether both visual memory and perception succumb to a seemingly ubiquitous neural computation: normalization. Observers were asked to remember the contrast of visual stimuli, which were pitted against each other to promote normalization either in perception or in visual memory. Our results revealed robust normalization between visual representations in perception, yet no signature of normalization occurring between working memory stores-neither between representations in memory nor between memory representations and visual inputs. These results provide unique insight into the nature of visual memory representations, illustrating that visual memory representations follow a different set of computational rules, bypassing normalization, a canonical visual computation.

  3. Normal Pressure Hydrocephalus

    Science.gov (United States)

    ... improves the chance of a good recovery. Without treatment, symptoms may worsen and cause death. What research is being done? The NINDS conducts and supports research on neurological disorders, including normal pressure hydrocephalus. Research on disorders such ...

  4. Normality in Analytical Psychology

    Science.gov (United States)

    Myers, Steve

    2013-01-01

    Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity. PMID:25379262

  5. Normal pressure hydrocephalus

    Science.gov (United States)

    Hydrocephalus - occult; Hydrocephalus - idiopathic; Hydrocephalus - adult; Hydrocephalus - communicating; Dementia - hydrocephalus; NPH ... Ferri FF. Normal pressure hydrocephalus. In: Ferri FF, ed. ... Elsevier; 2016:chap 648. Rosenberg GA. Brain edema and disorders ...

  6. Normal Functioning Family

    Science.gov (United States)

    ... Spread the Word Shop AAP Find a Pediatrician Family Life Medical Home Family Dynamics Adoption & Foster Care ... Español Text Size Email Print Share Normal Functioning Family Page Content Article Body Is there any way ...

  7. Normal growth and development

    Science.gov (United States)

    ... page: //medlineplus.gov/ency/article/002456.htm Normal growth and development To use the sharing features on this page, please enable JavaScript. A child's growth and development can be divided into four periods: ...

  8. Smooth quantile normalization.

    Science.gov (United States)

    Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada

    2018-04-01

    Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.

  9. Large-Scale Occupational Skills Normalization for Online Recruitment

    OpenAIRE

    Hoang, Phuong; Mahoney, Thomas; Javed, Faizan; McNair, Matt

    2018-01-01

    Job openings often go unfulfilled despite a surfeit of unemployed or underemployed workers. One of the main reasons for this is a mismatch between the skills required by employers and the skills that workers possess. This mismatch, also known as the skills gap, can pose socioeconomic challenges for an economy. A first step in alleviating the skills gap is to accurately detect skills in human capital data such as resumes and job ads. Comprehensive and accurate detection of skills facilitates a...

  10. The Importance of Normalization on Large and Heterogeneous Microarray Datasets

    Science.gov (United States)

    DNA microarray technology is a powerful functional genomics tool increasingly used for investigating global gene expression in environmental studies. Microarrays can also be used in identifying biological networks, as they give insight on the complex gene-to-gene interactions, ne...

  11. Monitoring the normal body

    DEFF Research Database (Denmark)

    Nissen, Nina Konstantin; Holm, Lotte; Baarts, Charlotte

    2015-01-01

    of practices for monitoring their bodies based on different kinds of calculations of weight and body size, observations of body shape, and measurements of bodily firmness. Biometric measurements are familiar to them as are health authorities' recommendations. Despite not belonging to an extreme BMI category...... provides us with knowledge about how to prevent future overweight or obesity. This paper investigates body size ideals and monitoring practices among normal-weight and moderately overweight people. Methods : The study is based on in-depth interviews combined with observations. 24 participants were...... recruited by strategic sampling based on self-reported BMI 18.5-29.9 kg/m2 and socio-demographic factors. Inductive analysis was conducted. Results : Normal-weight and moderately overweight people have clear ideals for their body size. Despite being normal weight or close to this, they construct a variety...

  12. Normal modified stable processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2002-01-01

    Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process......This paper discusses two classes of distributions, and stochastic processes derived from them: modified stable (MS) laws and normal modified stable (NMS) laws. This extends corresponding results for the generalised inverse Gaussian (GIG) and generalised hyperbolic (GH) or normal generalised inverse...

  13. Normalization of satellite imagery

    Science.gov (United States)

    Kim, Hongsuk H.; Elman, Gregory C.

    1990-01-01

    Sets of Thematic Mapper (TM) imagery taken over the Washington, DC metropolitan area during the months of November, March and May were converted into a form of ground reflectance imagery. This conversion was accomplished by adjusting the incident sunlight and view angles and by applying a pixel-by-pixel correction for atmospheric effects. Seasonal color changes of the area can be better observed when such normalization is applied to space imagery taken in time series. In normalized imagery, the grey scale depicts variations in surface reflectance and tonal signature of multi-band color imagery can be directly interpreted for quantitative information of the target.

  14. The normal holonomy group

    International Nuclear Information System (INIS)

    Olmos, C.

    1990-05-01

    The restricted holonomy group of a Riemannian manifold is a compact Lie group and its representation on the tangent space is a product of irreducible representations and a trivial one. Each one of the non-trivial factors is either an orthogonal representation of a connected compact Lie group which acts transitively on the unit sphere or it is the isotropy representation of a single Riemannian symmetric space of rank ≥ 2. We prove that, all these properties are also true for the representation on the normal space of the restricted normal holonomy group of any submanifold of a space of constant curvature. 4 refs

  15. Normality in Analytical Psychology

    Directory of Open Access Journals (Sweden)

    Steve Myers

    2013-11-01

    Full Text Available Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity.

  16. Medically-enhanced normality

    DEFF Research Database (Denmark)

    Møldrup, Claus; Traulsen, Janine Morgall; Almarsdóttir, Anna Birna

    2003-01-01

    Objective: To consider public perspectives on the use of medicines for non-medical purposes, a usage called medically-enhanced normality (MEN). Method: Examples from the literature were combined with empirical data derived from two Danish research projects: a Delphi internet study and a Telebus...

  17. The Normal Fetal Pancreas.

    Science.gov (United States)

    Kivilevitch, Zvi; Achiron, Reuven; Perlman, Sharon; Gilboa, Yinon

    2017-10-01

    The aim of the study was to assess the sonographic feasibility of measuring the fetal pancreas and its normal development throughout pregnancy. We conducted a cross-sectional prospective study between 19 and 36 weeks' gestation. The study included singleton pregnancies with normal pregnancy follow-up. The pancreas circumference was measured. The first 90 cases were tested to assess feasibility. Two hundred ninety-seven fetuses of nondiabetic mothers were recruited during a 3-year period. The overall satisfactory visualization rate was 61.6%. The intraobserver and interobserver variability had high interclass correlation coefficients of of 0.964 and 0.967, respectively. A cubic polynomial regression described best the correlation of pancreas circumference with gestational age (r = 0.744; P pancreas circumference percentiles for each week of gestation were calculated. During the study period, we detected 2 cases with overgrowth syndrome and 1 case with an annular pancreas. In this study, we assessed the feasibility of sonography for measuring the fetal pancreas and established a normal reference range for the fetal pancreas circumference throughout pregnancy. This database can be helpful when investigating fetomaternal disorders that can involve its normal development. © 2017 by the American Institute of Ultrasound in Medicine.

  18. a Recursive Approach to Compute Normal Forms

    Science.gov (United States)

    HSU, L.; MIN, L. J.; FAVRETTO, L.

    2001-06-01

    Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.

  19. Large transverse momentum phenomena

    International Nuclear Information System (INIS)

    Brodsky, S.J.

    1977-09-01

    It is pointed out that it is particularly significant that the quantum numbers of the leading particles are strongly correlated with the quantum numbers of the incident hadrons indicating that the valence quarks themselves are transferred to large p/sub t/. The crucial question is how they get there. Various hadron reactions are discussed covering the structure of exclusive reactions, inclusive reactions, normalization of inclusive cross sections, charge correlations, and jet production at large transverse momentum. 46 references

  20. Normal Weight Dyslipidemia

    DEFF Research Database (Denmark)

    Ipsen, David Hojland; Tveden-Nyborg, Pernille; Lykkesfeldt, Jens

    2016-01-01

    Objective: The liver coordinates lipid metabolism and may play a vital role in the development of dyslipidemia, even in the absence of obesity. Normal weight dyslipidemia (NWD) and patients with nonalcoholic fatty liver disease (NAFLD) who do not have obesity constitute a unique subset...... of individuals characterized by dyslipidemia and metabolic deterioration. This review examined the available literature on the role of the liver in dyslipidemia and the metabolic characteristics of patients with NAFLD who do not have obesity. Methods: PubMed was searched using the following keywords: nonobese......, dyslipidemia, NAFLD, NWD, liver, and metabolically obese/unhealthy normal weight. Additionally, article bibliographies were screened, and relevant citations were retrieved. Studies were excluded if they had not measured relevant biomarkers of dyslipidemia. Results: NWD and NAFLD without obesity share a similar...

  1. Ethics and "normal birth".

    Science.gov (United States)

    Lyerly, Anne Drapkin

    2012-12-01

    The concept of "normal birth" has been promoted as ideal by several international organizations, although debate about its meaning is ongoing. In this article, I examine the concept of normalcy to explore its ethical implications and raise a trio of concerns. First, in its emphasis on nonuse of technology as a goal, the concept of normalcy may marginalize women for whom medical intervention is necessary or beneficial. Second, in its emphasis on birth as a socially meaningful event, the mantra of normalcy may unintentionally avert attention to meaning in medically complicated births. Third, the emphasis on birth as a normal and healthy event may be a contributor to the long-standing tolerance for the dearth of evidence guiding the treatment of illness during pregnancy and the failure to responsibly and productively engage pregnant women in health research. Given these concerns, it is worth debating not just what "normal birth" means, but whether the term as an ideal earns its keep. © 2012, Copyright the Authors Journal compilation © 2012, Wiley Periodicals, Inc.

  2. Normal Female Reproductive Anatomy

    Science.gov (United States)

    ... historical Searches are case-insensitive Reproductive System, Female, Anatomy Add to My Pictures View /Download : Small: 720x756 ... Large: 3000x3150 View Download Title: Reproductive System, Female, Anatomy Description: Anatomy of the female reproductive system; drawing ...

  3. Normal Pancreas Anatomy

    Science.gov (United States)

    ... hyphen, e.g. -historical Searches are case-insensitive Pancreas Anatomy Add to My Pictures View /Download : Small: ... 1586x1534 View Download Large: 3172x3068 View Download Title: Pancreas Anatomy Description: Anatomy of the pancreas; drawing shows ...

  4. Quenches in large superconducting magnets

    International Nuclear Information System (INIS)

    Eberhard, P.H.; Alston-Garnjost, M.; Green, M.A.; Lecomte, P.; Smits, R.G.; Taylor, J.D.; Vuillemin, V.

    1977-08-01

    The development of large high current density superconducting magnets requires an understanding of the quench process by which the magnet goes normal. A theory which describes the quench process in large superconducting magnets is presented and compared with experimental measurements. The use of a quench theory to improve the design of large high current density superconducting magnets is discussed

  5. Linear regression and the normality assumption.

    Science.gov (United States)

    Schmidt, Amand F; Finan, Chris

    2017-12-16

    Researchers often perform arbitrary outcome transformations to fulfill the normality assumption of a linear regression model. This commentary explains and illustrates that in large data settings, such transformations are often unnecessary, and worse may bias model estimates. Linear regression assumptions are illustrated using simulated data and an empirical example on the relation between time since type 2 diabetes diagnosis and glycated hemoglobin levels. Simulation results were evaluated on coverage; i.e., the number of times the 95% confidence interval included the true slope coefficient. Although outcome transformations bias point estimates, violations of the normality assumption in linear regression analyses do not. The normality assumption is necessary to unbiasedly estimate standard errors, and hence confidence intervals and P-values. However, in large sample sizes (e.g., where the number of observations per variable is >10) violations of this normality assumption often do not noticeably impact results. Contrary to this, assumptions on, the parametric model, absence of extreme observations, homoscedasticity, and independency of the errors, remain influential even in large sample size settings. Given that modern healthcare research typically includes thousands of subjects focusing on the normality assumption is often unnecessary, does not guarantee valid results, and worse may bias estimates due to the practice of outcome transformations. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Theory of normal metals

    International Nuclear Information System (INIS)

    Mahan, G.D.

    1992-01-01

    The organizers requested that I give eight lectures on the theory of normal metals, ''with an eye on superconductivity.'' My job was to cover the general properties of metals. The topics were selected according to what the students would need to known for the following lectures on superconductivity. My role was to prepare the ground work for the later lectures. The problem is that there is not yet a widely accepted theory for the mechanism which pairs the electrons. Many mechanisms have been proposed, with those of phonons and spin fluctuations having the most followers. So I tried to discuss both topics. I also introduced the tight-binding model for metals, which forms the basis for most of the work on the cuprate superconductors

  7. Strength of Gamma Rhythm Depends on Normalization

    Science.gov (United States)

    Ray, Supratim; Ni, Amy M.; Maunsell, John H. R.

    2013-01-01

    Neuronal assemblies often exhibit stimulus-induced rhythmic activity in the gamma range (30–80 Hz), whose magnitude depends on the attentional load. This has led to the suggestion that gamma rhythms form dynamic communication channels across cortical areas processing the features of behaviorally relevant stimuli. Recently, attention has been linked to a normalization mechanism, in which the response of a neuron is suppressed (normalized) by the overall activity of a large pool of neighboring neurons. In this model, attention increases the excitatory drive received by the neuron, which in turn also increases the strength of normalization, thereby changing the balance of excitation and inhibition. Recent studies have shown that gamma power also depends on such excitatory–inhibitory interactions. Could modulation in gamma power during an attention task be a reflection of the changes in the underlying excitation–inhibition interactions? By manipulating the normalization strength independent of attentional load in macaque monkeys, we show that gamma power increases with increasing normalization, even when the attentional load is fixed. Further, manipulations of attention that increase normalization increase gamma power, even when they decrease the firing rate. Thus, gamma rhythms could be a reflection of changes in the relative strengths of excitation and normalization rather than playing a functional role in communication or control. PMID:23393427

  8. Large deviations

    CERN Document Server

    Varadhan, S R S

    2016-01-01

    The theory of large deviations deals with rates at which probabilities of certain events decay as a natural parameter in the problem varies. This book, which is based on a graduate course on large deviations at the Courant Institute, focuses on three concrete sets of examples: (i) diffusions with small noise and the exit problem, (ii) large time behavior of Markov processes and their connection to the Feynman-Kac formula and the related large deviation behavior of the number of distinct sites visited by a random walk, and (iii) interacting particle systems, their scaling limits, and large deviations from their expected limits. For the most part the examples are worked out in detail, and in the process the subject of large deviations is developed. The book will give the reader a flavor of how large deviation theory can help in problems that are not posed directly in terms of large deviations. The reader is assumed to have some familiarity with probability, Markov processes, and interacting particle systems.

  9. Data-driven intensity normalization of PET group comparison studies is superior to global mean normalization

    DEFF Research Database (Denmark)

    Borghammer, Per; Aanerud, Joel; Gjedde, Albert

    2009-01-01

    BACKGROUND: Global mean (GM) normalization is one of the most commonly used methods of normalization in PET and SPECT group comparison studies of neurodegenerative disorders. It requires that no between-group GM difference is present, which may be strongly violated in neurodegenerative disorders....... Importantly, such GM differences often elude detection due to the large intrinsic variance in absolute values of cerebral blood flow or glucose consumption. Alternative methods of normalization are needed for this type of data. MATERIALS AND METHODS: Two types of simulation were performed using CBF images...

  10. Short proofs of strong normalization

    OpenAIRE

    Wojdyga, Aleksander

    2008-01-01

    This paper presents simple, syntactic strong normalization proofs for the simply-typed lambda-calculus and the polymorphic lambda-calculus (system F) with the full set of logical connectives, and all the permutative reductions. The normalization proofs use translations of terms and types to systems, for which strong normalization property is known.

  11. Normal modes of Bardeen discs

    International Nuclear Information System (INIS)

    Verdaguer, E.

    1983-01-01

    The short wavelength normal modes of self-gravitating rotating polytropic discs in the Bardeen approximation are studied. The discs' oscillations can be seen in terms of two types of modes: the p-modes whose driving forces are pressure forces and the r-modes driven by Coriolis forces. As a consequence of differential rotation coupling between the two takes place and some mixed modes appear, their properties can be studied under the assumption of weak coupling and it is seen that they avoid the crossing of the p- and r-modes. The short wavelength analysis provides a basis for the classification of the modes, which can be made by using the properties of their phase diagrams. The classification is applied to the large wavelength modes of differentially rotating discs with strong coupling and to a uniformly rotating sequence with no coupling, which have been calculated in previous papers. Many of the physical properties and qualitative features of these modes are revealed by the analysis. (author)

  12. Congenital anomalies and normal skeletal variants

    International Nuclear Information System (INIS)

    Guebert, G.M.; Yochum, T.R.; Rowe, L.J.

    1987-01-01

    Congenital anomalies and normal skeletal variants are a common occurrence in clinical practice. In this chapter a large number of skeletal anomalies of the spine and pelvis are reviewed. Some of the more common skeletal anomalies of the extremities are also presented. The second section of this chapter deals with normal skeletal variants. Some of these variants may simulate certain disease processes. In some instances there are no clear-cut distinctions between skeletal variants and anomalies; therefore, there may be some overlap of material. The congenital anomalies are presented initially with accompanying text, photos, and references, beginning with the skull and proceeding caudally through the spine to then include the pelvis and extremities. The normal skeletal variants section is presented in an anatomical atlas format without text or references

  13. X-ray emssion from normal galaxies

    International Nuclear Information System (INIS)

    Speybroeck, L. van; Bechtold, J.

    1981-01-01

    A summary of results obtained with the Einstein Observatory is presented. There are two general categories of normal galaxy investigation being pursued - detailed studies of nearby galaxies where individual sources can be detected and possibly correlated with galactic morphology, and shorter observations of many more distant objects to determine the total luminosity distribution of normal galaxies. The principal examples of the first type are the CFA study of M31 and the Columbia study of the Large Magellanic Cloud. The Columbia normal galaxy survey is the principal example of the second type, although there also are smaller CFA programs concentrating on early galaxies and peculiar galaxies, and MIT has observed some members of the local group. (Auth.)

  14. Large deviations

    CERN Document Server

    Deuschel, Jean-Dominique; Deuschel, Jean-Dominique

    2001-01-01

    This is the second printing of the book first published in 1988. The first four chapters of the volume are based on lectures given by Stroock at MIT in 1987. They form an introduction to the basic ideas of the theory of large deviations and make a suitable package on which to base a semester-length course for advanced graduate students with a strong background in analysis and some probability theory. A large selection of exercises presents important material and many applications. The last two chapters present various non-uniform results (Chapter 5) and outline the analytic approach that allow

  15. Bicervical normal uterus with normal vagina | Okeke | Annals of ...

    African Journals Online (AJOL)

    To the best of our knowledge, only few cases of bicervical normal uterus with normal vagina exist in the literature; one of the cases had an anterior‑posterior disposition. This form of uterine abnormality is not explicable by the existing classical theory of mullerian anomalies and suggests that a complex interplay of events ...

  16. Group normalization for genomic data.

    Science.gov (United States)

    Ghandi, Mahmoud; Beer, Michael A

    2012-01-01

    Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN), to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.

  17. Group normalization for genomic data.

    Directory of Open Access Journals (Sweden)

    Mahmoud Ghandi

    Full Text Available Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN, to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.

  18. Defining the "normal" postejaculate urinalysis.

    Science.gov (United States)

    Mehta, Akanksha; Jarow, Jonathan P; Maples, Pat; Sigman, Mark

    2012-01-01

    Although sperm have been shown to be present in the postejaculate urinalysis (PEU) of both fertile and infertile men, the number of sperm present in the PEU of the general population has never been well defined. The objective of this study was to describe the semen and PEU findings in both the general and infertile population, in order to develop a better appreciation for "normal." Infertile men (n = 77) and control subjects (n = 71) were prospectively recruited. Exclusion criteria included azoospermia and medications known to affect ejaculation. All men underwent a history, physical examination, semen analysis, and PEU. The urine was split into 2 containers: PEU1, the initial voided urine, and PEU2, the remaining voided urine. Parametric statistical methods were applied for data analysis to compare sperm concentrations in each sample of semen and urine between the 2 groups of men. Controls had higher average semen volume (3.3 ± 1.6 vs 2.0 ± 1.4 mL, P sperm concentrations (112 million vs 56.2 million, P = .011), compared with infertile men. The presence of sperm in urine was common in both groups, but more prevalent among infertile men (98.7% vs 88.7%, P = .012), in whom it comprised a greater proportion of the total sperm count (46% vs 24%, P = .022). The majority of sperm present in PEU were seen in PEU1 of both controls (69%) and infertile men (88%). An association was noted between severe oligospermia (sperm counts in PEU (sperm in the urine compared with control, there is a large degree of overlap between the 2 populations, making it difficult to identify a specific threshold to define a positive test. Interpretation of a PEU should be directed by whether the number of sperm in the urine could affect subsequent management.

  19. U.S. Climate Normals Product Suite (1981-2010)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Climate Normals are a large suite of data products that provide users with many tools to understand typical climate conditions for thousands of locations...

  20. Comparison of spectrum normalization techniques for univariate ...

    Indian Academy of Sciences (India)

    Laser-induced breakdown spectroscopy; univariate study; normalization models; stainless steel; standard error of prediction. Abstract. Analytical performance of six different spectrum normalization techniques, namelyinternal normalization, normalization with total light, normalization with background along with their ...

  1. Normal matter storage of antiprotons

    International Nuclear Information System (INIS)

    Campbell, L.J.

    1987-01-01

    Various simple issues connected with the possible storage of anti p in relative proximity to normal matter are discussed. Although equilibrium storage looks to be impossible, condensed matter systems are sufficiently rich and controllable that nonequilibrium storage is well worth pursuing. Experiments to elucidate the anti p interactions with normal matter are suggested. 32 refs

  2. Deformation around basin scale normal faults

    International Nuclear Information System (INIS)

    Spahic, D.

    2010-01-01

    Faults in the earth crust occur within large range of scales from microscale over mesoscopic to large basin scale faults. Frequently deformation associated with faulting is not only limited to the fault plane alone, but rather forms a combination with continuous near field deformation in the wall rock, a phenomenon that is generally called fault drag. The correct interpretation and recognition of fault drag is fundamental for the reconstruction of the fault history and determination of fault kinematics, as well as prediction in areas of limited exposure or beyond comprehensive seismic resolution. Based on fault analyses derived from 3D visualization of natural examples of fault drag, the importance of fault geometry for the deformation of marker horizons around faults is investigated. The complex 3D structural models presented here are based on a combination of geophysical datasets and geological fieldwork. On an outcrop scale example of fault drag in the hanging wall of a normal fault, located at St. Margarethen, Burgenland, Austria, data from Ground Penetrating Radar (GPR) measurements, detailed mapping and terrestrial laser scanning were used to construct a high-resolution structural model of the fault plane, the deformed marker horizons and associated secondary faults. In order to obtain geometrical information about the largely unexposed master fault surface, a standard listric balancing dip domain technique was employed. The results indicate that for this normal fault a listric shape can be excluded, as the constructed fault has a geologically meaningless shape cutting upsection into the sedimentary strata. This kinematic modeling result is additionally supported by the observation of deformed horizons in the footwall of the structure. Alternatively, a planar fault model with reverse drag of markers in the hanging wall and footwall is proposed. Deformation around basin scale normal faults. A second part of this thesis investigates a large scale normal fault

  3. Role of the normal gut microbiota.

    Science.gov (United States)

    Jandhyala, Sai Manasa; Talukdar, Rupjyoti; Subramanyam, Chivkula; Vuyyuru, Harish; Sasikala, Mitnala; Nageshwar Reddy, D

    2015-08-07

    Relation between the gut microbiota and human health is being increasingly recognised. It is now well established that a healthy gut flora is largely responsible for overall health of the host. The normal human gut microbiota comprises of two major phyla, namely Bacteroidetes and Firmicutes. Though the gut microbiota in an infant appears haphazard, it starts resembling the adult flora by the age of 3 years. Nevertheless, there exist temporal and spatial variations in the microbial distribution from esophagus to the rectum all along the individual's life span. Developments in genome sequencing technologies and bioinformatics have now enabled scientists to study these microorganisms and their function and microbe-host interactions in an elaborate manner both in health and disease. The normal gut microbiota imparts specific function in host nutrient metabolism, xenobiotic and drug metabolism, maintenance of structural integrity of the gut mucosal barrier, immunomodulation, and protection against pathogens. Several factors play a role in shaping the normal gut microbiota. They include (1) the mode of delivery (vaginal or caesarean); (2) diet during infancy (breast milk or formula feeds) and adulthood (vegan based or meat based); and (3) use of antibiotics or antibiotic like molecules that are derived from the environment or the gut commensal community. A major concern of antibiotic use is the long-term alteration of the normal healthy gut microbiota and horizontal transfer of resistance genes that could result in reservoir of organisms with a multidrug resistant gene pool.

  4. Normal forms of Hopf-zero singularity

    International Nuclear Information System (INIS)

    Gazor, Majid; Mokhtari, Fahimeh

    2015-01-01

    The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative–nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov–Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov–Takens singularities. Despite this, the normal form computations of Bogdanov–Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative–nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto–Sivashinsky equations to demonstrate the applicability of our results. (paper)

  5. Normal forms of Hopf-zero singularity

    Science.gov (United States)

    Gazor, Majid; Mokhtari, Fahimeh

    2015-01-01

    The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative-nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov-Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov-Takens singularities. Despite this, the normal form computations of Bogdanov-Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative-nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto-Sivashinsky equations to demonstrate the applicability of our results.

  6. Large ethics.

    Science.gov (United States)

    Chambers, David W

    2008-01-01

    This essay presents an alternative to the traditional view that ethics means judging individual behavior against standards of right and wrong. Instead, ethics is understood as creating ethical communities through the promises we make to each other. The "aim" of ethics is to demonstrate in our own behavior a credible willingness to work to create a mutually better world. The "game" of ethics then becomes searching for strategies that overlap with others' strategies so that we are all better for intending to act on a basis of reciprocal trust. This is a difficult process because we have partial, simultaneous, shifting, and inconsistent views of the world. But despite the reality that we each "frame" ethics in personal terms, it is still possible to create sufficient common understanding to prosper together. Large ethics does not make it a prerequisite for moral behavior that everyone adheres to a universally agreed set of ethical principles; all that is necessary is sufficient overlap in commitment to searching for better alternatives.

  7. Normal mode analysis and applications in biological physics.

    Science.gov (United States)

    Dykeman, Eric C; Sankey, Otto F

    2010-10-27

    Normal mode analysis has become a popular and often used theoretical tool in the study of functional motions in enzymes, viruses, and large protein assemblies. The use of normal modes in the study of these motions is often extremely fruitful since many of the functional motions of large proteins can be described using just a few normal modes which are intimately related to the overall structure of the protein. In this review, we present a broad overview of several popular methods used in the study of normal modes in biological physics including continuum elastic theory, the elastic network model, and a new all-atom method, recently developed, which is capable of computing a subset of the low frequency vibrational modes exactly. After a review of the various methods, we present several examples of applications of normal modes in the study of functional motions, with an emphasis on viral capsids.

  8. Complete Normal Ordering 1: Foundations

    CERN Document Server

    Ellis, John; Skliros, Dimitri P.

    2016-01-01

    We introduce a new prescription for quantising scalar field theories perturbatively around a true minimum of the full quantum effective action, which is to `complete normal order' the bare action of interest. When the true vacuum of the theory is located at zero field value, the key property of this prescription is the automatic cancellation, to any finite order in perturbation theory, of all tadpole and, more generally, all `cephalopod' Feynman diagrams. The latter are connected diagrams that can be disconnected into two pieces by cutting one internal vertex, with either one or both pieces free from external lines. In addition, this procedure of `complete normal ordering' (which is an extension of the standard field theory definition of normal ordering) reduces by a substantial factor the number of Feynman diagrams to be calculated at any given loop order. We illustrate explicitly the complete normal ordering procedure and the cancellation of cephalopod diagrams in scalar field theories with non-derivative i...

  9. Neutron scattering by normal liquids

    Energy Technology Data Exchange (ETDEWEB)

    Gennes, P.G. de [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires

    1961-07-01

    Neutron data on motions in normal liquids well below critical point are reviewed and classified according to the order of magnitude of momentum transfers {Dirac_h}q and energy transfers {Dirac_h}w. For large momentum transfers a perfect gas model is valid. For smaller q and incoherent scattering, the major effects are related to the existence of two characteristic times: the period of oscillation of an atom in its cell, and the average lifetime of the atom in a definite cell. Various interpolation schemes covering both time scales are discussed. For coherent scattering and intermediate q, the energy spread is expected to show a minimum whenever q corresponds to a diffraction peak. For very small q the standard macroscopic description of density fluctuations is applicable. The limits of the various (q) and (w) domains and the validity of various approximations are discussed by a method of moments. The possibility of observing discrete transitions due to internal degrees of freedom in polyatomic molecules, in spite of the 'Doppler width' caused by translational motions, is also examined. (author) [French] L'auteur examine les donnees neutroniques sur les mouvements dans les liquides normaux, bien au-dessous du point critique, et les classe d'apres l'ordre de grandeur des transferts de quantite de mouvement {Dirac_h}q et des transferts d'energie {Dirac_h}w. Pour les grands transferts de, quantite de mouvement, un modele de gaz parfait est valable. En ce qui concerne les faibles valeurs de q et la diffussion incoherente, les principaux effets sont lies a l'existence de deux temps caracteristiques: la periode d'oscillation d'un atome dans sa cellule et la duree moyenne de vie de l'atome dans une cellule determinee. L'auteur etudie divers systemes d'interpolation se rapportant aux deux echelles de temps. Pour la diffusion coherente et les valeurs intermediaires de q, on presume que le spectre d'energie accuse un minimum chaque fois que q correspond a un pic de

  10. Sphalerons, deformed sphalerons and normal modes

    International Nuclear Information System (INIS)

    Brihaye, Y.; Kunz, J.; Oldenburg Univ.

    1992-01-01

    Topological arguments suggest that tha Weinberg-Salam model posses unstable solutions, sphalerons, representing the top of energy barriers between inequivalent vacua of the gauge theory. In the limit of vanishing Weinberg angle, such unstable solutions are known: the sphaleron of Klinkhamer and Manton and at large values of the Higgs mass in addition the deformed sphalerons. Here a systematic study of the discrete normal modes about these sphalerons for the full range Higgs mass is presented. The emergence of deformed sphalerons at critical values of the Higgs mass is seem to be related to the crossing of zero of the eigenvalue of the particular normal modes about the sphaleron. 6 figs., 1 tab., 19 refs. (author)

  11. The normal and pathological language

    OpenAIRE

    Espejo, Luis D.

    2014-01-01

    The extraordinary development of normal and pathological psychology has achieved in recent decades, thanks to the dual method of objective observation and oral survey enabled the researcher spirit of neuro-psychiatrist penetrate the intimate mechanism of the nervous system whose supreme manifestation is thought. It is normal psychology explaining the complicated game of perceptions: their methods of transmission, their centers of projection, its transformations and its synthesis to construct ...

  12. Is normal science good science?

    Directory of Open Access Journals (Sweden)

    Adrianna Kępińska

    2015-09-01

    Full Text Available “Normal science” is a concept introduced by Thomas Kuhn in The Structure of Scientific Revolutions (1962. In Kuhn’s view, normal science means “puzzle solving”, solving problems within the paradigm—framework most successful in solving current major scientific problems—rather than producing major novelties. This paper examines Kuhnian and Popperian accounts of normal science and their criticisms to assess if normal science is good. The advantage of normal science according to Kuhn was “psychological”: subjective satisfaction from successful “puzzle solving”. Popper argues for an “intellectual” science, one that consistently refutes conjectures (hypotheses and offers new ideas rather than focus on personal advantages. His account is criticized as too impersonal and idealistic. Feyerabend’s perspective seems more balanced; he argues for a community that would introduce new ideas, defend old ones, and enable scientists to develop in line with their subjective preferences. The paper concludes that normal science has no one clear-cut set of criteria encompassing its meaning and enabling clear assessment.

  13. nth roots of normal contractions

    International Nuclear Information System (INIS)

    Duggal, B.P.

    1992-07-01

    Given a complex separable Hilbert space H and a contraction A on H such that A n , n≥2 some integer, is normal it is shown that if the defect operator D A = (1 - A * A) 1/2 is of the Hilbert-Schmidt class, then A is similar to a normal contraction, either A or A 2 is normal, and if A 2 is normal (but A is not) then there is a normal contraction N and a positive definite contraction P of trace class such that parallel to A - N parallel to 1 = 1/2 parallel to P + P parallel to 1 (where parallel to · parallel to 1 denotes the trace norm). If T is a compact contraction such that its characteristics function admits a scalar factor, if T = A n for some integer n≥2 and contraction A with simple eigen-values, and if both T and A satisfy a ''reductive property'', then A is a compact normal contraction. (author). 16 refs

  14. Three-Dimensional Geologic Characterization of a Great Basin Geothermal System: Astor Pass, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    Mayhew, Brett; Siler, Drew L; Faulds, James E

    2013-09-30

    The Great Basin, western USA, exhibits anomalously high heat flow (~75±5 mWm-2) and active faulting and extension, resulting in ~430 known geothermal systems. Recent studies have shown that steeply dipping normal faults in transtensional pull-aparts are a common structural control of these Great Basin geothermal systems. The Astor Pass blind (no surface expression) geothermal system, Nevada, lies along the boundary between the Basin and Range to the east and the Walker Lane to the west. Across this boundary, strain is transferred from dextral shear in the Walker Lane to west-northwest directed extension in the Basin and Range, resulting in a transtensional setting consisting of both northwest-striking, left-stepping dextral faults and northerly striking normal faults. Previous studies indicate that Astor Pass was controlled by the intersection of a northwest-striking dextral normal fault and north-northwest striking normal-dextral fault bounding the western side of the Terraced Hills. Drilling (to ~1200 m) has revealed fluid temperatures of ~94°C, confirming a blind geothermal system. Expanding upon previous work and employing interpretation of 2D seismic reflection data, additional detailed geologic mapping, and well cuttings analysis, a 3-dimensional geologic model of the Astor Pass geothermal system was constructed. The 3D model indicates a complex interaction/intersection area of three discrete fault zones: a northwest-striking dextral-normal fault, a north-northwest-striking normal-dextral fault, and a north-striking west-dipping normal fault. These two discrete, critically-stressed intersection areas plunge moderately to steeply to the NW-NNW and probably act as conduits for upwelling geothermal fluids.

  15. Spatial normalization of array-CGH data

    Directory of Open Access Journals (Sweden)

    Brennetot Caroline

    2006-05-01

    Full Text Available Abstract Background Array-based comparative genomic hybridization (array-CGH is a recently developed technique for analyzing changes in DNA copy number. As in all microarray analyses, normalization is required to correct for experimental artifacts while preserving the true biological signal. We investigated various sources of systematic variation in array-CGH data and identified two distinct types of spatial effect of no biological relevance as the predominant experimental artifacts: continuous spatial gradients and local spatial bias. Local spatial bias affects a large proportion of arrays, and has not previously been considered in array-CGH experiments. Results We show that existing normalization techniques do not correct these spatial effects properly. We therefore developed an automatic method for the spatial normalization of array-CGH data. This method makes it possible to delineate and to eliminate and/or correct areas affected by spatial bias. It is based on the combination of a spatial segmentation algorithm called NEM (Neighborhood Expectation Maximization and spatial trend estimation. We defined quality criteria for array-CGH data, demonstrating significant improvements in data quality with our method for three data sets coming from two different platforms (198, 175 and 26 BAC-arrays. Conclusion We have designed an automatic algorithm for the spatial normalization of BAC CGH-array data, preventing the misinterpretation of experimental artifacts as biologically relevant outliers in the genomic profile. This algorithm is implemented in the R package MANOR (Micro-Array NORmalization, which is described at http://bioinfo.curie.fr/projects/manor and available from the Bioconductor site http://www.bioconductor.org. It can also be tested on the CAPweb bioinformatics platform at http://bioinfo.curie.fr/CAPweb.

  16. Glymphatic MRI in idiopathic normal pressure hydrocephalus.

    Science.gov (United States)

    Ringstad, Geir; Vatnehol, Svein Are Sirirud; Eide, Per Kristian

    2017-10-01

    The glymphatic system has in previous studies been shown as fundamental to clearance of waste metabolites from the brain interstitial space, and is proposed to be instrumental in normal ageing and brain pathology such as Alzheimer's disease and brain trauma. Assessment of glymphatic function using magnetic resonance imaging with intrathecal contrast agent as a cerebrospinal fluid tracer has so far been limited to rodents. We aimed to image cerebrospinal fluid flow characteristics and glymphatic function in humans, and applied the methodology in a prospective study of 15 idiopathic normal pressure hydrocephalus patients (mean age 71.3 ± 8.1 years, three female and 12 male) and eight reference subjects (mean age 41.1 + 13.0 years, six female and two male) with suspected cerebrospinal fluid leakage (seven) and intracranial cyst (one). The imaging protocol included T1-weighted magnetic resonance imaging with equal sequence parameters before and at multiple time points through 24 h after intrathecal injection of the contrast agent gadobutrol at the lumbar level. All study subjects were kept in the supine position between examinations during the first day. Gadobutrol enhancement was measured at all imaging time points from regions of interest placed at predefined locations in brain parenchyma, the subarachnoid and intraventricular space, and inside the sagittal sinus. Parameters demonstrating gadobutrol enhancement and clearance in different locations were compared between idiopathic normal pressure hydrocephalus and reference subjects. A characteristic flow pattern in idiopathic normal hydrocephalus was ventricular reflux of gadobutrol from the subarachnoid space followed by transependymal gadobutrol migration. At the brain surfaces, gadobutrol propagated antegradely along large leptomeningeal arteries in all study subjects, and preceded glymphatic enhancement in adjacent brain tissue, indicating a pivotal role of intracranial pulsations for glymphatic function. In

  17. Quantum transport in graphene normal-metal superconductor- normal-metal structures

    Directory of Open Access Journals (Sweden)

    H. Mohammadpour

    2008-06-01

    Full Text Available  We study the transport of electrons in a graphene NSN structure in which two normal regions are connected by a superconducting strip of thickness d. Within Dirac-Bogoliubov-de Gennes equations we describe the transmission through the contact in terms of different scattering processes consisting of quasiparticle cotunneling, local and crossed Andreev reflections. Compared to a fully normal structure we show that the angular dependence of the transmission probability is significantly modified by the presence of superconducting correlations. This modifation can be explained in terms of the interplay between Andreev reflection and Klein tunneling of chiral quasiparticles. We further analyze the energy dependence of the resulting differential conductance of the structure. The subgap differential conductance shows features of Andreev reflection and cotunelling processes, which tends to the values of an NS structure for large ds. Above the gap, the differential conductance shows an oscillatory behavior with energy even at very large ds.

  18. Precaval retropancreatic space: Normal anatomy

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yeon Hee; Kim, Ki Whang; Kim, Myung Jin; Yoo, Hyung Sik; Lee, Jong Tae [Yonsei University College of Medicine, Seoul (Korea, Republic of)

    1992-07-15

    The authors defined precaval retropancreatic space as the space between pancreatic head with portal vein and IVC and analyzed the CT findings of this space to know the normal structures and size in this space. We evaluated 100 cases of normal abdominal CT scan to find out normal anatomic structures of precaval retropancreatic space retrospectively. We also measured the distance between these structures and calculated the minimum, maximum and mean values. At the splenoportal confluence level, normal structures between portal vein and IVC were vessel (21%), lymph node (19%), and caudate lobe of liver (2%) in order of frequency. The maximum AP diameter of portocaval lymph node was 4 mm. Common bile duct (CBD) was seen in 44% and the diameter was mean 3 mm and maximum 11 mm. CBD was located in extrapancreatic (75%) and lateral (60.6%) to pancreatic head. At IVC-left renal vein level, the maximum distance between CBD and IVC was 5 mm and the structure between posterior pancreatic surface and IVC was only fat tissue. Knowledge of these normal structures and measurement will be helpful in differentiating pancreatic mass with retropancreatic mass such as lymphadenopathy.

  19. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. 3j Symbols: To Normalize or Not to Normalize?

    Science.gov (United States)

    van Veenendaal, Michel

    2011-01-01

    The systematic use of alternative normalization constants for 3j symbols can lead to a more natural expression of quantities, such as vector products and spherical tensor operators. The redefined coupling constants directly equate tensor products to the inner and outer products without any additional square roots. The approach is extended to…

  1. Normal myogenic cells from newborn mice restore normal histology to degenerating muscles of the mdx mouse

    International Nuclear Information System (INIS)

    Morgan, J.E.; Hoffman, E.P.; Partridge, T.A.

    1990-01-01

    Dystrophin deficiency in skeletal muscle of the x-linked dystrophic (mdx) mouse can be partially remedied by implantation of normal muscle precursor cells (mpc). However, it is difficult to determine whether this biochemical rescue results in any improvement in the structure or function of the treated muscle, because the vigorous regeneration of mdx muscle more than compensates for the degeneration. By using x-ray irradiation to prevent mpc proliferation, it is possible to study loss of mdx muscle fibers without the complicating effect of simultaneous fiber regeneration. Thus, improvements in fiber survival resulting from any potential therapy can be detected easily. Here, we have implanted normal mpc, obtained from newborn mice, into such preirradiated mdx muscles, finding that it is far more extensively permeated and replaced by implanted mpc than is nonirradiated mdx muscle; this is evident both from analysis of glucose-6-phosphate isomerase isoenzyme markers and from immunoblots and immunostaining of dystrophin in the treated muscles. Incorporation of normal mpc markedly reduces the loss of muscle fibers and the deterioration of muscle structure which otherwise occurs in irradiated mdx muscles. Surprisingly, the regenerated fibers are largely peripherally nucleated, whereas regenerated mouse skeletal muscle fibers are normally centrally nucleated. We attribute this regeneration of apparently normal muscle to the tendency of newborn mouse mpc to recapitulate their neonatal ontogeny, even when grafted into 3-wk-old degenerating muscle

  2. CT and MRI normal findings

    International Nuclear Information System (INIS)

    Moeller, T.B.; Reif, E.

    1998-01-01

    This book gives answers to questions frequently heard especially from trainees and doctors not specialising in the field of radiology: Is that a normal finding? How do I decide? What are the objective criteria? The information presented is three-fold. The normal findings of the usual CT and MRI examinations are shown with high-quality pictures serving as a reference, with inscribed important additional information on measures, angles and other criteria describing the normal conditions. These criteria are further explained and evaluated in accompanying texts which also teach the systematic approach for individual picture analysis, and include a check list of major aspects, as a didactic guide for learning. The book is primarily intended for students, radiographers, radiology trainees and doctors from other medical fields, but radiology specialists will also find useful details of help in special cases. (orig./CB) [de

  3. Marrow transfusions into normal recipients

    International Nuclear Information System (INIS)

    Brecher, G.

    1983-01-01

    During the past several years we have explored the transfusion of bone marrow into normal nonirradiated mice. While transfused marrow proliferates readily in irradiated animals, only minimal proliferation takes place in nonirradiated recipients. It has generally been assumed that this was due to the lack of available proliferative sites in recipients with normal marrow. Last year we were able to report that the transfusion of 200 million bone marrow cells (about 2/3 of the total complement of marrow cells of a normal mouse) resulted in 20% to 25% of the recipient's marrow being replaced by donor marrow. Thus we can now study the behavior of animals that have been transfused (donor) and endogenous (recipient) marrow cells, although none of the tissues of either donor or recipient have been irradiated. With these animals we hope to investigate the nature of the peculiar phenomenon of serial exhaustion of marrow, also referred to as the limited self-replicability of stem cells

  4. The construction of normal expectations

    DEFF Research Database (Denmark)

    Quitzau, Maj-Britt; Røpke, Inge

    2008-01-01

    The gradual upward changes of standards in normal everyday life have significant environmental implications, and it is therefore important to study how these changes come about. The intention of the article is to analyze the social construction of normal expectations through a case study. The case...... concerns the present boom in bathroom renovations in Denmark, which offers an excellent opportunity to study the interplay between a wide variety of consumption drivers and social changes pointing toward long-term changes of normal expectations regarding bathroom standards. The study is problemoriented...... and transdisciplinary and draws on a wide range of sociological, anthropological, and economic theories. The empirical basis comprises a combination of statistics, a review of magazine and media coverage, visits to exhibitions, and qualitative interviews. A variety of consumption drivers are identified. Among...

  5. Normalized cDNA libraries

    Science.gov (United States)

    Soares, Marcelo B.; Efstratiadis, Argiris

    1997-01-01

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library.

  6. Random Generators and Normal Numbers

    OpenAIRE

    Bailey, David H.; Crandall, Richard E.

    2002-01-01

    Pursuant to the authors' previous chaotic-dynamical model for random digits of fundamental constants, we investigate a complementary, statistical picture in which pseudorandom number generators (PRNGs) are central. Some rigorous results are achieved: We establish b-normality for constants of the form $\\sum_i 1/(b^{m_i} c^{n_i})$ for certain sequences $(m_i), (n_i)$ of integers. This work unifies and extends previously known classes of explicit normals. We prove that for coprime $b,c>1$ the...

  7. Deconstructing Interocular Suppression: Attention and Divisive Normalization.

    Directory of Open Access Journals (Sweden)

    Hsin-Hung Li

    2015-10-01

    Full Text Available In interocular suppression, a suprathreshold monocular target can be rendered invisible by a salient competitor stimulus presented in the other eye. Despite decades of research on interocular suppression and related phenomena (e.g., binocular rivalry, flash suppression, continuous flash suppression, the neural processing underlying interocular suppression is still unknown. We developed and tested a computational model of interocular suppression. The model included two processes that contributed to the strength of interocular suppression: divisive normalization and attentional modulation. According to the model, the salient competitor induced a stimulus-driven attentional modulation selective for the location and orientation of the competitor, thereby increasing the gain of neural responses to the competitor and reducing the gain of neural responses to the target. Additional suppression was induced by divisive normalization in the model, similar to other forms of visual masking. To test the model, we conducted psychophysics experiments in which both the size and the eye-of-origin of the competitor were manipulated. For small and medium competitors, behavioral performance was consonant with a change in the response gain of neurons that responded to the target. But large competitors induced a contrast-gain change, even when the competitor was split between the two eyes. The model correctly predicted these results and outperformed an alternative model in which the attentional modulation was eye specific. We conclude that both stimulus-driven attention (selective for location and feature and divisive normalization contribute to interocular suppression.

  8. Deconstructing Interocular Suppression: Attention and Divisive Normalization.

    Science.gov (United States)

    Li, Hsin-Hung; Carrasco, Marisa; Heeger, David J

    2015-10-01

    In interocular suppression, a suprathreshold monocular target can be rendered invisible by a salient competitor stimulus presented in the other eye. Despite decades of research on interocular suppression and related phenomena (e.g., binocular rivalry, flash suppression, continuous flash suppression), the neural processing underlying interocular suppression is still unknown. We developed and tested a computational model of interocular suppression. The model included two processes that contributed to the strength of interocular suppression: divisive normalization and attentional modulation. According to the model, the salient competitor induced a stimulus-driven attentional modulation selective for the location and orientation of the competitor, thereby increasing the gain of neural responses to the competitor and reducing the gain of neural responses to the target. Additional suppression was induced by divisive normalization in the model, similar to other forms of visual masking. To test the model, we conducted psychophysics experiments in which both the size and the eye-of-origin of the competitor were manipulated. For small and medium competitors, behavioral performance was consonant with a change in the response gain of neurons that responded to the target. But large competitors induced a contrast-gain change, even when the competitor was split between the two eyes. The model correctly predicted these results and outperformed an alternative model in which the attentional modulation was eye specific. We conclude that both stimulus-driven attention (selective for location and feature) and divisive normalization contribute to interocular suppression.

  9. Complete normal ordering 1: Foundations

    Directory of Open Access Journals (Sweden)

    John Ellis

    2016-08-01

    Full Text Available We introduce a new prescription for quantising scalar field theories (in generic spacetime dimension and background perturbatively around a true minimum of the full quantum effective action, which is to ‘complete normal order’ the bare action of interest. When the true vacuum of the theory is located at zero field value, the key property of this prescription is the automatic cancellation, to any finite order in perturbation theory, of all tadpole and, more generally, all ‘cephalopod’ Feynman diagrams. The latter are connected diagrams that can be disconnected into two pieces by cutting one internal vertex, with either one or both pieces free from external lines. In addition, this procedure of ‘complete normal ordering’ (which is an extension of the standard field theory definition of normal ordering reduces by a substantial factor the number of Feynman diagrams to be calculated at any given loop order. We illustrate explicitly the complete normal ordering procedure and the cancellation of cephalopod diagrams in scalar field theories with non-derivative interactions, and by using a point splitting ‘trick’ we extend this result to theories with derivative interactions, such as those appearing as non-linear σ-models in the world-sheet formulation of string theory. We focus here on theories with trivial vacua, generalising the discussion to non-trivial vacua in a follow-up paper.

  10. Normal forms in Poisson geometry

    NARCIS (Netherlands)

    Marcut, I.T.

    2013-01-01

    The structure of Poisson manifolds is highly nontrivial even locally. The first important result in this direction is Conn's linearization theorem around fixed points. One of the main results of this thesis (Theorem 2) is a normal form theorem in Poisson geometry, which is the Poisson-geometric

  11. Mixed normal inference on multicointegration

    NARCIS (Netherlands)

    Boswijk, H.P.

    2009-01-01

    Asymptotic likelihood analysis of cointegration in I(2) models, see Johansen (1997, 2006), Boswijk (2000) and Paruolo (2000), has shown that inference on most parameters is mixed normal, implying hypothesis test statistics with an asymptotic 2 null distribution. The asymptotic distribution of the

  12. Is My Child's Appetite Normal?

    Science.gov (United States)

    Is My Child’s Appetite Normal? Cayla, who is 4 years old, did not finish her lunch. But she is ready to play. Her ... snack for later. That is okay! Your child’s appetite changes. Children do not grow as fast in ...

  13. Transforming Normal Programs by Replacement

    NARCIS (Netherlands)

    Bossi, Annalisa; Pettorossi, A.; Cocco, Nicoletta; Etalle, Sandro

    1992-01-01

    The replacement transformation operation, already defined in [28], is studied wrt normal programs. We give applicability conditions able to ensure the correctness of the operation wrt Fitting's and Kunen's semantics. We show how replacement can mimic other transformation operations such as thinning,

  14. Semigroups of data normalization functions

    NARCIS (Netherlands)

    Warrens, Matthijs J.

    2016-01-01

    Variable centering and scaling are functions that are typically used in data normalization. Various properties of centering and scaling functions are presented. It is shown that if we use two centering functions (or scaling functions) successively, the result depends on the order in which the

  15. Normalizing Catastrophe: Sustainability and Scientism

    Science.gov (United States)

    Bonnett, Michael

    2013-01-01

    Making an adequate response to our deteriorating environmental situation is a matter of ever increasing urgency. It is argued that a central obstacle to achieving this is the way that scientism has become normalized in our thinking about environmental issues. This is taken to reflect on an underlying "metaphysics of mastery" that vitiates proper…

  16. Neutron RBE for normal tissues

    International Nuclear Information System (INIS)

    Field, S.B.; Hornsey, S.

    1979-01-01

    RBE for various normal tissues is considered as a function of neutron dose per fraction. Results from a variety of centres are reviewed. It is shown that RBE is dependent on neutron energy and is tissue dependent, but is not specially high for the more critical tissues or for damage occurring late after irradiation. (author)

  17. Normal and abnormal growth plate

    International Nuclear Information System (INIS)

    Kumar, R.; Madewell, J.E.; Swischuk, L.E.

    1987-01-01

    Skeletal growth is a dynamic process. A knowledge of the structure and function of the normal growth plate is essential in order to understand the pathophysiology of abnormal skeletal growth in various diseases. In this well-illustrated article, the authors provide a radiographic classification of abnormal growth plates and discuss mechanisms that lead to growth plate abnormalities

  18. Proteoglycans in Leiomyoma and Normal Myometrium

    Science.gov (United States)

    Barker, Nichole M.; Carrino, David A.; Caplan, Arnold I.; Hurd, William W.; Liu, James H.; Tan, Huiqing; Mesiano, Sam

    2015-01-01

    Uterine leiomyoma are a common benign pelvic tumors composed of modified smooth muscle cells and a large amount of extracellular matrix (ECM). The proteoglycan composition of the leiomyoma ECM is thought to affect pathophysiology of the disease. To test this hypothesis, we examined the abundance (by immunoblotting) and expression (by quantitative real-time polymerase chain reaction) of the proteoglycans biglycan, decorin, and versican in leiomyoma and normal myometrium and determined whether expression is affected by steroid hormones and menstrual phase. Leiomyoma and normal myometrium were collected from women (n = 17) undergoing hysterectomy or myomectomy. In vitro studies were performed on immortalized leiomyoma (UtLM) and normal myometrial (hTERT-HM) cells with and without exposure to estradiol and progesterone. In leiomyoma tissue, abundance of decorin messenger RNA (mRNA) and protein were 2.6-fold and 1.4-fold lower, respectively, compared with normal myometrium. Abundance of versican mRNA was not different between matched samples, whereas versican protein was increased 1.8-fold in leiomyoma compared with myometrium. Decorin mRNA was 2.4-fold lower in secretory phase leiomyoma compared with proliferative phase tissue. In UtLM cells, progesterone decreased the abundance of decorin mRNA by 1.3-fold. Lower decorin expression in leiomyoma compared with myometrium may contribute to disease growth and progression. As decorin inhibits the activity of specific growth factors, its reduced level in the leiomyoma cell microenvironment may promote cell proliferation and ECM deposition. Our data suggest that decorin expression in leiomyoma is inhibited by progesterone, which may be a mechanism by which the ovarian steroids affect leiomyoma growth and disease progression. PMID:26423601

  19. WEBnm@: a web application for normal mode analyses of proteins

    Directory of Open Access Journals (Sweden)

    Reuter Nathalie

    2005-03-01

    Full Text Available Abstract Background Normal mode analysis (NMA has become the method of choice to investigate the slowest motions in macromolecular systems. NMA is especially useful for large biomolecular assemblies, such as transmembrane channels or virus capsids. NMA relies on the hypothesis that the vibrational normal modes having the lowest frequencies (also named soft modes describe the largest movements in a protein and are the ones that are functionally relevant. Results We developed a web-based server to perform normal modes calculations and different types of analyses. Starting from a structure file provided by the user in the PDB format, the server calculates the normal modes and subsequently offers the user a series of automated calculations; normalized squared atomic displacements, vector field representation and animation of the first six vibrational modes. Each analysis is performed independently from the others and results can be visualized using only a web browser. No additional plug-in or software is required. For users who would like to analyze the results with their favorite software, raw results can also be downloaded. The application is available on http://www.bioinfo.no/tools/normalmodes. We present here the underlying theory, the application architecture and an illustration of its features using a large transmembrane protein as an example. Conclusion We built an efficient and modular web application for normal mode analysis of proteins. Non specialists can easily and rapidly evaluate the degree of flexibility of multi-domain protein assemblies and characterize the large amplitude movements of their domains.

  20. Superconducting versus normal conducting cavities

    CERN Document Server

    Podlech, Holger

    2013-01-01

    One of the most important issues of high-power hadron linacs is the choice of technology with respect to superconducting or room-temperature operation. The favour for a specific technology depends on several parameters such as the beam energy, beam current, beam power and duty factor. This contribution gives an overview of the comparison between superconducting and normal conducting cavities. This includes basic radiofrequency (RF) parameters, design criteria, limitations, required RF and plug power as well as case studies.

  1. Normal Movement Selectivity in Autism

    OpenAIRE

    Dinstein, Ilan; Thomas, Cibu; Humphreys, Kate; Minshew, Nancy; Behrmann, Marlene; Heeger, David J.

    2010-01-01

    It has been proposed that individuals with autism have difficulties understanding the goals and intentions of others because of a fundamental dysfunction in the mirror neuron system. Here, however, we show that individuals with autism exhibited not only normal fMRI responses in mirror system areas during observation and execution of hand movements, but also exhibited typical movement-selective adaptation (repetition suppression) when observing or executing the same movement repeatedly. Moveme...

  2. Lithium control during normal operation

    International Nuclear Information System (INIS)

    Suryanarayan, S.; Jain, D.

    2010-01-01

    Periodic increases in lithium (Li) concentrations in the primary heat transport (PHT) system during normal operation are a generic problem at CANDU® stations. Lithiated mixed bed ion exchange resins are used at stations for pH control in the PHT system. Typically tight chemistry controls including Li concentrations are maintained in the PHT water. The reason for the Li increases during normal operation at CANDU stations such as Pickering was not fully understood. In order to address this issue a two pronged approach was employed. Firstly, PNGS-A data and information from other available sources was reviewed in an effort to identify possible factors that may contribute to the observed Li variations. Secondly, experimental studies were carried out to assess the importance of these factors in order to establish reasons for Li increases during normal operation. Based on the results of these studies, plausible mechanisms/reasons for Li increases have been identified and recommendations made for proactive control of Li concentrations in the PHT system. (author)

  3. Normalization of Gravitational Acceleration Models

    Science.gov (United States)

    Eckman, Randy A.; Brown, Aaron J.; Adamo, Daniel R.

    2011-01-01

    Unlike the uniform density spherical shell approximations of Newton, the con- sequence of spaceflight in the real universe is that gravitational fields are sensitive to the nonsphericity of their generating central bodies. The gravitational potential of a nonspherical central body is typically resolved using spherical harmonic approximations. However, attempting to directly calculate the spherical harmonic approximations results in at least two singularities which must be removed in order to generalize the method and solve for any possible orbit, including polar orbits. Three unique algorithms have been developed to eliminate these singularities by Samuel Pines [1], Bill Lear [2], and Robert Gottlieb [3]. This paper documents the methodical normalization of two1 of the three known formulations for singularity-free gravitational acceleration (namely, the Lear [2] and Gottlieb [3] algorithms) and formulates a general method for defining normalization parameters used to generate normalized Legendre Polynomials and ALFs for any algorithm. A treatment of the conventional formulation of the gravitational potential and acceleration is also provided, in addition to a brief overview of the philosophical differences between the three known singularity-free algorithms.

  4. "Ser diferente é normal?"/"Being different: is it normal?"

    Directory of Open Access Journals (Sweden)

    Viviane Veras

    2007-01-01

    Full Text Available A pergunta título deste trabalho retoma o slogan “Ser diferente é normal”, que é parte da campanha criada para uma organização não-governamental que atende portadores de Síndrome de Down. O objetivo é a inclusão social da pessoa com deficiência e o primeiro passo foi propor a inclusão de um grupo de diferentes no grupo dito normal. No vídeo de lançamento da campanha, o diferente, identificado como normal, é mostrado por meio de exemplos – um negro com cabelo black-power, um skin-head, um corpo tatuado, um corpo feminino halterofílico, uma família hippie, uma garota com síndrome de Down. A visão da adolescente dançando reduz, de certo modo, o efeito imaginário que vai além da síndrome, uma vez que apenas o corpo com seus olhinhos puxados se destacam, e não se interrogam questões cognitivas. Minha proposta é refletir sobre o estatuto paradoxal do exemplo, tal como é trabalhado nesse vídeo: se, por definição, um exemplo mostra de fato seu pertencimento a uma classe, pode-se concluir que é exatamente por ser exemplar que ele se encontra fora dela, no exato momento em que a exibe e define. The question in the title of this paper refers to the slogan "ser diferente é normal" ("It´s normal to be different", which is part of a campaign created for a NGO that supports people with Down syndrome. The objective of the campaign is to promote the social inclusion of individuals with Down syndrome, and the first step was to propose the inclusion of a group of "differents" in the so-called normal group. The film launching the campaign shows the different identified as normal by means of examples: a black man exhibiting blackpower haircut, a skin-head, a tattooed body, an over-athletic female body, a hippie family and a girl with Down syndrome. The vision of the dancing teenager lessens the imaginary effect that surpasses the syndrome, since only her body and her little oriental eyes stand out and no cognitive issues are

  5. Magnetic resonance imaging of the normal and abnormal pulmonary hila

    International Nuclear Information System (INIS)

    Webb, W.R.; Gamsu, G.; Stark, D.D.; Moore, E.H.

    1984-01-01

    Magnetic resonance (MR) images of the hila were reviewed in 25 normal subjects and 12 patients with unilateral or bilateral hilar masses. On spin echo MR images in normal patients, collections of soft tissue large enough to be confused with an abnormally enlarged lymph node were seen in three locations. In patients with a hilar mass, the mass was differentiated from hilar vasculature more easily using MR than contrast-enhanced CT. However, because the spatial resolution of MR is inferior to that of CT, bronchi were difficult to evaulate using MR. Electrocardiographic-gated images showed better resolution of hilar structures but may not be necessary for large masses

  6. Understanding a Normal Distribution of Data.

    Science.gov (United States)

    Maltenfort, Mitchell G

    2015-12-01

    Assuming data follow a normal distribution is essential for many common statistical tests. However, what are normal data and when can we assume that a data set follows this distribution? What can be done to analyze non-normal data?

  7. Quantiles for Finite Mixtures of Normal Distributions

    Science.gov (United States)

    Rahman, Mezbahur; Rahman, Rumanur; Pearson, Larry M.

    2006-01-01

    Quantiles for finite mixtures of normal distributions are computed. The difference between a linear combination of independent normal random variables and a linear combination of independent normal densities is emphasized. (Contains 3 tables and 1 figure.)

  8. Corticocortical feedback increases the spatial extent of normalization.

    Science.gov (United States)

    Nassi, Jonathan J; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T

    2014-01-01

    Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a "normalization pool." Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing.

  9. Corticocortical feedback increases the spatial extent of normalization

    Science.gov (United States)

    Nassi, Jonathan J.; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T.

    2014-01-01

    Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a “normalization pool.” Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing. PMID:24910596

  10. HOW NORMAL IS VARIABLE, OR HOW VARIABLE IS NORMAL

    NARCIS (Netherlands)

    TOUWEN, BCL

    Variability is an important property of the central nervous system, and it shows characteristic changes during infancy and childhood. The large amount of variations in the performance of sensomotor functions in infancy is called indiscriminate or primary variability. During toddling age the child

  11. A locally adaptive normal distribution

    DEFF Research Database (Denmark)

    Arvanitidis, Georgios; Hansen, Lars Kai; Hauberg, Søren

    2016-01-01

    entropy distribution under the given metric. The underlying metric is, however, non-parametric. We develop a maximum likelihood algorithm to infer the distribution parameters that relies on a combination of gradient descent and Monte Carlo integration. We further extend the LAND to mixture models......The multivariate normal density is a monotonic function of the distance to the mean, and its ellipsoidal shape is due to the underlying Euclidean metric. We suggest to replace this metric with a locally adaptive, smoothly changing (Riemannian) metric that favors regions of high local density...

  12. Normal pediatric postmortem CT appearances

    Energy Technology Data Exchange (ETDEWEB)

    Klein, Willemijn M.; Bosboom, Dennis G.H.; Koopmanschap, Desiree H.J.L.M. [Radboud University Medical Center, Department of Radiology and Nuclear Medicine, Nijmegen (Netherlands); Nievelstein, Rutger A.J. [University Medical Center Utrecht, Department of Radiology, Utrecht (Netherlands); Nikkels, Peter G.J. [University Medical Center Utrecht, Department of Pathology, Utrecht (Netherlands); Rijn, Rick R. van [Academic Medical Center, Department of Radiology, Amsterdam (Netherlands)

    2015-04-01

    Postmortem radiology is a rapidly developing specialty that is increasingly used as an adjunct to or substitute for conventional autopsy. The goal is to find patterns of disease and possibly the cause of death. Postmortem CT images bring to light processes of decomposition most radiologists are unfamiliar with. These postmortem changes, such as the formation of gas and edema, should not be mistaken for pathological processes that occur in living persons. In this review we discuss the normal postmortem thoraco-abdominal changes and how these appear on CT images, as well as how to differentiate these findings from those of pathological processes. (orig.)

  13. Multispectral histogram normalization contrast enhancement

    Science.gov (United States)

    Soha, J. M.; Schwartz, A. A.

    1979-01-01

    A multispectral histogram normalization or decorrelation enhancement which achieves effective color composites by removing interband correlation is described. The enhancement procedure employs either linear or nonlinear transformations to equalize principal component variances. An additional rotation to any set of orthogonal coordinates is thus possible, while full histogram utilization is maintained by avoiding the reintroduction of correlation. For the three-dimensional case, the enhancement procedure may be implemented with a lookup table. An application of the enhancement to Landsat multispectral scanning imagery is presented.

  14. Normal modes and continuous spectra

    International Nuclear Information System (INIS)

    Balmforth, N.J.; Morrison, P.J.

    1994-12-01

    The authors consider stability problems arising in fluids, plasmas and stellar systems that contain singularities resulting from wave-mean flow or wave-particle resonances. Such resonances lead to singularities in the differential equations determining the normal modes at the so-called critical points or layers. The locations of the singularities are determined by the eigenvalue of the problem, and as a result, the spectrum of eigenvalues forms a continuum. They outline a method to construct the singular eigenfunctions comprising the continuum for a variety of problems

  15. Normal movement selectivity in autism.

    Science.gov (United States)

    Dinstein, Ilan; Thomas, Cibu; Humphreys, Kate; Minshew, Nancy; Behrmann, Marlene; Heeger, David J

    2010-05-13

    It has been proposed that individuals with autism have difficulties understanding the goals and intentions of others because of a fundamental dysfunction in the mirror neuron system. Here, however, we show that individuals with autism exhibited not only normal fMRI responses in mirror system areas during observation and execution of hand movements but also exhibited typical movement-selective adaptation (repetition suppression) when observing or executing the same movement repeatedly. Movement selectivity is a defining characteristic of neurons involved in movement perception, including mirror neurons, and, as such, these findings argue against a mirror system dysfunction in autism. Copyright 2010 Elsevier Inc. All rights reserved.

  16. Normal tissue complication probability for salivary glands

    International Nuclear Information System (INIS)

    Rana, B.S.

    2008-01-01

    The purpose of radiotherapy is to make a profitable balance between the morbidity (due to side effects of radiation) and cure of malignancy. To achieve this, one needs to know the relation between NTCP (normal tissue complication probability) and various treatment variables of a schedule viz. daily dose, duration of treatment, total dose and fractionation along with tissue conditions. Prospective studies require that a large number of patients be treated with varied schedule parameters and a statistically acceptable number of patients develop complications so that a true relation between NTCP and a particular variable is established. In this study Salivary Glands Complications have been considered. The cases treated in 60 Co teletherapy machine during the period 1994 to 2002 were analyzed and the clinicians judgement in ascertaining the end points was the only means of observations. The only end points were early and late xerestomia which were considered for NTCP evaluations for a period of 5 years

  17. Ocular Blood Flow and Normal Tension Glaucoma

    Directory of Open Access Journals (Sweden)

    Ning Fan

    2015-01-01

    Full Text Available Normal tension glaucoma (NTG is known as a multifactorial optic neuropathy characterized by progressive retinal ganglion cell death and glaucomatous visual field loss, even though the intraocular pressure (IOP does not exceed the normal range. The pathophysiology of NTG remains largely undetermined. It is hypothesized that the abnormal ocular blood flow is involved in the pathogenesis of this disease. A number of evidences suggested that the vascular factors played a significant role in the development of NTG. In recent years, the new imaging techniques, fluorescein angiography, color Doppler imaging (CDI, magnetic resonance imaging (MRI, and laser speckle flowgraphy (LSFG, have been used to evaluate the ocular blood flow and blood vessels, and the impaired vascular autoregulation was found in patients with NTG. Previous studies showed that NTG was associated with a variety of systemic diseases, including migraine, Alzheimer’s disease, primary vascular dysregulation, and Flammer syndrome. The vascular factors were involved in these diseases. The mechanisms underlying the abnormal ocular blood flow in NTG are still not clear, but the risk factors for glaucomatous optic neuropathy likely included oxidative stress, vasospasm, and endothelial dysfunction.

  18. Large transverse momentum hadronic processes

    International Nuclear Information System (INIS)

    Darriulat, P.

    1977-01-01

    The possible relations between deep inelastic leptoproduction and large transverse momentum (psub(t)) processes in hadronic collisions are usually considered in the framework of the quark-parton picture. Experiments observing the structure of the final state in proton-proton collisions producing at least one large transverse momentum particle have led to the following conclusions: a large fraction of produced particles are uneffected by the large psub(t) process. The other products are correlated to the large psub(t) particle. Depending upon the sign of scalar product they can be separated into two groups of ''towards-movers'' and ''away-movers''. The experimental evidence are reviewed favouring such a picture and the properties are discussed of each of three groups (underlying normal event, towards-movers and away-movers). Some phenomenological interpretations are presented. The exact nature of away- and towards-movers must be further investigated. Their apparent jet structure has to be confirmed. Angular correlations between leading away and towards movers are very informative. Quantum number flow, both within the set of away and towards-movers, and between it and the underlying normal event, are predicted to behave very differently in different models

  19. Update on normal tension glaucoma

    Directory of Open Access Journals (Sweden)

    Jyotiranjan Mallick

    2016-01-01

    Full Text Available Normal tension glaucoma (NTG is labelled when typical glaucomatous disc changes, visual field defects and open anterior chamber angles are associated with intraocular pressure (IOP constantly below 21 mmHg. Chronic low vascular perfusion, Raynaud's phenomenon, migraine, nocturnal systemic hypotension and over-treated systemic hypertension are the main causes of normal tension glaucoma. Goldmann applanation tonometry, gonioscopy, slit lamp biomicroscopy, optical coherence tomography and visual field analysis are the main tools of investigation for the diagnosis of NTG. Management follows the same principles of treatment for other chronic glaucomas: To reduce IOP by a substantial amount, sufficient to prevent disabling visual loss. Treatment is generally aimed to lower IOP by 30% from pre-existing levels to 12-14 mmHg. Betaxolol, brimonidine, prostaglandin analogues, trabeculectomy (in refractory cases, systemic calcium channel blockers (such as nifedipine and 24-hour monitoring of blood pressure are considered in the management of NTG. The present review summarises risk factors, causes, pathogenesis, diagnosis and management of NTG.

  20. Normal variation of hepatic artery

    International Nuclear Information System (INIS)

    Kim, Inn; Nam, Myung Hyun; Rhim, Hyun Chul; Koh, Byung Hee; Seo, Heung Suk; Kim, Soon Yong

    1987-01-01

    This study was an analyses of blood supply of the liver in 125 patients who received hepatic arteriography and abdominal aortography from Jan. 1984 to Dec. 1986 at the Department of Radiology of Hanyang University Hospital. A. Variations in extrahepatic arteries: 1. The normal extrahepatic artery pattern occurred in 106 of 125 cases (84.8%) ; Right hepatic and left hepatic arteries arising from the hepatic artery proper and hepatic artery proper arising from the common hepatic artery. 2. The most common type of variation of extrahepatic artery was replaced right hepatic artery from superior mesenteric artery: 6 of 125 cases (4.8%). B. Variations in intrahepatic arteries: 1. The normal intrahepatic artery pattern occurred in 83 of 125 cases (66.4%). Right hepatic and left hepatic arteries arising from the hepatic artery proper and middle hepatic artery arising from lower portion of the umbilical point of left hepatic artery. 2. The most common variation of intrahepatic arteries was middle hepatic artery. 3. Among the variation of middle hepatic artery; Right, middle and left hepatic arteries arising from the same location at the hepatic artery proper was the most common type; 17 of 125 cases (13.6%)

  1. Spatially tuned normalization explains attention modulation variance within neurons.

    Science.gov (United States)

    Ni, Amy M; Maunsell, John H R

    2017-09-01

    area can be largely explained by between-neuron differences in normalization strength. Here we demonstrate that attention modulation size varies within neurons as well and that this variance is largely explained by within-neuron differences in normalization strength. We provide a new spatially tuned normalization model that explains this broad range of observed normalization and attention effects. Copyright © 2017 the American Physiological Society.

  2. Is My Penis Normal? (For Teens)

    Science.gov (United States)

    ... Videos for Educators Search English Español Is My Penis Normal? KidsHealth / For Teens / Is My Penis Normal? Print en español ¿Es normal mi pene? ... any guy who's ever worried about whether his penis is a normal size. There's a fairly wide ...

  3. Normal vibrations in gallium arsenide

    International Nuclear Information System (INIS)

    Dolling, G.; Waugh, J.L.T.

    1964-01-01

    The triple axis crystal spectrometer at Chalk River has been used to observe coherent slow neutron scattering from a single crystal of pure gallium arsenide at 296 o K. The frequencies of normal modes of vibration propagating in the [ζ00], (ζζζ], and (0ζζ] crystal directions have been determined with a precision of between 1 and 2·5 per cent. A limited number of normal modes have also been studied at 95 and 184 o K. Considerable difficulty was experienced in obtaining welt resolved neutron peaks corresponding to the two non-degenerate optic modes for very small wave-vector, particularly at 296 o K. However, from a comparison of results obtained under various experimental conditions at several different points in reciprocal space, frequencies (units 10 12 c/s) for these modes (at 296 o K) have been assigned: T 8·02±0·08 and L 8·55±02. Other specific normal modes, with their measured frequencies are (a) (1,0,0): TO 7·56 ± 008, TA 2·36 ± 0·015, LO 7·22 ± 0·15, LA 6·80 ± 0·06; (b) (0·5, 0·5, 0·5): TO 7·84 ± 0·12, TA 1·86 ± 0·02, LO 7·15 ± 0·07, LA 6·26 ± 0·10; (c) (0, 0·65, 0·65): optic 8·08 ±0·13, 7·54 ± 0·12 and 6·57 ± 0·11, acoustic 5·58 ± 0·08, 3·42 · 0·06 and 2·36 ± 004. These results are generally slightly lower than the corresponding frequencies for germanium. An analysis in terms of various modifications of the dipole approximation model has been carried out. A feature of this analysis is that the charge on the gallium atom appears to be very small, about +0·04 e. The frequency distribution function has been derived from one of the force models. (author)

  4. Normal vibrations in gallium arsenide

    Energy Technology Data Exchange (ETDEWEB)

    Dolling, G; Waugh, J L T

    1964-07-01

    The triple axis crystal spectrometer at Chalk River has been used to observe coherent slow neutron scattering from a single crystal of pure gallium arsenide at 296{sup o}K. The frequencies of normal modes of vibration propagating in the [{zeta}00], ({zeta}{zeta}{zeta}], and (0{zeta}{zeta}] crystal directions have been determined with a precision of between 1 and 2{center_dot}5 per cent. A limited number of normal modes have also been studied at 95 and 184{sup o}K. Considerable difficulty was experienced in obtaining welt resolved neutron peaks corresponding to the two non-degenerate optic modes for very small wave-vector, particularly at 296{sup o}K. However, from a comparison of results obtained under various experimental conditions at several different points in reciprocal space, frequencies (units 10{sup 12} c/s) for these modes (at 296{sup o}K) have been assigned: T 8{center_dot}02{+-}0{center_dot}08 and L 8{center_dot}55{+-}02. Other specific normal modes, with their measured frequencies are (a) (1,0,0): TO 7{center_dot}56 {+-} 008, TA 2{center_dot}36 {+-} 0{center_dot}015, LO 7{center_dot}22 {+-} 0{center_dot}15, LA 6{center_dot}80 {+-} 0{center_dot}06; (b) (0{center_dot}5, 0{center_dot}5, 0{center_dot}5): TO 7{center_dot}84 {+-} 0{center_dot}12, TA 1{center_dot}86 {+-} 0{center_dot}02, LO 7{center_dot}15 {+-} 0{center_dot}07, LA 6{center_dot}26 {+-} 0{center_dot}10; (c) (0, 0{center_dot}65, 0{center_dot}65): optic 8{center_dot}08 {+-}0{center_dot}13, 7{center_dot}54 {+-} 0{center_dot}12 and 6{center_dot}57 {+-} 0{center_dot}11, acoustic 5{center_dot}58 {+-} 0{center_dot}08, 3{center_dot}42 {center_dot} 0{center_dot}06 and 2{center_dot}36 {+-} 004. These results are generally slightly lower than the corresponding frequencies for germanium. An analysis in terms of various modifications of the dipole approximation model has been carried out. A feature of this analysis is that the charge on the gallium atom appears to be very small, about +0{center_dot}04 e. The

  5. Loss of Brain Aerobic Glycolysis in Normal Human Aging.

    Science.gov (United States)

    Goyal, Manu S; Vlassenko, Andrei G; Blazey, Tyler M; Su, Yi; Couture, Lars E; Durbin, Tony J; Bateman, Randall J; Benzinger, Tammie L-S; Morris, John C; Raichle, Marcus E

    2017-08-01

    The normal aging human brain experiences global decreases in metabolism, but whether this affects the topography of brain metabolism is unknown. Here we describe PET-based measurements of brain glucose uptake, oxygen utilization, and blood flow in cognitively normal adults from 20 to 82 years of age. Age-related decreases in brain glucose uptake exceed that of oxygen use, resulting in loss of brain aerobic glycolysis (AG). Whereas the topographies of total brain glucose uptake, oxygen utilization, and blood flow remain largely stable with age, brain AG topography changes significantly. Brain regions with high AG in young adults show the greatest change, as do regions with prolonged developmental transcriptional features (i.e., neoteny). The normal aging human brain thus undergoes characteristic metabolic changes, largely driven by global loss and topographic changes in brain AG. Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Size, shape, and appearance of the normal female pituitary gland

    International Nuclear Information System (INIS)

    Wolpert, S.M.; Molitch, M.E.; Goldman, J.A.; Wood, J.B.

    1984-01-01

    One hundred seven women 18-65 years old were studied who were referred for suspected central nervous system disease not related to the pituitary gland or hypothalamus. High-resolution, direct, coronal, contrast-enhanced computed tomography (CT) was used to examine the size; shape, and density of the normal pituitary gland. There were three major conclusions: (1) the height of the normal gland can be as much as 9 mm; (2) the superior margin of the gland may bulge in normal patients; and (3) both large size and convex contour appear to be associated with younger age. It was also found that serum prolactin levels do not appear to correlate with the CT appearances. Noise artifacts inherent in high-detail, thin-section, soft-tissue scanning may be a limiting factor in defining reproducible patterns in different parts of the normal pituitary gland

  7. Striving for the unknown normal

    DEFF Research Database (Denmark)

    Nielsen, Mikka

    During the last decade, more and more people have received prescriptions for ADHD drug treatment, and simultaneously the legitimacy of the ADHD diagnosis has been heavily debated among both professionals and laymen. Based on an anthropological fieldwork among adults with ADHD, I illustrate how...... the ADHD diagnosis both answers and produces existential questions on what counts as normal behaviour and emotions. The diagnosis helps the diagnosed to identify, accept and handle problems by offering concrete explanations and solutions to diffuse experienced problems. But the diagnostic process...... is not only a clarifying procedure with a straight plan for treatment and direct effects. It is also a messy affair. In a process of experimenting with drugs and attempting to determine how or whether the medication eliminates the correct symptoms the diagnosed is put in an introspective, self...

  8. IIH with normal CSF pressures?

    Directory of Open Access Journals (Sweden)

    Soh Youn Suh

    2013-01-01

    Full Text Available Idiopathic intracranial hypertension (IIH is a condition of raised intracranial pressure (ICP in the absence of space occupying lesions. ICP is usually measured by lumbar puncture and a cerebrospinal fluid (CSF pressure above 250 mm H 2 O is one of the diagnostic criteria of IIH. Recently, we have encountered two patients who complained of headaches and exhibited disc swelling without an increased ICP. We prescribed acetazolamide and followed both patients frequently; because of the definite disc swelling with IIH related symptoms. Symptoms and signs resolved in both patients after they started taking acetazolamide. It is generally known that an elevated ICP, as measured by lumbar puncture, is the most important diagnostic sign of IIH. However, these cases caution even when CSF pressure is within the normal range, that suspicion should be raised when a patient has papilledema with related symptoms, since untreated papilledema may cause progressive and irreversible visual loss.

  9. Normalizing tweets with edit scripts and recurrent neural embeddings

    NARCIS (Netherlands)

    Chrupala, Grzegorz; Toutanova, Kristina; Wu, Hua

    2014-01-01

    Tweets often contain a large proportion of abbreviations, alternative spellings, novel words and other non-canonical language. These features are problematic for standard language analysis tools and it can be desirable to convert them to canonical form. We propose a novel text normalization model

  10. Evaluation of corneal higher order aberrations in normal topographic patterns

    Directory of Open Access Journals (Sweden)

    Ali Mirzajani

    2016-06-01

    Conclusions: Based on results in this study, there were a good correlation between corneal topographic pattern and corneal HOAs in normal eyes. These results indicate that the corneal HOAs values are largely determined by the topographic patterns. A larger sample size would perhaps have been beneficial to yield in more accurate outcomes.

  11. Normalized difference vegetation index (NDVI) variation among cultivars and environments

    Science.gov (United States)

    Although Nitrogen (N) is an essential nutrient for crop production, large preplant applications of fertilizer N can result in off-field loss that causes environmental concerns. Canopy reflectance is being investigated for use in variable rate (VR) N management. Normalized difference vegetation index...

  12. Confidence bounds for normal and lognormal distribution coefficients of variation

    Science.gov (United States)

    Steve Verrill

    2003-01-01

    This paper compares the so-called exact approach for obtaining confidence intervals on normal distribution coefficients of variation to approximate methods. Approximate approaches were found to perform less well than the exact approach for large coefficients of variation and small sample sizes. Web-based computer programs are described for calculating confidence...

  13. CASTOR: Normal-mode analysis of resistive MHD plasmas

    NARCIS (Netherlands)

    Kerner, W.; Goedbloed, J. P.; Huysmans, G. T. A.; Poedts, S.; Schwarz, E.

    1998-01-01

    The CASTOR (complex Alfven spectrum of toroidal plasmas) code computes the entire spectrum of normal-modes in resistive MHD for general tokamak configurations. The applied Galerkin method, in conjunction with a Fourier finite-element discretisation, leads to a large scale eigenvalue problem A (x)

  14. CT in normal pressure hydrocephalus

    International Nuclear Information System (INIS)

    Fujita, Katsuzo; Nogaki, Hidekazu; Noda, Masaya; Kusunoki, Tadaki; Tamaki, Norihiko

    1981-01-01

    CT scans were obtained on 33 patients (age 73y. to 31y.) with the diagnosis of normal pressure hydrocephalus. In each case, the diagnosis was made on the basis of the symptoms, CT and cisternographic findings. Underlying diseases of normal pressure hydrocephalus are ruptured aneurysms (21 cases), arteriovenous malformations (2 cases), head trauma (1 case), cerebrovascular accidents (1 case) and idiopathie (8 cases). Sixteen of 33 patients showed marked improvement, five, moderate or minimal improvement, and twelve, no change. The results were compared with CT findings and clinical response to shunting. CT findings were classified into five types, bases on the degree of periventricular hypodensity (P.V.H.), the extent of brain damage by underlying diseases, and the degree of cortical atrophy. In 17 cases of type (I), CT shows the presence of P.V.H. with or without minimal frontal lobe damage and no cortical atrophy. The good surgical improvements were achieved in all cases of type (I) by shunting. In 4 cases of type (II), CT shows the presence of P.V.H. and severe brain damage without cortical atrophy. The fair clinical improvements were achieved in 2 cases (50%) by shunting. In one case of type (III), CT shows the absence of P.V.H. without brain damage nor cortical atrophy. No clinical improvement was obtained by shunting in this type. In 9 cases of type (IV) with mild cortical atrophy, the fair clinical improvement was achieved in two cases (22%) and no improvement in 7 cases. In 2 cases of type (V) with moderate or marked cortical atrophy, no clinical improvement was obtained by shunting. In conclusion, it appeared from the present study that there was a good correlation between the result of shunting and the type of CT, and clinical response to shunting operation might be predicted by classification of CT findings. (author)

  15. Normalization reduces intersubject variability in cervical vestibular evoked myogenic potentials.

    Science.gov (United States)

    van Tilburg, Mark J; Herrmann, Barbara S; Guinan, John J; Rauch, Steven D

    2014-09-01

    Cervical vestibular evoked myogenic potentials are used to assess saccular and inferior vestibular nerve function. Normalization of the VEMP waveform has been proposed to reduce the variability in vestibular evoked myogenic potentials by correcting for muscle activation. In this study, we test the hypothesis that normalization of the raw cervical VEMP waveform causes a significant decrease in the intersubject variability. Prospective cohort study. Large specialty hospital, department of otolaryngology. Twenty healthy subjects were used in this study. All subjects underwent cervical vestibular evoked myogenic potential testing using short tone bursts at 250, 500, 750, and 1,000 Hz. Both intersubject and intrasubject variability was assessed. Variability between raw and normalized peak-to-peak amplitudes was compared using the coefficient of variation. Intrasubject variability was assessed using the intraclass correlation coefficient and interaural asymmetry ratio. cVEMPs were present in most ears. Highest peak-to-peak amplitudes were recorded at 750 Hz. Normalization did not alter cVEMP tuning characteristics. Normalization of the cVEMP response caused a significant reduction in intersubject variability of the peak-to-peak amplitude. No significant change was seen in the intrasubject variability. Normalization significantly reduces cVEMP intersubject variability in healthy subjects without altering cVEMP characteristics. By reducing cVEMP amplitude variation due to nonsaccular, muscle-related factors, cVEMP normalization is expected to improve the ability to distinguish between healthy and pathologic responses in the clinical application of cVEMP testing.

  16. Transport through hybrid superconducting/normal nanostructures

    Energy Technology Data Exchange (ETDEWEB)

    Futterer, David

    2013-01-29

    We mainly investigate transport through interacting quantum dots proximized by superconductors. For this purpose we extend an existing theory to describe transport through proximized quantum dots coupled to normal and superconducting leads. It allows us to study the influence of a strong Coulomb interaction on Andreev currents and Josephson currents. This is a particularly interesting topic because it combines two competing properties: in superconductors Cooper pairs are formed by two electrons which experience an attractive interaction while two electrons located on a quantum dot repel each other due to the Coulomb interaction. It seems at first glance that transport processes involving Cooper pairs should be suppressed because of the two competing interactions. However, it is possible to proximize the dot in nonequilibrium situations. At first, we study a setup composed of a quantum dot coupled to one normal, one ferromagnetic, and one superconducting lead in the limit of an infinitely-large superconducting gap. Within this limit the coupling between dot and superconductor is described exactly by the presented theory. It leads to the formation of Andreev-bound states (ABS) and an additional bias scheme opens in which a pure spin current, i.e. a spin current with a vanishing associated charge current, can be generated. In a second work, starting from the infinite-gap limit, we perform a systematic expansion of the superconducting gap around infinity and investigate Andreev currents and Josephson currents. This allows us to estimate the validity of infinite-gap calculations for real systems in which the superconducting gap is usually a rather small quantity. We find indications that a finite gap renormalizes the ABS and propose a resummation approach to explore the finite-gap ABS. Despite the renormalization effects the modifications of transport by finite gaps are rather small. This result lets us conclude that the infinite-gap calculation is a valuable tool to

  17. Transport through hybrid superconducting/normal nanostructures

    International Nuclear Information System (INIS)

    Futterer, David

    2013-01-01

    We mainly investigate transport through interacting quantum dots proximized by superconductors. For this purpose we extend an existing theory to describe transport through proximized quantum dots coupled to normal and superconducting leads. It allows us to study the influence of a strong Coulomb interaction on Andreev currents and Josephson currents. This is a particularly interesting topic because it combines two competing properties: in superconductors Cooper pairs are formed by two electrons which experience an attractive interaction while two electrons located on a quantum dot repel each other due to the Coulomb interaction. It seems at first glance that transport processes involving Cooper pairs should be suppressed because of the two competing interactions. However, it is possible to proximize the dot in nonequilibrium situations. At first, we study a setup composed of a quantum dot coupled to one normal, one ferromagnetic, and one superconducting lead in the limit of an infinitely-large superconducting gap. Within this limit the coupling between dot and superconductor is described exactly by the presented theory. It leads to the formation of Andreev-bound states (ABS) and an additional bias scheme opens in which a pure spin current, i.e. a spin current with a vanishing associated charge current, can be generated. In a second work, starting from the infinite-gap limit, we perform a systematic expansion of the superconducting gap around infinity and investigate Andreev currents and Josephson currents. This allows us to estimate the validity of infinite-gap calculations for real systems in which the superconducting gap is usually a rather small quantity. We find indications that a finite gap renormalizes the ABS and propose a resummation approach to explore the finite-gap ABS. Despite the renormalization effects the modifications of transport by finite gaps are rather small. This result lets us conclude that the infinite-gap calculation is a valuable tool to

  18. Increasingly normal, never the norm

    OpenAIRE

    Saskia Keuzenkamp et al.

    2010-01-01

    Original title: Steeds gewoner, nooit gewoon. Homosexuality is becoming more and more accepted in the Netherlands: in fact, compared with other Western countries, the Dutch public hold the most positive attitudes towards homosexuality. Despite this, there are still groups in the Netherlands who have difficulty with homosexuality and bisexuality. Moreover, people are not equally tolerant on all fronts. Based on information from large-scale population surveys and in-depth interviews with young ...

  19. Assembling large, complex environmental metagenomes

    Energy Technology Data Exchange (ETDEWEB)

    Howe, A. C. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Plant Soil and Microbial Sciences; Jansson, J. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Earth Sciences Division; Malfatti, S. A. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Tringe, S. G. [USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Tiedje, J. M. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Plant Soil and Microbial Sciences; Brown, C. T. [Michigan State Univ., East Lansing, MI (United States). Microbiology and Molecular Genetics, Computer Science and Engineering

    2012-12-28

    The large volumes of sequencing data required to sample complex environments deeply pose new challenges to sequence analysis approaches. De novo metagenomic assembly effectively reduces the total amount of data to be analyzed but requires significant computational resources. We apply two pre-assembly filtering approaches, digital normalization and partitioning, to make large metagenome assemblies more computationaly tractable. Using a human gut mock community dataset, we demonstrate that these methods result in assemblies nearly identical to assemblies from unprocessed data. We then assemble two large soil metagenomes from matched Iowa corn and native prairie soils. The predicted functional content and phylogenetic origin of the assembled contigs indicate significant taxonomic differences despite similar function. The assembly strategies presented are generic and can be extended to any metagenome; full source code is freely available under a BSD license.

  20. Normalization of emotion control scale

    Directory of Open Access Journals (Sweden)

    Hojatoolah Tahmasebian

    2014-09-01

    Full Text Available Background: Emotion control skill teaches the individuals how to identify their emotions and how to express and control them in various situations. The aim of this study was to normalize and measure the internal and external validity and reliability of emotion control test. Methods: This standardization study was carried out on a statistical society, including all pupils, students, teachers, nurses and university professors in Kermanshah in 2012, using Williams’ emotion control scale. The subjects included 1,500 (810 females and 690 males people who were selected by stratified random sampling. Williams (1997 emotion control scale, was used to collect the required data. Emotional Control Scale is a tool for measuring the degree of control people have over their emotions. This scale has four subscales, including anger, depressed mood, anxiety and positive affect. The collected data were analyzed by SPSS software using correlation and Cronbach's alpha tests. Results: The results of internal consistency of the questionnaire reported by Cronbach's alpha indicated an acceptable internal consistency for emotional control scale, and the correlation between the subscales of the test and between the items of the questionnaire was significant at 0.01 confidence level. Conclusion: The validity of emotion control scale among the pupils, students, teachers, nurses and teachers in Iran has an acceptable range, and the test itemswere correlated with each other, thereby making them appropriate for measuring emotion control.

  1. Digital Pupillometry in Normal Subjects

    Science.gov (United States)

    Rickmann, Annekatrin; Waizel, Maria; Kazerounian, Sara; Szurman, Peter; Wilhelm, Helmut; Boden, Karl T.

    2017-01-01

    ABSTRACT The aim of this study was to evaluate the pupil size of normal subjects at different illumination levels with a novel pupillometer. The pupil size of healthy study participants was measured with an infrared-video PupilX pupillometer (MEye Tech GmbH, Alsdorf, Germany) at five different illumination levels (0, 0.5, 4, 32, and 250 lux). Measurements were performed by the same investigator. Ninety images were executed during a measurement period of 3 seconds. The absolute linear camera resolution was approximately 20 pixels per mm. This cross-sectional study analysed 490 eyes of 245 subjects (mean age: 51.9 ± 18.3 years, range: 6–87 years). On average, pupil diameter decreased with increasing light intensities for both eyes, with a mean pupil diameter of 5.39 ± 1.04 mm at 0 lux, 5.20 ± 1.00 mm at 0.5 lux, 4.70 ± 0.97 mm at 4 lux, 3.74 ± 0.78 mm at 32 lux, and 2.84 ± 0.50 mm at 250 lux illumination. Furthermore, it was found that anisocoria increased by 0.03 mm per life decade for all illumination levels (R2 = 0.43). Anisocoria was higher under scotopic and mesopic conditions. This study provides additional information to the current knowledge concerning age- and light-related pupil size and anisocoria as a baseline for future patient studies. PMID:28228832

  2. Design study of a normal conducting helical snake for AGS

    CERN Document Server

    Takano, Junpei; Okamura, Masahiro; Roser, Thomas; MacKay, William W; Luccio, Alfredo U; Takano, Koji

    2004-01-01

    A new normal conducting snake magnet is being fabricated for the Alternate Gradient Synchrotron (AGS) at Brookhaven National Laboratory (BNL). In the Relativistic Heavy Ion Collider (RHIC) project, a superconducting type helical dipole magnets had been developed and it performed successfully in high-energy polarized proton acceleration. The new AGS helical snake has the same basic magnetic structure but is more complicated. To achieve no beam shift and no beam deflection in one magnetic device, helical pitches and rotating angles were carefully calculated. Compared to a superconducting magnet, a normal warm magnet must have a large cross- sectional area of conductors which make it difficult to design a magnet with large helical pitch. We developed a modified window frame structure to accommodate the large number of conductors. Its three dimensional magnetic field was simulated by using OPERA3D/TOSCA. 3 Refs.

  3. TRASYS form factor matrix normalization

    Science.gov (United States)

    Tsuyuki, Glenn T.

    1992-01-01

    A method has been developed for adjusting a TRASYS enclosure form factor matrix to unity. This approach is not limited to closed geometries, and in fact, it is primarily intended for use with open geometries. The purpose of this approach is to prevent optimistic form factors to space. In this method, nodal form factor sums are calculated within 0.05 of unity using TRASYS, although deviations as large as 0.10 may be acceptable, and then, a process is employed to distribute the difference amongst the nodes. A specific example has been analyzed with this method, and a comparison was performed with a standard approach for calculating radiation conductors. In this comparison, hot and cold case temperatures were determined. Exterior nodes exhibited temperature differences as large as 7 C and 3 C for the hot and cold cases, respectively when compared with the standard approach, while interior nodes demonstrated temperature differences from 0 C to 5 C. These results indicate that temperature predictions can be artificially biased if the form factor computation error is lumped into the individual form factors to space.

  4. Characterizing Normal Groundwater Chemistry in Hawaii

    Science.gov (United States)

    Tachera, D.; Lautze, N. C.; Thomas, D. M.; Whittier, R. B.; Frazer, L. N.

    2017-12-01

    Hawaii is dependent on groundwater resources, yet how water moves through the subsurface is not well understood in many locations across the state. As marine air moves across the islands water evaporates from the ocean, along with trace amounts of sea-salt ions, and interacts with the anthropogenic and volcanic aerosols (e.g. sulfuric acid, ammonium sulfate, HCl), creating a slightly more acidic rain. When this rain falls, it has a chemical signature distinctive of past processes. As this precipitation infiltrates through soil it may pick up another distinctive chemical signature associated with land use and degree of soil development, and as it flows through the underlying geology, its chemistry is influenced by the host rock. We are currently conducting an investigation of groundwater chemistry in selected aquifer areas of Hawaii, having diverse land use, land cover, and soil development conditions, in an effort to investigate and document what may be considered a "normal" water chemistry for an area. Through this effort, we believe we better assess anomalies due to contamination events, hydrothermal alteration, and other processes; and we can use this information to better understand groundwater flow direction. The project has compiled a large amount of precipitation, soil, and groundwater chemistry data in the three focus areas distributed across in the State of Hawaii. Statistical analyses of these data sets will be performed in an effort to determine what is "normal" and what is anomalous chemistry for a given area. Where possible, results will be used to trace groundwater flow paths. Methods and preliminary results will be presented.

  5. Turbocharging Normalization in Highland Conditions

    Directory of Open Access Journals (Sweden)

    I. V. Filippov

    2017-01-01

    Full Text Available To ensure many production processes are used compressors of various types, including turbochargers, which produce compressed air. The actual performance values of turbochargers used in highlands are significantly different from the certified values, and parameters of compressed air do not always guarantee the smooth and efficient functioning for consumers.The paper presents research results of the turbochargers of 4CI 425MX4 type, a series of "CENTAC", manufactured by INGERSOL – RAND Company. The research has been conducted in industrial highland conditions in difficult climatic environment. There were almost no investigations of turbochargers running in highland conditions. The combination of low atmospheric pressure with high temperature of the intake air causes the abnormal operating conditions of a turbocharger. Only N. M. Barannikov in his paper shows the results of theoretical studies of such operating conditions, but as to the practical research, there is no information at all.To normalize the turbocharger operation an option of the mechanical pressurization in the suction pipe is adopted. As a result of theoretical research, a TurboMAX blower MAX500 was chosen as a supercharger. The next stage of theoretical research was to construct characteristics of the turbocharger 4CI 425MX4 with a mechanical supercharger in the suction pipe. The boost reduces to the minimum the time of using additional compressors when parameters of the intake air are changed and ensures the smooth and efficient functioning for consumers.To verify the results of theoretical studies, namely, the technique for recalculation of the turbocharger characteristics under the real conditions of suction, were carried out the experimental researches. The average error between experimental and theoretical data is 2,9783 %, which confirms the validity of the technique used for reduction of the turbocharger characteristics to those under the real conditions of suction.

  6. Vaginal Discharge: What's Normal, What's Not

    Science.gov (United States)

    ... Staying Safe Videos for Educators Search English Español Vaginal Discharge: What's Normal, What's Not KidsHealth / For Teens / ... Discharge: What's Normal, What's Not Print What Is Vaginal Discharge? Vaginal discharge is fluid that comes from ...

  7. Should Japan Become a Normal Country

    National Research Council Canada - National Science Library

    Yildiz, Ahmet

    2005-01-01

    This thesis evaluates Japanese geopolitical change in the post-Cold War era. It does so by analyzing Japan's history, its foreign policy since 1945, its reasons for becoming a normal country, and the impact of its normalization...

  8. COBE DMR-normalized open inflation cold dark matter cosmogony

    Science.gov (United States)

    Gorski, Krzysztof M.; Ratra, Bharat; Sugiyama, Naoshi; Banday, Anthony J.

    1995-01-01

    A cut-sky orthogonal mode analysis of the 2 year COBE DMR 53 and 90 GHz sky maps (in Galactic coordinates) is used to determine the normalization of an open inflation model based on the cold dark matter (CDM) scenario. The normalized model is compared to measures of large-scale structure in the universe. Although the DMR data alone does not provide sufficient discriminative power to prefer a particular value of the mass density parameter, the open model appears to be reasonably consistent with observations when Omega(sub 0) is approximately 0.3-0.4 and merits further study.

  9. Normal Spin Asymmetries in Elastic Electron-Proton Scattering

    International Nuclear Information System (INIS)

    M. Gorchtein; P.A.M. Guichon; M. Vanderhaeghen

    2004-01-01

    We discuss the two-photon exchange contribution to observables which involve lepton helicity flip in elastic lepton-nucleon scattering. This contribution is accessed through the single spin asymmetry for a lepton beam polarized normal to the scattering plane. We estimate this beam normal spin asymmetry at large momentum transfer using a parton model and we express the corresponding amplitude in terms of generalized parton distributions. We further discuss this observable in the quasi-RCS kinematics which may be dominant at certain kinematical conditions and find it to be governed by the photon helicity-flip RCS amplitudes

  10. Normal Spin Asymmetries in Elastic Electron-Proton Scattering

    International Nuclear Information System (INIS)

    Gorchtein, M.; Guichon, P.A.M.; Vanderhaeghen, M.

    2005-01-01

    We discuss the two-photon exchange contribution to observables which involve lepton helicity flip in elastic lepton-nucleon scattering. This contribution is accessed through the single spin asymmetry for a lepton beam polarized normal to the scattering plane. We estimate this beam normal spin asymmetry at large momentum transfer using a parton model and we express the corresponding amplitude in terms of generalized parton distributions. We further discuss this observable in the quasi-RCS kinematics which may be dominant at certain kinematical conditions and find it to be governed by the photon helicity-flip RCS amplitudes

  11. Volume-preserving normal forms of Hopf-zero singularity

    International Nuclear Information System (INIS)

    Gazor, Majid; Mokhtari, Fahimeh

    2013-01-01

    A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto–Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple. (paper)

  12. Volume-preserving normal forms of Hopf-zero singularity

    Science.gov (United States)

    Gazor, Majid; Mokhtari, Fahimeh

    2013-10-01

    A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto-Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple.

  13. A comparison of vowel normalization procedures for language variation research

    Science.gov (United States)

    Adank, Patti; Smits, Roel; van Hout, Roeland

    2004-11-01

    An evaluation of vowel normalization procedures for the purpose of studying language variation is presented. The procedures were compared on how effectively they (a) preserve phonemic information, (b) preserve information about the talker's regional background (or sociolinguistic information), and (c) minimize anatomical/physiological variation in acoustic representations of vowels. Recordings were made for 80 female talkers and 80 male talkers of Dutch. These talkers were stratified according to their gender and regional background. The normalization procedures were applied to measurements of the fundamental frequency and the first three formant frequencies for a large set of vowel tokens. The normalization procedures were evaluated through statistical pattern analysis. The results show that normalization procedures that use information across multiple vowels (``vowel-extrinsic'' information) to normalize a single vowel token performed better than those that include only information contained in the vowel token itself (``vowel-intrinsic'' information). Furthermore, the results show that normalization procedures that operate on individual formants performed better than those that use information across multiple formants (e.g., ``formant-extrinsic'' F2-F1). .

  14. A note on totally normal spaces

    International Nuclear Information System (INIS)

    Zougdani, H.K.

    1990-10-01

    In this note we give the necessary and sufficient condition for a topological space X such that the product space X x Y is totally normal for any (non discrete) metric space Y, and we show that a totally normal p-space need not be a perfectly normal in general, which makes Theorem 2 doubtful. (author). 6 refs

  15. Manufacturing technology for practical Josephson voltage normals

    International Nuclear Information System (INIS)

    Kohlmann, Johannes; Kieler, Oliver

    2016-01-01

    In this contribution we present the manufacturing technology for the fabrication of integrated superconducting Josephson serial circuits for voltage normals. First we summarize some foundations for Josephson voltage normals and sketch the concept and the setup of the circuits, before we describe the manufacturing technology form modern practical Josephson voltage normals.

  16. Normal distal pulmonary vein anatomy

    Directory of Open Access Journals (Sweden)

    Wiesława Klimek-Piotrowska

    2016-01-01

    Full Text Available Background. It is well known that the pulmonary veins (PVs, especially their myocardial sleeves play a critical role in the initiation and maintenance of atrial fibrillation. Understanding the PV anatomy is crucial for the safety and efficacy of all procedures performed on PVs. The aim of this study was to present normal distal PV anatomy and to create a juxtaposition of all PV ostium variants.Methods. A total of 130 randomly selected autopsied adult human hearts (Caucasian were examined. The number of PVs ostia was evaluated and their diameter was measured. The ostium-to-last-tributary distance and macroscopic presence of myocardial sleeves were also evaluated.Results. Five hundred forty-one PV ostia were identified. Four classical PV ostia patterns (two left and two right PVs were observed in 70.8% of all cases. The most common variant was the classical pattern with additional middle right PV (19.2%, followed by the common ostium for the left superior and the inferior PVs (4.44%. Mean diameters of PV ostia (for the classical pattern were: left superior = 13.8 ± 2.9 mm; left inferior = 13.3 ± 3.4 mm; right superior = 14.3 ± 2.9 mm; right inferior = 13.7 ± 3.3 mm. When present, the additional middle right PV ostium had the smallest PV ostium diameter in the heart (8.2 ± 4.1 mm. The mean ostium-to-last-tributary (closest to the atrium distances were: left superior = 15.1 ± 4.6 mm; left inferior = 13.5 ± 4.0 mm; right superior = 11.8 ± 4.0 mm; right inferior = 11.0 ± 3.7 mm. There were no statistically significant differences between sexes in ostia diameters and ostium-to-last-tributary distances.Conclusion. Only 71% of the cases have four standard pulmonary veins. The middle right pulmonary vein is present in almost 20% of patients. Presented data can provide useful information for the clinicians during interventional procedures or radiologic examinations of PVs.

  17. Externally studentized normal midrange distribution

    Directory of Open Access Journals (Sweden)

    Ben Dêivide de Oliveira Batista

    Full Text Available ABSTRACT The distribution of externally studentized midrange was created based on the original studentization procedures of Student and was inspired in the distribution of the externally studentized range. The large use of the externally studentized range in multiple comparisons was also a motivation for developing this new distribution. This work aimed to derive analytic equations to distribution of the externally studentized midrange, obtaining the cumulative distribution, probability density and quantile functions and generating random values. This is a new distribution that the authors could not find any report in the literature. A second objective was to build an R package for obtaining numerically the probability density, cumulative distribution and quantile functions and make it available to the scientific community. The algorithms were proposed and implemented using Gauss-Legendre quadrature and the Newton-Raphson method in R software, resulting in the SMR package, available for download in the CRAN site. The implemented routines showed high accuracy proved by using Monte Carlo simulations and by comparing results with different number of quadrature points. Regarding to the precision to obtain the quantiles for cases where the degrees of freedom are close to 1 and the percentiles are close to 100%, it is recommended to use more than 64 quadrature points.

  18. Photometric normalization of LROC WAC images

    Science.gov (United States)

    Sato, H.; Denevi, B.; Robinson, M. S.; Hapke, B. W.; McEwen, A. S.; LROC Science Team

    2010-12-01

    The Lunar Reconnaissance Orbiter Camera (LROC) Wide Angle Camera (WAC) acquires near global coverage on a monthly basis. The WAC is a push frame sensor with a 90° field of view (FOV) in BW mode and 60° FOV in 7-color mode (320 nm to 689 nm). WAC images are acquired during each orbit in 10° latitude segments with cross track coverage of ~50 km. Before mosaicking, WAC images are radiometrically calibrated to remove instrumental artifacts and to convert at sensor radiance to I/F. Images are also photometrically normalized to common viewing and illumination angles (30° phase), a challenge due to the wide angle nature of the WAC where large differences in phase angle are observed in a single image line (±30°). During a single month the equatorial incidence angle drifts about 28° and over the course of ~1 year the lighting completes a 360° cycle. The light scattering properties of the lunar surface depend on incidence(i), emission(e), and phase(p) angles as well as soil properties such as single-scattering albedo and roughness that vary with terrain type and state of maturity [1]. We first tested a Lommel-Seeliger Correction (LSC) [cos(i)/(cos(i) + cos(e))] [2] with a phase function defined by an exponential decay plus 4th order polynomial term [3] which did not provide an adequate solution. Next we employed a LSC with an exponential 2nd order decay phase correction that was an improvement, but still exhibited unacceptable frame-to-frame residuals. In both cases we fitted the LSC I/F vs. phase angle to derive the phase corrections. To date, the best results are with a lunar-lambert function [4] with exponential 2nd order decay phase correction (LLEXP2) [(A1exp(B1p)+A2exp(B2p)+A3) * cos(i)/(cos(e) + cos(i)) + B3cos(i)]. We derived the parameters for the LLEXP2 from repeat imaging of a small region and then corrected that region with excellent results. When this correction was applied to the whole Moon the results were less than optimal - no surprise given the

  19. MR guided spatial normalization of SPECT scans

    International Nuclear Information System (INIS)

    Crouch, B.; Barnden, L.R.; Kwiatek, R.

    2010-01-01

    Full text: In SPECT population studies where magnetic resonance (MR) scans are also available, the higher resolution of the MR scans allows for an improved spatial normalization of the SPECT scans. In this approach, the SPECT images are first coregistered to their corresponding MR images by a linear (affine) transformation which is calculated using SPM's mutual information maximization algorithm. Non-linear spatial normalization maps are then computed either directly from the MR scans using SPM's built in spatial normalization algorithm, or, from segmented TI MR images using DARTEL, an advanced diffeomorphism based spatial normalization algorithm. We compare these MR based methods to standard SPECT based spatial normalization for a population of 27 fibromyalgia patients and 25 healthy controls with spin echo T 1 scans. We identify significant perfusion deficits in prefrontal white matter in FM patients, with the DARTEL based spatial normalization procedure yielding stronger statistics than the standard SPECT based spatial normalization. (author)

  20. Anomalous normal mode oscillations in semiconductor microcavities

    Energy Technology Data Exchange (ETDEWEB)

    Wang, H. [Univ. of Oregon, Eugene, OR (United States). Dept. of Physics; Hou, H.Q.; Hammons, B.E. [Sandia National Labs., Albuquerque, NM (United States)

    1997-04-01

    Semiconductor microcavities as a composite exciton-cavity system can be characterized by two normal modes. Under an impulsive excitation by a short laser pulse, optical polarizations associated with the two normal modes have a {pi} phase difference. The total induced optical polarization is then expected to exhibit a sin{sup 2}({Omega}t)-like oscillation where 2{Omega} is the normal mode splitting, reflecting a coherent energy exchange between the exciton and cavity. In this paper the authors present experimental studies of normal mode oscillations using three-pulse transient four wave mixing (FWM). The result reveals surprisingly that when the cavity is tuned far below the exciton resonance, normal mode oscillation in the polarization is cos{sup 2}({Omega}t)-like, in contrast to what is expected form the simple normal mode model. This anomalous normal mode oscillation reflects the important role of virtual excitation of electronic states in semiconductor microcavities.

  1. DNA-repair, cell killing and normal tissue damage

    International Nuclear Information System (INIS)

    Dahm-Daphi, J.; Dikomey, E.; Brammer, I.

    1998-01-01

    Background: Side effects of radiotherapy in normal tissue is determined by a variety of factors of which cellular and genetic contributions are described here. Material and methods: Review. Results: Normal tissue damage after irradiation is largely due to loss of cellular proliferative capacity. This can be due to mitotic cell death, apoptosis, or terminal differentiation. Dead or differentiated cells release cytokines which additionally modulate the tissue response. DNA damage, in particular non-reparable or misrepaired double-strand breaks are considered the basic lesion leading to G1-arrest and ultimately to cell inactivation. Conclusion: Evidence for genetic bases of normal tissue response, cell killing and DNA-repair capacity is presented. However, a direct link of all 3 endpoints has not yet been proved directly. (orig.) [de

  2. Absorption of orally administered 65Zn by normal human subjects

    International Nuclear Information System (INIS)

    Aamodt, R.L.; Rumble, W.F.; Johnston, G.S.; Markley, E.J.; Henkin, R.I.

    1981-01-01

    Despite studies by several investigators of human gastrointestinal 65Zn absorption, implications of these data for evaluation of functional zinc status are unclear because limited numbers of normal subjects have been studied. To evaluated zinc absorption in normal humans, 75 subjects (31 women, 44 men, ages 18 to 84 yr) were given 10 micro Ci carrier-free 65Zn orally after an overnight fast. Absorption calculated from total body retention measured 7, 14, and 21 days after administration of tracer was 65 +/- 11% (mean +/- 1 SD), range from 40 to 86%. Comparison of these results with those for patients with a variety of diseases indicate that patients exhibit a wider range of absorption and, in four of six studies patients exhibit decreased mean zinc absorption. These results of gastrointestinal zinc absorption in a large number of normal humans offer a basis for a clearer comparison with data from patients who exhibit abnormalities of zinc absorption

  3. Log-Normal Turbulence Dissipation in Global Ocean Models

    Science.gov (United States)

    Pearson, Brodie; Fox-Kemper, Baylor

    2018-03-01

    Data from turbulent numerical simulations of the global ocean demonstrate that the dissipation of kinetic energy obeys a nearly log-normal distribution even at large horizontal scales O (10 km ) . As the horizontal scales of resolved turbulence are larger than the ocean is deep, the Kolmogorov-Yaglom theory for intermittency in 3D homogeneous, isotropic turbulence cannot apply; instead, the down-scale potential enstrophy cascade of quasigeostrophic turbulence should. Yet, energy dissipation obeys approximate log-normality—robustly across depths, seasons, regions, and subgrid schemes. The distribution parameters, skewness and kurtosis, show small systematic departures from log-normality with depth and subgrid friction schemes. Log-normality suggests that a few high-dissipation locations dominate the integrated energy and enstrophy budgets, which should be taken into account when making inferences from simplified models and inferring global energy budgets from sparse observations.

  4. Normal mode-guided transition pathway generation in proteins.

    Directory of Open Access Journals (Sweden)

    Byung Ho Lee

    Full Text Available The biological function of proteins is closely related to its structural motion. For instance, structurally misfolded proteins do not function properly. Although we are able to experimentally obtain structural information on proteins, it is still challenging to capture their dynamics, such as transition processes. Therefore, we need a simulation method to predict the transition pathways of a protein in order to understand and study large functional deformations. Here, we present a new simulation method called normal mode-guided elastic network interpolation (NGENI that performs normal modes analysis iteratively to predict transition pathways of proteins. To be more specific, NGENI obtains displacement vectors that determine intermediate structures by interpolating the distance between two end-point conformations, similar to a morphing method called elastic network interpolation. However, the displacement vector is regarded as a linear combination of the normal mode vectors of each intermediate structure, in order to enhance the physical sense of the proposed pathways. As a result, we can generate more reasonable transition pathways geometrically and thermodynamically. By using not only all normal modes, but also in part using only the lowest normal modes, NGENI can still generate reasonable pathways for large deformations in proteins. This study shows that global protein transitions are dominated by collective motion, which means that a few lowest normal modes play an important role in this process. NGENI has considerable merit in terms of computational cost because it is possible to generate transition pathways by partial degrees of freedom, while conventional methods are not capable of this.

  5. Degassing a large LHe cryopump

    International Nuclear Information System (INIS)

    Denhoy, B.S.; Batzer, T.H.; Call, W.R.

    1977-01-01

    A method has been developed and successfully tested to degas a large LHe cryopump. Use of this method inhibits the normally excessive pressure rise during the degassing cycle when the degassing rate exceeds the external pumping capabilities of the system. A small appendage pump, installed close to the main cryopump, absorbs all the gas, as it is desorbed from the main cryopump, with no rise in the system pressure. The appendage pump can then be isolated from the main vacuum system and degassed at high pressure. We pumped 15 to 20 x 10 3 Torr . 1 of H 2 on a 1.25 m 2 panel. During the degassing cycle the system pressure never rose above 1 x 10 -4 Torr. In large vacuum systems for future fusion machines that contain cryopump panels as well as cryogenic magnets, this method is a unique and very useful tool. It will allow the degassing of cryopumps without affecting the temperature equilibrium of cryogenic magnets

  6. Effects of normalization on quantitative traits in association test

    Science.gov (United States)

    2009-01-01

    Background Quantitative trait loci analysis assumes that the trait is normally distributed. In reality, this is often not observed and one strategy is to transform the trait. However, it is not clear how much normality is required and which transformation works best in association studies. Results We performed simulations on four types of common quantitative traits to evaluate the effects of normalization using the logarithm, Box-Cox, and rank-based transformations. The impact of sample size and genetic effects on normalization is also investigated. Our results show that rank-based transformation gives generally the best and consistent performance in identifying the causal polymorphism and ranking it highly in association tests, with a slight increase in false positive rate. Conclusion For small sample size or genetic effects, the improvement in sensitivity for rank transformation outweighs the slight increase in false positive rate. However, for large sample size and genetic effects, normalization may not be necessary since the increase in sensitivity is relatively modest. PMID:20003414

  7. Effects of normalization on quantitative traits in association test

    Directory of Open Access Journals (Sweden)

    Yap Von Bing

    2009-12-01

    Full Text Available Abstract Background Quantitative trait loci analysis assumes that the trait is normally distributed. In reality, this is often not observed and one strategy is to transform the trait. However, it is not clear how much normality is required and which transformation works best in association studies. Results We performed simulations on four types of common quantitative traits to evaluate the effects of normalization using the logarithm, Box-Cox, and rank-based transformations. The impact of sample size and genetic effects on normalization is also investigated. Our results show that rank-based transformation gives generally the best and consistent performance in identifying the causal polymorphism and ranking it highly in association tests, with a slight increase in false positive rate. Conclusion For small sample size or genetic effects, the improvement in sensitivity for rank transformation outweighs the slight increase in false positive rate. However, for large sample size and genetic effects, normalization may not be necessary since the increase in sensitivity is relatively modest.

  8. Normalization and experimental design for ChIP-chip data

    Directory of Open Access Journals (Sweden)

    Alekseyenko Artyom A

    2007-06-01

    Full Text Available Abstract Background Chromatin immunoprecipitation on tiling arrays (ChIP-chip has been widely used to investigate the DNA binding sites for a variety of proteins on a genome-wide scale. However, several issues in the processing and analysis of ChIP-chip data have not been resolved fully, including the effect of background (mock control subtraction and normalization within and across arrays. Results The binding profiles of Drosophila male-specific lethal (MSL complex on a tiling array provide a unique opportunity for investigating these topics, as it is known to bind on the X chromosome but not on the autosomes. These large bound and control regions on the same array allow clear evaluation of analytical methods. We introduce a novel normalization scheme specifically designed for ChIP-chip data from dual-channel arrays and demonstrate that this step is critical for correcting systematic dye-bias that may exist in the data. Subtraction of the mock (non-specific antibody or no antibody control data is generally needed to eliminate the bias, but appropriate normalization obviates the need for mock experiments and increases the correlation among replicates. The idea underlying the normalization can be used subsequently to estimate the background noise level in each array for normalization across arrays. We demonstrate the effectiveness of the methods with the MSL complex binding data and other publicly available data. Conclusion Proper normalization is essential for ChIP-chip experiments. The proposed normalization technique can correct systematic errors and compensate for the lack of mock control data, thus reducing the experimental cost and producing more accurate results.

  9. Nonischemic changes in right ventricular function on exercise. Do normal volunteers differ from patients with normal coronary arteries

    International Nuclear Information System (INIS)

    Caplin, J.L.; Maltz, M.B.; Flatman, W.D.; Dymond, D.S.

    1988-01-01

    Factors other than ischemia may alter right ventricular function both at rest and on exercise. Normal volunteers differ from cardiac patients with normal coronary arteries with regard to their left ventricular response to exercise. This study examined changes in right ventricular function on exercise in 21 normal volunteers and 13 patients with normal coronary arteries, using first-pass radionuclide angiography. There were large ranges of right ventricular ejection fraction in the two groups, both at rest and on exercise. Resting right ventricular ejection fraction was 40.2 +/- 10.6% (mean +/- SD) in the volunteers and 38.6 +/- 9.7% in the patients, p = not significant, and on exercise rose significantly in both groups to 46.1 +/- 9.9% and 45.8 +/- 9.7%, respectively. The difference between the groups was not significant. In both groups some subjects with high resting values showed large decreases in ejection fraction on exercise, and there were significant negative correlations between resting ejection fraction and the change on exercise, r = -0.59 (p less than 0.01) in volunteers, and r = -0.66 (p less than 0.05) in patients. Older volunteers tended to have lower rest and exercise ejection fractions, but there was no difference between normotensive and hypertensive patients in their rest or exercise values. In conclusion, changes in right ventricular function on exercise are similar in normal volunteers and in patients with normal coronary arteries. Some subjects show decreases in right ventricular ejection fraction on exercise which do not appear to be related to ischemia

  10. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  11. Bifurcation of equilibria between with and without a large island in the large helical device

    Energy Technology Data Exchange (ETDEWEB)

    Ohyabu, N; Narushima, Y; Nagayama, Y; Narihara, K; Morisaki, T; Komori, A [National Institute for Fusion Science, Toki, Gifu, 509-5292 (Japan)

    2005-09-01

    A rapid bifurcation of the equilibria with and without a large island (n/m = 1/1) has been observed in the medium to high beta large helical device discharges. A large island imposed by an external resonant field is suddenly suppressed nearly perfectly by the plasma effects when the beta at the {iota}/2{pi} = 1 surface exceeds a critical value. The critical beta value is nearly proportional to the externally imposed resonant field normalized by the main field strength.

  12. Instantons and Large N

    Science.gov (United States)

    Mariño, Marcos

    2015-09-01

    Preface; Part I. Instantons: 1. Instantons in quantum mechanics; 2. Unstable vacua in quantum field theory; 3. Large order behavior and Borel summability; 4. Non-perturbative aspects of Yang-Mills theories; 5. Instantons and fermions; Part II. Large N: 6. Sigma models at large N; 7. The 1=N expansion in QCD; 8. Matrix models and matrix quantum mechanics at large N; 9. Large N QCD in two dimensions; 10. Instantons at large N; Appendix A. Harmonic analysis on S3; Appendix B. Heat kernel and zeta functions; Appendix C. Effective action for large N sigma models; References; Author index; Subject index.

  13. Normal form theory and spectral sequences

    OpenAIRE

    Sanders, Jan A.

    2003-01-01

    The concept of unique normal form is formulated in terms of a spectral sequence. As an illustration of this technique some results of Baider and Churchill concerning the normal form of the anharmonic oscillator are reproduced. The aim of this paper is to show that spectral sequences give us a natural framework in which to formulate normal form theory. © 2003 Elsevier Science (USA). All rights reserved.

  14. Denotational Aspects of Untyped Normalization by Evaluation

    DEFF Research Database (Denmark)

    Filinski, Andrzej; Rohde, Henning Korsholm

    2005-01-01

    of soundness (the output term, if any, is in normal form and ß-equivalent to the input term); identification (ß-equivalent terms are mapped to the same result); and completeness (the function is defined for all terms that do have normal forms). We also show how the semantic construction enables a simple yet...... formal correctness proof for the normalization algorithm, expressed as a functional program in an ML-like, call-by-value language. Finally, we generalize the construction to produce an infinitary variant of normal forms, namely Böhm trees. We show that the three-part characterization of correctness...

  15. Ultrasonographic features of normal lower ureters

    International Nuclear Information System (INIS)

    Kim, Young Soon; Bae, M. Y.; Park, K. J.; Jeon, H. S.; Lee, J. H.

    1990-01-01

    Although ultrasonographic evaluation of the normal ureters is difficult due to bowel gas, the lower segment of the normal ureters can be visualized using the urinary bladder as an acoustic window. Authors prospetively performed ultrasonography with the standard suprapubic technique and analyzed the ultrasonographic features of normal lower ureters in 79 cases(77%). Length of visualized segment of the distal ureter ranged frp, 1.5cm to 7.2 cm and the visualized segment did not exceed 3.9mm in maximum diameter. Knowledge of sonographic features of the normal lower ureters can be helpful in the evaluation of pathologic or suspected pathologic conditions of the lower ureters

  16. Heat transport and electron cooling in ballistic normal-metal/spin-filter/superconductor junctions

    International Nuclear Information System (INIS)

    Kawabata, Shiro; Vasenko, Andrey S.; Ozaeta, Asier; Bergeret, Sebastian F.; Hekking, Frank W.J.

    2015-01-01

    We investigate electron cooling based on a clean normal-metal/spin-filter/superconductor junction. Due to the suppression of the Andreev reflection by the spin-filter effect, the cooling power of the system is found to be extremely higher than that for conventional normal-metal/nonmagnetic-insulator/superconductor coolers. Therefore we can extract large amount of heat from normal metals. Our results strongly indicate the practical usefulness of the spin-filter effect for cooling detectors, sensors, and quantum bits

  17. Heat transport and electron cooling in ballistic normal-metal/spin-filter/superconductor junctions

    Energy Technology Data Exchange (ETDEWEB)

    Kawabata, Shiro, E-mail: s-kawabata@aist.go.jp [Electronics and Photonics Research Institute (ESPRIT), National Institute of Advanced Industrial Science and Technology (AIST), Tsukuba, Ibaraki 305-8568 (Japan); Vasenko, Andrey S. [LPMMC, Université Joseph Fourier and CNRS, 25 Avenue des Martyrs, BP 166, 38042 Grenoble (France); Ozaeta, Asier [Centro de Física de Materiales (CFM-MPC), Centro Mixto CSIC-UPV/EHU, Manuel de Lardizabal 5, E-20018 San Sebastián (Spain); Bergeret, Sebastian F. [Centro de Física de Materiales (CFM-MPC), Centro Mixto CSIC-UPV/EHU, Manuel de Lardizabal 5, E-20018 San Sebastián (Spain); Donostia International Physics Center (DIPC), Manuel de Lardizabal 5, E-20018 San Sebastián (Spain); Hekking, Frank W.J. [LPMMC, Université Joseph Fourier and CNRS, 25 Avenue des Martyrs, BP 166, 38042 Grenoble (France)

    2015-06-01

    We investigate electron cooling based on a clean normal-metal/spin-filter/superconductor junction. Due to the suppression of the Andreev reflection by the spin-filter effect, the cooling power of the system is found to be extremely higher than that for conventional normal-metal/nonmagnetic-insulator/superconductor coolers. Therefore we can extract large amount of heat from normal metals. Our results strongly indicate the practical usefulness of the spin-filter effect for cooling detectors, sensors, and quantum bits.

  18. Flight style optimization in ski jumping on normal, large, and ski flying hills.

    Science.gov (United States)

    Jung, Alexander; Staat, Manfred; Müller, Wolfram

    2014-02-07

    In V-style ski jumping, aerodynamic forces are predominant performance factors and athletes have to solve difficult optimization problems in parts of a second in order to obtain their jump length maximum and to keep the flight stable. Here, a comprehensive set of wind tunnel data was used for optimization studies based on Pontryagin's minimum principle with both the angle of attack α and the body-ski angle β as controls. Various combinations of the constraints αmax and βmin(t) were analyzed in order to compare different optimization strategies. For the computer simulation studies, the Olympic hill profiles in Esto-Sadok, Russia (HS 106m, HS 140m), and in Harrachov, Czech Republic, host of the Ski Flying World Championships 2014 (HS 205m) were used. It is of high importance for ski jumping practice that various aerodynamic strategies, i.e. combinations of α- and β-time courses, can lead to similar jump lengths which enables athletes to win competitions using individual aerodynamic strategies. Optimization results also show that aerodynamic behavior has to be different at different hill sizes (HS). Optimized time courses of α and β using reduced drag and lift areas in order to mimic recent equipment regulations differed only in a negligible way. This indicates that optimization results presented here are not very sensitive to minor changes of the aerodynamic equipment features when similar jump length are obtained by using adequately higher in-run velocities. However, wind tunnel measurements with athletes including take-off and transition to stabilized flight, flight, and landing behavior would enable a more detailed understanding of individual flight style optimization. © 2013 Published by Elsevier Ltd.

  19. Large Logarithms in the Beam Normal Spin Asymmetry of Elastic Electron--Proton Scattering

    Energy Technology Data Exchange (ETDEWEB)

    Andrei Afanasev; Mykola Merenkov

    2004-06-01

    We study a parity-conserving single-spin beam asymmetry of elastic electron-proton scattering induced by an absorptive part of the two-photon exchange amplitude. It is demonstrated that excitation of inelastic hadronic intermediate states by the consecutive exchange of two photons leads to logarithmic and double-logarithmic enhancement due to contributions of hard collinear quasi-real photons. The asymmetry at small electron scattering angles is expressed in terms of the total photoproduction cross section on the proton, and is predicted to reach the magnitude of 20-30 parts per million. At these conditions and fixed 4-momentum transfers, the asymmetry is rising logarithmically with increasing electron beam energy, following the high-energy diffractive behavior of total photoproduction cross section on the proton.

  20. Large ring polymers align FtsZ polymers for normal septum formation

    NARCIS (Netherlands)

    Guendogdu, Muhammet E.; Kawai, Yoshikazu; Pavlendova, Nada; Ogasawara, Naotake; Errington, Jeff; Scheffers, Dirk-Jan; Hamoen, Leendert W.; Gündoğdu, Muhammet E.

    2011-01-01

    Cytokinesis in bacteria is initiated by polymerization of the tubulin homologue FtsZ into a circular structure at midcell, the Z-ring. This structure functions as a scaffold for all other cell division proteins. Several proteins support assembly of the Z-ring, and one such protein, SepF, is required

  1. Platelet cyclooxygenase expression in normal dogs.

    Science.gov (United States)

    Thomason, J; Lunsford, K; Mullins, K; Stokes, J; Pinchuk, L; Wills, R; McLaughlin, R; Langston, C; Pruett, S; Mackin, A

    2011-01-01

    Human platelets express both cyclooxygenase-1 (COX-1) and cyclooxygenase-2 (COX-2). Variation in COX-2 expression could be a mechanism for variable response to aspirin. The hypotheses were that circulating canine platelets express COX-1 and COX-2, and that aspirin alters COX expression. The objective was to identify changes in platelet COX expression and in platelet function caused by aspirin administration to dogs. Eight female, intact hounds. A single population, repeated measures design was used to evaluate platelet COX-1 and COX-2 expression by flow cytometry before and after aspirin (10 mg/kg Q12h for 10 days). Platelet function was analyzed via PFA-100(®) (collagen/epinephrine), and urine 11-dehydro-thromboxane B(2) (11-dTXB(2)) was measured and normalized to urinary creatinine. Differences in COX expression, PFA-100(®) closure times, and urine 11-dTXB(2 ): creatinine ratio were analyzed before and after aspirin administration. Both COX-1 and COX-2 were expressed in canine platelets. COX-1 mean fluorescent intensity (MFI) increased in all dogs, by 250% (range 63-476%), while COX-2 expression did not change significantly (P = 0.124) after aspirin exposure, with large interindividual variation. PFA-100(®) closure times were prolonged and urine 11-dTXB(2) concentration decreased in all dogs after aspirin administration. Canine platelets express both COX isoforms. After aspirin exposure, COX-1 expression increased despite impairment of platelet function, while COX-2 expression varied markedly among dogs. Variability in platelet COX-2 expression should be explored as a potential mechanism for, or marker of, variable aspirin responsiveness. Copyright © 2011 by the American College of Veterinary Internal Medicine.

  2. Spinal cord normalization in multiple sclerosis.

    Science.gov (United States)

    Oh, Jiwon; Seigo, Michaela; Saidha, Shiv; Sotirchos, Elias; Zackowski, Kathy; Chen, Min; Prince, Jerry; Diener-West, Marie; Calabresi, Peter A; Reich, Daniel S

    2014-01-01

    Spinal cord (SC) pathology is common in multiple sclerosis (MS), and measures of SC-atrophy are increasingly utilized. Normalization reduces biological variation of structural measurements unrelated to disease, but optimal parameters for SC volume (SCV)-normalization remain unclear. Using a variety of normalization factors and clinical measures, we assessed the effect of SCV normalization on detecting group differences and clarifying clinical-radiological correlations in MS. 3T cervical SC-MRI was performed in 133 MS cases and 11 healthy controls (HC). Clinical assessment included expanded disability status scale (EDSS), MS functional composite (MSFC), quantitative hip-flexion strength ("strength"), and vibration sensation threshold ("vibration"). SCV between C3 and C4 was measured and normalized individually by subject height, SC-length, and intracranial volume (ICV). There were group differences in raw-SCV and after normalization by height and length (MS vs. HC; progressive vs. relapsing MS-subtypes, P normalization by length (EDSS:r = -.43; MSFC:r = .33; strength:r = .38; vibration:r = -.40), and height (EDSS:r = -.26; MSFC:r = .28; strength:r = .22; vibration:r = -.29), but diminished with normalization by ICV (EDSS:r = -.23; MSFC:r = -.10; strength:r = .23; vibration:r = -.35). In relapsing MS, normalization by length allowed statistical detection of correlations that were not apparent with raw-SCV. SCV-normalization by length improves the ability to detect group differences, strengthens clinical-radiological correlations, and is particularly relevant in settings of subtle disease-related SC-atrophy in MS. SCV-normalization by length may enhance the clinical utility of measures of SC-atrophy. Copyright © 2014 by the American Society of Neuroimaging.

  3. Comparison of SUVs normalized by lean body mass determined by CT with those normalized by lean body mass estimated by predictive equations in normal tissues

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Woo Hyoung; Kim, Chang Guhn; Kim, Dae Weung [Wonkwang Univ. School of Medicine, Iksan (Korea, Republic of)

    2012-09-15

    Standardized uptake values (SUVs)normalized by lean body mass (LBM)determined by CT were compared with those normalized by LBM estimated using predictive equations (PEs)in normal liver, spleen, and aorta using {sup 18}F FDG PET/CT. Fluorine 18 fluorodeoxyglucose (F FDG)positron emission tomography/computed tomography (PET/CT)was conducted on 453 patients. LBM determined by CT was defined in 3 ways (LBM{sup CT1}-3). Five PEs were used for comparison (LBM{sup PE1}-5). Tissue SUV normalized by LBM (SUL) was calculated using LBM from each method (SUL{sup CT1}-3, SUL{sup PE1}-5). Agreement between methods was assessed by Bland Altman analysis. Percentage difference and percentage error were also calculated. For all liver SUL{sup CTS} vs. liver SUL{sup PES} except liver SUL{sup PE3}, the range of biases, SDs of percentage difference and percentage errors were -0.17-0.24 SUL, 6.15-10.17%, and 25.07-38.91%, respectively. For liver SUL{sup CTs} vs. liver SUL{sup PE3}, the corresponding figures were 0.47-0.69 SUL, 10.90-11.25%, and 50.85-51.55%, respectively, showing the largest percentage errors and positive biases. Irrespective of magnitudes of the biases, large percentage errors of 25.07-51.55% were observed between liver SUL{sup CT1}-3 and liver SUL{sup PE1}-5. The results of spleen and aorta SUL{sup CTs} and SUL{sup PEs} comparison were almost identical to those for liver. The present study demonstrated substantial errors in individual SUL{sup PEs} compared with SUL{sup CTs} as a reference value. Normalization of SUV by LBM determined by CT rather than PEs may be a useful approach to reduce errors in individual SUL{sup PEs}.

  4. Comparison of SUVs normalized by lean body mass determined by CT with those normalized by lean body mass estimated by predictive equations in normal tissues

    International Nuclear Information System (INIS)

    Kim, Woo Hyoung; Kim, Chang Guhn; Kim, Dae Weung

    2012-01-01

    Standardized uptake values (SUVs)normalized by lean body mass (LBM)determined by CT were compared with those normalized by LBM estimated using predictive equations (PEs)in normal liver, spleen, and aorta using 18 F FDG PET/CT. Fluorine 18 fluorodeoxyglucose (F FDG)positron emission tomography/computed tomography (PET/CT)was conducted on 453 patients. LBM determined by CT was defined in 3 ways (LBM CT1 -3). Five PEs were used for comparison (LBM PE1 -5). Tissue SUV normalized by LBM (SUL) was calculated using LBM from each method (SUL CT1 -3, SUL PE1 -5). Agreement between methods was assessed by Bland Altman analysis. Percentage difference and percentage error were also calculated. For all liver SUL CTS vs. liver SUL PES except liver SUL PE3 , the range of biases, SDs of percentage difference and percentage errors were -0.17-0.24 SUL, 6.15-10.17%, and 25.07-38.91%, respectively. For liver SUL CTs vs. liver SUL PE3 , the corresponding figures were 0.47-0.69 SUL, 10.90-11.25%, and 50.85-51.55%, respectively, showing the largest percentage errors and positive biases. Irrespective of magnitudes of the biases, large percentage errors of 25.07-51.55% were observed between liver SUL CT1 -3 and liver SUL PE1 -5. The results of spleen and aorta SUL CTs and SUL PEs comparison were almost identical to those for liver. The present study demonstrated substantial errors in individual SUL PEs compared with SUL CTs as a reference value. Normalization of SUV by LBM determined by CT rather than PEs may be a useful approach to reduce errors in individual SUL PEs

  5. Large Neighborhood Search

    DEFF Research Database (Denmark)

    Pisinger, David; Røpke, Stefan

    2010-01-01

    Heuristics based on large neighborhood search have recently shown outstanding results in solving various transportation and scheduling problems. Large neighborhood search methods explore a complex neighborhood by use of heuristics. Using large neighborhoods makes it possible to find better...... candidate solutions in each iteration and hence traverse a more promising search path. Starting from the large neighborhood search method,we give an overview of very large scale neighborhood search methods and discuss recent variants and extensions like variable depth search and adaptive large neighborhood...

  6. Large Ventral Hernia

    Directory of Open Access Journals (Sweden)

    Meryl Abrams, MD

    2018-04-01

    Full Text Available History of present illness: A 46-year-old female presented to the emergency department (ED with diffuse abdominal pain and three days of poor oral intake associated with non-bilious, non-bloody vomiting. Initial vital signs consisted of a mild resting tachycardia of 111 with a temperature of 38.0 degrees Celsius (°C. On examination, the patient had a large pannus extending to the knees, which contained a hernia. She was tender in this region on examination. Laboratory values included normal serum chemistries and mild leukocytosis of 12.2. The patient reports that her abdomen had been enlarging over the previous 8 years but had not been painful until 3 days prior to presentation. The patient had no associated fever, chills, diarrhea, constipation, chest pain or shortness of breath. Significant findings: Computed tomography (CT scan with intravenous (IV contrast of the abdomen and pelvis demonstrated a large pannus containing a ventral hernia with abdominal contents extending below the knees (white circle, elongation of mesenteric vessels to accommodate abdominal contents outside of the abdomen (white arrow and air fluid levels (white arrow indicating a small bowel obstruction. Discussion: Hernias are a common chief complaint seen in the emergency department. The estimated lifetime risk of a spontaneous abdominal hernia is 5%.1 The most common type of hernia is inguinal while the next most common type of hernia is femoral, which are more common in women.1 Ventral hernias can be epigastric, incisional, or primary abdominal. An asymptomatic, reducible hernia can be followed up as outpatient with a general surgeon for elective repair.2 Hernias become problematic when they are either incarcerated or strangulated. A hernia is incarcerated when the hernia is irreducible and strangulated when its blood supply is compromised. A complicated hernia, especially strangulated, can have a mortality of greater than 50%.1 It is key to perform a thorough history

  7. An atlas of normal skeletal scintigraphy

    International Nuclear Information System (INIS)

    Flanagan, J.J.; Maisey, M.N.

    1985-01-01

    This atlas was compiled to provide the neophyte as well as the experienced radiologist and the nuclear medicine physician with a reference on normal skeletal scintigraphy as an aid in distinguishing normal variations in skeletal uptake from abnormal findings. Each skeletal scintigraph is labeled, and utilizing an identical scale, a relevant skeletal photograph and radiograph are placed adjacent to the scintigraph

  8. On normal modes in classical Hamiltonian systems

    NARCIS (Netherlands)

    van Groesen, Embrecht W.C.

    1983-01-01

    Normal modes of Hamittonian systems that are even and of classical type are characterized as the critical points of a normalized kinetic energy functional on level sets of the potential energy functional. With the aid of this constrained variational formulation the existence of at least one family

  9. Computerized three-dimensional normal atlas

    International Nuclear Information System (INIS)

    Mano, Isamu; Suto, Yasuzo; Suzuki, Masataka; Iio, Masahiro.

    1990-01-01

    This paper presents our ongoing project in which normal human anatomy and its quantitative data are systematically arranged in a computer. The final product, the Computerized Three-Dimensional Normal Atlas, will be able to supply tomographic images in any direction, 3-D images, and coded information on organs, e.g., anatomical names, CT numbers, and T 1 and T 2 values. (author)

  10. Pseudo--Normals for Signed Distance Computation

    DEFF Research Database (Denmark)

    Aanæs, Henrik; Bærentzen, Jakob Andreas

    2003-01-01

    the relation of a point to a mesh. At the vertices and edges of a triangle mesh, the surface is not \\$C\\^1\\$ continuous. Hence, the normal is undefined at these loci. Thürmer and Wüthrich proposed the \\$\\backslash\\$emph{angle weighted pseudo--normal} as a way to deal with this problem. In this paper, we...

  11. The lambda sigma calculus and strong normalization

    DEFF Research Database (Denmark)

    Schack-Nielsen, Anders; Schürmann, Carsten

    Explicit substitution calculi can be classified into several dis- tinct categories depending on whether they are confluent, meta-confluent, strong normalization preserving, strongly normalizing, simulating, fully compositional, and/or local. In this paper we present a variant of the λσ-calculus, ...

  12. Optimal consistency in microRNA expression analysis using reference-gene-based normalization.

    Science.gov (United States)

    Wang, Xi; Gardiner, Erin J; Cairns, Murray J

    2015-05-01

    Normalization of high-throughput molecular expression profiles secures differential expression analysis between samples of different phenotypes or biological conditions, and facilitates comparison between experimental batches. While the same general principles apply to microRNA (miRNA) normalization, there is mounting evidence that global shifts in their expression patterns occur in specific circumstances, which pose a challenge for normalizing miRNA expression data. As an alternative to global normalization, which has the propensity to flatten large trends, normalization against constitutively expressed reference genes presents an advantage through their relative independence. Here we investigated the performance of reference-gene-based (RGB) normalization for differential miRNA expression analysis of microarray expression data, and compared the results with other normalization methods, including: quantile, variance stabilization, robust spline, simple scaling, rank invariant, and Loess regression. The comparative analyses were executed using miRNA expression in tissue samples derived from subjects with schizophrenia and non-psychiatric controls. We proposed a consistency criterion for evaluating methods by examining the overlapping of differentially expressed miRNAs detected using different partitions of the whole data. Based on this criterion, we found that RGB normalization generally outperformed global normalization methods. Thus we recommend the application of RGB normalization for miRNA expression data sets, and believe that this will yield a more consistent and useful readout of differentially expressed miRNAs, particularly in biological conditions characterized by large shifts in miRNA expression.

  13. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  14. Large Pelagics Intercept Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Large Pelagics Intercept Survey (LPIS) is a dockside survey of private and charterboat captains who have just completed fishing trips directed at large pelagic...

  15. The COBE normalization for standard cold dark matter

    Science.gov (United States)

    Bunn, Emory F.; Scott, Douglas; White, Martin

    1995-01-01

    The Cosmic Background Explorer Satellite (COBE) detection of microwave anisotropies provides the best way of fixing the amplitude of cosmological fluctuations on the largest scales. This normalization is usually given for an n = 1 spectrum, including only the anisotropy caused by the Sachs-Wolfe effect. This is certainly not a good approximation for a model containing any reasonable amount of baryonic matter. In fact, even tilted Sachs-Wolfe spectra are not a good fit to models like cold dark matter (CDM). Here, we normalize standard CDM (sCDM) to the two-year COBE data and quote the best amplitude in terms of the conventionally used measures of power. We also give normalizations for some specific variants of this standard model, and we indicate how the normalization depends on the assumed values on n, Omega(sub B) and H(sub 0). For sCDM we find the mean value of Q = 19.9 +/- 1.5 micro-K, corresponding to sigma(sub 8) = 1.34 +/- 0.10, with the normalization at large scales being B = (8.16 +/- 1.04) x 10(exp 5)(Mpc/h)(exp 4), and other numbers given in the table. The measured rms temperature fluctuation smoothed on 10 deg is a little low relative to this normalization. This is mainly due to the low quadrupole in the data: when the quadrupole is removed, the measured value of sigma(10 deg) is quite consistent with the best-fitting the mean value of Q. The use of the mean value of Q should be preferred over sigma(10 deg), when its value can be determined for a particular theory, since it makes full use of the data.

  16. Large electrostatic accelerators

    International Nuclear Information System (INIS)

    Jones, C.M.

    1984-01-01

    The paper is divided into four parts: a discussion of the motivation for the construction of large electrostatic accelerators, a description and discussion of several large electrostatic accelerators which have been recently completed or are under construction, a description of several recent innovations which may be expected to improve the performance of large electrostatic accelerators in the future, and a description of an innovative new large electrostatic accelerator whose construction is scheduled to begin next year

  17. MR imaging of the ankle: Normal variants

    International Nuclear Information System (INIS)

    Noto, A.M.; Cheung, Y.; Rosenberg, Z.S.; Norman, A.; Leeds, N.E.

    1987-01-01

    Thirty asymptomatic ankles were studied with high-resolution surface coil MR imaging. The thirty ankles were reviewed for identification or normal structures. The MR appearance of the deltoid and posterior to talo-fibular ligaments, peroneous brevis and longus tendons, and posterior aspect of the tibial-talar joint demonstrated several normal variants not previously described. These should not be misinterpreted as pathologic processes. The specific findings included (1) cortical irregularity of the posterior tibial-talar joint in 27 of 30 cases which should not be mistaken for osteonecrois; (2) normal posterior talo-fibular ligament with irregular and frayed inhomogeneity, which represents a normal variant in seven of ten cases; and (3) fluid in the shared peroneal tendons sheath which may be confused for a longitudinal tendon tear in three of 30 cases. Ankle imaging with the use of MR is still a relatively new procedure. Further investigation is needed to better define normal anatomy as well as normal variants. The authors described several structures that normally present with variable MR imaging appearances. This is clinically significant in order to maintain a high sensitivity and specificity in MR imaging interpretation

  18. Defecography: A study of normal volunteers

    International Nuclear Information System (INIS)

    Shorvon, P.; Stevenson, G.W.; McHugh, S.; Somers, P.

    1987-01-01

    This study of young volunteers was set up in an effort to establish true normal measurements for defecography with minimum selection bias. The results describe the mean (and the range) for the following: anorectal angle; anorectal junction position at rest; excursion on lift, strain, and evacuation; anal canal length and degree of closure; and the frequency and degree of features such as rectocele and intussusception which have previously been called abnormalities. The results indicate that there is a very wide range of normal appearances. Knowledge of these normal variations is important to avoid overreporting and unnecessary surgery

  19. Nonlinear dynamics exploration through normal forms

    CERN Document Server

    Kahn, Peter B

    2014-01-01

    Geared toward advanced undergraduates and graduate students, this exposition covers the method of normal forms and its application to ordinary differential equations through perturbation analysis. In addition to its emphasis on the freedom inherent in the normal form expansion, the text features numerous examples of equations, the kind of which are encountered in many areas of science and engineering. The treatment begins with an introduction to the basic concepts underlying the normal forms. Coverage then shifts to an investigation of systems with one degree of freedom that model oscillations

  20. Normal-dispersion microresonator Kerr frequency combs

    Directory of Open Access Journals (Sweden)

    Xue Xiaoxiao

    2016-06-01

    Full Text Available Optical microresonator-based Kerr frequency comb generation has developed into a hot research area in the past decade. Microresonator combs are promising for portable applications due to their potential for chip-level integration and low power consumption. According to the group velocity dispersion of the microresonator employed, research in this field may be classified into two categories: the anomalous dispersion regime and the normal dispersion regime. In this paper, we discuss the physics of Kerr comb generation in the normal dispersion regime and review recent experimental advances. The potential advantages and future directions of normal dispersion combs are also discussed.

  1. Normal radiographic findings. 4. act. ed.

    International Nuclear Information System (INIS)

    Moeller, T.B.

    2003-01-01

    This book can serve the reader in three ways: First, it presents normal findings for all radiographic techniques including KM. Important data which are criteria of normal findings are indicated directly in the pictures and are also explained in full text and in summary form. Secondly, it teaches the systematics of interpreting a picture - how to look at it, what structures to regard in what order, and for what to look in particular. Checklists are presented in each case. Thirdly, findings are formulated in accordance with the image analysis procedure. All criteria of normal findings are defined in these formulations, which make them an important didactic element. (orig.)

  2. Normal radiographic findings. 4. act. ed.; Roentgennormalbefunde

    Energy Technology Data Exchange (ETDEWEB)

    Moeller, T.B. [Gemeinschaftspraxis fuer Radiologie und Nuklearmedizin, Dillingen (Germany)

    2003-07-01

    This book can serve the reader in three ways: First, it presents normal findings for all radiographic techniques including KM. Important data which are criteria of normal findings are indicated directly in the pictures and are also explained in full text and in summary form. Secondly, it teaches the systematics of interpreting a picture - how to look at it, what structures to regard in what order, and for what to look in particular. Checklists are presented in each case. Thirdly, findings are formulated in accordance with the image analysis procedure. All criteria of normal findings are defined in these formulations, which make them an important didactic element. (orig.)

  3. Imaging the corpus callosum, septum pellucidum and fornix in children: normal anatomy and variations of normality

    International Nuclear Information System (INIS)

    Griffiths, Paul D.; Batty, Ruth; Connolly, Dan J.A.; Reeves, Michael J.

    2009-01-01

    The midline structures of the supra-tentorial brain are important landmarks for judging if the brain has formed correctly. In this article, we consider the normal appearances of the corpus callosum, septum pellucidum and fornix as shown on MR imaging in normal and near-normal states. (orig.)

  4. The total plasmatic estriol on normal gestation

    International Nuclear Information System (INIS)

    Thiesen, A.L.

    1980-01-01

    The total plasmatic estriol in normal pregnants was determinated by radioimmunological method using estriol labelled with sup(125)I. The obtained results presented similar results in comparison with methods using sup(19)C and sup(3)H. (author)

  5. Terre Haute and the Normal School Fire

    Science.gov (United States)

    Ferreira, Allen

    1974-01-01

    This paper examines the short history of the Terre Haute Normal School before its tragic burning on April 9, 1888 and relates that story to the course of events immediately following the fire. (Author)

  6. Looking at Your Newborn: What's Normal

    Science.gov (United States)

    ... features that may make a normal newborn look strange are temporary. After all, babies develop while immersed ... sleepy during the first day or two of life. Many new parents become concerned about their newborn's ...

  7. Mental Health: What's Normal, What's Not?

    Science.gov (United States)

    Healthy Lifestyle Adult health Understanding what's considered normal mental health can be tricky. See how feelings, thoughts and behaviors determine mental health and how to recognize if you or a ...

  8. Compressed normalized block difference for object tracking

    Science.gov (United States)

    Gao, Yun; Zhang, Dengzhuo; Cai, Donglan; Zhou, Hao; Lan, Ge

    2018-04-01

    Feature extraction is very important for robust and real-time tracking. Compressive sensing provided a technical support for real-time feature extraction. However, all existing compressive tracking were based on compressed Haar-like feature, and how to compress many more excellent high-dimensional features is worth researching. In this paper, a novel compressed normalized block difference feature (CNBD) was proposed. For resisting noise effectively in a highdimensional normalized pixel difference feature (NPD), a normalized block difference feature extends two pixels in the original formula of NPD to two blocks. A CNBD feature can be obtained by compressing a normalized block difference feature based on compressive sensing theory, with the sparse random Gaussian matrix as the measurement matrix. The comparative experiments of 7 trackers on 20 challenging sequences showed that the tracker based on CNBD feature can perform better than other trackers, especially than FCT tracker based on compressed Haar-like feature, in terms of AUC, SR and Precision.

  9. Forced Normalization: Antagonism Between Epilepsy and Psychosis.

    Science.gov (United States)

    Kawakami, Yasuhiko; Itoh, Yasuhiko

    2017-05-01

    The antagonism between epilepsy and psychosis has been discussed for a long time. Landolt coined the term "forced normalization" in the 1950s to describe psychotic episodes associated with the remission of seizures and disappearance of epileptiform activity on electroencephalograms in individuals with epilepsy. Since then, neurologists and psychiatrists have been intrigued by this phenomenon. However, although collaborative clinical studies and basic experimental researches have been performed, the mechanism of forced normalization remains unknown. In this review article, we present a historical overview of the concept of forced normalization, and discuss potential pathogenic mechanisms and clinical diagnosis. We also discuss the role of dopamine, which appears to be a key factor in the mechanism of forced normalization. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Efficient CEPSTRAL Normalization for Robust Speech Recognition

    National Research Council Canada - National Science Library

    Liu, Fu-Hua; Stern, Richard M; Huang, Xuedong; Acero, Alejandro

    1993-01-01

    .... We compare the performance of these algorithms with the very simple RASTA and cepstral mean normalization procedures, describing the performance of these algorithms in the context of the 1992 DARPA...

  11. Large errors and severe conditions

    CERN Document Server

    Smith, D L; Van Wormer, L A

    2002-01-01

    Physical parameters that can assume real-number values over a continuous range are generally represented by inherently positive random variables. However, if the uncertainties in these parameters are significant (large errors), conventional means of representing and manipulating the associated variables can lead to erroneous results. Instead, all analyses involving them must be conducted in a probabilistic framework. Several issues must be considered: First, non-linear functional relations between primary and derived variables may lead to significant 'error amplification' (severe conditions). Second, the commonly used normal (Gaussian) probability distribution must be replaced by a more appropriate function that avoids the occurrence of negative sampling results. Third, both primary random variables and those derived through well-defined functions must be dealt with entirely in terms of their probability distributions. Parameter 'values' and 'errors' should be interpreted as specific moments of these probabil...

  12. Optical Observations of X-ray Bright, Optically Normal Galaxies

    Science.gov (United States)

    Sadun, Alberto C.; Aryan, N. S.; Ghosh, K. K.

    2007-05-01

    X-ray bright, optically normal galaxies (XBONGs) are galaxies that seem to have normal spectra and morphology, but are relatively bright x-ray sources. The large ratio of the x-ray to optical emission suggests that some activity, similar to that of active galactic nuclei (AGN), is occurring. Since the galaxies do not show any obvious sign of nuclear activity in their optical spectra, one possible explanation is that these galaxies do not have an optically thick accretion disk at small radii, as previously assumed. Previous data for NGC 7626 classifies it as an XBONG, and so we are studying optical features of this galaxy in order to determine better its features. After confirming an x-ray jet, we are now comparing this to optical features that we have found, including warped dust lanes and a possible optical jet.

  13. Predicting consonant recognition and confusions in normal-hearing listeners

    DEFF Research Database (Denmark)

    Zaar, Johannes; Dau, Torsten

    2017-01-01

    , Kollmeier, and Kohlrausch [(1997). J. Acoust. Soc. Am. 102, 2892–2905]. The model was evaluated based on the extensive consonant perception data set provided by Zaar and Dau [(2015). J. Acoust. Soc. Am. 138, 1253–1267], which was obtained with normal-hearing listeners using 15 consonant-vowel combinations...... confusion groups. The large predictive power of the proposed model suggests that adaptive processes in the auditory preprocessing in combination with a cross-correlation based template-matching back end can account for some of the processes underlying consonant perception in normal-hearing listeners....... The proposed model may provide a valuable framework, e.g., for investigating the effects of hearing impairment and hearing-aid signal processing on phoneme recognition....

  14. The ambivert: A failed attempt at a normal personality.

    Science.gov (United States)

    Davidson, Ian J

    2017-09-01

    Recently, attention has been drawn toward an overlooked and nearly forgotten personality type: the ambivert. This paper presents a genealogy of the ambivert, locating the various contexts it traversed in order to highlight the ways in which these places and times have interacted and changed-ultimately elucidating our current situation. Proposed by Edmund S. Conklin in 1923, the ambivert only was meant for normal persons in between the introvert and extravert extremes. Although the ambivert could have been taken up by early personality psychologists who were transitioning from the study of the abnormal to the normal, it largely failed to gain traction. Whether among psychoanalysts, psychiatrists, or applied and personality psychologists, the ambivert was personality non grata. It was only within the context of Eysenck's integrative view of types and traits that the ambivert marginally persisted up to the present day and is now the focus of sales management and popular psychology. © 2017 Wiley Periodicals, Inc.

  15. Normal and Abnormal Behavior in Early Childhood

    OpenAIRE

    Spinner, Miriam R.

    1981-01-01

    Evaluation of normal and abnormal behavior in the period to three years of age involves many variables. Parental attitudes, determined by many factors such as previous childrearing experience, the bonding process, parental psychological status and parental temperament, often influence the labeling of behavior as normal or abnormal. This article describes the forms of crying, sleep and wakefulness, and affective responses from infancy to three years of age.

  16. Right thoracic curvature in the normal spine

    Directory of Open Access Journals (Sweden)

    Masuda Keigo

    2011-01-01

    Full Text Available Abstract Background Trunk asymmetry and vertebral rotation, at times observed in the normal spine, resemble the characteristics of adolescent idiopathic scoliosis (AIS. Right thoracic curvature has also been reported in the normal spine. If it is determined that the features of right thoracic side curvature in the normal spine are the same as those observed in AIS, these findings might provide a basis for elucidating the etiology of this condition. For this reason, we investigated right thoracic curvature in the normal spine. Methods For normal spinal measurements, 1,200 patients who underwent a posteroanterior chest radiographs were evaluated. These consisted of 400 children (ages 4-9, 400 adolescents (ages 10-19 and 400 adults (ages 20-29, with each group comprised of both genders. The exclusion criteria were obvious chest and spinal diseases. As side curvature is minimal in normal spines and the range at which curvature is measured is difficult to ascertain, first the typical curvature range in scoliosis patients was determined and then the Cobb angle in normal spines was measured using the same range as the scoliosis curve, from T5 to T12. Right thoracic curvature was given a positive value. The curve pattern was organized in each collective three groups: neutral (from -1 degree to 1 degree, right (> +1 degree, and left ( Results In child group, Cobb angle in left was 120, in neutral was 125 and in right was 155. In adolescent group, Cobb angle in left was 70, in neutral was 114 and in right was 216. In adult group, Cobb angle in left was 46, in neutral was 102 and in right was 252. The curvature pattern shifts to the right side in the adolescent group (p Conclusions Based on standing chest radiographic measurements, a right thoracic curvature was observed in normal spines after adolescence.

  17. Advancing Normal Birth: Organizations, Goals, and Research

    OpenAIRE

    Hotelling, Barbara A.; Humenick, Sharron S.

    2005-01-01

    In this column, the support for advancing normal birth is summarized, based on a comparison of the goals of Healthy People 2010, Lamaze International, the Coalition for Improving Maternity Services, and the midwifery model of care. Research abstracts are presented to provide evidence that the midwifery model of care safely and economically advances normal birth. Rates of intervention experienced, as reported in the Listening to Mothers survey, are compared to the forms of care recommended by ...

  18. Distinguishing hyperhidrosis and normal physiological sweat production

    DEFF Research Database (Denmark)

    Thorlacius, Linnea; Gyldenløve, Mette; Zachariae, Claus

    2015-01-01

    of this study was to establish reference intervals for normal physiological axillary and palmar sweat production. METHODS: Gravimetric testing was performed in 75 healthy control subjects. Subsequently, these results were compared with findings in a cohort of patients with hyperhidrosis and with the results...... 100 mg/5 min. CONCLUSIONS: A sweat production rate of 100 mg/5 min as measured by gravimetric testing may be a reasonable cut-off value for distinguishing axillary and palmar hyperhidrosis from normal physiological sweat production....

  19. Normalization based K means Clustering Algorithm

    OpenAIRE

    Virmani, Deepali; Taneja, Shweta; Malhotra, Geetika

    2015-01-01

    K-means is an effective clustering technique used to separate similar data into groups based on initial centroids of clusters. In this paper, Normalization based K-means clustering algorithm(N-K means) is proposed. Proposed N-K means clustering algorithm applies normalization prior to clustering on the available data as well as the proposed approach calculates initial centroids based on weights. Experimental results prove the betterment of proposed N-K means clustering algorithm over existing...

  20. Sampling from the normal and exponential distributions

    International Nuclear Information System (INIS)

    Chaplin, K.R.; Wills, C.A.

    1982-01-01

    Methods for generating random numbers from the normal and exponential distributions are described. These involve dividing each function into subregions, and for each of these developing a method of sampling usually based on an acceptance rejection technique. When sampling from the normal or exponential distribution, each subregion provides the required random value with probability equal to the ratio of its area to the total area. Procedures written in FORTRAN for the CYBER 175/CDC 6600 system are provided to implement the two algorithms

  1. Normalizing biomedical terms by minimizing ambiguity and variability

    Directory of Open Access Journals (Sweden)

    McNaught John

    2008-04-01

    Full Text Available Abstract Background One of the difficulties in mapping biomedical named entities, e.g. genes, proteins, chemicals and diseases, to their concept identifiers stems from the potential variability of the terms. Soft string matching is a possible solution to the problem, but its inherent heavy computational cost discourages its use when the dictionaries are large or when real time processing is required. A less computationally demanding approach is to normalize the terms by using heuristic rules, which enables us to look up a dictionary in a constant time regardless of its size. The development of good heuristic rules, however, requires extensive knowledge of the terminology in question and thus is the bottleneck of the normalization approach. Results We present a novel framework for discovering a list of normalization rules from a dictionary in a fully automated manner. The rules are discovered in such a way that they minimize the ambiguity and variability of the terms in the dictionary. We evaluated our algorithm using two large dictionaries: a human gene/protein name dictionary built from BioThesaurus and a disease name dictionary built from UMLS. Conclusions The experimental results showed that automatically discovered rules can perform comparably to carefully crafted heuristic rules in term mapping tasks, and the computational overhead of rule application is small enough that a very fast implementation is possible. This work will help improve the performance of term-concept mapping tasks in biomedical information extraction especially when good normalization heuristics for the target terminology are not fully known.

  2. Large mass storage facility

    International Nuclear Information System (INIS)

    Peskin, A.M.

    1978-01-01

    The report of a committee to study the questions surrounding possible acquisition of a large mass-storage device is presented. The current computing environment at BNL and justification for an online large mass storage device are briefly discussed. Possible devices to meet the requirements of large mass storage are surveyed, including future devices. The future computing needs of BNL are prognosticated. 2 figures, 4 tables

  3. Computerized tomography and head growth curve infantile macrocephaly with normal psychomotor development

    International Nuclear Information System (INIS)

    Eda, Isematsu; Kitahara, Tadashi; Takashima, Sachio; Takeshita, Kenzo

    1982-01-01

    Macrocephaly was defined as a head measuring larger than 98th percentile. We have evaluated CT findings and head growth curves in 25 infants with large heads. Ten (40%) of 25 infants with large heads were normal developmentally and neurologically. Five (20%) of those were mentally retarded. The other 10 infants (40%) included hydrocephalus (4 cases), malformation syndrome (3 cases), brain tumor (1 case), metabolic disorder (1 case) and degenerative disorder (1 case). Their head growth curves were typed as (I), (II) and (III): Type (I) (excessive head growth curve to 2 SDs above normal); Type (II) (head growth curve gradually approached to 2 SDs above normal); Type (III) (head growth curve parallel to 2 SDs above normal). Ten of macrocephaly with normal psychomotor development were studied clinically and radiologically in details. They were all male. CT pictures of those showed normal or various abnormal findings: ventricular dilatations, wide frontal and temporal subdural spaces, wide interhemispheric fissures, wide cerebral sulci, and large sylvian fissures. CT findings in 2 of those, which because normal after repeated CT examinations, resembled benign subdural collection. CT findings in one of those were external hydrocephalus. Head growth curves were obtained from 8 of those. Six cases revealed type (II) and two cases did type (III). The remaining 2 cases could not be followed up. We consider that CT findings of infants showed macrocephaly with normal psychomotor development reveals normal or various abnormal (ventricular dilatations, benign subdural collection, external hydrocephalus) and their head growth curves are not at least excessive. Infants with mental retardation showed similar CT findings and head growth curves as those with normal psychomotor development. It was difficult to distinguish normal from mentally retarded infants by either CT findings or head growth curves. (author)

  4. Cortical Thinning in Network-Associated Regions in Cognitively Normal and Below-Normal Range Schizophrenia

    Directory of Open Access Journals (Sweden)

    R. Walter Heinrichs

    2017-01-01

    Full Text Available This study assessed whether cortical thickness across the brain and regionally in terms of the default mode, salience, and central executive networks differentiates schizophrenia patients and healthy controls with normal range or below-normal range cognitive performance. Cognitive normality was defined using the MATRICS Consensus Cognitive Battery (MCCB composite score (T=50 ± 10 and structural magnetic resonance imaging was used to generate cortical thickness data. Whole brain analysis revealed that cognitively normal range controls (n=39 had greater cortical thickness than both cognitively normal (n=17 and below-normal range (n=49 patients. Cognitively normal controls also demonstrated greater thickness than patients in regions associated with the default mode and salience, but not central executive networks. No differences on any thickness measure were found between cognitively normal range and below-normal range controls (n=24 or between cognitively normal and below-normal range patients. In addition, structural covariance between network regions was high and similar across subgroups. Positive and negative symptom severity did not correlate with thickness values. Cortical thinning across the brain and regionally in relation to the default and salience networks may index shared aspects of the psychotic psychopathology that defines schizophrenia with no relation to cognitive impairment.

  5. A compendium of canine normal tissue gene expression.

    Directory of Open Access Journals (Sweden)

    Joseph Briggs

    Full Text Available BACKGROUND: Our understanding of disease is increasingly informed by changes in gene expression between normal and abnormal tissues. The release of the canine genome sequence in 2005 provided an opportunity to better understand human health and disease using the dog as clinically relevant model. Accordingly, we now present the first genome-wide, canine normal tissue gene expression compendium with corresponding human cross-species analysis. METHODOLOGY/PRINCIPAL FINDINGS: The Affymetrix platform was utilized to catalogue gene expression signatures of 10 normal canine tissues including: liver, kidney, heart, lung, cerebrum, lymph node, spleen, jejunum, pancreas and skeletal muscle. The quality of the database was assessed in several ways. Organ defining gene sets were identified for each tissue and functional enrichment analysis revealed themes consistent with known physio-anatomic functions for each organ. In addition, a comparison of orthologous gene expression between matched canine and human normal tissues uncovered remarkable similarity. To demonstrate the utility of this dataset, novel canine gene annotations were established based on comparative analysis of dog and human tissue selective gene expression and manual curation of canine probeset mapping. Public access, using infrastructure identical to that currently in use for human normal tissues, has been established and allows for additional comparisons across species. CONCLUSIONS/SIGNIFICANCE: These data advance our understanding of the canine genome through a comprehensive analysis of gene expression in a diverse set of tissues, contributing to improved functional annotation that has been lacking. Importantly, it will be used to inform future studies of disease in the dog as a model for human translational research and provides a novel resource to the community at large.

  6. Parser Adaptation for Social Media by Integrating Normalization

    NARCIS (Netherlands)

    van der Goot, Rob; van Noord, Gerardus

    This work explores normalization for parser adaptation. Traditionally, normalization is used as separate pre-processing step. We show that integrating the normalization model into the parsing algorithm is beneficial. This way, multiple normalization candidates can be leveraged, which improves

  7. Large N Scalars

    DEFF Research Database (Denmark)

    Sannino, Francesco

    2016-01-01

    We construct effective Lagrangians, and corresponding counting schemes, valid to describe the dynamics of the lowest lying large N stable massive composite state emerging in strongly coupled theories. The large N counting rules can now be employed when computing quantum corrections via an effective...

  8. Large bowel resection

    Science.gov (United States)

    ... blockage in the intestine due to scar tissue Colon cancer Diverticular disease (disease of the large bowel) Other reasons for bowel resection are: Familial polyposis (polyps are growths on the lining of the colon or rectum) Injuries that damage the large bowel ...

  9. Option Pricing with Asymmetric Heteroskedastic Normal Mixture Models

    DEFF Research Database (Denmark)

    Rombouts, Jeroen V.K.; Stentoft, Lars

    This paper uses asymmetric heteroskedastic normal mixture models to fit return data and to price options. The models can be estimated straightforwardly by maximum likelihood, have high statistical fit when used on S&P 500 index return data, and allow for substantial negative skewness and time...... varying higher order moments of the risk neutral distribution. When forecasting out-of-sample a large set of index options between 1996 and 2009, substantial improvements are found compared to several benchmark models in terms of dollar losses and the ability to explain the smirk in implied volatilities...

  10. The uranium market: on the road to normality?

    International Nuclear Information System (INIS)

    Max, A.; Keese, H.

    1988-01-01

    The international market for natural uranium the basic forces of supply and demand had been hampered for many years by (1) political interventions, in particular specific import and export policies of the main uranium producing countries; and (2) inflexible uranium enrichment contract policies. These obstacles to normal uranium supply and demand balance have largely been overcome in recent years. However, current legislative and judicial moves in the USA could restrict the usage of non-US origin uranium. Alternatively, the USA/Canada Free Trade Agreement, if ratified by the US Congress and the Canadian Parliament, could pave the way for a smooth development of the future uranium market. 1 fig

  11. Fetal cerebral biometry: normal parenchymal findings and ventricular size

    International Nuclear Information System (INIS)

    Garel, C.

    2005-01-01

    Assessing fetal cerebral biometry is one means of ascertaining that the development of the fetal central nervous system is normal. Norms have been established on large cohorts of fetuses by sonographic and neurofetopathological studies. Biometric standards have been established in MR in much smaller cohorts. The purpose of this paper is to analyse methods of measuring a few parameters in MR [biparietal diameter (BPD), fronto-occipital diameter (FOD), length of the corpus callosum (LCC), atrial diameter, transverse cerebellar diameter, height, anteroposterior diameter and surface of the vermis] and to compare US and MR in the assessment of fetal cerebral biometry. (orig.)

  12. Adaptive Large Neighbourhood Search

    DEFF Research Database (Denmark)

    Røpke, Stefan

    Large neighborhood search is a metaheuristic that has gained popularity in recent years. The heuristic repeatedly moves from solution to solution by first partially destroying the solution and then repairing it. The best solution observed during this search is presented as the final solution....... This tutorial introduces the large neighborhood search metaheuristic and the variant adaptive large neighborhood search that dynamically tunes parameters of the heuristic while it is running. Both heuristics belong to a broader class of heuristics that are searching a solution space using very large...... neighborhoods. The tutorial also present applications of the adaptive large neighborhood search, mostly related to vehicle routing problems for which the heuristic has been extremely successful. We discuss how the heuristic can be parallelized and thereby take advantage of modern desktop computers...

  13. New Riemannian Priors on the Univariate Normal Model

    Directory of Open Access Journals (Sweden)

    Salem Said

    2014-07-01

    Full Text Available The current paper introduces new prior distributions on the univariate normal model, with the aim of applying them to the classification of univariate normal populations. These new prior distributions are entirely based on the Riemannian geometry of the univariate normal model, so that they can be thought of as “Riemannian priors”. Precisely, if {pθ ; θ ∈ Θ} is any parametrization of the univariate normal model, the paper considers prior distributions G( θ - , γ with hyperparameters θ - ∈ Θ and γ > 0, whose density with respect to Riemannian volume is proportional to exp(−d2(θ, θ - /2γ2, where d2(θ, θ - is the square of Rao’s Riemannian distance. The distributions G( θ - , γ are termed Gaussian distributions on the univariate normal model. The motivation for considering a distribution G( θ - , γ is that this distribution gives a geometric representation of a class or cluster of univariate normal populations. Indeed, G( θ - , γ has a unique mode θ - (precisely, θ - is the unique Riemannian center of mass of G( θ - , γ, as shown in the paper, and its dispersion away from θ - is given by γ.  Therefore, one thinks of members of the class represented by G( θ - , γ as being centered around θ - and  lying within a typical  distance determined by γ. The paper defines rigorously the Gaussian distributions G( θ - , γ and describes an algorithm for computing maximum likelihood estimates of their hyperparameters. Based on this algorithm and on the Laplace approximation, it describes how the distributions G( θ - , γ can be used as prior distributions for Bayesian classification of large univariate normal populations. In a concrete application to texture image classification, it is shown that  this  leads  to  an  improvement  in  performance  over  the  use  of  conjugate  priors.

  14. Quantitative thallium-201 myocardial exercise scintigraphy in normal subjects and patients with normal coronary arteries

    International Nuclear Information System (INIS)

    Niemeyer, M.G.; St. Antonius Hospital Nieuwegein; Laarman, G.J.; Lelbach, S.; Cramer, M.J.; Ascoop, C.A.P.L.; Verzijlbergen, J.F.; Wall, E.E. van der; Zwinderman, A.H.; Pauwels, E.K.J.

    1990-01-01

    Quantitative thallium-201 myocardial exercise scintigraphy was tested in two patient populations representing alternative standards for cardiac normality: group I comprised 18 male uncatherized patients with a low likelihood of coronary artery disease (CAD); group II contained 41 patients with normal coronary arteriograms. Group I patients were younger, they achieved a higher rate-pressure product than group II patients; all had normal findings by phisical examination and electrocardiography at rest and exercise. Group II patients comprised 21 females, 11 patients showed abnormal electrocardiography at rest, and five patients showed ischemic ST depression during exercise. Twelve patients had sign of minimal CAD. Twelve patients revealed abnormal visual and quantitative thallium findings, three of these patients had minimal CAD. Profiles of uptake and washout of thallium-201 were derived from both patient groups, and compared with normal limits developed by Maddahi et al. Furthermore, low likelihood and angiographically normal patients may differ substantially, and both sets of normal patients should be considered when establishing criteria of abnormality in exercise thallium imaging. When commercial software containing normal limits for quantitative analysis of exercise thallium-201 imaging is used in clinical practice, it is mandatory to compare these with normal limits of uptake and washout of thallium-201, derived from the less heterogeneous group of low-likelihood subjects, which should be used in selecting a normal population to define normality. (author). 37 refs.; 3 figs; 1 tab

  15. Normal people working in normal organizations with normal equipment: system safety and cognition in a mid-air collision.

    Science.gov (United States)

    de Carvalho, Paulo Victor Rodrigues; Gomes, José Orlando; Huber, Gilbert Jacob; Vidal, Mario Cesar

    2009-05-01

    A fundamental challenge in improving the safety of complex systems is to understand how accidents emerge in normal working situations, with equipment functioning normally in normally structured organizations. We present a field study of the en route mid-air collision between a commercial carrier and an executive jet, in the clear afternoon Amazon sky in which 154 people lost their lives, that illustrates one response to this challenge. Our focus was on how and why the several safety barriers of a well structured air traffic system melted down enabling the occurrence of this tragedy, without any catastrophic component failure, and in a situation where everything was functioning normally. We identify strong consistencies and feedbacks regarding factors of system day-to-day functioning that made monitoring and awareness difficult, and the cognitive strategies that operators have developed to deal with overall system behavior. These findings emphasize the active problem-solving behavior needed in air traffic control work, and highlight how the day-to-day functioning of the system can jeopardize such behavior. An immediate consequence is that safety managers and engineers should review their traditional safety approach and accident models based on equipment failure probability, linear combinations of failures, rules and procedures, and human errors, to deal with complex patterns of coincidence possibilities, unexpected links, resonance among system functions and activities, and system cognition.

  16. Computed tomography of the normal sternum

    International Nuclear Information System (INIS)

    Goodman, L.R.; Teplick, S.K.; Kay, H.

    1983-01-01

    The normal CT anatomy of the sternum was studied in 35 patients. In addition to the normal appearance of the sternum, normal variants that may mimic desease were often noted. In the manubrium, part of the posterior cortical margin was unsharp and irregular in 34 of 35 patients. Part of the anterior cortical margin was indistinct in 20 of the 35 patients. Angulation of the CT gantry to a position more nearly perpendicular to the manubrium improved the definition of the cortical margins. The body of the sternum was ovoid to rectangular and usually had sharp cortical margins. Sections through the manubriosternal joint and xyphoid often demonstrated irregular mottled calcifications and indistinct margins again simulating bony lesions. The rib insertions, sternal clavicular joints, and adjacent soft-tissue appearance also were evaluated

  17. Masturbation, sexuality, and adaptation: normalization in adolescence.

    Science.gov (United States)

    Shapiro, Theodore

    2008-03-01

    During adolescence the central masturbation fantasy that is formulated during childhood takes its final form and paradoxically must now be directed outward for appropriate object finding and pair matching in the service of procreative aims. This is a step in adaptation that requires a further developmental landmark that I have called normalization. The path toward airing these private fantasies is facilitated by chumship relationships as a step toward further exposure to the social surround. Hartmann's structuring application of adaptation within psychoanalysis is used as a framework for understanding the process that simultaneously serves intrapsychic and social demands and permits goals that follow evolutionary principles. Variations in the normalization process from masturbatory isolation to a variety of forms of sexual socialization are examined in sociological data concerning current adolescent sexual behavior and in case examples that indicate some routes to normalized experience and practice.

  18. Asymptotic normalization coefficients and astrophysical factors

    International Nuclear Information System (INIS)

    Mukhamedzhanov, A.M.; Azhari, A.; Clark, H.L.; Gagliardi, C.A.; Lui, Y.-W.; Sattarov, A.; Trache, L.; Tribble, R.E.; Burjan, V.; Kroha, V.; Carstoiu, F.

    2000-01-01

    The S factor for the direct capture reaction 7 Be(p,γ) 8 B can be found at astrophysical energies from the asymptotic normalization coefficients (ANC's) which provide the normalization of the tails of the overlap functions for 8 B → 7 Be + p. Peripheral transfer reactions offer a technique to determine these ANC's. Using this technique, the 10 B( 7 Be, 8 B) 9 Be and 14 N( 7 Be, 8 B) 13 C reactions have been used to measure the asymptotic normalization coefficient for 7 Be(p, γ) 8 B. These results provide an indirect determination of S 17 (0). Analysis of the existing 9 Be(p, γ) 10 B experimental data within the framework of the R-matrix method demonstrates that experimentally measured ANC's can provide a reasonable determination of direct radiative capture rates. (author)

  19. Quantum arrival times and operator normalization

    International Nuclear Information System (INIS)

    Hegerfeldt, Gerhard C.; Seidel, Dirk; Gonzalo Muga, J.

    2003-01-01

    A recent approach to arrival times used the fluorescence of an atom entering a laser illuminated region, and the resulting arrival-time distribution was close to the axiomatic distribution of Kijowski, but not exactly equal, neither in limiting cases nor after compensation of reflection losses by normalization on the level of expectation values. In this paper we employ a normalization on the level of operators, recently proposed in a slightly different context. We show that in this case the axiomatic arrival-time distribution of Kijowski is recovered as a limiting case. In addition, it is shown that Allcock's complex potential model is also a limit of the physically motivated fluorescence approach and connected to Kijowski's distribution through operator normalization

  20. Helicon normal modes in Proto-MPEX

    Science.gov (United States)

    Piotrowicz, P. A.; Caneses, J. F.; Green, D. L.; Goulding, R. H.; Lau, C.; Caughman, J. B. O.; Rapp, J.; Ruzic, D. N.

    2018-05-01

    The Proto-MPEX helicon source has been operating in a high electron density ‘helicon-mode’. Establishing plasma densities and magnetic field strengths under the antenna that allow for the formation of normal modes of the fast-wave are believed to be responsible for the ‘helicon-mode’. A 2D finite-element full-wave model of the helicon antenna on Proto-MPEX is used to identify the fast-wave normal modes responsible for the steady-state electron density profile produced by the source. We also show through the simulation that in the regions of operation in which core power deposition is maximum the slow-wave does not deposit significant power besides directly under the antenna. In the case of a simulation where a normal mode is not excited significant edge power is deposited in the mirror region. ).

  1. Normalization as a canonical neural computation

    Science.gov (United States)

    Carandini, Matteo; Heeger, David J.

    2012-01-01

    There is increasing evidence that the brain relies on a set of canonical neural computations, repeating them across brain regions and modalities to apply similar operations to different problems. A promising candidate for such a computation is normalization, in which the responses of neurons are divided by a common factor that typically includes the summed activity of a pool of neurons. Normalization was developed to explain responses in the primary visual cortex and is now thought to operate throughout the visual system, and in many other sensory modalities and brain regions. Normalization may underlie operations such as the representation of odours, the modulatory effects of visual attention, the encoding of value and the integration of multisensory information. Its presence in such a diversity of neural systems in multiple species, from invertebrates to mammals, suggests that it serves as a canonical neural computation. PMID:22108672

  2. Normal stress Sestamibi study: why re inject?

    International Nuclear Information System (INIS)

    Unger, S.A.; Hughes, T.

    2000-01-01

    Full text: Myocardial perfusion imaging (MPI) is widely used for risk stratification of patients with known or suspected coronary artery disease. A normal MPI study predicts an annual cardiac event rate of 99 Tc m -Sestamibi (MIBI), omitting the rest study when the post-stress study is interpreted as normal. The safety of this approach has not been validated, all published reports utilising both rest and stress images to interpret a study as 'normal'. Between 1/1/98 and 30/8/98, 489 patients (patients) were referred to our department for stress MPI. Of these, 237 were interpreted as normal on the basis of their post-stress study, and did not undergo a rest study. 12 month clinical follow-up was available in 184 (78%) of these patients, representing the study group (82 males, 102 females; mean age 61±12 years). 156 of these patients were referred for assessment of chest pain, three for dyspnoea, six for abnormal ECGs, and 19 for pre-operative evaluation. At one year of follow-up, there were no myocardial infarcts or admissions for unstable angina, and no cardiac deaths. Three patients died of non-cardiac causes. Seven patients underwent coronary angiography: five were normal, one had a single 50% stenosis, and one had an 80% vein graft stenosis which was subsequently angioplastied. In conclusion, a normal stress MIBI image predicts an excellent prognosis and negates the need for a rest reinjection study, thus reducing patient camera time and radiation exposure, improving departmental throughput, and minimising public health expenditure. Copyright (2000) The Australian and New Zealand Society of Nuclear Medicine Inc

  3. Normal modes of weak colloidal gels

    Science.gov (United States)

    Varga, Zsigmond; Swan, James W.

    2018-01-01

    The normal modes and relaxation rates of weak colloidal gels are investigated in calculations using different models of the hydrodynamic interactions between suspended particles. The relaxation spectrum is computed for freely draining, Rotne-Prager-Yamakawa, and accelerated Stokesian dynamics approximations of the hydrodynamic mobility in a normal mode analysis of a harmonic network representing several colloidal gels. We find that the density of states and spatial structure of the normal modes are fundamentally altered by long-ranged hydrodynamic coupling among the particles. Short-ranged coupling due to hydrodynamic lubrication affects only the relaxation rates of short-wavelength modes. Hydrodynamic models accounting for long-ranged coupling exhibit a microscopic relaxation rate for each normal mode, λ that scales as l-2, where l is the spatial correlation length of the normal mode. For the freely draining approximation, which neglects long-ranged coupling, the microscopic relaxation rate scales as l-γ, where γ varies between three and two with increasing particle volume fraction. A simple phenomenological model of the internal elastic response to normal mode fluctuations is developed, which shows that long-ranged hydrodynamic interactions play a central role in the viscoelasticity of the gel network. Dynamic simulations of hard spheres that gel in response to short-ranged depletion attractions are used to test the applicability of the density of states predictions. For particle concentrations up to 30% by volume, the power law decay of the relaxation modulus in simulations accounting for long-ranged hydrodynamic interactions agrees with predictions generated by the density of states of the corresponding harmonic networks as well as experimental measurements. For higher volume fractions, excluded volume interactions dominate the stress response, and the prediction from the harmonic network density of states fails. Analogous to the Zimm model in polymer

  4. Metabolomics data normalization with EigenMS.

    Directory of Open Access Journals (Sweden)

    Yuliya V Karpievitch

    Full Text Available Liquid chromatography mass spectrometry has become one of the analytical platforms of choice for metabolomics studies. However, LC-MS metabolomics data can suffer from the effects of various systematic biases. These include batch effects, day-to-day variations in instrument performance, signal intensity loss due to time-dependent effects of the LC column performance, accumulation of contaminants in the MS ion source and MS sensitivity among others. In this study we aimed to test a singular value decomposition-based method, called EigenMS, for normalization of metabolomics data. We analyzed a clinical human dataset where LC-MS serum metabolomics data and physiological measurements were collected from thirty nine healthy subjects and forty with type 2 diabetes and applied EigenMS to detect and correct for any systematic bias. EigenMS works in several stages. First, EigenMS preserves the treatment group differences in the metabolomics data by estimating treatment effects with an ANOVA model (multiple fixed effects can be estimated. Singular value decomposition of the residuals matrix is then used to determine bias trends in the data. The number of bias trends is then estimated via a permutation test and the effects of the bias trends are eliminated. EigenMS removed bias of unknown complexity from the LC-MS metabolomics data, allowing for increased sensitivity in differential analysis. Moreover, normalized samples better correlated with both other normalized samples and corresponding physiological data, such as blood glucose level, glycated haemoglobin, exercise central augmentation pressure normalized to heart rate of 75, and total cholesterol. We were able to report 2578 discriminatory metabolite peaks in the normalized data (p<0.05 as compared to only 1840 metabolite signals in the raw data. Our results support the use of singular value decomposition-based normalization for metabolomics data.

  5. ERP inside Large Organizations

    Directory of Open Access Journals (Sweden)

    Constantin Daniel AVRAM

    2010-01-01

    Full Text Available Many large companies in Romania are still functioning without an ERP system. Instead they are using traditional application systems built around the strong boundaries of specific functions: finance, selling, HR, production. An ERP will offer lots of advantages among which the integration of functionalities and support for top management decisions. Although the total cost of ownership is not small and there are some risks when implementing an ERP inside large and very large organizations, having such a system is mandatory. Choosing the right product and vendor and using a correct risk management strategy, will ensure a successful implementation.

  6. Computing Instantaneous Frequency by normalizing Hilbert Transform

    Science.gov (United States)

    Huang, Norden E.

    2005-05-31

    This invention presents Normalized Amplitude Hilbert Transform (NAHT) and Normalized Hilbert Transform(NHT), both of which are new methods for computing Instantaneous Frequency. This method is designed specifically to circumvent the limitation set by the Bedorsian and Nuttal Theorems, and to provide a sharp local measure of error when the quadrature and the Hilbert Transform do not agree. Motivation for this method is that straightforward application of the Hilbert Transform followed by taking the derivative of the phase-angle as the Instantaneous Frequency (IF) leads to a common mistake made up to this date. In order to make the Hilbert Transform method work, the data has to obey certain restrictions.

  7. Normal anatomy of lung perfusion SPECT scintigraphy

    International Nuclear Information System (INIS)

    Moskowitz, G.W.; Levy, L.M.

    1987-01-01

    Ten patients studies for possible pulmonary embolic disease had normal lung perfusion planar and SPECT scintigraphy. A computer program was developed to superimpose the CT scans on corresponding SPECT images. Superimposition of CT scans on corresponding SPECT transaxial cross-sectional images, when available, provides the needed definition and relationships of adjacent organs. SPECT transaxial sections provide clear anatomic definition of perfusion defects without foreground and background lung tissue superimposed. The location, shape, and size of the perfusion defects can be readily assessed by SPECT. An algorithm was developed for the differentiation of abnormal pulmonary perfusion patterns from normal structures on variation

  8. Anatomy, normal variants, and basic biomechanics

    International Nuclear Information System (INIS)

    Berquist, T.H.; Johnson, K.A.

    1989-01-01

    This paper reports on the anatomy and basic functions of the foot and ankle important to physicians involved in imaging procedures, clinical medicine, and surgery. New radiographic techniques especially magnetic resonance imaging, provide more diagnostic information owing to improved tissue contrast and the ability to obtain multiple image planes (axial, sagittal, coronal, oblique). Therefore, a thorough knowledge of skeletal and soft tissue anatomy is even more essential. Normal variants must also be understood in order to distinguish normal from pathologic changes in the foot and ankle. A basic understanding of biomechanics is also essential for selecting the proper diagnostic techniques

  9. Dlk1 in normal and abnormal hematopoiesis

    DEFF Research Database (Denmark)

    Sakajiri, S; O'kelly, J; Yin, D

    2005-01-01

    normals. Also, Dlk1 mRNA was elevated in mononuclear, low density bone marrow cells from 11/38 MDS patients, 5/11 AML M6 and 2/4 AML M7 samples. Furthermore, 5/6 erythroleukemia and 2/2 megakaryocytic leukemia cell lines highly expressed Dlk1 mRNA. Levels of Dlk1 mRNA markedly increased during...... (particularly M6, M7), and it appears to be associated with normal development of megakaryocytes and B cells....

  10. Statistical Theory of Normal Grain Growth Revisited

    International Nuclear Information System (INIS)

    Gadomski, A.; Luczka, J.

    2002-01-01

    In this paper, we discuss three physically relevant problems concerning the normal grain growth process. These are: Infinite vs finite size of the system under study (a step towards more realistic modeling); conditions of fine-grained structure formation, with possible applications to thin films and biomembranes, and interesting relations to superplasticity of materials; approach to log-normality, an ubiquitous natural phenomenon, frequently reported in literature. It turns out that all three important points mentioned are possible to be included in a Mulheran-Harding type behavior of evolving grains-containing systems that we have studied previously. (author)

  11. Radiogenomics: predicting clinical normal tissue radiosensitivity

    DEFF Research Database (Denmark)

    Alsner, Jan

    2006-01-01

    Studies on the genetic basis of normal tissue radiosensitivity, or  'radiogenomics', aims at predicting clinical radiosensitivity and optimize treatment from individual genetic profiles. Several studies have now reported links between variations in certain genes related to the biological response...... to radiation injury and risk of normal tissue morbidity in cancer patients treated with radiotherapy. However, after these initial association studies including few genes, we are still far from being able to predict clinical radiosensitivity on an individual level. Recent data from our own studies on risk...

  12. Foreshock occurrence before large earthquakes

    Science.gov (United States)

    Reasenberg, P.A.

    1999-01-01

    Rates of foreshock occurrence involving shallow M ??? 6 and M ??? 7 mainshocks and M ??? 5 foreshocks were measured in two worldwide catalogs over ???20-year intervals. The overall rates observed are similar to ones measured in previous worldwide and regional studies when they are normalized for the ranges of magnitude difference they each span. The observed worldwide rates were compared to a generic model of earthquake clustering based on patterns of small and moderate aftershocks in California. The aftershock model was extended to the case of moderate foreshocks preceding large mainshocks. Overall, the observed worldwide foreshock rates exceed the extended California generic model by a factor of ???2. Significant differences in foreshock rate were found among subsets of earthquakes defined by their focal mechanism and tectonic region, with the rate before thrust events higher and the rate before strike-slip events lower than the worldwide average. Among the thrust events, a large majority, composed of events located in shallow subduction zones, had a high foreshock rate, while a minority, located in continental thrust belts, had a low rate. These differences may explain why previous surveys have found low foreshock rates among thrust events in California (especially southern California), while the worldwide observations suggests the opposite: California, lacking an active subduction zone in most of its territory, and including a region of mountain-building thrusts in the south, reflects the low rate apparently typical for continental thrusts, while the worldwide observations, dominated by shallow subduction zone events, are foreshock-rich. If this is so, then the California generic model may significantly underestimate the conditional probability for a very large (M ??? 8) earthquake following a potential (M ??? 7) foreshock in Cascadia. The magnitude differences among the identified foreshock-mainshock pairs in the Harvard catalog are consistent with a uniform

  13. IRAS far-infrared colours of normal stars

    Science.gov (United States)

    Waters, L. B. F. M.; Cote, J.; Aumann, H. H.

    1987-01-01

    The analysis of IRAS observations at 12, 25, 60 and 100 microns of bright stars of spectral type O to M is presented. The objective is to identify the 'normal' stellar population and to characterize it in terms of the relationships between (B-V) and (V-/12/), between (R-I) and (V-/12/), and as a function of spectral type and luminosity class. A well-defined relation is found between the color of normal stars in the visual (B-V), (R-I) and in the IR, which does not depend on luminosity class. Using the (B-V), (V-/12/) relation for normal stars, it is found that B and M type stars show a large fraction of deviating stars, mostly with IR excess that is probably caused by circumstellar material. A comparison of IRAS colors with the Johnson colors as a function of spectral type shows good agreement except for the K0 to M5 type stars. The results will be useful in identifying the deviating stars detected with IRAS.

  14. Fungal invasion of normally non-phagocytic host cells.

    Directory of Open Access Journals (Sweden)

    Scott G Filler

    2006-12-01

    Full Text Available Many fungi that cause invasive disease invade host epithelial cells during mucosal and respiratory infection, and subsequently invade endothelial cells during hematogenous infection. Most fungi invade these normally non-phagocytic host cells by inducing their own uptake. Candida albicans hyphae interact with endothelial cells in vitro by binding to N-cadherin on the endothelial cell surface. This binding induces rearrangement of endothelial cell microfilaments, which results in the endocytosis of the organism. The capsule of Cryptococcus neoformans is composed of glucuronoxylomannan, which binds specifically to brain endothelial cells, and appears to mediate both adherence and induction of endocytosis. The mechanisms by which other fungal pathogens induce their own uptake are largely unknown. Some angioinvasive fungi, such as Aspergillus species and the Zygomycetes, invade endothelial cells from the abluminal surface during the initiation of invasive disease, and subsequently invade the luminal surface of endothelial cells during hematogenous dissemination. Invasion of normally non-phagocytic host cells has different consequences, depending on the type of invading fungus. Aspergillus fumigatus blocks apoptosis of pulmonary epithelial cells, whereas Paracoccidioides brasiliensis induces apoptosis of epithelial cells. This review summarizes the mechanisms by which diverse fungal pathogens invade normally non-phagocytic host cells and discusses gaps in our knowledge that provide opportunities for future research.

  15. The large cylindrical drift chamber of TASSO

    International Nuclear Information System (INIS)

    Boerner, H.; Fischer, H.M.; Hartmann, H.; Loehr, B.; Wollstadt, M.; Fohrmann, R.; Schmueser, P.; Cassel, D.G.; Koetz, U.; Kowalski, H.

    1980-03-01

    We have built and operated a large cylindrical drift chamber for the TASSO experiment at the DESY storage ring, PETRA. The chamber has a length of 3.5 m, a diameter of 2.5 m, and a total of 2340 drift cells. The cells are arranged in 15 concentric layers such that tracks can be reconstructed in three dimensions. A spatial resolution of 220 μm has been achieved for tracks of normal incidence on the drift cells. (orig.)

  16. How Normal is Our Solar System?

    Science.gov (United States)

    Kohler, Susanna

    2015-10-01

    To date, weve discovered nearly 2000 confirmed exoplanets, as well as thousands of additional candidates. Amidst this vast sea of solar systems, how special is our own? A new study explores the answer to this question.Analyzing DistributionsKnowing whether our solar system is unique among exoplanetary systems can help us to better understand future observations of exoplanets. Furthermore, if our solar system is typical, this allows us to be optimistic about the possibility of life existing elsewhere in the universe.In a recent study, Rebecca Martin (University of Nevada, Las Vegas) and Mario Livio (Space Telescope Science Institute) examine how normal our solar system is, by comparing the properties of our planets to the averages obtained from known exoplanets.Comparing PropertiesSo how do we measure up?Densities of planets as a function of their mass. Exoplanets (N=287) are shown in blue, planets in our solar system are shown in red. [MartinLivio 2015]Planet masses and densitiesThose of the gas giants in our solar system are pretty typical. The terrestrial planets are on the low side for mass, but thats probably a selection effect: its very difficult to detect low-mass planets.Age of the solar systemRoughly half the stars in the disk of our galaxy are younger than the Sun, and half are older. Were definitely not special in age.Orbital locations of the planetsThis is actually a little strange: our solar system is lacking close-in planets. All of our planets, in fact, orbit at a distance that is larger than the mean distance observed in exoplanetary systems. Again, however, this might be a selection effect at work: its easier to detect large planets orbiting very close to their stars.Eccentricities of the planets orbitsOur planets are on very circular orbits and that actually makes us somewhat special too, compared to typical exoplanet systems. There is a possible explanation though: eccentricity of orbits tends to decrease with more planets in the system. Because

  17. Large Pelagics Telephone Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Large Pelagics Telephone Survey (LPTS) collects fishing effort information directly from captains holding Highly Migratory Species (HMS) permits (required by...

  18. Large Customers (DR Sellers)

    Energy Technology Data Exchange (ETDEWEB)

    Kiliccot, Sila [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2011-10-25

    State of the large customers for demand response integration of solar and wind into electric grid; openADR; CAISO; DR as a pseudo generation; commercial and industrial DR strategies; California regulations

  19. Large Pelagics Biological Survey

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Large Pelagics Biological Survey (LPBS) collects additional length and weight information and body parts such as otoliths, caudal vertebrae, dorsal spines, and...

  20. Large Rotor Test Apparatus

    Data.gov (United States)

    Federal Laboratory Consortium — This test apparatus, when combined with the National Full-Scale Aerodynamics Complex, produces a thorough, full-scale test capability. The Large Rotor Test Apparatus...

  1. INTESTINAL VIROME AND NORMAL MICROFLORA OF HUMAN: FEATURES OF INTERACTION

    Directory of Open Access Journals (Sweden)

    Bobyr V.V.

    2015-05-01

    Full Text Available Summary: Intestinal bacteria defend the host organism and narrow pathogenic bacterial colonization. However, the microbiome effect to enteric viruses is unexplored largely as well as role of microbiota in the pathogenesis of viral infections in general. This review focuses on precisely these issues. Keywords: microbiome, virome, normal microflora, enteric viruses, contagiousness. In this review article, facts about viral persistence in the human gut are summarized. It is described the role of viral populations during health and diseases. After analyzing of the literary facts it was concluded that the gastrointestinal tract is an environment for one from the most complex microbial ecosystems, which requires of more deeper study of its composition, role in physiological processes, as well as the dynamics of changes under influence of the environment. Normal microflora performs a different important functions providing the physiological homeostasis of the human body, including, in particular, an important role in the human metabolic processes, supporting of homeostasis, limiting of colonization by infectious bacteria. The multifactorial significance of the normal gastrointestinal microflora can be divided into immunological, structural and metabolic functions. At the same time, interaction between intestinal microflora and enteric viruses has not been studied largely. In recent years, much attention is paid to study of viruses-bacteria associations, and it is possible, obtained results should change our understanding of microbiota role in the systematic pathogenesis of the diseases with viral etiology. In contrast to the well-known benefits of normal microflora to the host, the viruses can use intestinal microflora as a trigger for replication at the optimal region. Recent studies give a reason for assumption that depletion of normal microflora with antibiotics can determining the antiviral effect. Thus, the role of commensal bacteria in viral

  2. Large Retailers’ Financial Services

    OpenAIRE

    Risso, Mario

    2010-01-01

    Over the last few years, large retailers offering financial services have considerably grown in the financial services sector. Retailers are increasing the wideness and complexity of their offer of financial services. Large retail companies provide financial services to their customers following different strategic ways. The provision of financial services in the retailers offer is implemented in several different ways related to the strategies, the structures and the degree of financial know...

  3. Large momentum transfer phenomena

    International Nuclear Information System (INIS)

    Imachi, Masahiro; Otsuki, Shoichiro; Matsuoka, Takeo; Sawada, Shoji.

    1978-01-01

    The large momentum transfer phenomena in hadron reaction drastically differ from small momentum transfer phenomena, and are described in this paper. Brief review on the features of the large transverse momentum transfer reactions is described in relation with two-body reactions, single particle productions, particle ratios, two jet structure, two particle correlations, jet production cross section, and the component of momentum perpendicular to the plane defined by the incident protons and the triggered pions and transverse momentum relative to jet axis. In case of two-body process, the exponent N of the power law of the differential cross section is a value between 10 to 11.5 in the large momentum transfer region. The breaks of the exponential behaviors into the power ones are observed at the large momentum transfer region. The break would enable to estimate the order of a critical length. The large momentum transfer phenomena strongly suggest an important role of constituents of hadrons in the hard region. Hard rearrangement of constituents from different initial hadrons induces large momentum transfer reactions. Several rules to count constituents in the hard region have been proposed so far to explain the power behavior. Scale invariant quark interaction and hard reactions are explained, and a summary of the possible types of hard subprocess is presented. (Kato, T.)

  4. Automatic Radiometric Normalization of Multitemporal Satellite Imagery

    DEFF Research Database (Denmark)

    Canty, Morton J.; Nielsen, Allan Aasbjerg; Schmidt, Michael

    2004-01-01

    with normalization using orthogonal regression. The procedure is applied to Landsat TM images over Nevada, Landsat ETM+ images over Morocco, and SPOT HRV images over Kenya. Results from this new automatic, combined MAD/orthogonal regression method, based on statistical analysis of test pixels not used in the actual...

  5. Post-Normal science in practice

    NARCIS (Netherlands)

    Dankel, Dorothy J.; Vaage, Nora S.; van der Sluijs, Jeroen P.

    This special issue contains a selection of papers presented during the 2014 Bergen meeting, complemented with short perspectives by young PNS-inspired scholars, presented at a mini-symposium "Post-normal times? New thinking about science and policy advice" held on 21 October 2016 in celebration of

  6. Physical Development: What's Normal? What's Not?

    Science.gov (United States)

    ... Stages Listen Español Text Size Email Print Share Physical Development: What’s Normal? What’s Not? Page Content Article ... growth . The timing and speed of a child's physical development can vary a lot, because it is ...

  7. Visual attention and flexible normalization pools

    Science.gov (United States)

    Schwartz, Odelia; Coen-Cagli, Ruben

    2013-01-01

    Attention to a spatial location or feature in a visual scene can modulate the responses of cortical neurons and affect perceptual biases in illusions. We add attention to a cortical model of spatial context based on a well-founded account of natural scene statistics. The cortical model amounts to a generalized form of divisive normalization, in which the surround is in the normalization pool of the center target only if they are considered statistically dependent. Here we propose that attention influences this computation by accentuating the neural unit activations at the attended location, and that the amount of attentional influence of the surround on the center thus depends on whether center and surround are deemed in the same normalization pool. The resulting form of model extends a recent divisive normalization model of attention (Reynolds & Heeger, 2009). We simulate cortical surround orientation experiments with attention and show that the flexible model is suitable for capturing additional data and makes nontrivial testable predictions. PMID:23345413

  8. Achondroplasia in sibs of normal parents.

    Science.gov (United States)

    Philip, N; Auger, M; Mattei, J F; Giraud, F

    1988-01-01

    A new case of recurrent achondroplasia in sibs of normal parents is reported. Two sisters and a half sister were affected. Various mechanisms can be postulated to account for unexpected recurrence of achondroplasia in the same sibship. Germinal mosaicism and unstable premutation are discussed here. Images PMID:3236371

  9. Achondroplasia in sibs of normal parents.

    OpenAIRE

    Philip, N; Auger, M; Mattei, J F; Giraud, F

    1988-01-01

    A new case of recurrent achondroplasia in sibs of normal parents is reported. Two sisters and a half sister were affected. Various mechanisms can be postulated to account for unexpected recurrence of achondroplasia in the same sibship. Germinal mosaicism and unstable premutation are discussed here.

  10. Endoscopic third ventriculostomy in idiopathic normal pressure ...

    African Journals Online (AJOL)

    Mohammed Ahmed Eshra

    2013-12-22

    Dec 22, 2013 ... system of the brain causing ventricular enlargement. This is followed by gradual .... sion, not to decrease the pressure (which is already normal).8–15 ... So ETV must be performed in patients with clinical evolution of not more.

  11. Principal normal indicatrices of closed space curves

    DEFF Research Database (Denmark)

    Røgen, Peter

    1999-01-01

    A theorem due to J. Weiner, which is also proven by B. Solomon, implies that a principal normal indicatrix of a closed space curve with nonvanishing curvature has integrated geodesic curvature zero and contains no subarc with integrated geodesic curvature pi. We prove that the inverse problem alw...

  12. Normal tension glaucoma and Alzheimer disease

    DEFF Research Database (Denmark)

    Bach-Holm, Daniella; Kessing, Svend Vedel; Mogensen, Ulla Brasch

    2012-01-01

    Purpose: To investigate whether normal tension glaucoma (NTG) is associated with increased risk of developing dementia/Alzheimer disease (AD). Methods: A total of 69 patients with NTG were identified in the case note files in the Glaucoma Clinic, University Hospital of Copenhagen (Rigshospitalet...

  13. Normal stresses in semiflexible polymer hydrogels

    Science.gov (United States)

    Vahabi, M.; Vos, Bart E.; de Cagny, Henri C. G.; Bonn, Daniel; Koenderink, Gijsje H.; MacKintosh, F. C.

    2018-03-01

    Biopolymer gels such as fibrin and collagen networks are known to develop tensile axial stress when subject to torsion. This negative normal stress is opposite to the classical Poynting effect observed for most elastic solids including synthetic polymer gels, where torsion provokes a positive normal stress. As shown recently, this anomalous behavior in fibrin gels depends on the open, porous network structure of biopolymer gels, which facilitates interstitial fluid flow during shear and can be described by a phenomenological two-fluid model with viscous coupling between network and solvent. Here we extend this model and develop a microscopic model for the individual diagonal components of the stress tensor that determine the axial response of semiflexible polymer hydrogels. This microscopic model predicts that the magnitude of these stress components depends inversely on the characteristic strain for the onset of nonlinear shear stress, which we confirm experimentally by shear rheometry on fibrin gels. Moreover, our model predicts a transient behavior of the normal stress, which is in excellent agreement with the full time-dependent normal stress we measure.

  14. Comparative ultrasound measurement of normal thyroid gland ...

    African Journals Online (AJOL)

    2011-08-31

    Aug 31, 2011 ... the normal thyroid gland has a homogenous increased medium level echo texture. The childhood thyroid gland dimension correlates linearly with age and body surface unlike adults. [14] Iodothyronine (T3) and thyroxine (T4) are thyroid hormones which function to control the basal metabolic rate (BMR).

  15. Mast cell distribution in normal adult skin

    NARCIS (Netherlands)

    A.S. Janssens (Artiena Soe); R. Heide (Rogier); J.C. den Hollander (Jan); P.G.M. Mulder (P. G M); B. Tank (Bhupendra); A.P. Oranje (Arnold)

    2005-01-01

    markdownabstract__AIMS:__ To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. __METHODS:__ Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults.

  16. Normal sleep and its neurophysiological regulation

    NARCIS (Netherlands)

    Hofman, W.F.; Talamini, L.M.; Watson, R.R.

    2015-01-01

    Normal sleep consists of two states: NREM (light and deep sleep) and REM, alternating in a cyclical pattern. The sleep/wake rhythm is regulated by two processes: the sleep propensity, building up during wake, and the circadian rhythm, imposed by the suprachiasmatic nucleus. The arousal pathways in

  17. Named entity normalization in user generated content

    NARCIS (Netherlands)

    Jijkoun, V.; Khalid, M.A.; Marx, M.; de Rijke, M.

    2008-01-01

    Named entity recognition is important for semantically oriented retrieval tasks, such as question answering, entity retrieval, biomedical retrieval, trend detection, and event and entity tracking. In many of these tasks it is important to be able to accurately normalize the recognized entities,

  18. Morphological evaluation of normal human corneal epithelium

    DEFF Research Database (Denmark)

    Ehlers, Niels; Heegaard, Steffen; Hjortdal, Jesper

    2010-01-01

    of corneas from 100 consecutively selected paraffin-embedded eyes were stained with hematoxylin-eosin and Periodic Acid-Schiff (PAS). All specimens were evaluated by light microscopy. The eyes were enucleated from patients with choroidal melanoma. Corneas were considered to be normal. RESULTS: Ninety of 100...

  19. Dissociative Functions in the Normal Mourning Process.

    Science.gov (United States)

    Kauffman, Jeffrey

    1994-01-01

    Sees dissociative functions in mourning process as occurring in conjunction with integrative trends. Considers initial shock reaction in mourning as model of normal dissociation in mourning process. Dissociation is understood to be related to traumatic significance of death in human consciousness. Discerns four psychological categories of…

  20. Hemoglobin levels in normal Filipino pregnant women.

    Science.gov (United States)

    Kuizon, M D; Natera, M G; Ancheta, L P; Platon, T P; Reyes, G D; Macapinlac, M P

    1981-09-01

    The hemoglobin concentrations during pregnancy in Filipinos belonging to the upper income group, who were prescribed 105 mg elemental iron daily, and who had acceptable levels of transferrin saturation, were examined in an attempt to define normal levels. The hemoglobin concentrations for each trimester followed a Gaussian distribution. The hemoglobin values equal to the mean minus one standard deviation were 11.4 gm/dl for the first trimester and 10.4 gm/dl for the second and third trimesters. Using these values as the lower limits of normal, in one group of pregnant women the prevalence of anemia during the last two trimesters was found lower than that obtained when WHO levels for normal were used. Groups of women with hemoglobin of 10.4 to 10.9 gm/dl (classified anemic by WHO criteria but normal in the present study) and those with 11.0 gm/dl and above could not be distinguished on the basis of their serum ferritin levels nor on the degree of decrease in their hemoglobin concentration during pregnancy. Many subjects in both groups, however, had serum ferritin levels less than 12 ng/ml which indicate poor iron stores. It might be desirable in future studies to determine the hemoglobin cut-off point that will delineate subjects who are both non-anemic and adequate in iron stores using serum ferritin levels as criterion for the latter.

  1. Normalized compression distance of multisets with applications

    NARCIS (Netherlands)

    Cohen, A.R.; Vitányi, P.M.B.

    Pairwise normalized compression distance (NCD) is a parameter-free, feature-free, alignment-free, similarity metric based on compression. We propose an NCD of multisets that is also metric. Previously, attempts to obtain such an NCD failed. For classification purposes it is superior to the pairwise

  2. Limiting Normal Operator in Quasiconvex Analysis

    Czech Academy of Sciences Publication Activity Database

    Aussel, D.; Pištěk, Miroslav

    2015-01-01

    Roč. 23, č. 4 (2015), s. 669-685 ISSN 1877-0533 R&D Projects: GA ČR GA15-00735S Institutional support: RVO:67985556 Keywords : Quasiconvex function * Sublevel set * Normal operator Subject RIV: BA - General Mathematics Impact factor: 0.973, year: 2015 http://library.utia.cas.cz/separaty/2015/MTR/pistek-0453552.pdf

  3. The normal bacterial flora prevents GI disease

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. The normal bacterial flora prevents GI disease. Inhibits pathogenic enteric bacteria. Decrease luminal pH; Secrete bacteriocidal proteins; Colonization resistance; Block epithelial binding – induce MUC2. Improves epithelial and mucosal barrier integrity. Produce ...

  4. Perturbations of normally solvable nonlinear operators, I

    Directory of Open Access Journals (Sweden)

    William O. Ray

    1985-01-01

    Full Text Available Let X and Y be Banach spaces and let ℱ and be Gateaux differentiable mappings from X to Y In this note we study when the operator ℱ+ is surjective for sufficiently small perturbations of a surjective operator ℱ The methods extend previous results in the area of normal solvability for nonlinear operators.

  5. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Refixation saccades with normal gain values

    DEFF Research Database (Denmark)

    Korsager, Leise Elisabeth Hviid; Faber, Christian Emil; Schmidt, Jesper Hvass

    2017-01-01

    -ocular reflex. However, this partial deficit is in conflict with the current way of interpreting vHIT results in which the vestibular function is classified as either normal or pathological based only on the gain value. Refixation saccades, which are evident signs of vestibulopathy, are not considered...

  7. Hypervascular liver lesions in radiologically normal liver

    Energy Technology Data Exchange (ETDEWEB)

    Amico, Enio Campos; Alves, Jose Roberto; Souza, Dyego Leandro Bezerra de; Salviano, Fellipe Alexandre Macena; Joao, Samir Assi; Liguori, Adriano de Araujo Lima, E-mail: ecamic@uol.com.br [Hospital Universitario Onofre Lopes (HUOL/UFRN), Natal, RN (Brazil). Clinica Gastrocentro e Ambulatorios de Cirurgia do Aparelho Digestivo e de Cirurgia Hepatobiliopancreatica

    2017-09-01

    Background: The hypervascular liver lesions represent a diagnostic challenge. Aim: To identify risk factors for cancer in patients with non-hemangiomatous hypervascular hepatic lesions in radiologically normal liver. Method: This prospective study included patients with hypervascular liver lesions in radiologically normal liver. The diagnosis was made by biopsy or was presumed on the basis of radiologic stability in follow-up period of one year. Cirrhosis or patients with typical imaging characteristics of haemangioma were excluded. Results: Eighty eight patients were included. The average age was 42.4. The lesions were unique and were between 2-5 cm in size in most cases. Liver biopsy was performed in approximately 1/3 of cases. The lesions were benign or most likely benign in 81.8%, while cancer was diagnosed in 12.5% of cases. Univariate analysis showed that age >45 years (p< 0.001), personal history of cancer (p=0.020), presence of >3 nodules (p=0.003) and elevated alkaline phosphatase (p=0.013) were significant risk factors for cancer. Conclusion: It is safe to observe hypervascular liver lesions in normal liver in patients up to 45 years, normal alanine amino transaminase, up to three nodules and no personal history of cancer. Lesion biopsies are safe in patients with atypical lesions and define the treatment to be established for most of these patients. (author)

  8. KERNEL MAD ALGORITHM FOR RELATIVE RADIOMETRIC NORMALIZATION

    Directory of Open Access Journals (Sweden)

    Y. Bai

    2016-06-01

    Full Text Available The multivariate alteration detection (MAD algorithm is commonly used in relative radiometric normalization. This algorithm is based on linear canonical correlation analysis (CCA which can analyze only linear relationships among bands. Therefore, we first introduce a new version of MAD in this study based on the established method known as kernel canonical correlation analysis (KCCA. The proposed method effectively extracts the non-linear and complex relationships among variables. We then conduct relative radiometric normalization experiments on both the linear CCA and KCCA version of the MAD algorithm with the use of Landsat-8 data of Beijing, China, and Gaofen-1(GF-1 data derived from South China. Finally, we analyze the difference between the two methods. Results show that the KCCA-based MAD can be satisfactorily applied to relative radiometric normalization, this algorithm can well describe the nonlinear relationship between multi-temporal images. This work is the first attempt to apply a KCCA-based MAD algorithm to relative radiometric normalization.

  9. Robust glint detection through homography normalization

    DEFF Research Database (Denmark)

    Hansen, Dan Witzner; Roholm, Lars; García Ferreiros, Iván

    2014-01-01

    A novel normalization principle for robust glint detection is presented. The method is based on geometric properties of corneal reflections and allows for simple and effective detection of glints even in the presence of several spurious and identically appearing reflections. The method is tested...

  10. Power curve report - with turbulence intensity normalization

    DEFF Research Database (Denmark)

    Gómez Arranz, Paula; Wagner, Rozenn; Vesth, Allan

    , additional shear and turbulence intensitity filters are applied on the measured data. Secondly, the method for normalization to a given reference turbulence intensity level (as described in Annex M of the draft of IEC 61400-12-1 Ed.2 [3]) is applied. The measurements have been performed using DTU...

  11. Accounting for the Benefits of Database Normalization

    Science.gov (United States)

    Wang, Ting J.; Du, Hui; Lehmann, Constance M.

    2010-01-01

    This paper proposes a teaching approach to reinforce accounting students' understanding of the concept of database normalization. Unlike a conceptual approach shown in most of the AIS textbooks, this approach involves with calculations and reconciliations with which accounting students are familiar because the methods are frequently used in…

  12. Selective attention in normal and impaired hearing.

    Science.gov (United States)

    Shinn-Cunningham, Barbara G; Best, Virginia

    2008-12-01

    A common complaint among listeners with hearing loss (HL) is that they have difficulty communicating in common social settings. This article reviews how normal-hearing listeners cope in such settings, especially how they focus attention on a source of interest. Results of experiments with normal-hearing listeners suggest that the ability to selectively attend depends on the ability to analyze the acoustic scene and to form perceptual auditory objects properly. Unfortunately, sound features important for auditory object formation may not be robustly encoded in the auditory periphery of HL listeners. In turn, impaired auditory object formation may interfere with the ability to filter out competing sound sources. Peripheral degradations are also likely to reduce the salience of higher-order auditory cues such as location, pitch, and timbre, which enable normal-hearing listeners to select a desired sound source out of a sound mixture. Degraded peripheral processing is also likely to increase the time required to form auditory objects and focus selective attention so that listeners with HL lose the ability to switch attention rapidly (a skill that is particularly important when trying to participate in a lively conversation). Finally, peripheral deficits may interfere with strategies that normal-hearing listeners employ in complex acoustic settings, including the use of memory to fill in bits of the conversation that are missed. Thus, peripheral hearing deficits are likely to cause a number of interrelated problems that challenge the ability of HL listeners to communicate in social settings requiring selective attention.

  13. Superconvergent sum rules for the normal reflectivity

    International Nuclear Information System (INIS)

    Furuya, K.; Zimerman, A.H.; Villani, A.

    1976-05-01

    Families of superconvergent relations for the normal reflectivity function are written. Sum rules connecting the difference of phases of the reflectivities of two materials are also considered. Finally superconvergence relations and sum rules for magneto-reflectivity in the Faraday and Voigt regimes are also studied

  14. Effects of pions on normal tissues

    International Nuclear Information System (INIS)

    Tokita, N.

    1981-01-01

    Verification of the uniform biological effectiveness of pion beams of various dimensions produced at LAMPF has been made using cultured mammalian cells and mouse jejunum. Normal tissue radiobiology studies at LAMPF are reviewed with regard to biological beam characterization for the therapy program and the current status of acute and late effect studies on rodents

  15. Normal equivariant forms of vector fields

    International Nuclear Information System (INIS)

    Sanchez Bringas, F.

    1992-07-01

    We prove a theorem of linearization of type Siegel and a theorem of normal forms of type Poincare-Dulac for germs of holomorphic vector fields in the origin of C 2 , Γ -equivariants, where Γ is a finite subgroup of GL (2,C). (author). 5 refs

  16. Challenging the Ideology of Normal in Schools

    Science.gov (United States)

    Annamma, Subini A.; Boelé, Amy L.; Moore, Brooke A.; Klingner, Janette

    2013-01-01

    In this article, we build on Brantlinger's work to critique the binary of normal and abnormal applied in US schools that create inequities in education. Operating from a critical perspective, we draw from Critical Race Theory, Disability Studies in Education, and Cultural/Historical Activity Theory to build a conceptual framework for…

  17. Prominent central spinal canal on MRI - normal variant or pathology

    International Nuclear Information System (INIS)

    Dugal, T.P.; Brazier, D.; Roche, J.

    2002-01-01

    Full text: The sensitivity of MRI can make differentiation of normal from abnormal challenging.The study investigates whether a visible central spinal canal is pathological or a normal variant. We review eight MRI (mostly on a 1.5 Tesla unit) cases where there is a visible central cavity in keeping with a central canal and review the literature. The central canal is a space in the medial part of the grey-matter commissure between the anterior and posterior horns. Histopathological studies show that the canal is present at birth with the majority showing subsequent involution but is uncommonly imaged on MRI. The main differential diagnosis is syringomyelia which usually presents with deficits in pain and sensation corresponding to the appropriate level often with a demonstrable aetiology. Two thirds of our patients were female with an average age of thirty-six years (range 26-45). The patients were largely asymptomatic or their symptoms appeared unrelated to the imaging findings. Three patients had minor previous trauma and two others had non-bacterial meningitis up to twenty years earlier. No patient had known spinal surgery or trauma.The cavity corresponded tomographically to the expected site of the central canal. The canal was in the thoracic location. The canal diameter ranged from one to five millimetres and its length varied from one half a vertebral body height to extending over the entire thoracic region. Its configuration was either filiform or fusiform, with smooth contours. No predisposing features to suggest syringomyelia or other structural abnormalities were noted. Where Gadolinium was given no abnormal enhancement was observed. These cases add to the literature and suggest that these prominent canals are largely asymptomatic and should be viewed as normal variants. Copyright (2002) Blackwell Science Pty Ltd

  18. Normalization and microbial differential abundance strategies depend upon data characteristics.

    Science.gov (United States)

    Weiss, Sophie; Xu, Zhenjiang Zech; Peddada, Shyamal; Amir, Amnon; Bittinger, Kyle; Gonzalez, Antonio; Lozupone, Catherine; Zaneveld, Jesse R; Vázquez-Baeza, Yoshiki; Birmingham, Amanda; Hyde, Embriette R; Knight, Rob

    2017-03-03

    Data from 16S ribosomal RNA (rRNA) amplicon sequencing present challenges to ecological and statistical interpretation. In particular, library sizes often vary over several ranges of magnitude, and the data contains many zeros. Although we are typically interested in comparing relative abundance of taxa in the ecosystem of two or more groups, we can only measure the taxon relative abundance in specimens obtained from the ecosystems. Because the comparison of taxon relative abundance in the specimen is not equivalent to the comparison of taxon relative abundance in the ecosystems, this presents a special challenge. Second, because the relative abundance of taxa in the specimen (as well as in the ecosystem) sum to 1, these are compositional data. Because the compositional data are constrained by the simplex (sum to 1) and are not unconstrained in the Euclidean space, many standard methods of analysis are not applicable. Here, we evaluate how these challenges impact the performance of existing normalization methods and differential abundance analyses. Effects on normalization: Most normalization methods enable successful clustering of samples according to biological origin when the groups differ substantially in their overall microbial composition. Rarefying more clearly clusters samples according to biological origin than other normalization techniques do for ordination metrics based on presence or absence. Alternate normalization measures are potentially vulnerable to artifacts due to library size. Effects on differential abundance testing: We build on a previous work to evaluate seven proposed statistical methods using rarefied as well as raw data. Our simulation studies suggest that the false discovery rates of many differential abundance-testing methods are not increased by rarefying itself, although of course rarefying results in a loss of sensitivity due to elimination of a portion of available data. For groups with large (~10×) differences in the average

  19. Potential clinical impact of normal-tissue intrinsic radiosensitivity testing

    International Nuclear Information System (INIS)

    Bentzen, Soeren M.

    1997-01-01

    A critical appraisal is given of the possible benefit from a reliable pre-treatment knowledge of individual normal-tissue sensitivity to radiotherapy. The considerations are in part, but not exclusively, based on the recent experience with in vitro colony-forming assays of the surviving fraction at 2 Gy, the SF 2 . Three strategies are reviewed: (1) to screen for rare cases with extreme radiosensitivity, so-called over-reactors, and treat these with reduced total dose, (2) to identify the sensitive tail of the distribution of 'normal' radiosensitivities, refer these patients to other treatment, and to escalate the dose to the remaining patients, or (3) to individualize dose prescriptions based on individual radiosensitivity, i.e. treating to isoeffect rather than to a specific dose-fractionation schedule. It is shown that these strategies will have a small, if any, impact on routine radiotherapy. Screening for over-reactors is hampered by the low prevalence of these among otherwise un-selected patients that leads to a low positive predictive value of in vitro radiosensitivity assays. It is argued, that this problem may persist even if the noise on current assays could be reduced to (the unrealistic value of) zero, simply because of the large biological variation in SF 2 . Removing the sensitive tail of the patient population, will only have a minor effect on the dose that could be delivered to the remaining patients, because of the sigmoid shape of empirical dose-response relationships. Finally, individualizing dose prescriptions based exclusively on information from a normal-tissue radiosensitivity assay, leads to a nearly symmetrical distribution of dose-changes that would produce a very small gain, or even a loss, of tumor control probability if implemented in the clinic. From a theoretical point of view, other strategies could be devised and some of these are considered in this review. Right now the most promising clinical use of in vitro radiosensitivity

  20. Diffusely increased uptake in the skull in normal bone scans

    International Nuclear Information System (INIS)

    Suematsu, Toru; Yoshida, Shoji; Motohara, Tomofumi; Fujiwara, Hirofumi; Nishii, Hironori; Komiyama, Toyozo; Yanase, Masakazu; Mizutani, Masahiro

    1992-01-01

    Diffusely increased skull uptake (a hot skull) is often seen in patients with bone metastases and metabolic disease. This finding is also, however, noticed in normal bone scans of aged women. To determine whether the hot skull could be considered a normal variant in elderly women and is associated to menopause, we studied 282 normal bone scans (166 women and 116 men without metabolic and hormonal disease; age range 11 to 84 yr). We divided the patients into eight age groups--ages 10-19, 20-29, 30-39, 40-49, 50-59, 60-69, 70-79, and 80-89 yrs. Measurements of skull uptake were obtained from anterior total body views using contrast-to-noise ratio (CNR). CNR for the skull was calculated using an equation. The sex dependent difference in skull uptake began to develop in the age group 30-39 yrs (p<0.05). The skull showed greater activity in women than in men for age groups from 30-39 to 80-89 yrs. In the age groups 50-59 and 60-69, the difference was particularly large (p<0.001). For women, the 50-59 yr age group had a significantly higher CNR than the 40-49 yr (p<0.01), 30-39 yr (p<0.05), and 20-29 yr age group (p<0.05). On the other hand, there was no significant difference between the 20-29 yr, 30-39 yr and 40-49 yr age groups. For men, the skull uptake was virtually unchanged with age. Our data strongly suggested that the hot skull in normal bone scan is related to menopausal estrogen deficiency. One should not necessarily regard it abnormal that elderly women suffer hot skull. (J.P.N.)

  1. Vocal fold contact patterns based on normal modes of vibration.

    Science.gov (United States)

    Smith, Simeon L; Titze, Ingo R

    2018-05-17

    The fluid-structure interaction and energy transfer from respiratory airflow to self-sustained vocal fold oscillation continues to be a topic of interest in vocal fold research. Vocal fold vibration is driven by pressures on the vocal fold surface, which are determined by the shape of the glottis and the contact between vocal folds. Characterization of three-dimensional glottal shapes and contact patterns can lead to increased understanding of normal and abnormal physiology of the voice, as well as to development of improved vocal fold models, but a large inventory of shapes has not been directly studied previously. This study aimed to take an initial step toward characterizing vocal fold contact patterns systematically. Vocal fold motion and contact was modeled based on normal mode vibration, as it has been shown that vocal fold vibration can be almost entirely described by only the few lowest order vibrational modes. Symmetric and asymmetric combinations of the four lowest normal modes of vibration were superimposed on left and right vocal fold medial surfaces, for each of three prephonatory glottal configurations, according to a surface wave approach. Contact patterns were generated from the interaction of modal shapes at 16 normalized phases during the vibratory cycle. Eight major contact patterns were identified and characterized by the shape of the flow channel, with the following descriptors assigned: convergent, divergent, convergent-divergent, uniform, split, merged, island, and multichannel. Each of the contact patterns and its variation are described, and future work and applications are discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Normalization for triple-target microarray experiments

    Directory of Open Access Journals (Sweden)

    Magniette Frederic

    2008-04-01

    Full Text Available Abstract Background Most microarray studies are made using labelling with one or two dyes which allows the hybridization of one or two samples on the same slide. In such experiments, the most frequently used dyes are Cy3 and Cy5. Recent improvements in the technology (dye-labelling, scanner and, image analysis allow hybridization up to four samples simultaneously. The two additional dyes are Alexa488 and Alexa494. The triple-target or four-target technology is very promising, since it allows more flexibility in the design of experiments, an increase in the statistical power when comparing gene expressions induced by different conditions and a scaled down number of slides. However, there have been few methods proposed for statistical analysis of such data. Moreover the lowess correction of the global dye effect is available for only two-color experiments, and even if its application can be derived, it does not allow simultaneous correction of the raw data. Results We propose a two-step normalization procedure for triple-target experiments. First the dye bleeding is evaluated and corrected if necessary. Then the signal in each channel is normalized using a generalized lowess procedure to correct a global dye bias. The normalization procedure is validated using triple-self experiments and by comparing the results of triple-target and two-color experiments. Although the focus is on triple-target microarrays, the proposed method can be used to normalize p differently labelled targets co-hybridized on a same array, for any value of p greater than 2. Conclusion The proposed normalization procedure is effective: the technical biases are reduced, the number of false positives is under control in the analysis of differentially expressed genes, and the triple-target experiments are more powerful than the corresponding two-color experiments. There is room for improving the microarray experiments by simultaneously hybridizing more than two samples.

  3. Fission cross-section normalization problems

    International Nuclear Information System (INIS)

    Wagemans, C.; Ghent Rijksuniversiteit; Deruytter, A.J.

    1983-01-01

    The present measurements yield σsub(f)-data in the neutron energy from 20 MeV to 30 keV directly normalized in the thermal region. In the keV-region these data are consistent with the absolute σsub(f)-measurements of Szabo and Marquette. For the secondary normalization integral I 2 values have been obtained in agreement with those of Gwin et al. and Czirr et al. which were also directly normalized in the thermal region. For the I 1 integral, however, puzzling low values have been obtained. This was also the case for σsub(f)-bar in neutron energy intervals containing strong resonances. Three additional measurements are planned to further investigate these observations: (i) maintaining the actual approx.2π-geometry but using a 10 B-foil for the neutron flux detection (ii) using a low detection geometry with a 10 B- as well as a 6 Li-flux monitor. Only after these measurements definite conclusions on the I 1 and I 2 integrals can be formulated and final σsub(f)-bar-values can be released. The present study also gives some evidence for a correlation between the integral I 2 and the neutron flux monitor used. The influence of a normalization via I 1 or I 2 on the final cross-section has been shown. The magnitude of possible normalization errors is illustrated. Finally, since 235 U is expected to be an ''easy'' nucleus (low α-activity high σsub(f)-values), there are some indications that the important discrepancies still present in 235 U(n,f) cross-section measurements might partially be due to errors in the neutron flux determination

  4. Fusion and normalization to enhance anomaly detection

    Science.gov (United States)

    Mayer, R.; Atkinson, G.; Antoniades, J.; Baumback, M.; Chester, D.; Edwards, J.; Goldstein, A.; Haas, D.; Henderson, S.; Liu, L.

    2009-05-01

    This study examines normalizing the imagery and the optimization metrics to enhance anomaly and change detection, respectively. The RX algorithm, the standard anomaly detector for hyperspectral imagery, more successfully extracts bright rather than dark man-made objects when applied to visible hyperspectral imagery. However, normalizing the imagery prior to applying the anomaly detector can help detect some of the problematic dark objects, but can also miss some bright objects. This study jointly fuses images of RX applied to normalized and unnormalized imagery and has a single decision surface. The technique was tested using imagery of commercial vehicles in urban environment gathered by a hyperspectral visible/near IR sensor mounted in an airborne platform. Combining detections first requires converting the detector output to a target probability. The observed anomaly detections were fitted with a linear combination of chi square distributions and these weights were used to help compute the target probability. Receiver Operator Characteristic (ROC) quantitatively assessed the target detection performance. The target detection performance is highly variable depending on the relative number of candidate bright and dark targets and false alarms and controlled in this study by using vegetation and street line masks. The joint Boolean OR and AND operations also generate variable performance depending on the scene. The joint SUM operation provides a reasonable compromise between OR and AND operations and has good target detection performance. In addition, new transforms based on normalizing correlation coefficient and least squares generate new transforms related to canonical correlation analysis (CCA) and a normalized image regression (NIR). Transforms based on CCA and NIR performed better than the standard approaches. Only RX detection of the unnormalized of the difference imagery in change detection provides adequate change detection performance.

  5. On the transition to the normal phase for superconductors surrounded by normal conductors

    DEFF Research Database (Denmark)

    Fournais, Søren; Kachmar, Ayman

    2009-01-01

    For a cylindrical superconductor surrounded by a normal material, we discuss transition to the normal phase of stable, locally stable and critical configurations. Associated with those phase transitions, we define critical magnetic fields and we provide a sufficient condition for which those...

  6. Inheritance of Properties of Normal and Non-Normal Distributions after Transformation of Scores to Ranks

    Science.gov (United States)

    Zimmerman, Donald W.

    2011-01-01

    This study investigated how population parameters representing heterogeneity of variance, skewness, kurtosis, bimodality, and outlier-proneness, drawn from normal and eleven non-normal distributions, also characterized the ranks corresponding to independent samples of scores. When the parameters of population distributions from which samples were…

  7. Large electrostatic accelerators

    Energy Technology Data Exchange (ETDEWEB)

    Jones, C.M.

    1984-01-01

    The increasing importance of energetic heavy ion beams in the study of atomic physics, nuclear physics, and materials science has partially or wholly motivated the construction of a new generation of large electrostatic accelerators designed to operate at terminal potentials of 20 MV or above. In this paper, the author briefly discusses the status of these new accelerators and also discusses several recent technological advances which may be expected to further improve their performance. The paper is divided into four parts: (1) a discussion of the motivation for the construction of large electrostatic accelerators, (2) a description and discussion of several large electrostatic accelerators which have been recently completed or are under construction, (3) a description of several recent innovations which may be expected to improve the performance of large electrostatic accelerators in the future, and (4) a description of an innovative new large electrostatic accelerator whose construction is scheduled to begin next year. Due to time and space constraints, discussion is restricted to consideration of only tandem accelerators.

  8. Large electrostatic accelerators

    International Nuclear Information System (INIS)

    Jones, C.M.

    1984-01-01

    The increasing importance of energetic heavy ion beams in the study of atomic physics, nuclear physics, and materials science has partially or wholly motivated the construction of a new generation of large electrostatic accelerators designed to operate at terminal potentials of 20 MV or above. In this paper, the author briefly discusses the status of these new accelerators and also discusses several recent technological advances which may be expected to further improve their performance. The paper is divided into four parts: (1) a discussion of the motivation for the construction of large electrostatic accelerators, (2) a description and discussion of several large electrostatic accelerators which have been recently completed or are under construction, (3) a description of several recent innovations which may be expected to improve the performance of large electrostatic accelerators in the future, and (4) a description of an innovative new large electrostatic accelerator whose construction is scheduled to begin next year. Due to time and space constraints, discussion is restricted to consideration of only tandem accelerators

  9. Normal gravity field in relativistic geodesy

    Science.gov (United States)

    Kopeikin, Sergei; Vlasov, Igor; Han, Wen-Biao

    2018-02-01

    Modern geodesy is subject to a dramatic change from the Newtonian paradigm to Einstein's theory of general relativity. This is motivated by the ongoing advance in development of quantum sensors for applications in geodesy including quantum gravimeters and gradientometers, atomic clocks and fiber optics for making ultra-precise measurements of the geoid and multipolar structure of the Earth's gravitational field. At the same time, very long baseline interferometry, satellite laser ranging, and global navigation satellite systems have achieved an unprecedented level of accuracy in measuring 3-d coordinates of the reference points of the International Terrestrial Reference Frame and the world height system. The main geodetic reference standard to which gravimetric measurements of the of Earth's gravitational field are referred is a normal gravity field represented in the Newtonian gravity by the field of a uniformly rotating, homogeneous Maclaurin ellipsoid of which mass and quadrupole momentum are equal to the total mass and (tide-free) quadrupole moment of Earth's gravitational field. The present paper extends the concept of the normal gravity field from the Newtonian theory to the realm of general relativity. We focus our attention on the calculation of the post-Newtonian approximation of the normal field that is sufficient for current and near-future practical applications. We show that in general relativity the level surface of homogeneous and uniformly rotating fluid is no longer described by the Maclaurin ellipsoid in the most general case but represents an axisymmetric spheroid of the fourth order with respect to the geodetic Cartesian coordinates. At the same time, admitting a post-Newtonian inhomogeneity of the mass density in the form of concentric elliptical shells allows one to preserve the level surface of the fluid as an exact ellipsoid of rotation. We parametrize the mass density distribution and the level surface with two parameters which are

  10. Deformation associated with continental normal faults

    Science.gov (United States)

    Resor, Phillip G.

    Deformation associated with normal fault earthquakes and geologic structures provide insights into the seismic cycle as it unfolds over time scales from seconds to millions of years. Improved understanding of normal faulting will lead to more accurate seismic hazard assessments and prediction of associated structures. High-precision aftershock locations for the 1995 Kozani-Grevena earthquake (Mw 6.5), Greece image a segmented master fault and antithetic faults. This three-dimensional fault geometry is typical of normal fault systems mapped from outcrop or interpreted from reflection seismic data and illustrates the importance of incorporating three-dimensional fault geometry in mechanical models. Subsurface fault slip associated with the Kozani-Grevena and 1999 Hector Mine (Mw 7.1) earthquakes is modeled using a new method for slip inversion on three-dimensional fault surfaces. Incorporation of three-dimensional fault geometry improves the fit to the geodetic data while honoring aftershock distributions and surface ruptures. GPS Surveying of deformed bedding surfaces associated with normal faulting in the western Grand Canyon reveals patterns of deformation that are similar to those observed by interferometric satellite radar interferometry (InSAR) for the Kozani Grevena earthquake with a prominent down-warp in the hanging wall and a lesser up-warp in the footwall. However, deformation associated with the Kozani-Grevena earthquake extends ˜20 km from the fault surface trace, while the folds in the western Grand Canyon only extend 500 m into the footwall and 1500 m into the hanging wall. A comparison of mechanical and kinematic models illustrates advantages of mechanical models in exploring normal faulting processes including incorporation of both deformation and causative forces, and the opportunity to incorporate more complex fault geometry and constitutive properties. Elastic models with antithetic or synthetic faults or joints in association with a master

  11. Comparison of CSF Distribution between Idiopathic Normal Pressure Hydrocephalus and Alzheimer Disease.

    Science.gov (United States)

    Yamada, S; Ishikawa, M; Yamamoto, K

    2016-07-01

    CSF volumes in the basal cistern and Sylvian fissure are increased in both idiopathic normal pressure hydrocephalus and Alzheimer disease, though the differences in these volumes in idiopathic normal pressure hydrocephalus and Alzheimer disease have not been well-described. Using CSF segmentation and volume quantification, we compared the distribution of CSF in idiopathic normal pressure hydrocephalus and Alzheimer disease. CSF volumes were extracted from T2-weighted 3D spin-echo sequences on 3T MR imaging and quantified semi-automatically. We compared the volumes and ratios of the ventricles and subarachnoid spaces after classification in 30 patients diagnosed with idiopathic normal pressure hydrocephalus, 10 with concurrent idiopathic normal pressure hydrocephalus and Alzheimer disease, 18 with Alzheimer disease, and 26 control subjects 60 years of age or older. Brain to ventricle ratios at the anterior and posterior commissure levels and 3D volumetric convexity cistern to ventricle ratios were useful indices for the differential diagnosis of idiopathic normal pressure hydrocephalus or idiopathic normal pressure hydrocephalus with Alzheimer disease from Alzheimer disease, similar to the z-Evans index and callosal angle. The most distinctive characteristics of the CSF distribution in idiopathic normal pressure hydrocephalus were small convexity subarachnoid spaces and the large volume of the basal cistern and Sylvian fissure. The distribution of the subarachnoid spaces in the idiopathic normal pressure hydrocephalus with Alzheimer disease group was the most deformed among these 3 groups, though the mean ventricular volume of the idiopathic normal pressure hydrocephalus with Alzheimer disease group was intermediate between that of the idiopathic normal pressure hydrocephalus and Alzheimer disease groups. The z-axial expansion of the lateral ventricle and compression of the brain just above the ventricle were the common findings in the parameters for differentiating

  12. A systematic assessment of normalization approaches for the Infinium 450K methylation platform.

    Science.gov (United States)

    Wu, Michael C; Joubert, Bonnie R; Kuan, Pei-fen; Håberg, Siri E; Nystad, Wenche; Peddada, Shyamal D; London, Stephanie J

    2014-02-01

    The Illumina Infinium HumanMethylation450 BeadChip has emerged as one of the most popular platforms for genome wide profiling of DNA methylation. While the technology is wide-spread, systematic technical biases are believed to be present in the data. For example, this array incorporates two different chemical assays, i.e., Type I and Type II probes, which exhibit different technical characteristics and potentially complicate the computational and statistical analysis. Several normalization methods have been introduced recently to adjust for possible biases. However, there is considerable debate within the field on which normalization procedure should be used and indeed whether normalization is even necessary. Yet despite the importance of the question, there has been little comprehensive comparison of normalization methods. We sought to systematically compare several popular normalization approaches using the Norwegian Mother and Child Cohort Study (MoBa) methylation data set and the technical replicates analyzed with it as a case study. We assessed both the reproducibility between technical replicates following normalization and the effect of normalization on association analysis. Results indicate that the raw data are already highly reproducible, some normalization approaches can slightly improve reproducibility, but other normalization approaches may introduce more variability into the data. Results also suggest that differences in association analysis after applying different normalizations are not large when the signal is strong, but when the signal is more modest, different normalizations can yield very different numbers of findings that meet a weaker statistical significance threshold. Overall, our work provides useful, objective assessment of the effectiveness of key normalization methods.

  13. Large field radiotherapy

    International Nuclear Information System (INIS)

    Vanasek, J.; Chvojka, Z.; Zouhar, M.

    1984-01-01

    Calculations may prove that irradiation procedures, commonly used in radiotherapy and represented by large-capacity irradiation techniques, do not exceed certain limits of integral doses with favourable radiobiological action on the organism. On the other hand integral doses in supralethal whole-body irradiation, used in the therapy of acute leukemia, represent radiobiological values which without extreme and exceptional further interventions and teamwork are not compatible with life, and the radiotherapeutist cannot use such high doses without the backing of a large team. (author)

  14. Developing Large Web Applications

    CERN Document Server

    Loudon, Kyle

    2010-01-01

    How do you create a mission-critical site that provides exceptional performance while remaining flexible, adaptable, and reliable 24/7? Written by the manager of a UI group at Yahoo!, Developing Large Web Applications offers practical steps for building rock-solid applications that remain effective even as you add features, functions, and users. You'll learn how to develop large web applications with the extreme precision required for other types of software. Avoid common coding and maintenance headaches as small websites add more pages, more code, and more programmersGet comprehensive soluti

  15. Choice of large projects

    Energy Technology Data Exchange (ETDEWEB)

    Harris, R

    1978-08-01

    Conventional cost/benefit or project analysis has generally not taken into account circumstances in which the project under consideration is large enough that its introduction to the economy would have significant general equilibrium effects. In this paper, rules are examined that would indicate whether such large projects should be accepted or rejected. The rules utilize information yielded by before-project and after-project equilibrium prices and production data. Rules are developed for the undistorted ''first-best'' case, the case in which the fixed costs of the project are covered by distortionary taxation, and for the case of projects producing public goods. 34 references.

  16. Lifetime analysis of the ITER first wall under steady-state and off-normal loads

    International Nuclear Information System (INIS)

    Mitteau, R; Sugihara, M; Raffray, R; Carpentier-Chouchana, S; Merola, M; Pitts, R A; Labidi, H; Stangeby, P

    2011-01-01

    The lifetime of the beryllium armor of the ITER first wall is evaluated for normal and off-normal operation. For the individual events considered, the lifetime spans between 930 and 35×10 6 discharges. The discrepancy between low and high estimates is caused by uncertainties about the behavior of the melt layer during off-normal events, variable plasma operation parameters and variability of the sputtering yields. These large uncertainties in beryllium armor loss estimates are a good example of the experimental nature of the ITER project and will not be truly resolved until ITER begins burning plasma operation.

  17. SwarmDock and the Use of Normal Modes in Protein-Protein Docking

    Directory of Open Access Journals (Sweden)

    Paul A. Bates

    2010-09-01

    Full Text Available Here is presented an investigation of the use of normal modes in protein-protein docking, both in theory and in practice. Upper limits of the ability of normal modes to capture the unbound to bound conformational change are calculated on a large test set, with particular focus on the binding interface, the subset of residues from which the binding energy is calculated. Further, the SwarmDock algorithm is presented, to demonstrate that the modelling of conformational change as a linear combination of normal modes is an effective method of modelling flexibility in protein-protein docking.

  18. The variability problem of normal human walking

    DEFF Research Database (Denmark)

    Simonsen, Erik B; Alkjær, Tine

    2012-01-01

    Previous investigations have suggested considerable inter-individual variability in the time course pattern of net joint moments during normal human walking, although the limited sample sizes precluded statistical analyses. The purpose of the present study was to obtain joint moment patterns from...... a group of normal subjects and to test whether or not the expected differences would prove to be statistically significant. Fifteen healthy male subjects were recorded on video while they walked across two force platforms. Ten kinematic and kinetic parameters were selected and input to a statistical...... cluster analysis to determine whether or not the 15 subjects could be divided into different 'families' (clusters) of walking strategy. The net joint moments showed a variability corroborating earlier reports. The cluster analysis showed that the 15 subjects could be grouped into two clusters of 5 and 10...

  19. Autobiographical Memory in Normal Ageing and Dementia

    Directory of Open Access Journals (Sweden)

    Harvey J. Sagar

    1991-01-01

    Full Text Available Autobiographical memories in young and elderly normal subjects are drawn mostly from the recent past but elderly subjects relate a second peak of memories from early adulthood. Memory for remote past public events is relatively preserved in dementia, possibly reflecting integrity of semantic relative to episodic memory. We examined recall of specific, consistent autobiographical episodes in Alzheimer's disease (AD in response to cue words. Patients and control subjects drew most memories from the recent 20 years: episode age related to anterograde memory function but not subject age or dementia. Subjects also related a secondary peak of memories from early adulthood; episode age related to subject age and severity of dementia. The results suggest that preferential recall of memories from early adulthood is based on the salience of retrieval cues, altered by age and dementia, superimposed on a temporal gradient of semantic memory. Further, AD shows behavioural similarity to normal ageing.

  20. A Proposed Arabic Handwritten Text Normalization Method

    Directory of Open Access Journals (Sweden)

    Tarik Abu-Ain

    2014-11-01

    Full Text Available Text normalization is an important technique in document image analysis and recognition. It consists of many preprocessing stages, which include slope correction, text padding, skew correction, and straight the writing line. In this side, text normalization has an important role in many procedures such as text segmentation, feature extraction and characters recognition. In the present article, a new method for text baseline detection, straightening, and slant correction for Arabic handwritten texts is proposed. The method comprises a set of sequential steps: first components segmentation is done followed by components text thinning; then, the direction features of the skeletons are extracted, and the candidate baseline regions are determined. After that, selection of the correct baseline region is done, and finally, the baselines of all components are aligned with the writing line.  The experiments are conducted on IFN/ENIT benchmark Arabic dataset. The results show that the proposed method has a promising and encouraging performance.

  1. Zero cosmological constant from normalized general relativity

    International Nuclear Information System (INIS)

    Davidson, Aharon; Rubin, Shimon

    2009-01-01

    Normalizing the Einstein-Hilbert action by the volume functional makes the theory invariant under constant shifts in the Lagrangian. The associated field equations then resemble unimodular gravity whose otherwise arbitrary cosmological constant is now determined as a Machian universal average. We prove that an empty space-time is necessarily Ricci tensor flat, and demonstrate the vanishing of the cosmological constant within the scalar field paradigm. The cosmological analysis, carried out at the mini-superspace level, reveals a vanishing cosmological constant for a universe which cannot be closed as long as gravity is attractive. Finally, we give an example of a normalized theory of gravity which does give rise to a non-zero cosmological constant.

  2. Suitable Image Intensity Normalization for Arterial Visualization

    Directory of Open Access Journals (Sweden)

    Yara Omran

    2012-12-01

    Full Text Available Ultrasonic imaging is a widely used non-invasivemedical imaging procedure since it is economical, comparativelysafe, portable and adaptable. However, one of its main weaknessesis the poor quality of images, which makes the enhancementof image quality an important issue in order to have a moreaccurate diagnose of the disease, or for the transformation of theimage through telemedicine channel and in many other imageprocessing tasks [1]. The purpose of this paper is to automaticallyenhance the image quality after the automatic detection of theartery wall. This step is essential before subsequent measurementsof arterial parameters [9]. This was performed automaticallyby applying linear normalization, where results showedthat normalization of ultra sound images is an important step inenhancing the image quality for later processing. In comparisonwith other methods, our method is automatic. The evaluationof image quality was done mathematically by comparing pixelintensities of images before and after enhancement, in additionto a visual evaluation.

  3. Normal anatomical measurements in cervical computerized tomography

    International Nuclear Information System (INIS)

    Zaunbauer, W.; Daepp, S.; Haertel, M.

    1985-01-01

    Radiodiagnostically relevant normal values and variations for measurements of the cervical region, the arithmetical average and the standard deviation were determined from adequate computer tomograms on 60 healthy women and men, aged 20 to 83 years. The sagittal diameter of the prevertebral soft tissue and the lumina of the upper respiratory tract were evaluated at exactly defined levels between the hyoid bone and the incisura jugularis sterni. - The thickness of the aryepiglottic folds, the maximal sagittal and transverse diameters of the thyroid gland and the calibre of the great cervical vessels were defined. - To assess information about laryngeal function in computerized tomography, measurements of distances between the cervical spine and anatomical fixed points of the larynx and hypopharynx were made as well as of the degree of vocal cord movement during normal respiration and phonation. (orig.) [de

  4. Broad Ligament Haematoma Following Normal Vaginal Delivery.

    Science.gov (United States)

    Ibrar, Faiza; Awan, Azra Saeed; Fatima, Touseef; Tabassum, Hina

    2017-01-01

    A 37-year-old, patient presented in emergency with history of normal vaginal delivery followed by development of abdominal distention, vomiting, constipation for last 3 days. She was para 4 and had normal vaginal delivery by traditional birth attendant at peripheral hospital 3 days back. Imaging study revealed a heterogeneous complex mass, ascites, pleural effusion, air fluid levels with dilatation gut loops. Based upon pelvic examination by senior gynaecologist in combination with ultrasound; a clinical diagnosis of broad ligament haematoma was made. However, vomiting and abdominal distention raised suspicion of intestinal obstruction. Due to worsening abdominal distention exploratory laparotomy was carried out. It was pseudo colonic obstruction and caecostomy was done. Timely intervention by multidisciplinary approach saved patient life with minimal morbidity.

  5. Normalizing the causality between time series

    Science.gov (United States)

    Liang, X. San

    2015-08-01

    Recently, a rigorous yet concise formula was derived to evaluate information flow, and hence the causality in a quantitative sense, between time series. To assess the importance of a resulting causality, it needs to be normalized. The normalization is achieved through distinguishing a Lyapunov exponent-like, one-dimensional phase-space stretching rate and a noise-to-signal ratio from the rate of information flow in the balance of the marginal entropy evolution of the flow recipient. It is verified with autoregressive models and applied to a real financial analysis problem. An unusually strong one-way causality is identified from IBM (International Business Machines Corporation) to GE (General Electric Company) in their early era, revealing to us an old story, which has almost faded into oblivion, about "Seven Dwarfs" competing with a giant for the mainframe computer market.

  6. Normal modes of vibration in nickel

    Energy Technology Data Exchange (ETDEWEB)

    Birgeneau, R J [Yale Univ., New Haven, Connecticut (United States); Cordes, J [Cambridge Univ., Cambridge (United Kingdom); Dolling, G; Woods, A D B

    1964-07-01

    The frequency-wave-vector dispersion relation, {nu}(q), for the normal vibrations of a nickel single crystal at 296{sup o}K has been measured for the [{zeta}00], [{zeta}00], [{zeta}{zeta}{zeta}], and [0{zeta}1] symmetric directions using inelastic neutron scattering. The results can be described in terms of the Born-von Karman theory of lattice dynamics with interactions out to fourth-nearest neighbors. The shapes of the dispersion curves are very similar to those of copper, the normal mode frequencies in nickel being about 1.24 times the corresponding frequencies in copper. The fourth-neighbor model was used to calculate the frequency distribution function g({nu}) and related thermodynamic properties. (author)

  7. The self-normalized Donsker theorem revisited

    OpenAIRE

    Parczewski, Peter

    2016-01-01

    We extend the Poincar\\'{e}--Borel lemma to a weak approximation of a Brownian motion via simple functionals of uniform distributions on n-spheres in the Skorokhod space $D([0,1])$. This approach is used to simplify the proof of the self-normalized Donsker theorem in Cs\\"{o}rg\\H{o} et al. (2003). Some notes on spheres with respect to $\\ell_p$-norms are given.

  8. Normal mode analysis for linear resistive magnetohydrodynamics

    International Nuclear Information System (INIS)

    Kerner, W.; Lerbinger, K.; Gruber, R.; Tsunematsu, T.

    1984-10-01

    The compressible, resistive MHD equations are linearized around an equilibrium with cylindrical symmetry and solved numerically as a complex eigenvalue problem. This normal mode code allows to solve for very small resistivity eta proportional 10 -10 . The scaling of growthrates and layer width agrees very well with analytical theory. Especially, both the influence of current and pressure on the instabilities is studied in detail; the effect of resistivity on the ideally unstable internal kink is analyzed. (orig.)

  9. The J/$\\psi$ normal nuclear absorption

    CERN Document Server

    Alessandro, B; Arnaldi, R; Atayan, M; Beolè, S; Boldea, V; Bordalo, P; Borges, G; Castanier, C; Castor, J; Chaurand, B; Cheynis, B; Chiavassa, E; Cicalò, C; Comets, M P; Constantinescu, S; Cortese, P; De Falco, A; De Marco, N; Dellacasa, G; Devaux, A; Dita, S; Fargeix, J; Force, P; Gallio, M; Gerschel, C; Giubellino, P; Golubeva, M B; Grigorian, A A; Grigorian, S; Guber, F F; Guichard, A; kanyan, H; ldzik, M; Jouan, D; Karavicheva, T L; Kluberg, L; Kurepin, A B; Le Bornec, Y; Lourenço, C; Cormick, M M; Marzari-Chiesa, A; Masera, M; Masoni, A; Monteno, M; Musso, A; Petiau, P; Piccotti, A; Pizzi, J R; Prino, F; Puddu, G; Quintans, C; Ramello, L; Ramos, S; Riccati, L; Santos, H; Saturnini, P; Scomparin, E; Serci, S; Shahoyan, R; Sigaudo, M F; Sitta, M; Sonderegger, P; Tarrago, X; Topilskaya, N S; Usai, G L; Vercellin, E; Villatte, L; Willis, N; Wu T

    2005-01-01

    We present a new determination of the ratio of cross-sections (J/psi) /DY as expected for nucleus-nucleus reactions if J/psi would only be normally absorbed by nuclear matter. This anticipated behaviour is based on proton-nucleus data exclusively, and compared, as a function of centrality, with updated S-U results from experiment NA38 and with the most recent Pb-Pb results from experiment NA50.

  10. Research on Normal Human Plantar Pressure Test

    Directory of Open Access Journals (Sweden)

    Liu Xi Yang

    2016-01-01

    Full Text Available FSR400 pressure sensor, nRF905 wireless transceiver and MSP40 SCM are used to design the insole pressure collection system, LabVIEW is used to make HMI of data acquisition, collecting a certain amount of normal human foot pressure data, statistical analysis of pressure distribution relations about five stages of swing phase during walking, using the grid closeness degree to identify plantar pressure distribution pattern recognition, and the algorithm simulation, experimental results demonstrated this method feasible.

  11. The classification of normal screening mammograms

    Science.gov (United States)

    Ang, Zoey Z. Y.; Rawashdeh, Mohammad A.; Heard, Robert; Brennan, Patrick C.; Lee, Warwick; Lewis, Sarah J.

    2016-03-01

    Rationale and objectives: To understand how breast screen readers classify the difficulty of normal screening mammograms using common lexicon describing normal appearances. Cases were also assessed on their suitability for a single reader strategy. Materials and Methods: 15 breast readers were asked to interpret a test set of 29 normal screening mammogram cases and classify them by rating the difficulty of the case on a five-point Likert scale, identifying the salient features and assessing their suitability for single reading. Using the False Positive Fractions from a previous study, the 29 cases were classified into 10 "low", 10 "medium" and nine "high" difficulties. Data was analyzed with descriptive statistics. Spearman's correlation was used to test the strength of association between the difficulty of the cases and the readers' recommendation for single reading strategy. Results: The ratings from readers in this study corresponded to the known difficulty level of cases for the 'low' and 'high' difficulty cases. Uniform ductal pattern and density, symmetrical mammographic features and the absence of micro-calcifications were the main reasons associated with 'low' difficulty cases. The 'high' difficulty cases were described as having `dense breasts'. There was a statistically significant negative correlation between the difficulty of the cases and readers' recommendation for single reading (r = -0.475, P = 0.009). Conclusion: The findings demonstrated potential relationships between certain mammographic features and the difficulty for readers to classify mammograms as 'normal'. The standard Australian practice of double reading was deemed more suitable for most cases. There was an inverse moderate association between the difficulty of the cases and the recommendations for single reading.

  12. Geomagnetic reversal in brunhes normal polarity epoch.

    Science.gov (United States)

    Smith, J D; Foster, J H

    1969-02-07

    The magnetic stratigraphly of seven cores of deep-sea sediment established the existence of a short interval of reversed polarity in the upper part of the Brunches epoch of normal polarity. The reversed zone in the cores correlates well with paleontological boundaries and is named the Blake event. Its boundaries are estimated to be 108,000 and 114,000 years ago +/- 10 percent.

  13. Ureterocolonic anastomosis in clinically normal dogs

    International Nuclear Information System (INIS)

    Stone, E.A.; Walter, M.C.; Goldschmidt, M.H.; Biery, D.N.; Bovee, K.C.

    1988-01-01

    Ureterocolonic anastomosis was evaluated in 13 clinically normal dogs. Urinary continence was maintained after surgery, and the procedure was completed without technique errors in all but 2 dogs. Three dogs died within 5 weeks (2 of undetermined causes and 1 of aspiration pneumonia and neurologic disease), and 1 dog was euthanatized 4 months after surgery because of neurologic signs. Two healthy dogs were euthanatized 3 months after surgery for light microscopic evaluation of their kidneys. Five dogs were euthanatized 6 months after surgery for light microscopic evaluation of their kidneys. Gastrointestinal and neurologic disturbances developed in 4 dogs at various postoperative intervals. Plasma ammonia concentration measured in 2 dogs with neurologic signs was increased. Plasma ammonia concentration measured in 5 dogs without neurologic signs was within normal limits. All 5 dogs, in which metabolic acidosis was diagnosed, had high normal or above normal serum chloride concentration. Serum urea nitrogen values were increased after surgery because of colonic absorption of urea. Serum creatinine concentration was increased in 1 dog 6 months after surgery. Individual kidney glomerular filtration rate was reduced in 38% (3/8) of the kidneys from 4 other dogs at 6 months after surgery. Of 5 dogs euthanatized at 3 to 4 months after surgery, 4 had bilateral pyelitis, and 1 had unilateral pyelonephritis. Six months after surgery, pyelonephritis was diagnosed in 40% (4/10) of the kidneys from 5 dogs. The ureterocolonic anastomosis procedure is a salvage procedure that should allow complete cystectomy. However, variable degress of metabolic acidosis, hyperammonemia, and neurologic disease may result

  14. The thoracic paraspinal shadow: normal appearances.

    Science.gov (United States)

    Lien, H H; Kolbenstvedt, A

    1982-01-01

    The width of the right and left thoracic paraspinal shadows were measured at all levels in 200 presumably normal individuals. The paraspinal shadow could be identified in nearly all cases on the left side and in approximately one-third on the right. The range of variation was greater on the left side than one the right. The left paraspinal shadow was wider at the upper levels and in individuals above 40 years of age.

  15. Online Normalization Algorithm for Engine Turbofan Monitoring

    Science.gov (United States)

    2014-10-02

    Online Normalization Algorithm for Engine Turbofan Monitoring Jérôme Lacaille 1 , Anastasios Bellas 2 1 Snecma, 77550 Moissy-Cramayel, France...understand the behavior of a turbofan engine, one first needs to deal with the variety of data acquisition contexts. Each time a set of measurements is...it auto-adapts itself with piecewise linear models. 1. INTRODUCTION Turbofan engine abnormality diagnosis uses three steps: reduction of

  16. Comet Giacobini-Zinner - a normal comet?

    International Nuclear Information System (INIS)

    Cochran, A.L.; Barker, E.S.

    1987-01-01

    Observations of Comet Giacobini-Zinner were obtained during its 1985 apparition using an IDS spectrograph at McDonald Observatory. Column densities and production rates were computed. The production rates were compared to observations of other normal comets. Giacobini-Zinner is shown to be depleted in C2 and C3 relative to CN. These production rates are down by a factor of 5. 12 references

  17. Manual on environmental monitoring in normal operation

    International Nuclear Information System (INIS)

    1966-01-01

    Many establishments handling radioactive materials produce, and to some extent also discharge, radioactive waste as part of their normal operation. The radiation doses to which members of the public may be exposed during such operation must remain below the stipulated level. The purpose of this manual is to provide technical guidance for setting up programmes of routine environmental monitoring in the vicinity of nuclear establishment. The annex gives five examples of routine environmental monitoring programmes currently in use: these have been indexed separately.

  18. Normal Conducting RF Cavity for MICE

    International Nuclear Information System (INIS)

    Li, D.; DeMello, A.; Virostek, S.; Zisman, M.; Summers, D.

    2010-01-01

    Normal conducting RF cavities must be used for the cooling section of the international Muon Ionization Cooling Experiment (MICE), currently under construction at Rutherford Appleton Laboratory (RAL) in the UK. Eight 201-MHz cavities are needed for the MICE cooling section; fabrication of the first five cavities is complete. We report the cavity fabrication status including cavity design, fabrication techniques and preliminary low power RF measurements.

  19. Basic characterization of normal multifocal electroretinogram

    International Nuclear Information System (INIS)

    Fernandez Cherkasova, Lilia; Rojas Rondon, Irene; Castro Perez, Pedro Daniel; Lopez Felipe, Daniel; Santiesteban Freixas, Rosaralis; Mendoza Santiesteban, Carlos E

    2008-01-01

    A scientific literature review was made on the novel multifocal electroretinogram technique, the involved cell mechanisms and some of the factors modifying its results together with the form of presentation. The basic characteristics of this electrophysiological record obtained from several regions of the retina of normal subjects is important in order to create at a small scale a comparative database to evaluate pathological eye tracing. All this will greatly help in early less invasive electrodiagnosis of localized retinal lesions. (Author)

  20. Estimating the Heading Direction Using Normal Flow

    Science.gov (United States)

    1994-01-01

    understood (Faugeras and Maybank 1990), 3 Kinetic Stabilization under the assumption that optic flow or correspon- dence is known with some uncertainty...accelerometers can achieve very It can easily be shown (Koenderink and van Doom high accuracy, the same is not true for inexpensive 1975; Maybank 1985... Maybank . ’Motion from point matches: Multi- just don’t compute normal flow there (see Section 6). plicity of solutions". Int’l J. Computer Vision 4

  1. Normalization of oxygen and hydrogen isotope data

    Science.gov (United States)

    Coplen, T.B.

    1988-01-01

    To resolve confusion due to expression of isotopic data from different laboratories on non-corresponding scales, oxygen isotope analyses of all substances can be expressed relative to VSMOW or VPDB (Vienna Peedee belemnite) on scales normalized such that the ??18O of SLAP is -55.5% relative to VSMOW. H3+ contribution in hydrogen isotope ratio analysis can be easily determined using two gaseous reference samples that differ greatly in deuterium content. ?? 1988.

  2. Parietal podocytes in normal human glomeruli.

    Science.gov (United States)

    Bariety, Jean; Mandet, Chantal; Hill, Gary S; Bruneval, Patrick

    2006-10-01

    Although parietal podocytes along the Bowman's capsule have been described by electron microscopy in the normal human kidney, their molecular composition remains unknown. Ten human normal kidneys that were removed for cancer were assessed for the presence and the extent of parietal podocytes along the Bowman's capsule. The expression of podocyte-specific proteins (podocalyxin, glomerular epithelial protein-1, podocin, nephrin, synaptopodin, and alpha-actinin-4), podocyte synthesized proteins (vascular endothelial growth factor and novH), transcription factors (WT1 and PAX2), cyclin-dependent kinase inhibitor p57, and intermediate filaments (cytokeratins and vimentin) was tested. In addition, six normal fetal kidneys were studied to track the ontogeny of parietal podocytes. The podocyte protein labeling detected parietal podocytes in all of the kidneys, was found in 76.6% on average of Bowman's capsule sections, and was prominent at the vascular pole. WT1 and p57 were expressed in some parietal cells, whereas PAX2 was present in all or most of them, so some parietal cells coexpressed WT1 and PAX2. Furthermore, parietal podocytes coexpressed WT1 and podocyte proteins. Cytokeratin-positive cells covered a variable part of the capsule and did not express podocyte proteins. Tuft-capsular podocyte bridges were present in 15.5 +/- 3.7% of the glomerular sections. Parietal podocytes often covered the juxtaglomerular arterioles and were present within the extraglomerular mesangium. Parietal podocytes were present in fetal kidneys. Parietal podocytes that express the same epitopes as visceral podocytes do exist along Bowman's capsule in the normal adult kidney. They are a constitutive cell type of the Bowman's capsule. Therefore, their role in physiology and pathology should be investigated.

  3. Crate counter for normal operating loss

    International Nuclear Information System (INIS)

    Harlan, R.A.

    A lithium-loaded zinc sulfide scintillation counter to closely assay plutonium in waste packaged in 1.3 by 1.3 by 2.13m crates was built. In addition to assays for normal operating loss accounting, the counter will allow safeguards verification immediately before shipment of the crates for burial. The counter should detect approximately 10 g of plutonium in 1000 kg of waste

  4. ''Identical'' bands in normally-deformed nuclei

    International Nuclear Information System (INIS)

    Garrett, J.D.; Baktash, C.; Yu, C.H.

    1990-01-01

    Gamma-ray transitions energies in neighboring odd- and even-mass nuclei for normally-deformed nuclear configurations are analyzed in a manner similar to recent analyses for superdeformed states. The moment of inertia is shown to depend on pair correlations and the aligned angular momentum of the odd nucleon. The implications of this analysis for ''identical'' super-deformed bands are discussed. 26 refs., 9 figs

  5. LARGE SCALE GLAZED

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    OF SELECTED EXISTING BUILDINGS IN AND AROUND COPENHAGEN COVERED WITH MOSAIC TILES, UNGLAZED OR GLAZED CLAY TILES. ITS BUILDINGS WHICH HAVE QUALITIES THAT I WOULD LIKE APPLIED, PERHAPS TRANSFORMED OR MOST PREFERABLY, INTERPRETED ANEW, FOR THE LARGE GLAZED CONCRETE PANELS I AM DEVELOPING. KEYWORDS: COLOR, LIGHT...

  6. Large hydropower generating units

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-07-01

    This document presents the Brazilian experience with the design, fabrication, construction, commissioning and operation of large scale and generation capacity unities. The experience had been acquired with the implementation of Itumbiara, Paulo Afonso IV, Tucurui, Itaipu and Xingo power plants, which are among the largest world unities.

  7. Large Data Set Mining

    NARCIS (Netherlands)

    Leemans, I.B.; Broomhall, Susan

    2017-01-01

    Digital emotion research has yet to make history. Until now large data set mining has not been a very active field of research in early modern emotion studies. This is indeed surprising since first, the early modern field has such rich, copyright-free, digitized data sets and second, emotion studies

  8. Representing Large Virtual Worlds

    NARCIS (Netherlands)

    Kol, T.R.

    2018-01-01

    The ubiquity of large virtual worlds and their growing complexity in computer graphics require efficient representations. This means that we need smart solutions for the underlying storage of these complex environments, but also for their visualization. How the virtual world is best stored and how

  9. The large hadron computer

    CERN Multimedia

    Hirstius, Andreas

    2008-01-01

    Plans for dealing with the torrent of data from the Large Hadron Collider's detectors have made the CERN particle-phycis lab, yet again, a pioneer in computing as well as physics. The author describes the challenges of processing and storing data in the age of petabyt science. (4 pages)

  10. LARGE BUILDING HVAC SIMULATION

    Science.gov (United States)

    The report discusses the monitoring and collection of data relating to indoor pressures and radon concentrations under several test conditions in a large school building in Bartow, Florida. The Florida Solar Energy Center (FSEC) used an integrated computational software, FSEC 3.0...

  11. Large Hadron Collider

    CERN Multimedia

    2007-01-01

    "In the spring 2008, the Large Hadron Collider (LHC) machine at CERN (the European Particle Physics laboratory) will be switched on for the first time. The huge machine is housed in a circular tunnel, 27 km long, excavated deep under the French-Swiss border near Geneva." (1,5 page)

  12. Hilar height ratio in normal Korea

    International Nuclear Information System (INIS)

    Yoo, Kyung Ho; Lee, Nam Joon; Seol, Hae Young; Chung, Kyoo Byung

    1979-01-01

    Hilar displacement is one of the significant sign of pulmonary volume change. The hilar height ratio (HHR) is a value that express the normal position of hilum in its hemithorax, and it is calculated by dividing the distance from the hilum to the lung apex by the distance from the hilum to the diaphragm. Displacement of one hilum is usually easy to detect but both are displaced in the same direction especially, recognition is more difficult. Knowledge of normal HHR allows evaluation of hilar positional change even when the relative hilar position are not altered. Normal chest PA views of 275 cases taken at Korea University Hospital during the period of April 1978 to Jun 1979 were analyzed. The right hilum is positioned in lower half of the right hemithorax, while the left hilum is situated in the upper half of left hemithorax. The difference of hilar ratio according to age group is slight, but there is significant difference between right-HHR and left-HHR. The value of right-HHR is 1.28 ± 0.14, the value of left-HHR is 0.88 ± 0.09.

  13. Normal CT anatomy of the spine

    International Nuclear Information System (INIS)

    Quiroga, O.; Matozzi, F.; Beranger, M.; Nazarian, S.; Salamon, G.; Gambarelli, J.

    1982-01-01

    To analyse the anatomo-radiological correlation of the spine and spinal cord, 22 formalized, frozen anatomical specimens corresponding to different regions of the spinal column (8 cervical, 5 dorsal, and 9 lumbar) were studied by CT scans on axial, sagittal and coronal planes and by contact radiography after they were cut into anatomical slices in order to clarify the normal CT anatomy of the spinal column. The results obtained from CT patient scans, performed exclusively on the axial plane, were compared with those obtained from the anatomical specimens (both CT and contrast radiography). High resolution CT programs were used, enabling us to obtain better individualization of the normal structures contained in the spinal column. Direct sagittal and coronal sections were performed on the specimens in order to get further anatomo-radiological information. Enhanced CT studies of the specimens were also available because of the air already present in the subarachnoid spaces. Excellent visualization was obtained of bone structures, soft tissue and the spinal cord. High CT resolution of the spine appeares to be an excellent neuroradiological procedure to study the spine and spinal cord. A metrizamide CT scan is, however, necessary when a normal unenhanced CT scan is insufficient for diagnosis and when the spinal cord is not clearly visible, as often happens at the cervical level. Clinical findings are certainly very useful to ascertain the exact CT level and to limit the radiation exposure. (orig.)

  14. Aerosol lung inhalation scintigraphy in normal subjects

    Energy Technology Data Exchange (ETDEWEB)

    Sui, Osamu; Shimazu, Hideki

    1985-03-01

    We previously reported basic and clinical evaluation of aerosol lung inhalation scintigraphy with /sup 99m/Tc-millimicrosphere albumin (milli MISA) and concluded aerosol inhalation scintigraphy with /sup 99m/Tc-milli MISA was useful for routine examination. But central airway deposit of aerosol particles was found in not only the patients with chronic obstructive pulmonary disease (COPD) but also normal subjects. So we performed aerosol inhalation scintigraphy in normal subjects and evaluated their scintigrams. The subjects had normal values of FEVsub(1.0)% (more than 70%) in lung function tests, no abnormal findings in chest X-ray films and no symptoms and signs. The findings of aerosol inhalation scintigrams in them were classified into 3 patterns; type I: homogeneous distribution without central airway deposit, type II: homogeneous distribution with central airway deposit, type III: inhomogeneous distribution. These patterns were compared with lung function tests. There was no significant correlation between type I and type II in lung function tests. Type III was different from type I and type II in inhomogeneous distribution. This finding showed no correlation with %VC, FEVsub(1.0)%, MMF, V radical50 and V radical50/V radical25, but good correlation with V radical25 in a maximum forced expiratory flow-volume curve. Flow-volume curve is one of the sensitive methods in early detection of COPD, so inhomogeneous distribution of type III is considered to be due to small airway dysfunction.

  15. Mast cell distribution in normal adult skin.

    Science.gov (United States)

    Janssens, A S; Heide, R; den Hollander, J C; Mulder, P G M; Tank, B; Oranje, A P

    2005-03-01

    To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults. There was an uneven distribution of MCs in different body sites using the anti-tryptase monoclonal antibody technique. Numbers of MCs on the trunk, upper arm, and upper leg were similar, but were significantly different from those found on the lower leg and forearm. Two distinct groups were formed--proximal and distal. There were 77.0 MCs/mm2 at proximal body sites and 108.2 MCs/mm2 at distal sites. Adjusted for the adjacent diagnosis and age, this difference was consistent. The numbers of MCs in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders were not different from those in the control group. Differences in the numbers of MCs between the distal and the proximal body sites must be considered when MCs are counted for a reliable diagnosis of mastocytosis. A pilot study in patients with mastocytosis underlined the variation in the numbers of MCs in mastocytosis and normal skin, but showed a considerable overlap. The observed numbers of MCs in adults cannot be extrapolated to children. MC numbers varied significantly between proximal and distal body sites and these differences must be considered when MCs are counted for a reliable diagnosis of mastocytosis. There was a considerable overlap between the numbers of MCs in mastocytosis and normal skin.

  16. Normal form for mirror machine Hamiltonians

    International Nuclear Information System (INIS)

    Dragt, A.J.; Finn, J.M.

    1979-01-01

    A systematic algorithm is developed for performing canonical transformations on Hamiltonians which govern particle motion in magnetic mirror machines. These transformations are performed in such a way that the new Hamiltonian has a particularly simple normal form. From this form it is possible to compute analytic expressions for gyro and bounce frequencies. In addition, it is possible to obtain arbitrarily high order terms in the adiabatic magnetic moment expansion. The algorithm makes use of Lie series, is an extension of Birkhoff's normal form method, and has been explicitly implemented by a digital computer programmed to perform the required algebraic manipulations. Application is made to particle motion in a magnetic dipole field and to a simple mirror system. Bounce frequencies and locations of periodic orbits are obtained and compared with numerical computations. Both mirror systems are shown to be insoluble, i.e., trajectories are not confined to analytic hypersurfaces, there is no analytic third integral of motion, and the adiabatic magnetic moment expansion is divergent. It is expected also that the normal form procedure will prove useful in the study of island structure and separatrices associated with periodic orbits, and should facilitate studies of breakdown of adiabaticity and the onset of ''stochastic'' behavior

  17. [Quantification of acetabular coverage in normal adult].

    Science.gov (United States)

    Lin, R M; Yang, C Y; Yu, C Y; Yang, C R; Chang, G L; Chou, Y L

    1991-03-01

    Quantification of acetabular coverage is important and can be expressed by superimposition of cartilage tracings on the maximum cross-sectional area of the femoral head. A practical Autolisp program on PC AutoCAD has been developed by us to quantify the acetabular coverage through numerical expression of the images of computed tomography. Thirty adults (60 hips) with normal center-edge angle and acetabular index in plain X ray were randomly selected for serial drops. These slices were prepared with a fixed coordination and in continuous sections of 5 mm in thickness. The contours of the cartilage of each section were digitized into a PC computer and processed by AutoCAD programs to quantify and characterize the acetabular coverage of normal and dysplastic adult hips. We found that a total coverage ratio of greater than 80%, an anterior coverage ratio of greater than 75% and a posterior coverage ratio of greater than 80% can be categorized in a normal group. Polar edge distance is a good indicator for the evaluation of preoperative and postoperative coverage conditions. For standardization and evaluation of acetabular coverage, the most suitable parameters are the total coverage ratio, anterior coverage ratio, posterior coverage ratio and polar edge distance. However, medial coverage and lateral coverage ratios are indispensable in cases of dysplastic hip because variations between them are so great that acetabuloplasty may be impossible. This program can also be used to classify precisely the type of dysplastic hip.

  18. Normal and abnormal aging in bilinguals

    Directory of Open Access Journals (Sweden)

    Alfredo Ardila

    Full Text Available Abstract Bilinguals use two different language systems to mediate not only social communication, but also cognitive processes. Potential differences between bilinguals and monolinguals in task-solving strategies and patterns of cognitive decline during normal and abnormal aging have been suggested. Main contribution: A research review of the area suggests that normal aging is associated with increased interference between the two languages and tendency to retreat to a single language. General cognitive functioning has been found to be higher in demented bilingual patients if communication is carried out in L1 rather than in L2. Recent research has reported that bilingualism can have a protective effect during aging, attenuating the normal cognitive decline associated with aging, and delaying the onset of dementia. Conclusions: Regardless of the significant heterogeneity of bilingualism and the diversity of patterns in language use during life-span, current research suggests that bilingualism is associated with preserved cognitive test performance during aging, and potentially can have some protective effect in dementia.

  19. Cerebral perfusion in homogeneity in normal volunteers

    International Nuclear Information System (INIS)

    Gruenwald, S.M.; Larcos, G.

    1998-01-01

    Full text: In the interpretation of cerebral perfusion scans, it is important to know the normal variation in perfusion which may occur between the cerebral hemispheres. For this reason 24 normal volunteers with no neurological or psychiatric history, and who were on no medications, had 99m Tc-HMPAO brain SPECT studies using a single headed gamma camera computer system. Oblique, coronal and sagittal images were reviewed separately by two experienced observers and any differences were resolved by consensus. Semi-quantitation was performed by summing two adjacent oblique slices and drawing right and left mirror image ROIs corresponding to the mid section level of anterior and posterior frontal lobes, anterior and posterior parietal lobes, temporal lobes and cerebellum. From the mean counts per pixel, right: left ROI ratios and ROI: cerebellar ratios were calculated. On qualitative review 6/24 subjects had mild asymmetry in tracer distribution between right and left cerebral lobes. Semi-quantitation revealed a 5-10% difference in counts between right and left ROIs in 12/24 subjects and an additional three subjects had 10-20% difference in counts between right and left temporal lobes. This study demonstrates the presence of mild asymmetry of cerebral perfusion in a significant minority of normal subjects

  20. Magnetic resonance imaging of normal pituitary gland

    International Nuclear Information System (INIS)

    Yamanaka, Masami; Uozumi, Tohru; Sakoda, Katsuaki; Ohta, Masahiro; Kagawa, Yoshihiro; Kajima, Toshio.

    1986-01-01

    Magnetic resonance imaging (MRI) is a suitable procedure for diagnosing such midline-positioned lesions as pituitary adenomas. To differentiate them from microadenomas fifty-seven cases (9 - 74 years old, 29 men and 28 women), including 50 patients without any sellar or parasellar diseases and seven normal volunteers, were studied in order to clarify the MR findings of the shape, height, and signal intensity of the normal pituitary gland, especially at the median sagittal section. The height of a normal pituitary gland varied from 2 to 9 mm (mean: 5.7 mm); the upper surface of the gland was convex in 19.3 %, flat in 49.1 %, and concave in 31.6 %. The mean height of the gland in women in their twenties was 7.5 mm, and the upper convex shape appeared exclusively in women of the second to fourth decades. Nine intrasellar pituitary adenomas (PRL-secreting: 4, GH-secreting: 4, ACTH-secreting: 1), all verified by surgery, were diagnosed using a resistive MR system. The heights of the gland in these cases were from 7 to 15 mm (mean: 11.3 mm); the upper surface was convex in 7 cases. A localized bulging of the upper surface of the gland and a localized depression of the sellar floor were depicted on the coronal and sagittal sections in most cases. Although the GH- and ACTH-secreting adenoma cases showed homogeneous intrasellar contents, in all the PRL-secreting adenoma cases a low-signal-intensity area was detected in the IR images. The mean T1 values of the intrasellar content of the normal volunteers, the PRL-, GH-, and ACTH-secreting adenoma cases, were 367, 416, 355, and 411 ms respectively. However, in the PRL-secreting adenoma cases, the mean T1 value of the areas showing a low signal intensity on IR images was 455 ms; this was a significant prolongation in comparison with that of a normal pituitary gland. (J.P.N.)

  1. Shutdown problems in large tokamaks

    International Nuclear Information System (INIS)

    Weldon, D.M.

    1978-01-01

    Some of the problems connected with a normal shutdown at the end of the burn phase (soft shutdown) and with a shutdown caused by disruptive instability (hard shutdown) have been considered. For a soft shutdown a cursory literature search was undertaken and methods for controlling the thermal wall loading were listed. Because shutdown computer codes are not widespread, some of the differences between start-up codes and shutdown codes were discussed along with program changes needed to change a start-up code to a shutdown code. For a hard shutdown, the major problems are large induced voltages in the ohmic-heating and equilibrium-field coils and high first wall erosion. A literature search of plasma-wall interactions was carried out. Phenomena that occur at the plasma-wall interface can be quite complicated. For example, material evaporated from the wall can form a virtual limiter or shield protecting the wall from major damage. Thermal gradients that occur during the interaction can produce currents whose associated magnetic field also helps shield the wall

  2. HYPERVASCULAR LIVER LESIONS IN RADIOLOGICALLY NORMAL LIVER.

    Science.gov (United States)

    Amico, Enio Campos; Alves, José Roberto; Souza, Dyego Leandro Bezerra de; Salviano, Fellipe Alexandre Macena; João, Samir Assi; Liguori, Adriano de Araújo Lima

    2017-01-01

    The hypervascular liver lesions represent a diagnostic challenge. To identify risk factors for cancer in patients with non-hemangiomatous hypervascular hepatic lesions in radiologically normal liver. This prospective study included patients with hypervascular liver lesions in radiologically normal liver. The diagnosis was made by biopsy or was presumed on the basis of radiologic stability in follow-up period of one year. Cirrhosis or patients with typical imaging characteristics of haemangioma were excluded. Eighty-eight patients were included. The average age was 42.4. The lesions were unique and were between 2-5 cm in size in most cases. Liver biopsy was performed in approximately 1/3 of cases. The lesions were benign or most likely benign in 81.8%, while cancer was diagnosed in 12.5% of cases. Univariate analysis showed that age >45 years (p3 nodules (p=0.003) and elevated alkaline phosphatase (p=0.013) were significant risk factors for cancer. It is safe to observe hypervascular liver lesions in normal liver in patients up to 45 years, normal alanine aminotransaminase, up to three nodules and no personal history of cancer. Lesion biopsies are safe in patients with atypical lesions and define the treatment to be established for most of these patients. As lesões hepáticas hipervasculares representam um desafio diagnóstico. Identificar fatores de risco para câncer em pacientes portadores de lesão hepática hipervascular não-hemangiomatosa em fígado radiologicamente normal. Estudo prospectivo que incluiu pacientes com lesões hepáticas hipervasculares em que o diagnóstico final foi obtido por exame anatomopatológico ou, presumido a partir de seguimento mínimo de um ano. Diagnóstico prévio de cirrose ou radiológico de hemangioma foram considerados critérios de exclusão. Oitenta e oito pacientes foram incluídos. A relação mulher/homem foi de 5,3/1. A idade média foi de 42,4 anos. Na maior parte das vezes as lesões hepáticas foram únicas e com

  3. Large reservoirs: Chapter 17

    Science.gov (United States)

    Miranda, Leandro E.; Bettoli, Phillip William

    2010-01-01

    Large impoundments, defined as those with surface area of 200 ha or greater, are relatively new aquatic ecosystems in the global landscape. They represent important economic and environmental resources that provide benefits such as flood control, hydropower generation, navigation, water supply, commercial and recreational fisheries, and various other recreational and esthetic values. Construction of large impoundments was initially driven by economic needs, and ecological consequences received little consideration. However, in recent decades environmental issues have come to the forefront. In the closing decades of the 20th century societal values began to shift, especially in the developed world. Society is no longer willing to accept environmental damage as an inevitable consequence of human development, and it is now recognized that continued environmental degradation is unsustainable. Consequently, construction of large reservoirs has virtually stopped in North America. Nevertheless, in other parts of the world construction of large reservoirs continues. The emergence of systematic reservoir management in the early 20th century was guided by concepts developed for natural lakes (Miranda 1996). However, we now recognize that reservoirs are different and that reservoirs are not independent aquatic systems inasmuch as they are connected to upstream rivers and streams, the downstream river, other reservoirs in the basin, and the watershed. Reservoir systems exhibit longitudinal patterns both within and among reservoirs. Reservoirs are typically arranged sequentially as elements of an interacting network, filter water collected throughout their watersheds, and form a mosaic of predictable patterns. Traditional approaches to fisheries management such as stocking, regulating harvest, and in-lake habitat management do not always produce desired effects in reservoirs. As a result, managers may expend resources with little benefit to either fish or fishing. Some locally

  4. In vitro fertilisation when normal sperm morphology is less than ...

    African Journals Online (AJOL)

    1990-08-18

    Aug 18, 1990 ... couples where the husband's normal sperm morphology was less than 15% ... gonadotrophin (HCG) 5000 ID was given when the average size of three ... have a normal sperm count and motility but have lower than normal ...

  5. Non-Normality and Testing that a Correlation Equals Zero

    Science.gov (United States)

    Levy, Kenneth J.

    1977-01-01

    The importance of the assumption of normality for testing that a bivariate normal correlation equals zero is examined. Both empirical and theoretical evidence suggest that such tests are robust with respect to violation of the normality assumption. (Author/JKS)

  6. Normal families and isolated singularities of meromorphic functions

    International Nuclear Information System (INIS)

    Chee, P.S.; Subramaniam, A.

    1985-06-01

    Based on the criterion of Zalcman for normal families, a generalization of a well-known result relating normal families and isolated essential singularities of meromorphic functions is proved, using a theorem of Lehto and Virtanen on normal functions. (author)

  7. Large Hadron Collider manual

    CERN Document Server

    Lavender, Gemma

    2018-01-01

    What is the universe made of? How did it start? This Manual tells the story of how physicists are seeking answers to these questions using the world’s largest particle smasher – the Large Hadron Collider – at the CERN laboratory on the Franco-Swiss border. Beginning with the first tentative steps taken to build the machine, the digestible text, supported by color photographs of the hardware involved, along with annotated schematic diagrams of the physics experiments, covers the particle accelerator’s greatest discoveries – from both the perspective of the writer and the scientists who work there. The Large Hadron Collider Manual is a full, comprehensive guide to the most famous, record-breaking physics experiment in the world, which continues to capture the public imagination as it provides new insight into the fundamental laws of nature.

  8. [Large benign prostatic hiperplasia].

    Science.gov (United States)

    Soria-Fernández, Guillermo René; Jungfermann-Guzman, José René; Lomelín-Ramos, José Pedro; Jaspersen-Gastelum, Jorge; Rosas-Nava, Jesús Emmanuel

    2012-01-01

    the term prostatic hyperplasia is most frequently used to describe the benign prostatic growth, this being a widely prevalent disorder associated with age that affects most men as they age. The association between prostate growth and urinary obstruction in older adults is well documented. large benign prostatic hyperplasia is rare and few cases have been published and should be taken into account during the study of tumors of the pelvic cavity. we report the case of an 81-year-old who had significant symptoms relating to storage and bladder emptying, with no significant elevation of prostate specific antigen. this is a rare condition but it is still important to diagnose and treat as it may be related to severe obstructive uropathy and chronic renal failure. In our institution, cases of large prostatic hyperplasia that are solved by suprapubic adenomectomy are less than 3%.

  9. [Large vessel vasculitides].

    Science.gov (United States)

    Morović-Vergles, Jadranka; Puksić, Silva; Gracanin, Ana Gudelj

    2013-01-01

    Large vessel vasculitis includes Giant cell arteritis and Takayasu arteritis. Giant cell arteritis is the most common form of vasculitis affect patients aged 50 years or over. The diagnosis should be considered in older patients who present with new onset of headache, visual disturbance, polymyalgia rheumatica and/or fever unknown cause. Glucocorticoides remain the cornerstone of therapy. Takayasu arteritis is a chronic panarteritis of the aorta ant its major branches presenting commonly in young ages. Although all large arteries can be affected, the aorta, subclavian and carotid arteries are most commonly involved. The most common symptoms included upper extremity claudication, hypertension, pain over the carotid arteries (carotidynia), dizziness and visual disturbances. Early diagnosis and treatment has improved the outcome in patients with TA.

  10. Large tandem accelerators

    International Nuclear Information System (INIS)

    Jones, C.M.

    1976-01-01

    The increasing importance of energetic heavy ion beams in the study of atomic physics, nuclear physics, and materials science has partially or wholly motivated the construction of a new generation of tandem accelerators designed to operate at maximum terminal potentials in the range 14 to 30 MV. In addition, a number of older tandem accelerators are now being significantly upgraded to improve their heavy ion performance. Both of these developments have reemphasized the importance of negative heavy ion sources. The new large tandem accelerators are described, and the requirements placed on negative heavy ion source technology by these and other tandem accelerators used for the acceleration of heavy ions are discussed. First, a brief description is given of the large tandem accelerators which have been completed recently, are under construction, or are funded for construction, second, the motivation for construction of these accelerators is discussed, and last, criteria for negative ion sources for use with these accelerators are presented

  11. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  12. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  13. Metabolomic analysis of urine samples by UHPLC-QTOF-MS: Impact of normalization strategies.

    Science.gov (United States)

    Gagnebin, Yoric; Tonoli, David; Lescuyer, Pierre; Ponte, Belen; de Seigneux, Sophie; Martin, Pierre-Yves; Schappler, Julie; Boccard, Julien; Rudaz, Serge

    2017-02-22

    Among the various biological matrices used in metabolomics, urine is a biofluid of major interest because of its non-invasive collection and its availability in large quantities. However, significant sources of variability in urine metabolomics based on UHPLC-MS are related to the analytical drift and variation of the sample concentration, thus requiring normalization. A sequential normalization strategy was developed to remove these detrimental effects, including: (i) pre-acquisition sample normalization by individual dilution factors to narrow the concentration range and to standardize the analytical conditions, (ii) post-acquisition data normalization by quality control-based robust LOESS signal correction (QC-RLSC) to correct for potential analytical drift, and (iii) post-acquisition data normalization by MS total useful signal (MSTUS) or probabilistic quotient normalization (PQN) to prevent the impact of concentration variability. This generic strategy was performed with urine samples from healthy individuals and was further implemented in the context of a clinical study to detect alterations in urine metabolomic profiles due to kidney failure. In the case of kidney failure, the relation between creatinine/osmolality and the sample concentration is modified, and relying only on these measurements for normalization could be highly detrimental. The sequential normalization strategy was demonstrated to significantly improve patient stratification by decreasing the unwanted variability and thus enhancing data quality. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Large mass storage facility

    Energy Technology Data Exchange (ETDEWEB)

    Peskin, Arnold M.

    1978-08-01

    This is the final report of a study group organized to investigate questions surrounding the acquisition of a large mass storage facility. The programatic justification for such a system at Brookhaven is reviewed. Several candidate commercial products are identified and discussed. A draft of a procurement specification is developed. Some thoughts on possible new directions for computing at Brookhaven are also offered, although this topic was addressed outside of the context of the group's deliberations. 2 figures, 3 tables.

  15. The Large Hadron Collider

    CERN Document Server

    Juettner Fernandes, Bonnie

    2014-01-01

    What really happened during the Big Bang? Why did matter form? Why do particles have mass? To answer these questions, scientists and engineers have worked together to build the largest and most powerful particle accelerator in the world: the Large Hadron Collider. Includes glossary, websites, and bibliography for further reading. Perfect for STEM connections. Aligns to the Common Core State Standards for Language Arts. Teachers' Notes available online.

  16. CT of Normal Developmental and Variant Anatomy of the Pediatric Skull: Distinguishing Trauma from Normality.

    Science.gov (United States)

    Idriz, Sanjin; Patel, Jaymin H; Ameli Renani, Seyed; Allan, Rosemary; Vlahos, Ioannis

    2015-01-01

    The use of computed tomography (CT) in clinical practice has been increasing rapidly, with the number of CT examinations performed in adults and children rising by 10% per year in England. Because the radiology community strives to reduce the radiation dose associated with pediatric examinations, external factors, including guidelines for pediatric head injury, are raising expectations for use of cranial CT in the pediatric population. Thus, radiologists are increasingly likely to encounter pediatric head CT examinations in daily practice. The variable appearance of cranial sutures at different ages can be confusing for inexperienced readers of radiologic images. The evolution of multidetector CT with thin-section acquisition increases the clarity of some of these sutures, which may be misinterpreted as fractures. Familiarity with the normal anatomy of the pediatric skull, how it changes with age, and normal variants can assist in translating the increased resolution of multidetector CT into more accurate detection of fractures and confident determination of normality, thereby reducing prolonged hospitalization of children with normal developmental structures that have been misinterpreted as fractures. More important, the potential morbidity and mortality related to false-negative interpretation of fractures as normal sutures may be avoided. The authors describe the normal anatomy of all standard pediatric sutures, common variants, and sutural mimics, thereby providing an accurate and safe framework for CT evaluation of skull trauma in pediatric patients. (©)RSNA, 2015.

  17. Correspondence normalized ghost imaging on compressive sensing

    International Nuclear Information System (INIS)

    Zhao Sheng-Mei; Zhuang Peng

    2014-01-01

    Ghost imaging (GI) offers great potential with respect to conventional imaging techniques. It is an open problem in GI systems that a long acquisition time is be required for reconstructing images with good visibility and signal-to-noise ratios (SNRs). In this paper, we propose a new scheme to get good performance with a shorter construction time. We call it correspondence normalized ghost imaging based on compressive sensing (CCNGI). In the scheme, we enhance the signal-to-noise performance by normalizing the reference beam intensity to eliminate the noise caused by laser power fluctuations, and reduce the reconstruction time by using both compressive sensing (CS) and time-correspondence imaging (CI) techniques. It is shown that the qualities of the images have been improved and the reconstruction time has been reduced using CCNGI scheme. For the two-grayscale ''double-slit'' image, the mean square error (MSE) by GI and the normalized GI (NGI) schemes with the measurement number of 5000 are 0.237 and 0.164, respectively, and that is 0.021 by CCNGI scheme with 2500 measurements. For the eight-grayscale ''lena'' object, the peak signal-to-noise rates (PSNRs) are 10.506 and 13.098, respectively using GI and NGI schemes while the value turns to 16.198 using CCNGI scheme. The results also show that a high-fidelity GI reconstruction has been achieved using only 44% of the number of measurements corresponding to the Nyquist limit for the two-grayscale “double-slit'' object. The qualities of the reconstructed images using CCNGI are almost the same as those from GI via sparsity constraints (GISC) with a shorter reconstruction time. (electromagnetism, optics, acoustics, heat transfer, classical mechanics, and fluid dynamics)

  18. Normal range of gastric emptying in children

    International Nuclear Information System (INIS)

    Thomas, P.; Collins, C.; Francis, L.; Henry, R.; O'Loughlin, E.; John Hunter Children's Hospital, Newcastle, NSW

    1999-01-01

    Full text: As part of a larger study looking at gastric emptying times in cystic fibrosis, we assessed the normal range of gastric emptying in a control group of children. Thirteen children (8 girls, 5 boys) aged 4-15 years (mean 10) were studied. Excluded were children with a history of relevant gastrointestinal medical or surgical disease, egg allergy or medication affecting gastric emptying. Imaging was performed at 08.00 h after an overnight fast. The test meal was consumed in under 15 min and comprised one 50 g egg, 80 g commercial pancake mix, 10 ml of polyunsaturated oil, 40 ml of water and 30 g of jam. The meal was labelled with 99 Tc m -macroaggregates of albumin. Water (150 ml) was also consumed with the test meal. One minute images of 128 x 128 were acquired over the anterior and posterior projections every 5 min for 30 min, then every 15 min until 90 min with a final image at 120 min. Subjects remained supine for the first 60 min, after which they were allowed to walk around. A time-activity curve was generated using the geometric mean of anterior and posterior activity. The half emptying time ranged from 55 to 107 min (mean 79, ± 2 standard deviations 43-115). Lag time (time for 5% to leave stomach) ranged from 2 to 26 min (mean 10). The percent emptied at 60 min ranged from 47 to 73% (mean 63%). There was no correlation of half emptying time with age. The normal reference range for a test meal of pancakes has been established for 13 normal children

  19. Exercises in anatomy: the normal heart.

    Science.gov (United States)

    Anderson, Robert H; Sarwark, Anne; Spicer, Diane E; Backer, Carl L

    2014-01-01

    In the first of our exercises in anatomy, created for the Multimedia Manual of the European Association of Cardiothoracic Surgery, we emphasized that thorough knowledge of intracardiac anatomy was an essential part of the training for all budding cardiac surgeons, explaining how we had used the archive of congenitally malformed hearts maintained at Lurie Children's Hospital in Chicago to prepare a series of videoclips, demonstrating the salient features of tetralogy of Fallot. In this series of videoclips, we extend our analysis of the normal heart, since for our initial exercise we had concentrated exclusively on the structure of the right ventricular outflow tract. We begin our overview of normal anatomy by emphasizing the need, in the current era, to describe the heart in attitudinally appropriate fashion. Increasingly, clinicians are demonstrating the features of the heart as it is located within the body. It is no longer satisfactory, therefore, to describe these components in a 'Valentine' fashion, as continues to be the case in most textbooks of normal or cardiac anatomy. We then emphasize the importance of the so-called morphological method, which states that structures within the heart should be defined on the basis of their own intrinsic morphology, and not according to other parts, which are themselves variable. We continue by using this concept to show how it is the appendages that serve to distinguish between the atrial chambers, while the apical trabecular components provide the features to distinguish the ventricles. We then return to the cardiac chambers, emphasizing features of surgical significance, in particular the locations of the cardiac conduction tissues. We proceed by examining the cardiac valves, and conclude by providing a detailed analysis of the septal structures. © The Author 2014. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  20. Normal variants of skin in neonates

    Directory of Open Access Journals (Sweden)

    Kulkarni M

    1996-01-01

    Full Text Available 2221 consecutive live births taking place between March 1994 and February 1995 were evaluated for a minimum period of 5 days to note for the occurrence of various normal anatomical variants specially those of skin. Birth weight, gestational age, maternal age, socio-economic status and consanguinity were carefully recorded in all the cases. Mongolian spots (72%, Epstein pearls (43.8%, Milia (26.2% and Erythema toxicum (25.2%, were the common dermatological variants noted. Maturity of the babies and possibly genetic factors (consanguinity are important factors in their causation as ordered in our study.

  1. Technical normalization in the geoinformatics branch

    Directory of Open Access Journals (Sweden)

    Bronislava Horáková

    2006-09-01

    Full Text Available A basic principle of the technical normalisation is to hold the market development by developing unified technical rules for all concerned subjects. The information and communication technological industry is characterised by certain specific features contrary to the traditional industry. These features bring to the normalisation domain new demands, mainly the flexibility enabling to reflect the rapidly development market of ICT elastic way. The goal of the paper is to provide a comprehensive overview of the current process of technical normalization in the geoinformatic branch

  2. Renal malignancies with normal excretory urograms

    International Nuclear Information System (INIS)

    Kass, D.A.; Hricak, H.; Davidson, A.J.

    1983-01-01

    Four patients with malignant renal masses showed no abnormality of excretory urograms with tomography. Of the four lesions, two were primary renal cell carcinomas, one was a metastatic focus from a contralateral renal cell carcinoma, and one was a metastatic lesion from rectal adenocarcinoma. A normal excretory urogram should not be considered sufficient to exclude a clinically suspected malignant renal mass. In such an instance, diagnostic evaluation should be pursued using a method capable of topographic anatomic display, such as computed tomography or sonography

  3. Local normality properties of some infrared representations

    International Nuclear Information System (INIS)

    Doplicher, S.; Spera, M.

    1983-01-01

    We consider the positive energy representations of the algebra of quasilocal observables for the free massless Majorana field described in preceding papers. We show that by an appropriate choice of the (partially) occupied one particle modes we can find irreducible, type IIsub(infinite) or IIIsub(lambda) representations in this class which are unitarily equivalent to the vacuum representation when restricted to any forward light cone and disjoint from it when restricted to any backward light cone, or conversely. We give an elementary explicit proof of local normality of each representation in the above class. (orig.)

  4. Normal blood supply of the canine patella

    International Nuclear Information System (INIS)

    Howard, P.E.; Wilson, J.W.; Robbins, T.A.; Ribble, G.A.

    1986-01-01

    The normal blood supply of the canine patella was evaluated, using microangiography and correlated histology. Arterioles entered the cortex of the patella at multiple sites along the medial, lateral, and dorsal aspects. The body of the patella was vascularized uniformly, with many arterioles that branched and anastomosed extensively throughout the patella. The patella was not dependent on a single nutrient artery for its afferent supply, but had an extensive interior vascular network. These factors should ensure rapid revascularization and healing of patellar fractures, provided appropriate fracture fixation is achieved

  5. Deficiency of normal galaxies among Markaryan galaxies

    International Nuclear Information System (INIS)

    Iyeveer, M.M.

    1986-01-01

    Comparison of the morphological types of Markaryan galaxies and other galaxies in the Uppsala catalog indicates a strong deficiency of normal ellipticals among the Markaryan galaxies, for which the fraction of type E galaxies is ≤ 1% against 10% among the remaining galaxies. Among the Markaryan galaxies, an excess of barred galaxies is observed - among the Markaryan galaxies with types Sa-Scd, approximately half or more have bars, whereas among the remaining galaxies of the same types bars are found in about 1/3

  6. Exchange rate arrangements: From extreme to "normal"

    Directory of Open Access Journals (Sweden)

    Beker Emilija

    2006-01-01

    Full Text Available The paper studies theoretical and empirical location dispersion of exchange rate arrangements - rigid-intermediate-flexible regimes, in the context of extreme arrangements of a currency board, dollarization and monetary union moderate characteristics of intermediate arrangements (adjustable pegs crawling pegs and target zones and imperative-process "normalization" in the form of a managed or clean floating system. It is established that de iure and de facto classifications generate "fear of floating" and "fear of pegging". The "impossible trinity" under the conditions of capital liberalization and globalization creates a bipolar view or hypothesis of vanishing intermediate exchange rate regimes.

  7. Ultrasonographic ejection fraction of normal gallbladder

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jin Hun; Kim, Seung Yup; Park, Yaung Hee; Kang, Ik Won; Yoon, Jong Sup [Hangang Sacred Heart Hospital, Halym College, Chuncheon (Korea, Republic of)

    1984-06-15

    Real-time ultrasonography is a simple, accurate, noninvasive and potentially valuable means of studying gallbladder size and emptying. The authors calculated ultrasonographically the ejection fraction of 80 cases of normally functioning gallbladder on oral cholecystography, from June 1983 to April 1984, at the department of radiology, Hangang Sacred Heart Hospital. The results were obtained as follows; 1. Ultrasonographic Ejection Fraction at 30 minutes after the fatty meal was 73.1{+-}16.85. 2. There was no significant difference in age and sex, statistically.

  8. MR of the normal and ischemic hip

    International Nuclear Information System (INIS)

    Mitchell, D.G.

    1988-01-01

    Magnetic resonance imaging (MRI) appears to be more sensitive than traditional radiographic and radionuclide methods for detecting early avascular necrosis (AVN) of the femoral head. The authors have found that in addition to its proven value for early detection, MRI can help us characterize individual lesions and understand the pathophysiology of AVN. This chapter reviews the clinical and pathological features of AVN of the femoral head, and describes recent contributions of MRI toward understanding the normal and ischemic hip. This review summarizes the 5-year experience of the MR group at the Hospital of the University of Pennsylvania

  9. The consequences of non-normality

    International Nuclear Information System (INIS)

    Hip, I.; Lippert, Th.; Neff, H.; Schilling, K.; Schroers, W.

    2002-01-01

    The non-normality of Wilson-type lattice Dirac operators has important consequences - the application of the usual concepts from the textbook (hermitian) quantum mechanics should be reconsidered. This includes an appropriate definition of observables and the refinement of computational tools. We show that the truncated singular value expansion is the optimal approximation to the inverse operator D -1 and we prove that due to the γ 5 -hermiticity it is equivalent to γ 5 times the truncated eigenmode expansion of the hermitian Wilson-Dirac operator

  10. Torsion of the normal fallopian tube.

    Science.gov (United States)

    Provost, M W

    1972-01-01

    From 1961 to 1970 a number of cases of torsion of the Fallopian tube were seen at the Kaiser Foundation Hospital in San Francisco of which 3 cases are reported. Of the many theories of causation, pelvic congestion seemed the most likely. The only universal symptom is pain, located in the quadrant of the affected tube and sometimes radiating to the thigh or flank. Nausea and vomiting are frequent; temperature and white cell count are only slightly elevated or normal. A mass is often felt, depending on the amount of hemorrhage. Correct diagnosis is almost never made preoperatively. The only treatment is laparotomy and surgical correction.

  11. Normal CT anatomy of the calcaneus

    International Nuclear Information System (INIS)

    Lee, Mun Gyu; Kang, Heung Sik

    1986-01-01

    Normal sectional anatomy of the calcaneus with multiplanar CT examination was studied in 5 volunteers as the background for interpretation of various abnormalities. Major 3 sectional anatomy including plantar, coronal, sagittal and additional tuberosity planes are described. With CT examination of the calcaneus, 1. More detailed anatomy of 3 facets of subtalar joint (anterior, middle, and posterior facet) can be well visualized. 2. Its clinical applications in the tarsal trauma, tarsal coalition, subtalar infection, degenerative arthritis, club foot, pes planus and tarsal tumor could provide much more information's, which not obtained by conventional radiographic studies.

  12. The evaluation of computed tomography of the normal adrenal glands

    Energy Technology Data Exchange (ETDEWEB)

    Baek, Seung Yon; Kook, Shin Ho; Lee, Cho Hye; Choi, Kyung Hee; Rhee, Chung Sik [Ewha Womens University College of Medicine, Seoul (Korea, Republic of)

    1986-08-15

    Radiology plays an important role in evaluating patients with suspected adrenal gland pathology. Morphologic delineation of adrenal gland is especially valuable in patients with clinical and/or biochemical evidence of a disturbance in adrenal function. Many diagnostic radiologic methods are available for demonstrating adrenal lesions. Computed tomography overcomes many of the disadvantages of these other radiologic techniques. The high degree of spatial and density resolution allows precise demonstration of the normal adrenal glands as well as detection of both small and large tumors in almost all patients. So CT of adrenal gland is an excellent noninvasive screening method and definitive imaging technique. The anthers have investigated the capability of CT to image the normal size, location and shape of both glands. Knowledge of the range of normal is useful for optimal interpretation of CT scans in patients with suspected adrenal pathology. We reviewed CT scan of 150 cases without evidence of adrenal disease. The following results were obtained; 1. There were 90 male and 60 female patients. 2. Their ages ranged from 20 to 60 years. 3. On CT, both glands were shown in 135 (90.0%), the right in 143 (95.3%), the left in 142 (94.6%). 4. In the shape of adrenal glands, most of right adrenal gland was linear or comet shaped; 68 (47.6%), most of left adrenal gland was inverted-Y shaped; 103 (72.6%). 5. In the length of adrenal glands, the right was 2.5{+-}0.77cm, the left was 2.9{+-}0.75cm. 6. In the width of adrenal glands, the right was 3.2{+-}0.74cm, the left was 2.7{+-}0.57cm. 7. In the thickness of adrenal glands, the right was 0.5{+-}0.14cm, the left was 0.6{+-}0.16cm.

  13. Bayesian Option Pricing using Mixed Normal Heteroskedasticity Models

    DEFF Research Database (Denmark)

    Rombouts, Jeroen; Stentoft, Lars

    2014-01-01

    Option pricing using mixed normal heteroscedasticity models is considered. It is explained how to perform inference and price options in a Bayesian framework. The approach allows to easily compute risk neutral predictive price densities which take into account parameter uncertainty....... In an application to the S&P 500 index, classical and Bayesian inference is performed on the mixture model using the available return data. Comparing the ML estimates and posterior moments small differences are found. When pricing a rich sample of options on the index, both methods yield similar pricing errors...... measured in dollar and implied standard deviation losses, and it turns out that the impact of parameter uncertainty is minor. Therefore, when it comes to option pricing where large amounts of data are available, the choice of the inference method is unimportant. The results are robust to different...

  14. Laplacian Estrada and normalized Laplacian Estrada indices of evolving graphs.

    Directory of Open Access Journals (Sweden)

    Yilun Shang

    Full Text Available Large-scale time-evolving networks have been generated by many natural and technological applications, posing challenges for computation and modeling. Thus, it is of theoretical and practical significance to probe mathematical tools tailored for evolving networks. In this paper, on top of the dynamic Estrada index, we study the dynamic Laplacian Estrada index and the dynamic normalized Laplacian Estrada index of evolving graphs. Using linear algebra techniques, we established general upper and lower bounds for these graph-spectrum-based invariants through a couple of intuitive graph-theoretic measures, including the number of vertices or edges. Synthetic random evolving small-world networks are employed to show the relevance of the proposed dynamic Estrada indices. It is found that neither the static snapshot graphs nor the aggregated graph can approximate the evolving graph itself, indicating the fundamental difference between the static and dynamic Estrada indices.

  15. Automatic identification and normalization of dosage forms in drug monographs

    Science.gov (United States)

    2012-01-01

    Background Each day, millions of health consumers seek drug-related information on the Web. Despite some efforts in linking related resources, drug information is largely scattered in a wide variety of websites of different quality and credibility. Methods As a step toward providing users with integrated access to multiple trustworthy drug resources, we aim to develop a method capable of identifying drug's dosage form information in addition to drug name recognition. We developed rules and patterns for identifying dosage forms from different sections of full-text drug monographs, and subsequently normalized them to standardized RxNorm dosage forms. Results Our method represents a significant improvement compared with a baseline lookup approach, achieving overall macro-averaged Precision of 80%, Recall of 98%, and F-Measure of 85%. Conclusions We successfully developed an automatic approach for drug dosage form identification, which is critical for building links between different drug-related resources. PMID:22336431

  16. A normal metal tunnel-junction heat diode

    Energy Technology Data Exchange (ETDEWEB)

    Fornieri, Antonio, E-mail: antonio.fornieri@sns.it; Martínez-Pérez, María José; Giazotto, Francesco, E-mail: giazotto@sns.it [NEST, Istituto Nanoscienze-CNR and Scuola Normale Superiore, I-56127 Pisa (Italy)

    2014-05-05

    We propose a low-temperature thermal rectifier consisting of a chain of three tunnel-coupled normal metal electrodes. We show that a large heat rectification is achievable if the thermal symmetry of the structure is broken and the central island can release energy to the phonon bath. The performance of the device is theoretically analyzed and, under the appropriate conditions, temperature differences up to ∼200 mK between the forward and reverse thermal bias configurations are obtained below 1 K, corresponding to a rectification ratio R∼2000. The simplicity intrinsic to its design joined with the insensitivity to magnetic fields make our device potentially attractive as a fundamental building block in solid-state thermal nanocircuits and in general-purpose cryogenic electronic applications requiring energy management.

  17. Analysis of KNU1 loss of normal feedwater

    International Nuclear Information System (INIS)

    Kim, Hho-Jung; Chung, Bub-Dong; Lee, Young-Jin; Kim, Jin-Soo

    1986-01-01

    Simulation of the system thermal-hydraulic parameters was carried out following the KNU1 (Korea Nuclear Unit-1) loss of normal feedwater transient sequence occurred on November 14, 1984. Results were compared with the plant transient data, and good agreements were obtained. Some deviations were found in the parameters such as the steam flowrate and the RCS (Reactor Coolant system) average temperature, around the time of reactor trip. It can be expected since the thermal-hydraulic parameters encounter rapid transitions due to the large reduction of the reactor thermal power in a short period of time and, thereby, the plant data involve transient uncertainties. The analysis was performed using the RELAP5/MOD1/NSC developed through some modifications of the interphase drag and the wall heat transfer modeling routines of the RELAP5/MOD1/CY018. (author)

  18. Laplacian Estrada and normalized Laplacian Estrada indices of evolving graphs.

    Science.gov (United States)

    Shang, Yilun

    2015-01-01

    Large-scale time-evolving networks have been generated by many natural and technological applications, posing challenges for computation and modeling. Thus, it is of theoretical and practical significance to probe mathematical tools tailored for evolving networks. In this paper, on top of the dynamic Estrada index, we study the dynamic Laplacian Estrada index and the dynamic normalized Laplacian Estrada index of evolving graphs. Using linear algebra techniques, we established general upper and lower bounds for these graph-spectrum-based invariants through a couple of intuitive graph-theoretic measures, including the number of vertices or edges. Synthetic random evolving small-world networks are employed to show the relevance of the proposed dynamic Estrada indices. It is found that neither the static snapshot graphs nor the aggregated graph can approximate the evolving graph itself, indicating the fundamental difference between the static and dynamic Estrada indices.

  19. Normal Conducting Separation Dipoles for the LHC Beam Cleaning Insertions

    CERN Document Server

    Bidon, S; Hans, O; Kalbreier, Willi; Kiselev, O; Petrov, V; Protopopov, I V; Pupkov, Yu A; Ramberger, S; de Rijk, G; Ruvinsky, E; Sukhanov, A

    2004-01-01

    In the Large Hadron Collider (LHC), two straight sections, IR3 and IR7, will be dedicated to beam cleaning. These cleaning insertions will be equipped with normal conducting magnets. MBW magnets are dipole magnets used to increase the separation of the two beams. They have a core length of 3.4 m and a gap height of 52 mm and will operate at a magnetic field ranging from 0.09 T to 1.53 T. Limitations on the dimensions and total weight of the magnet resulted in a special design with a common yoke for the two beams. The orbits of the two beams will be separated horizontally by a distance between 194 mm and 224 mm in the gap of the magnet. The magnet was designed in collaboration between CERN and BINP. The report presents the main design issues and results of the pre-series acceptance tests including mechanical, electrical and magnetic field measurements.

  20. Normal Conducting Deflecting Cavity Development at the Cockcroft Institute

    CERN Document Server

    Burt, G; Dexter, A C; Woolley, B; Jones, R M; Grudiev, A; Dolgashev, V; Wheelhouse, A; Mackenzie, J; McIntosh, P A; Hill, C; Goudket, P; Buckley, S; Lingwood, C

    2013-01-01

    Two normal conducting deflecting structures are currently being developed at the Cockcroft Institute, one as a crab cavity for CERN linear collider CLIC and one for bunch slice diagnostics on low energy electron beams for Electron Beam Test Facility EBTF at Daresbury. Each has its own challenges that need overcome. For CLIC the phase and amplitude tolerances are very stringent and hence beamloading effects and wakefields must be minimised. Significant work has been undertook to understand the effect of the couplers on beamloading and the effect of the couplers on the wakefields. For EBTF the difficulty is avoiding the large beam offset caused by the cavities internal deflecting voltage at the low beam energy. Prototypes for both cavities have been manufactured and results will be presented.

  1. Large eddy simulation of breaking waves

    DEFF Research Database (Denmark)

    Christensen, Erik Damgaard; Deigaard, Rolf

    2001-01-01

    A numerical model is used to simulate wave breaking, the large scale water motions and turbulence induced by the breaking process. The model consists of a free surface model using the surface markers method combined with a three-dimensional model that solves the flow equations. The turbulence....... The incoming waves are specified by a flux boundary condition. The waves are approaching in the shore-normal direction and are breaking on a plane, constant slope beach. The first few wave periods are simulated by a two-dimensional model in the vertical plane normal to the beach line. The model describes...... the steepening and the overturning of the wave. At a given instant, the model domain is extended to three dimensions, and the two-dimensional flow field develops spontaneously three-dimensional flow features with turbulent eddies. After a few wave periods, stationary (periodic) conditions are achieved...

  2. Fast Eigensolver for Computing 3D Earth's Normal Modes

    Science.gov (United States)

    Shi, J.; De Hoop, M. V.; Li, R.; Xi, Y.; Saad, Y.

    2017-12-01

    We present a novel parallel computational approach to compute Earth's normal modes. We discretize Earth via an unstructured tetrahedral mesh and apply the continuous Galerkin finite element method to the elasto-gravitational system. To resolve the eigenvalue pollution issue, following the analysis separating the seismic point spectrum, we utilize explicitly a representation of the displacement for describing the oscillations of the non-seismic modes in the fluid outer core. Effectively, we separate out the essential spectrum which is naturally related to the Brunt-Väisälä frequency. We introduce two Lanczos approaches with polynomial and rational filtering for solving this generalized eigenvalue problem in prescribed intervals. The polynomial filtering technique only accesses the matrix pair through matrix-vector products and is an ideal candidate for solving three-dimensional large-scale eigenvalue problems. The matrix-free scheme allows us to deal with fluid separation and self-gravitation in an efficient way, while the standard shift-and-invert method typically needs an explicit shifted matrix and its factorization. The rational filtering method converges much faster than the standard shift-and-invert procedure when computing all the eigenvalues inside an interval. Both two Lanczos approaches solve for the internal eigenvalues extremely accurately, comparing with the standard eigensolver. In our computational experiments, we compare our results with the radial earth model benchmark, and visualize the normal modes using vector plots to illustrate the properties of the displacements in different modes.

  3. Transliteration normalization for Information Extraction and Machine Translation

    Directory of Open Access Journals (Sweden)

    Yuval Marton

    2014-12-01

    Full Text Available Foreign name transliterations typically include multiple spelling variants. These variants cause data sparseness and inconsistency problems, increase the Out-of-Vocabulary (OOV rate, and present challenges for Machine Translation, Information Extraction and other natural language processing (NLP tasks. This work aims to identify and cluster name spelling variants using a Statistical Machine Translation method: word alignment. The variants are identified by being aligned to the same “pivot” name in another language (the source-language in Machine Translation settings. Based on word-to-word translation and transliteration probabilities, as well as the string edit distance metric, names with similar spellings in the target language are clustered and then normalized to a canonical form. With this approach, tens of thousands of high-precision name transliteration spelling variants are extracted from sentence-aligned bilingual corpora in Arabic and English (in both languages. When these normalized name spelling variants are applied to Information Extraction tasks, improvements over strong baseline systems are observed. When applied to Machine Translation tasks, a large improvement potential is shown.

  4. Functional neuroimaging of normal aging: Declining brain, adapting brain.

    Science.gov (United States)

    Sugiura, Motoaki

    2016-09-01

    Early functional neuroimaging research on normal aging brain has been dominated by the interest in cognitive decline. In this framework the age-related compensatory recruitment of prefrontal cortex, in terms of executive system or reduced lateralization, has been established. Further details on these compensatory mechanisms and the findings reflecting cognitive decline, however, remain the matter of intensive investigations. Studies in another framework where age-related neural alteration is considered adaptation to the environmental change are recently burgeoning and appear largely categorized into three domains. The age-related increase in activation of the sensorimotor network may reflect the alteration of the peripheral sensorimotor systems. The increased susceptibility of the network for the mental-state inference to the socioemotional significance may be explained by the age-related motivational shift due to the altered social perception. The age-related change in activation of the self-referential network may be relevant to the focused positive self-concept of elderly driven by a similar motivational shift. Across the domains, the concept of the self and internal model may provide the theoretical bases of this adaptation framework. These two frameworks complement each other to provide a comprehensive view of the normal aging brain. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Normal Conducting Separation Dipoles For The Lhc Beam Cleaning Insertions

    CERN Document Server

    Petrov, V; de Rijk, G; Gerard, D; Hans, O; Kalbreier, Willi; Kiselev, O; Protopopov, I V; Pupkov, Yu; Ramberger, S; Ruvinsky, E; Sukhanov, A

    2004-01-01

    In the Large Hadron Collider (LHC), two straight sections, IR3 and IR7, will be dedicated to beam cleaning [1]. These cleaning insertions will be equipped with normal conducting magnets. MBW magnets are dipole magnets used to increase the separation of the two beams. They have a core length of 3.4 m and a gap height of 52 mm and will operate at a magnetic field ranging from 0.09 T to 1.53 T. Limitations on the dimensions and total weight of the magnet resulted in a special design with a common yoke for the two beams. The orbits of the two beams will be separated horizontally by a distance between 194 mm and 224 mm in the gap of the magnet. The magnet was designed in collaboration between CERN and BINP. The report presents the main design issues and results of the pre-series acceptance tests including mechanical, electrical and magnetic field measurements. Index terms - LHC, normal conducting magnet, twin aperture design, separation dipole

  6. Sex differences in normal age trajectories of functional brain networks.

    Science.gov (United States)

    Scheinost, Dustin; Finn, Emily S; Tokoglu, Fuyuze; Shen, Xilin; Papademetris, Xenophon; Hampson, Michelle; Constable, R Todd

    2015-04-01

    Resting-state functional magnetic resonance image (rs-fMRI) is increasingly used to study functional brain networks. Nevertheless, variability in these networks due to factors such as sex and aging is not fully understood. This study explored sex differences in normal age trajectories of resting-state networks (RSNs) using a novel voxel-wise measure of functional connectivity, the intrinsic connectivity distribution (ICD). Males and females showed differential patterns of changing connectivity in large-scale RSNs during normal aging from early adulthood to late middle-age. In some networks, such as the default-mode network, males and females both showed decreases in connectivity with age, albeit at different rates. In other networks, such as the fronto-parietal network, males and females showed divergent connectivity trajectories with age. Main effects of sex and age were found in many of the same regions showing sex-related differences in aging. Finally, these sex differences in aging trajectories were robust to choice of preprocessing strategy, such as global signal regression. Our findings resolve some discrepancies in the literature, especially with respect to the trajectory of connectivity in the default mode, which can be explained by our observed interactions between sex and aging. Overall, results indicate that RSNs show different aging trajectories for males and females. Characterizing effects of sex and age on RSNs are critical first steps in understanding the functional organization of the human brain. © 2014 Wiley Periodicals, Inc.

  7. Is this the right normalization? A diagnostic tool for ChIP-seq normalization.

    Science.gov (United States)

    Angelini, Claudia; Heller, Ruth; Volkinshtein, Rita; Yekutieli, Daniel

    2015-05-09

    Chip-seq experiments are becoming a standard approach for genome-wide profiling protein-DNA interactions, such as detecting transcription factor binding sites, histone modification marks and RNA Polymerase II occupancy. However, when comparing a ChIP sample versus a control sample, such as Input DNA, normalization procedures have to be applied in order to remove experimental source of biases. Despite the substantial impact that the choice of the normalization method can have on the results of a ChIP-seq data analysis, their assessment is not fully explored in the literature. In particular, there are no diagnostic tools that show whether the applied normalization is indeed appropriate for the data being analyzed. In this work we propose a novel diagnostic tool to examine the appropriateness of the estimated normalization procedure. By plotting the empirical densities of log relative risks in bins of equal read count, along with the estimated normalization constant, after logarithmic transformation, the researcher is able to assess the appropriateness of the estimated normalization constant. We use the diagnostic plot to evaluate the appropriateness of the estimates obtained by CisGenome, NCIS and CCAT on several real data examples. Moreover, we show the impact that the choice of the normalization constant can have on standard tools for peak calling such as MACS or SICER. Finally, we propose a novel procedure for controlling the FDR using sample swapping. This procedure makes use of the estimated normalization constant in order to gain power over the naive choice of constant (used in MACS and SICER), which is the ratio of the total number of reads in the ChIP and Input samples. Linear normalization approaches aim to estimate a scale factor, r, to adjust for different sequencing depths when comparing ChIP versus Input samples. The estimated scaling factor can easily be incorporated in many peak caller algorithms to improve the accuracy of the peak identification. The

  8. Large Right Pleural Effusion

    Directory of Open Access Journals (Sweden)

    Robert Rowe

    2016-09-01

    Full Text Available History of present illness: An 83-year-old male with a distant history of tuberculosis status post treatment and resection approximately fifty years prior presented with two days of worsening shortness of breath. He denied any chest pain, and reported his shortness of breath was worse with exertion and lying flat. Significant findings: Chest x-ray and bedside ultrasound revealed a large right pleural effusion, estimated to be greater than two and a half liters in size. Discussion: The incidence of pleural effusion is estimated to be at least 1.5 million cases annually in the United States.1 Erect posteroanterior and lateral chest radiography remains the mainstay for diagnosis of a pleural effusion; on upright chest radiography small effusions (>400cc will blunt the costophrenic angles, and as the size of an effusion grows it will begin to obscure the hemidiphragm.1 Large effusions will cause mediastinal shift away from the affected side (seen in effusions >1000cc.1 Lateral decubitus chest radiography can detect effusions greater than 50cc.1 Ultrasonography can help differentiate large pulmonary masses from effusions and can be instrumental in guiding thoracentesis.1 The patient above was comfortable at rest and was admitted for a non-emergent thoracentesis. The pulmonology team removed 2500cc of fluid, and unfortunately the patient subsequently developed re-expansion pulmonary edema and pneumothorax ex-vacuo. It is generally recommended that no more than 1500cc be removed to minimize the risk of re-expansion pulmonary edema.2

  9. The normal range of condylar movement

    International Nuclear Information System (INIS)

    Choe, Han Up; Park, Tae Won

    1978-01-01

    The purpose of this study was to investigate the normal range of condylar movement of normal adults. The author gas observed roentgenographic images of four serial positions of condylar head taken by modified transcranial lateral oblique projection. The serial positions are centric occlusion, rest position, 1 inch open position and maximal open position. The results were obtained as follow; 1. Inter-incisal distance was 46.85 mm in maximal open position. 2. The length between the deepest point of glenoid fossa and summit of condylar head in rest position was wider than that in centric occlusion by 0.8 mm. 3. In 1 inch open position, condylar head moved forward from the standard line in 12.64 mm of horizontal direction and moved downwards from the standard line in 1.84 mm of vertical direction. 4. In maximal open position, condylar head moved forward from the standard line in 19.06 mm of horizontal direction and moved downwards from the standard line in 0.4 mm of vertical direction. 5. In centric occlusion, the width between glenoid fossa and margin of condylar head was greater in the posterior portion than in the anterior portion by 0.4 mm. 6. Except for estimated figures of 1 inch open position, all of the estimated figures was greater in male than in female.

  10. Extravascular transport in normal and tumor tissues.

    Science.gov (United States)

    Jain, R K; Gerlowski, L E

    1986-01-01

    The transport characteristics of the normal and tumor tissue extravascular space provide the basis for the determination of the optimal dosage and schedule regimes of various pharmacological agents in detection and treatment of cancer. In order for the drug to reach the cellular space where most therapeutic action takes place, several transport steps must first occur: (1) tissue perfusion; (2) permeation across the capillary wall; (3) transport through interstitial space; and (4) transport across the cell membrane. Any of these steps including intracellular events such as metabolism can be the rate-limiting step to uptake of the drug, and these rate-limiting steps may be different in normal and tumor tissues. This review examines these transport limitations, first from an experimental point of view and then from a modeling point of view. Various types of experimental tumor models which have been used in animals to represent human tumors are discussed. Then, mathematical models of extravascular transport are discussed from the prespective of two approaches: compartmental and distributed. Compartmental models lump one or more sections of a tissue or body into a "compartment" to describe the time course of disposition of a substance. These models contain "effective" parameters which represent the entire compartment. Distributed models consider the structural and morphological aspects of the tissue to determine the transport properties of that tissue. These distributed models describe both the temporal and spatial distribution of a substance in tissues. Each of these modeling techniques is described in detail with applications for cancer detection and treatment in mind.

  11. Normal CT in infants and children

    Energy Technology Data Exchange (ETDEWEB)

    Takao, T; Okuno, T; Ito, M; Konishi, Y; Yoshioka, M [Kyoto Univ. (Japan). Faculty of Medicine

    1980-10-01

    There have been several reports as to normal CT in children. However, they included children with convulsions as normal subjects. In our experience, children with convulsions have an enlargement of the subdural space in the frontal region. Therefore, we studied CT in children without convulsions. Of the 10,000 patients examined with EMI 1000 or EMI 1010 at Kyoto Univ. Hospital from 1976 to 1979, 110 children could be classified into the following types according to their symptoms: 1) Type-1 head injury, without abnormalities in CT resulting from this injury, 2) non-migraining headaches, and 3) others without CT abnormalities who were routinely examined. Previous studies have shown that the enlargement of the subdural space in the frontal region was not abnormal under one year. However, the present study has shown that it was not dilated in children without convulsions. We stressed the usefulness of our newly calculated basal cistern index, because the SD was small and could be readily indentified (this index was under 0.29 in most cases; their SD's were 0.04 in those under one year and 0.02 over one year). The other data were not so different from those of previous studies.

  12. Oxygen delivery in irradiated normal tissue

    Energy Technology Data Exchange (ETDEWEB)

    Kiani, M.F.; Ansari, R. [Univ. of Tennessee Health Science Center, Memphis, TN (United States). School of Biomedical Engineering; Gaber, M.W. [St. Jude Children' s Research Hospital, Memphis, TN (United States)

    2003-03-01

    Ionizing radiation exposure significantly alters the structure and function of microvascular networks, which regulate delivery of oxygen to tissue. In this study we use a hamster cremaster muscle model to study changes in microvascular network parameters and use a mathematical model to study the effects of these observed structural and microhemodynamic changes in microvascular networks on oxygen delivery to the tissue. Our experimental observations indicate that in microvascular networks while some parameters are significantly affected by irradiation (e.g. red blood cell (RBC) transit time), others remain at the control level (e.g. RBC path length) up to 180 days post-irradiation. The results from our mathematical model indicate that tissue oxygenation patterns are significantly different in irradiated normal tissue as compared to age-matched controls and the differences are apparent as early as 3 days post irradiation. However, oxygen delivery to irradiated tissue was not found to be significantly different from age matched controls at any time between 7 days to 6 months post-irradiation. These findings indicate that microvascular late effects in irradiated normal tissue may be due to factors other than compromised tissue oxygenation. (author)

  13. Does partial occlusion promote normal binocular function?

    Science.gov (United States)

    Li, Jingrong; Thompson, Benjamin; Ding, Zhaofeng; Chan, Lily Y L; Chen, Xiang; Yu, Minbin; Deng, Daming; Hess, Robert F

    2012-10-03

    There is growing evidence that abnormal binocular interactions play a key role in the amblyopia syndrome and represent a viable target for treatment interventions. In this context the use of partial occlusion using optical devices such as Bangerter filters as an alternative to complete occlusion is of particular interest. The aims of this study were to understand why Bangerter filters do not result in improved binocular outcomes compared to complete occlusion, and to compare the effects of Bangerter filters, optical blur and neutral density (ND) filters on normal binocular function. The effects of four strengths of Bangerter filters (0.8, 0.6, 0.4, 0.2) on letter and vernier acuity, contrast sensitivity, stereoacuity, and interocular suppression were measured in 21 observers with normal vision. In a subset of 14 observers, the partial occlusion effects of Bangerter filters, ND filters and plus lenses on stereopsis and interocular suppression were compared. Bangerter filters did not have graded effect on vision and induced significant disruption to binocular function. This disruption was greater than that of monocular defocus but weaker than that of ND filters. The effect of the Bangerter filters on stereopsis was more pronounced than their effect on monocular acuity, and the induced monocular acuity deficits did not predict the induced deficits in stereopsis. Bangerter filters appear to be particularly disruptive to binocular function. Other interventions, such as optical defocus and those employing computer generated dichoptic stimulus presentation, may be more appropriate than partial occlusion for targeting binocular function during amblyopia treatment.

  14. Normal CT in infants and children

    International Nuclear Information System (INIS)

    Takao, Tatsuo; Okuno, Takehiko; Ito, Masatoshi; Konishi, Yukuo; Yoshioka, Mieko

    1980-01-01

    There have been several reports as to normal CT in children. However, they included children with convulsions as normal subjects. In our experience, children with convulsions have an enlargement of the subdural space in the frontal region. Therefore, we studied CT in children without convulsions. Of the 10,000 patients examined with EMI 1000 or EMI 1010 at Kyoto Univ. Hospital from 1976 to 1979, 110 children could be classified into the following types according to their symptoms: 1) Type-1 head injury, without abnormalities in CT resulting from this injury, 2) non-migraining headaches, and 3) others with on CT abnormalities who were routinely examined. Previous studies have shown that the enlargement of the subdural space in the frontal region was not abnormal under one year. However, the present study has shown that it was not dilated in children without convulsions. We stressed the usefulness of our newly calculated basal cistern index, because the SD was small and could be readily indentified (this index was under 0.29 in most cases; their SD's were 0.04 in those under one year and 0.02 over one year). The other data were not so different from those of previous studies. (J.P.N.)

  15. Not Normal: the uncertainties of scientific measurements

    Science.gov (United States)

    Bailey, David C.

    2017-01-01

    Judging the significance and reproducibility of quantitative research requires a good understanding of relevant uncertainties, but it is often unclear how well these have been evaluated and what they imply. Reported scientific uncertainties were studied by analysing 41 000 measurements of 3200 quantities from medicine, nuclear and particle physics, and interlaboratory comparisons ranging from chemistry to toxicology. Outliers are common, with 5σ disagreements up to five orders of magnitude more frequent than naively expected. Uncertainty-normalized differences between multiple measurements of the same quantity are consistent with heavy-tailed Student's t-distributions that are often almost Cauchy, far from a Gaussian Normal bell curve. Medical research uncertainties are generally as well evaluated as those in physics, but physics uncertainty improves more rapidly, making feasible simple significance criteria such as the 5σ discovery convention in particle physics. Contributions to measurement uncertainty from mistakes and unknown problems are not completely unpredictable. Such errors appear to have power-law distributions consistent with how designed complex systems fail, and how unknown systematic errors are constrained by researchers. This better understanding may help improve analysis and meta-analysis of data, and help scientists and the public have more realistic expectations of what scientific results imply.

  16. Is Middle-Upper Arm Circumference "normally" distributed? Secondary data analysis of 852 nutrition surveys.

    Science.gov (United States)

    Frison, Severine; Checchi, Francesco; Kerac, Marko; Nicholas, Jennifer

    2016-01-01

    Wasting is a major public health issue throughout the developing world. Out of the 6.9 million estimated deaths among children under five annually, over 800,000 deaths (11.6 %) are attributed to wasting. Wasting is quantified as low Weight-For-Height (WFH) and/or low Mid-Upper Arm Circumference (MUAC) (since 2005). Many statistical procedures are based on the assumption that the data used are normally distributed. Analyses have been conducted on the distribution of WFH but there are no equivalent studies on the distribution of MUAC. This secondary data analysis assesses the normality of the MUAC distributions of 852 nutrition cross-sectional survey datasets of children from 6 to 59 months old and examines different approaches to normalise "non-normal" distributions. The distribution of MUAC showed no departure from a normal distribution in 319 (37.7 %) distributions using the Shapiro-Wilk test. Out of the 533 surveys showing departure from a normal distribution, 183 (34.3 %) were skewed (D'Agostino test) and 196 (36.8 %) had a kurtosis different to the one observed in the normal distribution (Anscombe-Glynn test). Testing for normality can be sensitive to data quality, design effect and sample size. Out of the 533 surveys showing departure from a normal distribution, 294 (55.2 %) showed high digit preference, 164 (30.8 %) had a large design effect, and 204 (38.3 %) a large sample size. Spline and LOESS smoothing techniques were explored and both techniques work well. After Spline smoothing, 56.7 % of the MUAC distributions showing departure from normality were "normalised" and 59.7 % after LOESS. Box-Cox power transformation had similar results on distributions showing departure from normality with 57 % of distributions approximating "normal" after transformation. Applying Box-Cox transformation after Spline or Loess smoothing techniques increased that proportion to 82.4 and 82.7 % respectively. This suggests that statistical approaches relying on the

  17. Large litter sizes

    DEFF Research Database (Denmark)

    Sandøe, Peter; Rutherford, K.M.D.; Berg, Peer

    2012-01-01

    This paper presents some key results and conclusions from a review (Rutherford et al. 2011) undertaken regarding the ethical and welfare implications of breeding for large litter size in the domestic pig and about different ways of dealing with these implications. Focus is primarily on the direct...... possible to achieve a drop in relative piglet mortality and the related welfare problems. However, there will be a growing problem with the need to use foster or nurse sows which may have negative effects on both sows and piglets. This gives rise to new challenges for management....

  18. Large lithium loop experience

    International Nuclear Information System (INIS)

    Kolowith, R.; Owen, T.J.; Berg, J.D.; Atwood, J.M.

    1981-10-01

    An engineering design and operating experience of a large, isothermal, lithium-coolant test loop are presented. This liquid metal coolant loop is called the Experimental Lithium System (ELS) and has operated safely and reliably for over 6500 hours through September 1981. The loop is used for full-scale testing of components for the Fusion Materials Irradiation Test (FMIT) Facility. Main system parameters include coolant temperatures to 430 0 C and flow to 0.038 m 3 /s (600 gal/min). Performance of the main pump, vacuum system, and control system is discussed. Unique test capabilities of the ELS are also discussed

  19. Large coil test facility

    International Nuclear Information System (INIS)

    Nelms, L.W.; Thompson, P.B.

    1980-01-01

    Final design of the facility is nearing completion, and 20% of the construction has been accomplished. A large vacuum chamber, houses the test assembly which is coupled to appropriate cryogenic, electrical, instrumentation, diagnostc systems. Adequate assembly/disassembly areas, shop space, test control center, offices, and test support laboratories are located in the same building. Assembly and installation operations are accomplished with an overhead crane. The major subsystems are the vacuum system, the test stand assembly, the cryogenic system, the experimental electric power system, the instrumentation and control system, and the data aquisition system

  20. "Differently normal" and "normally different": negotiations of female embodiment in women's accounts of 'atypical' sex development.

    Science.gov (United States)

    Guntram, Lisa

    2013-12-01

    During recent decades numerous feminist scholars have scrutinized the two-sex model and questioned its status in Western societies and medicine. Along the same line, increased attention has been paid to individuals' experiences of atypical sex development, also known as intersex or 'disorders of sex development' (DSD). Yet research on individuals' experiences of finding out about their atypical sex development in adolescence has been scarce. Against this backdrop, the present article analyses 23 in-depth interviews with women who in their teens found out about their atypical sex development. The interviews were conducted during 2009-2012 and the interviewees were all Swedish. Drawing on feminist research on female embodiment and social scientific studies on diagnosis, I examine how the women make sense of their bodies and situations. First, I aim to explore how the women construe normality as they negotiate female embodiment. Second, I aim to investigate how the divergent manners in which these negotiations are expressed can be further understood via the women's different access to a diagnosis. Through a thematic and interpretative analysis, I outline two negotiation strategies: the "differently normal" and the "normally different" strategy. In the former, the women present themselves as just slightly different from 'normal' women. In the latter, they stress that everyone is different in some manner and thereby claim normalcy. The analysis shows that access to diagnosis corresponds to the ways in which the women present themselves as "differently normal" and "normally different", thus shedding light on the complex role of diagnosis in their negotiations of female embodiment. It also reveals that the women make use of what they do have and how alignments with and work on norms interplay as normality is construed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Sentiment Classification of Documents in Serbian: The Effects of Morphological Normalization and Word Embeddings

    Directory of Open Access Journals (Sweden)

    V. Batanović

    2017-11-01

    Full Text Available An open issue in the sentiment classification of texts written in Serbian is the effect of different forms of morphological normalization and the usefulness of leveraging large amounts of unlabeled texts. In this paper, we assess the impact of lemmatizers and stemmers for Serbian on classifiers trained and evaluated on the Serbian Movie Review Dataset. We also consider the effectiveness of using word embeddings, generated from a large unlabeled corpus, as classification features.

  2. Normalization of the Psychometric Hepatic Encephalopathy score ...

    African Journals Online (AJOL)

    2016-05-09

    May 9, 2016 ... influenced by age, education levels, and gender.[5] Till date, the PHES ... and death. MHE also increases the risk of development ... large circles beginning from each row on the left and working to the right. The test score is the ...

  3. An investigation on normal school students’ learning burnout – A case study of English normal students

    Directory of Open Access Journals (Sweden)

    Linjing Xu

    2017-11-01

    Full Text Available Learning burnout is a phenomenon in which students hold a negative attitude to curriculum learning, which manifests in aspects of physiology, psychology, behavior and interpersonal communication. China attaches great importance to higher education, colleges and universities shoulder the important task of training national modernization personnel. The problem of university students’ learning burnout has become a social phenomenon that cannot be ignored. Normal university students are one of the important groups of college students, and this phenomenon of learning burnout may also occur among them. English majors are the backbone of English teachers in primary and secondary schools in the future. The learning status of these groups affects the overall quality of teaching in normal colleges and universities and, more importantly, the quality of teachers in primary and secondary schools in the future. This paper first reviews the definition of learning burnout and the research methods of measurement. Subsequently, it investigates the learning burnout of English matriculation students by taking the first-year English majors of Jiangxi Normal University as an example. In this way, this research is hoped to promote the study on learning burnout not only among English normal students but also other normal students.

  4. 20 CFR 336.2 - Duration of normal unemployment benefits.

    Science.gov (United States)

    2010-04-01

    ... 20 Employees' Benefits 1 2010-04-01 2010-04-01 false Duration of normal unemployment benefits. 336... UNEMPLOYMENT INSURANCE ACT DURATION OF NORMAL AND EXTENDED BENEFITS Normal Benefits § 336.2 Duration of normal unemployment benefits. (a) 130 compensable day limitation. A qualified employee who has satisfied the waiting...

  5. Frictional response of simulated faults to normal stresses perturbations probed with ultrasonic waves

    Science.gov (United States)

    Shreedharan, S.; Riviere, J.; Marone, C.

    2017-12-01

    We report on a suite of laboratory friction experiments conducted on saw-cut Westerly Granite surfaces to probe frictional response to step changes in normal stress and loading rate. The experiments are conducted to illuminate the fundamental processes that yield friction rate and state dependence. We quantify the microphysical frictional response of the simulated fault surfaces to normal stress steps, in the range of 1% - 600% step increases and decreases from a nominal baseline normal stress. We measure directly the fault slip rate and account for changes in slip rate with changes in normal stress and complement mechanical data acquisition by continuously probing the faults with ultrasonic pulses. We conduct the experiments at room temperature and humidity conditions in a servo controlled biaxial testing apparatus in the double direct shear configuration. The samples are sheared over a range of velocities, from 0.02 - 100 μm/s. We report observations of a transient shear stress and friction evolution with step increases and decreases in normal stress. Specifically, we show that, at low shear velocities and small increases in normal stress ( 5% increases), the shear stress evolves immediately with normal stress. We show that the excursions in slip rate resulting from the changes in normal stress must be accounted for in order to predict fault strength evolution. Ultrasonic wave amplitudes which first increase immediately in response to normal stress steps, then decrease approximately linearly to a new steady state value, in part due to changes in fault slip rate. Previous descriptions of frictional state evolution during normal stress perturbations have not adequately accounted for the effect of large slip velocity excursions. Here, we attempt to do so by using the measured ultrasonic amplitudes as a proxy for frictional state during transient shear stress evolution. Our work aims to improve understanding of induced and triggered seismicity with focus on

  6. Cell survival of human tumor cells compared with normal fibroblasts following 60Co gamma irradiation

    International Nuclear Information System (INIS)

    Lloyd, E.L.; Henning, C.B.; Reynolds, S.D.; Holmblad, G.L.; Trier, J.E.

    1982-01-01

    Three tumor cell lines, two of which were shown to be HeLa cells, were irradiated with 60 Co gamma irradiation, together with two cell cultures of normal human diploid fibroblasts. Cell survival was studied in three different experiments over a dose range of 2 to 14 gray. All the tumor cell lines showed a very wide shoulder in the dose response curves in contrast to the extremely narrow shoulder of the normal fibroblasts. In addition, the D/sub o/ values for the tumor cell lines were somewhat greater. These two characteristics of the dose response curves resulted in up to 2 orders of magnitude less sensitivity for cell inactivation of HeLa cells when compared with normal cells at high doses (10 gray). Because of these large differences, the extrapolation of results from the irradiation of HeLa cells concerning the mechanisms of normal cell killing should be interpreted with great caution

  7. Mural Thrombus in the Normal-Appearing Descending Thoracic Aorta of a Chronic Smoker

    Science.gov (United States)

    Habib, Habib; Hsu, Judy; Winchell, Patricia Jo; Daoko, Joseph

    2013-01-01

    Thrombus formation in an atherosclerotic or aneurysmal descending thoracic aorta is a well-described, frequently encountered vascular condition. In comparison, thrombus formation in a normal-appearing descending thoracic aorta is reported far less often. We describe the case of a 46-year-old woman who had splenic and renal infarctions secondary to embolic showers from a large, mobile thrombus in a morphologically normal proximal descending thoracic aorta. After the patient underwent anticoagulation, stent-grafting, and surgical bypass to correct an arterial blockage caused by the stent-graft, she resumed a relatively normal life. In contrast with other cases of a thrombotic but normal-appearing descending thoracic aorta, this patient had no known malignancy or systemic coagulative disorders; her sole risk factor was chronic smoking. We discuss our patient's case and review the relevant medical literature, focusing on the effect of smoking on coagulation physiology. PMID:24391341

  8. Propagation of normal zones in composite superconductors

    International Nuclear Information System (INIS)

    Dresner, L.

    1976-08-01

    This paper describes calculations of propagation velocities of normal zones in composite superconductors. Full accounting is made for (1) current sharing, (2) the variation with temperature of the thermal conductivity of the copper matrix, and the specific heats of the matrix and the superconductor, and (3) the variation with temperature of the steady-state heat transfer at a copper-helium interface in the nucleate-boiling, transition, and film-boiling ranges. The theory, which contains no adjustable parameters, is compared with experiments on bare (uninsulated) conductors. Agreement is not good. It is concluded that the effects of transient heat transfer may need to be included in the theory to improve agreement with experiment

  9. Uterus MRI. Normal and pathological aspects

    International Nuclear Information System (INIS)

    Moulin, G.; Bartoli, J.M.; Gaubert, J.Y.; Bayle, O.; Distefano-Louineau, D.; Kasbarian, M.

    1991-01-01

    Magnetic Resonance Imaging (MRI), a non invasive procedure, is taking a place of growing importance as a means of radiological exploration. Its use in uterine pathologies has shown considerable developments. This requires an excellent knowledge of the normal and pathological aspects of the uterus. In fact it exists a zonal anatomy of the uterus which varies according to hormonal impregnation and this is very well seen by MRI. MRI gives excellent results in the diagnosis and study of different uterine pathologies. The radiological appearance of leiomyomas differs depending on the presence or not of degenerative changes within them. Uterine adenomyosis is also well studied by MRI. Lastly different studies in the literature have shown MRI to be a reliable method of exploration with a high degree of fiability, specificity and sensibility to study the local spread of malignant uterine diseases. The authors report their experience and also that present in the literature concerning the study of the uterus by MRI [fr

  10. Microscopic theory of normal liquid 3He

    International Nuclear Information System (INIS)

    Nafari, N.; Doroudi, A.

    1994-03-01

    We have used the self-consistent scheme proposed by Singwi, Tosi, Land and Sjoelander (STLS) to study the properties of normal liquid 3 He. By employing the Aziz potential (HFD-B) and some other realistic pairwise interactions, we have calculated the static structure factor, the pair-correlation function, the zero sound frequencies as a function of wave-vector, and the Landau parameter F s 0 for different densities. Our results show considerable improvement over the Ng-Singwi's model potential of a hard core plus an attractive tail. Agreement between our results and the experimental data for the static structure factor and the zero sound frequencies is fairly good. (author). 30 refs, 6 figs, 2 tabs

  11. Ibsen and Peking Women's High Normal University

    Directory of Open Access Journals (Sweden)

    Sun Jian

    2015-02-01

    Full Text Available This article aims at exploring the great influence of Ibsen and especially his play A Doll House on the young Chinese girls studying at Peking Women’s High Normal University established for the first time in China at the beginning of the 20th century to educate girls. In its short history, the girls at the university were exposed widely to the progressive ideas and literature from the West. Ibsen, the most popular writer at that time, inspired the girls tremendously whose performance of A Doll House aroused a heated debate among the well-known scholars on such important issues as women’s rights, women’s liberation, new culture, art and literature. Consequently there appeared at the university first group of modern Chinese women writers who picked up their pens and wrote about themselves and about women in China, describing themselves as “Chinese Noras”.

  12. Cerebral blood flow in normal pressure hydrocephalus

    International Nuclear Information System (INIS)

    Mamo, H.L.; Meric, P.C.; Ponsin, J.C.; Rey, A.C.; Luft, A.G.; Seylaz, J.A.

    1987-01-01

    A xenon-133 method was used to measure cerebral blood flow (CBF) before and after cerebrospinal fluid (CSF) removal in patients with normal pressure hydrocephalus (NPH). Preliminary results suggested that shunting should be performed on patients whose CBF increased after CSF removal. There was a significant increase in CBF in patients with NPH, which was confirmed by the favorable outcome of 88% of patients shunted. The majority of patients with senile and presenile dementia showed a decrease or no change in CBF after CSF removal. It is suggested that although changes in CBF and clinical symptoms of NPH may have the same cause, i.e., changes in the cerebral intraparenchymal pressure, there is no simple direct relation between these two events. The mechanism underlying the loss of autoregulation observed in NPH is also discussed

  13. 'Normal' markets, market imperfections and energy efficiency

    International Nuclear Information System (INIS)

    Sanstad, A.H.; Howarth, R.B.

    1994-01-01

    The conventional distinction between 'economic' and 'engineering' approaches to energy analysis obscures key methodological issues concerning the measurement of the costs and benefits of policies to promote the adoption of energy-efficient technologies. The engineering approach is in fact based upon firm economic foundations: the principle of lifecycle cost minimization that arises directly from the theory of rational investment. Thus, evidence that so-called 'market barriers' impede the adoption of cost-effective energy-efficient technologies implies the existence of market failures as defined in the context of microeconomic theory. A widely held view that the engineering view lacks economic justification, is based on the fallacy that markets are 'normally' efficient. (author)

  14. About the principles of radiation level normalization

    International Nuclear Information System (INIS)

    Nosovskij, A.V.

    2000-01-01

    The paper highlights the impact being made by the radiation level normalization principles upon the social and economic indicators. The newly introduced radiation safety standards - 97 are taken as an example. It is emphasized that it is necessary to use a sound approach while defining radiation protection standards, taking into consideration economic and social factors existing in Ukraine at the moment. Based on the concept of the natural radiation background and available results of the epidemiological surveys, the dose limits are proposed for the radiation protection standards. The paper gives a description of the dose limitation system recommended by the International Committee for Radiation Protection. The paper highlights a negative impact of the line non threshold concept, lack of special knowledge in the medical service and mass media to make decisions to protect people who suffered from the Chernobyl accident

  15. How Long Is a Normal Labor?

    DEFF Research Database (Denmark)

    Hildingsson, Ingegerd; Blix, Ellen; Hegaard, Hanne

    2015-01-01

    OBJECTIVE: Normal progress of labor is a subject for discussion among professionals. The aim of this study was to assess the duration of labor in women with a planned home birth and spontaneous onset who gave birth at home or in hospital after transfer. METHODS: This is a population-based study...... of home births in four Nordic countries (Denmark, Iceland, Norway, and Sweden). All midwives assisting at a home birth from 2008 to 2013 were asked to provide information about home births using a questionnaire. RESULTS: Birth data from 1,612 women, from Denmark (n = 1,170), Norway (n = 263), Sweden (n...... = 138), and Iceland (n = 41) were included. The total median duration from onset of labor until the birth of the baby was approximately 14 hours for primiparas and 7.25 hours for multiparas. The duration of the different phases varied between countries. Blood loss more than 1,000 mL and perineal...

  16. Normal dispersion femtosecond fiber optical parametric oscillator.

    Science.gov (United States)

    Nguyen, T N; Kieu, K; Maslov, A V; Miyawaki, M; Peyghambarian, N

    2013-09-15

    We propose and demonstrate a synchronously pumped fiber optical parametric oscillator (FOPO) operating in the normal dispersion regime. The FOPO generates chirped pulses at the output, allowing significant pulse energy scaling potential without pulse breaking. The output average power of the FOPO at 1600 nm was ∼60  mW (corresponding to 1.45 nJ pulse energy and ∼55% slope power conversion efficiency). The output pulses directly from the FOPO were highly chirped (∼3  ps duration), and they could be compressed outside of the cavity to 180 fs by using a standard optical fiber compressor. Detailed numerical simulation was also performed to understand the pulse evolution dynamics around the laser cavity. We believe that the proposed design concept is useful for scaling up the pulse energy in the FOPO using different pumping wavelengths.

  17. The Ethos of Post-Normal Science

    DEFF Research Database (Denmark)

    Kønig, Nicolas; Børsen, Tom; Emmeche, Claus

    2017-01-01

    The norms and values of Post-Normal Science (PNS) are instrumental in guiding science advice practices. In this article, we report work in progress to systematically investigate the norms and values of PNS through a structured review. Anarchive of 397 documents was collected, including documents...... that contribute to the endeavour of ameliorating science advice practices from a PNS perspective. Action and structure-oriented viewpoints are used as complementing perspectives in the analysis of the ethos of PNS. From the action-perspective we study how prototypes of norms and values are reflected upon...... in negotiations of normative issues relating to science advice. From the structural perspective we study how interrelated prototypes of norms and values are presupposed in prescriptions, proscriptions, and goals for science advice practices. Through this analysis we identify a plurality of interrelated prototypes...

  18. Behavioral finance: Finance with normal people

    Directory of Open Access Journals (Sweden)

    Meir Statman

    2014-06-01

    Behavioral finance substitutes normal people for the rational people in standard finance. It substitutes behavioral portfolio theory for mean-variance portfolio theory, and behavioral asset pricing model for the CAPM and other models where expected returns are determined only by risk. Behavioral finance also distinguishes rational markets from hard-to-beat markets in the discussion of efficient markets, a distinction that is often blurred in standard finance, and it examines why so many investors believe that it is easy to beat the market. Moreover, behavioral finance expands the domain of finance beyond portfolios, asset pricing, and market efficiency and is set to continue that expansion while adhering to the scientific rigor introduced by standard finance.

  19. Visualization of normal pleural sinuses with AMBER

    International Nuclear Information System (INIS)

    Aarts, N.J.; Kool, L.J.S.; Oestmann, J.W.

    1991-01-01

    This paper reports that ventral and dorsal pleural sinuses are frequently better appreciated with advanced modulated beam equalization radiography (AMBER) than with standard chest radiography. The visualization of the sinuses with both techniques was compared and their typical configuration studied. Four hundred patients without known chest disease were evaluated. Two groups of 200 patients were studied with either AMBER or standard chest radiography. Visualization was evaluated by three radiologists using a four-point scale. The shape of the sinus was traced if sufficiently visible. A significantly larger segment of the respective sinuses was seen with the AMBER technique. The dorsal sinus was significantly easier to trace than the ventral. Various sinus configurations were noted. AMBER improves the visibility of the pleural sinuses. Knowledge of their normal configuration is the precondition for correctly diagnosing lesions hitherto frequently overlooked

  20. Normal Incidence for Graded Index Surfaces

    Science.gov (United States)

    Khankhoje, Uday K.; Van Zyl, Jakob

    2011-01-01

    A plane wave is incident normally from vacuum (eta(sub 0) = 1) onto a smooth surface. The substrate has three layers; the top most layer has thickness d(sub 1) and permittivity epsilon(sub 1). The corresponding numbers for the next layer are d(sub 2); epsilon(sub 2), while the third layer which is semi-in nite has index eta(sub 3). The Hallikainen model [1] is used to relate volumetric soil moisture to the permittivity. Here, we consider the relation for the real part of the permittivity for a typical loam soil: acute epsilon(mv) = 2.8571 + 3.9678 x mv + 118:85 x mv(sup 2).