WorldWideScience

Sample records for providing normal application

  1. WEBnm@: a web application for normal mode analyses of proteins

    Directory of Open Access Journals (Sweden)

    Reuter Nathalie

    2005-03-01

    Full Text Available Abstract Background Normal mode analysis (NMA has become the method of choice to investigate the slowest motions in macromolecular systems. NMA is especially useful for large biomolecular assemblies, such as transmembrane channels or virus capsids. NMA relies on the hypothesis that the vibrational normal modes having the lowest frequencies (also named soft modes describe the largest movements in a protein and are the ones that are functionally relevant. Results We developed a web-based server to perform normal modes calculations and different types of analyses. Starting from a structure file provided by the user in the PDB format, the server calculates the normal modes and subsequently offers the user a series of automated calculations; normalized squared atomic displacements, vector field representation and animation of the first six vibrational modes. Each analysis is performed independently from the others and results can be visualized using only a web browser. No additional plug-in or software is required. For users who would like to analyze the results with their favorite software, raw results can also be downloaded. The application is available on http://www.bioinfo.no/tools/normalmodes. We present here the underlying theory, the application architecture and an illustration of its features using a large transmembrane protein as an example. Conclusion We built an efficient and modular web application for normal mode analysis of proteins. Non specialists can easily and rapidly evaluate the degree of flexibility of multi-domain protein assemblies and characterize the large amplitude movements of their domains.

  2. Normal mode analysis and applications in biological physics.

    Science.gov (United States)

    Dykeman, Eric C; Sankey, Otto F

    2010-10-27

    Normal mode analysis has become a popular and often used theoretical tool in the study of functional motions in enzymes, viruses, and large protein assemblies. The use of normal modes in the study of these motions is often extremely fruitful since many of the functional motions of large proteins can be described using just a few normal modes which are intimately related to the overall structure of the protein. In this review, we present a broad overview of several popular methods used in the study of normal modes in biological physics including continuum elastic theory, the elastic network model, and a new all-atom method, recently developed, which is capable of computing a subset of the low frequency vibrational modes exactly. After a review of the various methods, we present several examples of applications of normal modes in the study of functional motions, with an emphasis on viral capsids.

  3. Fetal magnetic resonance: technique applications and normal fetal anatomy

    International Nuclear Information System (INIS)

    Martin, C.; Darnell, A.; Duran, C.; Mellado, F.; Corona, M

    2003-01-01

    Ultrasonography is the preferred diagnostic imaging technique for intrauterine fetal examination. Nevertheless, circumstances sometimes dictate the use of other techniques in order to analyze fetal structures. The advent of ultra rapid magnetic resonance (MR) sequencing has led to the possibility of doing MR fetal studies, since images are obtained in an extradordiarily short time and are not affected by either maternal or fetal movements. It does not employ ionizing radiations, it provides high-contrast images and it can obtain such images in any plane of space without being influenced by either the child bearer's physical characteristics of fetal position. MR provides good quality images of most fetal organs. It is extremely useful in analysing distinct structures, as well as permitting an evaluation of cervical structures, lungs, diaphragms, intra-abdominal and retroperitoneal structures, and fetal extremities. It can also provide useful information regarding the placenta,umbilical cord, amniotic fluid and uterus. The objective of this work is to describe MR technique as applied to intrauterine fetal examination, and to illustrate normal fetal anatomy as manifested by MR and its applications. (Author) 42 refs

  4. Normal and student´s t distributions and their applications

    CERN Document Server

    Ahsanullah, Mohammad; Shakil, Mohammad

    2014-01-01

    The most important properties of normal and Student t-distributions are presented. A number of applications of these properties are demonstrated. New related results dealing with the distributions of the sum, product and ratio of the independent normal and Student distributions are presented. The materials will be useful to the advanced undergraduate and graduate students and practitioners in the various fields of science and engineering.

  5. Localized Energy-Based Normalization of Medical Images: Application to Chest Radiography.

    Science.gov (United States)

    Philipsen, R H H M; Maduskar, P; Hogeweg, L; Melendez, J; Sánchez, C I; van Ginneken, B

    2015-09-01

    Automated quantitative analysis systems for medical images often lack the capability to successfully process images from multiple sources. Normalization of such images prior to further analysis is a possible solution to this limitation. This work presents a general method to normalize medical images and thoroughly investigates its effectiveness for chest radiography (CXR). The method starts with an energy decomposition of the image in different bands. Next, each band's localized energy is scaled to a reference value and the image is reconstructed. We investigate iterative and local application of this technique. The normalization is applied iteratively to the lung fields on six datasets from different sources, each comprising 50 normal CXRs and 50 abnormal CXRs. The method is evaluated in three supervised computer-aided detection tasks related to CXR analysis and compared to two reference normalization methods. In the first task, automatic lung segmentation, the average Jaccard overlap significantly increased from 0.72±0.30 and 0.87±0.11 for both reference methods to with normalization. The second experiment was aimed at segmentation of the clavicles. The reference methods had an average Jaccard index of 0.57±0.26 and 0.53±0.26; with normalization this significantly increased to . The third experiment was detection of tuberculosis related abnormalities in the lung fields. The average area under the Receiver Operating Curve increased significantly from 0.72±0.14 and 0.79±0.06 using the reference methods to with normalization. We conclude that the normalization can be successfully applied in chest radiography and makes supervised systems more generally applicable to data from different sources.

  6. Normalization in Lie algebras via mould calculus and applications

    Science.gov (United States)

    Paul, Thierry; Sauzin, David

    2017-11-01

    We establish Écalle's mould calculus in an abstract Lie-theoretic setting and use it to solve a normalization problem, which covers several formal normal form problems in the theory of dynamical systems. The mould formalism allows us to reduce the Lie-theoretic problem to a mould equation, the solutions of which are remarkably explicit and can be fully described by means of a gauge transformation group. The dynamical applications include the construction of Poincaré-Dulac formal normal forms for a vector field around an equilibrium point, a formal infinite-order multiphase averaging procedure for vector fields with fast angular variables (Hamiltonian or not), or the construction of Birkhoff normal forms both in classical and quantum situations. As a by-product we obtain, in the case of harmonic oscillators, the convergence of the quantum Birkhoff form to the classical one, without any Diophantine hypothesis on the frequencies of the unperturbed Hamiltonians.

  7. Normalization of Complete Genome Characteristics: Application to Evolution from Primitive Organisms to Homo sapiens.

    Science.gov (United States)

    Sorimachi, Kenji; Okayasu, Teiji; Ohhira, Shuji

    2015-04-01

    Normalized nucleotide and amino acid contents of complete genome sequences can be visualized as radar charts. The shapes of these charts depict the characteristics of an organism's genome. The normalized values calculated from the genome sequence theoretically exclude experimental errors. Further, because normalization is independent of both target size and kind, this procedure is applicable not only to single genes but also to whole genomes, which consist of a huge number of different genes. In this review, we discuss the applications of the normalization of the nucleotide and predicted amino acid contents of complete genomes to the investigation of genome structure and to evolutionary research from primitive organisms to Homo sapiens. Some of the results could never have been obtained from the analysis of individual nucleotide or amino acid sequences but were revealed only after the normalization of nucleotide and amino acid contents was applied to genome research. The discovery that genome structure was homogeneous was obtained only after normalization methods were applied to the nucleotide or predicted amino acid contents of genome sequences. Normalization procedures are also applicable to evolutionary research. Thus, normalization of the contents of whole genomes is a useful procedure that can help to characterize organisms.

  8. 40 CFR 406.50 - Applicability; description of the normal rice milling subcategory.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 28 2010-07-01 2010-07-01 true Applicability; description of the normal rice milling subcategory. 406.50 Section 406.50 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS GRAIN MILLS POINT SOURCE CATEGORY Normal Rice...

  9. Normally-off GaN Transistors for Power Applications

    International Nuclear Information System (INIS)

    Hilt, O; Bahat-Treidel, E; Brunner, F; Knauer, A; Zhytnytska, R; Kotara, P; Wuerfl, J

    2014-01-01

    Normally-off high voltage GaN-HFETs for switching applications are presented. Normally-off operation with threshold voltages of 1 V and more and with 5 V gate swing has been obtained by using p-type GaN as gate. Different GaN-based buffer types using doping and backside potential barriers have been used to obtain blocking strengths up to 1000 V. The increase of the dynamic on-state resistance is analyzed for the different buffer types. The best trade-off between low dispersion and high blocking strength was obtained for a modified carbon-doped GaN-buffer that showed a 2.6x increase of the dynamic on-state resistance for 500 V switching as compared to switching from 20 V off-state drain bias. Device operation up to 200 °C ambient temperature without any threshold voltage shift is demonstrated.

  10. 40 CFR 406.30 - Applicability; description of the normal wheat flour milling subcategory.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 28 2010-07-01 2010-07-01 true Applicability; description of the normal wheat flour milling subcategory. 406.30 Section 406.30 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) EFFLUENT GUIDELINES AND STANDARDS GRAIN MILLS POINT SOURCE CATEGORY Normal...

  11. A Classification Method of Normal and Overweight Females Based on Facial Features for Automated Medical Applications

    Directory of Open Access Journals (Sweden)

    Bum Ju Lee

    2012-01-01

    Full Text Available Obesity and overweight have become serious public health problems worldwide. Obesity and abdominal obesity are associated with type 2 diabetes, cardiovascular diseases, and metabolic syndrome. In this paper, we first suggest a method of predicting normal and overweight females according to body mass index (BMI based on facial features. A total of 688 subjects participated in this study. We obtained the area under the ROC curve (AUC value of 0.861 and kappa value of 0.521 in Female: 21–40 (females aged 21–40 years group, and AUC value of 0.76 and kappa value of 0.401 in Female: 41–60 (females aged 41–60 years group. In two groups, we found many features showing statistical differences between normal and overweight subjects by using an independent two-sample t-test. We demonstrated that it is possible to predict BMI status using facial characteristics. Our results provide useful information for studies of obesity and facial characteristics, and may provide useful clues in the development of applications for alternative diagnosis of obesity in remote healthcare.

  12. ContextProvider: Context awareness for medical monitoring applications.

    Science.gov (United States)

    Mitchell, Michael; Meyers, Christopher; Wang, An-I Andy; Tyson, Gary

    2011-01-01

    Smartphones are sensor-rich and Internet-enabled. With their on-board sensors, web services, social media, and external biosensors, smartphones can provide contextual information about the device, user, and environment, thereby enabling the creation of rich, biologically driven applications. We introduce ContextProvider, a framework that offers a unified, query-able interface to contextual data on the device. Unlike other context-based frameworks, ContextProvider offers interactive user feedback, self-adaptive sensor polling, and minimal reliance on third-party infrastructure. ContextProvider also allows for rapid development of new context and bio-aware applications. Evaluation of ContextProvider shows the incorporation of an additional monitoring sensor into the framework with fewer than 100 lines of Java code. With adaptive sensor monitoring, power consumption per sensor can be reduced down to 1% overhead. Finally, through the use of context, accuracy of data interpretation can be improved by up to 80%.

  13. Correlated random sampling for multivariate normal and log-normal distributions

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Kodeli, Ivan A.

    2012-01-01

    A method for correlated random sampling is presented. Representative samples for multivariate normal or log-normal distribution can be produced. Furthermore, any combination of normally and log-normally distributed correlated variables may be sampled to any requested accuracy. Possible applications of the method include sampling of resonance parameters which are used for reactor calculations.

  14. OMARC: An online multimedia application for training health care providers in the assessment of respiratory conditions.

    Science.gov (United States)

    Meruvia-Pastor, Oscar; Patra, Pranjal; Andres, Karen; Twomey, Creina; Peña-Castillo, Lourdes

    2016-05-01

    OMARC, a multimedia application designed to support the training of health care providers for the identification of common lung sounds heard in a patient's thorax as part of a health assessment, is described and its positive contribution to user learning is assessed. The main goal of OMARC is to effectively help health-care students become familiar with lung sounds as part of the assessment of respiratory conditions. In addition, the application must be easy to use and accessible to students and practitioners over the internet. OMARC was developed using an online platform to facilitate access to users in remote locations. OMARC's unique contribution as an educational software tool is that it presents a narrative about normal and abnormal lung sounds using interactive multimedia and sample case studies designed by professional health-care providers and educators. Its interface consists of two distinct components: a sounds glossary and a rich multimedia interface which presents clinical case studies and provides access to lung sounds placed on a model of a human torso. OMARC's contents can be extended through the addition of sounds and case studies designed by health-care educators and professionals. To validate OMARC and determine its efficacy in improving learning and capture user perceptions about it, we performed a pilot study with ten nursing students. Participants' performance was measured through an evaluation of their ability to identify several normal and adventitious/abnormal sounds prior and after exposure to OMARC. Results indicate that participants are able to better identify different lung sounds, going from an average of 63% (S.D. 18.3%) in the pre-test evaluation to an average of 90% (S.D. of 11.5%) after practising with OMARC. Furthermore, participants indicated in a user satisfaction questionnaire that they found the application helpful, easy to use and that they would recommend it to other persons in their field. OMARC is an online multimedia

  15. Normal Forms for Retarded Functional Differential Equations and Applications to Bogdanov-Takens Singularity

    Science.gov (United States)

    Faria, T.; Magalhaes, L. T.

    The paper addresses, for retarded functional differential equations (FDEs), the computation of normal forms associated with the flow on a finite-dimensional invariant manifold tangent to invariant spaces for the infinitesimal generator of the linearized equation at a singularity. A phase space appropriate to the computation of these normal forms is introduced, and adequate nonresonance conditions for the computation of the normal forms are derived. As an application, the general situation of Bogdanov-Takens singularity and its versal unfolding for scalar retarded FDEs with nondegeneracy at second order is considered, both in the general case and in the case of differential-delay equations of the form ẋ( t) = ƒ( x( t), x( t-1)).

  16. Does Normal Processing Provide Evidence of Specialised Semantic Subsystems?

    Science.gov (United States)

    Shapiro, Laura R.; Olson, Andrew C.

    2005-01-01

    Category-specific disorders are frequently explained by suggesting that living and non-living things are processed in separate subsystems (e.g. Caramazza & Shelton, 1998). If subsystems exist, there should be benefits for normal processing, beyond the influence of structural similarity. However, no previous study has separated the relative…

  17. Application of a truncated normal failure distribution in reliability testing

    Science.gov (United States)

    Groves, C., Jr.

    1968-01-01

    Statistical truncated normal distribution function is applied as a time-to-failure distribution function in equipment reliability estimations. Age-dependent characteristics of the truncated function provide a basis for formulating a system of high-reliability testing that effectively merges statistical, engineering, and cost considerations.

  18. ArrayMining: a modular web-application for microarray analysis combining ensemble and consensus methods with cross-study normalization

    Directory of Open Access Journals (Sweden)

    Krasnogor Natalio

    2009-10-01

    Full Text Available Abstract Background Statistical analysis of DNA microarray data provides a valuable diagnostic tool for the investigation of genetic components of diseases. To take advantage of the multitude of available data sets and analysis methods, it is desirable to combine both different algorithms and data from different studies. Applying ensemble learning, consensus clustering and cross-study normalization methods for this purpose in an almost fully automated process and linking different analysis modules together under a single interface would simplify many microarray analysis tasks. Results We present ArrayMining.net, a web-application for microarray analysis that provides easy access to a wide choice of feature selection, clustering, prediction, gene set analysis and cross-study normalization methods. In contrast to other microarray-related web-tools, multiple algorithms and data sets for an analysis task can be combined using ensemble feature selection, ensemble prediction, consensus clustering and cross-platform data integration. By interlinking different analysis tools in a modular fashion, new exploratory routes become available, e.g. ensemble sample classification using features obtained from a gene set analysis and data from multiple studies. The analysis is further simplified by automatic parameter selection mechanisms and linkage to web tools and databases for functional annotation and literature mining. Conclusion ArrayMining.net is a free web-application for microarray analysis combining a broad choice of algorithms based on ensemble and consensus methods, using automatic parameter selection and integration with annotation databases.

  19. Normalized Excited Squeezed Vacuum State and Its Applications

    International Nuclear Information System (INIS)

    Meng Xiangguo; Wang Jisuo; Liang Baolong

    2007-01-01

    By using the intermediate coordinate-momentum representation in quantum optics and generating function for the normalization of the excited squeezed vacuum state (ESVS), the normalized ESVS is obtained. We find that its normalization constants obtained via two new methods are uniform and a new form which is different from the result obtained by Zhang and Fan [Phys. Lett. A 165 (1992) 14]. By virtue of the normalization constant of the ESVS and the intermediate coordinate-momentum representation, the tomogram of the normalized ESVS and some useful formulae are derived.

  20. Speech Perception in Noise in Normally Hearing Children: Does Binaural Frequency Modulated Fitting Provide More Benefit than Monaural Frequency Modulated Fitting?

    Science.gov (United States)

    Mukari, Siti Zamratol-Mai Sarah; Umat, Cila; Razak, Ummu Athiyah Abdul

    2011-07-01

    The aim of the present study was to compare the benefit of monaural versus binaural ear-level frequency modulated (FM) fitting on speech perception in noise in children with normal hearing. Reception threshold for sentences (RTS) was measured in no-FM, monaural FM, and binaural FM conditions in 22 normally developing children with bilateral normal hearing, aged 8 to 9 years old. Data were gathered using the Pediatric Malay Hearing in Noise Test (P-MyHINT) with speech presented from front and multi-talker babble presented from 90°, 180°, 270° azimuths in a sound treated booth. The results revealed that the use of either monaural or binaural ear level FM receivers provided significantly better mean RTSs than the no-FM condition (Pbinaural FM did not produce a significantly greater benefit in mean RTS than monaural fitting. The benefit of binaural over monaural FM varies across individuals; while binaural fitting provided better RTSs in about 50% of study subjects, there were those in whom binaural fitting resulted in either deterioration or no additional improvement compared to monaural FM fitting. The present study suggests that the use of monaural ear-level FM receivers in children with normal hearing might provide similar benefit as binaural use. Individual subjects' variations of binaural FM benefit over monaural FM suggests that the decision to employ monaural or binaural fitting should be individualized. It should be noted however, that the current study recruits typically developing normal hearing children. Future studies involving normal hearing children with high risk of having difficulty listening in noise is indicated to see if similar findings are obtained.

  1. Cointegration as a data normalization tool for structural health monitoring applications

    Science.gov (United States)

    Harvey, Dustin Y.; Todd, Michael D.

    2012-04-01

    The structural health monitoring literature has shown an abundance of features sensitive to various types of damage in laboratory tests. However, robust feature extraction in the presence of varying operational and environmental conditions has proven to be one of the largest obstacles in the development of practical structural health monitoring systems. Cointegration, a technique adapted from the field of econometrics, has recently been introduced to the SHM field as one solution to the data normalization problem. Response measurements and feature histories often show long-run nonstationarity due to fluctuating temperature, load conditions, or other factors that leads to the occurrence of false positives. Cointegration theory allows nonstationary trends common to two or more time series to be modeled and subsequently removed. Thus, the residual retains sensitivity to damage with dependence on operational and environmental variability removed. This study further explores the use of cointegration as a data normalization tool for structural health monitoring applications.

  2. Noise study of all-normal dispersion supercontinuum sources for potential application in optical coherence tomography

    DEFF Research Database (Denmark)

    Bravo Gonzalo, Ivan; Engelsholm, Rasmus Dybbro; Bang, Ole

    2017-01-01

    bandwidths, such sources are characterized by large intensity fluctuations, limiting their performance for applications in imaging such as optical coherence tomography (OCT). An approach to eliminate the influence of noise sensitive effects is to use a so-called all-normal dispersion (ANDi) fiber, in which...... the dispersion is normal for all the wavelengths of interest. Pumping these types of fibers with short enough femtosecond pulses allows to suppress stimulated Raman scattering (SRS), which is known to be as noisy process as modulation instability (MI), and coherent SC is generated through self-phase modulation...... (SPM) and optical wave breaking (OWB). In this study, we show the importance of the pump laser and fiber parameters in the design of low-noise ANDi based SC sources, for application in OCT. We numerically investigate the pulse-to-pulse fluctuations of the SC, calculating the relative intensity noise...

  3. The clinical application and nursing experience of adjustable shunt valve in treatment for patients with normal pressure hydrocephalus

    Directory of Open Access Journals (Sweden)

    YANG Li-rong

    2012-02-01

    Full Text Available Objective To introduce the application of adjustable shunt valve in treatment for patients with normal pressure hydrocephalus. Methods Twenty-four patients with normal pressure hydrocephalus implanted adjustable shunt valve underwent ventriculo-peritoneal shunt surgery and nursing care. Results After operation, cerebrospinal pressure was regulated for 0-6 (1.88 ± 1.52 times. Clinical symptoms were improved, especially in gait disturbance. Conclusion Treatment of normal pressure hydrocephalus with adjustable shunt valve can alleviate symptoms of hydrocephalus. It is especially suitable for patients with short course and secondary normal hydrocephalus patients.

  4. Some Normal Intuitionistic Fuzzy Heronian Mean Operators Using Hamacher Operation and Their Application

    Directory of Open Access Journals (Sweden)

    Guofang Zhang

    2018-06-01

    Full Text Available Hamacher operation is a generalization of the algebraic and Einstein operation and expresses a family of binary operation in the unit interval [0,1]. Heronian mean can deal with correlations of different criteria or input arguments and does not bring out repeated calculation. The normal intuitionistic fuzzy numbers (NIFNs can depict normal distribution information in practical decision making. A decision-making problem was researched under the NIFN environment in this study, and a new multi-criteria group decision-making (MCGDM approach is herein introduced on the basis of Hamacher operation. Firstly, according to Hamacher operation, some operational laws of NIFNs are presented. Secondly, it is noted that Heronian mean not only takes into account mutuality between the attribute values once, but also considers the correlation between input argument and itself. Therefore, in order to aggregate NIFN information, we developed some operators and studied their properties. These operators include Hamacher Heronian mean (NIFHHM, Hamacher weighted Heronian mean (NIFHWHM, Hamacher geometric Heronian mean (NIFHGHM, and Hamacher weighted geometric Heronian mean (NIFHWGHM. Furthermore, we applied the proposed operators to the MCGDM problem and developed a new MCGDM approach. The characteristics of this new approach are that: (1 it is suitable for making a decision under the NIFN environment and it is more reasonable for aggregating the normal distribution data; (2 it utilizes Hamacher operation to provide an effective and powerful MCGDM algorithm and to make more reliable and more flexible decisions under the NIFN circumstance; (3 it uses the Heronian mean operator to deal with interrelations between the attributes or input arguments, and it does not bring about repeated calculation. Therefore, the proposed method can describe the interaction of the different criteria or input arguments and offer some reasonable and reliable MCGDM aggregation operators

  5. Corticocortical feedback increases the spatial extent of normalization.

    Science.gov (United States)

    Nassi, Jonathan J; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T

    2014-01-01

    Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a "normalization pool." Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing.

  6. The Application of Normal Stress Reduction Function in Tilt Tests for Different Block Shapes

    Science.gov (United States)

    Kim, Dong Hyun; Gratchev, Ivan; Hein, Maw; Balasubramaniam, Arumugam

    2016-08-01

    This paper focuses on the influence of the shapes of rock cores, which control the sliding or toppling behaviours in tilt tests for the estimation of rock joint roughness coefficients (JRC). When the JRC values are estimated by performing tilt tests, the values are directly proportional to the basic friction of the rock material and the applied normal stress on the sliding planes. Normal stress obviously varies with the shape of the sliding block, and the basic friction angle is also affected by the sample shapes in tilt tests. In this study, the shapes of core blocks are classified into three representative shapes and those are created using plaster. Using the various shaped artificial cores, a set of tilt tests is carried out to identify the shape influences on the normal stress and the basic friction angle in tilt tests. The test results propose a normal stress reduction function to estimate the normal stress for tilt tests according to the sample shapes based on Barton's empirical equation. The proposed normal stress reduction functions are verified by tilt tests using artificial plaster joints and real rock joint sets. The plaster joint sets are well matched and cast in detailed printed moulds using a 3D printing technique. With the application of the functions, the obtained JRC values from the tilt tests using the plaster samples and the natural rock samples are distributed within a reasonable JRC range when compared with the measured values.

  7. Application of specific gravity method for normalization of urinary excretion rates of radionuclides

    International Nuclear Information System (INIS)

    Thakur, Smita S.; Yadav, J.R.; Rao, D.D.

    2015-01-01

    In vitro bioassay monitoring is based on the determination of activity concentration in biological samples excreted from the body and is most suitable for alpha and beta emitters. For occupational workers handling actinides in reprocessing facilities possibility of internal exposure exists and urine assay is preferred method for monitoring such exposure. Urine samples collected for 24 h duration, is the true representative of bioassay sample and hence in the case of insufficient collection time, specific gravity applied method of normalization of urine sample is used. The present study reports the data of specific gravity generated for controlled group of Indian population by the use of densitometer and its application in urinary sample activity normalization. The average specific gravity value obtained for the controlled group was 1.008±0.005 gm/ml. (author)

  8. Computing Instantaneous Frequency by normalizing Hilbert Transform

    Science.gov (United States)

    Huang, Norden E.

    2005-05-31

    This invention presents Normalized Amplitude Hilbert Transform (NAHT) and Normalized Hilbert Transform(NHT), both of which are new methods for computing Instantaneous Frequency. This method is designed specifically to circumvent the limitation set by the Bedorsian and Nuttal Theorems, and to provide a sharp local measure of error when the quadrature and the Hilbert Transform do not agree. Motivation for this method is that straightforward application of the Hilbert Transform followed by taking the derivative of the phase-angle as the Instantaneous Frequency (IF) leads to a common mistake made up to this date. In order to make the Hilbert Transform method work, the data has to obey certain restrictions.

  9. Improving Mobile Phone Speech Recognition by Personalized Amplification: Application in People with Normal Hearing and Mild-to-Moderate Hearing Loss.

    Science.gov (United States)

    Kam, Anna Chi Shan; Sung, John Ka Keung; Lee, Tan; Wong, Terence Ka Cheong; van Hasselt, Andrew

    In this study, the authors evaluated the effect of personalized amplification on mobile phone speech recognition in people with and without hearing loss. This prospective study used double-blind, within-subjects, repeated measures, controlled trials to evaluate the effectiveness of applying personalized amplification based on the hearing level captured on the mobile device. The personalized amplification settings were created using modified one-third gain targets. The participants in this study included 100 adults of age between 20 and 78 years (60 with age-adjusted normal hearing and 40 with hearing loss). The performance of the participants with personalized amplification and standard settings was compared using both subjective and speech-perception measures. Speech recognition was measured in quiet and in noise using Cantonese disyllabic words. Subjective ratings on the quality, clarity, and comfortableness of the mobile signals were measured with an 11-point visual analog scale. Subjective preferences of the settings were also obtained by a paired-comparison procedure. The personalized amplification application provided better speech recognition via the mobile phone both in quiet and in noise for people with hearing impairment (improved 8 to 10%) and people with normal hearing (improved 1 to 4%). The improvement in speech recognition was significantly better for people with hearing impairment. When the average device output level was matched, more participants preferred to have the individualized gain than not to have it. The personalized amplification application has the potential to improve speech recognition for people with mild-to-moderate hearing loss, as well as people with normal hearing, in particular when listening in noisy environments.

  10. DOSEFU: Computer application for dose calculation and effluent management in normal operation

    International Nuclear Information System (INIS)

    Martin Garcia, J. E.; Gonzalvo Manovel, A.; Revuelta Garcia, L.

    2002-01-01

    DOSEFU is a computer application on Windows that develops the methodology of nuclear power plant Exterior Dose Calculation Manuals (Manuals de Calculo de Dosis al Exterior-MACADE) for calculating doses in normal operation caused by radioactive liquid and gaseous effluents, for the purpose of enforcing the new Spanish Regulation on Health Protection against Ionizing Radiations, Royal Decree 783/2001 resulting from transposition of Directive 96/29/Euratom whereby the basic rules regarding health protection of workers and the population against risks resulting from ionizing radiations are established. In addition to making dose calculations, DOSEFU generates, on a magnetic support, the information regarding radioactive liquid and gaseous effluents that plants must periodically send to the CSN (ELGA format). The computer application has been developed for the specific case of Jose Cabrera NPP, which is called DOEZOR. This application can be easily implemented in any other nuclear or radioactive facility. The application is user-friendly, as the end user inputs data and executes the different modules through keys and dialogue boxes that are enabled by clicking on the mouse (see figures 2, 3, 4 and 5 ), The application runs under Windows 95. Digital Visual Fortran has been used as the development program, as this does not require additional libraries (DLLs), it can be installed in any computer without affecting other programs that are already installed. (Author)

  11. Application of normal form methods to the analysis of resonances in particle accelerators

    International Nuclear Information System (INIS)

    Davies, W.G.

    1992-01-01

    The transformation to normal form in a Lie-algebraic framework provides a very powerful method for identifying and analysing non-linear behaviour and resonances in particle accelerators. The basic ideas are presented and illustrated. (author). 4 refs

  12. Corticocortical feedback increases the spatial extent of normalization

    Science.gov (United States)

    Nassi, Jonathan J.; Gómez-Laberge, Camille; Kreiman, Gabriel; Born, Richard T.

    2014-01-01

    Normalization has been proposed as a canonical computation operating across different brain regions, sensory modalities, and species. It provides a good phenomenological description of non-linear response properties in primary visual cortex (V1), including the contrast response function and surround suppression. Despite its widespread application throughout the visual system, the underlying neural mechanisms remain largely unknown. We recently observed that corticocortical feedback contributes to surround suppression in V1, raising the possibility that feedback acts through normalization. To test this idea, we characterized area summation and contrast response properties in V1 with and without feedback from V2 and V3 in alert macaques and applied a standard normalization model to the data. Area summation properties were well explained by a form of divisive normalization, which computes the ratio between a neuron's driving input and the spatially integrated activity of a “normalization pool.” Feedback inactivation reduced surround suppression by shrinking the spatial extent of the normalization pool. This effect was independent of the gain modulation thought to mediate the influence of contrast on area summation, which remained intact during feedback inactivation. Contrast sensitivity within the receptive field center was also unaffected by feedback inactivation, providing further evidence that feedback participates in normalization independent of the circuit mechanisms involved in modulating contrast gain and saturation. These results suggest that corticocortical feedback contributes to surround suppression by increasing the visuotopic extent of normalization and, via this mechanism, feedback can play a critical role in contextual information processing. PMID:24910596

  13. System and Method for Providing a Climate Data Analytic Services Application Programming Interface Distribution Package

    Science.gov (United States)

    Schnase, John L. (Inventor); Duffy, Daniel Q. (Inventor); Tamkin, Glenn S. (Inventor)

    2016-01-01

    A system, method and computer-readable storage devices for providing a climate data analytic services application programming interface distribution package. The example system can provide various components. The system provides a climate data analytic services application programming interface library that enables software applications running on a client device to invoke the capabilities of a climate data analytic service. The system provides a command-line interface that provides a means of interacting with a climate data analytic service by issuing commands directly to the system's server interface. The system provides sample programs that call on the capabilities of the application programming interface library and can be used as templates for the construction of new client applications. The system can also provide test utilities, build utilities, service integration utilities, and documentation.

  14. High power coupler issues in normal conducting and superconducting accelerator applications

    Energy Technology Data Exchange (ETDEWEB)

    Matsumoto, H. [High Energy Accelerator Research Organization, Tsukuba, Ibaraki (Japan)

    2001-02-01

    The ceramic material (Al{sub 2}O{sub 3}) commonly used for the klystron output coupler in normal conducting, and for an input coupler to superconducting cavities is one of the most troublesome parts in accelerator applications. But the performance can be improved very much by starting with high purity (>99.9%) alumina powder of controlled grain-size (0.1-0.5-{mu}m), and reducing the magnesium (Mg) sintering-binder to lower the dielectric loss to the order of 10{sup -4} at S-band frequencies. It has been confirmed that the new ceramic can stand a peak S-band frequency rf power of up to 300 MW and 2.5 {mu}sec pulse width. (author)

  15. Utilization of Smartphone Applications by Anesthesia Providers

    Directory of Open Access Journals (Sweden)

    Michael S. Green

    2018-01-01

    Full Text Available Health care-related apps provide valuable facts and have added a new dimension to knowledge sharing. The purpose of this study is to understand the pattern of utilization of mobile apps specifically created for anesthesia providers. Smartphone app stores were searched, and a survey was sent to 416 anesthesia providers at 136 anesthesiology residency programs querying specific facets of application use. Among respondents, 11.4% never used, 12.4% used less than once per month, 6.0% used once per month, 12.1% used 2-3 times per month, 13.6% used once per week, 21% used 2-3 times per week, and 23.5% used daily. Dosage/pharmaceutical apps were rated the highest as most useful. 24.6% of the participants would pay less than $2.00, 25.1% would pay $5.00, 30.3% would pay $5–$10.00, 9.6% would pay $10–$25.00, 5.1% would pay $25–$50.00, and 5.1% would pay more than $50.00 if an app saves 5–10 minutes per day or 30 minutes/week. The use of mobile phone apps is not limited to reiterating information from textbooks but provides opportunities to further the ever-changing field of anesthesiology. Our survey illustrates the convenience of apps for health care professionals. Providers must exercise caution when selecting apps to ensure best evidence-based medicine.

  16. Topological resilience in non-normal networked systems

    Science.gov (United States)

    Asllani, Malbor; Carletti, Timoteo

    2018-04-01

    The network of interactions in complex systems strongly influences their resilience and the system capability to resist external perturbations or structural damages and to promptly recover thereafter. The phenomenon manifests itself in different domains, e.g., parasitic species invasion in ecosystems or cascade failures in human-made networks. Understanding the topological features of the networks that affect the resilience phenomenon remains a challenging goal for the design of robust complex systems. We hereby introduce the concept of non-normal networks, namely networks whose adjacency matrices are non-normal, propose a generating model, and show that such a feature can drastically change the global dynamics through an amplification of the system response to exogenous disturbances and eventually impact the system resilience. This early stage transient period can induce the formation of inhomogeneous patterns, even in systems involving a single diffusing agent, providing thus a new kind of dynamical instability complementary to the Turing one. We provide, first, an illustrative application of this result to ecology by proposing a mechanism to mute the Allee effect and, second, we propose a model of virus spreading in a population of commuters moving using a non-normal transport network, the London Tube.

  17. Syntactic error modeling and scoring normalization in speech recognition: Error modeling and scoring normalization in the speech recognition task for adult literacy training

    Science.gov (United States)

    Olorenshaw, Lex; Trawick, David

    1991-01-01

    The purpose was to develop a speech recognition system to be able to detect speech which is pronounced incorrectly, given that the text of the spoken speech is known to the recognizer. Better mechanisms are provided for using speech recognition in a literacy tutor application. Using a combination of scoring normalization techniques and cheater-mode decoding, a reasonable acceptance/rejection threshold was provided. In continuous speech, the system was tested to be able to provide above 80 pct. correct acceptance of words, while correctly rejecting over 80 pct. of incorrectly pronounced words.

  18. Transformation of correlation coefficients between normal and lognormal distribution and implications for nuclear applications

    International Nuclear Information System (INIS)

    Žerovnik, Gašper; Trkov, Andrej; Smith, Donald L.; Capote, Roberto

    2013-01-01

    Inherently positive parameters with large relative uncertainties (typically ≳30%) are often considered to be governed by the lognormal distribution. This assumption has the practical benefit of avoiding the possibility of sampling negative values in stochastic applications. Furthermore, it is typically assumed that the correlation coefficients for comparable multivariate normal and lognormal distributions are equivalent. However, this ideal situation is approached only in the linear approximation which happens to be applicable just for small uncertainties. This paper derives and discusses the proper transformation of correlation coefficients between both distributions for the most general case which is applicable for arbitrary uncertainties. It is seen that for lognormal distributions with large relative uncertainties strong anti-correlations (negative correlations) are mathematically forbidden. This is due to the asymmetry that is an inherent feature of these distributions. Some implications of these results for practical nuclear applications are discussed and they are illustrated with examples in this paper. Finally, modifications to the ENDF-6 format used for representing uncertainties in evaluated nuclear data libraries are suggested, as needed to deal with this issue

  19. Transformation of correlation coefficients between normal and lognormal distribution and implications for nuclear applications

    Energy Technology Data Exchange (ETDEWEB)

    Žerovnik, Gašper, E-mail: gasper.zerovnik@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Trkov, Andrej, E-mail: andrej.trkov@ijs.si [Jožef Stefan Institute, Jamova cesta 39, SI-1000 Ljubljana (Slovenia); Smith, Donald L., E-mail: donald.l.smith@anl.gov [Argonne National Laboratory, 1710 Avenida del Mundo, Coronado, CA 92118-3073 (United States); Capote, Roberto, E-mail: roberto.capotenoy@iaea.org [NAPC–Nuclear Data Section, International Atomic Energy Agency, PO Box 100, Vienna-A-1400 (Austria)

    2013-11-01

    Inherently positive parameters with large relative uncertainties (typically ≳30%) are often considered to be governed by the lognormal distribution. This assumption has the practical benefit of avoiding the possibility of sampling negative values in stochastic applications. Furthermore, it is typically assumed that the correlation coefficients for comparable multivariate normal and lognormal distributions are equivalent. However, this ideal situation is approached only in the linear approximation which happens to be applicable just for small uncertainties. This paper derives and discusses the proper transformation of correlation coefficients between both distributions for the most general case which is applicable for arbitrary uncertainties. It is seen that for lognormal distributions with large relative uncertainties strong anti-correlations (negative correlations) are mathematically forbidden. This is due to the asymmetry that is an inherent feature of these distributions. Some implications of these results for practical nuclear applications are discussed and they are illustrated with examples in this paper. Finally, modifications to the ENDF-6 format used for representing uncertainties in evaluated nuclear data libraries are suggested, as needed to deal with this issue.

  20. Technical Note: The normal quantile transformation and its application in a flood forecasting system

    Directory of Open Access Journals (Sweden)

    K. Bogner

    2012-04-01

    Full Text Available The Normal Quantile Transform (NQT has been used in many hydrological and meteorological applications in order to make the Cumulated Distribution Function (CDF of the observed, simulated and forecast river discharge, water level or precipitation data Gaussian. It is also the heart of the meta-Gaussian model for assessing the total predictive uncertainty of the Hydrological Uncertainty Processor (HUP developed by Krzysztofowicz. In the field of geo-statistics this transformation is better known as the Normal-Score Transform. In this paper some possible problems caused by small sample sizes when applying the NQT in flood forecasting systems will be discussed and a novel way to solve the problem will be outlined by combining extreme value analysis and non-parametric regression methods. The method will be illustrated by examples of hydrological stream-flow forecasts.

  1. Normal CT anatomy of the calcaneus

    International Nuclear Information System (INIS)

    Lee, Mun Gyu; Kang, Heung Sik

    1986-01-01

    Normal sectional anatomy of the calcaneus with multiplanar CT examination was studied in 5 volunteers as the background for interpretation of various abnormalities. Major 3 sectional anatomy including plantar, coronal, sagittal and additional tuberosity planes are described. With CT examination of the calcaneus, 1. More detailed anatomy of 3 facets of subtalar joint (anterior, middle, and posterior facet) can be well visualized. 2. Its clinical applications in the tarsal trauma, tarsal coalition, subtalar infection, degenerative arthritis, club foot, pes planus and tarsal tumor could provide much more information's, which not obtained by conventional radiographic studies.

  2. Intuitionistic Fuzzy Normalized Weighted Bonferroni Mean and Its Application in Multicriteria Decision Making

    Directory of Open Access Journals (Sweden)

    Wei Zhou

    2012-01-01

    Full Text Available The Bonferroni mean (BM was introduced by Bonferroni six decades ago but has been a hot research topic recently since its usefulness of the aggregation techniques. The desirable characteristic of the BM is its capability to capture the interrelationship between input arguments. However, the classical BM and GBM ignore the weight vector of aggregated arguments, the general weighted BM (WBM has not the reducibility, and the revised generalized weighted BM (GWBM cannot reflect the interrelationship between the individual criterion and other criteria. To deal with these issues, in this paper, we propose the normalized weighted Bonferroni mean (NWBM and the generalized normalized weighted Bonferroni mean (GNWBM and study their desirable properties, such as reducibility, idempotency, monotonicity, and boundedness. Furthermore, we investigate the NWBM and GNWBM operators under the intuitionistic fuzzy environment which is more common phenomenon in modern life and develop two new intuitionistic fuzzy aggregation operators based on the NWBM and GNWBM, that is, the intuitionistic fuzzy normalized weighted Bonferroni mean (IFNWBM and the generalized intuitionistic fuzzy normalized weighted Bonferroni mean (GIFNWBM. Finally, based on the GIFNWBM, we propose an approach to multicriteria decision making under the intuitionistic fuzzy environment, and a practical example is provided to illustrate our results.

  3. Estimation of value at risk and conditional value at risk using normal mixture distributions model

    Science.gov (United States)

    Kamaruzzaman, Zetty Ain; Isa, Zaidi

    2013-04-01

    Normal mixture distributions model has been successfully applied in financial time series analysis. In this paper, we estimate the return distribution, value at risk (VaR) and conditional value at risk (CVaR) for monthly and weekly rates of returns for FTSE Bursa Malaysia Kuala Lumpur Composite Index (FBMKLCI) from July 1990 until July 2010 using the two component univariate normal mixture distributions model. First, we present the application of normal mixture distributions model in empirical finance where we fit our real data. Second, we present the application of normal mixture distributions model in risk analysis where we apply the normal mixture distributions model to evaluate the value at risk (VaR) and conditional value at risk (CVaR) with model validation for both risk measures. The empirical results provide evidence that using the two components normal mixture distributions model can fit the data well and can perform better in estimating value at risk (VaR) and conditional value at risk (CVaR) where it can capture the stylized facts of non-normality and leptokurtosis in returns distribution.

  4. Normal tissue dose-effect models in biological dose optimisation

    International Nuclear Information System (INIS)

    Alber, M.

    2008-01-01

    Sophisticated radiotherapy techniques like intensity modulated radiotherapy with photons and protons rely on numerical dose optimisation. The evaluation of normal tissue dose distributions that deviate significantly from the common clinical routine and also the mathematical expression of desirable properties of a dose distribution is difficult. In essence, a dose evaluation model for normal tissues has to express the tissue specific volume effect. A formalism of local dose effect measures is presented, which can be applied to serial and parallel responding tissues as well as target volumes and physical dose penalties. These models allow a transparent description of the volume effect and an efficient control over the optimum dose distribution. They can be linked to normal tissue complication probability models and the equivalent uniform dose concept. In clinical applications, they provide a means to standardize normal tissue doses in the face of inevitable anatomical differences between patients and a vastly increased freedom to shape the dose, without being overly limiting like sets of dose-volume constraints. (orig.)

  5. Novel Electrolyzer Applications: Providing More Than Just Hydrogen

    Energy Technology Data Exchange (ETDEWEB)

    Eichman, J.; Harrison, K.; Peters, M.

    2014-09-01

    Hydrogen can be used for many different applications and can be integrated into many different system architectures. One of the methods for producing the hydrogen is to use an electrolyzer. This work explores the flexibility of electrolyzers to behave as responsive loads. Experimental tests were performed for a proton exchange membrane (PEM) and an alkaline electrolyzer to assess the operational flexibility of electrolyzers to behave as responsive loads. The results are compared to the operational requirements to participate in end-user facility energy management, transmission and distribution system support, and wholesale electricity market services. Electrolyzers begin changing their electricity demand within milliseconds of a set-point change. The settling time after a set-point change is on the order of seconds. It took 6.5 minutes for the PEM unit to execute a cold start and 1 minute to turn off. In addition, a frequency disturbance correction test was performed and electrolyzers were able to accelerate the speed that the grid frequency can be restored. Electrolyzers acting as demand response devices can respond sufficiently fast and for a long enough duration to participate in all of the applications explored. Furthermore, electrolyzers can be operated to support a variety of applications while also providing hydrogen for industrial processes, transportation fuel, or heating fuel. Additionally, favorable operating properties and a variety of potential system architectures showcase the flexibility of electrolyzer systems.

  6. An Evaluation of Mobile Applications for Reproductive Endocrinology and Infertility Providers.

    Science.gov (United States)

    Shaia, Kathryn L; Farag, Sara; Chyjek, Kathy; Knopman, Jaime; Chen, Katherine T

    2017-03-01

    To identify and rate reproductive endocrinology and infertility (REI) mobile applications (apps) targeted toward REI providers. A list of REI apps was found in both the Apple iTunes and Google Play stores using the following seven MeSH terms: reproductive endocrinology, REI, infertility, fertility, In Vitro Fertilization, IVF, and embryology. Patient-centered apps were excluded. The remaining apps were then evaluated for accuracy using reliable references. Mobile technology. None. Accurate apps were evaluated for comprehensiveness (the extent of the ability to aid in clinical decision-making) and rated with objective and subjective components using the APPLICATIONS scoring system. Using the seven REI-related MeSH terms, 985 apps and 1,194 apps were identified in the Apple iTunes and Google Play stores, respectively. Of these unique apps, only 20 remained after excluding patient-centered apps. Upon further review for applicability to REI specifically and content accuracy, only seven apps remained. These seven apps were then rated using the APPLICATIONS scoring system. Only 0.32% of 2,179 apps reviewed for this study were useful to REI providers. There is potential for further mobile resource development in the area of REI, given the limited number and varying comprehensiveness and quality of available apps.

  7. A Mathematical Framework for Critical Transitions: Normal Forms, Variance and Applications

    Science.gov (United States)

    Kuehn, Christian

    2013-06-01

    Critical transitions occur in a wide variety of applications including mathematical biology, climate change, human physiology and economics. Therefore it is highly desirable to find early-warning signs. We show that it is possible to classify critical transitions by using bifurcation theory and normal forms in the singular limit. Based on this elementary classification, we analyze stochastic fluctuations and calculate scaling laws of the variance of stochastic sample paths near critical transitions for fast-subsystem bifurcations up to codimension two. The theory is applied to several models: the Stommel-Cessi box model for the thermohaline circulation from geoscience, an epidemic-spreading model on an adaptive network, an activator-inhibitor switch from systems biology, a predator-prey system from ecology and to the Euler buckling problem from classical mechanics. For the Stommel-Cessi model we compare different detrending techniques to calculate early-warning signs. In the epidemics model we show that link densities could be better variables for prediction than population densities. The activator-inhibitor switch demonstrates effects in three time-scale systems and points out that excitable cells and molecular units have information for subthreshold prediction. In the predator-prey model explosive population growth near a codimension-two bifurcation is investigated and we show that early-warnings from normal forms can be misleading in this context. In the biomechanical model we demonstrate that early-warning signs for buckling depend crucially on the control strategy near the instability which illustrates the effect of multiplicative noise.

  8. 29 CFR 37.38 - What information must grant applicants and recipients provide to CRC?

    Science.gov (United States)

    2010-07-01

    ... provide to CRC? 37.38 Section 37.38 Labor Office of the Secretary of Labor IMPLEMENTATION OF THE... information must grant applicants and recipients provide to CRC? In addition to the information which must be collected, maintained, and, upon request, submitted to CRC under § 37.37: (a) Each grant applicant and...

  9. Providing Assistive Technology Applications as a Service Through Cloud Computing.

    Science.gov (United States)

    Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio

    2015-01-01

    Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.

  10. Measurement of normal auditory ossicles by high-resolusion CT with application of normal criteria to disease cases

    International Nuclear Information System (INIS)

    Hara, Jyoko

    1988-01-01

    The purposes of this study were to define criteria for the normal position of ossicles and to apply them in patients with rhinolaryngologically or pathologically confirmed diseases. Ossicles were measured on high-resolution CT images of 300 middle ears, including 241 normal ears and 59 diseased ears, in a total of 203 subjects. Angles A, B, and C to the baseline between the most lateral margins of bilateral internal auditory canals, and distance ratio b/a were defined as measurement items. Normal angles A, B, and C and distance ratio b/a ranged from 19 deg to 59 deg, 101 deg to 145 deg, 51 deg to 89 deg, and 0.49 to 0.51, respectively. Based on these criteria, all of these items were within the normal range in 30/34 (88.2 %) ears for otitis media and mastoiditis. One or more items showed far abnormal values (standard deviation; more than 3) in 5/7 (71.4 %) ears for cholesteatoma and 4/4 (100 %) ears for external ear anomaly. These normal measurements may aid in evaluating the position of auditory ossicles especially in the case of cholesteatoma and auditory ossicle abnormality. (Namekawa, K.)

  11. Measurement of normal auditory ossicles by high-resolusion CT with application of normal criteria to disease cases

    Energy Technology Data Exchange (ETDEWEB)

    Hara, Jyoko

    1988-09-01

    The purposes of this study were to define criteria for the normal position of ossicles and to apply them in patients with rhinolaryngologically or pathologically confirmed diseases. Ossicles were measured on high-resolution CT images of 300 middle ears, including 241 normal ears and 59 diseased ears, in a total of 203 subjects. Angles A, B, and C to the baseline between the most lateral margins of bilateral internal auditory canals, and distance ratio b/a were defined as measurement items. Normal angles A, B, and C and distance ratio b/a ranged from 19 deg to 59 deg, 101 deg to 145 deg, 51 deg to 89 deg, and 0.49 to 0.51, respectively. Based on these criteria, all of these items were within the normal range in 30/34 (88.2 %) ears for otitis media and mastoiditis. One or more items showed far abnormal values (standard deviation; more than 3) in 5/7 (71.4 %) ears for cholesteatoma and 4/4 (100 %) ears for external ear anomaly. These normal measurements may aid in evaluating the position of auditory ossicles especially in the case of cholesteatoma and auditory ossicle abnormality. (Namekawa, K.).

  12. Normalized compression distance of multisets with applications

    NARCIS (Netherlands)

    Cohen, A.R.; Vitányi, P.M.B.

    Pairwise normalized compression distance (NCD) is a parameter-free, feature-free, alignment-free, similarity metric based on compression. We propose an NCD of multisets that is also metric. Previously, attempts to obtain such an NCD failed. For classification purposes it is superior to the pairwise

  13. Center manifolds, normal forms and bifurcations of vector fields with application to coupling between periodic and steady motions

    Science.gov (United States)

    Holmes, Philip J.

    1981-06-01

    We study the instabilities known to aeronautical engineers as flutter and divergence. Mathematically, these states correspond to bifurcations to limit cycles and multiple equilibrium points in a differential equation. Making use of the center manifold and normal form theorems, we concentrate on the situation in which flutter and divergence become coupled, and show that there are essentially two ways in which this is likely to occur. In the first case the system can be reduced to an essential model which takes the form of a single degree of freedom nonlinear oscillator. This system, which may be analyzed by conventional phase-plane techniques, captures all the qualitative features of the full system. We discuss the reduction and show how the nonlinear terms may be simplified and put into normal form. Invariant manifold theory and the normal form theorem play a major role in this work and this paper serves as an introduction to their application in mechanics. Repeating the approach in the second case, we show that the essential model is now three dimensional and that far more complex behavior is possible, including nonperiodic and ‘chaotic’ motions. Throughout, we take a two degree of freedom system as an example, but the general methods are applicable to multi- and even infinite degree of freedom problems.

  14. A normal colposcopy examination fails to provide psychological reassurance for women who have had low-grade abnormal cervical cytology.

    Science.gov (United States)

    Cotton, S C; Sharp, L; Little, J; Gray, N M; Walker, L G; Whynes, D K; Cruickshank, M E

    2015-06-01

    Worldwide, each year, large numbers of women are referred for colposcopy following low-grade abnormal cervical cytology. Many have no visible abnormality on examination. The risk of cervical intra-epithelial neoplasia grade 2/3 (CIN2/3) in these women is low. It is unknown whether, for women, a normal colposcopy resolves the anxiety which often follows the receipt of an abnormal cytology result. We investigated the prevalence of adverse psychological outcomes over 30 months following a normal colposcopy. This cohort study was nested within the UK TOMBOLA randomized controlled trial. Women aged 20-59 years, with recent low-grade cytology, who had a satisfactory colposcopy examination and normal transformation zone, completed the Hospital Anxiety and Depression Scale (HADS) and Process Outcome Specific Measure (POSM) at recruitment and during follow-up (12, 18, 24 and 30 months post-recruitment). Outcomes included percentages reporting significant anxiety (HADS anxiety subscale score ≥11), significant depression (HADS depression subscale score ≥8) or worries about the result of the next cytology test, cervical cancer, having sex, future fertility and general health at each time point (point prevalence) and during follow-up (cumulative prevalence). The study included 727 women. All psychological measures (except depression) had high prevalence at recruitment, falling substantially by 12 months. During follow-up, the cumulative prevalence of significant anxiety was 27% and significant depression was 21%. The most frequently reported worry was that the next cytology test would be abnormal (cumulative prevalence of 71%; point prevalence of ≥50% at 12 and 18 months). The cumulative prevalence values of worries about cervical cancer, having sex and future fertility were 33%, 20% and 16%, respectively. For some women who have low-grade cytology, a normal colposcopy does not appear to provide psychological reassurance. © 2014 John Wiley & Sons Ltd.

  15. Generation of hiPSTZ16 (ISMMSi003-A cell line from normal human foreskin fibroblasts

    Directory of Open Access Journals (Sweden)

    Marion Dejosez

    2018-01-01

    Full Text Available Human foreskin fibroblasts from a commercial source were reprogrammed into induced pluripotent stem cells to establish a clonal stem cell line, hiPSTZ16 (ISMMSi003-A. These cells show a normal karyotype and full differentiation potential in teratoma assays. The described cells provide a useful resource in combination with other iPS cell lines generated from normal human foreskin fibroblasts to study source- and reprogramming method-independent effects in downstream applications.

  16. Non-linear learning in online tutorial to enhance students’ knowledge on normal distribution application topic

    Science.gov (United States)

    Kartono; Suryadi, D.; Herman, T.

    2018-01-01

    This study aimed to analyze the enhancement of non-linear learning (NLL) in the online tutorial (OT) content to students’ knowledge of normal distribution application (KONDA). KONDA is a competence expected to be achieved after students studied the topic of normal distribution application in the course named Education Statistics. The analysis was performed by quasi-experiment study design. The subject of the study was divided into an experimental class that was given OT content in NLL model and a control class which was given OT content in conventional learning (CL) model. Data used in this study were the results of online objective tests to measure students’ statistical prior knowledge (SPK) and students’ pre- and post-test of KONDA. The statistical analysis test of a gain score of KONDA of students who had low and moderate SPK’s scores showed students’ KONDA who learn OT content with NLL model was better than students’ KONDA who learn OT content with CL model. Meanwhile, for students who had high SPK’s scores, the gain score of students who learn OT content with NLL model had relatively similar with the gain score of students who learn OT content with CL model. Based on those findings it could be concluded that the NLL model applied to OT content could enhance KONDA of students in low and moderate SPK’s levels. Extra and more challenging didactical situation was needed for students in high SPK’s level to achieve the significant gain score.

  17. Factors influencing flap and INTACS decentration after femtosecond laser application in normal and keratoconic eyes.

    Science.gov (United States)

    Ertan, Aylin; Karacal, Humeyra

    2008-10-01

    To compare accuracy of LASIK flap and INTACS centration following femtosecond laser application in normal and keratoconic eyes. This is a retrospective case series comprising 133 eyes of 128 patients referred for refractive surgery. All eyes were divided into two groups according to preoperative diagnosis: group 1 (LASIK group) comprised 74 normal eyes of 72 patients undergoing LASIK with a femtosecond laser (IntraLase), and group 2 (INTACS group) consisted of 59 eyes of 39 patients with keratoconus for whom INTACS were implanted using a femtosecond laser (IntraLase). Decentration of the LASIK flap and INTACS was analyzed using Pentacam. Temporal decentration was 612.56 +/- 384.24 microm (range: 30 to 2120 microm) in the LASIK group and 788.33 +/- 500.34 microm (range: 30 to 2450 microm) in the INTACS group. A statistically significant difference was noted between the groups in terms of decentration (P decentration of the LASIK flap and INTACS correlated with the central corneal thickness in the LASIK group and preoperative sphere and cylinder in the INTACS group, respectively. Decentration with the IntraLase occurred in most cases, especially in keratoconic eyes. The applanation performed for centralization during IntraLase application may flatten and shift the pupil center, and thus cause decentralization of the LASIK flap and INTACS. Central corneal thickness in the LASIK group and preoperative sphere and cylinder in the INTACS group proved to be statistically significant parameters associated with decentration.

  18. On the transition to the normal phase for superconductors surrounded by normal conductors

    DEFF Research Database (Denmark)

    Fournais, Søren; Kachmar, Ayman

    2009-01-01

    For a cylindrical superconductor surrounded by a normal material, we discuss transition to the normal phase of stable, locally stable and critical configurations. Associated with those phase transitions, we define critical magnetic fields and we provide a sufficient condition for which those...

  19. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Normality in Analytical Psychology

    Science.gov (United States)

    Myers, Steve

    2013-01-01

    Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity. PMID:25379262

  1. Normality in Analytical Psychology

    Directory of Open Access Journals (Sweden)

    Steve Myers

    2013-11-01

    Full Text Available Although C.G. Jung’s interest in normality wavered throughout his career, it was one of the areas he identified in later life as worthy of further research. He began his career using a definition of normality which would have been the target of Foucault’s criticism, had Foucault chosen to review Jung’s work. However, Jung then evolved his thinking to a standpoint that was more aligned to Foucault’s own. Thereafter, the post Jungian concept of normality has remained relatively undeveloped by comparison with psychoanalysis and mainstream psychology. Jung’s disjecta membra on the subject suggest that, in contemporary analytical psychology, too much focus is placed on the process of individuation to the neglect of applications that consider collective processes. Also, there is potential for useful research and development into the nature of conflict between individuals and societies, and how normal people typically develop in relation to the spectrum between individuation and collectivity.

  2. Simplified Deployment of Health Informatics Applications by Providing Docker Images.

    Science.gov (United States)

    Löbe, Matthias; Ganslandt, Thomas; Lotzmann, Lydia; Mate, Sebastian; Christoph, Jan; Baum, Benjamin; Sariyar, Murat; Wu, Jie; Stäubert, Sebastian

    2016-01-01

    Due to the specific needs of biomedical researchers, in-house development of software is widespread. A common problem is to maintain and enhance software after the funded project has ended. Even if many tools are made open source, only a couple of projects manage to attract a user basis large enough to ensure sustainability. Reasons for this include complex installation and configuration of biomedical software as well as an ambiguous terminology of the features provided; all of which make evaluation of software laborious. Docker is a para-virtualization technology based on Linux containers that eases deployment of applications and facilitates evaluation. We investigated a suite of software developments funded by a large umbrella organization for networked medical research within the last 10 years and created Docker containers for a number of applications to support utilization and dissemination.

  3. Providing QoS through machine-learning-driven adaptive multimedia applications.

    Science.gov (United States)

    Ruiz, Pedro M; Botía, Juan A; Gómez-Skarmeta, Antonio

    2004-06-01

    We investigate the optimization of the quality of service (QoS) offered by real-time multimedia adaptive applications through machine learning algorithms. These applications are able to adapt in real time their internal settings (i.e., video sizes, audio and video codecs, among others) to the unpredictably changing capacity of the network. Traditional adaptive applications just select a set of settings to consume less than the available bandwidth. We propose a novel approach in which the selected set of settings is the one which offers a better user-perceived QoS among all those combinations which satisfy the bandwidth restrictions. We use a genetic algorithm to decide when to trigger the adaptation process depending on the network conditions (i.e., loss-rate, jitter, etc.). Additionally, the selection of the new set of settings is done according to a set of rules which model the user-perceived QoS. These rules are learned using the SLIPPER rule induction algorithm over a set of examples extracted from scores provided by real users. We will demonstrate that the proposed approach guarantees a good user-perceived QoS even when the network conditions are constantly changing.

  4. Nonlinear dynamics exploration through normal forms

    CERN Document Server

    Kahn, Peter B

    2014-01-01

    Geared toward advanced undergraduates and graduate students, this exposition covers the method of normal forms and its application to ordinary differential equations through perturbation analysis. In addition to its emphasis on the freedom inherent in the normal form expansion, the text features numerous examples of equations, the kind of which are encountered in many areas of science and engineering. The treatment begins with an introduction to the basic concepts underlying the normal forms. Coverage then shifts to an investigation of systems with one degree of freedom that model oscillations

  5. Noise study of all-normal dispersion supercontinuum sources for potential application in optical coherence tomography

    Science.gov (United States)

    Gonzalo, I. B.; Engelsholm, R. D.; Bang, O.

    2018-03-01

    Commercially available silica-fiber-based and ultra-broadband supercontinuum (SC) sources are typically generated by pumping close to the zero-dispersion wavelength (ZDW) of a photonic crystal fiber (PCF), using high-power picosecond or nanosecond laser pulses. Despite the extremely broad bandwidths, such sources are characterized by large intensity fluctuations, limiting their performance for applications in imaging such as optical coherence tomography (OCT). An approach to eliminate the influence of noise sensitive effects is to use a so-called all-normal dispersion (ANDi) fiber, in which the dispersion is normal for all the wavelengths of interest. Pumping these types of fibers with short enough femtosecond pulses allows to suppress stimulated Raman scattering (SRS), which is known to be as noisy process as modulation instability (MI), and coherent SC is generated through self-phase modulation (SPM) and optical wave breaking (OWB). In this study, we show the importance of the pump laser and fiber parameters in the design of low-noise ANDi based SC sources, for application in OCT. We numerically investigate the pulse-to-pulse fluctuations of the SC, calculating the relative intensity noise (RIN) as a function of the pump pulse duration and fiber length. Furthermore, we experimentally demonstrate the role of the fiber length on the RIN of the ANDi SC, validating the results calculated numerically. In the end, we compare the RIN of a commercial SC source based on MI and the ANDi SC source developed here, which shows better noise performance when it is carefully designed.

  6. Percutaneous renal angioplasty and stenting: application of embolic protection device in patients with normal renal function

    International Nuclear Information System (INIS)

    Tong Xiaoqiang; Yang Ming; Wang Jian; Song Li; Wang Chao; Lv Yongxing; Sun Hongliang; Zou Yinghua; Yin Ming

    2007-01-01

    Objective: To investigate the Value of embolic protection device (EPD)in renal artery stenting (RAS)for the patients with normal renal function. Methods: Total 24 patients (26 renal arteries) suffering from renal artery stenosis with normal serum creatinine were divided into two groups: EPD group (n12)and non-EPD group (n=12). Serum creatinine was calculated and analized statistically between the two groups, 1 month and 6months after stenting respectively, and followed by comparisons taking inside of each group and between both groups. Results: Serum creatinine of the EPD and non-EPD groups before, 1 month and 6 month after stenting were(99.18 ± 18.26) μmol/L, (101.73 ± 12.65) μmol/L, (96.82 ± 15.81) μmol/L and (100.18 ± 19.81) μmol/L, (107.36 ± 29.49) μmol/L, (127.64 ± 88.05) μmol/L, respectively; showing no significant difference inside each group individually (P>0.05), and also no statistically significant difference between the two groups (P>0.05). Conclusion: For the patients suffering from renal artery stenosis with normal serum creatinine, application of EPD may have no impact on renal function. Further evaluation is needed. (authors)

  7. Application of the Speed-Duration Relationship to Normalize the Intensity of High-Intensity Interval Training

    Science.gov (United States)

    Ferguson, Carrie; Wilson, John; Birch, Karen M.; Kemi, Ole J.

    2013-01-01

    The tolerable duration of continuous high-intensity exercise is determined by the hyperbolic Speed-tolerable duration (S-tLIM) relationship. However, application of the S-tLIM relationship to normalize the intensity of High-Intensity Interval Training (HIIT) has yet to be considered, with this the aim of present study. Subjects completed a ramp-incremental test, and series of 4 constant-speed tests to determine the S-tLIM relationship. A sub-group of subjects (n = 8) then repeated 4 min bouts of exercise at the speeds predicted to induce intolerance at 4 min (WR4), 6 min (WR6) and 8 min (WR8), interspersed with bouts of 4 min recovery, to the point of exercise intolerance (fixed WR HIIT) on different days, with the aim of establishing the work rate that could be sustained for 960 s (i.e. 4×4 min). A sub-group of subjects (n = 6) also completed 4 bouts of exercise interspersed with 4 min recovery, with each bout continued to the point of exercise intolerance (maximal HIIT) to determine the appropriate protocol for maximizing the amount of high-intensity work that can be completed during 4×4 min HIIT. For fixed WR HIIT tLIM of HIIT sessions was 399±81 s for WR4, 892±181 s for WR6 and 1517±346 s for WR8, with total exercise durations all significantly different from each other (PHIIT, there was no difference in tLIM of each of the 4 bouts (Bout 1: 229±27 s; Bout 2: 262±37 s; Bout 3: 235±49 s; Bout 4: 235±53 s; P>0.050). However, there was significantly less high-intensity work completed during bouts 2 (153.5±40. 9 m), 3 (136.9±38.9 m), and 4 (136.7±39.3 m), compared with bout 1 (264.9±58.7 m; P>0.050). These data establish that WR6 provides the appropriate work rate to normalize the intensity of HIIT between subjects. Maximal HIIT provides a protocol which allows the relative contribution of the work rate profile to physiological adaptations to be considered during alternative intensity-matched HIIT protocols. PMID:24244266

  8. Normal gravity field in relativistic geodesy

    Science.gov (United States)

    Kopeikin, Sergei; Vlasov, Igor; Han, Wen-Biao

    2018-02-01

    Modern geodesy is subject to a dramatic change from the Newtonian paradigm to Einstein's theory of general relativity. This is motivated by the ongoing advance in development of quantum sensors for applications in geodesy including quantum gravimeters and gradientometers, atomic clocks and fiber optics for making ultra-precise measurements of the geoid and multipolar structure of the Earth's gravitational field. At the same time, very long baseline interferometry, satellite laser ranging, and global navigation satellite systems have achieved an unprecedented level of accuracy in measuring 3-d coordinates of the reference points of the International Terrestrial Reference Frame and the world height system. The main geodetic reference standard to which gravimetric measurements of the of Earth's gravitational field are referred is a normal gravity field represented in the Newtonian gravity by the field of a uniformly rotating, homogeneous Maclaurin ellipsoid of which mass and quadrupole momentum are equal to the total mass and (tide-free) quadrupole moment of Earth's gravitational field. The present paper extends the concept of the normal gravity field from the Newtonian theory to the realm of general relativity. We focus our attention on the calculation of the post-Newtonian approximation of the normal field that is sufficient for current and near-future practical applications. We show that in general relativity the level surface of homogeneous and uniformly rotating fluid is no longer described by the Maclaurin ellipsoid in the most general case but represents an axisymmetric spheroid of the fourth order with respect to the geodetic Cartesian coordinates. At the same time, admitting a post-Newtonian inhomogeneity of the mass density in the form of concentric elliptical shells allows one to preserve the level surface of the fluid as an exact ellipsoid of rotation. We parametrize the mass density distribution and the level surface with two parameters which are

  9. Principal Component Analysis for Normal-Distribution-Valued Symbolic Data.

    Science.gov (United States)

    Wang, Huiwen; Chen, Meiling; Shi, Xiaojun; Li, Nan

    2016-02-01

    This paper puts forward a new approach to principal component analysis (PCA) for normal-distribution-valued symbolic data, which has a vast potential of applications in the economic and management field. We derive a full set of numerical characteristics and variance-covariance structure for such data, which forms the foundation for our analytical PCA approach. Our approach is able to use all of the variance information in the original data than the prevailing representative-type approach in the literature which only uses centers, vertices, etc. The paper also provides an accurate approach to constructing the observations in a PC space based on the linear additivity property of normal distribution. The effectiveness of the proposed method is illustrated by simulated numerical experiments. At last, our method is applied to explain the puzzle of risk-return tradeoff in China's stock market.

  10. Application of activity-based costing (ABC) for a Peruvian NGO healthcare provider.

    Science.gov (United States)

    Waters, H; Abdallah, H; Santillán, D

    2001-01-01

    This article describes the application of activity-based costing (ABC) to calculate the unit costs of the services for a health care provider in Peru. While traditional costing allocates overhead and indirect costs in proportion to production volume or to direct costs, ABC assigns costs through activities within an organization. ABC uses personnel interviews to determine principal activities and the distribution of individual's time among these activities. Indirect costs are linked to services through time allocation and other tracing methods, and the result is a more accurate estimate of unit costs. The study concludes that applying ABC in a developing country setting is feasible, yielding results that are directly applicable to pricing and management. ABC determines costs for individual clinics, departments and services according to the activities that originate these costs, showing where an organization spends its money. With this information, it is possible to identify services that are generating extra revenue and those operating at a loss, and to calculate cross subsidies across services. ABC also highlights areas in the health care process where efficiency improvements are possible. Conclusions about the ultimate impact of the methodology are not drawn here, since the study was not repeated and changes in utilization patterns and the addition of new clinics affected applicability of the results. A potential constraint to implementing ABC is the availability and organization of cost information. Applying ABC efficiently requires information to be readily available, by cost category and department, since the greatest benefits of ABC come from frequent, systematic application of the methodology in order to monitor efficiency and provide feedback for management. The article concludes with a discussion of the potential applications of ABC in the health sector in developing countries.

  11. Normal-dispersion microresonator Kerr frequency combs

    Directory of Open Access Journals (Sweden)

    Xue Xiaoxiao

    2016-06-01

    Full Text Available Optical microresonator-based Kerr frequency comb generation has developed into a hot research area in the past decade. Microresonator combs are promising for portable applications due to their potential for chip-level integration and low power consumption. According to the group velocity dispersion of the microresonator employed, research in this field may be classified into two categories: the anomalous dispersion regime and the normal dispersion regime. In this paper, we discuss the physics of Kerr comb generation in the normal dispersion regime and review recent experimental advances. The potential advantages and future directions of normal dispersion combs are also discussed.

  12. Chaos emerging in soil failure patterns observed during tillage: Normalized deterministic nonlinear prediction (NDNP) and its application.

    Science.gov (United States)

    Sakai, Kenshi; Upadhyaya, Shrinivasa K; Andrade-Sanchez, Pedro; Sviridova, Nina V

    2017-03-01

    Real-world processes are often combinations of deterministic and stochastic processes. Soil failure observed during farm tillage is one example of this phenomenon. In this paper, we investigated the nonlinear features of soil failure patterns in a farm tillage process. We demonstrate emerging determinism in soil failure patterns from stochastic processes under specific soil conditions. We normalized the deterministic nonlinear prediction considering autocorrelation and propose it as a robust way of extracting a nonlinear dynamical system from noise contaminated motion. Soil is a typical granular material. The results obtained here are expected to be applicable to granular materials in general. From a global scale to nano scale, the granular material is featured in seismology, geotechnology, soil mechanics, and particle technology. The results and discussions presented here are applicable in these wide research areas. The proposed method and our findings are useful with respect to the application of nonlinear dynamics to investigate complex motions generated from granular materials.

  13. Variation-preserving normalization unveils blind spots in gene expression profiling

    Science.gov (United States)

    Roca, Carlos P.; Gomes, Susana I. L.; Amorim, Mónica J. B.; Scott-Fordsmand, Janeck J.

    2017-01-01

    RNA-Seq and gene expression microarrays provide comprehensive profiles of gene activity, but lack of reproducibility has hindered their application. A key challenge in the data analysis is the normalization of gene expression levels, which is currently performed following the implicit assumption that most genes are not differentially expressed. Here, we present a mathematical approach to normalization that makes no assumption of this sort. We have found that variation in gene expression is much larger than currently believed, and that it can be measured with available assays. Our results also explain, at least partially, the reproducibility problems encountered in transcriptomics studies. We expect that this improvement in detection will help efforts to realize the full potential of gene expression profiling, especially in analyses of cellular processes involving complex modulations of gene expression. PMID:28276435

  14. Comparison of SSS and SRS calculated from normal databases provided by QPS and 4D-MSPECT manufacturers and from identical institutional normals

    International Nuclear Information System (INIS)

    Knollmann, Daniela; Knebel, Ingrid; Gebhard, Michael; Krohn, Thomas; Buell, Ulrich; Schaefer, Wolfgang M.; Koch, Karl-Christian

    2008-01-01

    There is proven evidence for the importance of myocardial perfusion-single-photon emission computed tomography (SPECT) with computerised determination of summed stress and rest scores (SSS/SRS) for the diagnosis of coronary artery disease (CAD). SSS and SRS can thereby be calculated semi-quantitatively using a 20-segment model by comparing tracer-uptake with values from normal databases (NDB). Four severity-degrees for SSS and SRS are normally used: 99m Tc-tetrofosmin, triple-head-camera, 30 s/view, 20 views/head) from 36 men with a low post-stress test CAD probability and visually normal SPECT findings. Patient group was 60 men showing the entire CAD-spectrum referred for routine perfusion-SPECT. Stress/rest results of automatic quantification of the 60 patients were compared to M-NDB and I-NDB. After reclassifying SSS/SRS into the four severity degrees, kappa (κ) values were calculated to objectify agreement. Mean values (vs M-NDB) were 9.4 ± 10.3 (SSS) and 5.8 ± 9.7 (SRS) for QPS and 8.2 ± 8.7 (SSS) and 6.2 ± 7.8 (SRS) for 4D-MSPECT. Thirty seven of sixty SSS classifications (κ = 0.462) and 40/60 SRS classifications (κ = 0.457) agreed. Compared to I-NDB, mean values were 10.2 ± 11.6 (SSS) and 6.5 ± 10.4 (SRS) for QPS and 9.2 ± 9.3 (SSS) and 7.2 ± 8.6 (SRS) for 4D-MSPECT. Forty four of sixty patients agreed in SSS and SRS (κ = 0.621 resp. 0.58). Considerable differences between SSS/SRS obtained with QPS and 4D-MSPECT were found when using M-NDB. Even using identical patients and identical I-NDB, the algorithms still gave substantial different results. (orig.)

  15. Visual Memories Bypass Normalization.

    Science.gov (United States)

    Bloem, Ilona M; Watanabe, Yurika L; Kibbe, Melissa M; Ling, Sam

    2018-05-01

    How distinct are visual memory representations from visual perception? Although evidence suggests that briefly remembered stimuli are represented within early visual cortices, the degree to which these memory traces resemble true visual representations remains something of a mystery. Here, we tested whether both visual memory and perception succumb to a seemingly ubiquitous neural computation: normalization. Observers were asked to remember the contrast of visual stimuli, which were pitted against each other to promote normalization either in perception or in visual memory. Our results revealed robust normalization between visual representations in perception, yet no signature of normalization occurring between working memory stores-neither between representations in memory nor between memory representations and visual inputs. These results provide unique insight into the nature of visual memory representations, illustrating that visual memory representations follow a different set of computational rules, bypassing normalization, a canonical visual computation.

  16. Claims Procedure for Plans Providing Disability Benefits; 90-Day Delay of Applicability Date. Final rule; delay of applicability

    Science.gov (United States)

    2017-11-29

    This document delays for ninety (90) days--through April 1, 2018--the applicability of a final rule amending the claims procedure requirements applicable to ERISA-covered employee benefit plans that provide disability benefits (Final Rule). The Final Rule was published in the Federal Register on December 19, 2016, became effective on January 18, 2017, and was scheduled to become applicable on January 1, 2018. The delay announced in this document is necessary to enable the Department of Labor to carefully consider comments and data as part of its effort, pursuant to Executive Order 13777, to examine regulatory alternatives that meet its objectives of ensuring the full and fair review of disability benefit claims while not imposing unnecessary costs and adverse consequences.

  17. Transforming Normal Programs by Replacement

    NARCIS (Netherlands)

    Bossi, Annalisa; Pettorossi, A.; Cocco, Nicoletta; Etalle, Sandro

    1992-01-01

    The replacement transformation operation, already defined in [28], is studied wrt normal programs. We give applicability conditions able to ensure the correctness of the operation wrt Fitting's and Kunen's semantics. We show how replacement can mimic other transformation operations such as thinning,

  18. a Recursive Approach to Compute Normal Forms

    Science.gov (United States)

    HSU, L.; MIN, L. J.; FAVRETTO, L.

    2001-06-01

    Normal forms are instrumental in the analysis of dynamical systems described by ordinary differential equations, particularly when singularities close to a bifurcation are to be characterized. However, the computation of a normal form up to an arbitrary order is numerically hard. This paper focuses on the computer programming of some recursive formulas developed earlier to compute higher order normal forms. A computer program to reduce the system to its normal form on a center manifold is developed using the Maple symbolic language. However, it should be stressed that the program relies essentially on recursive numerical computations, while symbolic calculations are used only for minor tasks. Some strategies are proposed to save computation time. Examples are presented to illustrate the application of the program to obtain high order normalization or to handle systems with large dimension.

  19. Comparison of SSS and SRS calculated from normal databases provided by QPS and 4D-MSPECT manufacturers and from identical institutional normals.

    Science.gov (United States)

    Knollmann, Daniela; Knebel, Ingrid; Koch, Karl-Christian; Gebhard, Michael; Krohn, Thomas; Buell, Ulrich; Schaefer, Wolfgang M

    2008-02-01

    There is proven evidence for the importance of myocardial perfusion-single-photon emission computed tomography (SPECT) with computerised determination of summed stress and rest scores (SSS/SRS) for the diagnosis of coronary artery disease (CAD). SSS and SRS can thereby be calculated semi-quantitatively using a 20-segment model by comparing tracer-uptake with values from normal databases (NDB). Four severity-degrees for SSS and SRS are normally used: or =14. Manufacturers' NDBs (M-NDBs) often do not fit the institutional (I) settings. Therefore, this study compared SSS and SRS obtained with the algorithms Quantitative Perfusion SPECT (QPS) and 4D-MSPECT using M-NDB and I-NDB. I-NDBs were obtained using QPS and 4D-MSPECT from exercise stress data (450 MBq (99m)Tc-tetrofosmin, triple-head-camera, 30 s/view, 20 views/head) from 36 men with a low post-stress test CAD probability and visually normal SPECT findings. Patient group was 60 men showing the entire CAD-spectrum referred for routine perfusion-SPECT. Stress/rest results of automatic quantification of the 60 patients were compared to M-NDB and I-NDB. After reclassifying SSS/SRS into the four severity degrees, kappa values were calculated to objectify agreement. Mean values (vs M-NDB) were 9.4 +/- 10.3 (SSS) and 5.8 +/- 9.7 (SRS) for QPS and 8.2 +/- 8.7 (SSS) and 6.2 +/- 7.8 (SRS) for 4D-MSPECT. Thirty seven of sixty SSS classifications (kappa = 0.462) and 40/60 SRS classifications (kappa = 0.457) agreed. Compared to I-NDB, mean values were 10.2 +/- 11.6 (SSS) and 6.5 +/- 10.4 (SRS) for QPS and 9.2 +/- 9.3 (SSS) and 7.2 +/- 8.6 (SRS) for 4D-MSPECT. Forty four of sixty patients agreed in SSS and SRS (kappa = 0.621 resp. 0.58). Considerable differences between SSS/SRS obtained with QPS and 4D-MSPECT were found when using M-NDB. Even using identical patients and identical I-NDB, the algorithms still gave substantial different results.

  20. The application of the piecewise linear approximation to the spectral neighborhood of soil line for the analysis of the quality of normalization of remote sensing materials

    Science.gov (United States)

    Kulyanitsa, A. L.; Rukhovich, A. D.; Rukhovich, D. D.; Koroleva, P. V.; Rukhovich, D. I.; Simakova, M. S.

    2017-04-01

    The concept of soil line can be to describe the temporal distribution of spectral characteristics of the bare soil surface. In this case, the soil line can be referred to as the multi-temporal soil line, or simply temporal soil line (TSL). In order to create TSL for 8000 regular lattice points for the territory of three regions of Tula oblast, we used 34 Landsat images obtained in the period from 1985 to 2014 after their certain transformation. As Landsat images are the matrices of the values of spectral brightness, this transformation is the normalization of matrices. There are several methods of normalization that move, rotate, and scale the spectral plane. In our study, we applied the method of piecewise linear approximation to the spectral neighborhood of soil line in order to assess the quality of normalization mathematically. This approach allowed us to range normalization methods according to their quality as follows: classic normalization > successive application of the turn and shift > successive application of the atmospheric correction and shift > atmospheric correction > shift > turn > raw data. The normalized data allowed us to create the maps of the distribution of a and b coefficients of the TSL. The map of b coefficient is characterized by the high correlation with the ground-truth data obtained from 1899 soil pits described during the soil surveys performed by the local institute for land management (GIPROZEM).

  1. Integrated NDVI images for Niger 1986-1987. [Normalized Difference Vegetation Index

    Science.gov (United States)

    Harrington, John A., Jr.; Wylie, Bruce K.; Tucker, Compton J.

    1988-01-01

    Two NOAA AVHRR images are presented which provide a comparison of the geographic distribution of an integration of the normalized difference vegetation index (NDVI) for the Sahel zone in Niger for the growing seasons of 1986 and 1987. The production of the images and the application of the images for resource management are discussed. Daily large area coverage with a spatial resolution of 1.1 km at nadir were transformed to the NDVI and geographically registered to produce the images.

  2. Brightness-normalized Partial Least Squares Regression for hyperspectral data

    International Nuclear Information System (INIS)

    Feilhauer, Hannes; Asner, Gregory P.; Martin, Roberta E.; Schmidtlein, Sebastian

    2010-01-01

    Developed in the field of chemometrics, Partial Least Squares Regression (PLSR) has become an established technique in vegetation remote sensing. PLSR was primarily designed for laboratory analysis of prepared material samples. Under field conditions in vegetation remote sensing, the performance of the technique may be negatively affected by differences in brightness due to amount and orientation of plant tissues in canopies or the observing conditions. To minimize these effects, we introduced brightness normalization to the PLSR approach and tested whether this modification improves the performance under changing canopy and observing conditions. This test was carried out using high-fidelity spectral data (400-2510 nm) to model observed leaf chemistry. The spectral data was combined with a canopy radiative transfer model to simulate effects of varying canopy structure and viewing geometry. Brightness normalization enhanced the performance of PLSR by dampening the effects of canopy shade, thus providing a significant improvement in predictions of leaf chemistry (up to 3.6% additional explained variance in validation) compared to conventional PLSR. Little improvement was made on effects due to variable leaf area index, while minor improvement (mostly not significant) was observed for effects of variable viewing geometry. In general, brightness normalization increased the stability of model fits and regression coefficients for all canopy scenarios. Brightness-normalized PLSR is thus a promising approach for application on airborne and space-based imaging spectrometer data.

  3. Predicating magnetorheological effect of magnetorheological elastomers under normal pressure

    International Nuclear Information System (INIS)

    Dong, X; Qi, M; Ma, N; Ou, J

    2013-01-01

    Magnetorheological elastomers (MREs) present reversible change in shear modulus in an applied magnetic field. For applications and tests of MREs, a normal pressure must be applied on the materials. However, little research paid attention on the effect of the normal pressure on properties of MREs. In this study, a theoretical model is established based on the effective permeability rule and the consideration of the normal pressure. The results indicate that the normal pressure have great influence on magnetic field-induced shear modulus. The shear modulus of MREs increases with increasing normal pressure, such dependence is more significant at high magnetic field levels.

  4. Asymptotic normalization coefficients and astrophysical factors

    International Nuclear Information System (INIS)

    Mukhamedzhanov, A.M.; Azhari, A.; Clark, H.L.; Gagliardi, C.A.; Lui, Y.-W.; Sattarov, A.; Trache, L.; Tribble, R.E.; Burjan, V.; Kroha, V.; Carstoiu, F.

    2000-01-01

    The S factor for the direct capture reaction 7 Be(p,γ) 8 B can be found at astrophysical energies from the asymptotic normalization coefficients (ANC's) which provide the normalization of the tails of the overlap functions for 8 B → 7 Be + p. Peripheral transfer reactions offer a technique to determine these ANC's. Using this technique, the 10 B( 7 Be, 8 B) 9 Be and 14 N( 7 Be, 8 B) 13 C reactions have been used to measure the asymptotic normalization coefficient for 7 Be(p, γ) 8 B. These results provide an indirect determination of S 17 (0). Analysis of the existing 9 Be(p, γ) 10 B experimental data within the framework of the R-matrix method demonstrates that experimentally measured ANC's can provide a reasonable determination of direct radiative capture rates. (author)

  5. Non normal and non quadratic anisotropic plasticity coupled with ductile damage in sheet metal forming: Application to the hydro bulging test

    International Nuclear Information System (INIS)

    Badreddine, Houssem; Saanouni, Khemaies; Dogui, Abdelwaheb

    2007-01-01

    In this work an improved material model is proposed that shows good agreement with experimental data for both hardening curves and plastic strain ratios in uniaxial and equibiaxial proportional loading paths for steel metal until the final fracture. This model is based on non associative and non normal flow rule using two different orthotropic equivalent stresses in both yield criterion and plastic potential functions. For the plastic potential the classical Hill 1948 quadratic equivalent stress is considered while for the yield criterion the Karafillis and Boyce 1993 non quadratic equivalent stress is used taking into account the non linear mixed (kinematic and isotropic) hardening. Applications are made to hydro bulging tests using both circular and elliptical dies. The results obtained with different particular cases of the model such as the normal quadratic and the non normal non quadratic cases are compared and discussed with respect to the experimental results

  6. Sample normalization methods in quantitative metabolomics.

    Science.gov (United States)

    Wu, Yiman; Li, Liang

    2016-01-22

    To reveal metabolomic changes caused by a biological event in quantitative metabolomics, it is critical to use an analytical tool that can perform accurate and precise quantification to examine the true concentration differences of individual metabolites found in different samples. A number of steps are involved in metabolomic analysis including pre-analytical work (e.g., sample collection and storage), analytical work (e.g., sample analysis) and data analysis (e.g., feature extraction and quantification). Each one of them can influence the quantitative results significantly and thus should be performed with great care. Among them, the total sample amount or concentration of metabolites can be significantly different from one sample to another. Thus, it is critical to reduce or eliminate the effect of total sample amount variation on quantification of individual metabolites. In this review, we describe the importance of sample normalization in the analytical workflow with a focus on mass spectrometry (MS)-based platforms, discuss a number of methods recently reported in the literature and comment on their applicability in real world metabolomics applications. Sample normalization has been sometimes ignored in metabolomics, partially due to the lack of a convenient means of performing sample normalization. We show that several methods are now available and sample normalization should be performed in quantitative metabolomics where the analyzed samples have significant variations in total sample amounts. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. The provider perspective: investigating the effect of the Electronic Patient-Reported Outcome (ePRO) mobile application and portal on primary care provider workflow.

    Science.gov (United States)

    Hans, Parminder K; Gray, Carolyn Steele; Gill, Ashlinder; Tiessen, James

    2018-03-01

    Aim This qualitative study investigates how the Electronic Patient-Reported Outcome (ePRO) mobile application and portal system, designed to capture patient-reported measures to support self-management, affected primary care provider workflows. The Canadian health system is facing an ageing population that is living with chronic disease. Disruptive innovations like mobile health technologies can help to support health system transformation needed to better meet the multifaceted needs of the complex care patient. However, there are challenges with implementing these technologies in primary care settings, in particular the effect on primary care provider workflows. Over a six-week period interdisciplinary primary care providers (n=6) and their complex care patients (n=12), used the ePRO mobile application and portal to collaboratively goal-set, manage care plans, and support self-management using patient-reported measures. Secondary thematic analysis of focus groups, training sessions, and issue tracker reports captured user experiences at a Toronto area Family Health Team from October 2014 to January 2015. Findings Key issues raised by providers included: liability concerns associated with remote monitoring, increased documentation activities due to a lack of interoperability between the app and the electronic patient record, increased provider anxiety with regard to the potential for the app to disrupt and infringe upon appointment time, and increased demands for patient engagement. Primary care providers reported the app helped to focus care plans and to begin a collaborative conversation on goal-setting. However, throughout our investigation we found a high level of provider resistance evidenced by consistent attempts to shift the app towards fitting with existing workflows rather than adapting much of their behaviour. As health systems seek innovative and disruptive models to better serve this complex patient population, provider change resistance will need to

  8. Hormonal enzymatic systems in normal and cancerous human breast: control, prognostic factors, and clinical applications.

    Science.gov (United States)

    Pasqualini, Jorge R; Chetrite, Gérard S

    2012-04-01

    The bioformation and transformation of estrogens and other hormones in the breast tissue as a result of the activity of the various enzymes involved attract particular attention for the role they play in the development and pathogenesis of hormone-dependent breast cancer. The enzymatic process concerns the aromatase, which transforms androgens into estrogens; the sulfatase, which hydrolyzes the biologically inactive sulfates to the active hormone; the 17β-hydroxysteroid dehydrogenases, which are involved in the interconversion estradiol/estrone or testosterone/androstenedione; hydroxylases, which transform estrogens into mitotic and antimitotic derivatives; and sulfotransferases and glucuronidases, which, respectively convert into the biologically inactive sulfates and glucuronides. These enzymatic activities are more intense in the carcinoma than in the normal tissue. Concerning aromatase, the application of antiaromatase agents has been largely developed in the treatment of breast cancer patients, with very positive results. Various studies have shown that the activity levels of these enzymes and their mRNA can be involved as interesting prognostic factors for breast cancer. In conclusion, the application of new antienzymatic molecules can open attractive perspectives in the treatment of hormone-dependent breast cancer.

  9. Characteristic analysis on UAV-MIMO channel based on normalized correlation matrix.

    Science.gov (United States)

    Gao, Xi jun; Chen, Zi li; Hu, Yong Jiang

    2014-01-01

    Based on the three-dimensional GBSBCM (geometrically based double bounce cylinder model) channel model of MIMO for unmanned aerial vehicle (UAV), the simple form of UAV space-time-frequency channel correlation function which includes the LOS, SPE, and DIF components is presented. By the methods of channel matrix decomposition and coefficient normalization, the analytic formula of UAV-MIMO normalized correlation matrix is deduced. This formula can be used directly to analyze the condition number of UAV-MIMO channel matrix, the channel capacity, and other characteristic parameters. The simulation results show that this channel correlation matrix can be applied to describe the changes of UAV-MIMO channel characteristics under different parameter settings comprehensively. This analysis method provides a theoretical basis for improving the transmission performance of UAV-MIMO channel. The development of MIMO technology shows practical application value in the field of UAV communication.

  10. Indentation stiffness does not discriminate between normal and degraded articular cartilage.

    Science.gov (United States)

    Brown, Cameron P; Crawford, Ross W; Oloyede, Adekunle

    2007-08-01

    Relative indentation characteristics are commonly used for distinguishing between normal healthy and degraded cartilage. The application of this parameter in surgical decision making and an appreciation of articular cartilage biomechanics has prompted us to hypothesise that it is difficult to define a reference stiffness to characterise normal articular cartilage. This hypothesis is tested for validity by carrying out biomechanical indentation of articular cartilage samples that are characterised as visually normal and degraded relative to proteoglycan depletion and collagen disruption. Compressive loading was applied at known strain rates to visually normal, artificially degraded and naturally osteoarthritic articular cartilage and observing the trends of their stress-strain and stiffness characteristics. While our results demonstrated a 25% depreciation in the stiffness of individual samples after proteoglycan depletion, they also showed that when compared to the stiffness of normal samples only 17% lie outside the range of the stress-strain behaviour of normal samples. We conclude that the extent of the variability in the properties of normal samples, and the degree of overlap (81%) of the biomechanical properties of normal and degraded matrices demonstrate that indentation data cannot form an accurate basis for distinguishing normal from abnormal articular cartilage samples with consequences for the application of this mechanical process in the clinical environment.

  11. 78 FR 72089 - Medicare, Medicaid, and Children's Health Insurance Programs; Provider Enrollment Application Fee...

    Science.gov (United States)

    2013-12-02

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Medicare & Medicaid Services [CMS-6051-N] Medicare, Medicaid, and Children's Health Insurance Programs; Provider Enrollment Application Fee Amount... period entitled ``Medicare, Medicaid, and Children's Health Insurance Programs; Additional Screening...

  12. Normalization of NDVI from Different Sensor System using MODIS Products as Reference

    International Nuclear Information System (INIS)

    Wenxia, Gan; Liangpei, Zhang; Wei, Gong; Huanfeng, Shen

    2014-01-01

    Medium Resolution NDVI(Normalized Difference Vegetation Index) from different sensor systems such as Landsat, SPOT, ASTER, CBERS and HJ-1A/1B satellites provide detailed spatial information for studies of ecosystems, vegetation biophysics, and land cover. Limitation of sensor designs, cloud contamination, and sensor failure highlighted the need to normalize and integrate NDVI from multiple sensor system in order to create a consistent, long-term NDVI data set. In this paper, we used a reference-based method for NDVI normalization. And present an application of this approach which covert Landsat ETM+ NDVI calculated by digital number (NDVI DN ) to NDVI calculated by surface reflectance (NDVI SR ) using MODIS products as reference, and different cluster was treated differently. Result shows that this approach can produce NDVI with highly agreement to NDVI calculated by surface reflectance from physical approaches based on 6S (Second Simulation of the satellite Signal in the Solar Spectrum). Although some variability exists, the cluster specified reference based approach shows considerable potential for NDVI normalization. Therefore, NDVI products in MODIS era from different sources can be combined for time-series analysis, biophysical parameter retrievals, and other downstream analysis

  13. Degree of corneal anaesthesia after topical application of 0.4% oxybuprocaine hydrochloride and 0.5% proparacaine hydrochloride ophthalmic solution in clinically normal cattle.

    Science.gov (United States)

    Little, W B; Jean, G St; Sithole, F; Little, E; Jean, K Yvorchuk-St

    2016-06-01

    The use of corneal anaesthesia is necessary for a range of clinical purposes. Therefore, we assessed and compared the efficacy of corneal anaesthesia after application of 0.4% oxybuprocaine hydrochloride and 0.5% proparacaine hydrochloride ophthalmic solution in clinically normal cattle. The 24 clinically normal cows were allocated into two groups. Cows in group 1 (n = 12) received 0.2 mL of 0.4% oxybuprocaine hydrochloride with fluorescein ophthalmic solution in one eye and 0.2 mL of sterile saline (0.9% NaCl) with fluorescein in the contralateral eye (control). Group 2 (n = 12) received 0.2 mL of 0.4% oxybuprocaine hydrochloride with fluorescein ophthalmic solution in one eye and 0.2 mL of 0.5% proparacaine hydrochloride with fluorescein in the contralateral eye (control). In each group, corneal touch threshold was determined by Cochet-Bonnet aesthesiometer for both eyes immediately prior to topical administration of solutions, at 1 min and 5 min after administration of topical solutions and every 5 min thereafter for a total of 75 min. Significant corneal anaesthesia was noted immediately following topical application of both oxybuprocaine and proparacaine as compared with controls, with maximal corneal anaesthesia noted 1 min after administration. Both oxybuprocaine and proparacaine produced significant corneal anaesthesia for the duration of the 75-min study. Neither oxybuprocaine hydrochloride nor proparacaine hydrochloride treatment resulted in visible adverse effects. There are limited data available demonstrating the efficacy and duration of corneal anaesthetic agents in cattle. Both oxybuprocaine hydrochloride and proparacaine hydrochloride should be considered practical options for providing corneal anaesthesia in cattle in a clinical setting. © 2016 Australian Veterinary Association.

  14. Normal modified stable processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler; Shephard, N.

    2002-01-01

    Gaussian (NGIG) laws. The wider framework thus established provides, in particular, for added flexibility in the modelling of the dynamics of financial time series, of importance especially as regards OU based stochastic volatility models for equities. In the special case of the tempered stable OU process......This paper discusses two classes of distributions, and stochastic processes derived from them: modified stable (MS) laws and normal modified stable (NMS) laws. This extends corresponding results for the generalised inverse Gaussian (GIG) and generalised hyperbolic (GH) or normal generalised inverse...

  15. 30 CFR 778.11 - Providing applicant and operator information.

    Science.gov (United States)

    2010-07-01

    ... operator, if different from the applicant. (4) Each business entity in the applicant's and operator's organizational structure, up to and including the ultimate parent entity of the applicant and operator; for every... are corporations, partnerships, associations, sole proprietorships, or other business entities; (2...

  16. Type II iodothyronine deiodinase provides intracellular 3,5,3′-triiodothyronine to normal and regenerating mouse skeletal muscle

    Science.gov (United States)

    Marsili, Alessandro; Tang, Dan; Harney, John W.; Singh, Prabhat; Zavacki, Ann Marie; Dentice, Monica; Salvatore, Domenico

    2011-01-01

    The FoxO3-dependent increase in type II deiodinase (D2), which converts the prohormone thyroxine (T4) to 3,5,3′-triiodothyronine (T3), is required for normal mouse skeletal muscle differentiation and regeneration. This implies a requirement for an increase in D2-generated intracellular T3 under these conditions, which has not been directly demonstrated despite the presence of D2 activity in skeletal muscle. We directly show that D2-mediated T4-to-T3 conversion increases during differentiation in C2C12 myoblast and primary cultures of mouse neonatal skeletal muscle precursor cells, and that blockade of D2 eliminates this. In adult mice given 125I-T4 and 131I-T3, the intracellular 125I-T3/131I-T3 ratio is significantly higher than in serum in both the D2-expressing cerebral cortex and the skeletal muscle of wild-type, but not D2KO, mice. In D1-expressing liver and kidney, the 125I-T3/131I-T3 ratio does not differ from that in serum. Hypothyroidism increases D2 activity, and in agreement with this, the difference in 125I-T3/131I-T3 ratio is increased further in hypothyroid wild-type mice but not altered in the D2KO. Notably, in wild-type but not in D2KO mice, the muscle production of 125I-T3 is doubled after skeletal muscle injury. Thus, D2-mediated T4-to-T3 conversion generates significant intracellular T3 in normal mouse skeletal muscle, with the increased T3 required for muscle regeneration being provided by increased D2 synthesis, not by T3 from the circulation. PMID:21771965

  17. Skin perfusion measurement: the normal range, the effects of ambient temperature and its clinical application

    International Nuclear Information System (INIS)

    Henry, R.E.; Malone, J.M.; Daly, M.J.; Hughes, J.H.; Moore, W.S.

    1982-01-01

    Quantitation of skin perfusion provides objective criteria to determine the optimal amputation level in ischemic limb disease, to assess the maturation of pedicle flaps in reconstructive surgery, and to select appropriate treatment for chronic skin ulcers. A technique for measurement of skin perfusion using intradermal (ID) Xe-133 and a gamma camera/minicomputer system was previously reported. An update of this procedure is now reported, the normal range for the lower extremity in men, observations on the effects of ambient temperature, and an experience using the procedure to determine amputation level

  18. Commutative $C^*$-algebras and $\\sigma$-normal morphisms

    OpenAIRE

    de Jeu, Marcel

    2003-01-01

    We prove in an elementary fashion that the image of a commutative monotone $\\sigma$-complete $C^*$-algebra under a $\\sigma$-normal morphism is again monotone $\\sigma$-complete and give an application of this result in spectral theory.

  19. Spin-polarized transport in a normal/ferromagnetic/normal zigzag graphene nanoribbon junction

    International Nuclear Information System (INIS)

    Tian Hong-Yu; Wang Jun

    2012-01-01

    We investigate the spin-dependent electron transport in single and double normal/ferromagnetic/normal zigzag graphene nanoribbon (NG/FG/NG) junctions. The ferromagnetism in the FG region originates from the spontaneous magnetization of the zigzag graphene nanoribbon. It is shown that when the zigzag-chain number of the ribbon is even and only a single transverse mode is actived, the single NG/FG/NG junction can act as a spin polarizer and/or a spin analyzer because of the valley selection rule and the spin-exchange field in the FG, while the double NG/FG/NG/FG/NG junction exhibits a quantum switching effect, in which the on and the off states switch rapidly by varying the cross angle between two FG magnetizations. Our findings may shed light on the application of magnetized graphene nanoribbons to spintronics devices. (condensed matter: electronic structure, electrical, magnetic, and optical properties)

  20. Application Service Providers (ASP Adoption in Core and Non-Core Functions

    Directory of Open Access Journals (Sweden)

    Aman Y.M. Chan

    2009-10-01

    Full Text Available With the further improvement in internet bandwidth, connection stability and data transmission security, a new wave of Application Service Providers (ASP is on his way. The recent booming on some models such as Software Application as Service (SaaS and On-Demand in 2008, has led to emergence of ASP model in core business functions. The traditional IS outsourcing covers the non-core business functions that are not critical to business performance and competitive advantages. Comparing with traditional IS outsourcing, ASP is a new phenomenon that can be considered as an emerging innovation as it covers both core and non-core business functions. Most of the executives do not comprehend the difference and similarity between traditional IS outsourcing and ASP mode. Hence, we propose to conduct a research so as to identify the determinants (cost benefit, gap in IS capability complementing the company's strategic goal, and trust to ASP's service and security level and moderating factors (management's attitude in ownership & control, and company aggressiveness of ASP adoption decision in both core and non-core business functions.

  1. Normalized cDNA libraries

    Science.gov (United States)

    Soares, Marcelo B.; Efstratiadis, Argiris

    1997-01-01

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to moderate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library.

  2. A normal form approach to the theory of nonlinear betatronic motion

    International Nuclear Information System (INIS)

    Bazzani, A.; Todesco, E.; Turchetti, G.; Servizi, G.

    1994-01-01

    The betatronic motion of a particle in a circular accelerator is analysed using the transfer map description of the magnetic lattice. In the linear case the transfer matrix approach is shown to be equivalent to the Courant-Snyder theory: In the normal coordinates' representation the transfer matrix is a pure rotation. When the nonlinear effects due to the multipolar components of the magnetic field are taken into account, a similar procedure is used: a nonlinear change of coordinates provides a normal form representation of the map, which exhibits explicit symmetry properties depending on the absence or presence of resonance relations among the linear tunes. The use of normal forms is illustrated in the simplest but significant model of a cell with a sextupolar nonlinearity which is described by the quadratic Henon map. After recalling the basic theoretical results in Hamiltonian dynamics, we show how the normal forms describe the different topological structures of phase space such as KAM tori, chains of islands and chaotic regions; a critical comparison with the usual perturbation theory for Hamilton equations is given. The normal form theory is applied to compute the tune shift and deformation of the orbits for the lattices of the SPS and LHC accelerators, and scaling laws are obtained. Finally, the correction procedure of the multipolar errors of the LHC, based on the analytic minimization of the tune shift computed via the normal forms, is described and the results for a model of the LHC are presented. This application, relevant for the lattice design, focuses on the advantages of normal forms with respect to tracking when parametric dependences have to be explored. (orig.)

  3. Not normally manned compression platforms for the North Sea

    International Nuclear Information System (INIS)

    Kumaran, K.S.

    1994-01-01

    Gas turbine driven gas compressors have been widely used on manned offshore facilities. Similarly unmanned gas turbine driven compressor stations have been in operation onshore with major gas transmission companies in Europe, North America and elsewhere. This paper summarizes a recent joint industry study to investigate the technical and economic feasibility of Not Normally Manned (NNM) Offshore Compression Facilities in terms of reliability, availability and maintainability. Classification of not normally manned (or unmanned) offshore facilities in the UK North Sea is in accordance with HSE Operations Notice 8. ON8 specifies criteria for offshore visits, visit hours and number of personnel on board for the operation of NNM platforms. This paper describes a typical Southern North Sea gas platform being considered for NNM compressor application. The conclusions from the study was that NNM compression is technically feasible with the facilities being able to provide an availability in excess of 98%. Life cycle costs were of the order of 70% of manned facilities thus significantly improving field development economics

  4. Telehealth Applications to Enhance CKD Knowledge and Awareness Among Patients and Providers.

    Science.gov (United States)

    Tuot, Delphine S; Boulware, L Ebony

    2017-01-01

    CKD affects 13% of the US adult population, causes excess mortality, and is associated with significant sociodemographic disparities. Optimal CKD management slows progression of disease and reduces cardiovascular-related outcomes. Resources for patients and primary care providers, major stakeholders in preventive CKD care, are critically needed to enhance understanding of the disease and to optimize CKD health, particularly because of the asymptomatic nature of kidney disease. Telehealth is defined as the use of electronic communication and telecommunications technology to support long-distance clinical health care, patient and professional health-related education, and public health and health administration. It provides new opportunities to enhance awareness and understanding among these important stakeholders. This review will examine the role of telehealth within existing educational theories, identify telehealth applications that can enhance CKD knowledge and behavior change among patients and primary care providers, and examine the advantages and disadvantages of telehealth vs usual modalities for education. Copyright © 2016 National Kidney Foundation, Inc. Published by Elsevier Inc. All rights reserved.

  5. Bifactor model of WISC-IV: Applicability and measurement invariance in low and normal IQ groups.

    Science.gov (United States)

    Gomez, Rapson; Vance, Alasdair; Watson, Shaun

    2017-07-01

    This study examined the applicability and measurement invariance of the bifactor model of the 10 Wechsler Intelligence Scale for Children-Fourth Edition (WISC-IV) core subtests in groups of children and adolescents (age range from 6 to 16 years) with low (IQ ≤79; N = 229; % male = 75.9) and normal (IQ ≥80; N = 816; % male = 75.0) IQ scores. Results supported this model in both groups, and there was good support for measurement invariance for this model across these groups. For all participants together, the omega hierarchical and explained common variance (ECV) values were high for the general factor and low to negligible for the specific factors. Together, the findings favor the use of the Full Scale IQ (FSIQ) scores of the WISC-IV, but not the subscale index scores. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Application of 'Process management' methodology in providing financial services of PE 'Post Serbia'

    Directory of Open Access Journals (Sweden)

    Kujačić Momčilo D.

    2014-01-01

    Full Text Available The paper describes application of the methodology 'Process management', in providing of financial services at the post office counter hall. An overview of the methodology is given, as one of the most commonly used qualitative methodology, whereby Process management's technics are described , those can better meet user needs and market demands, as well as to find more effectively way to resist current competition in the postal service market. One of the main problem that pointed out is a long waiting time in the counter hall during providing financial services, which leads to the formation of queue lines, and thus to customer dissatisfaction. According that, paper points steps that should be taken during provide of financial services in a postal network unit for providing services to customers by optimizing user time waiting in line and increasing the satisfaction of all participants in that process.

  7. Normal forms of Hopf-zero singularity

    International Nuclear Information System (INIS)

    Gazor, Majid; Mokhtari, Fahimeh

    2015-01-01

    The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative–nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov–Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov–Takens singularities. Despite this, the normal form computations of Bogdanov–Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative–nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto–Sivashinsky equations to demonstrate the applicability of our results. (paper)

  8. Normal forms of Hopf-zero singularity

    Science.gov (United States)

    Gazor, Majid; Mokhtari, Fahimeh

    2015-01-01

    The Lie algebra generated by Hopf-zero classical normal forms is decomposed into two versal Lie subalgebras. Some dynamical properties for each subalgebra are described; one is the set of all volume-preserving conservative systems while the other is the maximal Lie algebra of nonconservative systems. This introduces a unique conservative-nonconservative decomposition for the normal form systems. There exists a Lie-subalgebra that is Lie-isomorphic to a large family of vector fields with Bogdanov-Takens singularity. This gives rise to a conclusion that the local dynamics of formal Hopf-zero singularities is well-understood by the study of Bogdanov-Takens singularities. Despite this, the normal form computations of Bogdanov-Takens and Hopf-zero singularities are independent. Thus, by assuming a quadratic nonzero condition, complete results on the simplest Hopf-zero normal forms are obtained in terms of the conservative-nonconservative decomposition. Some practical formulas are derived and the results implemented using Maple. The method has been applied on the Rössler and Kuramoto-Sivashinsky equations to demonstrate the applicability of our results.

  9. Great Basin land managers provide detailed feedback about usefulness of two climate information web applications

    Directory of Open Access Journals (Sweden)

    Chad Zanocco

    Full Text Available Land managers in the Great Basin are working to maintain or restore sagebrush ecosystems as climate change exacerbates existing threats. Web applications delivering climate change and climate impacts information have the potential to assist their efforts. Although many web applications containing climate information currently exist, few have been co-produced with land managers or have incorporated information specifically focused on land managers’ needs. Through surveys and interviews, we gathered detailed feedback from federal, state, and tribal sagebrush land managers in the Great Basin on climate information web applications targeting land management. We found that a managers are searching for weather and climate information they can incorporate into their current management strategies and plans; b they are willing to be educated on how to find and understand climate related web applications; c both field and administrative-type managers want data for timescales ranging from seasonal to decadal; d managers want multiple levels of climate information, from simple summaries, to detailed descriptions accessible through the application; and e managers are interested in applications that evaluate uncertainty and provide projected climate impacts. Keywords: Great Basin, Sagebrush, Land management, Climate change, Web application, Co-production

  10. Monitoring the normal body

    DEFF Research Database (Denmark)

    Nissen, Nina Konstantin; Holm, Lotte; Baarts, Charlotte

    2015-01-01

    of practices for monitoring their bodies based on different kinds of calculations of weight and body size, observations of body shape, and measurements of bodily firmness. Biometric measurements are familiar to them as are health authorities' recommendations. Despite not belonging to an extreme BMI category...... provides us with knowledge about how to prevent future overweight or obesity. This paper investigates body size ideals and monitoring practices among normal-weight and moderately overweight people. Methods : The study is based on in-depth interviews combined with observations. 24 participants were...... recruited by strategic sampling based on self-reported BMI 18.5-29.9 kg/m2 and socio-demographic factors. Inductive analysis was conducted. Results : Normal-weight and moderately overweight people have clear ideals for their body size. Despite being normal weight or close to this, they construct a variety...

  11. Software reliability growth models with normal failure time distributions

    International Nuclear Information System (INIS)

    Okamura, Hiroyuki; Dohi, Tadashi; Osaki, Shunji

    2013-01-01

    This paper proposes software reliability growth models (SRGM) where the software failure time follows a normal distribution. The proposed model is mathematically tractable and has sufficient ability of fitting to the software failure data. In particular, we consider the parameter estimation algorithm for the SRGM with normal distribution. The developed algorithm is based on an EM (expectation-maximization) algorithm and is quite simple for implementation as software application. Numerical experiment is devoted to investigating the fitting ability of the SRGMs with normal distribution through 16 types of failure time data collected in real software projects

  12. Prácticas para estimular el parto normal Practices to stimulate normal childbirth

    Directory of Open Access Journals (Sweden)

    Flora Maria Barbosa da Silva

    2011-09-01

    Full Text Available Este artículo lleva a una reflexión sobre las prácticas del estímulo al parto normal, con la fundamentación teórica de cada una de ellas. Las prácticas incluidas en este estudio fueron el ayuno, enemas, spray y baños de inmersión, caminatas, movimientos pélvicos y masaje. En un contexto de revalorización del parto normal, ofrecer a la mujer durante el parto opciones de comodidad basadas en evidencias puede ser una forma de preservar el curso fisiológico del parto.This article leads to a reflection about the practices of encouraging normal childbirth, with the theoretical foundation for each one of them. The practices included in this study were fasting, enema, shower and immersion baths, walking, pelvic movements and massage. In a context of revaluation of normal birth, providing evidence-based comfort options for women during childbirth can be a way to preserve the physiological course of labour.

  13. Thermal Cameras and Applications

    DEFF Research Database (Denmark)

    Gade, Rikke; Moeslund, Thomas B.

    2014-01-01

    Thermal cameras are passive sensors that capture the infrared radiation emitted by all objects with a temperature above absolute zero. This type of camera was originally developed as a surveillance and night vision tool for the military, but recently the price has dropped, significantly opening up...... a broader field of applications. Deploying this type of sensor in vision systems eliminates the illumination problems of normal greyscale and RGB cameras. This survey provides an overview of the current applications of thermal cameras. Applications include animals, agriculture, buildings, gas detection......, industrial, and military applications, as well as detection, tracking, and recognition of humans. Moreover, this survey describes the nature of thermal radiation and the technology of thermal cameras....

  14. An atlas of normal skeletal scintigraphy

    International Nuclear Information System (INIS)

    Flanagan, J.J.; Maisey, M.N.

    1985-01-01

    This atlas was compiled to provide the neophyte as well as the experienced radiologist and the nuclear medicine physician with a reference on normal skeletal scintigraphy as an aid in distinguishing normal variations in skeletal uptake from abnormal findings. Each skeletal scintigraph is labeled, and utilizing an identical scale, a relevant skeletal photograph and radiograph are placed adjacent to the scintigraph

  15. Application of microgrids in providing ancillary services to the utility grid

    International Nuclear Information System (INIS)

    Majzoobi, Alireza; Khodaei, Amin

    2017-01-01

    A microgrid optimal scheduling model is developed in this paper to demonstrate microgrid's capability in offering ancillary services to the utility grid. The application of localized ancillary services is of significant importance to grid operators as the growing proliferation of distributed renewable energy resources, mainly solar generation, is causing major technical challenges in supply-load balance. The proposed microgrid optimal scheduling model coordinates the microgrid net load with the aggregated consumers/prosumers net load in its connected distribution feeder to capture both inter-hour and intra-hour net load variations. In particular, net load variations for three various time resolutions are considered, including hourly ramping, 10-min based load following, and 1-min based frequency regulation. Numerical simulations on a test distribution feeder with one microgrid and several consumers/prosumers indicate the effectiveness of the proposed model and the viability of the microgrid application in supporting grid operation. - Highlights: • Microgrid optimal scheduling for providing ancillary services to the utility grid. • Local management and mitigation of distribution net load variations. • Offering various support services: ramping, load following, frequency regulation. • Proven effectiveness and accuracy in capturing net load variations.

  16. Ultrasonic off-normal imaging techniques for under sodium viewing

    International Nuclear Information System (INIS)

    Michaels, T.E.; Horn, J.E.

    1979-01-01

    Advanced imaging methods have been evaluated for the purpose of constructing images of objects from ultrasonic data. Feasibility of imaging surfaces which are off-normal to the sound beam has been established. Laboratory results are presented which show a complete image of a typical core component. Using the previous system developed for under sodium viewing (USV), only normal surfaces of this object could be imaged. Using advanced methods, surfaces up to 60 degrees off-normal have been imaged. Details of equipment and procedures used for this image construction are described. Additional work on high temperature transducers, electronics, and signal analysis is required in order to adapt the off-normal viewing process described here to an eventual USV application

  17. Sampling from the normal and exponential distributions

    International Nuclear Information System (INIS)

    Chaplin, K.R.; Wills, C.A.

    1982-01-01

    Methods for generating random numbers from the normal and exponential distributions are described. These involve dividing each function into subregions, and for each of these developing a method of sampling usually based on an acceptance rejection technique. When sampling from the normal or exponential distribution, each subregion provides the required random value with probability equal to the ratio of its area to the total area. Procedures written in FORTRAN for the CYBER 175/CDC 6600 system are provided to implement the two algorithms

  18. Use of operator-provided, installed C/S equipment in IAEA safeguards

    International Nuclear Information System (INIS)

    Shea, T.; Rundquist, D.; Gaertner, K.; Yellin, E.

    1987-01-01

    Developing solutions for complex safe guards problems in close cooperation with Operators is becoming more common, especially as the IAEA continues to operate under zero-growth limitations. This has in practice taken on various forms; from the extreme case of very specific equipment developed and constructed by the State/Operator for use in only one facility, to the more normal case where only the development is carried out by the State/Operator. This practice has advantages and disadvantages. For example, to ensure that Agency inspections will be carried out in a predictable manner, it will be in the Operator's interest to ensure that any equipment he provides is of the highest quality, meets all national safety requirements, and is installed and maintained in such a manner that it will provide years of service. Agency equipment performs its intended function in a reliable manner, but with very specific, limited applications in mind, improvements in reliability over that obtained with normal Agency equipment are to be expected. Also, the authors experience is that reaching acceptable arrangements for the use of State- of Operator-supplied equipment is often far more straightforward than when arranging to apply Agency equipment

  19. Drug Use Normalization: A Systematic and Critical Mixed-Methods Review.

    Science.gov (United States)

    Sznitman, Sharon R; Taubman, Danielle S

    2016-09-01

    Drug use normalization, which is a process whereby drug use becomes less stigmatized and more accepted as normative behavior, provides a conceptual framework for understanding contemporary drug issues and changes in drug use trends. Through a mixed-methods systematic review of the normalization literature, this article seeks to (a) critically examine how the normalization framework has been applied in empirical research and (b) make recommendations for future research in this area. Twenty quantitative, 26 qualitative, and 4 mixed-methods studies were identified through five electronic databases and reference lists of published studies. Studies were assessed for relevance, study characteristics, quality, and aspects of normalization examined. None of the studies applied the most rigorous research design (experiments) or examined all of the originally proposed normalization dimensions. The most commonly assessed dimension of drug use normalization was "experimentation." In addition to the original dimensions, the review identified the following new normalization dimensions in the literature: (a) breakdown of demographic boundaries and other risk factors in relation to drug use; (b) de-normalization; (c) drug use as a means to achieve normal goals; and (d) two broad forms of micro-politics associated with managing the stigma of illicit drug use: assimilative and transformational normalization. Further development in normalization theory and methodology promises to provide researchers with a novel framework for improving our understanding of drug use in contemporary society. Specifically, quasi-experimental designs that are currently being made feasible by swift changes in cannabis policy provide researchers with new and improved opportunities to examine normalization processes.

  20. An inexact log-normal distribution-based stochastic chance-constrained model for agricultural water quality management

    Science.gov (United States)

    Wang, Yu; Fan, Jie; Xu, Ye; Sun, Wei; Chen, Dong

    2018-05-01

    In this study, an inexact log-normal-based stochastic chance-constrained programming model was developed for solving the non-point source pollution issues caused by agricultural activities. Compared to the general stochastic chance-constrained programming model, the main advantage of the proposed model is that it allows random variables to be expressed as a log-normal distribution, rather than a general normal distribution. Possible deviations in solutions caused by irrational parameter assumptions were avoided. The agricultural system management in the Erhai Lake watershed was used as a case study, where critical system factors, including rainfall and runoff amounts, show characteristics of a log-normal distribution. Several interval solutions were obtained under different constraint-satisfaction levels, which were useful in evaluating the trade-off between system economy and reliability. The applied results show that the proposed model could help decision makers to design optimal production patterns under complex uncertainties. The successful application of this model is expected to provide a good example for agricultural management in many other watersheds.

  1. EMG normalization method based on grade 3 of manual muscle testing: Within- and between-day reliability of normalization tasks and application to gait analysis.

    Science.gov (United States)

    Tabard-Fougère, Anne; Rose-Dulcina, Kevin; Pittet, Vincent; Dayer, Romain; Vuillerme, Nicolas; Armand, Stéphane

    2018-02-01

    Electromyography (EMG) is an important parameter in Clinical Gait Analysis (CGA), and is generally interpreted with timing of activation. EMG amplitude comparisons between individuals, muscles or days need normalization. There is no consensus on existing methods. The gold standard, maximum voluntary isometric contraction (MVIC), is not adapted to pathological populations because patients are often unable to perform an MVIC. The normalization method inspired by the isometric grade 3 of manual muscle testing (isoMMT3), which is the ability of a muscle to maintain a position against gravity, could be an interesting alternative. The aim of this study was to evaluate the within- and between-day reliability of the isoMMT3 EMG normalizing method during gait compared with the conventional MVIC method. Lower limb muscles EMG (gluteus medius, rectus femoris, tibialis anterior, semitendinosus) were recorded bilaterally in nine healthy participants (five males, aged 29.7±6.2years, BMI 22.7±3.3kgm -2 ) giving a total of 18 independent legs. Three repeated measurements of the isoMMT3 and MVIC exercises were performed with an EMG recording. EMG amplitude of the muscles during gait was normalized by these two methods. This protocol was repeated one week later. Within- and between-day reliability of normalization tasks were similar for isoMMT3 and MVIC methods. Within- and between-day reliability of gait EMG normalized by isoMMT3 was higher than with MVIC normalization. These results indicate that EMG normalization using isoMMT3 is a reliable method with no special equipment needed and will support CGA interpretation. The next step will be to evaluate this method in pathological populations. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. On applicability of PCA, voxel-wise variance normalization and dimensionality assumptions for sliding temporal window sICA in resting-state fMRI.

    Science.gov (United States)

    Remes, Jukka J; Abou Elseoud, Ahmed; Ollila, Esa; Haapea, Marianne; Starck, Tuomo; Nikkinen, Juha; Tervonen, Osmo; Silven, Olli

    2013-10-01

    Subject-level resting-state fMRI (RS-fMRI) spatial independent component analysis (sICA) may provide new ways to analyze the data when performed in the sliding time window. However, whether principal component analysis (PCA) and voxel-wise variance normalization (VN) are applicable pre-processing procedures in the sliding-window context, as they are for regular sICA, has not been addressed so far. Also model order selection requires further studies concerning sliding-window sICA. In this paper we have addressed these concerns. First, we compared PCA-retained subspaces concerning overlapping parts of consecutive temporal windows to answer whether in-window PCA and VN can confound comparisons between sICA analyses in consecutive windows. Second, we compared the PCA subspaces between windowed and full data to assess expected comparability between windowed and full-data sICA results. Third, temporal evolution of dimensionality estimates in RS-fMRI data sets was monitored to identify potential challenges in model order selection in a sliding-window sICA context. Our results illustrate that in-window VN can be safely used, in-window PCA is applicable with most window widths and that comparisons between windowed and full data should not be performed from a subspace similarity point of view. In addition, our studies on dimensionality estimates demonstrated that there are sustained, periodic and very case-specific changes in signal-to-noise ratio within RS-fMRI data sets. Consequently, dimensionality estimation is needed for well-founded model order determination in the sliding-window case. The observed periodic changes correspond to a frequency band of ≤0.1 Hz, which is commonly associated with brain activity in RS-fMRI and become on average most pronounced at window widths of 80 and 60 time points (144 and 108 s, respectively). Wider windows provided only slightly better comparability between consecutive windows, and 60 time point or shorter windows also provided the

  3. Method for construction of normalized cDNA libraries

    Science.gov (United States)

    Soares, Marcelo B.; Efstratiadis, Argiris

    1998-01-01

    This invention provides a method to normalize a directional cDNA library constructed in a vector that allows propagation in single-stranded circle form comprising: (a) propagating the directional cDNA library in single-stranded circles; (b) generating fragments complementary to the 3' noncoding sequence of the single-stranded circles in the library to produce partial duplexes; (c) purifying the partial duplexes; (d) melting and reassociating the purified partial duplexes to appropriate Cot; and (e) purifying the unassociated single-stranded circles, thereby generating a normalized cDNA library. This invention also provides normalized cDNA libraries generated by the above-described method and uses of the generated libraries.

  4. Statistical Theory of Normal Grain Growth Revisited

    International Nuclear Information System (INIS)

    Gadomski, A.; Luczka, J.

    2002-01-01

    In this paper, we discuss three physically relevant problems concerning the normal grain growth process. These are: Infinite vs finite size of the system under study (a step towards more realistic modeling); conditions of fine-grained structure formation, with possible applications to thin films and biomembranes, and interesting relations to superplasticity of materials; approach to log-normality, an ubiquitous natural phenomenon, frequently reported in literature. It turns out that all three important points mentioned are possible to be included in a Mulheran-Harding type behavior of evolving grains-containing systems that we have studied previously. (author)

  5. Normal-incidence spectroscopic ellipsometry for critical dimension monitoring

    International Nuclear Information System (INIS)

    Huang, Hsu-Ting; Kong, Wei; Terry, Fred Lewis

    2001-01-01

    In this letter, we show that normal-incidence spectroscopic ellipsometry can be used for high-accuracy topography measurements on surface relief gratings. We present both experimental and theoretical results which show that spectroscopic ellipsometry or reflectance-difference spectroscopy at near-normal incidence coupled with vector diffraction theory for data analysis is capable of high-accuracy critical dimension (CD), feature height, and sidewall angle measurements in the extreme submicron regime. Quantitative comparisons of optical and cross-sectional scanning electron microscopy (SEM) topography measurements from a number of 350 nm line/space reactive-ion-etched Si gratings demonstrate the strong potential for in situ etching monitoring. This technique can be used for both ex situ and in situ applications and has the potential to replace the use of CD-SEM measurements in some applications. [copyright] 2001 American Institute of Physics

  6. [Barriers to the normalization of telemedicine in a healthcare system model based on purchasing of healthcare services using providers' contracts].

    Science.gov (United States)

    Roig, Francesc; Saigí, Francesc

    2011-01-01

    Despite the clear political will to promote telemedicine and the large number of initiatives, the incorporation of this modality in clinical practice remains limited. The objective of this study was to identify the barriers perceived by key professionals who actively participate in the design and implementation of telemedicine in a healthcare system model based on purchasing of healthcare services using providers' contracts. We performed a qualitative study based on data from semi-structured interviews with 17 key informants belonging to distinct Catalan health organizations. The barriers identified were grouped in four areas: technological, organizational, human and economic. The main barriers identified were changes in the healthcare model caused by telemedicine, problems with strategic alignment, resistance to change in the (re)definition of roles, responsibilities and new skills, and lack of a business model that incorporates telemedicine in the services portfolio to ensure its sustainability. In addition to suitable management of change and of the necessary strategic alignment, the definitive normalization of telemedicine in a mixed healthcare model based on purchasing of healthcare services using providers' contracts requires a clear and stable business model that incorporates this modality in the services portfolio and allows healthcare organizations to obtain reimbursement from the payer. 2010 SESPAS. Published by Elsevier Espana. All rights reserved.

  7. A method for named entity normalization in biomedical articles: application to diseases and plants.

    Science.gov (United States)

    Cho, Hyejin; Choi, Wonjun; Lee, Hyunju

    2017-10-13

    In biomedical articles, a named entity recognition (NER) technique that identifies entity names from texts is an important element for extracting biological knowledge from articles. After NER is applied to articles, the next step is to normalize the identified names into standard concepts (i.e., disease names are mapped to the National Library of Medicine's Medical Subject Headings disease terms). In biomedical articles, many entity normalization methods rely on domain-specific dictionaries for resolving synonyms and abbreviations. However, the dictionaries are not comprehensive except for some entities such as genes. In recent years, biomedical articles have accumulated rapidly, and neural network-based algorithms that incorporate a large amount of unlabeled data have shown considerable success in several natural language processing problems. In this study, we propose an approach for normalizing biological entities, such as disease names and plant names, by using word embeddings to represent semantic spaces. For diseases, training data from the National Center for Biotechnology Information (NCBI) disease corpus and unlabeled data from PubMed abstracts were used to construct word representations. For plants, a training corpus that we manually constructed and unlabeled PubMed abstracts were used to represent word vectors. We showed that the proposed approach performed better than the use of only the training corpus or only the unlabeled data and showed that the normalization accuracy was improved by using our model even when the dictionaries were not comprehensive. We obtained F-scores of 0.808 and 0.690 for normalizing the NCBI disease corpus and manually constructed plant corpus, respectively. We further evaluated our approach using a data set in the disease normalization task of the BioCreative V challenge. When only the disease corpus was used as a dictionary, our approach significantly outperformed the best system of the task. The proposed approach shows robust

  8. Space-based observatories providing key data for climate change applications

    Science.gov (United States)

    Lecomte, J.; Juillet, J. J.

    2016-12-01

    The Sentinel-1 & 3 mission are part of the Copernicus program, previously known as GMES (Global Monitoring for Environment and Security), whose overall objective is to support Europe's goals regarding sustainable development and global governance of the environment by providing timely and quality data, information, services and knowledge. This European Earth Observation program is led by the European Commission and the space infrastructure is developed under the European Space Agency leadership. Many services will be developed through the Copernicus program among different thematic areas. The climate change is one of this thematic area and the Sentinel-1 & 3 satellites will provide key space-based observations in this area. The Sentinel-1 mission is based on a constellation of 2 identical satellites each one embarking C-SAR Instrument and provides capability for continuous radar mapping of the Earth with enhanced revisit frequency, coverage, timeliness and reliability for operational services and applications requiring long time series. In particular, Sentinel 1 provides all-weather, day-and-night estimates of soil moisture, wind speed and direction, sea ice, continental ice sheets and glaciers. The Sentinel-3 mission will mainly be devoted to the provision of Ocean observation data in routine, long term (20 years of operations) and continuous fashion with a consistent quality and a very high level of availability. Among these data, very accurate surface temperatures and topography measurements will be provided and will constitute key indicators, once ingested in climate change models, for identifying climate drivers and expected climate impacts. The paper will briefly recall the satellite architectures, their main characteristics and performance. The inflight performance and key features of their images or data of the 3 satellites namely Sentinel 1A, 1B and 3A will be reviewed to demonstrate the quality and high scientific potential of the data as well as their

  9. Statistical validation of normal tissue complication probability models

    NARCIS (Netherlands)

    Xu, Cheng-Jian; van der Schaaf, Arjen; van t Veld, Aart; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-01-01

    PURPOSE: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. METHODS AND MATERIALS: A penalized regression method, LASSO (least absolute shrinkage

  10. Provider-Independent Use of the Cloud

    Science.gov (United States)

    Harmer, Terence; Wright, Peter; Cunningham, Christina; Perrott, Ron

    Utility computing offers researchers and businesses the potential of significant cost-savings, making it possible for them to match the cost of their computing and storage to their demand for such resources. A utility compute provider enables the purchase of compute infrastructures on-demand; when a user requires computing resources a provider will provision a resource for them and charge them only for their period of use of that resource. There has been a significant growth in the number of cloud computing resource providers and each has a different resource usage model, application process and application programming interface (API)-developing generic multi-resource provider applications is thus difficult and time consuming. We have developed an abstraction layer that provides a single resource usage model, user authentication model and API for compute providers that enables cloud-provider neutral applications to be developed. In this paper we outline the issues in using external resource providers, give examples of using a number of the most popular cloud providers and provide examples of developing provider neutral applications. In addition, we discuss the development of the API to create a generic provisioning model based on a common architecture for cloud computing providers.

  11. Masturbation, sexuality, and adaptation: normalization in adolescence.

    Science.gov (United States)

    Shapiro, Theodore

    2008-03-01

    During adolescence the central masturbation fantasy that is formulated during childhood takes its final form and paradoxically must now be directed outward for appropriate object finding and pair matching in the service of procreative aims. This is a step in adaptation that requires a further developmental landmark that I have called normalization. The path toward airing these private fantasies is facilitated by chumship relationships as a step toward further exposure to the social surround. Hartmann's structuring application of adaptation within psychoanalysis is used as a framework for understanding the process that simultaneously serves intrapsychic and social demands and permits goals that follow evolutionary principles. Variations in the normalization process from masturbatory isolation to a variety of forms of sexual socialization are examined in sociological data concerning current adolescent sexual behavior and in case examples that indicate some routes to normalized experience and practice.

  12. Facial-Attractiveness Choices Are Predicted by Divisive Normalization.

    Science.gov (United States)

    Furl, Nicholas

    2016-10-01

    Do people appear more attractive or less attractive depending on the company they keep? A divisive-normalization account-in which representation of stimulus intensity is normalized (divided) by concurrent stimulus intensities-predicts that choice preferences among options increase with the range of option values. In the first experiment reported here, I manipulated the range of attractiveness of the faces presented on each trial by varying the attractiveness of an undesirable distractor face that was presented simultaneously with two attractive targets, and participants were asked to choose the most attractive face. I used normalization models to predict the context dependence of preferences regarding facial attractiveness. The more unattractive the distractor, the more one of the targets was preferred over the other target, which suggests that divisive normalization (a potential canonical computation in the brain) influences social evaluations. I obtained the same result when I manipulated faces' averageness and participants chose the most average face. This finding suggests that divisive normalization is not restricted to value-based decisions (e.g., attractiveness). This new application to social evaluation of normalization, a classic theory, opens possibilities for predicting social decisions in naturalistic contexts such as advertising or dating.

  13. Using partially labeled data for normal mixture identification with application to class definition

    Science.gov (United States)

    Shahshahani, Behzad M.; Landgrebe, David A.

    1992-01-01

    The problem of estimating the parameters of a normal mixture density when, in addition to the unlabeled samples, sets of partially labeled samples are available is addressed. The density of the multidimensional feature space is modeled with a normal mixture. It is assumed that the set of components of the mixture can be partitioned into several classes and that training samples are available from each class. Since for any training sample the class of origin is known but the exact component of origin within the corresponding class is unknown, the training samples as considered to be partially labeled. The EM iterative equations are derived for estimating the parameters of the normal mixture in the presence of partially labeled samples. These equations can be used to combine the supervised and nonsupervised learning processes.

  14. powerbox: Arbitrarily structured, arbitrary-dimension boxes and log-normal mocks

    Science.gov (United States)

    Murray, Steven G.

    2018-05-01

    powerbox creates density grids (or boxes) with an arbitrary two-point distribution (i.e. power spectrum). The software works in any number of dimensions, creates Gaussian or Log-Normal fields, and measures power spectra of output fields to ensure consistency. The primary motivation for creating the code was the simple creation of log-normal mock galaxy distributions, but the methodology can be used for other applications.

  15. CT of Normal Developmental and Variant Anatomy of the Pediatric Skull: Distinguishing Trauma from Normality.

    Science.gov (United States)

    Idriz, Sanjin; Patel, Jaymin H; Ameli Renani, Seyed; Allan, Rosemary; Vlahos, Ioannis

    2015-01-01

    The use of computed tomography (CT) in clinical practice has been increasing rapidly, with the number of CT examinations performed in adults and children rising by 10% per year in England. Because the radiology community strives to reduce the radiation dose associated with pediatric examinations, external factors, including guidelines for pediatric head injury, are raising expectations for use of cranial CT in the pediatric population. Thus, radiologists are increasingly likely to encounter pediatric head CT examinations in daily practice. The variable appearance of cranial sutures at different ages can be confusing for inexperienced readers of radiologic images. The evolution of multidetector CT with thin-section acquisition increases the clarity of some of these sutures, which may be misinterpreted as fractures. Familiarity with the normal anatomy of the pediatric skull, how it changes with age, and normal variants can assist in translating the increased resolution of multidetector CT into more accurate detection of fractures and confident determination of normality, thereby reducing prolonged hospitalization of children with normal developmental structures that have been misinterpreted as fractures. More important, the potential morbidity and mortality related to false-negative interpretation of fractures as normal sutures may be avoided. The authors describe the normal anatomy of all standard pediatric sutures, common variants, and sutural mimics, thereby providing an accurate and safe framework for CT evaluation of skull trauma in pediatric patients. (©)RSNA, 2015.

  16. Insurer Provide Rating Software and Competitive Advantage: An Evaluation Based on BOP Insurance Applications at Wisconsin Agencies

    OpenAIRE

    Thomas A. Aiuppa; William E. Wehrs

    1993-01-01

    This research examines whether a competitive advantage exists for insurers which provide independent agencies with rating software to generate price quotes. Data relative to the Business Owners line of insurance were collected from several Wisconsin agencies which represent several BOP insurers, some of which provide software for rating BOP applications. The results indicate that providing rating software did not increase insurers’ business volumes, but may decrease their underwriting costs. ...

  17. Normal anatomy of lung perfusion SPECT scintigraphy

    International Nuclear Information System (INIS)

    Moskowitz, G.W.; Levy, L.M.

    1987-01-01

    Ten patients studies for possible pulmonary embolic disease had normal lung perfusion planar and SPECT scintigraphy. A computer program was developed to superimpose the CT scans on corresponding SPECT images. Superimposition of CT scans on corresponding SPECT transaxial cross-sectional images, when available, provides the needed definition and relationships of adjacent organs. SPECT transaxial sections provide clear anatomic definition of perfusion defects without foreground and background lung tissue superimposed. The location, shape, and size of the perfusion defects can be readily assessed by SPECT. An algorithm was developed for the differentiation of abnormal pulmonary perfusion patterns from normal structures on variation

  18. Providing Geospatial Education and Real World Applications of Data across the Climate Initiative Themes

    Science.gov (United States)

    Weigel, A. M.; Griffin, R.; Bugbee, K.

    2015-12-01

    Various organizations such as the Group on Earth Observations (GEO) have developed a structure for general thematic areas in Earth science research, however the Climate Data Initiative (CDI) is addressing the challenging goal of organizing such datasets around core themes specifically related to climate change impacts. These thematic areas, which currently include coastal flooding, food resilience, ecosystem vulnerability, water, transportation, energy infrastructure, and human health, form the core of a new college course at the University of Alabama in Huntsville developed around real-world applications in the Earth sciences. The goal of this course is to educate students on the data available and scope of GIS applications in Earth science across the CDI climate themes. Real world applications and datasets serve as a pedagogical tool that provide a useful medium for instruction in scientific geospatial analysis and GIS software. With a wide range of potential research areas that fall under the rubric of "Earth science", thematic foci can help to structure a student's understanding of the potential uses of GIS across sub-disciplines, while communicating core data processing concepts. The learning modules and use-case scenarios for this course demonstrate the potential applications of CDI data to undergraduate and graduate Earth science students.

  19. Users' and providers' perspectives on technological procedures for 'normal' childbirth in a public maternity hospital in Salvador, Brazil Perspectivas de usuarios y proveedores sobre procedimientos tecnológicos para el parto 'normal' en una maternidad pública de Salvador, Brasil

    Directory of Open Access Journals (Sweden)

    Cecilia McCallum

    2008-02-01

    Full Text Available OBJECTIVE: To reveal the effect of cultural practices on the way in which normal birth is conducted in a public hospital in Brazil. MATERIAL AND METHODS: This article about a public maternity hospital in Salvador, Brazil, compares the points of view of providers and users on four technological normal childbirth procedures: trichotomy, episiotomy, oxytocin infusion, and epidural analgesia. Fieldwork carried out from 2002 to 2003 combined qualitative and quantitative methods. RESULTS: Institutional practices make childbirth unnecessarily difficult for women. Nonetheless, most women accept the conditions because the medical procedures make sense according to their cultural understandings. Service providers support the use of such procedures, although doctors are aware that they contradict recommendations found in scientific medical literature. This article argues that from the perspective of both providers and users, the technological procedures are infused with a culturally specific set of meanings and values. CONCLUSIONS: Policymakers must address the cultural understandings of both users and health care professionals in order to improve maternal healthcare in public hospitals in Brazil.OBJETIVO: Revelar el efecto de las prácticas culturales en el parto normal en un hospital público en Brasil. MATERIAL Y MÉTODOS: Este artículo sobre el parto en una maternidad pública de Salvador, Brasil, compara el punto de vista de los proveedores de servicios de salud y los usuarios de dichos servicios con respecto a cuatro procedimientos para el parto normal: tricotomía, episiotomía, infusión de oxitocina y analgésico epidural. La investigación, realizada entre 2002 y 2003, utilizó métodos cualitativos y cuantitativos. RESULTADOS: La práctica institucional hace que el parto sea innecesariamente dificultoso para las mujeres, sin embargo, la mayoría de ellas aceptan las condiciones, porque los procedimientos médicos tienen sentido dentro de su

  20. Designing area optimized application-specific network-on-chip architectures while providing hard QoS guarantees.

    Directory of Open Access Journals (Sweden)

    Sajid Gul Khawaja

    Full Text Available With the increase of transistors' density, popularity of System on Chip (SoC has increased exponentially. As a communication module for SoC, Network on Chip (NoC framework has been adapted as its backbone. In this paper, we propose a methodology for designing area-optimized application specific NoC while providing hard Quality of Service (QoS guarantees for real time flows. The novelty of the proposed system lies in derivation of a Mixed Integer Linear Programming model which is then used to generate a resource optimal Network on Chip (NoC topology and architecture while considering traffic and QoS requirements. We also present the micro-architectural design features used for enabling traffic and latency guarantees and discuss how the solution adapts for dynamic variations in the application traffic. The paper highlights the effectiveness of proposed method by generating resource efficient NoC solutions for both industrial and benchmark applications. The area-optimized results are generated in few seconds by proposed technique, without resorting to heuristics, even for an application with 48 traffic flows.

  1. An Algorithm for Higher Order Hopf Normal Forms

    Directory of Open Access Journals (Sweden)

    A.Y.T. Leung

    1995-01-01

    Full Text Available Normal form theory is important for studying the qualitative behavior of nonlinear oscillators. In some cases, higher order normal forms are required to understand the dynamic behavior near an equilibrium or a periodic orbit. However, the computation of high-order normal forms is usually quite complicated. This article provides an explicit formula for the normalization of nonlinear differential equations. The higher order normal form is given explicitly. Illustrative examples include a cubic system, a quadratic system and a Duffing–Van der Pol system. We use exact arithmetic and find that the undamped Duffing equation can be represented by an exact polynomial differential amplitude equation in a finite number of terms.

  2. NOMAD-Ref: visualization, deformation and refinement of macromolecular structures based on all-atom normal mode analysis.

    Science.gov (United States)

    Lindahl, Erik; Azuara, Cyril; Koehl, Patrice; Delarue, Marc

    2006-07-01

    Normal mode analysis (NMA) is an efficient way to study collective motions in biomolecules that bypasses the computational costs and many limitations associated with full dynamics simulations. The NOMAD-Ref web server presented here provides tools for online calculation of the normal modes of large molecules (up to 100,000 atoms) maintaining a full all-atom representation of their structures, as well as access to a number of programs that utilize these collective motions for deformation and refinement of biomolecular structures. Applications include the generation of sets of decoys with correct stereochemistry but arbitrary large amplitude movements, the quantification of the overlap between alternative conformations of a molecule, refinement of structures against experimental data, such as X-ray diffraction structure factors or Cryo-EM maps and optimization of docked complexes by modeling receptor/ligand flexibility through normal mode motions. The server can be accessed at the URL http://lorentz.immstr.pasteur.fr/nomad-ref.php.

  3. Multispectral histogram normalization contrast enhancement

    Science.gov (United States)

    Soha, J. M.; Schwartz, A. A.

    1979-01-01

    A multispectral histogram normalization or decorrelation enhancement which achieves effective color composites by removing interband correlation is described. The enhancement procedure employs either linear or nonlinear transformations to equalize principal component variances. An additional rotation to any set of orthogonal coordinates is thus possible, while full histogram utilization is maintained by avoiding the reintroduction of correlation. For the three-dimensional case, the enhancement procedure may be implemented with a lookup table. An application of the enhancement to Landsat multispectral scanning imagery is presented.

  4. The normalized administration of hybrid operating room: its practical application in managing multiple injuries

    International Nuclear Information System (INIS)

    Li Xue; Zhang Weiguo; Zhang Lianyang; Chen Tingjing; Chen Jinhua

    2011-01-01

    Objective: Through carrying out the normalized administration of hybrid operating room the application of the operating room is expanded to the performing of multiple injuries, and, in this way, the operative management become standardized and programmed, the cooperation and efficiency of hybrid operations for multiple injuries are improved and the surgeries can be ensured. Methods: According to the characteristics of hybrid interventional operation for multiple injuries, the basic construction of the hybrid operating room improved, the hybrid operation team was organized, and the administrative system as well as the working program were established. The green channel for rescuing patients with multiple injuries was set up. The cooperative behavior during interventional treatment for multiple injuries was specified. Results: The coordination and working efficiency of physicians, nurses, technicians and anesthetists were well improved. The qualified rate of lamina flow administration reached 100%. The success rate of the rescue of multiple injuries was increased. Conclusion: As one-stop complex interventional operation for multiple injuries is a new technique, there is no integrated administration system. Therefore, the establishment of standardized management of one-stop complex interventional operation is of great significance in guiding clinical practice. (authors)

  5. Maximization of energy recovery inside supersonic separator in the presence of condensation and normal shock wave

    International Nuclear Information System (INIS)

    Shooshtari, S.H. Rajaee; Shahsavand, A.

    2017-01-01

    Natural gases provide around a quarter of energy consumptions around the globe. Supersonic separators (3S) play multifaceted role in natural gas industry processing, especially for water and hydrocarbon dew point corrections. These states of the art devices have minimum energy requirement and favorable process economy compared to conventional facilities. Their relatively large pressure drops may limit their application in some situations. To maximize the energy recovery of the dew point correction facility, the pressure loss across the 3S unit should be minimized. The optimal structure of 3s unit (including shock wave location and diffuser angle) is selected using simultaneous combination of normal shock occurrence and condensation in the presence of nucleation and growth processes. The condense-free gas enters the non-isentropic normal shock wave. The simulation results indicate that the normal shock location, pressure recovery coefficient and onset position strongly vary up to a certain diffuser angle (β = 8°) with the maximum pressure recovery of 0.88 which leads to minimum potential energy loss. Computational fluid dynamic simulations show that separation of boundary layer does not happen for the computed optimal value of β and it is essentially constant when the inlet gas temperatures and pressures vary over a relatively broad range. - Highlights: • Supersonic separators have found numerous applications in oil and gas industries. • Maximum pressure recovery is crucial for such units to maximize energy efficiency. • Simultaneous condensation and shock wave occurrence are studied for the first time. • Diverging nozzle angle of 8° can provide maximum pressure recovery of 0.88. • The optimal diffuser angle remains constant over a broad range of inlet conditions.

  6. Quaternion normalization in additive EKF for spacecraft attitude determination. [Extended Kalman Filters

    Science.gov (United States)

    Bar-Itzhack, I. Y.; Deutschmann, J.; Markley, F. L.

    1991-01-01

    This work introduces, examines and compares several quaternion normalization algorithms, which are shown to be an effective stage in the application of the additive extended Kalman filter to spacecraft attitude determination, which is based on vector measurements. Three new normalization schemes are introduced. They are compared with one another and with the known brute force normalization scheme, and their efficiency is examined. Simulated satellite data are used to demonstate the performance of all four schemes.

  7. Crystallization features of normal alkanes in confined geometry.

    Science.gov (United States)

    Su, Yunlan; Liu, Guoming; Xie, Baoquan; Fu, Dongsheng; Wang, Dujin

    2014-01-21

    How polymers crystallize can greatly affect their thermal and mechanical properties, which influence the practical applications of these materials. Polymeric materials, such as block copolymers, graft polymers, and polymer blends, have complex molecular structures. Due to the multiple hierarchical structures and different size domains in polymer systems, confined hard environments for polymer crystallization exist widely in these materials. The confined geometry is closely related to both the phase metastability and lifetime of polymer. This affects the phase miscibility, microphase separation, and crystallization behaviors and determines both the performance of polymer materials and how easily these materials can be processed. Furthermore, the size effect of metastable states needs to be clarified in polymers. However, scientists find it difficult to propose a quantitative formula to describe the transition dynamics of metastable states in these complex systems. Normal alkanes [CnH2n+2, n-alkanes], especially linear saturated hydrocarbons, can provide a well-defined model system for studying the complex crystallization behaviors of polymer materials, surfactants, and lipids. Therefore, a deeper investigation of normal alkane phase behavior in confinement will help scientists to understand the crystalline phase transition and ultimate properties of many polymeric materials, especially polyolefins. In this Account, we provide an in-depth look at the research concerning the confined crystallization behavior of n-alkanes and binary mixtures in microcapsules by our laboratory and others. Since 2006, our group has developed a technique for synthesizing nearly monodispersed n-alkane containing microcapsules with controllable size and surface porous morphology. We applied an in situ polymerization method, using melamine-formaldehyde resin as shell material and nonionic surfactants as emulsifiers. The solid shell of microcapsules can provide a stable three-dimensional (3-D

  8. "I Treat Him as a Normal Patient": Unveiling the Normalization Coping Strategy Among Formal Caregivers of Persons With Dementia and Its Implications for Person-Centered Care.

    Science.gov (United States)

    Bentwich, Miriam Ethel; Dickman, Nomy; Oberman, Amitai; Bokek-Cohen, Ya'arit

    2017-11-01

    Currently, 47 million people have dementia, worldwide, often requiring paid care by formal caregivers. Research regarding family caregivers suggests normalization as a model for coping with negative emotional outcomes in caring for a person with dementia (PWD). The study aims to explore whether normalization coping mechanism exists among formal caregivers, reveal differences in its application among cross-cultural caregivers, and examine how this coping mechanism may be related to implementing person-centered care for PWDs. Content analysis of interviews with 20 formal caregivers from three cultural groups (Jews born in Israel [JI], Arabs born in Israel [AI], Russian immigrants [RI]), attending to PWDs. We extracted five normalization modes, revealing AI caregivers had substantially more utterances of normalization expressions than their colleagues. The normalization modes most commonly expressed by AI caregivers relate to the personhood of PWDs. These normalization modes may enhance formal caregivers' ability to employ person-centered care.

  9. Consumer Health Informatics: The Application of ICT in Improving Patient-Provider Partnership for a Better Health Care.

    Science.gov (United States)

    Abaidoo, Benjamin; Larweh, Benjamin Teye

    2014-01-01

    There is a growing interest concerning the potential of ICT solutions that are customized to consumers. This emerging discipline referred to as consumer health informatics (CHI) plays a major role in providing information to patients and the public, and facilitates the promotion of self-management. The concept of CHI has emerged out of the desire of most patients to shoulder responsibilities regarding their health and a growing desire of health practitioners to fully appreciate the potential of the patient. To describe the role of ICT in improving the patient-provider partnership in consumer health informatics. Systematic reviewing of literature, identification of reference sources and formulation of search strategies and manual search regarding the significance of developed CHI applications in healthcare delivery. New consumer health IT applications have been developed to be used on a variety of different platforms, including the Web, messaging systems, PDAs, and cell phones. These applications assists patients with self-management through reminders and prompts, delivery of real-time data on a patient's health condition to patients and providers, web-based communication and personal electronic health information. New tools are being developed for the purposes of providing information to patients and the public which has enhanced decision making in health matters and an avenue for clinicians and consumers to exchange health information for personal and public use. This calls for corroboration among healthcare organizations, governments and the ICT industry to develop new research and IT innovations which are tailored to the health needs of the consumer.

  10. Event-related brain potentials, bilateral electrodermal activity and Mangina-Test performance in learning disabled/ADHD pre-adolescents with severe behavioral disorders as compared to age-matched normal controls.

    Science.gov (United States)

    Mangina, C A; Beuzeron-Mangina, J H; Grizenko, N

    2000-07-01

    The most frequently encountered developmental problems of learning disabilities/ADHD often co-exist with severe behavioral disorders. As a direct consequence, this condition opens the way to delinquency, school drop-out, depression, suicide, substance abuse, work absenteeism, and other psycho-social complications. In this paper, we are presenting a selective overview of our previous research and its clinical applications in this field as it relates to our present research data pertaining to the effects of our original Memory Workload Paradigm on the event-related brain potentials in differentiating normal and pathological pre-adolescents (learning disabled/ADHD with concomitant severe behavioral disorders such as oppositional and conduct). In addition, it provides data on the bilateral electrodermal activity during cognitive workload and Mangina-Test performance of pathological and normal pre-adolescents conducted in separate sessions. The results of our present research indicate that a significant memory load effect for the P450 latency (F(3,27)=4.98, PWorkload Paradigm in pre-frontal and frontal regions clearly differentiated normal from pathological pre-adolescents (F(1, 18)=12.21, Presearch findings provide an original and valuable demonstration of an integrative and effective clinical psychophysiological application of central (ERPs), autonomic (bilateral electrodermal activity) and neuro-psychometric aspects (Mangina-Test) which characterize normal and pathological pre-adolescents and underpin the neurophysiological basis of learning disabled/ADHD with severe behavioral disorders as opposed to normal subjects.

  11. Normal estimation for pointcloud using GPU based sparse tensor voting

    OpenAIRE

    Liu , Ming; Pomerleau , François; Colas , Francis; Siegwart , Roland

    2012-01-01

    International audience; Normal estimation is the basis for most applications using pointcloud, such as segmentation. However, it is still a challenging problem regarding computational complexity and observation noise. In this paper, we propose a normal estimation method for pointcloud using results from tensor voting. Comparing with other approaches, we show it has smaller estimation error. Moreover, by varying the voting kernel size, we find it is a flexible approach for structure extraction...

  12. Comparative Study of Various Normal Mode Analysis Techniques Based on Partial Hessians

    OpenAIRE

    GHYSELS, AN; VAN SPEYBROECK, VERONIQUE; PAUWELS, EWALD; CATAK, SARON; BROOKS, BERNARD R.; VAN NECK, DIMITRI; WAROQUIER, MICHEL

    2010-01-01

    Standard normal mode analysis becomes problematic for complex molecular systems, as a result of both the high computational cost and the excessive amount of information when the full Hessian matrix is used. Several partial Hessian methods have been proposed in the literature, yielding approximate normal modes. These methods aim at reducing the computational load and/or calculating only the relevant normal modes of interest in a specific application. Each method has its own (dis)advantages and...

  13. The consequences of non-normality

    International Nuclear Information System (INIS)

    Hip, I.; Lippert, Th.; Neff, H.; Schilling, K.; Schroers, W.

    2002-01-01

    The non-normality of Wilson-type lattice Dirac operators has important consequences - the application of the usual concepts from the textbook (hermitian) quantum mechanics should be reconsidered. This includes an appropriate definition of observables and the refinement of computational tools. We show that the truncated singular value expansion is the optimal approximation to the inverse operator D -1 and we prove that due to the γ 5 -hermiticity it is equivalent to γ 5 times the truncated eigenmode expansion of the hermitian Wilson-Dirac operator

  14. 26 CFR 1.613-7 - Application of percentage depletion rates provided in section 613(b) to certain taxable years...

    Science.gov (United States)

    2010-04-01

    ... 26 Internal Revenue 7 2010-04-01 2010-04-01 true Application of percentage depletion rates... TAXES (CONTINUED) Natural Resources § 1.613-7 Application of percentage depletion rates provided in... depletion rate specified in section 613 in respect of any mineral property (within the meaning of the 1939...

  15. Effect of normalization on the neutron spectrum adjustment procedure

    International Nuclear Information System (INIS)

    Zsolnay, E.M.; Zijp, W.L.; Nolthenius, H.J.

    1983-10-01

    Various computer programs currently applied for neutron spectrum adjustment based on multifoil activation data, use different ways to determine the normalization factor to be applied to an unnormalized input spectrum. The influence is shown of the various definitions of the normalization factor on the adjusted results for the case of the ORR and YAYOI spectra considered in the international REAL-80 exercise. The actual expression for defining the normalization factor is more important than previously assumed. The theory of the generalized least squares principle provides an optimal definition for the normalization factor

  16. Multivariate stochastic simulation with subjective multivariate normal distributions

    Science.gov (United States)

    P. J. Ince; J. Buongiorno

    1991-01-01

    In many applications of Monte Carlo simulation in forestry or forest products, it may be known that some variables are correlated. However, for simplicity, in most simulations it has been assumed that random variables are independently distributed. This report describes an alternative Monte Carlo simulation technique for subjectively assesed multivariate normal...

  17. Application of Normal Family to the Spread Inequality and the Paley ...

    African Journals Online (AJOL)

    In this paper we derive a Paley type inequality for subharmonic functions of order λ,0 < λ≤½ and describe the asymptotic behaviour of the extremal functions near Pòlya peaks. We also give an alternative proof for the spread inequality using a non-asymptotic method via - a normal family of δ -subharmonic functions.

  18. Quaternion normalization in additive EKF for spacecraft attitude determination

    Science.gov (United States)

    Bar-Itzhack, I. Y.; Deutschmann, J.; Markley, F. L.

    1991-01-01

    This work introduces, examines, and compares several quaternion normalization algorithms, which are shown to be an effective stage in the application of the additive extended Kalman filter (EKF) to spacecraft attitude determination, which is based on vector measurements. Two new normalization schemes are introduced. They are compared with one another and with the known brute force normalization scheme, and their efficiency is examined. Simulated satellite data are used to demonstrate the performance of all three schemes. A fourth scheme is suggested for future research. Although the schemes were tested for spacecraft attitude determination, the conclusions are general and hold for attitude determination of any three dimensional body when based on vector measurements, and use an additive EKF for estimation, and the quaternion for specifying the attitude.

  19. Normalized modes at selected points without normalization

    Science.gov (United States)

    Kausel, Eduardo

    2018-04-01

    As every textbook on linear algebra demonstrates, the eigenvectors for the general eigenvalue problem | K - λM | = 0 involving two real, symmetric, positive definite matrices K , M satisfy some well-defined orthogonality conditions. Equally well-known is the fact that those eigenvectors can be normalized so that their modal mass μ =ϕT Mϕ is unity: it suffices to divide each unscaled mode by the square root of the modal mass. Thus, the normalization is the result of an explicit calculation applied to the modes after they were obtained by some means. However, we show herein that the normalized modes are not merely convenient forms of scaling, but that they are actually intrinsic properties of the pair of matrices K , M, that is, the matrices already "know" about normalization even before the modes have been obtained. This means that we can obtain individual components of the normalized modes directly from the eigenvalue problem, and without needing to obtain either all of the modes or for that matter, any one complete mode. These results are achieved by means of the residue theorem of operational calculus, a finding that is rather remarkable inasmuch as the residues themselves do not make use of any orthogonality conditions or normalization in the first place. It appears that this obscure property connecting the general eigenvalue problem of modal analysis with the residue theorem of operational calculus may have been overlooked up until now, but which has in turn interesting theoretical implications.Á

  20. Application of a Brittle Damage Model to Normal Plate-on-Plate Impact

    National Research Council Canada - National Science Library

    Raftenberg, Martin N

    2005-01-01

    A brittle damage model presented by Grinfeld and Wright of the U.S. Army Research Laboratory was implemented in the LS-DYNA finite element code and applied to the simulation of normal plate-on-plate impact...

  1. Providing end-to-end QoS for multimedia applications in 3G wireless networks

    Science.gov (United States)

    Guo, Katherine; Rangarajan, Samapth; Siddiqui, M. A.; Paul, Sanjoy

    2003-11-01

    As the usage of wireless packet data services increases, wireless carriers today are faced with the challenge of offering multimedia applications with QoS requirements within current 3G data networks. End-to-end QoS requires support at the application, network, link and medium access control (MAC) layers. We discuss existing CDMA2000 network architecture and show its shortcomings that prevent supporting multiple classes of traffic at the Radio Access Network (RAN). We then propose changes in RAN within the standards framework that enable support for multiple traffic classes. In addition, we discuss how Session Initiation Protocol (SIP) can be augmented with QoS signaling for supporting end-to-end QoS. We also review state of the art scheduling algorithms at the base station and provide possible extensions to these algorithms to support different classes of traffic as well as different classes of users.

  2. Generalized Polar Decompositions for Closed Operators in Hilbert Spaces and Some Applications

    OpenAIRE

    Gesztesy, Fritz; Malamud, Mark; Mitrea, Marius; Naboko, Serguei

    2008-01-01

    We study generalized polar decompositions of densely defined, closed linear operators in Hilbert spaces and provide some applications to relatively (form) bounded and relatively (form) compact perturbations of self-adjoint, normal, and m-sectorial operators.

  3. Spin-transfer torque magnetoresistive random-access memory technologies for normally off computing (invited)

    International Nuclear Information System (INIS)

    Ando, K.; Yuasa, S.; Fujita, S.; Ito, J.; Yoda, H.; Suzuki, Y.; Nakatani, Y.; Miyazaki, T.

    2014-01-01

    Most parts of present computer systems are made of volatile devices, and the power to supply them to avoid information loss causes huge energy losses. We can eliminate this meaningless energy loss by utilizing the non-volatile function of advanced spin-transfer torque magnetoresistive random-access memory (STT-MRAM) technology and create a new type of computer, i.e., normally off computers. Critical tasks to achieve normally off computers are implementations of STT-MRAM technologies in the main memory and low-level cache memories. STT-MRAM technology for applications to the main memory has been successfully developed by using perpendicular STT-MRAMs, and faster STT-MRAM technologies for applications to the cache memory are now being developed. The present status of STT-MRAMs and challenges that remain for normally off computers are discussed

  4. Clarifying Normalization

    Science.gov (United States)

    Carpenter, Donald A.

    2008-01-01

    Confusion exists among database textbooks as to the goal of normalization as well as to which normal form a designer should aspire. This article discusses such discrepancies with the intention of simplifying normalization for both teacher and student. This author's industry and classroom experiences indicate such simplification yields quicker…

  5. Normalization of energy-dependent gamma survey data.

    Science.gov (United States)

    Whicker, Randy; Chambers, Douglas

    2015-05-01

    Instruments and methods for normalization of energy-dependent gamma radiation survey data to a less energy-dependent basis of measurement are evaluated based on relevant field data collected at 15 different sites across the western United States along with a site in Mongolia. Normalization performance is assessed relative to measurements with a high-pressure ionization chamber (HPIC) due to its "flat" energy response and accurate measurement of the true exposure rate from both cosmic and terrestrial radiation. While analytically ideal for normalization applications, cost and practicality disadvantages have increased demand for alternatives to the HPIC. Regression analysis on paired measurements between energy-dependent sodium iodide (NaI) scintillation detectors (5-cm by 5-cm crystal dimensions) and the HPIC revealed highly consistent relationships among sites not previously impacted by radiological contamination (natural sites). A resulting generalized data normalization factor based on the average sensitivity of NaI detectors to naturally occurring terrestrial radiation (0.56 nGy hHPIC per nGy hNaI), combined with the calculated site-specific estimate of cosmic radiation, produced reasonably accurate predictions of HPIC readings at natural sites. Normalization against two to potential alternative instruments (a tissue-equivalent plastic scintillator and energy-compensated NaI detector) did not perform better than the sensitivity adjustment approach at natural sites. Each approach produced unreliable estimates of HPIC readings at radiologically impacted sites, though normalization against the plastic scintillator or energy-compensated NaI detector can address incompatibilities between different energy-dependent instruments with respect to estimation of soil radionuclide levels. The appropriate data normalization method depends on the nature of the site, expected duration of the project, survey objectives, and considerations of cost and practicality.

  6. Increased Classroom Consumption of Home-Provided Fruits and Vegetables for Normal and Overweight Children: Results of the Food Dudes Program in Italy.

    Science.gov (United States)

    Presti, Giovambattista; Cau, Silvia; Oppo, Annalisa; Moderato, Paolo

    2015-01-01

    To increase classroom consumption of home-provided fruits (F) and vegetables (V) in obese, overweight, and normal weight children. Consumption evaluated within and across the baseline phase and the end of the intervention and maintenance phases. Three Italian primary schools. The study involved 672 children (321 male and 329 female) aged 5-11 years. Body mass index measures were available for 461 children. Intervention schools received the Food Dudes (FD) program: 16 days of repeated taste exposure (40 g of F and 40 g of V), video modeling, and rewards-based techniques. The comparison school was only repeatedly exposed to FV. Grams of FV brought from home and eaten. Chi-square, independent t test, repeated-measures ANOVA, and generalized estimating equation model. Intervention schools show a significant increase in home-provided F (P < .001) and V (P < .001) consumption both in overweight and non-overweight children. Approximately half of children in the intervention schools ate at least 1 portion of FV at the end of the intervention and maintenance phases. The increase in home-provided FV intake was similar in overweight and non-overweight children in the FD intervention schools compared with the comparison school. The effect of the FD program was higher at the end of the intervention phase than the end of the maintenance phase. Copyright © 2015 Society for Nutrition Education and Behavior. Published by Elsevier Inc. All rights reserved.

  7. Advancing Normal Birth: Organizations, Goals, and Research

    OpenAIRE

    Hotelling, Barbara A.; Humenick, Sharron S.

    2005-01-01

    In this column, the support for advancing normal birth is summarized, based on a comparison of the goals of Healthy People 2010, Lamaze International, the Coalition for Improving Maternity Services, and the midwifery model of care. Research abstracts are presented to provide evidence that the midwifery model of care safely and economically advances normal birth. Rates of intervention experienced, as reported in the Listening to Mothers survey, are compared to the forms of care recommended by ...

  8. Normal form analysis of linear beam dynamics in a coupled storage ring

    International Nuclear Information System (INIS)

    Wolski, Andrzej; Woodley, Mark D.

    2004-01-01

    The techniques of normal form analysis, well known in the literature, can be used to provide a straightforward characterization of linear betatron dynamics in a coupled lattice. Here, we consider both the beam distribution and the betatron oscillations in a storage ring. We find that the beta functions for uncoupled motion generalize in a simple way to the coupled case. Defined in the way that we propose, the beta functions remain well behaved (positive and finite) under all circumstances, and have essentially the same physical significance for the beam size and betatron oscillation amplitude as in the uncoupled case. Application of this analysis to the online modeling of the PEP-II rings is also discussed

  9. Extravascular transport in normal and tumor tissues.

    Science.gov (United States)

    Jain, R K; Gerlowski, L E

    1986-01-01

    The transport characteristics of the normal and tumor tissue extravascular space provide the basis for the determination of the optimal dosage and schedule regimes of various pharmacological agents in detection and treatment of cancer. In order for the drug to reach the cellular space where most therapeutic action takes place, several transport steps must first occur: (1) tissue perfusion; (2) permeation across the capillary wall; (3) transport through interstitial space; and (4) transport across the cell membrane. Any of these steps including intracellular events such as metabolism can be the rate-limiting step to uptake of the drug, and these rate-limiting steps may be different in normal and tumor tissues. This review examines these transport limitations, first from an experimental point of view and then from a modeling point of view. Various types of experimental tumor models which have been used in animals to represent human tumors are discussed. Then, mathematical models of extravascular transport are discussed from the prespective of two approaches: compartmental and distributed. Compartmental models lump one or more sections of a tissue or body into a "compartment" to describe the time course of disposition of a substance. These models contain "effective" parameters which represent the entire compartment. Distributed models consider the structural and morphological aspects of the tissue to determine the transport properties of that tissue. These distributed models describe both the temporal and spatial distribution of a substance in tissues. Each of these modeling techniques is described in detail with applications for cancer detection and treatment in mind.

  10. Testing the normality assumption in the sample selection model with an application to travel demand

    NARCIS (Netherlands)

    van der Klaauw, B.; Koning, R.H.

    2003-01-01

    In this article we introduce a test for the normality assumption in the sample selection model. The test is based on a flexible parametric specification of the density function of the error terms in the model. This specification follows a Hermite series with bivariate normality as a special case.

  11. Testing the normality assumption in the sample selection model with an application to travel demand

    NARCIS (Netherlands)

    van der Klauw, B.; Koning, R.H.

    In this article we introduce a test for the normality assumption in the sample selection model. The test is based on a flexible parametric specification of the density function of the error terms in the model. This specification follows a Hermite series with bivariate normality as a special case.

  12. Gene expression signatures affected by ethanol and/or nicotine in normal human normal oral keratinocytes (NHOKs

    Directory of Open Access Journals (Sweden)

    Jeffrey J. Kim

    2014-12-01

    Full Text Available It has been reported that nicotine/alcohol alters epigenetic control and leads to abrogated DNA methylation and histone modifications, which could subsequently perturb transcriptional regulation critically important in cellular transformation. The aim of this study is to determine the molecular mechanisms of nicotine/alcohol-induced epigenetic alterations and their mechanistic roles in transcriptional regulation in human adult stem cells. We hypothesized that nicotine/alcohol induces deregulation of epigenetic machinery and leads to epigenetic alterations, which subsequently affect transcriptional regulation in oral epithelial stem cells. As an initiating step we have profiled transcriptomic alterations induced by the combinatory administration of EtOH and nicotine in primary normal human oral keratinocytes. Here we provide detailed experimental methods, analysis and information associated with our data deposited into Gene Expression Omnibus (GEO under GSE57634. Our data provide comprehensive transcriptomic map describing molecular changes induced by EtOH and nicotine on normal human oral keratinocytes.

  13. Anatomy, normal variants, and basic biomechanics

    International Nuclear Information System (INIS)

    Berquist, T.H.; Johnson, K.A.

    1989-01-01

    This paper reports on the anatomy and basic functions of the foot and ankle important to physicians involved in imaging procedures, clinical medicine, and surgery. New radiographic techniques especially magnetic resonance imaging, provide more diagnostic information owing to improved tissue contrast and the ability to obtain multiple image planes (axial, sagittal, coronal, oblique). Therefore, a thorough knowledge of skeletal and soft tissue anatomy is even more essential. Normal variants must also be understood in order to distinguish normal from pathologic changes in the foot and ankle. A basic understanding of biomechanics is also essential for selecting the proper diagnostic techniques

  14. Volume-preserving normal forms of Hopf-zero singularity

    International Nuclear Information System (INIS)

    Gazor, Majid; Mokhtari, Fahimeh

    2013-01-01

    A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto–Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple. (paper)

  15. Volume-preserving normal forms of Hopf-zero singularity

    Science.gov (United States)

    Gazor, Majid; Mokhtari, Fahimeh

    2013-10-01

    A practical method is described for computing the unique generator of the algebra of first integrals associated with a large class of Hopf-zero singularity. The set of all volume-preserving classical normal forms of this singularity is introduced via a Lie algebra description. This is a maximal vector space of classical normal forms with first integral; this is whence our approach works. Systems with a nonzero condition on their quadratic parts are considered. The algebra of all first integrals for any such system has a unique (modulo scalar multiplication) generator. The infinite level volume-preserving parametric normal forms of any nondegenerate perturbation within the Lie algebra of any such system is computed, where it can have rich dynamics. The associated unique generator of the algebra of first integrals are derived. The symmetry group of the infinite level normal forms are also discussed. Some necessary formulas are derived and applied to appropriately modified Rössler and generalized Kuramoto-Sivashinsky equations to demonstrate the applicability of our theoretical results. An approach (introduced by Iooss and Lombardi) is applied to find an optimal truncation for the first level normal forms of these examples with exponentially small remainders. The numerically suggested radius of convergence (for the first integral) associated with a hypernormalization step is discussed for the truncated first level normal forms of the examples. This is achieved by an efficient implementation of the results using Maple.

  16. Bayesian non- and semi-parametric methods and applications

    CERN Document Server

    Rossi, Peter

    2014-01-01

    This book reviews and develops Bayesian non-parametric and semi-parametric methods for applications in microeconometrics and quantitative marketing. Most econometric models used in microeconomics and marketing applications involve arbitrary distributional assumptions. As more data becomes available, a natural desire to provide methods that relax these assumptions arises. Peter Rossi advocates a Bayesian approach in which specific distributional assumptions are replaced with more flexible distributions based on mixtures of normals. The Bayesian approach can use either a large but fixed number

  17. Stochastic Frontier Models with Dependent Errors based on Normal and Exponential Margins || Modelos de frontera estocástica con errores dependientes basados en márgenes normal y exponencial

    Directory of Open Access Journals (Sweden)

    Gómez-Déniz, Emilio

    2017-06-01

    Full Text Available Following the recent work of Gómez-Déniz and Pérez-Rodríguez (2014, this paper extends the results obtained there to the normal-exponential distribution with dependence. Accordingly, the main aim of the present paper is to enhance stochastic production frontier and stochastic cost frontier modelling by proposing a bivariate distribution for dependent errors which allows us to nest the classical models. Closed-form expressions for the error term and technical efficiency are provided. An illustration using real data from the econometric literature is provided to show the applicability of the model proposed. || Continuando el reciente trabajo de Gómez-Déniz y Pérez-Rodríguez (2014, el presente artículo extiende los resultados obtenidos a la distribución normal-exponencial con dependencia. En consecuencia, el principal propósito de este artículo es mejorar el modelado de la frontera estocástica tanto de producción como de coste proponiendo para ello una distribución bivariante para errores dependientes que nos permitan encajar los modelos clásicos. Se obtienen las expresiones en forma cerrada para el término de error y la eficiencia técnica. Se ilustra la aplicabilidad del modelo propouesto usando datos reales existentes en la literatura econométrica.

  18. Normal and abnormal growth plate

    International Nuclear Information System (INIS)

    Kumar, R.; Madewell, J.E.; Swischuk, L.E.

    1987-01-01

    Skeletal growth is a dynamic process. A knowledge of the structure and function of the normal growth plate is essential in order to understand the pathophysiology of abnormal skeletal growth in various diseases. In this well-illustrated article, the authors provide a radiographic classification of abnormal growth plates and discuss mechanisms that lead to growth plate abnormalities

  19. On Normalized Compression Distance and Large Malware

    OpenAIRE

    Borbely, Rebecca Schuller

    2015-01-01

    Normalized Compression Distance (NCD) is a popular tool that uses compression algorithms to cluster and classify data in a wide range of applications. Existing discussions of NCD's theoretical merit rely on certain theoretical properties of compression algorithms. However, we demonstrate that many popular compression algorithms don't seem to satisfy these theoretical properties. We explore the relationship between some of these properties and file size, demonstrating that this theoretical pro...

  20. Normalization of Gravitational Acceleration Models

    Science.gov (United States)

    Eckman, Randy A.; Brown, Aaron J.; Adamo, Daniel R.

    2011-01-01

    Unlike the uniform density spherical shell approximations of Newton, the con- sequence of spaceflight in the real universe is that gravitational fields are sensitive to the nonsphericity of their generating central bodies. The gravitational potential of a nonspherical central body is typically resolved using spherical harmonic approximations. However, attempting to directly calculate the spherical harmonic approximations results in at least two singularities which must be removed in order to generalize the method and solve for any possible orbit, including polar orbits. Three unique algorithms have been developed to eliminate these singularities by Samuel Pines [1], Bill Lear [2], and Robert Gottlieb [3]. This paper documents the methodical normalization of two1 of the three known formulations for singularity-free gravitational acceleration (namely, the Lear [2] and Gottlieb [3] algorithms) and formulates a general method for defining normalization parameters used to generate normalized Legendre Polynomials and ALFs for any algorithm. A treatment of the conventional formulation of the gravitational potential and acceleration is also provided, in addition to a brief overview of the philosophical differences between the three known singularity-free algorithms.

  1. GMPR: A robust normalization method for zero-inflated count data with application to microbiome sequencing data.

    Science.gov (United States)

    Chen, Li; Reeve, James; Zhang, Lujun; Huang, Shengbing; Wang, Xuefeng; Chen, Jun

    2018-01-01

    Normalization is the first critical step in microbiome sequencing data analysis used to account for variable library sizes. Current RNA-Seq based normalization methods that have been adapted for microbiome data fail to consider the unique characteristics of microbiome data, which contain a vast number of zeros due to the physical absence or under-sampling of the microbes. Normalization methods that specifically address the zero-inflation remain largely undeveloped. Here we propose geometric mean of pairwise ratios-a simple but effective normalization method-for zero-inflated sequencing data such as microbiome data. Simulation studies and real datasets analyses demonstrate that the proposed method is more robust than competing methods, leading to more powerful detection of differentially abundant taxa and higher reproducibility of the relative abundances of taxa.

  2. Normal modes of weak colloidal gels

    Science.gov (United States)

    Varga, Zsigmond; Swan, James W.

    2018-01-01

    The normal modes and relaxation rates of weak colloidal gels are investigated in calculations using different models of the hydrodynamic interactions between suspended particles. The relaxation spectrum is computed for freely draining, Rotne-Prager-Yamakawa, and accelerated Stokesian dynamics approximations of the hydrodynamic mobility in a normal mode analysis of a harmonic network representing several colloidal gels. We find that the density of states and spatial structure of the normal modes are fundamentally altered by long-ranged hydrodynamic coupling among the particles. Short-ranged coupling due to hydrodynamic lubrication affects only the relaxation rates of short-wavelength modes. Hydrodynamic models accounting for long-ranged coupling exhibit a microscopic relaxation rate for each normal mode, λ that scales as l-2, where l is the spatial correlation length of the normal mode. For the freely draining approximation, which neglects long-ranged coupling, the microscopic relaxation rate scales as l-γ, where γ varies between three and two with increasing particle volume fraction. A simple phenomenological model of the internal elastic response to normal mode fluctuations is developed, which shows that long-ranged hydrodynamic interactions play a central role in the viscoelasticity of the gel network. Dynamic simulations of hard spheres that gel in response to short-ranged depletion attractions are used to test the applicability of the density of states predictions. For particle concentrations up to 30% by volume, the power law decay of the relaxation modulus in simulations accounting for long-ranged hydrodynamic interactions agrees with predictions generated by the density of states of the corresponding harmonic networks as well as experimental measurements. For higher volume fractions, excluded volume interactions dominate the stress response, and the prediction from the harmonic network density of states fails. Analogous to the Zimm model in polymer

  3. Normalizations of High Taylor Reynolds Number Power Spectra

    Science.gov (United States)

    Puga, Alejandro; Koster, Timothy; Larue, John C.

    2014-11-01

    The velocity power spectrum provides insight in how the turbulent kinetic energy is transferred from larger to smaller scales. Wind tunnel experiments are conducted where high intensity turbulence is generated by means of an active turbulence grid modeled after Makita's 1991 design (Makita, 1991) as implemented by Mydlarski and Warhaft (M&W, 1998). The goal of this study is to document the evolution of the scaling region and assess the relative collapse of several proposed normalizations over a range of Rλ from 185 to 997. As predicted by Kolmogorov (1963), an asymptotic approach of the slope (n) of the inertial subrange to - 5 / 3 with increasing Rλ is observed. There are three velocity power spectrum normalizations as presented by Kolmogorov (1963), Von Karman and Howarth (1938) and George (1992). Results show that the Von Karman and Howarth normalization does not collapse the velocity power spectrum as well as the Kolmogorov and George normalizations. The Kolmogorov normalization does a good job of collapsing the velocity power spectrum in the normalized high wavenumber range of 0 . 0002 University of California, Irvine Research Fund.

  4. Normal form for mirror machine Hamiltonians

    International Nuclear Information System (INIS)

    Dragt, A.J.; Finn, J.M.

    1979-01-01

    A systematic algorithm is developed for performing canonical transformations on Hamiltonians which govern particle motion in magnetic mirror machines. These transformations are performed in such a way that the new Hamiltonian has a particularly simple normal form. From this form it is possible to compute analytic expressions for gyro and bounce frequencies. In addition, it is possible to obtain arbitrarily high order terms in the adiabatic magnetic moment expansion. The algorithm makes use of Lie series, is an extension of Birkhoff's normal form method, and has been explicitly implemented by a digital computer programmed to perform the required algebraic manipulations. Application is made to particle motion in a magnetic dipole field and to a simple mirror system. Bounce frequencies and locations of periodic orbits are obtained and compared with numerical computations. Both mirror systems are shown to be insoluble, i.e., trajectories are not confined to analytic hypersurfaces, there is no analytic third integral of motion, and the adiabatic magnetic moment expansion is divergent. It is expected also that the normal form procedure will prove useful in the study of island structure and separatrices associated with periodic orbits, and should facilitate studies of breakdown of adiabaticity and the onset of ''stochastic'' behavior

  5. Shear-coupled grain-boundary migration dependence on normal strain/stress

    Science.gov (United States)

    Combe, N.; Mompiou, F.; Legros, M.

    2017-08-01

    In specific conditions, grain-boundary (GB) migration occurs in polycrystalline materials as an alternative vector of plasticity compared to the usual dislocation activity. The shear-coupled GB migration, the expected most efficient GB based mechanism, couples the GB motion to an applied shear stress. Stresses on GB in polycrystalline materials seldom have, however, a unique pure shear component. This work investigates the influence of a normal strain on the shear coupled migration of a Σ 13 (320 )[001 ] GB in a copper bicrystal using atomistic simulations. We show that the yield shear stress inducing the GB migration strongly depends on the applied normal stress. Beyond, the application of a normal stress on this GB qualitatively modifies the GB migration: while the Σ 13 (320 )[001 ] GB shear couples following the 〈110 〉 migration mode without normal stress, we report the observation of the 〈010 〉 mode under a sufficiently high tensile normal stress. Using the nudge elastic band method, we uncover the atomistic mechanism of this 〈010 〉 migration mode and energetically characterize it.

  6. Compressed normalized block difference for object tracking

    Science.gov (United States)

    Gao, Yun; Zhang, Dengzhuo; Cai, Donglan; Zhou, Hao; Lan, Ge

    2018-04-01

    Feature extraction is very important for robust and real-time tracking. Compressive sensing provided a technical support for real-time feature extraction. However, all existing compressive tracking were based on compressed Haar-like feature, and how to compress many more excellent high-dimensional features is worth researching. In this paper, a novel compressed normalized block difference feature (CNBD) was proposed. For resisting noise effectively in a highdimensional normalized pixel difference feature (NPD), a normalized block difference feature extends two pixels in the original formula of NPD to two blocks. A CNBD feature can be obtained by compressing a normalized block difference feature based on compressive sensing theory, with the sparse random Gaussian matrix as the measurement matrix. The comparative experiments of 7 trackers on 20 challenging sequences showed that the tracker based on CNBD feature can perform better than other trackers, especially than FCT tracker based on compressed Haar-like feature, in terms of AUC, SR and Precision.

  7. Noble Metal Nanoparticles Applications in Cancer

    Directory of Open Access Journals (Sweden)

    João Conde

    2012-01-01

    Full Text Available Nanotechnology has prompted new and improved materials for biomedical applications with particular emphasis in therapy and diagnostics. Special interest has been directed at providing enhanced molecular therapeutics for cancer, where conventional approaches do not effectively differentiate between cancerous and normal cells; that is, they lack specificity. This normally causes systemic toxicity and severe and adverse side effects with concomitant loss of quality of life. Because of their small size, nanoparticles can readily interact with biomolecules both at surface and inside cells, yielding better signals and target specificity for diagnostics and therapeutics. This way, a variety of nanoparticles with the possibility of diversified modification with biomolecules have been investigated for biomedical applications including their use in highly sensitive imaging assays, thermal ablation, and radiotherapy enhancement as well as drug and gene delivery and silencing. Here, we review the available noble metal nanoparticles for cancer therapy, with particular focus on those already being translated into clinical settings.

  8. GMPR: A robust normalization method for zero-inflated count data with application to microbiome sequencing data

    Directory of Open Access Journals (Sweden)

    Li Chen

    2018-04-01

    Full Text Available Normalization is the first critical step in microbiome sequencing data analysis used to account for variable library sizes. Current RNA-Seq based normalization methods that have been adapted for microbiome data fail to consider the unique characteristics of microbiome data, which contain a vast number of zeros due to the physical absence or under-sampling of the microbes. Normalization methods that specifically address the zero-inflation remain largely undeveloped. Here we propose geometric mean of pairwise ratios—a simple but effective normalization method—for zero-inflated sequencing data such as microbiome data. Simulation studies and real datasets analyses demonstrate that the proposed method is more robust than competing methods, leading to more powerful detection of differentially abundant taxa and higher reproducibility of the relative abundances of taxa.

  9. A QSPR STUDY OF NORMAL BOILING POINT OF ORGANIC COMPOUNDS (ALIPHATIC ALKANES USING MOLECULAR DESCRIPTORS

    Directory of Open Access Journals (Sweden)

    B. Souyei

    2013-12-01

    Full Text Available A quantitative structure–property relationship (QSPR study is carried out to develop correlations that relate the molecular structures of organic compounds (Aliphatic Alkanes to their normal boiling point (NBP and two correlations were proposed for constitutionals and connectivity indices Models. The correlations are simple in application with good accuracy, which provide an easy, direct and relatively accurate way to calculate NBP. Such calculation gives us a model that gives results in remarkable correlations with the descriptors of blokes constitutionals (CON, and connectivity indices (CI (R2 = 0.950, δ = 0.766 (R2 = 0.969, δ = 0.782 respectively.

  10. Normalized difference vegetation index (NDVI) variation among cultivars and environments

    Science.gov (United States)

    Although Nitrogen (N) is an essential nutrient for crop production, large preplant applications of fertilizer N can result in off-field loss that causes environmental concerns. Canopy reflectance is being investigated for use in variable rate (VR) N management. Normalized difference vegetation index...

  11. New method for computing ideal MHD normal modes in axisymmetric toroidal geometry

    International Nuclear Information System (INIS)

    Wysocki, F.; Grimm, R.C.

    1984-11-01

    Analytic elimination of the two magnetic surface components of the displacement vector permits the normal mode ideal MHD equations to be reduced to a scalar form. A Galerkin procedure, similar to that used in the PEST codes, is implemented to determine the normal modes computationally. The method retains the efficient stability capabilities of the PEST 2 energy principle code, while allowing computation of the normal mode frequencies and eigenfunctions, if desired. The procedure is illustrated by comparison with earlier various of PEST and by application to tilting modes in spheromaks, and to stable discrete Alfven waves in tokamak geometry

  12. First-In-Class Small Molecule ONC201 Induces DR5 and Cell Death in Tumor but Not Normal Cells to Provide a Wide Therapeutic Index as an Anti-Cancer Agent.

    Science.gov (United States)

    Allen, Joshua E; Crowder, Roslyn N; Crowder, Roslyn; El-Deiry, Wafik S

    2015-01-01

    We previously identified ONC201 (TIC10) as a first-in-class orally active small molecule with robust antitumor activity that is currently in clinical trials in advanced cancers. Here, we further investigate the safety characteristics of ONC201 in preclinical models that reveal an excellent safety profile at doses that exceed efficacious doses by 10-fold. In vitro studies indicated a strikingly different dose-response relationship when comparing tumor and normal cells where maximal effects are much stronger in tumor cells than in normal cells. In further support of a wide therapeutic index, investigation of tumor and normal cell responses under identical conditions demonstrated large apoptotic effects in tumor cells and modest anti-proliferative effects in normal cells that were non-apoptotic and reversible. Probing the underlying mechanism of apoptosis indicated that ONC201 does not induce DR5 in normal cells under conditions that induce DR5 in tumor cells; DR5 is a pro-apoptotic TRAIL receptor previously linked to the anti-tumor mechanism of ONC201. GLP toxicology studies in Sprague-Dawley rats and beagle dogs at therapeutic and exaggerated doses revealed no dose-limiting toxicities. Observations in both species at the highest doses were mild and reversible at doses above 10-fold the expected therapeutic dose. The no observed adverse event level (NOAEL) was ≥42 mg/kg in dogs and ≥125 mg/kg in rats, which both correspond to a human dose of approximately 1.25 g assuming standard allometric scaling. These results provided the rationale for the 125 mg starting dose in dose escalation clinical trials that began in 2015 in patients with advanced cancer.

  13. First-In-Class Small Molecule ONC201 Induces DR5 and Cell Death in Tumor but Not Normal Cells to Provide a Wide Therapeutic Index as an Anti-Cancer Agent.

    Directory of Open Access Journals (Sweden)

    Joshua E Allen

    Full Text Available We previously identified ONC201 (TIC10 as a first-in-class orally active small molecule with robust antitumor activity that is currently in clinical trials in advanced cancers. Here, we further investigate the safety characteristics of ONC201 in preclinical models that reveal an excellent safety profile at doses that exceed efficacious doses by 10-fold. In vitro studies indicated a strikingly different dose-response relationship when comparing tumor and normal cells where maximal effects are much stronger in tumor cells than in normal cells. In further support of a wide therapeutic index, investigation of tumor and normal cell responses under identical conditions demonstrated large apoptotic effects in tumor cells and modest anti-proliferative effects in normal cells that were non-apoptotic and reversible. Probing the underlying mechanism of apoptosis indicated that ONC201 does not induce DR5 in normal cells under conditions that induce DR5 in tumor cells; DR5 is a pro-apoptotic TRAIL receptor previously linked to the anti-tumor mechanism of ONC201. GLP toxicology studies in Sprague-Dawley rats and beagle dogs at therapeutic and exaggerated doses revealed no dose-limiting toxicities. Observations in both species at the highest doses were mild and reversible at doses above 10-fold the expected therapeutic dose. The no observed adverse event level (NOAEL was ≥42 mg/kg in dogs and ≥125 mg/kg in rats, which both correspond to a human dose of approximately 1.25 g assuming standard allometric scaling. These results provided the rationale for the 125 mg starting dose in dose escalation clinical trials that began in 2015 in patients with advanced cancer.

  14. Normal form and synchronization of strict-feedback chaotic systems

    International Nuclear Information System (INIS)

    Wang, Feng; Chen, Shihua; Yu Minghai; Wang Changping

    2004-01-01

    This study concerns the normal form and synchronization of strict-feedback chaotic systems. We prove that, any strict-feedback chaotic system can be rendered into a normal form with a invertible transform and then a design procedure to synchronize the normal form of a non-autonomous strict-feedback chaotic system is presented. This approach needs only a scalar driving signal to realize synchronization no matter how many dimensions the chaotic system contains. Furthermore, the Roessler chaotic system is taken as a concrete example to illustrate the procedure of designing without transforming a strict-feedback chaotic system into its normal form. Numerical simulations are also provided to show the effectiveness and feasibility of the developed methods

  15. Modal analysis of inter-area oscillations using the theory of normal modes

    Energy Technology Data Exchange (ETDEWEB)

    Betancourt, R.J. [School of Electromechanical Engineering, University of Colima, Manzanillo, Col. 28860 (Mexico); Barocio, E. [CUCEI, University of Guadalajara, Guadalajara, Jal. 44480 (Mexico); Messina, A.R. [Graduate Program in Electrical Engineering, Cinvestav, Guadalajara, Jal. 45015 (Mexico); Martinez, I. [State Autonomous University of Mexico, Toluca, Edo. Mex. 50110 (Mexico)

    2009-04-15

    Based on the notion of normal modes in mechanical systems, a method is proposed for the analysis and characterization of oscillatory processes in power systems. The method is based on the property of invariance of modal subspaces and can be used to represent complex power system modal behavior by a set of decoupled, two-degree-of-freedom nonlinear oscillator equations. Using techniques from nonlinear mechanics, a new approach is outlined, for determining the normal modes (NMs) of motion of a general n-degree-of-freedom nonlinear system. Equations relating the normal modes and the physical velocities and displacements are developed from the linearized system model and numerical issues associated with the application of the technique are discussed. In addition to qualitative insight, this method can be utilized in the study of nonlinear behavior and bifurcation analyses. The application of these procedures is illustrated on a planning model of the Mexican interconnected system using a quadratic nonlinear model. Specifically, the use of normal mode analysis as a basis for identifying modal parameters, including natural frequencies and damping ratios of general, linear systems with n degrees of freedom is discussed. Comparisons to conventional linear analysis techniques demonstrate the ability of the proposed technique to extract the different oscillation modes embedded in the oscillation. (author)

  16. Fusion and normalization to enhance anomaly detection

    Science.gov (United States)

    Mayer, R.; Atkinson, G.; Antoniades, J.; Baumback, M.; Chester, D.; Edwards, J.; Goldstein, A.; Haas, D.; Henderson, S.; Liu, L.

    2009-05-01

    This study examines normalizing the imagery and the optimization metrics to enhance anomaly and change detection, respectively. The RX algorithm, the standard anomaly detector for hyperspectral imagery, more successfully extracts bright rather than dark man-made objects when applied to visible hyperspectral imagery. However, normalizing the imagery prior to applying the anomaly detector can help detect some of the problematic dark objects, but can also miss some bright objects. This study jointly fuses images of RX applied to normalized and unnormalized imagery and has a single decision surface. The technique was tested using imagery of commercial vehicles in urban environment gathered by a hyperspectral visible/near IR sensor mounted in an airborne platform. Combining detections first requires converting the detector output to a target probability. The observed anomaly detections were fitted with a linear combination of chi square distributions and these weights were used to help compute the target probability. Receiver Operator Characteristic (ROC) quantitatively assessed the target detection performance. The target detection performance is highly variable depending on the relative number of candidate bright and dark targets and false alarms and controlled in this study by using vegetation and street line masks. The joint Boolean OR and AND operations also generate variable performance depending on the scene. The joint SUM operation provides a reasonable compromise between OR and AND operations and has good target detection performance. In addition, new transforms based on normalizing correlation coefficient and least squares generate new transforms related to canonical correlation analysis (CCA) and a normalized image regression (NIR). Transforms based on CCA and NIR performed better than the standard approaches. Only RX detection of the unnormalized of the difference imagery in change detection provides adequate change detection performance.

  17. Design of a normal incidence multilayer imaging X-ray microscope

    Science.gov (United States)

    Shealy, David L.; Gabardi, David R.; Hoover, Richard B.; Walker, Arthur B. C., Jr.; Lindblom, Joakim F.

    Normal incidence multilayer Cassegrain X-ray telescopes were flown on the Stanford/MSFC Rocket X-ray Spectroheliograph. These instruments produced high spatial resolution images of the sun and conclusively demonstrated that doubly reflecting multilayer X-ray optical systems are feasible. The images indicated that aplanatic imaging soft X-ray/EUV microscopes should be achievable using multilayer optics technology. A doubly reflecting normal incidence multilayer imaging X-ray microscope based on the Schwarzschild configuration has been designed. The design of the microscope and the results of the optical system ray trace analysis are discussed. High resolution aplanatic imaging X-ray microscopes using normal incidence multilayer X-ray mirrors should have many important applications in advanced X-ray astronomical instrumentation, X-ray lithography, biological, biomedical, metallurgical, and laser fusion research.

  18. The Effects of a Normal Rate versus a Slow Intervalled Rate of Oral Nutrient Intake and Intravenous Low Rate Macronutrient Application on Psychophysical Function – Two Pilot Studies

    Directory of Open Access Journals (Sweden)

    Melanie Y. Denzer-Lippmann

    2017-06-01

    Full Text Available Stomach distension and energy per time are factors influencing satiety. Moreover, different rates of nutrient intake induce different stomach distension. The goal of our studies was to elucidate the influence of different oral rates of nutrient intake (normal rate versus slow intervalled rate; study I and intravenous low rate macronutrient application (protein, carbohydrate, fat or placebo (study II on psychophysical function. The pilot studies investigated the effects of 1 study I: a mixed nutrient solution (1/3 protein, 1/3 fat, 1/3 carbohydrates 2 study II: intravenous macronutrient infusions (protein, carbohydrate, fat or placebo on psychophysical function (mood, hunger, food craving, alertness, smell intensity ratings and hedonic ratings in human subjects. In study I 10 male subjects (age range: 21–30 years completed the study protocol participating in both test conditions and in study II 20 male subjects (age range: 19–41 years completed the study protocol participating in all test conditions. Additionally, metabolic function was analyzed and cognitive and olfactory tests were conducted twice starting 100 min before the beginning of the intervention and 240 min after. Psychophysical (mood, hunger, fat-, protein-, carbohydrate-, sweets- and vegetable-craving, alertness and metabolic function tests were performed seven times on each examination day. Greater effects on hunger and food cravings were observed for normal rate of intake compared to slow intervalled rate of intake and intravenous low rate macronutrient application. Our findings potentially confirm that volume of the food ingested and a higher rate of energy per time contribute to satiety during normal rate of food intake, while slow intervalled rate of food intake and intravenous low rate macronutrient application showed no effects on satiation. Our results motivate the view that a certain amount of volume of the food ingested and a certain energy per time ratio are necessary

  19. Software Application Profile: RVPedigree: a suite of family-based rare variant association tests for normally and non-normally distributed quantitative traits.

    Science.gov (United States)

    Oualkacha, Karim; Lakhal-Chaieb, Lajmi; Greenwood, Celia Mt

    2016-04-01

    RVPedigree (Rare Variant association tests in Pedigrees) implements a suite of programs facilitating genome-wide analysis of association between a quantitative trait and autosomal region-based genetic variation. The main features here are the ability to appropriately test for association of rare variants with non-normally distributed quantitative traits, and also to appropriately adjust for related individuals, either from families or from population structure and cryptic relatedness. RVPedigree is available as an R package. The package includes calculation of kinship matrices, various options for coping with non-normality, three different ways of estimating statistical significance incorporating triaging to enable efficient use of the most computationally-intensive calculations, and a parallelization option for genome-wide analysis. The software is available from the Comprehensive R Archive Network [CRAN.R-project.org] under the name 'RVPedigree' and at [https://github.com/GreenwoodLab]. It has been published under General Public License (GPL) version 3 or newer. © The Author 2016; all rights reserved. Published by Oxford University Press on behalf of the International Epidemiological Association.

  20. Color Doppler Echocardiographic Assessment of Valvular Regurgitation in Normal Infants

    Directory of Open Access Journals (Sweden)

    Shu-Ting Lee

    2010-01-01

    Conclusion: The prevalence of inaudible valvular regurgitation is high in infants with structurally normal hearts. Multiple-valve involvement with regurgitation is not uncommon. Mild severity and low velocity on color Doppler, and the structural information provided by 2D imaging strongly suggest that these regurgitant flows are physiologically normal in infancy.

  1. Normalization: A Preprocessing Stage

    OpenAIRE

    Patro, S. Gopal Krishna; Sahu, Kishore Kumar

    2015-01-01

    As we know that the normalization is a pre-processing stage of any type problem statement. Especially normalization takes important role in the field of soft computing, cloud computing etc. for manipulation of data like scale down or scale up the range of data before it becomes used for further stage. There are so many normalization techniques are there namely Min-Max normalization, Z-score normalization and Decimal scaling normalization. So by referring these normalization techniques we are ...

  2. Oral Topical Doxepin Rinse: Anesthetic Effect in Normal Subjects

    Directory of Open Access Journals (Sweden)

    Joel B Epstein

    2003-01-01

    Full Text Available Oral doxepin rinse has been reported to provide pain relief in patients with oral mucosal lesions due to cancer or cancer therapy. The purpose of this study was to assess the anesthetic effect of doxepin oral rinse in normal subjects to identify the duration of effect and to contrast the anesthetic effect with reported pain relief in patients with oral mucosal lesions. Normal volunteers were provided a solution of doxepin (5 mg/mL for oral rinsing. Oral numbness and adverse effects were recorded for a period of 4 h after rinsing. Doxepin rinse resulted in mucosal anesthesia in all subjects. Sedation/fatigue was reported in four of seven subjects. There were no taste complaints and no nausea reported. The limited duration of numbness/anesthesia in normal subjects compared with prior studies showing pain relief for more than 3 h in patients with mucosal lesions, suggests that the extended duration of pain relief in patients was due to analgesic effects rather than anesthetic effects. The majority of normal subjects reported sedation after use, but this was less common in patients with mucosal lesions.

  3. Mast cell distribution in normal adult skin

    NARCIS (Netherlands)

    A.S. Janssens (Artiena Soe); R. Heide (Rogier); J.C. den Hollander (Jan); P.G.M. Mulder (P. G M); B. Tank (Bhupendra); A.P. Oranje (Arnold)

    2005-01-01

    markdownabstract__AIMS:__ To investigate mast cell distribution in normal adult skin to provide a reference range for comparison with mastocytosis. __METHODS:__ Mast cells (MCs) were counted in uninvolved skin adjacent to basal cell carcinomas and other dermatological disorders in adults.

  4. Discrimination of bladder cancer cells from normal urothelial cells with high specificity and sensitivity: combined application of atomic force microscopy and modulated Raman spectroscopy.

    Science.gov (United States)

    Canetta, Elisabetta; Riches, Andrew; Borger, Eva; Herrington, Simon; Dholakia, Kishan; Adya, Ashok K

    2014-05-01

    Atomic force microscopy (AFM) and modulated Raman spectroscopy (MRS) were used to discriminate between living normal human urothelial cells (SV-HUC-1) and bladder tumour cells (MGH-U1) with high specificity and sensitivity. MGH-U1 cells were 1.5-fold smaller, 1.7-fold thicker and 1.4-fold rougher than normal SV-HUC-1 cells. The adhesion energy was 2.6-fold higher in the MGH-U1 cells compared to normal SV-HUC-1 cells, which possibly indicates that bladder tumour cells are more deformable than normal cells. The elastic modulus of MGH-U1 cells was 12-fold lower than SV-HUC-1 cells, suggesting a higher elasticity of the bladder cancer cell membranes. The biochemical fingerprints of cancer cells displayed a higher DNA and lipid content, probably due to an increase in the nuclear to cytoplasm ratio. Normal cells were characterized by higher protein contents. AFM studies revealed a decrease in the lateral dimensions and an increase in thickness of cancer cells compared to normal cells; these studies authenticate the observations from MRS. Nanostructural, nanomechanical and biochemical profiles of bladder cells provide qualitative and quantitative markers to differentiate between normal and cancerous cells at the single cellular level. AFM and MRS allow discrimination between adhesion energy, elasticity and Raman spectra of SV-HUC-1 and MGH-U1 cells with high specificity (83, 98 and 95%) and sensitivity (97, 93 and 98%). Such single-cell-level studies could have a pivotal impact on the development of AFM-Raman combined methodologies for cancer profiling and screening with translational significance. Copyright © 2014 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.

  5. Mild desalination demo pilot: New normalization approach to effectively evaluate electrodialysis reversal technology

    Directory of Open Access Journals (Sweden)

    Roel Bisselink

    2016-06-01

    Full Text Available Key performance indicators for characterization of nanofiltration performance are well developed, similar key performance indicators for electrodialysis reversal are however underdeveloped. Under the E4Water project Dow Benelux BV and Evides Industriewater BV operate a pilot facility to compare both technologies for their application to mildly desalinate a variety of brackish water streams. Normalized pressure drop, normalized current efficiency and normalized membrane resistance proved to be a useful tool to interpret process performance and to initiate a cleaning procedure if required. The availability of these normalized key performance indicators enables optimization and process monitoring and control of electrodialysis reversal independent of the continuously changing conditions of the feed water.

  6. Magnetic resonance imaging of the normal placenta

    International Nuclear Information System (INIS)

    Blaicher, Wibke; Brugger, Peter C.; Mittermayer, Christoph; Schwindt, Jens; Deutinger, Josef; Bernaschek, Gerhard; Prayer, Daniela

    2006-01-01

    The goal of this study was to provide a representative description of the normal placenta with contrast medium-free magnetic resonance imaging (MRI) in order to determine a standard of reference. One hundred consecutive singleton pregnancies were investigated by MRI without application of a contrast medium. The mean gestational age (GA) at the time of investigation was 29.5 weeks (range 19-40). Patients with suspected utero-placental insufficiency (UPI) or placental anomalies were excluded. Signal intensities were assessed and correlated with the respective GA. Antenatal MRI without contrast medium was able to depict placental status and morphological changes during gestation. A regular homogeneous structure was found in weeks 19-23. Subsequently, sporadic, slightly marked lobules appeared, which increased in number and markedness with ongoing gestation. Stratification of the lobules was observed after 36 weeks. The ratio of placental and amniotic fluid signal intensities decreased significantly with higher GA and with placental grading. MRI is well suited as an imaging method for the placenta. Our data may be used as a reference in the assessment of the placenta on MRI, and may have further clinical impact with respect to the determination of UPI

  7. Magnetic resonance imaging of the normal placenta

    Energy Technology Data Exchange (ETDEWEB)

    Blaicher, Wibke [Department of Gynecology and Obstetrics, University Hospital Vienna (Austria)]. E-mail: wibke.blaicher@meduniwien.ac.at; Brugger, Peter C. [Center of Anatomy and Cell Biology, University Hospital of Vienna (Austria); Mittermayer, Christoph [Department of Pediatrics, Division of Neonatology and Intensive Care, University Hospital of Vienna (Austria); Schwindt, Jens [Department of Pediatrics, Division of Neonatology and Intensive Care, University Hospital of Vienna (Austria); Deutinger, Josef [Department of Gynecology and Obstetrics, University Hospital Vienna (Austria); Bernaschek, Gerhard [Department of Gynecology and Obstetrics, University Hospital Vienna (Austria); Prayer, Daniela [Department of Radiology, Division of Neuroradiology, University Hospital of Vienna (Austria)

    2006-02-15

    The goal of this study was to provide a representative description of the normal placenta with contrast medium-free magnetic resonance imaging (MRI) in order to determine a standard of reference. One hundred consecutive singleton pregnancies were investigated by MRI without application of a contrast medium. The mean gestational age (GA) at the time of investigation was 29.5 weeks (range 19-40). Patients with suspected utero-placental insufficiency (UPI) or placental anomalies were excluded. Signal intensities were assessed and correlated with the respective GA. Antenatal MRI without contrast medium was able to depict placental status and morphological changes during gestation. A regular homogeneous structure was found in weeks 19-23. Subsequently, sporadic, slightly marked lobules appeared, which increased in number and markedness with ongoing gestation. Stratification of the lobules was observed after 36 weeks. The ratio of placental and amniotic fluid signal intensities decreased significantly with higher GA and with placental grading. MRI is well suited as an imaging method for the placenta. Our data may be used as a reference in the assessment of the placenta on MRI, and may have further clinical impact with respect to the determination of UPI.

  8. Diagnostic imaging features of normal anal sacs in dogs and cats.

    Science.gov (United States)

    Jung, Yechan; Jeong, Eunseok; Park, Sangjun; Jeong, Jimo; Choi, Ul Soo; Kim, Min-Su; Kim, Namsoo; Lee, Kichang

    2016-09-30

    This study was conducted to provide normal reference features for canine and feline anal sacs using ultrasound, low-field magnetic resonance imaging (MRI) and radiograph contrast as diagnostic imaging tools. A total of ten clinically normal beagle dogs and eight clinically normally cats were included. General radiography with contrast, ultrasonography and low-field MRI scans were performed. The visualization of anal sacs, which are located at distinct sites in dogs and cats, is possible with a contrast study on radiography. Most surfaces of the anal sacs tissue, occasionally appearing as a hyperechoic thin line, were surrounded by the hypoechoic external sphincter muscle on ultrasonography. The normal anal sac contents of dogs and cats had variable echogenicity. Signals of anal sac contents on low-field MRI varied in cats and dogs, and contrast medium using T1-weighted images enhanced the anal sac walls more obviously than that on ultrasonography. In conclusion, this study provides the normal features of anal sacs from dogs and cats on diagnostic imaging. Further studies including anal sac evaluation are expected to investigate disease conditions.

  9. Making nuclear 'normal'

    International Nuclear Information System (INIS)

    Haehlen, Peter; Elmiger, Bruno

    2000-01-01

    The mechanics of the Swiss NPPs' 'come and see' programme 1995-1999 were illustrated in our contributions to all PIME workshops since 1996. Now, after four annual 'waves', all the country has been covered by the NPPs' invitation to dialogue. This makes PIME 2000 the right time to shed some light on one particular objective of this initiative: making nuclear 'normal'. The principal aim of the 'come and see' programme, namely to give the Swiss NPPs 'a voice of their own' by the end of the nuclear moratorium 1990-2000, has clearly been attained and was commented on during earlier PIMEs. It is, however, equally important that Swiss nuclear energy not only made progress in terms of public 'presence', but also in terms of being perceived as a normal part of industry, as a normal branch of the economy. The message that Swiss nuclear energy is nothing but a normal business involving normal people, was stressed by several components of the multi-prong campaign: - The speakers in the TV ads were real - 'normal' - visitors' guides and not actors; - The testimonials in the print ads were all real NPP visitors - 'normal' people - and not models; - The mailings inviting a very large number of associations to 'come and see' activated a typical channel of 'normal' Swiss social life; - Spending money on ads (a new activity for Swiss NPPs) appears to have resulted in being perceived by the media as a normal branch of the economy. Today we feel that the 'normality' message has well been received by the media. In the controversy dealing with antinuclear arguments brought forward by environmental organisations journalists nowadays as a rule give nuclear energy a voice - a normal right to be heard. As in a 'normal' controversy, the media again actively ask themselves questions about specific antinuclear claims, much more than before 1990 when the moratorium started. The result is that in many cases such arguments are discarded by journalists, because they are, e.g., found to be

  10. CUILESS2016: a clinical corpus applying compositional normalization of text mentions.

    Science.gov (United States)

    Osborne, John D; Neu, Matthew B; Danila, Maria I; Solorio, Thamar; Bethard, Steven J

    2018-01-10

    Traditionally text mention normalization corpora have normalized concepts to single ontology identifiers ("pre-coordinated concepts"). Less frequently, normalization corpora have used concepts with multiple identifiers ("post-coordinated concepts") but the additional identifiers have been restricted to a defined set of relationships to the core concept. This approach limits the ability of the normalization process to express semantic meaning. We generated a freely available corpus using post-coordinated concepts without a defined set of relationships that we term "compositional concepts" to evaluate their use in clinical text. We annotated 5397 disorder mentions from the ShARe corpus to SNOMED CT that were previously normalized as "CUI-less" in the "SemEval-2015 Task 14" shared task because they lacked a pre-coordinated mapping. Unlike the previous normalization method, we do not restrict concept mappings to a particular set of the Unified Medical Language System (UMLS) semantic types and allow normalization to occur to multiple UMLS Concept Unique Identifiers (CUIs). We computed annotator agreement and assessed semantic coverage with this method. We generated the largest clinical text normalization corpus to date with mappings to multiple identifiers and made it freely available. All but 8 of the 5397 disorder mentions were normalized using this methodology. Annotator agreement ranged from 52.4% using the strictest metric (exact matching) to 78.2% using a hierarchical agreement that measures the overlap of shared ancestral nodes. Our results provide evidence that compositional concepts can increase semantic coverage in clinical text. To our knowledge we provide the first freely available corpus of compositional concept annotation in clinical text.

  11. Software architecture for time-constrained machine vision applications

    Science.gov (United States)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2013-01-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility, because they are normally oriented toward particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse, and inefficient execution on multicore processors. We present a novel software architecture for time-constrained machine vision applications that addresses these issues. The architecture is divided into three layers. The platform abstraction layer provides a high-level application programming interface for the rest of the architecture. The messaging layer provides a message-passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of message. The application layer provides a repository for reusable application modules designed for machine vision applications. These modules, which include acquisition, visualization, communication, user interface, and data processing, take advantage of the power of well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, the proposed architecture is applied to a real machine vision application: a jam detector for steel pickling lines.

  12. Attention and normalization circuits in macaque V1

    Science.gov (United States)

    Sanayei, M; Herrero, J L; Distler, C; Thiele, A

    2015-01-01

    Attention affects neuronal processing and improves behavioural performance. In extrastriate visual cortex these effects have been explained by normalization models, which assume that attention influences the circuit that mediates surround suppression. While normalization models have been able to explain attentional effects, their validity has rarely been tested against alternative models. Here we investigate how attention and surround/mask stimuli affect neuronal firing rates and orientation tuning in macaque V1. Surround/mask stimuli provide an estimate to what extent V1 neurons are affected by normalization, which was compared against effects of spatial top down attention. For some attention/surround effect comparisons, the strength of attentional modulation was correlated with the strength of surround modulation, suggesting that attention and surround/mask stimulation (i.e. normalization) might use a common mechanism. To explore this in detail, we fitted multiplicative and additive models of attention to our data. In one class of models, attention contributed to normalization mechanisms, whereas in a different class of models it did not. Model selection based on Akaike's and on Bayesian information criteria demonstrated that in most cells the effects of attention were best described by models where attention did not contribute to normalization mechanisms. This demonstrates that attentional influences on neuronal responses in primary visual cortex often bypass normalization mechanisms. PMID:25757941

  13. Tumor vessel normalization after aerobic exercise enhances chemotherapeutic efficacy.

    Science.gov (United States)

    Schadler, Keri L; Thomas, Nicholas J; Galie, Peter A; Bhang, Dong Ha; Roby, Kerry C; Addai, Prince; Till, Jacob E; Sturgeon, Kathleen; Zaslavsky, Alexander; Chen, Christopher S; Ryeom, Sandra

    2016-10-04

    Targeted therapies aimed at tumor vasculature are utilized in combination with chemotherapy to improve drug delivery and efficacy after tumor vascular normalization. Tumor vessels are highly disorganized with disrupted blood flow impeding drug delivery to cancer cells. Although pharmacologic anti-angiogenic therapy can remodel and normalize tumor vessels, there is a limited window of efficacy and these drugs are associated with severe side effects necessitating alternatives for vascular normalization. Recently, moderate aerobic exercise has been shown to induce vascular normalization in mouse models. Here, we provide a mechanistic explanation for the tumor vascular normalization induced by exercise. Shear stress, the mechanical stimuli exerted on endothelial cells by blood flow, modulates vascular integrity. Increasing vascular shear stress through aerobic exercise can alter and remodel blood vessels in normal tissues. Our data in mouse models indicate that activation of calcineurin-NFAT-TSP1 signaling in endothelial cells plays a critical role in exercise-induced shear stress mediated tumor vessel remodeling. We show that moderate aerobic exercise with chemotherapy caused a significantly greater decrease in tumor growth than chemotherapy alone through improved chemotherapy delivery after tumor vascular normalization. Our work suggests that the vascular normalizing effects of aerobic exercise can be an effective chemotherapy adjuvant.

  14. Degree and duration of corneal anesthesia after topical application of 0.4% oxybuprocaine hydrochloride ophthalmic solution in ophthalmically normal dogs.

    Science.gov (United States)

    Douet, Jean-Yves; Michel, Julien; Regnier, Alain

    2013-10-01

    To assess the anesthetic efficacy and local tolerance of topically applied 0.4% oxybuprocaine ophthalmic solution to in dogs and compare its effects with those of 1% tetracaine solution. 34 ophthalmically normal Beagles. Dogs were assigned to 2 groups, and baseline corneal touch threshold (CTT) was measured bilaterally with a Cochet-Bonnet aesthesiometer. Dogs of group 1 (n = 22) received a single drop of 0.4% oxybuprocaine ophthalmic solution in one eye and saline (0.9% NaCl) solution (control treatment) in the contralateral eye. Dogs of group 2 (n = 12) received a single drop of 0.4% oxybuprocaine ophthalmic solution in one eye and 1% tetracaine ophthalmic solution in the contralateral eye. The CTT of each eye was measured 1 and 5 minutes after topical application and then at 5-minute intervals until 75 minutes after topical application. CTT changes over time differed significantly between oxybuprocaine-treated and control eyes. After instillation of oxybuprocaine, maximal corneal anesthesia (CTT = 0) was achieved within 1 minute, and CTT was significantly decreased from 1 to 45 minutes, compared with the baseline value. No significant difference in onset, depth, and duration of corneal anesthesia was found between oxybuprocaine-treated and tetracaine-treated eyes. Conjunctival hyperemia and chemosis were detected more frequently in tetracaine-treated eyes than in oxybuprocaine-treated eyes. Topical application of oxybuprocaine and tetracaine similarly reduced corneal sensitivity in dogs, but oxybuprocaine was less irritating to the conjunctiva than was tetracaine.

  15. Right thoracic curvature in the normal spine

    Directory of Open Access Journals (Sweden)

    Masuda Keigo

    2011-01-01

    Full Text Available Abstract Background Trunk asymmetry and vertebral rotation, at times observed in the normal spine, resemble the characteristics of adolescent idiopathic scoliosis (AIS. Right thoracic curvature has also been reported in the normal spine. If it is determined that the features of right thoracic side curvature in the normal spine are the same as those observed in AIS, these findings might provide a basis for elucidating the etiology of this condition. For this reason, we investigated right thoracic curvature in the normal spine. Methods For normal spinal measurements, 1,200 patients who underwent a posteroanterior chest radiographs were evaluated. These consisted of 400 children (ages 4-9, 400 adolescents (ages 10-19 and 400 adults (ages 20-29, with each group comprised of both genders. The exclusion criteria were obvious chest and spinal diseases. As side curvature is minimal in normal spines and the range at which curvature is measured is difficult to ascertain, first the typical curvature range in scoliosis patients was determined and then the Cobb angle in normal spines was measured using the same range as the scoliosis curve, from T5 to T12. Right thoracic curvature was given a positive value. The curve pattern was organized in each collective three groups: neutral (from -1 degree to 1 degree, right (> +1 degree, and left ( Results In child group, Cobb angle in left was 120, in neutral was 125 and in right was 155. In adolescent group, Cobb angle in left was 70, in neutral was 114 and in right was 216. In adult group, Cobb angle in left was 46, in neutral was 102 and in right was 252. The curvature pattern shifts to the right side in the adolescent group (p Conclusions Based on standing chest radiographic measurements, a right thoracic curvature was observed in normal spines after adolescence.

  16. Normalization for triple-target microarray experiments

    Directory of Open Access Journals (Sweden)

    Magniette Frederic

    2008-04-01

    Full Text Available Abstract Background Most microarray studies are made using labelling with one or two dyes which allows the hybridization of one or two samples on the same slide. In such experiments, the most frequently used dyes are Cy3 and Cy5. Recent improvements in the technology (dye-labelling, scanner and, image analysis allow hybridization up to four samples simultaneously. The two additional dyes are Alexa488 and Alexa494. The triple-target or four-target technology is very promising, since it allows more flexibility in the design of experiments, an increase in the statistical power when comparing gene expressions induced by different conditions and a scaled down number of slides. However, there have been few methods proposed for statistical analysis of such data. Moreover the lowess correction of the global dye effect is available for only two-color experiments, and even if its application can be derived, it does not allow simultaneous correction of the raw data. Results We propose a two-step normalization procedure for triple-target experiments. First the dye bleeding is evaluated and corrected if necessary. Then the signal in each channel is normalized using a generalized lowess procedure to correct a global dye bias. The normalization procedure is validated using triple-self experiments and by comparing the results of triple-target and two-color experiments. Although the focus is on triple-target microarrays, the proposed method can be used to normalize p differently labelled targets co-hybridized on a same array, for any value of p greater than 2. Conclusion The proposed normalization procedure is effective: the technical biases are reduced, the number of false positives is under control in the analysis of differentially expressed genes, and the triple-target experiments are more powerful than the corresponding two-color experiments. There is room for improving the microarray experiments by simultaneously hybridizing more than two samples.

  17. Physics of collisionless scrape-off-layer plasma during normal and off-normal Tokamak operating conditions

    International Nuclear Information System (INIS)

    Hassanein, A.; Konkashbaev, I.

    1999-01-01

    The structure of a collisionless scrape-off-layer (SOL) plasma in tokamak reactors is being studied to define the electron distribution function and the corresponding sheath potential between the divertor plate and the edge plasma. The collisionless model is shown to be valid during the thermal phase of a plasma disruption, as well as during the newly desired low-recycling normal phase of operation with low-density, high-temperature, edge plasma conditions. An analytical solution is developed by solving the Fokker-Planck equation for electron distribution and balance in the SOL. The solution is in good agreement with numerical studies using Monte-Carlo methods. The analytical solutions provide an insight to the role of different physical and geometrical processes in a collisionless SOL during disruptions and during the enhanced phase of normal operation over a wide range of parameters

  18. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    International Nuclear Information System (INIS)

    Tam, Allison; Barker, Jocelyn; Rubin, Daniel

    2016-01-01

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline

  19. A method for normalizing pathology images to improve feature extraction for quantitative pathology

    Energy Technology Data Exchange (ETDEWEB)

    Tam, Allison [Stanford Institutes of Medical Research Program, Stanford University School of Medicine, Stanford, California 94305 (United States); Barker, Jocelyn [Department of Radiology, Stanford University School of Medicine, Stanford, California 94305 (United States); Rubin, Daniel [Department of Radiology, Stanford University School of Medicine, Stanford, California 94305 and Department of Medicine (Biomedical Informatics Research), Stanford University School of Medicine, Stanford, California 94305 (United States)

    2016-01-15

    Purpose: With the advent of digital slide scanning technologies and the potential proliferation of large repositories of digital pathology images, many research studies can leverage these data for biomedical discovery and to develop clinical applications. However, quantitative analysis of digital pathology images is impeded by batch effects generated by varied staining protocols and staining conditions of pathological slides. Methods: To overcome this problem, this paper proposes a novel, fully automated stain normalization method to reduce batch effects and thus aid research in digital pathology applications. Their method, intensity centering and histogram equalization (ICHE), normalizes a diverse set of pathology images by first scaling the centroids of the intensity histograms to a common point and then applying a modified version of contrast-limited adaptive histogram equalization. Normalization was performed on two datasets of digitized hematoxylin and eosin (H&E) slides of different tissue slices from the same lung tumor, and one immunohistochemistry dataset of digitized slides created by restaining one of the H&E datasets. Results: The ICHE method was evaluated based on image intensity values, quantitative features, and the effect on downstream applications, such as a computer aided diagnosis. For comparison, three methods from the literature were reimplemented and evaluated using the same criteria. The authors found that ICHE not only improved performance compared with un-normalized images, but in most cases showed improvement compared with previous methods for correcting batch effects in the literature. Conclusions: ICHE may be a useful preprocessing step a digital pathology image processing pipeline.

  20. Birkhoff normalization

    NARCIS (Netherlands)

    Broer, H.; Hoveijn, I.; Lunter, G.; Vegter, G.

    2003-01-01

    The Birkhoff normal form procedure is a widely used tool for approximating a Hamiltonian systems by a simpler one. This chapter starts out with an introduction to Hamiltonian mechanics, followed by an explanation of the Birkhoff normal form procedure. Finally we discuss several algorithms for

  1. Normalization reduces intersubject variability in cervical vestibular evoked myogenic potentials.

    Science.gov (United States)

    van Tilburg, Mark J; Herrmann, Barbara S; Guinan, John J; Rauch, Steven D

    2014-09-01

    Cervical vestibular evoked myogenic potentials are used to assess saccular and inferior vestibular nerve function. Normalization of the VEMP waveform has been proposed to reduce the variability in vestibular evoked myogenic potentials by correcting for muscle activation. In this study, we test the hypothesis that normalization of the raw cervical VEMP waveform causes a significant decrease in the intersubject variability. Prospective cohort study. Large specialty hospital, department of otolaryngology. Twenty healthy subjects were used in this study. All subjects underwent cervical vestibular evoked myogenic potential testing using short tone bursts at 250, 500, 750, and 1,000 Hz. Both intersubject and intrasubject variability was assessed. Variability between raw and normalized peak-to-peak amplitudes was compared using the coefficient of variation. Intrasubject variability was assessed using the intraclass correlation coefficient and interaural asymmetry ratio. cVEMPs were present in most ears. Highest peak-to-peak amplitudes were recorded at 750 Hz. Normalization did not alter cVEMP tuning characteristics. Normalization of the cVEMP response caused a significant reduction in intersubject variability of the peak-to-peak amplitude. No significant change was seen in the intrasubject variability. Normalization significantly reduces cVEMP intersubject variability in healthy subjects without altering cVEMP characteristics. By reducing cVEMP amplitude variation due to nonsaccular, muscle-related factors, cVEMP normalization is expected to improve the ability to distinguish between healthy and pathologic responses in the clinical application of cVEMP testing.

  2. A CONTEXT AWARE BASED PRE-HANDOFF SUPPORT APPROACH TO PROVIDE OPTIMAL QOS FOR STREAMING APPLICATIONS OVER VEHICULAR AD HOC NETWORKS – HOSA

    Directory of Open Access Journals (Sweden)

    K. RAMESH BABU

    2015-06-01

    Full Text Available Large variations in network Quality of Service (QoS such as bandwidth, latency, jitter, and reliability may occur during media transfer over vehicular ad hoc networks (VANET. Usage of VANET over mobile and wireless computing applications experience “bursty” QoS behavior during the execution over distributed network scenarios. Applications such as streaming media services need to adapt their functionalities to any change in network status. Moreover, an enhanced software platform is necessary to provide adaptive network management services to upper software components. HOSA, a handoff service broker based architecture for QoS adaptation over VANET supports in providing awareness. HOSA is structured as a middleware platform both to provide QoS awareness to streaming applications as well to manage dynamic ad hoc network resources with support over handoff in an adaptive fashion. HOSA is well analyzed over routing schemes such as TIBSCRPH, SIP and ABSRP where performance of HOSA was measured using throughput, traffic intensity and end to end delay. HOSA has been analyzed using JXTA development toolkit over C++ implemented classes to demonstrate its performance over varying node mobility established using vehicular mobility based conference application.

  3. Empirical evaluation of data normalization methods for molecular classification.

    Science.gov (United States)

    Huang, Huei-Chung; Qin, Li-Xuan

    2018-01-01

    Data artifacts due to variations in experimental handling are ubiquitous in microarray studies, and they can lead to biased and irreproducible findings. A popular approach to correct for such artifacts is through post hoc data adjustment such as data normalization. Statistical methods for data normalization have been developed and evaluated primarily for the discovery of individual molecular biomarkers. Their performance has rarely been studied for the development of multi-marker molecular classifiers-an increasingly important application of microarrays in the era of personalized medicine. In this study, we set out to evaluate the performance of three commonly used methods for data normalization in the context of molecular classification, using extensive simulations based on re-sampling from a unique pair of microRNA microarray datasets for the same set of samples. The data and code for our simulations are freely available as R packages at GitHub. In the presence of confounding handling effects, all three normalization methods tended to improve the accuracy of the classifier when evaluated in an independent test data. The level of improvement and the relative performance among the normalization methods depended on the relative level of molecular signal, the distributional pattern of handling effects (e.g., location shift vs scale change), and the statistical method used for building the classifier. In addition, cross-validation was associated with biased estimation of classification accuracy in the over-optimistic direction for all three normalization methods. Normalization may improve the accuracy of molecular classification for data with confounding handling effects; however, it cannot circumvent the over-optimistic findings associated with cross-validation for assessing classification accuracy.

  4. Updated US and Canadian normalization factors for TRACI 2.1

    DEFF Research Database (Denmark)

    Ryberg, Morten; Vieira, Marisa D. M.; Zgola, Melissa

    2014-01-01

    When LCA practitioners perform LCAs, the interpretation of the results can be difficult without a reference point to benchmark the results. Hence, normalization factors are important for relating results to a common reference. The main purpose of this paper was to update the normalization factors...... for the US and US-Canadian regions. The normalization factors were used for highlighting the most contributing substances, thereby enabling practitioners to put more focus on important substances, when compiling the inventory, as well as providing them with normalization factors reflecting the actual...... situation. Normalization factors were calculated using characterization factors from the TRACI 2.1 LCIA model. The inventory was based on US databases on emissions of substances. The Canadian inventory was based on a previous inventory with 2005 as reference, in this inventory the most significant...

  5. A compendium of canine normal tissue gene expression.

    Directory of Open Access Journals (Sweden)

    Joseph Briggs

    Full Text Available BACKGROUND: Our understanding of disease is increasingly informed by changes in gene expression between normal and abnormal tissues. The release of the canine genome sequence in 2005 provided an opportunity to better understand human health and disease using the dog as clinically relevant model. Accordingly, we now present the first genome-wide, canine normal tissue gene expression compendium with corresponding human cross-species analysis. METHODOLOGY/PRINCIPAL FINDINGS: The Affymetrix platform was utilized to catalogue gene expression signatures of 10 normal canine tissues including: liver, kidney, heart, lung, cerebrum, lymph node, spleen, jejunum, pancreas and skeletal muscle. The quality of the database was assessed in several ways. Organ defining gene sets were identified for each tissue and functional enrichment analysis revealed themes consistent with known physio-anatomic functions for each organ. In addition, a comparison of orthologous gene expression between matched canine and human normal tissues uncovered remarkable similarity. To demonstrate the utility of this dataset, novel canine gene annotations were established based on comparative analysis of dog and human tissue selective gene expression and manual curation of canine probeset mapping. Public access, using infrastructure identical to that currently in use for human normal tissues, has been established and allows for additional comparisons across species. CONCLUSIONS/SIGNIFICANCE: These data advance our understanding of the canine genome through a comprehensive analysis of gene expression in a diverse set of tissues, contributing to improved functional annotation that has been lacking. Importantly, it will be used to inform future studies of disease in the dog as a model for human translational research and provides a novel resource to the community at large.

  6. Single-Phase Full-Wave Rectifier as an Effective Example to Teach Normalization, Conduction Modes, and Circuit Analysis Methods

    Directory of Open Access Journals (Sweden)

    Predrag Pejovic

    2013-12-01

    Full Text Available Application of a single phase rectifier as an example in teaching circuit modeling, normalization, operating modes of nonlinear circuits, and circuit analysis methods is proposed.The rectifier supplied from a voltage source by an inductive impedance is analyzed in the discontinuous as well as in the continuous conduction mode. Completely analytical solution for the continuous conduction mode is derived. Appropriate numerical methods are proposed to obtain the circuit waveforms in both of the operating modes, and to compute the performance parameters. Source code of the program that performs such computation is provided.

  7. Normal Pressure Hydrocephalus (NPH)

    Science.gov (United States)

    ... local chapter Join our online community Normal Pressure Hydrocephalus (NPH) Normal pressure hydrocephalus is a brain disorder ... Symptoms Diagnosis Causes & risks Treatments About Normal Pressure Hydrocephalus Normal pressure hydrocephalus occurs when excess cerebrospinal fluid ...

  8. Nursing application of Bobath principles in stroke care.

    Science.gov (United States)

    Passarella, P M; Lewis, N

    1987-04-01

    The nursing approach in the care of stroke patients has a direct impact on functional outcome. Nursing application of Bobath principles in stroke care offers a nursing focus on involvement of the affected side; facilitation of normal tone, posture, and movement; and development of more normal function. A research study evaluating the functional gains of stroke patients demonstrated a significant level of functional improvement in those treated with Bobath principles over stroke patients treated with the traditional nursing approach. Practical methods for applying Bobath principles in patient care activities are described. These therapeutic methods provide nurses with the means to maximize stroke patients' potential and further influence their functional recovery.

  9. A case study: application of statistical process control tool for determining process capability and sigma level.

    Science.gov (United States)

    Chopra, Vikram; Bairagi, Mukesh; Trivedi, P; Nagar, Mona

    2012-01-01

    Statistical process control is the application of statistical methods to the measurement and analysis of variation process. Various regulatory authorities such as Validation Guidance for Industry (2011), International Conference on Harmonisation ICH Q10 (2009), the Health Canada guidelines (2009), Health Science Authority, Singapore: Guidance for Product Quality Review (2008), and International Organization for Standardization ISO-9000:2005 provide regulatory support for the application of statistical process control for better process control and understanding. In this study risk assessments, normal probability distributions, control charts, and capability charts are employed for selection of critical quality attributes, determination of normal probability distribution, statistical stability, and capability of production processes, respectively. The objective of this study is to determine tablet production process quality in the form of sigma process capability. By interpreting data and graph trends, forecasting of critical quality attributes, sigma process capability, and stability of process were studied. The overall study contributes to an assessment of process at the sigma level with respect to out-of-specification attributes produced. Finally, the study will point to an area where the application of quality improvement and quality risk assessment principles for achievement of six sigma-capable processes is possible. Statistical process control is the most advantageous tool for determination of the quality of any production process. This tool is new for the pharmaceutical tablet production process. In the case of pharmaceutical tablet production processes, the quality control parameters act as quality assessment parameters. Application of risk assessment provides selection of critical quality attributes among quality control parameters. Sequential application of normality distributions, control charts, and capability analyses provides a valid statistical

  10. On the Ergodic Capacity of Dual-Branch Correlated Log-Normal Fading Channels with Applications

    KAUST Repository

    Al-Quwaiee, Hessa

    2015-05-01

    Closed-form expressions of the ergodic capacity of independent or correlated diversity branches over Log-Normal fading channels are not available in the literature. Thus, it is become of an interest to investigate the behavior of such metric at high signal-to-noise (SNR). In this work, we propose simple closed-form asymptotic expressions of the ergodic capacity of dual-branch correlated Log- Normal corresponding to selection combining, and switch-and-stay combining. Furthermore, we capitalize on these new results to find new asymptotic ergodic capacity of correlated dual- branch free-space optical communication system under the impact of pointing error with both heterodyne and intensity modulation/direct detection. © 2015 IEEE.

  11. U.S. Climate Normals Product Suite (1981-2010)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Climate Normals are a large suite of data products that provide users with many tools to understand typical climate conditions for thousands of locations...

  12. Waste gas could provide power for ships

    Energy Technology Data Exchange (ETDEWEB)

    1970-07-18

    Dual-fuel engines are not new, but a version has been produced which, when used on ships carrying liquefied natural gas (LNG) could operate almost completely on waste gas. In its gas-operating mode, an engine can use the waste gas boiled off an LNG cargo. This wastage, normally allowed to escape to atmosphere, is about 0.25% of the cargo per day. Calculations have shown that this is enough to provide almost all the propulsion needs of a tanker under full cargo. This design is important in that it is suitable for the larger vessels now being required to carry LNG from N. Africa to North America, a journey where the costs of fuel are very considerable. Tests on the engine have indicated that power output is reduced to something like 80% of power under diesel fuel. However, additional advantages, such as cleaner engines with reduced maintenance costs, will help to tip the economic balance even further in favor of the dual purpose unit. This system also is applicable to stationary generating plant, again particularly on LNG tankage units where the same degree of gas boil-off applies.

  13. Costs of publicly provided maternity services in Rosario, Argentina

    Directory of Open Access Journals (Sweden)

    Borghi Josephine

    2003-01-01

    Full Text Available OBJECTIVE: This study estimates the costs of maternal health services in Rosario, Argentina. MATERIAL AND METHODS: The provider costs (US$ 1999 of antenatal care, a normal vaginal delivery and a caesarean section, were evaluated retrospectively in two municipal hospitals. The cost of an antenatal visit was evaluated in two health centres and the patient costs associated with the visit were evaluated in a hospital and a health centre. RESULTS: The average cost per hospital day is $114.62. The average cost of a caesarean section ($525.57 is five times greater than that of a normal vaginal delivery ($105.61. A normal delivery costs less at the general hospital and a c-section less at the maternity hospital. The average cost of an antenatal visit is $31.10. The provider cost is lower at the health centre than at the hospital. Personnel accounted for 72-94% of the total cost and drugs and medical supplies between 4-26%. On average, an antenatal visit costs women $4.70. Direct costs are minimal compared to indirect costs of travel and waiting time. CONCLUSIONS: These results suggest the potential for increasing the efficiency of resource use by promoting antenatal care visits at the primary level. Women could also benefit from reduced travel and waiting time. Similar benefits could accrue to the provider by encouraging normal delivery at general hospitals, and complicated deliveries at specialised maternity hospitals.

  14. The quotient of normal random variables and application to asset price fat tails

    Science.gov (United States)

    Caginalp, Carey; Caginalp, Gunduz

    2018-06-01

    The quotient of random variables with normal distributions is examined and proven to have power law decay, with density f(x) ≃f0x-2, with the coefficient depending on the means and variances of the numerator and denominator and their correlation. We also obtain the conditional probability densities for each of the four quadrants given by the signs of the numerator and denominator for arbitrary correlation ρ ∈ [ - 1 , 1) . For ρ = - 1 we obtain a particularly simple closed form solution for all x ∈ R. The results are applied to a basic issue in economics and finance, namely the density of relative price changes. Classical finance stipulates a normal distribution of relative price changes, though empirical studies suggest a power law at the tail end. By considering the supply and demand in a basic price change model, we prove that the relative price change has density that decays with an x-2 power law. Various parameter limits are established.

  15. Log-normality of indoor radon data in the Walloon region of Belgium

    International Nuclear Information System (INIS)

    Cinelli, Giorgia; Tondeur, François

    2015-01-01

    The deviations of the distribution of Belgian indoor radon data from the log-normal trend are examined. Simulated data are generated to provide a theoretical frame for understanding these deviations. It is shown that the 3-component structure of indoor radon (radon from subsoil, outdoor air and building materials) generates deviations in the low- and high-concentration tails, but this low-C trend can be almost completely compensated by the effect of measurement uncertainties and by possible small errors in background subtraction. The predicted low-C and high-C deviations are well observed in the Belgian data, when considering the global distribution of all data. The agreement with the log-normal model is improved when considering data organised in homogeneous geological groups. As the deviation from log-normality is often due to the low-C tail for which there is no interest, it is proposed to use the log-normal fit limited to the high-C half of the distribution. With this prescription, the vast majority of the geological groups of data are compatible with the log-normal model, the remaining deviations being mostly due to a few outliers, and rarely to a “fat tail”. With very few exceptions, the log-normal modelling of the high-concentration part of indoor radon data is expected to give reasonable results, provided that the data are organised in homogeneous geological groups. - Highlights: • Deviations of the distribution of Belgian indoor Rn data from the log-normal trend. • 3-component structure of indoor Rn: subsoil, outdoor air and building materials. • Simulated data generated to provide a theoretical frame for understanding deviations. • Data organised in homogeneous geological groups; better agreement with the log-normal

  16. Automatic development of normal zone in composite MgB2/CuNi wires with different diameters

    Science.gov (United States)

    Jokinen, A.; Kajikawa, K.; Takahashi, M.; Okada, M.

    2010-06-01

    One of the promising applications with superconducting technology for hydrogen utilization is a sensor with a magnesium-diboride (MgB2) superconductor to detect the position of boundary between the liquid hydrogen and the evaporated gas stored in a Dewar vessel. In our previous experiment for the level sensor, the normal zone has been automatically developed and therefore any energy input with the heater has not been required for normal operation. Although the physical mechanism for such a property of the MgB2 wire has not been clarified yet, the deliberate application might lead to the realization of a simpler superconducting level sensor without heater system. In the present study, the automatic development of normal zone with increasing a transport current is evaluated for samples consisting of three kinds of MgB2 wires with CuNi sheath and different diameters immersed in liquid helium. The influences of the repeats of current excitation and heat cycle on the normal zone development are discussed experimentally. The aim of this paper is to confirm the suitability of MgB2 wire in a heater free level sensor application. This could lead to even more optimized design of the liquid hydrogen level sensor and the removal of extra heater input.

  17. Applications of Raman spectroscopy in life science

    Science.gov (United States)

    Martin, Airton A.; T. Soto, Cláudio A.; Ali, Syed M.; Neto, Lázaro P. M.; Canevari, Renata A.; Pereira, Liliane; Fávero, Priscila P.

    2015-06-01

    Raman spectroscopy has been applied to the analysis of biological samples for the last 12 years providing detection of changes occurring at the molecular level during the pathological transformation of the tissue. The potential use of this technology in cancer diagnosis has shown encouraging results for the in vivo, real-time and minimally invasive diagnosis. Confocal Raman technics has also been successfully applied in the analysis of skin aging process providing new insights in this field. In this paper it is presented the latest biomedical applications of Raman spectroscopy in our laboratory. It is shown that Raman spectroscopy (RS) has been used for biochemical and molecular characterization of thyroid tissue by micro-Raman spectroscopy and gene expression analysis. This study aimed to improve the discrimination between different thyroid pathologies by Raman analysis. A total of 35 thyroid tissues samples including normal tissue (n=10), goiter (n=10), papillary (n=10) and follicular carcinomas (n=5) were analyzed. The confocal Raman spectroscopy allowed a maximum discrimination of 91.1% between normal and tumor tissues, 84.8% between benign and malignant pathologies and 84.6% among carcinomas analyzed. It will be also report the application of in vivo confocal Raman spectroscopy as an important sensor for detecting advanced glycation products (AGEs) on human skin.

  18. Predicting consonant recognition and confusions in normal-hearing listeners

    DEFF Research Database (Denmark)

    Zaar, Johannes; Dau, Torsten

    2017-01-01

    , Kollmeier, and Kohlrausch [(1997). J. Acoust. Soc. Am. 102, 2892–2905]. The model was evaluated based on the extensive consonant perception data set provided by Zaar and Dau [(2015). J. Acoust. Soc. Am. 138, 1253–1267], which was obtained with normal-hearing listeners using 15 consonant-vowel combinations...... confusion groups. The large predictive power of the proposed model suggests that adaptive processes in the auditory preprocessing in combination with a cross-correlation based template-matching back end can account for some of the processes underlying consonant perception in normal-hearing listeners....... The proposed model may provide a valuable framework, e.g., for investigating the effects of hearing impairment and hearing-aid signal processing on phoneme recognition....

  19. U.S. Annual/Seasonal Climate Normals (1981-2010)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The U.S. Annual Climate Normals for 1981 to 2010 are 30-year averages of meteorological parameters that provide users with many tools to understand typical climate...

  20. Attention and normalization circuits in macaque V1.

    Science.gov (United States)

    Sanayei, M; Herrero, J L; Distler, C; Thiele, A

    2015-04-01

    Attention affects neuronal processing and improves behavioural performance. In extrastriate visual cortex these effects have been explained by normalization models, which assume that attention influences the circuit that mediates surround suppression. While normalization models have been able to explain attentional effects, their validity has rarely been tested against alternative models. Here we investigate how attention and surround/mask stimuli affect neuronal firing rates and orientation tuning in macaque V1. Surround/mask stimuli provide an estimate to what extent V1 neurons are affected by normalization, which was compared against effects of spatial top down attention. For some attention/surround effect comparisons, the strength of attentional modulation was correlated with the strength of surround modulation, suggesting that attention and surround/mask stimulation (i.e. normalization) might use a common mechanism. To explore this in detail, we fitted multiplicative and additive models of attention to our data. In one class of models, attention contributed to normalization mechanisms, whereas in a different class of models it did not. Model selection based on Akaike's and on Bayesian information criteria demonstrated that in most cells the effects of attention were best described by models where attention did not contribute to normalization mechanisms. This demonstrates that attentional influences on neuronal responses in primary visual cortex often bypass normalization mechanisms. © 2015 The Authors. European Journal of Neuroscience published by Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  1. Datum Feature Extraction and Deformation Analysis Method Based on Normal Vector of Point Cloud

    Science.gov (United States)

    Sun, W.; Wang, J.; Jin, F.; Liang, Z.; Yang, Y.

    2018-04-01

    In order to solve the problem lacking applicable analysis method in the application of three-dimensional laser scanning technology to the field of deformation monitoring, an efficient method extracting datum feature and analysing deformation based on normal vector of point cloud was proposed. Firstly, the kd-tree is used to establish the topological relation. Datum points are detected by tracking the normal vector of point cloud determined by the normal vector of local planar. Then, the cubic B-spline curve fitting is performed on the datum points. Finally, datum elevation and the inclination angle of the radial point are calculated according to the fitted curve and then the deformation information was analyzed. The proposed approach was verified on real large-scale tank data set captured with terrestrial laser scanner in a chemical plant. The results show that the method could obtain the entire information of the monitor object quickly and comprehensively, and reflect accurately the datum feature deformation.

  2. Procedure for normalization of cDNA libraries

    Science.gov (United States)

    Bonaldo, Maria DeFatima; Soares, Marcelo Bento

    1997-01-01

    This invention provides a method to normalize a cDNA library constructed in a vector capable of being converted to single-stranded circles and capable of producing complementary nucleic acid molecules to the single-stranded circles comprising: (a) converting the cDNA library in single-stranded circles; (b) generating complementary nucleic acid molecules to the single-stranded circles; (c) hybridizing the single-stranded circles converted in step (a) with complementary nucleic acid molecules of step (b) to produce partial duplexes to an appropriate Cot; (e) separating the unhybridized single-stranded circles from the hybridized single-stranded circles, thereby generating a normalized cDNA library.

  3. Optimal consistency in microRNA expression analysis using reference-gene-based normalization.

    Science.gov (United States)

    Wang, Xi; Gardiner, Erin J; Cairns, Murray J

    2015-05-01

    Normalization of high-throughput molecular expression profiles secures differential expression analysis between samples of different phenotypes or biological conditions, and facilitates comparison between experimental batches. While the same general principles apply to microRNA (miRNA) normalization, there is mounting evidence that global shifts in their expression patterns occur in specific circumstances, which pose a challenge for normalizing miRNA expression data. As an alternative to global normalization, which has the propensity to flatten large trends, normalization against constitutively expressed reference genes presents an advantage through their relative independence. Here we investigated the performance of reference-gene-based (RGB) normalization for differential miRNA expression analysis of microarray expression data, and compared the results with other normalization methods, including: quantile, variance stabilization, robust spline, simple scaling, rank invariant, and Loess regression. The comparative analyses were executed using miRNA expression in tissue samples derived from subjects with schizophrenia and non-psychiatric controls. We proposed a consistency criterion for evaluating methods by examining the overlapping of differentially expressed miRNAs detected using different partitions of the whole data. Based on this criterion, we found that RGB normalization generally outperformed global normalization methods. Thus we recommend the application of RGB normalization for miRNA expression data sets, and believe that this will yield a more consistent and useful readout of differentially expressed miRNAs, particularly in biological conditions characterized by large shifts in miRNA expression.

  4. A methodology for generating normal and pathological brain perfusion SPECT images for evaluation of MRI/SPECT fusion methods: application in epilepsy

    Energy Technology Data Exchange (ETDEWEB)

    Grova, C [Laboratoire IDM, Faculte de Medecine, Universite de Rennes 1, Rennes (France); Jannin, P [Laboratoire IDM, Faculte de Medecine, Universite de Rennes 1, Rennes (France); Biraben, A [Laboratoire IDM, Faculte de Medecine, Universite de Rennes 1, Rennes (France); Buvat, I [INSERM U494, CHU Pitie Salpetriere, Paris (France); Benali, H [INSERM U494, CHU Pitie Salpetriere, Paris (France); Bernard, A M [Service de Medecine Nucleaire, Centre Eugene Marquis, Rennes (France); Scarabin, J M [Laboratoire IDM, Faculte de Medecine, Universite de Rennes 1, Rennes (France); Gibaud, B [Laboratoire IDM, Faculte de Medecine, Universite de Rennes 1, Rennes (France)

    2003-12-21

    Quantitative evaluation of brain MRI/SPECT fusion methods for normal and in particular pathological datasets is difficult, due to the frequent lack of relevant ground truth. We propose a methodology to generate MRI and SPECT datasets dedicated to the evaluation of MRI/SPECT fusion methods and illustrate the method when dealing with ictal SPECT. The method consists in generating normal or pathological SPECT data perfectly aligned with a high-resolution 3D T1-weighted MRI using realistic Monte Carlo simulations that closely reproduce the response of a SPECT imaging system. Anatomical input data for the SPECT simulations are obtained from this 3D T1-weighted MRI, while functional input data result from an inter-individual analysis of anatomically standardized SPECT data. The method makes it possible to control the 'brain perfusion' function by proposing a theoretical model of brain perfusion from measurements performed on real SPECT images. Our method provides an absolute gold standard for assessing MRI/SPECT registration method accuracy since, by construction, the SPECT data are perfectly registered with the MRI data. The proposed methodology has been applied to create a theoretical model of normal brain perfusion and ictal brain perfusion characteristic of mesial temporal lobe epilepsy. To approach realistic and unbiased perfusion models, real SPECT data were corrected for uniform attenuation, scatter and partial volume effect. An anatomic standardization was used to account for anatomic variability between subjects. Realistic simulations of normal and ictal SPECT deduced from these perfusion models are presented. The comparison of real and simulated SPECT images showed relative differences in regional activity concentration of less than 20% in most anatomical structures, for both normal and ictal data, suggesting realistic models of perfusion distributions for evaluation purposes. Inter-hemispheric asymmetry coefficients measured on simulated data were

  5. A study of the normal interpedicular distance of the spine in Korean teenagers (Estimation of normal range by roentgenographic measurement)

    International Nuclear Information System (INIS)

    Lee, Myung Uk

    1979-01-01

    The radiological measurement of the interpedicular disease using a routine antero-posterior view of the spine gives important clinical criteria in evaluation of the intraspinal tumor and stenosis of the spinal canal, and aids for diagnosis of the lesions. In 1934 Elsberg and Dyke reported values of interpedicular distance as determined on roentgenograms for spine of white adult, and in 1968 Song prepared normal values of interpedicular distance for Korean adult. The present investigation was undertaken to provide normal interpedicular distance of Korean teenagers. The author observed the antero-posterior films of the spine of 200 normal teenagers which were composed of 100 male and 100 female. The normal values of the interpedicular distance of Korean teenagers were obtained, as well as 90% tolerance range for clinical use. In this statistical analysis, there were noted significant differences between male and female, and each age groups. It was observed that average male measurement were consistently larger than female by about 1 mm and the growth of the spinal canal appeared to be continued.

  6. 77 FR 26507 - Application(s) for Duty-Free Entry of Scientific Instruments

    Science.gov (United States)

    2012-05-04

    ... States. Application accepted by Commissioner of Customs: March 29, 2012. Docket Number: 12-018. Applicant... general category manufactured in the United States. Application accepted by Commissioner of Customs: March...: The instrument will be used to investigate the genes and proteins that underlie normal and pathologic...

  7. Bernstein Algorithm for Vertical Normalization to 3NF Using Synthesis

    Directory of Open Access Journals (Sweden)

    Matija Varga

    2013-07-01

    Full Text Available This paper demonstrates the use of Bernstein algorithm for vertical normalization to 3NF using synthesis. The aim of the paper is to provide an algorithm for database normalization and present a set of steps which minimize redundancy in order to increase the database management efficiency, and specify tests and algorithms for testing and proving the reversibility (i.e., proving that the normalization did not cause loss of information. Using Bernstein algorithm steps, the paper gives examples of vertical normalization to 3NF through synthesis and proposes a test and an algorithm to demonstrate decomposition reversibility. This paper also sets out to explain that the reasons for generating normal forms are to facilitate data search, eliminate data redundancy as well as delete, insert and update anomalies and explain how anomalies develop using examples-

  8. New Riemannian Priors on the Univariate Normal Model

    Directory of Open Access Journals (Sweden)

    Salem Said

    2014-07-01

    Full Text Available The current paper introduces new prior distributions on the univariate normal model, with the aim of applying them to the classification of univariate normal populations. These new prior distributions are entirely based on the Riemannian geometry of the univariate normal model, so that they can be thought of as “Riemannian priors”. Precisely, if {pθ ; θ ∈ Θ} is any parametrization of the univariate normal model, the paper considers prior distributions G( θ - , γ with hyperparameters θ - ∈ Θ and γ > 0, whose density with respect to Riemannian volume is proportional to exp(−d2(θ, θ - /2γ2, where d2(θ, θ - is the square of Rao’s Riemannian distance. The distributions G( θ - , γ are termed Gaussian distributions on the univariate normal model. The motivation for considering a distribution G( θ - , γ is that this distribution gives a geometric representation of a class or cluster of univariate normal populations. Indeed, G( θ - , γ has a unique mode θ - (precisely, θ - is the unique Riemannian center of mass of G( θ - , γ, as shown in the paper, and its dispersion away from θ - is given by γ.  Therefore, one thinks of members of the class represented by G( θ - , γ as being centered around θ - and  lying within a typical  distance determined by γ. The paper defines rigorously the Gaussian distributions G( θ - , γ and describes an algorithm for computing maximum likelihood estimates of their hyperparameters. Based on this algorithm and on the Laplace approximation, it describes how the distributions G( θ - , γ can be used as prior distributions for Bayesian classification of large univariate normal populations. In a concrete application to texture image classification, it is shown that  this  leads  to  an  improvement  in  performance  over  the  use  of  conjugate  priors.

  9. Capacitor energy needed to induce transitions from the superconducting to the normal state

    International Nuclear Information System (INIS)

    Eberhard, P.H.; Ross, R.R.

    1985-08-01

    The purpose of this paper is to describe a technique to turn a long length of superconducting wire normal by dumping a charged capacitor into it and justify some formulae needed in the design. The physical phenomenon is described. A formula for the energy to be stored in the capacitor is given. There are circumstances where the dc in an electrical circuit containing superconducting elements has to be turned off quickly and where the most convenient way to switch the current off is to turn a large portion or all of the superconducting wire normal. Such was the case of the Time Projection Chamber (TPC) superconducting magnet as soon as a quench was detected. The technique used was the discharge of a capacitor into the coil center tap. It turned the magnet winding normal in ten milliseconds or so and provided an adequate quench protection. The technique of discharging a capacitor into a superconducting wire should have many other applications whenever a substantial resistance in a superconducting circuit has to be generated in that kind of time scale. The process involves generating a pulse of large currents in some part of the circuit and heating the wire up by ac losses until the value of the wire critical current is smaller than the dc current. Use of low inductance connections to the circuit is necessary. Then the dc gets turned off due to the resistance of the wire as in a magnet quench

  10. Normalized Minimum Error Entropy Algorithm with Recursive Power Estimation

    Directory of Open Access Journals (Sweden)

    Namyong Kim

    2016-06-01

    Full Text Available The minimum error entropy (MEE algorithm is known to be superior in signal processing applications under impulsive noise. In this paper, based on the analysis of behavior of the optimum weight and the properties of robustness against impulsive noise, a normalized version of the MEE algorithm is proposed. The step size of the MEE algorithm is normalized with the power of input entropy that is estimated recursively for reducing its computational complexity. The proposed algorithm yields lower minimum MSE (mean squared error and faster convergence speed simultaneously than the original MEE algorithm does in the equalization simulation. On the condition of the same convergence speed, its performance enhancement in steady state MSE is above 3 dB.

  11. Improved Discovery of Molecular Interactions in Genome-Scale Data with Adaptive Model-Based Normalization

    Science.gov (United States)

    Brown, Patrick O.

    2013-01-01

    Background High throughput molecular-interaction studies using immunoprecipitations (IP) or affinity purifications are powerful and widely used in biology research. One of many important applications of this method is to identify the set of RNAs that interact with a particular RNA-binding protein (RBP). Here, the unique statistical challenge presented is to delineate a specific set of RNAs that are enriched in one sample relative to another, typically a specific IP compared to a non-specific control to model background. The choice of normalization procedure critically impacts the number of RNAs that will be identified as interacting with an RBP at a given significance threshold – yet existing normalization methods make assumptions that are often fundamentally inaccurate when applied to IP enrichment data. Methods In this paper, we present a new normalization methodology that is specifically designed for identifying enriched RNA or DNA sequences in an IP. The normalization (called adaptive or AD normalization) uses a basic model of the IP experiment and is not a variant of mean, quantile, or other methodology previously proposed. The approach is evaluated statistically and tested with simulated and empirical data. Results and Conclusions The adaptive (AD) normalization method results in a greatly increased range in the number of enriched RNAs identified, fewer false positives, and overall better concordance with independent biological evidence, for the RBPs we analyzed, compared to median normalization. The approach is also applicable to the study of pairwise RNA, DNA and protein interactions such as the analysis of transcription factors via chromatin immunoprecipitation (ChIP) or any other experiments where samples from two conditions, one of which contains an enriched subset of the other, are studied. PMID:23349766

  12. A Box-Cox normal model for response times.

    Science.gov (United States)

    Klein Entink, R H; van der Linden, W J; Fox, J-P

    2009-11-01

    The log-transform has been a convenient choice in response time modelling on test items. However, motivated by a dataset of the Medical College Admission Test where the lognormal model violated the normality assumption, the possibilities of the broader class of Box-Cox transformations for response time modelling are investigated. After an introduction and an outline of a broader framework for analysing responses and response times simultaneously, the performance of a Box-Cox normal model for describing response times is investigated using simulation studies and a real data example. A transformation-invariant implementation of the deviance information criterium (DIC) is developed that allows for comparing model fit between models with different transformation parameters. Showing an enhanced description of the shape of the response time distributions, its application in an educational measurement context is discussed at length.

  13. Effect of normal stress under an excitation in poroelastic flat slabs

    African Journals Online (AJOL)

    user

    Biot's poroelastic theory is employed to investigate stresses under an ... Keywords: Flat slab, radial normal stress, pervious surface, impervious ... warranted, because of above mentioned applications. ...... M.Tajuddin, and G. Narayan Reddy, Effect of boundaries on the dynamic interaction of a liquid filled porous layer and a.

  14. Low-dose computed tomography image restoration using previous normal-dose scan

    International Nuclear Information System (INIS)

    Ma, Jianhua; Huang, Jing; Feng, Qianjin; Zhang, Hua; Lu, Hongbing; Liang, Zhengrong; Chen, Wufan

    2011-01-01

    Purpose: In current computed tomography (CT) examinations, the associated x-ray radiation dose is of a significant concern to patients and operators. A simple and cost-effective means to perform the examinations is to lower the milliampere-seconds (mAs) or kVp parameter (or delivering less x-ray energy to the body) as low as reasonably achievable in data acquisition. However, lowering the mAs parameter will unavoidably increase data noise and the noise would propagate into the CT image if no adequate noise control is applied during image reconstruction. Since a normal-dose high diagnostic CT image scanned previously may be available in some clinical applications, such as CT perfusion imaging and CT angiography (CTA), this paper presents an innovative way to utilize the normal-dose scan as a priori information to induce signal restoration of the current low-dose CT image series. Methods: Unlike conventional local operations on neighboring image voxels, nonlocal means (NLM) algorithm utilizes the redundancy of information across the whole image. This paper adapts the NLM to utilize the redundancy of information in the previous normal-dose scan and further exploits ways to optimize the nonlocal weights for low-dose image restoration in the NLM framework. The resulting algorithm is called the previous normal-dose scan induced nonlocal means (ndiNLM). Because of the optimized nature of nonlocal weights calculation, the ndiNLM algorithm does not depend heavily on image registration between the current low-dose and the previous normal-dose CT scans. Furthermore, the smoothing parameter involved in the ndiNLM algorithm can be adaptively estimated based on the image noise relationship between the current low-dose and the previous normal-dose scanning protocols. Results: Qualitative and quantitative evaluations were carried out on a physical phantom as well as clinical abdominal and brain perfusion CT scans in terms of accuracy and resolution properties. The gain by the use

  15. Credential Service Provider (CSP)

    Data.gov (United States)

    Department of Veterans Affairs — Provides a VA operated Level 1 and Level 2 credential for individuals who require access to VA applications, yet cannot obtain a credential from another VA accepted...

  16. kCCA Transformation-Based Radiometric Normalization of Multi-Temporal Satellite Images

    Directory of Open Access Journals (Sweden)

    Yang Bai

    2018-03-01

    Full Text Available Radiation normalization is an essential pre-processing step for generating high-quality satellite sequence images. However, most radiometric normalization methods are linear, and they cannot eliminate the regular nonlinear spectral differences. Here we introduce the well-established kernel canonical correlation analysis (kCCA into radiometric normalization for the first time to overcome this problem, which leads to a new kernel method. It can maximally reduce the image differences among multi-temporal images regardless of the imaging conditions and the reflectivity difference. It also perfectly eliminates the impact of nonlinear changes caused by seasonal variation of natural objects. Comparisons with the multivariate alteration detection (CCA-based normalization and the histogram matching, on Gaofen-1 (GF-1 data, indicate that the kCCA-based normalization can preserve more similarity and better correlation between an image-pair and effectively avoid the color error propagation. The proposed method not only builds the common scale or reference to make the radiometric consistency among GF-1 image sequences, but also highlights the interesting spectral changes while eliminates less interesting spectral changes. Our method enables the application of GF-1 data for change detection, land-use, land-cover change detection etc.

  17. Proportionate-type normalized last mean square algorithms

    CERN Document Server

    Wagner, Kevin

    2013-01-01

    The topic of this book is proportionate-type normalized least mean squares (PtNLMS) adaptive filtering algorithms, which attempt to estimate an unknown impulse response by adaptively giving gains proportionate to an estimate of the impulse response and the current measured error. These algorithms offer low computational complexity and fast convergence times for sparse impulse responses in network and acoustic echo cancellation applications. New PtNLMS algorithms are developed by choosing gains that optimize user-defined criteria, such as mean square error, at all times. PtNLMS algorithms ar

  18. Bicervical normal uterus with normal vagina | Okeke | Annals of ...

    African Journals Online (AJOL)

    To the best of our knowledge, only few cases of bicervical normal uterus with normal vagina exist in the literature; one of the cases had an anterior‑posterior disposition. This form of uterine abnormality is not explicable by the existing classical theory of mullerian anomalies and suggests that a complex interplay of events ...

  19. Prior publication productivity, grant percentile ranking, and topic-normalized citation impact of NHLBI cardiovascular R01 grants.

    Science.gov (United States)

    Kaltman, Jonathan R; Evans, Frank J; Danthi, Narasimhan S; Wu, Colin O; DiMichele, Donna M; Lauer, Michael S

    2014-09-12

    We previously demonstrated absence of association between peer-review-derived percentile ranking and raw citation impact in a large cohort of National Heart, Lung, and Blood Institute cardiovascular R01 grants, but we did not consider pregrant investigator publication productivity. We also did not normalize citation counts for scientific field, type of article, and year of publication. To determine whether measures of investigator prior productivity predict a grant's subsequent scientific impact as measured by normalized citation metrics. We identified 1492 investigator-initiated de novo National Heart, Lung, and Blood Institute R01 grant applications funded between 2001 and 2008 and linked the publications from these grants to their InCites (Thompson Reuters) citation record. InCites provides a normalized citation count for each publication stratifying by year of publication, type of publication, and field of science. The coprimary end points for this analysis were the normalized citation impact per million dollars allocated and the number of publications per grant that has normalized citation rate in the top decile per million dollars allocated (top 10% articles). Prior productivity measures included the number of National Heart, Lung, and Blood Institute-supported publications each principal investigator published in the 5 years before grant review and the corresponding prior normalized citation impact score. After accounting for potential confounders, there was no association between peer-review percentile ranking and bibliometric end points (all adjusted P>0.5). However, prior productivity was predictive (Pcitation counts, we confirmed a lack of association between peer-review grant percentile ranking and grant citation impact. However, prior investigator publication productivity was predictive of grant-specific citation impact. © 2014 American Heart Association, Inc.

  20. NormaCurve: a SuperCurve-based method that simultaneously quantifies and normalizes reverse phase protein array data.

    Directory of Open Access Journals (Sweden)

    Sylvie Troncale

    Full Text Available MOTIVATION: Reverse phase protein array (RPPA is a powerful dot-blot technology that allows studying protein expression levels as well as post-translational modifications in a large number of samples simultaneously. Yet, correct interpretation of RPPA data has remained a major challenge for its broad-scale application and its translation into clinical research. Satisfying quantification tools are available to assess a relative protein expression level from a serial dilution curve. However, appropriate tools allowing the normalization of the data for external sources of variation are currently missing. RESULTS: Here we propose a new method, called NormaCurve, that allows simultaneous quantification and normalization of RPPA data. For this, we modified the quantification method SuperCurve in order to include normalization for (i background fluorescence, (ii variation in the total amount of spotted protein and (iii spatial bias on the arrays. Using a spike-in design with a purified protein, we test the capacity of different models to properly estimate normalized relative expression levels. The best performing model, NormaCurve, takes into account a negative control array without primary antibody, an array stained with a total protein stain and spatial covariates. We show that this normalization is reproducible and we discuss the number of serial dilutions and the number of replicates that are required to obtain robust data. We thus provide a ready-to-use method for reliable and reproducible normalization of RPPA data, which should facilitate the interpretation and the development of this promising technology. AVAILABILITY: The raw data, the scripts and the normacurve package are available at the following web site: http://microarrays.curie.fr.

  1. Normal values for quantitative muscle ultrasonography in adults.

    NARCIS (Netherlands)

    Arts, I.M.P.; Pillen, S.; Schelhaas, H.J.; Overeem, S.; Zwarts, M.J.

    2010-01-01

    Ultrasonography can detect structural muscle changes caused by neuromuscular disease. Quantitative analysis is the preferred method to determine if ultrasound findings are within normal limits, but normative data are incomplete. The purpose of this study was to provide normative muscle

  2. Smooth quantile normalization.

    Science.gov (United States)

    Hicks, Stephanie C; Okrah, Kwame; Paulson, Joseph N; Quackenbush, John; Irizarry, Rafael A; Bravo, Héctor Corrada

    2018-04-01

    Between-sample normalization is a critical step in genomic data analysis to remove systematic bias and unwanted technical variation in high-throughput data. Global normalization methods are based on the assumption that observed variability in global properties is due to technical reasons and are unrelated to the biology of interest. For example, some methods correct for differences in sequencing read counts by scaling features to have similar median values across samples, but these fail to reduce other forms of unwanted technical variation. Methods such as quantile normalization transform the statistical distributions across samples to be the same and assume global differences in the distribution are induced by only technical variation. However, it remains unclear how to proceed with normalization if these assumptions are violated, for example, if there are global differences in the statistical distributions between biological conditions or groups, and external information, such as negative or control features, is not available. Here, we introduce a generalization of quantile normalization, referred to as smooth quantile normalization (qsmooth), which is based on the assumption that the statistical distribution of each sample should be the same (or have the same distributional shape) within biological groups or conditions, but allowing that they may differ between groups. We illustrate the advantages of our method on several high-throughput datasets with global differences in distributions corresponding to different biological conditions. We also perform a Monte Carlo simulation study to illustrate the bias-variance tradeoff and root mean squared error of qsmooth compared to other global normalization methods. A software implementation is available from https://github.com/stephaniehicks/qsmooth.

  3. Staying connected: Service-specific orientation can be successfully achieved using a mobile application for onboarding care providers.

    Science.gov (United States)

    Chreiman, Kristen M; Prakash, Priya S; Martin, Niels D; Kim, Patrick K; Mehta, Samir; McGinnis, Kelly; Gallagher, John J; Reilly, Patrick M

    2017-01-01

    Communicating service-specific practice patterns, guidelines, and provider information to a new team of learners that rotate frequently can be challenging. Leveraging individual and healthcare electronic resources, a mobile device platform was implemented into a newly revised resident onboarding process. We hypothesized that offering an easy-to-use mobile application would improve communication across multiple disciplines as well as improve provider experiences when transitioning to a new rotation. A mobile platform was created and deployed to assist with enhancing communication within a trauma service and its resident onboarding process. The platform had resource materials such as: divisional policies, Clinical Practice Guidelines (CMGs), and onboarding manuals along with allowing for the posting of divisional events, a divisional directory that linked to direct dialing, text or email messaging, as well as on-call schedules. A mixed-methods study, including an anonymous survey, aimed at providing information on team member's impressions and usage of the mobile application was performed. Usage statistics over a 3-month period were analyzed on those providers who completed the survey. After rotation on the trauma service, trainees were asked to complete an anonymous, online survey addressing both the experience with, as well as the utility of, the mobile app. Thirty of the 37 (81%) residents and medical students completed the survey. Twenty-five (83%) trainees stated that this was their first experience rotating on the trauma service and 6 (20%) were from outside of the health system. According to those surveyed, the most useful function of the app were access to the directory (15, 50%), the divisional calendar (4, 13.3%), and the on-call schedules (3, 10%). Overall, the app was felt to be easy to use (27, 90%) and was accessed an average of 7 times per day (1-50, SD 9.67). Over half the survey respondents felt that the mobile app was helpful in completing their

  4. Fast Bitwise Implementation of the Algebraic Normal Form Transform

    OpenAIRE

    Bakoev, Valentin

    2017-01-01

    The representation of Boolean functions by their algebraic normal forms (ANFs) is very important for cryptography, coding theory and other scientific areas. The ANFs are used in computing the algebraic degree of S-boxes, some other cryptographic criteria and parameters of errorcorrecting codes. Their applications require these criteria and parameters to be computed by fast algorithms. Hence the corresponding ANFs should also be obtained by fast algorithms. Here we continue o...

  5. Wavefield extrapolation in caustic-free normal ray coordinates

    KAUST Repository

    Ma, Xuxin

    2012-11-04

    Normal ray coordinates are conventionally constructed from ray tracing, which inherently requires smooth velocity profiles. To use rays as coordinates, the velocities have to be smoothed further to avoid caustics, which is detrimental to the mapping process. Solving the eikonal equation numerically for a line source at the surface provides a platform to map normal rays in complex unsmoothed velocity models and avoid caustics. We implement reverse-time migration (RTM) and downward continuation in the new ray coordinate system, which allows us to obtain efficient images and avoid some of the dip limitations of downward continuation.

  6. Normal foot and ankle

    International Nuclear Information System (INIS)

    Weissman, S.D.

    1989-01-01

    The foot may be thought of as a bag of bones tied tightly together and functioning as a unit. The bones re expected to maintain their alignment without causing symptomatology to the patient. The author discusses a normal radiograph. The bones must have normal shape and normal alignment. The density of the soft tissues should be normal and there should be no fractures, tumors, or foreign bodies

  7. Cortical Thinning in Network-Associated Regions in Cognitively Normal and Below-Normal Range Schizophrenia

    Directory of Open Access Journals (Sweden)

    R. Walter Heinrichs

    2017-01-01

    Full Text Available This study assessed whether cortical thickness across the brain and regionally in terms of the default mode, salience, and central executive networks differentiates schizophrenia patients and healthy controls with normal range or below-normal range cognitive performance. Cognitive normality was defined using the MATRICS Consensus Cognitive Battery (MCCB composite score (T=50 ± 10 and structural magnetic resonance imaging was used to generate cortical thickness data. Whole brain analysis revealed that cognitively normal range controls (n=39 had greater cortical thickness than both cognitively normal (n=17 and below-normal range (n=49 patients. Cognitively normal controls also demonstrated greater thickness than patients in regions associated with the default mode and salience, but not central executive networks. No differences on any thickness measure were found between cognitively normal range and below-normal range controls (n=24 or between cognitively normal and below-normal range patients. In addition, structural covariance between network regions was high and similar across subgroups. Positive and negative symptom severity did not correlate with thickness values. Cortical thinning across the brain and regionally in relation to the default and salience networks may index shared aspects of the psychotic psychopathology that defines schizophrenia with no relation to cognitive impairment.

  8. Information Engineering and Applications

    CERN Document Server

    Ma, Yan; International Conference on Information Engineering and Applications (IEA) 2011

    2012-01-01

    The International Conference on Information Engineering and Applications (IEA) 2011 will be held on October 21-24, 2011, in Chongqing, China. It is organized by Chongqing Normal University, Chongqing University, Shanghai Jiao Tong University, Nanyang Technological University, the University of Michigan, Chongqing University of Arts and Sciences, and sponsored by the National Natural Science Foundation of China. The objective of IEA 2011 is to facilitate an exchange of information on best practices for the latest research advances in the area of information engineering and intelligence applications, which mainly includes computer science and engineering, informatics, communications and control, electrical engineering, information computing, business intelligence and management. IEA 2011 will provide a forum for engineers and scientists in academia, industry, and government to address the most innovative research and development including technical challenges, social and economic issues, and to present and disc...

  9. Iso-effect tables and therapeutic ratios for epidermoid cancer and normal tissue stroma

    International Nuclear Information System (INIS)

    Cohen, L.; Creditor, M.

    1983-01-01

    Available literature on radiation injury to normal tissue stroma and ablation of epidermoid carcinoma was surveyed. Computer programs (RAD3 and RAD1) were then used to derive cell kinetic parameters and generate iso-effect tables for the relevant tissues. The two tables provide a set of limiting doses for tolerance of normal connective tissue (16% risk of injury) and for ablation of epidermoid cancer (16% risk of recurrence) covering a wide range of treatment schedules. Calculating the ratios of normal tissue tolerance to tumor control doses for each treatment scheme provides an array of therapeutic ratios, from which appropriate treatment schemes can be selected

  10. Normally-Closed Zero-Leak Valve with Magnetostrictive Actuator

    Science.gov (United States)

    Ramspacher, Daniel J. (Inventor); Richard, James A. (Inventor)

    2017-01-01

    A non-pyrotechnic, normally-closed, zero-leak valve is a replacement for the pyrovalve used for both in-space and launch vehicle applications. The valve utilizes a magnetostrictive alloy for actuation, rather than pyrotechnic charges. The alloy, such as Terfenol-D, experiences magnetostriction, i.e. a gross elongation, when exposed to a magnetic field. This elongation fractures a parent metal seal, allowing fluid flow through the valve. The required magnetic field is generated by redundant coils that are isolated from the working fluid.

  11. Dynamic analysis to establish normal shock and vibration of radioactive material shipping packages

    International Nuclear Information System (INIS)

    Fields, S.R.

    1980-01-01

    A computer model, CARDS (Cask-Railcar Dynamic Simulator) was developed to provide input data for a broad range of radioactive material package-tiedown structural assessments. CARDS simulates the dynamic behavior of shipping packages and their transporters during normal transport conditions. The model will be used to identify parameters which significantly affect the normal shock and vibration environments which, in turn, provide the basis for determining the forces transmitted to the packages

  12. An experimental study on the normal stress of magnetorheological fluids

    International Nuclear Information System (INIS)

    Jiang, Jile; Tian, Yu; Ren, Dongxue; Meng, Yonggang

    2011-01-01

    The dependence of the normal stress on the shear rate and magnetic field strength in the shear flow of magnetorheological (MR) fluids has been studied experimentally. An obvious normal stress could be observed when the applied magnetic field was higher than a critical value. The normal stress increases considerably with increase of the shear rate and magnetic field, and decreases suddenly and significantly upon the onset of shear thickening in MR fluids. The ratio of shear stress to normal stress, an analogue of the friction coefficient, increases with increase of the shear rate, but decreases with increase of the applied magnetic field. Along with the shear stress, the normal stress in MR fluids could provide a more comprehensive understanding of the MR effect, and the evolution of the particle structure in shear flow, and may have important implications for preparing high performance magnetostrictive elastomers with high force output along the magnetic field direction

  13. Normalizing Landsat and ASTER Data Using MODIS Data Products for Forest Change Detection

    Science.gov (United States)

    Gao, Feng; Masek, Jeffrey G.; Wolfe, Robert E.; Tan, Bin

    2010-01-01

    Monitoring forest cover and its changes are a major application for optical remote sensing. In this paper, we present an approach to integrate Landsat, ASTER and MODIS data for forest change detection. Moderate resolution (10-100m) images (e.g. Landsat and ASTER) acquired from different seasons and times are normalized to one "standard" date using MODIS data products as reference. The normalized data are then used to compute forest disturbance index for forest change detection. Comparing to the results from original data, forest disturbance index from the normalized images is more consistent spatially and temporally. This work demonstrates an effective approach for mapping forest change over a large area from multiple moderate resolution sensors on various acquisition dates.

  14. On the Ergodic Capacity of Dual-Branch Correlated Log-Normal Fading Channels with Applications

    KAUST Repository

    Al-Quwaiee, Hessa; Alouini, Mohamed-Slim

    2015-01-01

    Closed-form expressions of the ergodic capacity of independent or correlated diversity branches over Log-Normal fading channels are not available in the literature. Thus, it is become of an interest to investigate the behavior of such metric at high

  15. Crowdsourcing: an overview and applications to ophthalmology.

    Science.gov (United States)

    Wang, Xueyang; Mudie, Lucy; Brady, Christopher J

    2016-05-01

    Crowdsourcing involves the use of the collective intelligence of online communities to produce solutions and outcomes for defined objectives. The use of crowdsourcing is growing in many scientific areas. Crowdsourcing in ophthalmology has been used in basic science and clinical research; however, it also shows promise as a method with wide-ranging applications. This review presents current findings on the use of crowdsourcing in ophthalmology and potential applications in the future. Crowdsourcing has been used to distinguish normal retinal images from images with diabetic retinopathy; the collective intelligence of the crowd was able to correctly classify 81% of 230 images (19 unique) for US$1.10/eye in 20 min. Crowdsourcing has also been used to distinguish normal optic discs from abnormal ones with reasonable sensitivity (83-88%), but low specificity (35-43%). Another study used crowdsourcing for quick and reliable manual segmentation of optical coherence tomography images. Outside of ophthalmology, crowdsourcing has been used for text and image interpretation, language translation, and data analysis. Crowdsourcing has the potential for rapid and economical data processing. Among other applications, it could be used in research settings to provide the 'ground-truth' data, and in the clinical settings to relieve the burden of image processing on experts.

  16. The impact of signal normalization on seizure detection using line length features.

    Science.gov (United States)

    Logesparan, Lojini; Rodriguez-Villegas, Esther; Casson, Alexander J

    2015-10-01

    Accurate automated seizure detection remains a desirable but elusive target for many neural monitoring systems. While much attention has been given to the different feature extractions that can be used to highlight seizure activity in the EEG, very little formal attention has been given to the normalization that these features are routinely paired with. This normalization is essential in patient-independent algorithms to correct for broad-level differences in the EEG amplitude between people, and in patient-dependent algorithms to correct for amplitude variations over time. It is crucial, however, that the normalization used does not have a detrimental effect on the seizure detection process. This paper presents the first formal investigation into the impact of signal normalization techniques on seizure discrimination performance when using the line length feature to emphasize seizure activity. Comparing five normalization methods, based upon the mean, median, standard deviation, signal peak and signal range, we demonstrate differences in seizure detection accuracy (assessed as the area under a sensitivity-specificity ROC curve) of up to 52 %. This is despite the same analysis feature being used in all cases. Further, changes in performance of up to 22 % are present depending on whether the normalization is applied to the raw EEG itself or directly to the line length feature. Our results highlight the median decaying memory as the best current approach for providing normalization when using line length features, and they quantify the under-appreciated challenge of providing signal normalization that does not impair seizure detection algorithm performance.

  17. 78 FR 20614 - Application(s) for Duty-Free Entry of Scientific Instruments

    Science.gov (United States)

    2013-04-05

    ... compositions of electronic materials, advanced ceramics for medical applications, advanced Ni-based Superalloys... will be used to help understand how the human body functions normally, such as in learning, memory or... normal functional changes in cells of living organisms such as nerve cells or neurons of the brain, as...

  18. The Distribution of the Product Explains Normal Theory Mediation Confidence Interval Estimation.

    Science.gov (United States)

    Kisbu-Sakarya, Yasemin; MacKinnon, David P; Miočević, Milica

    2014-05-01

    The distribution of the product has several useful applications. One of these applications is its use to form confidence intervals for the indirect effect as the product of 2 regression coefficients. The purpose of this article is to investigate how the moments of the distribution of the product explain normal theory mediation confidence interval coverage and imbalance. Values of the critical ratio for each random variable are used to demonstrate how the moments of the distribution of the product change across values of the critical ratio observed in research studies. Results of the simulation study showed that as skewness in absolute value increases, coverage decreases. And as skewness in absolute value and kurtosis increases, imbalance increases. The difference between testing the significance of the indirect effect using the normal theory versus the asymmetric distribution of the product is further illustrated with a real data example. This article is the first study to show the direct link between the distribution of the product and indirect effect confidence intervals and clarifies the results of previous simulation studies by showing why normal theory confidence intervals for indirect effects are often less accurate than those obtained from the asymmetric distribution of the product or from resampling methods.

  19. Optimization of superconductor--normal-metal--superconductor Josephson junctions for high critical-current density

    International Nuclear Information System (INIS)

    Golub, A.; Horovitz, B.

    1994-01-01

    The application of superconducting Bi 2 Sr 2 CaCu 2 O 8 and YBa 2 Cu 3 O 7 wires or tapes to electronic devices requires the optimization of the transport properties in Ohmic contacts between the superconductor and the normal metal in the circuit. This paper presents results of tunneling theory in superconductor--normal-metal--superconductor (SNS) junctions, in both pure and dirty limits. We derive expressions for the critical-current density as a function of the normal-metal resistivity in the dirty limit or of the ratio of Fermi velocities and effective masses in the clean limit. In the latter case the critical current increases when the ratio γ of the Fermi velocity in the superconductor to that of the weak link becomes much less than 1 and it also has a local maximum if γ is close to 1. This local maximum is more pronounced if the ratio of effective masses is large. For temperatures well below the critical temperature of the superconductors the model with abrupt pair potential on the SN interfaces is considered and its applicability near the critical temperature is examined

  20. Fast Edge Detection and Segmentation of Terrestrial Laser Scans Through Normal Variation Analysis

    Science.gov (United States)

    Che, E.; Olsen, M. J.

    2017-09-01

    Terrestrial Laser Scanning (TLS) utilizes light detection and ranging (lidar) to effectively and efficiently acquire point cloud data for a wide variety of applications. Segmentation is a common procedure of post-processing to group the point cloud into a number of clusters to simplify the data for the sequential modelling and analysis needed for most applications. This paper presents a novel method to rapidly segment TLS data based on edge detection and region growing. First, by computing the projected incidence angles and performing the normal variation analysis, the silhouette edges and intersection edges are separated from the smooth surfaces. Then a modified region growing algorithm groups the points lying on the same smooth surface. The proposed method efficiently exploits the gridded scan pattern utilized during acquisition of TLS data from most sensors and takes advantage of parallel programming to process approximately 1 million points per second. Moreover, the proposed segmentation does not require estimation of the normal at each point, which limits the errors in normal estimation propagating to segmentation. Both an indoor and outdoor scene are used for an experiment to demonstrate and discuss the effectiveness and robustness of the proposed segmentation method.

  1. ProNormz--an integrated approach for human proteins and protein kinases normalization.

    Science.gov (United States)

    Subramani, Suresh; Raja, Kalpana; Natarajan, Jeyakumar

    2014-02-01

    The task of recognizing and normalizing protein name mentions in biomedical literature is a challenging task and important for text mining applications such as protein-protein interactions, pathway reconstruction and many more. In this paper, we present ProNormz, an integrated approach for human proteins (HPs) tagging and normalization. In Homo sapiens, a greater number of biological processes are regulated by a large human gene family called protein kinases by post translational phosphorylation. Recognition and normalization of human protein kinases (HPKs) is considered to be important for the extraction of the underlying information on its regulatory mechanism from biomedical literature. ProNormz distinguishes HPKs from other HPs besides tagging and normalization. To our knowledge, ProNormz is the first normalization system available to distinguish HPKs from other HPs in addition to gene normalization task. ProNormz incorporates a specialized synonyms dictionary for human proteins and protein kinases, a set of 15 string matching rules and a disambiguation module to achieve the normalization. Experimental results on benchmark BioCreative II training and test datasets show that our integrated approach achieve a fairly good performance and outperforms more sophisticated semantic similarity and disambiguation systems presented in BioCreative II GN task. As a freely available web tool, ProNormz is useful to developers as extensible gene normalization implementation, to researchers as a standard for comparing their innovative techniques, and to biologists for normalization and categorization of HPs and HPKs mentions in biomedical literature. URL: http://www.biominingbu.org/pronormz. Copyright © 2013 Elsevier Inc. All rights reserved.

  2. [Ultrasound and color Doppler applications in nephrology. The normal kidney: anatomy, vessels and congenital anomalies].

    Science.gov (United States)

    Meola, Mario; Petrucci, Ilaria; Giovannini, Lisa; Samoni, Sara; Dellafiore, Carolina

    2012-01-01

    Gray-scale ultrasound is the diagnostic technique of choice in patients with suspected or known renal disease. Knowledge of the normal and abnormal sonographic morphology of the kidney and urinary tract is essential for a successful diagnosis. Conventional sonography must always be complemented by Doppler sampling of the principal arterial and venous vessels. B-mode scanning is performed with the patient in supine, prone or side position. The kidney can be imaged by the anterior, lateral or posterior approach using coronal, transverse and oblique scanning planes. Morphological parameters that must be evaluated are the coronal diameter, the parenchymal thickness and echogenicity, the structure and state of the urinary tract, and the presence of congenital anomalies that may mimic a pseudomass. The main renal artery and the hilar-intraparenchymal branches of the arterial and venous vessels should be accurately evaluated using color Doppler. Measurement of intraparenchymal resistance indices (IP, IR) provides an indirect and quantitative parameter of the stiffness and eutrophic or dystrophic remodeling of the intrarenal microvasculature. These parameters differ depending on age, diabetic and hypertensive disease, chronic renal glomerular disease, and interstitial, vascular and obstructive nephropathy.

  3. Normalized inverse characterization of sound absorbing rigid porous media.

    Science.gov (United States)

    Zieliński, Tomasz G

    2015-06-01

    This paper presents a methodology for the inverse characterization of sound absorbing rigid porous media, based on standard measurements of the surface acoustic impedance of a porous sample. The model parameters need to be normalized to have a robust identification procedure which fits the model-predicted impedance curves with the measured ones. Such a normalization provides a substitute set of dimensionless (normalized) parameters unambiguously related to the original model parameters. Moreover, two scaling frequencies are introduced, however, they are not additional parameters and for different, yet reasonable, assumptions of their values, the identification procedure should eventually lead to the same solution. The proposed identification technique uses measured and computed impedance curves for a porous sample not only in the standard configuration, that is, set to the rigid termination piston in an impedance tube, but also with air gaps of known thicknesses between the sample and the piston. Therefore, all necessary analytical formulas for sound propagation in double-layered media are provided. The methodology is illustrated by one numerical test and by two examples based on the experimental measurements of the acoustic impedance and absorption of porous ceramic samples of different thicknesses and a sample of polyurethane foam.

  4. Neutron dosimetry and spectrometry with Bonner spheres. Working out a log-normal reference matrix

    International Nuclear Information System (INIS)

    Zaborowski, Henrick.

    1981-11-01

    From the experimental and theoretical studies made upon the BONNER's spheres System with a I 6 Li(Eu) crystal and with a miniaturized 3 He counter we get the normalized energy response functions R*sub(i)(E). This normalization is obtained by the mathematization of the Resolution Function R*(i,E) in the Log-Normal distribution hypothesis to mono energetic neutrons given in April 1976 to the International Symposium on Californium 252. The fit of the Log-Normal Hypothesis with the experimental and Theoretical data is very satisfactory. The parameter's tabulated values allow a precise interpolation, at all energies between 0.4 eV and 15 MeV and for all spheres diameters between 2 and 12 inches, of the discretized R*sub(ij) Reference Matrix for the applications to neutron dosimetry and spectrometry [fr

  5. hemaClass.org: Online One-By-One Microarray Normalization and Classification of Hematological Cancers for Precision Medicine.

    Science.gov (United States)

    Falgreen, Steffen; Ellern Bilgrau, Anders; Brøndum, Rasmus Froberg; Hjort Jakobsen, Lasse; Have, Jonas; Lindblad Nielsen, Kasper; El-Galaly, Tarec Christoffer; Bødker, Julie Støve; Schmitz, Alexander; H Young, Ken; Johnsen, Hans Erik; Dybkær, Karen; Bøgsted, Martin

    2016-01-01

    Dozens of omics based cancer classification systems have been introduced with prognostic, diagnostic, and predictive capabilities. However, they often employ complex algorithms and are only applicable on whole cohorts of patients, making them difficult to apply in a personalized clinical setting. This prompted us to create hemaClass.org, an online web application providing an easy interface to one-by-one RMA normalization of microarrays and subsequent risk classifications of diffuse large B-cell lymphoma (DLBCL) into cell-of-origin and chemotherapeutic sensitivity classes. Classification results for one-by-one array pre-processing with and without a laboratory specific RMA reference dataset were compared to cohort based classifiers in 4 publicly available datasets. Classifications showed high agreement between one-by-one and whole cohort pre-processsed data when a laboratory specific reference set was supplied. The website is essentially the R-package hemaClass accompanied by a Shiny web application. The well-documented package can be used to run the website locally or to use the developed methods programmatically. The website and R-package is relevant for biological and clinical lymphoma researchers using affymetrix U-133 Plus 2 arrays, as it provides reliable and swift methods for calculation of disease subclasses. The proposed one-by-one pre-processing method is relevant for all researchers using microarrays.

  6. Confidence bounds and hypothesis tests for normal distribution coefficients of variation

    Science.gov (United States)

    Steve Verrill; Richard A. Johnson

    2007-01-01

    For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations.

  7. A log-sinh transformation for data normalization and variance stabilization

    Science.gov (United States)

    Wang, Q. J.; Shrestha, D. L.; Robertson, D. E.; Pokhrel, P.

    2012-05-01

    When quantifying model prediction uncertainty, it is statistically convenient to represent model errors that are normally distributed with a constant variance. The Box-Cox transformation is the most widely used technique to normalize data and stabilize variance, but it is not without limitations. In this paper, a log-sinh transformation is derived based on a pattern of errors commonly seen in hydrological model predictions. It is suited to applications where prediction variables are positively skewed and the spread of errors is seen to first increase rapidly, then slowly, and eventually approach a constant as the prediction variable becomes greater. The log-sinh transformation is applied in two case studies, and the results are compared with one- and two-parameter Box-Cox transformations.

  8. Normal Brain-Skull Development with Hybrid Deformable VR Models Simulation.

    Science.gov (United States)

    Jin, Jing; De Ribaupierre, Sandrine; Eagleson, Roy

    2016-01-01

    This paper describes a simulation framework for a clinical application involving skull-brain co-development in infants, leading to a platform for craniosynostosis modeling. Craniosynostosis occurs when one or more sutures are fused early in life, resulting in an abnormal skull shape. Surgery is required to reopen the suture and reduce intracranial pressure, but is difficult without any predictive model to assist surgical planning. We aim to study normal brain-skull growth by computer simulation, which requires a head model and appropriate mathematical methods for brain and skull growth respectively. On the basis of our previous model, we further specified suture model into fibrous and cartilaginous sutures and develop algorithm for skull extension. We evaluate the resulting simulation by comparison with datasets of cases and normal growth.

  9. Robust Control of Aeronautical Electrical Generators for Energy Management Applications

    OpenAIRE

    Giacomo Canciello; Alberto Cavallo; Beniamino Guida

    2017-01-01

    A new strategy for the control of aeronautical electrical generators via sliding manifold selection is proposed, with an associated innovative intelligent energy management strategy used for efficient power transfer between two sources providing energy to aeronautical loads, having different functionalities and priorities. Electric generators used for aeronautical application involve several machines, including a main generator and an exciter. Standard regulators (PI or PID-like) are normally...

  10. Multiple spacecraft observations of interplanetary shocks Four spacecraft determination of shock normals

    Science.gov (United States)

    Russell, C. T.; Mellott, M. M.; Smith, E. J.; King, J. H.

    1983-01-01

    ISEE 1, 2, 3, IMP 8, and Prognoz 7 observations of interplanetary shocks in 1978 and 1979 provide five instances where a single shock is observed by four spacecraft. These observations are used to determine best-fit normals for these five shocks. In addition to providing well-documented shocks for future investigations these data allow the evaluation of the accuracy of several shock normal determination techniques. When the angle between upstream and downstream magnetic field is greater than 20 deg, magnetic coplanarity can be an accurate single spacecraft method. However, no technique based solely on the magnetic measurements at one or multiple sites was universally accurate. Thus, the use of overdetermined shock normal solutions, utilizing plasma measurements, separation vectors, and time delays together with magnetic constraints, is recommended whenever possible.

  11. Normalizing biomedical terms by minimizing ambiguity and variability

    Directory of Open Access Journals (Sweden)

    McNaught John

    2008-04-01

    Full Text Available Abstract Background One of the difficulties in mapping biomedical named entities, e.g. genes, proteins, chemicals and diseases, to their concept identifiers stems from the potential variability of the terms. Soft string matching is a possible solution to the problem, but its inherent heavy computational cost discourages its use when the dictionaries are large or when real time processing is required. A less computationally demanding approach is to normalize the terms by using heuristic rules, which enables us to look up a dictionary in a constant time regardless of its size. The development of good heuristic rules, however, requires extensive knowledge of the terminology in question and thus is the bottleneck of the normalization approach. Results We present a novel framework for discovering a list of normalization rules from a dictionary in a fully automated manner. The rules are discovered in such a way that they minimize the ambiguity and variability of the terms in the dictionary. We evaluated our algorithm using two large dictionaries: a human gene/protein name dictionary built from BioThesaurus and a disease name dictionary built from UMLS. Conclusions The experimental results showed that automatically discovered rules can perform comparably to carefully crafted heuristic rules in term mapping tasks, and the computational overhead of rule application is small enough that a very fast implementation is possible. This work will help improve the performance of term-concept mapping tasks in biomedical information extraction especially when good normalization heuristics for the target terminology are not fully known.

  12. Provider Monitoring and Pay-for-Performance When Multiple Providers Affect Outcomes: An Application to Renal Dialysis

    Science.gov (United States)

    Hirth, Richard A; Turenne, Marc N; Wheeler, John RC; Pan, Qing; Ma, Yu; Messana, Joseph M

    2009-01-01

    Objective To characterize the influence of dialysis facilities and nephrologists on resource use and patient outcomes in the dialysis population and to illustrate how such information can be used to inform payment system design. Data Sources Medicare claims for all hemodialysis patients for whom Medicare was the primary payer in 2004, combined with the Medicare Enrollment Database and the CMS Medical Evidence Form (CMS Form 2728), which is completed at onset of renal replacement therapy. Study Design Resource use (mainly drugs and laboratory tests) per dialysis session and two clinical outcomes (achieving targets for anemia management and dose of dialysis) were modeled at the patient level with random effects for nephrologist and dialysis facility, controlling for patient characteristics. Results For each measure, both the physician and the facility had significant effects. However, facilities were more influential than physicians, as measured by the standard deviation of the random effects. Conclusions The success of tools such as P4P and provider profiling relies upon the identification of providers most able to enhance efficiency and quality. This paper demonstrates a method for determining the extent to which variation in health care costs and quality of care can be attributed to physicians and institutional providers. Because variation in quality and cost attributable to facilities is consistently larger than that attributable to physicians, if provider profiling or financial incentives are targeted to only one type of provider, the facility appears to be the appropriate locus. PMID:19555398

  13. Spatially tuned normalization explains attention modulation variance within neurons.

    Science.gov (United States)

    Ni, Amy M; Maunsell, John H R

    2017-09-01

    area can be largely explained by between-neuron differences in normalization strength. Here we demonstrate that attention modulation size varies within neurons as well and that this variance is largely explained by within-neuron differences in normalization strength. We provide a new spatially tuned normalization model that explains this broad range of observed normalization and attention effects. Copyright © 2017 the American Physiological Society.

  14. Limited value of interlaced ECG-gated radiography in the presence of a normal chest radiograph

    International Nuclear Information System (INIS)

    Chen, J.T.T.; Ravin, C.E.; Handel, D.

    1984-01-01

    Twenty-seven patients with normal posteroanterior and lateral chest radiographs, who were undergoing cardiac catheterization because of symptoms strongly suggesting coronary artery disease, also had posteroanterior and lateral interlaced electrocardiogram-gated radiographs made. In 14 patients, the interlaced radiography system underestimated (suggested hypokinesia) the wall motion, which was normal on cardiac catheterization. In two cases the system overestimated the wall motion, in two others it both under- and overestimated the motion, and in only nine cases was the correlation correct. These data suggest that the technique is of limited application, particularly in cases in which the routine chest radiographs are normal

  15. Exercises in anatomy: the normal heart.

    Science.gov (United States)

    Anderson, Robert H; Sarwark, Anne; Spicer, Diane E; Backer, Carl L

    2014-01-01

    In the first of our exercises in anatomy, created for the Multimedia Manual of the European Association of Cardiothoracic Surgery, we emphasized that thorough knowledge of intracardiac anatomy was an essential part of the training for all budding cardiac surgeons, explaining how we had used the archive of congenitally malformed hearts maintained at Lurie Children's Hospital in Chicago to prepare a series of videoclips, demonstrating the salient features of tetralogy of Fallot. In this series of videoclips, we extend our analysis of the normal heart, since for our initial exercise we had concentrated exclusively on the structure of the right ventricular outflow tract. We begin our overview of normal anatomy by emphasizing the need, in the current era, to describe the heart in attitudinally appropriate fashion. Increasingly, clinicians are demonstrating the features of the heart as it is located within the body. It is no longer satisfactory, therefore, to describe these components in a 'Valentine' fashion, as continues to be the case in most textbooks of normal or cardiac anatomy. We then emphasize the importance of the so-called morphological method, which states that structures within the heart should be defined on the basis of their own intrinsic morphology, and not according to other parts, which are themselves variable. We continue by using this concept to show how it is the appendages that serve to distinguish between the atrial chambers, while the apical trabecular components provide the features to distinguish the ventricles. We then return to the cardiac chambers, emphasizing features of surgical significance, in particular the locations of the cardiac conduction tissues. We proceed by examining the cardiac valves, and conclude by providing a detailed analysis of the septal structures. © The Author 2014. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  16. Quantitative computed tomography determined regional lung mechanics in normal nonsmokers, normal smokers and metastatic sarcoma subjects.

    Directory of Open Access Journals (Sweden)

    Jiwoong Choi

    Full Text Available Extra-thoracic tumors send out pilot cells that attach to the pulmonary endothelium. We hypothesized that this could alter regional lung mechanics (tissue stiffening or accumulation of fluid and inflammatory cells through interactions with host cells. We explored this with serial inspiratory computed tomography (CT and image matching to assess regional changes in lung expansion.We retrospectively assessed 44 pairs of two serial CT scans on 21 sarcoma patients: 12 without lung metastases and 9 with lung metastases. For each subject, two or more serial inspiratory clinically-derived CT scans were retrospectively collected. Two research-derived control groups were included: 7 normal nonsmokers and 12 asymptomatic smokers with two inspiratory scans taken the same day or one year apart respectively. We performed image registration for local-to-local matching scans to baseline, and derived local expansion and density changes at an acinar scale. Welch two sample t test was used for comparison between groups. Statistical significance was determined with a p value < 0.05.Lung regions of metastatic sarcoma patients (but not the normal control group demonstrated an increased proportion of normalized lung expansion between the first and second CT. These hyper-expanded regions were associated with, but not limited to, visible metastatic lung lesions. Compared with the normal control group, the percent of increased normalized hyper-expanded lung in sarcoma subjects was significantly increased (p < 0.05. There was also evidence of increased lung "tissue" volume (non-air components in the hyper-expanded regions of the cancer subjects relative to non-hyper-expanded regions. "Tissue" volume increase was present in the hyper-expanded regions of metastatic and non-metastatic sarcoma subjects. This putatively could represent regional inflammation related to the presence of tumor pilot cell-host related interactions.This new quantitative CT (QCT method for linking

  17. Telomere length in normal and neoplastic canine tissues.

    Science.gov (United States)

    Cadile, Casey D; Kitchell, Barbara E; Newman, Rebecca G; Biller, Barbara J; Hetler, Elizabeth R

    2007-12-01

    To determine the mean telomere restriction fragment (TRF) length in normal and neoplastic canine tissues. 57 solid-tissue tumor specimens collected from client-owned dogs, 40 samples of normal tissue collected from 12 clinically normal dogs, and blood samples collected from 4 healthy blood donor dogs. Tumor specimens were collected from client-owned dogs during diagnostic or therapeutic procedures at the University of Illinois Veterinary Medical Teaching Hospital, whereas 40 normal tissue samples were collected from 12 control dogs. Telomere restriction fragment length was determined by use of an assay kit. A histologic diagnosis was provided for each tumor by personnel at the Veterinary Diagnostic Laboratory at the University of Illinois. Mean of the mean TRF length for 44 normal samples was 19.0 kilobases (kb; range, 15.4 to 21.4 kb), and the mean of the mean TRF length for 57 malignant tumors was 19.0 kb (range, 12.9 to 23.5 kb). Although the mean of the mean TRF length for tumors and normal tissues was identical, tumor samples had more variability in TRF length. Telomerase, which represents the main mechanism by which cancer cells achieve immortality, is an attractive therapeutic target. The ability to measure telomere length is crucial to monitoring the efficacy of telomerase inhibition. In contrast to many other mammalian species, the length of canine telomeres and the rate of telomeric DNA loss are similar to those reported in humans, making dogs a compelling choice for use in the study of human anti-telomerase strategies.

  18. Diffusion-weighted imaging in normal fetal brain maturation

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, J.F. [University Children' s Hospital UKBB, Department of Pediatric Radiology, Basel (Switzerland); Confort-Gouny, S.; Le Fur, Y.; Viout, P.; Cozzone, P. [UMR-CNRS 6612, Faculte de Medecine, Universite de la Mediterranee, Centre de Resonance Magnetique Biologique et Medicale, Marseille (France); Bennathan, M.; Chapon, F.; Fogliarini, C.; Girard, N. [Universite de la Mediterranee, Department of Neuroradiology AP-HM Timone, Marseille (France)

    2007-09-15

    Diffusion-weighted imaging (DWI) provides information about tissue maturation not seen on conventional magnetic resonance imaging. The aim of this study is to analyze the evolution over time of the apparent diffusion coefficient (ADC) of normal fetal brain in utero. DWI was performed on 78 fetuses, ranging from 23 to 37 gestational weeks (GW). All children showed at follow-up a normal neurological evaluation. ADC values were obtained in the deep white matter (DWM) of the centrum semiovale, the frontal, parietal, occipital and temporal lobe, in the cerebellar hemisphere, the brainstem, the basal ganglia (BG) and the thalamus. Mean ADC values in supratentorial DWM areas (1.68 {+-} 0.05 mm{sup 2}/s) were higher compared with the cerebellar hemisphere (1.25 {+-} 0.06 mm{sup 2}/s) and lowest in the pons (1.11 {+-} 0.05 mm{sup 2}/s). Thalamus and BG showed intermediate values (1.25 {+-} 0.04 mm{sup 2}/s). Brainstem, cerebellar hemisphere and thalamus showed a linear negative correlation with gestational age. Supratentorial areas revealed an increase in ADC values, followed by a decrease after the 30th GW. This study provides a normative data set that allows insights in the normal fetal brain maturation in utero, which has not yet been observed in previous studies on premature babies. (orig.)

  19. Design of a lightweight, cost effective thimble-like sensor for haptic applications based on contact force sensors.

    Science.gov (United States)

    Ferre, Manuel; Galiana, Ignacio; Aracil, Rafael

    2011-01-01

    This paper describes the design and calibration of a thimble that measures the forces applied by a user during manipulation of virtual and real objects. Haptic devices benefit from force measurement capabilities at their end-point. However, the heavy weight and cost of force sensors prevent their widespread incorporation in these applications. The design of a lightweight, user-adaptable, and cost-effective thimble with four contact force sensors is described in this paper. The sensors are calibrated before being placed in the thimble to provide normal and tangential forces. Normal forces are exerted directly by the fingertip and thus can be properly measured. Tangential forces are estimated by sensors strategically placed in the thimble sides. Two applications are provided in order to facilitate an evaluation of sensorized thimble performance. These applications focus on: (i) force signal edge detection, which determines task segmentation of virtual object manipulation, and (ii) the development of complex object manipulation models, wherein the mechanical features of a real object are obtained and these features are then reproduced for training by means of virtual object manipulation.

  20. Design of a Lightweight, Cost Effective Thimble-Like Sensor for Haptic Applications Based on Contact Force Sensors

    Directory of Open Access Journals (Sweden)

    Ignacio Galiana

    2011-12-01

    Full Text Available This paper describes the design and calibration of a thimble that measures the forces applied by a user during manipulation of virtual and real objects. Haptic devices benefit from force measurement capabilities at their end-point. However, the heavy weight and cost of force sensors prevent their widespread incorporation in these applications. The design of a lightweight, user-adaptable, and cost-effective thimble with four contact force sensors is described in this paper. The sensors are calibrated before being placed in the thimble to provide normal and tangential forces. Normal forces are exerted directly by the fingertip and thus can be properly measured. Tangential forces are estimated by sensors strategically placed in the thimble sides. Two applications are provided in order to facilitate an evaluation of sensorized thimble performance. These applications focus on: (i force signal edge detection, which determines task segmentation of virtual object manipulation, and (ii the development of complex object manipulation models, wherein the mechanical features of a real object are obtained and these features are then reproduced for training by means of virtual object manipulation.

  1. Can simple mobile phone applications provide reliable counts of respiratory rates in sick infants and children? An initial evaluation of three new applications.

    Science.gov (United States)

    Black, James; Gerdtz, Marie; Nicholson, Pat; Crellin, Dianne; Browning, Laura; Simpson, Julie; Bell, Lauren; Santamaria, Nick

    2015-05-01

    applications found. This study provides evidence that applications running on simple phones can be used to count respiratory rates in children. The Once-per-Breath methods are the most reliable, outperforming the 60-second count. For children with raised respiratory rates the 20-breath version of the Once-per-Breath method is faster, so it is a more suitable option where health workers are under time pressure. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Comprehensive Experiment—Clinical Biochemistry: Determination of Blood Glucose and Triglycerides in Normal and Diabetic Rats

    Science.gov (United States)

    Jiao, Li; Xiujuan, Shi; Juan, Wang; Song, Jia; Lei, Xu; Guotong, Xu; Lixia, Lu

    2015-01-01

    For second year medical students, we redesigned an original laboratory experiment and developed a combined research-teaching clinical biochemistry experiment. Using an established diabetic rat model to detect blood glucose and triglycerides, the students participate in the entire experimental process, which is not normally experienced during a standard clinical biochemistry exercise. The students are not only exposed to techniques and equipment but are also inspired to think more about the biochemical mechanisms of diseases. When linked with lecture topics about the metabolism of carbohydrates and lipids, the students obtain a better understanding of the relevance of abnormal metabolism in relation to diseases. Such understanding provides a solid foundation for the medical students' future research and for other clinical applications. PMID:25521692

  3. Comprehensive experiment-clinical biochemistry: determination of blood glucose and triglycerides in normal and diabetic rats.

    Science.gov (United States)

    Jiao, Li; Xiujuan, Shi; Juan, Wang; Song, Jia; Lei, Xu; Guotong, Xu; Lixia, Lu

    2015-01-01

    For second year medical students, we redesigned an original laboratory experiment and developed a combined research-teaching clinical biochemistry experiment. Using an established diabetic rat model to detect blood glucose and triglycerides, the students participate in the entire experimental process, which is not normally experienced during a standard clinical biochemistry exercise. The students are not only exposed to techniques and equipment but are also inspired to think more about the biochemical mechanisms of diseases. When linked with lecture topics about the metabolism of carbohydrates and lipids, the students obtain a better understanding of the relevance of abnormal metabolism in relation to diseases. Such understanding provides a solid foundation for the medical students' future research and for other clinical applications. © 2014 Biochemistry and Molecular Biology Education.

  4. A recursive field-normalized bibliometric performance indicator: an application to the field of library and information science.

    Science.gov (United States)

    Waltman, Ludo; Yan, Erjia; van Eck, Nees Jan

    2011-10-01

    Two commonly used ideas in the development of citation-based research performance indicators are the idea of normalizing citation counts based on a field classification scheme and the idea of recursive citation weighing (like in PageRank-inspired indicators). We combine these two ideas in a single indicator, referred to as the recursive mean normalized citation score indicator, and we study the validity of this indicator. Our empirical analysis shows that the proposed indicator is highly sensitive to the field classification scheme that is used. The indicator also has a strong tendency to reinforce biases caused by the classification scheme. Based on these observations, we advise against the use of indicators in which the idea of normalization based on a field classification scheme and the idea of recursive citation weighing are combined.

  5. TU-AB-201-07: Image Guided Endorectal HDR Brachytherapy Using a Compliant Balloon Applicator

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, G; Goodman, K [Memorial Sloan Kettering Cancer Center, New York, NY (United States)

    2015-06-15

    Purpose: High dose rate endorectal brachytherapy is an option to deliver a focal, high-dose radiotherapy to rectal tumors for patients undergoing non-operative management. We investigate a new multichannel, MR compatible applicator with a novel balloon-based design to provide improved treatment geometry. We report on the initial clinical experience using this applicator. Methods: Patients were enrolled on an IRB-approved, dose-escalation protocol evaluating the use of the anorectal (AR-1) applicator (Ancer Medical, Hialeah, FL), a multichannel applicator with two concentric balloons. The inner balloon supports 8 source lumens; the compliant outer balloon expands to separate the normal rectal wall and the source lumens, yet deforms around a firm, exophytic rectal mass, leading to dose escalation to tumor while sparing normal rectum. Under general anesthesia, gold fiducial markers were inserted above and below the tumor, and the AR applicator was placed in the rectum. MRI-based treatment plans were prepared to deliver 15 Gy in 3 weekly fractions to the target volume while sparing healthy rectal tissue, bladder, bowel and anal muscles. Prior to each treatment, CBCT/Fluoroscopy were used to place the applicator in the treatment position and confirm the treatment geometry using rigid registration of the CBCT and planning MRI. After registration of the applicator images, positioning was evaluated based on the match of the gold markers. Results: Highly conformal treatment plans were achieved. MR compatibility of the applicator enabled good tumor visualization. In spite of the non-rigid nature of the applicators and the fact that a new applicator was used at each treatment session, treatment geometry was reproducible to within 2.5 mm. Conclusions: This is the first report on using the AR applicator in patients. Highly conformal plans, confidence in MRI target delineation, in combination with reproducible treatment geometry provide encouraging feedback for continuation with

  6. Creating a smart application system to provide a beneficial maintenance service for elderly drivers

    Directory of Open Access Journals (Sweden)

    Jung Sebin

    2017-01-01

    Full Text Available As overall population ages, elderly drivers have become a larger percentage of the driving population. With this trend, a lot of vehicle systems have been improved for elderly’s safety and convenience using different advanced technologies. However, elderly drivers have often paid more money than other drivers in a car-repair shop due to their lack of knowledge about vehicle systems with modern technologies. Given this fact, developing a tool to diminish this disadvantage and to help elderly drivers maintain their cars with confidence and with minimal cost is necessary. Therefore, this research work mainly focuses on suggesting a system concept on user-interface application, which is connected to a smart phone or a tablet to provide beneficial services anywhere. For the research outcome, diverse research activities – surveys, interviews with small focus groups, observations of the focus groups, and discussions – has been conducted to understand the elderly driver’s difficulties and behaviours regarding vehicle maintenance, to investigate what specific problems make them uncomfortable in repair shops, and to demonstrate how new system-concepts could be developed for the elderly. Furthermore, we conclude that adequate system-concepts for the elderly would offer elderly drivers convenient vehicle repair and maintenance and provide them a confident driving experience.

  7. Application of radioreceptor assay for chorionic gonadotropin in diagnosis of normal and disturbed pregnancy

    International Nuclear Information System (INIS)

    Koch, R.; Schmidt-Gollwitzer, M.; Nevinny-Stickel, J.

    1977-01-01

    For diagnoses of normal and disturbed pregnancy, a radioreceptor assay (RRA) for the detection of chorionic gonadotropin (HLG) has been developed. The hormone was labelled with 125 I. Compared with biological and immunological methods, the RRA has a higher sensitivity and a shorter evaluation time. (orig./VJ) [de

  8. Hand function with touch screen technology in children with normal hand formation, congenital differences, and neuromuscular disease.

    Science.gov (United States)

    Shin, David H; Bohn, Deborah K; Agel, Julie; Lindstrom, Katy A; Cronquist, Sara M; Van Heest, Ann E

    2015-05-01

    To measure and compare hand function for children with normal hand development, congenital hand differences (CHD), and neuromuscular disease (NMD) using a function test with touch screen technology designed as an iPhone application. We measured touch screen hand function in 201 children including 113 with normal hand formation, 43 with CHD, and 45 with NMD. The touch screen test was developed on the iOS platform using an Apple iPhone 4. We measured 4 tasks: touching dots on a 3 × 4 grid, dragging shapes, use of the touch screen camera, and typing a line of text. The test takes 60 to 120 seconds and includes a pretest to familiarize the subject with the format. Each task is timed independently and the overall time is recorded. Children with normal hand development took less time to complete all 4 subtests with increasing age. When comparing children with normal hand development with those with CHD or NMD, in children aged less than 5 years we saw minimal differences; those aged 5 to 6 years with CHD took significantly longer total time; those aged 7 to 8 years with NMD took significantly longer total time; those aged 9 to 11 years with CHD took significantly longer total time; and those aged 12 years and older with NMD took significantly longer total time. Touch screen technology has becoming increasingly relevant to hand function in modern society. This study provides standardized age norms and shows that our test discriminates between normal hand development and that in children with CHD or NMD. Diagnostic III. Copyright © 2015 American Society for Surgery of the Hand. Published by Elsevier Inc. All rights reserved.

  9. Deformation associated with continental normal faults

    Science.gov (United States)

    Resor, Phillip G.

    Deformation associated with normal fault earthquakes and geologic structures provide insights into the seismic cycle as it unfolds over time scales from seconds to millions of years. Improved understanding of normal faulting will lead to more accurate seismic hazard assessments and prediction of associated structures. High-precision aftershock locations for the 1995 Kozani-Grevena earthquake (Mw 6.5), Greece image a segmented master fault and antithetic faults. This three-dimensional fault geometry is typical of normal fault systems mapped from outcrop or interpreted from reflection seismic data and illustrates the importance of incorporating three-dimensional fault geometry in mechanical models. Subsurface fault slip associated with the Kozani-Grevena and 1999 Hector Mine (Mw 7.1) earthquakes is modeled using a new method for slip inversion on three-dimensional fault surfaces. Incorporation of three-dimensional fault geometry improves the fit to the geodetic data while honoring aftershock distributions and surface ruptures. GPS Surveying of deformed bedding surfaces associated with normal faulting in the western Grand Canyon reveals patterns of deformation that are similar to those observed by interferometric satellite radar interferometry (InSAR) for the Kozani Grevena earthquake with a prominent down-warp in the hanging wall and a lesser up-warp in the footwall. However, deformation associated with the Kozani-Grevena earthquake extends ˜20 km from the fault surface trace, while the folds in the western Grand Canyon only extend 500 m into the footwall and 1500 m into the hanging wall. A comparison of mechanical and kinematic models illustrates advantages of mechanical models in exploring normal faulting processes including incorporation of both deformation and causative forces, and the opportunity to incorporate more complex fault geometry and constitutive properties. Elastic models with antithetic or synthetic faults or joints in association with a master

  10. Application of normalized spectra in resolving a challenging Orphenadrine and Paracetamol binary mixture

    Science.gov (United States)

    Yehia, Ali M.; Abd El-Rahman, Mohamed K.

    2015-03-01

    Normalized spectra have a great power in resolving spectral overlap of challenging Orphenadrine (ORP) and Paracetamol (PAR) binary mixture, four smart techniques utilizing the normalized spectra were used in this work, namely, amplitude modulation (AM), simultaneous area ratio subtraction (SARS), simultaneous derivative spectrophotometry (S1DD) and ratio H-point standard addition method (RHPSAM). In AM, peak amplitude at 221.6 nm of the division spectra was measured for both ORP and PAR determination, while in SARS, concentration of ORP was determined using the area under the curve from 215 nm to 222 nm of the regenerated ORP zero order absorption spectra, in S1DD, concentration of ORP was determined using the peak amplitude at 224 nm of the first derivative ratio spectra. PAR concentration was determined directly at 288 nm in the division spectra obtained during the manipulation steps in the previous three methods. The last RHPSAM is a dual wavelength method in which two calibrations were plotted at 216 nm and 226 nm. RH point is the intersection of the two calibration lines, where ORP and PAR concentrations were directly determined from coordinates of RH point. The proposed methods were applied successfully for the determination of ORP and PAR in their dosage form.

  11. Gasdynamics: theory and applications

    International Nuclear Information System (INIS)

    Emanuel, G.

    1986-01-01

    The fundamental principles and applications of gasdynamic theory are presented in an introductory textbook intended for senior and graduate engineering students. The emphasis is on supersonic inviscid adiabatic flows with negligible body forces, and the approach aims to bridge the gap between traditional gasdynamics and CFD. Topics examined include thermodynamics, one-dimensional conservation equations, steady streamtube flows, normal and oblique shock waves, nozzle and diffuser flows, exact solutions for the steady homentropic flow of a perfect gas, and waverider aerodynamics. A glossary of symbols, summaries of the equations for each aspect of the theory, and fully worked problems for each chapter are provided. 82 references

  12. Proton MR spectroscopic features of the human liver: in-vivo application to the normal condition

    International Nuclear Information System (INIS)

    Cho, Soon Gu; Kim, Mi Young; Kim, Young Soo; Choi, Won; Shin, Seok Hwan; Ok, Chul Soo; Suh, Chang Hae

    1999-01-01

    To determine the feasibility of MR spectroscopy in the living human liver, and to evaluate the corresponding proton MR spectroscopic features. In fifteen normal volunteers with neither previous nor present liver disease, the proton MR spectroscopic findings were reviewed. Twelve subjects were male and three were female ; they were aged between 28 and 32 (mean, 30) years. MR spectroscopy involved the use of a 1.5T GE Signa Horizon system with body coil(GE Medical System, Milwaukee, U.S.A). We used STEAM (Stimulated Echo-Acquisition Mode) with 3000/30 msec of TR/TE for signal acquisition, and the prone position without respiratory interruption. Mean and standard deviation of the ratios of glutamate+glutamine/lipids, phosphomonoesters/lipids, and glycogen+glucose/lipids were calculated from the area of their peaks. The proton MR spectroscopic findings of normal human livers showed four distinctive peaks, i.e. lipids, glutamate and glutamine complex, phosphomonoesters, and glycogen and glucose complex. The mean and standard deviation of the ratios of glutamate+glutamine/lipids, phosphomonoesters/lipids, and glycogen+glucose/lipids were 0.02±0.01, 0.01±0.01, and 0.04±0.03, respectively. In living normal human livers, MR spectroscopy can be successfully applied. When applied to a liver whose condition is pathologic, the findings can be used as a standard

  13. An individual urinary proteome analysis in normal human beings to define the minimal sample number to represent the normal urinary proteome

    Directory of Open Access Journals (Sweden)

    Liu Xuejiao

    2012-11-01

    Full Text Available Abstract Background The urinary proteome has been widely used for biomarker discovery. A urinary proteome database from normal humans can provide a background for discovery proteomics and candidate proteins/peptides for targeted proteomics. Therefore, it is necessary to define the minimum number of individuals required for sampling to represent the normal urinary proteome. Methods In this study, inter-individual and inter-gender variations of urinary proteome were taken into consideration to achieve a representative database. An individual analysis was performed on overnight urine samples from 20 normal volunteers (10 males and 10 females by 1DLC/MS/MS. To obtain a representative result of each sample, a replicate 1DLCMS/MS analysis was performed. The minimal sample number was estimated by statistical analysis. Results For qualitative analysis, less than 5% of new proteins/peptides were identified in a male/female normal group by adding a new sample when the sample number exceeded nine. In addition, in a normal group, the percentage of newly identified proteins/peptides was less than 5% upon adding a new sample when the sample number reached 10. Furthermore, a statistical analysis indicated that urinary proteomes from normal males and females showed different patterns. For quantitative analysis, the variation of protein abundance was defined by spectrum count and western blotting methods. And then the minimal sample number for quantitative proteomic analysis was identified. Conclusions For qualitative analysis, when considering the inter-individual and inter-gender variations, the minimum sample number is 10 and requires a balanced number of males and females in order to obtain a representative normal human urinary proteome. For quantitative analysis, the minimal sample number is much greater than that for qualitative analysis and depends on the experimental methods used for quantification.

  14. The applicability of constructivist user studies: how can constructivist inquiry inform service providers and systems designers? Constructivist inquiry, Case study, Systems design, User behaviour

    Directory of Open Access Journals (Sweden)

    Alison Pickard

    2004-01-01

    Full Text Available This paper has attempted to clarify the ways in which individual, holistic case studies, produced via the process of constructivist inquiry, can be tested for trustworthiness and applied to other, similar situations. Service providers and systems designers need contextual information concerning their users in order to design and provide systems and services that will function effectively and efficiently within those contexts. Abstract models can only provide abstract insight into human behaviour and this is rarely sufficient detail upon which to base the planning and delivery of a service. The methodological issues which surround the applicability of individual, holistic case studies are discussed, explaining the concept of 'contextual applicability.' The relevance and usefulness of in-depth case study research to systems designers and service providers is highlighted.

  15. Dichotic and dichoptic digit perception in normal adults.

    Science.gov (United States)

    Lawfield, Angela; McFarland, Dennis J; Cacace, Anthony T

    2011-06-01

    Verbally based dichotic-listening experiments and reproduction-mediated response-selection strategies have been used for over four decades to study perceptual/cognitive aspects of auditory information processing and make inferences about hemispheric asymmetries and language lateralization in the brain. Test procedures using dichotic digits have also been used to assess for disorders of auditory processing. However, with this application, limitations exist and paradigms need to be developed to improve specificity of the diagnosis. Use of matched tasks in multiple sensory modalities is a logical approach to address this issue. Herein, we use dichotic listening and dichoptic viewing of visually presented digits for making this comparison. To evaluate methodological issues involved in using matched tasks of dichotic listening and dichoptic viewing in normal adults. A multivariate assessment of the effects of modality (auditory vs. visual), digit-span length (1-3 pairs), response selection (recognition vs. reproduction), and ear/visual hemifield of presentation (left vs. right) on dichotic and dichoptic digit perception. Thirty adults (12 males, 18 females) ranging in age from 18 to 30 yr with normal hearing sensitivity and normal or corrected-to-normal visual acuity. A computerized, custom-designed program was used for all data collection and analysis. A four-way repeated measures analysis of variance (ANOVA) evaluated the effects of modality, digit-span length, response selection, and ear/visual field of presentation. The ANOVA revealed that performances on dichotic listening and dichoptic viewing tasks were dependent on complex interactions between modality, digit-span length, response selection, and ear/visual hemifield of presentation. Correlation analysis suggested a common effect on overall accuracy of performance but isolated only an auditory factor for a laterality index. The variables used in this experiment affected performances in the auditory modality to a

  16. Identification of the boundary between normal breast tissue and invasive ductal carcinoma during breast-conserving surgery using multiphoton microscopy

    Science.gov (United States)

    Deng, Tongxin; Nie, Yuting; Lian, Yuane; Wu, Yan; Fu, Fangmeng; Wang, Chuan; Zhuo, Shuangmu; Chen, Jianxin

    2014-11-01

    Breast-conserving surgery has become an important way of surgical treatment for breast cancer worldwide nowadays. Multiphoton microscopy (MPM) has the ability to noninvasively visualize tissue architectures at the cellular level using intrinsic fluorescent molecules in biological tissues without the need for fluorescent dye. In this study, MPM is used to image the microstructures of terminal duct lobular unit (TDLU), invasive ductal carcinoma and the boundary region between normal and cancerous breast tissues. Our study demonstrates that MPM has the ability to not only reveal the morphological changes of the cuboidal epithelium, basement membrane and interlobular stroma but also identify the boundary between normal breast tissue and invasive ductal carcinoma, which correspond well to the Hematoxylin and Eosin (H and E) images. Predictably, MPM can monitor surgical margins in real time and provide considerable accuracy for resection of breast cancerous tissues intraoperatively. With the development of miniature, real-time MPM imaging technology, MPM should have great application prospects during breast-conserving surgery.

  17. Comparative study of various normal mode analysis techniques based on partial Hessians.

    Science.gov (United States)

    Ghysels, An; Van Speybroeck, Veronique; Pauwels, Ewald; Catak, Saron; Brooks, Bernard R; Van Neck, Dimitri; Waroquier, Michel

    2010-04-15

    Standard normal mode analysis becomes problematic for complex molecular systems, as a result of both the high computational cost and the excessive amount of information when the full Hessian matrix is used. Several partial Hessian methods have been proposed in the literature, yielding approximate normal modes. These methods aim at reducing the computational load and/or calculating only the relevant normal modes of interest in a specific application. Each method has its own (dis)advantages and application field but guidelines for the most suitable choice are lacking. We have investigated several partial Hessian methods, including the Partial Hessian Vibrational Analysis (PHVA), the Mobile Block Hessian (MBH), and the Vibrational Subsystem Analysis (VSA). In this article, we focus on the benefits and drawbacks of these methods, in terms of the reproduction of localized modes, collective modes, and the performance in partially optimized structures. We find that the PHVA is suitable for describing localized modes, that the MBH not only reproduces localized and global modes but also serves as an analysis tool of the spectrum, and that the VSA is mostly useful for the reproduction of the low frequency spectrum. These guidelines are illustrated with the reproduction of the localized amine-stretch, the spectrum of quinine and a bis-cinchona derivative, and the low frequency modes of the LAO binding protein. 2009 Wiley Periodicals, Inc.

  18. Normalization matters: tracking the best strategy for sperm miRNA quantification.

    Science.gov (United States)

    Corral-Vazquez, Celia; Blanco, Joan; Salas-Huetos, Albert; Vidal, Francesca; Anton, Ester

    2017-01-01

    biological processes. Hsa-miR-146b-5p and hsa-miR-92a-3p were more uniformly expressed than RNU6B, but their results still showed scant proximity to the reference method. The highest resemblance to MCR was achieved by hsa-miR-100-5p and hsa-miR-30a-5p. Normalization against the combination of both miRNAs reached the best proximity rank regarding the detected DE-miRNAs (Area Under the Curve = 0.8). This combination also exhibited the best performance in terms of the target genes predicted (72.3% of True Positives) and their corresponding enriched biological processes (70.4% of True Positives). Not applicable. This study is focused on sperm miRNA qRT-PCR analysis. The use of the selected normalizers in other cell types or tissues would still require confirmation. The search for new fertility biomarkers based on sperm miRNA expression using high-throughput assays is one of the upcoming challenges in the field of reproductive genetics. In this context, validation of the results using singleplex assays would be mandatory. The normalizer strategy suggested in this study would provide a universal option in this area, allowing for normalization of the validated data without causing meaningful variations of the results. Instead, qRT-PCR data normalization by RNU6B should be discarded in sperm-miRNA expression studies. This work was supported by the 2014/SGR00524 project (Agència de Gestió d'Ajuts Universitaris i de Recerca, Generalitat de Catalunya, Spain) and UAB CF-180034 grant (Universitat Autònoma de Barcelona). Celia Corral-Vazquez is a recipient of a Personal Investigador en Formació grant UAB/PIF2015 (Universitat Autònoma de Barcelona). The authors report no conflict of interest. © The Author 2016. Published by Oxford University Press on behalf of the European Society of Human Reproduction and Embryology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  19. Log-normal distribution from a process that is not multiplicative but is additive.

    Science.gov (United States)

    Mouri, Hideaki

    2013-10-01

    The central limit theorem ensures that a sum of random variables tends to a Gaussian distribution as their total number tends to infinity. However, for a class of positive random variables, we find that the sum tends faster to a log-normal distribution. Although the sum tends eventually to a Gaussian distribution, the distribution of the sum is always close to a log-normal distribution rather than to any Gaussian distribution if the summands are numerous enough. This is in contrast to the current consensus that any log-normal distribution is due to a product of random variables, i.e., a multiplicative process, or equivalently to nonlinearity of the system. In fact, the log-normal distribution is also observable for a sum, i.e., an additive process that is typical of linear systems. We show conditions for such a sum, an analytical example, and an application to random scalar fields such as those of turbulence.

  20. Group normalization for genomic data.

    Science.gov (United States)

    Ghandi, Mahmoud; Beer, Michael A

    2012-01-01

    Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN), to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.

  1. Application of digital compression techniques to optical surveillance systems

    International Nuclear Information System (INIS)

    Johnson, C.S.

    1991-01-01

    There are many benefits to handling video images electronically, however, the amount of digital data in a normal video image is a major obstacle. The solution is to remove the high frequency and redundant information in a process that is referred to as compression. Compression allows the number of digital bits required for a given image to be reduced for more efficient storage or transmission of images. The next question is how much compression can be done without impairing the image quality beyond its usefulness for a given application. This paper discusses image compression that might be applied to provide useful images in unattended nuclear facility surveillance applications

  2. radiation dosimetry in cases of normal and emergency situations

    International Nuclear Information System (INIS)

    Morsi, T.M.

    2010-01-01

    The use of radioactive materials in various fields of medicine, industry, agriculture and researches has been increasing steadily during the last few decades. A lot of radiation sources, radiopharmaceuticals, labeled compounds and other radioactive materials are sold and used throughout the world each year. Historically, accidents have occurred during the production, transport and use of radioactive materials. If an accident does occur, it is necessary to cope with it as soon as possible in order to control radiological human exposures and contamination of the environment and to restore normal conditions. Examination of individuals that deal with radioactive isotopes should be carried out in cases of nuclear medicine units, and in other applications including radiotherapy unit and gamma irradiation facility. Identification of the feasibility and efficiency of the counting detectors of internal and external radiation dosimetry, and preparedness in normal and emergency situations are included in the present work. Furthermore, this study also deals with the use of thermoluminescent dosimeters for radiation dose estimation for applications of gamma irradiation, and cobalt-60 treatment unit. Hence, operator dose can be estimated in case of malfunction or stuck of the radioactive source. Three methods were used to measure the radiation dose: (1) TL dosimeters with Harshaw (TLD- 4000) reader were used for measurement of external exposures. (2) FASTSCAN and (3) ACUUSCAN II whole body counters were used for measurement of internal exposures.

  3. Modeling and simulation of normal and hemiparetic gait

    Science.gov (United States)

    Luengas, Lely A.; Camargo, Esperanza; Sanchez, Giovanni

    2015-09-01

    Gait is the collective term for the two types of bipedal locomotion, walking and running. This paper is focused on walking. The analysis of human gait is of interest to many different disciplines, including biomechanics, human-movement science, rehabilitation and medicine in general. Here we present a new model that is capable of reproducing the properties of walking, normal and pathological. The aim of this paper is to establish the biomechanical principles that underlie human walking by using Lagrange method. The constraint forces of Rayleigh dissipation function, through which to consider the effect on the tissues in the gait, are included. Depending on the value of the factor present in the Rayleigh dissipation function, both normal and pathological gait can be simulated. First of all, we apply it in the normal gait and then in the permanent hemiparetic gait. Anthropometric data of adult person are used by simulation, and it is possible to use anthropometric data for children but is necessary to consider existing table of anthropometric data. Validation of these models includes simulations of passive dynamic gait that walk on level ground. The dynamic walking approach provides a new perspective of gait analysis, focusing on the kinematics and kinetics of gait. There have been studies and simulations to show normal human gait, but few of them have focused on abnormal, especially hemiparetic gait. Quantitative comparisons of the model predictions with gait measurements show that the model can reproduce the significant characteristics of normal gait.

  4. The COBE normalization for standard cold dark matter

    Science.gov (United States)

    Bunn, Emory F.; Scott, Douglas; White, Martin

    1995-01-01

    The Cosmic Background Explorer Satellite (COBE) detection of microwave anisotropies provides the best way of fixing the amplitude of cosmological fluctuations on the largest scales. This normalization is usually given for an n = 1 spectrum, including only the anisotropy caused by the Sachs-Wolfe effect. This is certainly not a good approximation for a model containing any reasonable amount of baryonic matter. In fact, even tilted Sachs-Wolfe spectra are not a good fit to models like cold dark matter (CDM). Here, we normalize standard CDM (sCDM) to the two-year COBE data and quote the best amplitude in terms of the conventionally used measures of power. We also give normalizations for some specific variants of this standard model, and we indicate how the normalization depends on the assumed values on n, Omega(sub B) and H(sub 0). For sCDM we find the mean value of Q = 19.9 +/- 1.5 micro-K, corresponding to sigma(sub 8) = 1.34 +/- 0.10, with the normalization at large scales being B = (8.16 +/- 1.04) x 10(exp 5)(Mpc/h)(exp 4), and other numbers given in the table. The measured rms temperature fluctuation smoothed on 10 deg is a little low relative to this normalization. This is mainly due to the low quadrupole in the data: when the quadrupole is removed, the measured value of sigma(10 deg) is quite consistent with the best-fitting the mean value of Q. The use of the mean value of Q should be preferred over sigma(10 deg), when its value can be determined for a particular theory, since it makes full use of the data.

  5. Category Specificity in Normal Episodic Learning: Applications to Object Recognition and Category-Specific Agnosia

    Science.gov (United States)

    Bukach, Cindy M.; Bub, Daniel N.; Masson, Michael E. J.; Lindsay, D. Stephen

    2004-01-01

    Studies of patients with category-specific agnosia (CSA) have given rise to multiple theories of object recognition, most of which assume the existence of a stable, abstract semantic memory system. We applied an episodic view of memory to questions raised by CSA in a series of studies examining normal observers' recall of newly learned attributes…

  6. Veterinary software application for comparison of thermograms for pathology evaluation

    Science.gov (United States)

    Pant, Gita; Umbaugh, Scott E.; Dahal, Rohini; Lama, Norsang; Marino, Dominic J.; Sackman, Joseph

    2017-09-01

    The bilateral symmetry property in mammals allows for the detection of pathology by comparison of opposing sides. For any pathological disorder, thermal patterns differ compared to the normal body part. A software application for veterinary clinics has been under development to input two thermograms of body parts on both sides, one normal and the other unknown, and the application compares them based on extracted features and appropriate similarity and difference measures and outputs the likelihood of pathology. Here thermographic image data from 19° C to 40° C was linearly remapped to create images with 256 gray level values. Features were extracted from these images, including histogram, texture and spectral features. The comparison metrics used are the vector inner product, Tanimoto, Euclidean, city block, Minkowski and maximum value metric. Previous research with the anterior cruciate ligament (ACL) pathology in dogs suggested any thermogram variation below a threshold of 40% of Euclidean distance is normal and above 40% is abnormal. Here the 40% threshold was applied to a new ACL image set and achieved a sensitivity of 75%, an improvement from the 55% sensitivity of the previous work. With the new data set it was determined that using a threshold of 20% provided a much improved 92% sensitivity metric. However, this will require further research to determine the corresponding specificity success rate. Additionally, it was found that the anterior view provided better results than the lateral view. It was also determined that better results were obtained with all three feature sets than with just the histogram and texture sets. Further experiments are ongoing with larger image datasets, and pathologies, new features and comparison metric evaluation for determination of more accurate threshold values to separate normal and abnormal images.

  7. COBE DMR-normalized open inflation cold dark matter cosmogony

    Science.gov (United States)

    Gorski, Krzysztof M.; Ratra, Bharat; Sugiyama, Naoshi; Banday, Anthony J.

    1995-01-01

    A cut-sky orthogonal mode analysis of the 2 year COBE DMR 53 and 90 GHz sky maps (in Galactic coordinates) is used to determine the normalization of an open inflation model based on the cold dark matter (CDM) scenario. The normalized model is compared to measures of large-scale structure in the universe. Although the DMR data alone does not provide sufficient discriminative power to prefer a particular value of the mass density parameter, the open model appears to be reasonably consistent with observations when Omega(sub 0) is approximately 0.3-0.4 and merits further study.

  8. Application-Defined Decentralized Access Control

    Science.gov (United States)

    Xu, Yuanzhong; Dunn, Alan M.; Hofmann, Owen S.; Lee, Michael Z.; Mehdi, Syed Akbar; Witchel, Emmett

    2014-01-01

    DCAC is a practical OS-level access control system that supports application-defined principals. It allows normal users to perform administrative operations within their privilege, enabling isolation and privilege separation for applications. It does not require centralized policy specification or management, giving applications freedom to manage their principals while the policies are still enforced by the OS. DCAC uses hierarchically-named attributes as a generic framework for user-defined policies such as groups defined by normal users. For both local and networked file systems, its execution time overhead is between 0%–9% on file system microbenchmarks, and under 1% on applications. This paper shows the design and implementation of DCAC, as well as several real-world use cases, including sandboxing applications, enforcing server applications’ security policies, supporting NFS, and authenticating user-defined sub-principals in SSH, all with minimal code changes. PMID:25426493

  9. MISTRAL V1.1.1: assessing doses from atmospheric releases in normal and off-normal conditions

    International Nuclear Information System (INIS)

    David Kerouanton; Patrick Devin; Malvina Rennesson

    2006-01-01

    Protecting the environment and the public from radioactive and chemical hazards has always been a top priority for all companies operating in the nuclear domain. In this scope, SGN provides all the services the nuclear industry needs in environmental studies especially in relation to the impact assessment in normal operating conditions and risk assessment in off-normal conditions. In order to quantify dose impact on members of the public due to atmospheric releases, COGEMA and SGN developed MISTRAL V1.1.1 code. Dose impact depends strongly on dispersion of radionuclides in atmosphere. The main parameters involved in dispersion characterization are wind velocity and direction, rain, diffusion conditions, coordinates of the point of observation and stack elevation. MISTRAL code implements DOURY and PASQUILL Gaussian plume models which are widely used in the scientific community. These models, applicable for distances of transfer ranging from 100 m up to 30 km, are used to calculate atmospheric concentration and deposit at different distances from the point of release. MISTRAL allows the use of different dose regulations or dose coefficient databases such as: - ICRP30 and ICPR71 for internal doses (inhalation, ingestion) - Despres/Kocher database or US-EPA Federal Guidance no.12 (ICPR72 for noble gases) for external exposure (from plume or ground). The initial instant of the release can be considered as the origin of time or a date format can be specified (could be useful in a crisis context). While the context is specified, the user define the meteorological conditions of the release. In normal operating mode (routine releases), the user gives the annual meteorological scheme. The data can be recorded in the MISTRAL meteorological database. In off-normal conditions mode, MISTRAL V1.1 allows the use of successive release stages for which the user gives the duration, the meteorological conditions, that is to say stability class, wind speed and direction and rainfall

  10. Multi-source waveform inversion of marine streamer data using the normalized wavefield

    KAUST Repository

    Choi, Yun Seok

    2012-01-01

    Even though the encoded multi-source approach dramatically reduces the computational cost of waveform inversion, it is generally not applicable to marine streamer data. This is because the simultaneous-sources modeled data cannot be muted to comply with the configuration of the marine streamer data, which causes differences in the number of stacked-traces, or energy levels, between the modeled and observed data. Since the conventional L2 norm does not account for the difference in energy levels, multi-source inversion based on the conventional L2 norm does not work for marine streamer data. In this study, we propose the L2, approximated L2, and L1 norm using the normalized wavefields for the multi-source waveform inversion of marine streamer data. Since the normalized wavefields mitigate the different energy levels between the observed and modeled wavefields, the multi-source waveform inversion using the normalized wavefields can be applied to marine streamer data. We obtain the gradient of the objective functions using the back-propagation algorithm. To conclude, the gradient of the L2 norm using the normalized wavefields is exactly the same as that of the global correlation norm. In the numerical examples, the new objective functions using the normalized wavefields generate successful results whereas conventional L2 norm does not.

  11. ELV Recycling Service Provider Selection Using the Hybrid MCDM Method: A Case Application in China

    Directory of Open Access Journals (Sweden)

    Fuli Zhou

    2016-05-01

    Full Text Available With the rapid depletion of natural resources and undesired environmental changes globally, more interest has been shown in the research of green supply chain practices, including end-of-life vehicle (ELV recycling. The ELV recycling is mandatory for auto-manufacturers by legislation for the purpose of minimizing potential environmental damages. The purpose of the present research is to determine the best choice of ELV recycling service provider by employing an integrating hybrid multi-criteria decision making (MCDM method. In this research, economic, environmental and social factors are taken into consideration. The linguistic variables and trapezoidal fuzzy numbers (TFNs are applied into this evaluation to deal with the vague and qualitative information. With the combined weight calculation of criteria based on fuzzy aggregation and Shannon Entropy techniques, the normative multi-criteria optimization technique (FVIKOR method is applied to explore the best solution. An application was performed based on the proposed hybrid MCDM method, and sensitivity analysis was conducted on different decision making scenarios. The present study provides a decision-making approach on ELV recycling business selection under sustainability and green philosophy with high robustness and easy implementation.

  12. Group normalization for genomic data.

    Directory of Open Access Journals (Sweden)

    Mahmoud Ghandi

    Full Text Available Data normalization is a crucial preliminary step in analyzing genomic datasets. The goal of normalization is to remove global variation to make readings across different experiments comparable. In addition, most genomic loci have non-uniform sensitivity to any given assay because of variation in local sequence properties. In microarray experiments, this non-uniform sensitivity is due to different DNA hybridization and cross-hybridization efficiencies, known as the probe effect. In this paper we introduce a new scheme, called Group Normalization (GN, to remove both global and local biases in one integrated step, whereby we determine the normalized probe signal by finding a set of reference probes with similar responses. Compared to conventional normalization methods such as Quantile normalization and physically motivated probe effect models, our proposed method is general in the sense that it does not require the assumption that the underlying signal distribution be identical for the treatment and control, and is flexible enough to correct for nonlinear and higher order probe effects. The Group Normalization algorithm is computationally efficient and easy to implement. We also describe a variant of the Group Normalization algorithm, called Cross Normalization, which efficiently amplifies biologically relevant differences between any two genomic datasets.

  13. Best-Matched Internal Standard Normalization in Liquid Chromatography-Mass Spectrometry Metabolomics Applied to Environmental Samples.

    Science.gov (United States)

    Boysen, Angela K; Heal, Katherine R; Carlson, Laura T; Ingalls, Anitra E

    2018-01-16

    The goal of metabolomics is to measure the entire range of small organic molecules in biological samples. In liquid chromatography-mass spectrometry-based metabolomics, formidable analytical challenges remain in removing the nonbiological factors that affect chromatographic peak areas. These factors include sample matrix-induced ion suppression, chromatographic quality, and analytical drift. The combination of these factors is referred to as obscuring variation. Some metabolomics samples can exhibit intense obscuring variation due to matrix-induced ion suppression, rendering large amounts of data unreliable and difficult to interpret. Existing normalization techniques have limited applicability to these sample types. Here we present a data normalization method to minimize the effects of obscuring variation. We normalize peak areas using a batch-specific normalization process, which matches measured metabolites with isotope-labeled internal standards that behave similarly during the analysis. This method, called best-matched internal standard (B-MIS) normalization, can be applied to targeted or untargeted metabolomics data sets and yields relative concentrations. We evaluate and demonstrate the utility of B-MIS normalization using marine environmental samples and laboratory grown cultures of phytoplankton. In untargeted analyses, B-MIS normalization allowed for inclusion of mass features in downstream analyses that would have been considered unreliable without normalization due to obscuring variation. B-MIS normalization for targeted or untargeted metabolomics is freely available at https://github.com/IngallsLabUW/B-MIS-normalization .

  14. Is this the right normalization? A diagnostic tool for ChIP-seq normalization.

    Science.gov (United States)

    Angelini, Claudia; Heller, Ruth; Volkinshtein, Rita; Yekutieli, Daniel

    2015-05-09

    Chip-seq experiments are becoming a standard approach for genome-wide profiling protein-DNA interactions, such as detecting transcription factor binding sites, histone modification marks and RNA Polymerase II occupancy. However, when comparing a ChIP sample versus a control sample, such as Input DNA, normalization procedures have to be applied in order to remove experimental source of biases. Despite the substantial impact that the choice of the normalization method can have on the results of a ChIP-seq data analysis, their assessment is not fully explored in the literature. In particular, there are no diagnostic tools that show whether the applied normalization is indeed appropriate for the data being analyzed. In this work we propose a novel diagnostic tool to examine the appropriateness of the estimated normalization procedure. By plotting the empirical densities of log relative risks in bins of equal read count, along with the estimated normalization constant, after logarithmic transformation, the researcher is able to assess the appropriateness of the estimated normalization constant. We use the diagnostic plot to evaluate the appropriateness of the estimates obtained by CisGenome, NCIS and CCAT on several real data examples. Moreover, we show the impact that the choice of the normalization constant can have on standard tools for peak calling such as MACS or SICER. Finally, we propose a novel procedure for controlling the FDR using sample swapping. This procedure makes use of the estimated normalization constant in order to gain power over the naive choice of constant (used in MACS and SICER), which is the ratio of the total number of reads in the ChIP and Input samples. Linear normalization approaches aim to estimate a scale factor, r, to adjust for different sequencing depths when comparing ChIP versus Input samples. The estimated scaling factor can easily be incorporated in many peak caller algorithms to improve the accuracy of the peak identification. The

  15. Patient-provider connectivity and the role of e-health.

    Science.gov (United States)

    Holmes, Suzanne C; Kearns, Ellen Hope

    2003-01-01

    Patient-provider connectivity (PPC) offers innovative approaches to control costs, improve quality, and sustain a healthy workforce. The application of e-commerce to health care is one facet of PPC and provides solutions to educating, informing, and more efficiently using scarce resources to sustain the nation's health. Technology is available to provide real-time access to clinical results, medical records, health-care providers, and other time-sensitive patient information. This is the first article in a series on PPC that explores the application of e-commerce to the health-care industry from the consumers' and providers' points of view and examines and assesses trends and data from various interdisciplinary sources and studies. Two models exemplifying PPC are explored including the Science Business & Education, Inc., proof-of-concept patient demonstration project, and the emerging application of peer-to-peer (P2P) technology. PPC promises to improve efficiency, facilitate communication between physician and patient, monitor compliance with medical regimens, and positively affect the quality of health care provided and the overall health of the patient. Future articles will address the growth of telemedicine, issues of confidentiality and e-risk, and other PPC applications.

  16. Normalization and experimental design for ChIP-chip data

    Directory of Open Access Journals (Sweden)

    Alekseyenko Artyom A

    2007-06-01

    Full Text Available Abstract Background Chromatin immunoprecipitation on tiling arrays (ChIP-chip has been widely used to investigate the DNA binding sites for a variety of proteins on a genome-wide scale. However, several issues in the processing and analysis of ChIP-chip data have not been resolved fully, including the effect of background (mock control subtraction and normalization within and across arrays. Results The binding profiles of Drosophila male-specific lethal (MSL complex on a tiling array provide a unique opportunity for investigating these topics, as it is known to bind on the X chromosome but not on the autosomes. These large bound and control regions on the same array allow clear evaluation of analytical methods. We introduce a novel normalization scheme specifically designed for ChIP-chip data from dual-channel arrays and demonstrate that this step is critical for correcting systematic dye-bias that may exist in the data. Subtraction of the mock (non-specific antibody or no antibody control data is generally needed to eliminate the bias, but appropriate normalization obviates the need for mock experiments and increases the correlation among replicates. The idea underlying the normalization can be used subsequently to estimate the background noise level in each array for normalization across arrays. We demonstrate the effectiveness of the methods with the MSL complex binding data and other publicly available data. Conclusion Proper normalization is essential for ChIP-chip experiments. The proposed normalization technique can correct systematic errors and compensate for the lack of mock control data, thus reducing the experimental cost and producing more accurate results.

  17. On the Use of the Log-Normal Particle Size Distribution to Characterize Global Rain

    Science.gov (United States)

    Meneghini, Robert; Rincon, Rafael; Liao, Liang

    2003-01-01

    Although most parameterizations of the drop size distributions (DSD) use the gamma function, there are several advantages to the log-normal form, particularly if we want to characterize the large scale space-time variability of the DSD and rain rate. The advantages of the distribution are twofold: the logarithm of any moment can be expressed as a linear combination of the individual parameters of the distribution; the parameters of the distribution are approximately normally distributed. Since all radar and rainfall-related parameters can be written approximately as a moment of the DSD, the first property allows us to express the logarithm of any radar/rainfall variable as a linear combination of the individual DSD parameters. Another consequence is that any power law relationship between rain rate, reflectivity factor, specific attenuation or water content can be expressed in terms of the covariance matrix of the DSD parameters. The joint-normal property of the DSD parameters has applications to the description of the space-time variation of rainfall in the sense that any radar-rainfall quantity can be specified by the covariance matrix associated with the DSD parameters at two arbitrary space-time points. As such, the parameterization provides a means by which we can use the spaceborne radar-derived DSD parameters to specify in part the covariance matrices globally. However, since satellite observations have coarse temporal sampling, the specification of the temporal covariance must be derived from ancillary measurements and models. Work is presently underway to determine whether the use of instantaneous rain rate data from the TRMM Precipitation Radar can provide good estimates of the spatial correlation in rain rate from data collected in 5(sup 0)x 5(sup 0) x 1 month space-time boxes. To characterize the temporal characteristics of the DSD parameters, disdrometer data are being used from the Wallops Flight Facility site where as many as 4 disdrometers have been

  18. Sandstone-filled normal faults: A case study from central California

    Science.gov (United States)

    Palladino, Giuseppe; Alsop, G. Ian; Grippa, Antonio; Zvirtes, Gustavo; Phillip, Ruy Paulo; Hurst, Andrew

    2018-05-01

    Despite the potential of sandstone-filled normal faults to significantly influence fluid transmissivity within reservoirs and the shallow crust, they have to date been largely overlooked. Fluidized sand, forcefully intruded along normal fault zones, markedly enhances the transmissivity of faults and, in general, the connectivity between otherwise unconnected reservoirs. Here, we provide a detailed outcrop description and interpretation of sandstone-filled normal faults from different stratigraphic units in central California. Such faults commonly show limited fault throw, cm to dm wide apertures, poorly-developed fault zones and full or partial sand infill. Based on these features and inferences regarding their origin, we propose a general classification that defines two main types of sandstone-filled normal faults. Type 1 form as a consequence of the hydraulic failure of the host strata above a poorly-consolidated sandstone following a significant, rapid increase of pore fluid over-pressure. Type 2 sandstone-filled normal faults form as a result of regional tectonic deformation. These structures may play a significant role in the connectivity of siliciclastic reservoirs, and may therefore be crucial not just for investigation of basin evolution but also in hydrocarbon exploration.

  19. Peripartum haemodynamic status of bitches with normal birth or dystocia.

    Science.gov (United States)

    Lúcio, C F; Silva, L C G; Rodrigues, J A; Veiga, G A L; Vannucchi, C I

    2009-07-01

    There has been limited investigation of parturition in the bitch and there is little information published on clinical and obstetrical examination other than opinion and anecdote. While there are substantial data on haemodynamic and vascular changes during normal parturition in humans, little is known about the physiological events in the dog. This study was aimed at maternal haemodynamic changes occurring during normal parturition and to investigate how these were modified in bitches with dystocia (DYST) treated either medically or via assisted delivery and caesarean operation. Three groups of 10 bitches were investigated; those with normal parturition, those with DYST corrected by manipulative assistance or caesarean operation and those with uterine inertia treated by oxytocin administration. Heart rate, systolic and diastolic blood pressure, electrocardiogram and blood glucose concentration were measured pre-partum, intra-partum, immediately after parturition and 1 h later. Heart rate was high at all times throughout the study and the majority of bitches had normal sinus rhythm. Blood pressure was generally within the normal range, and although systolic and diastolic blood pressure was highest during the intra-partum period and sometimes during the immediate post-partum period, there were no significant differences between groups. All bitches had blood glucose concentrations within the normal range throughout the study although pre-partum concentrations were statistically lower than many of the other time periods. The study provides useful physiological data that will facilitate monitoring and clinical management of bitches throughout normal parturition and DYST.

  20. Quantitative thallium-201 myocardial exercise scintigraphy in normal subjects and patients with normal coronary arteries

    International Nuclear Information System (INIS)

    Niemeyer, M.G.; St. Antonius Hospital Nieuwegein; Laarman, G.J.; Lelbach, S.; Cramer, M.J.; Ascoop, C.A.P.L.; Verzijlbergen, J.F.; Wall, E.E. van der; Zwinderman, A.H.; Pauwels, E.K.J.

    1990-01-01

    Quantitative thallium-201 myocardial exercise scintigraphy was tested in two patient populations representing alternative standards for cardiac normality: group I comprised 18 male uncatherized patients with a low likelihood of coronary artery disease (CAD); group II contained 41 patients with normal coronary arteriograms. Group I patients were younger, they achieved a higher rate-pressure product than group II patients; all had normal findings by phisical examination and electrocardiography at rest and exercise. Group II patients comprised 21 females, 11 patients showed abnormal electrocardiography at rest, and five patients showed ischemic ST depression during exercise. Twelve patients had sign of minimal CAD. Twelve patients revealed abnormal visual and quantitative thallium findings, three of these patients had minimal CAD. Profiles of uptake and washout of thallium-201 were derived from both patient groups, and compared with normal limits developed by Maddahi et al. Furthermore, low likelihood and angiographically normal patients may differ substantially, and both sets of normal patients should be considered when establishing criteria of abnormality in exercise thallium imaging. When commercial software containing normal limits for quantitative analysis of exercise thallium-201 imaging is used in clinical practice, it is mandatory to compare these with normal limits of uptake and washout of thallium-201, derived from the less heterogeneous group of low-likelihood subjects, which should be used in selecting a normal population to define normality. (author). 37 refs.; 3 figs; 1 tab

  1. Performance characterization of gallium nitride HEMT cascode switch for power conditioning applications

    International Nuclear Information System (INIS)

    Chou, Po-Chien; Cheng, Stone

    2015-01-01

    Highlights: • We develop TO-257 cascoded GaN switch configuration in power conversion applications. • The normally-off cascode circuit provides 14.6 A/600 V characteristics. • Analysis of resistive and inductive switching performances shown in loaded circuits. • A 48-to-96 V boost converter is used to evaluate the benefit of GaN cascode switches. - Abstract: A hybrid cascoded GaN switch configuration is demonstrated in power conversion applications. A novel metal package is proposed for the packaging of a D-mode GaN MIS-HEMT cascoded with an integrated power MOSFET and a SBD. The normally-off cascode circuit provides a maximum drain current of 14.6 A and a blocking capability of 600 V. Analysis of 200 V/1 A power conversion characteristics are discussed and show the excellent switching performance in load circuits. Switching characteristics of the integral SiC SBD are also demonstrated. Finally, a 48-to-96 V boost converter is used to evaluate the benefit of GaN cascode switches. These results show that high-voltage GaN-HEMTs can be switching devices for an ultralow-loss converter circuit

  2. Performance characterization of gallium nitride HEMT cascode switch for power conditioning applications

    Energy Technology Data Exchange (ETDEWEB)

    Chou, Po-Chien; Cheng, Stone, E-mail: stonecheng@mail.nctu.edu.tw

    2015-08-15

    Highlights: • We develop TO-257 cascoded GaN switch configuration in power conversion applications. • The normally-off cascode circuit provides 14.6 A/600 V characteristics. • Analysis of resistive and inductive switching performances shown in loaded circuits. • A 48-to-96 V boost converter is used to evaluate the benefit of GaN cascode switches. - Abstract: A hybrid cascoded GaN switch configuration is demonstrated in power conversion applications. A novel metal package is proposed for the packaging of a D-mode GaN MIS-HEMT cascoded with an integrated power MOSFET and a SBD. The normally-off cascode circuit provides a maximum drain current of 14.6 A and a blocking capability of 600 V. Analysis of 200 V/1 A power conversion characteristics are discussed and show the excellent switching performance in load circuits. Switching characteristics of the integral SiC SBD are also demonstrated. Finally, a 48-to-96 V boost converter is used to evaluate the benefit of GaN cascode switches. These results show that high-voltage GaN-HEMTs can be switching devices for an ultralow-loss converter circuit.

  3. Normal co-ordinate analysis of 1, 8-dibromooctane

    Science.gov (United States)

    Singh, Devinder; Jaggi, Neena; Singh, Nafa

    2010-02-01

    The organic compound 1,8-dibromooctane (1,8-DBO) exists in liquid phase at ambient temperatures and has versatile synthetic applications. In its liquid phase 1,8-DBO has been expected to exist in four most probable conformations, with all its carbon atoms in the same plane, having symmetries C 2h , C i , C 2 and C 1 . In the present study a detailed vibrational analysis in terms of assignment of Fourier transform infrared (FT-IR) and Raman bands of this molecule using normal co-ordinate calculations has been done. A systematic set of symmetry co-ordinates has been constructed for this molecule and normal co-ordinate analysis is carried out using the computer program MOLVIB. The force-field transferred from already studied lower chain bromo-alkanes is subjected to refinement so as to fit the observed infrared and Raman frequencies with those of calculated ones. The potential energy distribution (PED) has also been calculated for each mode of vibration of the molecule for the assumed conformations.

  4. Quantitative analysis of spinal curvature in 3D: application to CT images of normal spine

    Energy Technology Data Exchange (ETDEWEB)

    Vrtovec, Tomaz; Likar, Bostjan; Pernus, Franjo [University of Ljubljana, Faculty of Electrical Engineering, Trzaska 25, SI-1000 Ljubljana (Slovenia)

    2008-04-07

    The purpose of this study is to present a framework for quantitative analysis of spinal curvature in 3D. In order to study the properties of such complex 3D structures, we propose two descriptors that capture the characteristics of spinal curvature in 3D. The descriptors are the geometric curvature (GC) and curvature angle (CA), which are independent of the orientation and size of spine anatomy. We demonstrate the two descriptors that characterize the spinal curvature in 3D on 30 computed tomography (CT) images of normal spine and on a scoliotic spine. The descriptors are determined from 3D vertebral body lines, which are obtained by two different methods. The first method is based on the least-squares technique that approximates the manually identified vertebra centroids, while the second method searches for vertebra centroids in an automated optimization scheme, based on computer-assisted image analysis. Polynomial functions of the fourth and fifth degree were used for the description of normal and scoliotic spinal curvature in 3D, respectively. The mean distance to vertebra centroids was 1.1 mm ({+-}0.6 mm) for the first and 2.1 mm ({+-}1.4 mm) for the second method. The distributions of GC and CA values were obtained along the 30 images of normal spine at each vertebral level and show that maximal thoracic kyphosis (TK), thoracolumbar junction (TJ) and maximal lumbar lordosis (LL) on average occur at T3/T4, T12/L1 and L4/L5, respectively. The main advantage of GC and CA is that the measurements are independent of the orientation and size of the spine, thus allowing objective intra- and inter-subject comparisons. The positions of maximal TK, TJ and maximal LL can be easily identified by observing the GC and CA distributions at different vertebral levels. The obtained courses of the GC and CA for the scoliotic spine were compared to the distributions of GC and CA for the normal spines. The significant difference in values indicates that the descriptors of GC and

  5. Quantitative analysis of spinal curvature in 3D: application to CT images of normal spine

    International Nuclear Information System (INIS)

    Vrtovec, Tomaz; Likar, Bostjan; Pernus, Franjo

    2008-01-01

    The purpose of this study is to present a framework for quantitative analysis of spinal curvature in 3D. In order to study the properties of such complex 3D structures, we propose two descriptors that capture the characteristics of spinal curvature in 3D. The descriptors are the geometric curvature (GC) and curvature angle (CA), which are independent of the orientation and size of spine anatomy. We demonstrate the two descriptors that characterize the spinal curvature in 3D on 30 computed tomography (CT) images of normal spine and on a scoliotic spine. The descriptors are determined from 3D vertebral body lines, which are obtained by two different methods. The first method is based on the least-squares technique that approximates the manually identified vertebra centroids, while the second method searches for vertebra centroids in an automated optimization scheme, based on computer-assisted image analysis. Polynomial functions of the fourth and fifth degree were used for the description of normal and scoliotic spinal curvature in 3D, respectively. The mean distance to vertebra centroids was 1.1 mm (±0.6 mm) for the first and 2.1 mm (±1.4 mm) for the second method. The distributions of GC and CA values were obtained along the 30 images of normal spine at each vertebral level and show that maximal thoracic kyphosis (TK), thoracolumbar junction (TJ) and maximal lumbar lordosis (LL) on average occur at T3/T4, T12/L1 and L4/L5, respectively. The main advantage of GC and CA is that the measurements are independent of the orientation and size of the spine, thus allowing objective intra- and inter-subject comparisons. The positions of maximal TK, TJ and maximal LL can be easily identified by observing the GC and CA distributions at different vertebral levels. The obtained courses of the GC and CA for the scoliotic spine were compared to the distributions of GC and CA for the normal spines. The significant difference in values indicates that the descriptors of GC and CA

  6. Development of a GUI-based RETRAN running environment and its application

    International Nuclear Information System (INIS)

    Kim, K.D.; Jeong, J.J.; Mo, S.Y.; Lee, Y.G.; Lee, C.B.

    2001-01-01

    In order to assist RETRAN users in their input preparation, code execution, and output interpretation, a visual interactive RETRAN running environment (ViRRE) has been developed. ViRRE provides dialog boxes and graphical modules for base input data generation and transient initiation on a user-friendly basis, and special graphical displays to provide an in-depth understanding of the major thermal-hydraulic phenomena during normal and accident conditions for nuclear power plants. This paper presents the main features of ViRRE and an example of its application. (authors)

  7. Providing prenatal care to pregnant women with overweight or obesity: Differences in provider communication and ratings of the patient-provider relationship by patient body weight.

    Science.gov (United States)

    Washington Cole, Katie O; Gudzune, Kimberly A; Bleich, Sara N; Cheskin, Lawrence J; Bennett, Wendy L; Cooper, Lisa A; Roter, Debra L

    2017-06-01

    To examine the association of women's body weight with provider communication during prenatal care. We coded audio recordings of prenatal visits between 22 providers and 117 of their patients using the Roter Interaction Analysis System. Multivariate, multilevel Poisson models were used to examine the relationship between patient pre-pregnancy body mass index and provider communication. Compared to women with normal weight, providers asked fewer lifestyle questions (IRR 0.66, 95% CI 0.44-0.99, p=0.04) and gave less lifestyle information (IRR 0.51, 95% CI 0.32-0.82, p=0.01) to women with overweight and obesity, respectively. Providers used fewer approval (IRR 0.68, 95% CI 0.51-0.91, p=0.01) and concern statements (IRR 0.68, 95% CI 0.53-0.86, p=0.002) when caring for women with overweight and fewer self-disclosure statements caring for women with obesity (IRR 0.40, 95% CI 0.19-0.84 p=0.02). Less lifestyle and rapport building communication for women with obesity may weaken patient-provider relationship during routine prenatal care. Interventions to increase use of patient-centered communication - especially for women with overweight and obesity - may improve prenatal care quality. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  8. Radionuclide blood levels during cisternography of patients with normal-pressure hydrocephalus or Alzheimer's disease

    International Nuclear Information System (INIS)

    Mahaley, M.S. Jr.; Wilkinson, R.H. Jr.; Sivalingham, S.; Friedman, H.; Tyson, W.; Goodrich, J.K.

    1974-01-01

    Various diagnostic procedures were compared during investigations of 37 dementia patients undergoing differential study for normal-pressure hydrocephalus or Alzheimer's disease. A diminished radionuclide level in the blood, with abnormal cisternography and pneumoencephalography, provided the most valuable diagnostic evidence of normal-pressure hydrocephalus. (U.S.)

  9. Normalization and Implementation of Three Gravitational Acceleration Models

    Science.gov (United States)

    Eckman, Randy A.; Brown, Aaron J.; Adamo, Daniel R.; Gottlieb, Robert G.

    2016-01-01

    Unlike the uniform density spherical shell approximations of Newton, the consequence of spaceflight in the real universe is that gravitational fields are sensitive to the asphericity of their generating central bodies. The gravitational potential of an aspherical central body is typically resolved using spherical harmonic approximations. However, attempting to directly calculate the spherical harmonic approximations results in at least two singularities that must be removed to generalize the method and solve for any possible orbit, including polar orbits. Samuel Pines, Bill Lear, and Robert Gottlieb developed three unique algorithms to eliminate these singularities. This paper documents the methodical normalization of two of the three known formulations for singularity-free gravitational acceleration (namely, the Lear and Gottlieb algorithms) and formulates a general method for defining normalization parameters used to generate normalized Legendre polynomials and Associated Legendre Functions (ALFs) for any algorithm. A treatment of the conventional formulation of the gravitational potential and acceleration is also provided, in addition to a brief overview of the philosophical differences between the three known singularity-free algorithms.

  10. Cucker-Smale model with normalized communication weights and time delay

    KAUST Repository

    Choi, Young-Pil; Haskovec, Jan

    2017-01-01

    We study a Cucker-Smale-type system with time delay in which agents interact with each other through normalized communication weights. We construct a Lyapunov functional for the system and provide sufficient conditions for asymptotic flocking, i

  11. The claudin gene family: expression in normal and neoplastic tissues

    International Nuclear Information System (INIS)

    Hewitt, Kyle J; Agarwal, Rachana; Morin, Patrice J

    2006-01-01

    The claudin (CLDN) genes encode a family of proteins important in tight junction formation and function. Recently, it has become apparent that CLDN gene expression is frequently altered in several human cancers. However, the exact patterns of CLDN expression in various cancers is unknown, as only a limited number of CLDN genes have been investigated in a few tumors. We identified all the human CLDN genes from Genbank and we used the large public SAGE database to ascertain the gene expression of all 21 CLDN in 266 normal and neoplastic tissues. Using real-time RT-PCR, we also surveyed a subset of 13 CLDN genes in 24 normal and 24 neoplastic tissues. We show that claudins represent a family of highly related proteins, with claudin-16, and -23 being the most different from the others. From in silico analysis and RT-PCR data, we find that most claudin genes appear decreased in cancer, while CLDN3, CLDN4, and CLDN7 are elevated in several malignancies such as those originating from the pancreas, bladder, thyroid, fallopian tubes, ovary, stomach, colon, breast, uterus, and the prostate. Interestingly, CLDN5 is highly expressed in vascular endothelial cells, providing a possible target for antiangiogenic therapy. CLDN18 might represent a biomarker for gastric cancer. Our study confirms previously known CLDN gene expression patterns and identifies new ones, which may have applications in the detection, prognosis and therapy of several human cancers. In particular we identify several malignancies that express CLDN3 and CLDN4. These cancers may represent ideal candidates for a novel therapy being developed based on CPE, a toxin that specifically binds claudin-3 and claudin-4

  12. Technical normalization in the geoinformatics branch

    Directory of Open Access Journals (Sweden)

    Bronislava Horáková

    2006-09-01

    Full Text Available A basic principle of the technical normalisation is to hold the market development by developing unified technical rules for all concerned subjects. The information and communication technological industry is characterised by certain specific features contrary to the traditional industry. These features bring to the normalisation domain new demands, mainly the flexibility enabling to reflect the rapidly development market of ICT elastic way. The goal of the paper is to provide a comprehensive overview of the current process of technical normalization in the geoinformatic branch

  13. "Ser diferente é normal?"/"Being different: is it normal?"

    Directory of Open Access Journals (Sweden)

    Viviane Veras

    2007-01-01

    Full Text Available A pergunta título deste trabalho retoma o slogan “Ser diferente é normal”, que é parte da campanha criada para uma organização não-governamental que atende portadores de Síndrome de Down. O objetivo é a inclusão social da pessoa com deficiência e o primeiro passo foi propor a inclusão de um grupo de diferentes no grupo dito normal. No vídeo de lançamento da campanha, o diferente, identificado como normal, é mostrado por meio de exemplos – um negro com cabelo black-power, um skin-head, um corpo tatuado, um corpo feminino halterofílico, uma família hippie, uma garota com síndrome de Down. A visão da adolescente dançando reduz, de certo modo, o efeito imaginário que vai além da síndrome, uma vez que apenas o corpo com seus olhinhos puxados se destacam, e não se interrogam questões cognitivas. Minha proposta é refletir sobre o estatuto paradoxal do exemplo, tal como é trabalhado nesse vídeo: se, por definição, um exemplo mostra de fato seu pertencimento a uma classe, pode-se concluir que é exatamente por ser exemplar que ele se encontra fora dela, no exato momento em que a exibe e define. The question in the title of this paper refers to the slogan "ser diferente é normal" ("It´s normal to be different", which is part of a campaign created for a NGO that supports people with Down syndrome. The objective of the campaign is to promote the social inclusion of individuals with Down syndrome, and the first step was to propose the inclusion of a group of "differents" in the so-called normal group. The film launching the campaign shows the different identified as normal by means of examples: a black man exhibiting blackpower haircut, a skin-head, a tattooed body, an over-athletic female body, a hippie family and a girl with Down syndrome. The vision of the dancing teenager lessens the imaginary effect that surpasses the syndrome, since only her body and her little oriental eyes stand out and no cognitive issues are

  14. Analysis of Within-Test Variability of Non-Destructive Test Methods to Evaluate Compressive Strength of Normal Vibrated and Self-Compacting Concretes

    Science.gov (United States)

    Nepomuceno, Miguel C. S.; Lopes, Sérgio M. R.

    2017-10-01

    Non-destructive tests (NDT) have been used in the last decades for the assessment of in-situ quality and integrity of concrete elements. An important step in the application of NDT methods concerns to the interpretation and validation of the test results. In general, interpretation of NDT results should involve three distinct phases leading to the development of conclusions: processing of collected data, analysis of within-test variability and quantitative evaluation of property under investigation. The analysis of within-test variability can provide valuable information, since this can be compared with that of within-test variability associated with the NDT method in use, either to provide a measure of the quality control or to detect the presence of abnormal circumstances during the in-situ application. This paper reports the analysis of the experimental results of within-test variability of NDT obtained for normal vibrated concrete and self-compacting concrete. The NDT reported includes the surface hardness test, ultrasonic pulse velocity test, penetration resistance test, pull-off test, pull-out test and maturity test. The obtained results are discussed and conclusions are presented.

  15. Normalizing difference: Emotional intelligence and diversity management competence in healthcare managers

    Directory of Open Access Journals (Sweden)

    Adebukola E. Oyewunmi

    2018-05-01

    Full Text Available Purpose: Diversity is synonymous with difference. The diverse workforce presents an array of complexities which necessitates the deployment of specific managerial competencies. Empirical evidences have indicated the role of emotional intelligence in the enhancement of abilities. Thus, this study investigated the relationship between emotional intelligence and diversity management competency amongst healthcare managers in Southwest Nigeria.  Design: The descriptive survey method was adopted for the study. A total of 360 respondents completed the structured questionnaire titled Emotional Intelligence and Diversity Management Competency Questionnaire (EIDMCQ. Data was analyzed using descriptive and inferential statistics such as, Multiple Regression Analyses and Pearson Product Moment Correlation Statistical methods.  Findings: A positive correlation was found between emotional intelligence and diversity management competency. Gender, ethnicity, and age, did not moderate the relationship between emotional intelligence and diversity management competency.  Practical Implications: As difference is the reality of modern organizations, it is important to conceptualize it as normal and positive. Emotional intelligence is recommended as a critical tool to normalize the individual perceptions of difference. The re-assessment of the functions of managers must be followed by total commitment to capacity building in emotional intelligence, as well as the re-engineering of organizational and national cultures to promote equal opportunities, inclusion and diversity leveraging.  Originality/value: This study pioneers research on emotional intelligence and diversity management competency in Nigeria’s public healthcare sector. It conceptualizes diversity management on an individual- managerial level. Practical interventions are provided to enhance the application of specific competencies to optimize a diverse workplace.

  16. Condition monitoring with wind turbine SCADA data using Neuro-Fuzzy normal behavior models

    DEFF Research Database (Denmark)

    Schlechtingen, Meik; Santos, Ilmar

    2012-01-01

    System (ANFIS) models are employed to learn the normal behavior in a training phase, where the component condition can be considered healthy. In the application phase the trained models are applied to predict the target signals, e.g. temperatures, pressures, currents, power output, etc. The behavior......This paper presents the latest research results of a project that focuses on normal behavior models for condition monitoring of wind turbines and their components, via ordinary Supervisory Control And Data Acquisition (SCADA) data. In this machine learning approach Adaptive Neuro-Fuzzy Interference...... of the prediction error is used as an indicator for normal and abnormal behavior, with respect to the learned behavior. The advantage of this approach is that the prediction error is widely decoupled from the typical fluctuations of the SCADA data caused by the different turbine operational modes. To classify...

  17. Response of cultured normal human mammary epithelial cells to X rays

    International Nuclear Information System (INIS)

    Yang, T.C.; Stampfer, M.R.; Smith, H.S.

    1983-01-01

    The effect of X rays on the reproductive death of cultured normal human mammary epithelial cells was examined. Techniques were developed for isolating and culturing normal human mammary epithelial cells which provide sufficient cells at second passage for radiation studies, and an efficient clonogenic assay suitable for measuring radiation survival curves. It was found that the survival curves for epithelial cells from normal breast tissue were exponential and had D 0 values of about 109-148 rad for 225 kVp X rays. No consistent change in cell radiosensitivity with the age of donor was observed, and no sublethal damage repair in these cells could be detected with the split-dose technique

  18. 78 FR 40092 - Inviting Rural Business Enterprise Grant Program Applications for Grants To Provide Technical...

    Science.gov (United States)

    2013-07-03

    ... corporate felony convictions and corporate Federal tax delinquencies, applicants that are not delinquent on... result in a zero-point score for that criterion and will impact the overall evaluation of the application... Felony Convictions and Corporate Felony Tax Delinquencies Applications from corporate applicants...

  19. LSM Proteins Provide Accurate Splicing and Decay of Selected Transcripts to Ensure Normal Arabidopsis Development[W

    Science.gov (United States)

    Perea-Resa, Carlos; Hernández-Verdeja, Tamara; López-Cobollo, Rosa; Castellano, María del Mar; Salinas, Julio

    2012-01-01

    In yeast and animals, SM-like (LSM) proteins typically exist as heptameric complexes and are involved in different aspects of RNA metabolism. Eight LSM proteins, LSM1 to 8, are highly conserved and form two distinct heteroheptameric complexes, LSM1-7 and LSM2-8,that function in mRNA decay and splicing, respectively. A search of the Arabidopsis thaliana genome identifies 11 genes encoding proteins related to the eight conserved LSMs, the genes encoding the putative LSM1, LSM3, and LSM6 proteins being duplicated. Here, we report the molecular and functional characterization of the Arabidopsis LSM gene family. Our results show that the 11 LSM genes are active and encode proteins that are also organized in two different heptameric complexes. The LSM1-7 complex is cytoplasmic and is involved in P-body formation and mRNA decay by promoting decapping. The LSM2-8 complex is nuclear and is required for precursor mRNA splicing through U6 small nuclear RNA stabilization. More importantly, our results also reveal that these complexes are essential for the correct turnover and splicing of selected development-related mRNAs and for the normal development of Arabidopsis. We propose that LSMs play a critical role in Arabidopsis development by ensuring the appropriate development-related gene expression through the regulation of mRNA splicing and decay. PMID:23221597

  20. System and Method for Providing Web-Based Remote Application Service

    OpenAIRE

    Shuen-Tai Wang; Yu-Ching Lin; Hsi-Ya Chang

    2017-01-01

    With the development of virtualization technologies, a new type of service named cloud computing service is produced. Cloud users usually encounter the problem of how to use the virtualized platform easily over the web without requiring the plug-in or installation of special software. The object of this paper is to develop a system and a method enabling process interfacing within an automation scenario for accessing remote application by using the web browser. To meet this challenge, we have ...

  1. On the efficient simulation of the left-tail of the sum of correlated log-normal variates

    KAUST Repository

    Alouini, Mohamed-Slim; Rached, Nadhir B.; Kammoun, Abla; Tempone, Raul

    2018-01-01

    The sum of log-normal variates is encountered in many challenging applications such as performance analysis of wireless communication systems and financial engineering. Several approximation methods have been reported in the literature. However

  2. The Use of a Pressure-Indicating Sensor Film to Provide Feedback upon Hydrogel-Forming Microneedle Array Self-Application In Vivo.

    Science.gov (United States)

    Vicente-Pérez, Eva M; Quinn, Helen L; McAlister, Emma; O'Neill, Shannon; Hanna, Lezley-Anne; Barry, Johanne G; Donnelly, Ryan F

    2016-12-01

    To evaluate the combination of a pressure-indicating sensor film with hydrogel-forming microneedle arrays, as a method of feedback to confirm MN insertion in vivo. Pilot in vitro insertion studies were conducted using a Texture Analyser to insert MN arrays, coupled with a pressure-indicating sensor film, at varying forces into excised neonatal porcine skin. In vivo studies involved twenty human volunteers, who self-applied two hydrogel-forming MN arrays, one with a pressure-indicating sensor film incorporated and one without. Optical coherence tomography was employed to measure the resulting penetration depth and colorimetric analysis to investigate the associated colour change of the pressure-indicating sensor film. Microneedle insertion was achieved in vitro at three different forces, demonstrating the colour change of the pressure-indicating sensor film upon application of increasing pressure. When self-applied in vivo, there was no significant difference in the microneedle penetration depth resulting from each type of array, with a mean depth of 237 μm recorded. When the pressure-indicating sensor film was present, a colour change occurred upon each application, providing evidence of insertion. For the first time, this study shows how the incorporation of a simple, low-cost pressure-indicating sensor film can indicate microneedle insertion in vitro and in vivo, providing visual feedback to assure the user of correct application. Such a strategy may enhance usability of a microneedle device and, hence, assist in the future translation of the technology to widespread clinical use.

  3. Normalization method for metabolomics data using optimal selection of multiple internal standards

    Directory of Open Access Journals (Sweden)

    Yetukuri Laxman

    2007-03-01

    Full Text Available Abstract Background Success of metabolomics as the phenotyping platform largely depends on its ability to detect various sources of biological variability. Removal of platform-specific sources of variability such as systematic error is therefore one of the foremost priorities in data preprocessing. However, chemical diversity of molecular species included in typical metabolic profiling experiments leads to different responses to variations in experimental conditions, making normalization a very demanding task. Results With the aim to remove unwanted systematic variation, we present an approach that utilizes variability information from multiple internal standard compounds to find optimal normalization factor for each individual molecular species detected by metabolomics approach (NOMIS. We demonstrate the method on mouse liver lipidomic profiles using Ultra Performance Liquid Chromatography coupled to high resolution mass spectrometry, and compare its performance to two commonly utilized normalization methods: normalization by l2 norm and by retention time region specific standard compound profiles. The NOMIS method proved superior in its ability to reduce the effect of systematic error across the full spectrum of metabolite peaks. We also demonstrate that the method can be used to select best combinations of standard compounds for normalization. Conclusion Depending on experiment design and biological matrix, the NOMIS method is applicable either as a one-step normalization method or as a two-step method where the normalization parameters, influenced by variabilities of internal standard compounds and their correlation to metabolites, are first calculated from a study conducted in repeatability conditions. The method can also be used in analytical development of metabolomics methods by helping to select best combinations of standard compounds for a particular biological matrix and analytical platform.

  4. The value of MR perfusion weighted imaging in normal and abnormal kidneys

    International Nuclear Information System (INIS)

    Shi Hao; Ding Hongyu; Duan Ruiping; Sun Yongping; Xing Yiyong

    2008-01-01

    Objective: To explore the characteristics and the clinical application of MR perfusion weighted imaging (PWI) in the normal kidneys and the renal diseases. Methods: Thirty-one subjects including 9 cases without urinary diseases, 14 cases with renal carcinoma, 6 cases with renal cyst and 2 cases with renal tuberculosis who had been examined with T 1 WI, T 2 WI and PWI were analyzed retrospectively. All the data were processed by a workstation to obtain time-signal intensity curves, color perfusion maps and relative perfusion value. The relative renal blood volume (RBV), relative renal blood flow (RBF), mean transition time (MTY) and the time to peak (TTP) in the normal renal cortex and medulla and the renal lesions were calculated. Comparisons between the right and the left normal kidneys, and between the cortex and the medulla of the normal kidneys were performed using t test, and comparisons between the normal and the abnormal kidneys were performed using q test. Results: Relative RBV and relative RBF of the cortex were 1.33±0.08 and 1.44±0.09 respectively, and for medulla were 0.58± 0.05 and 0.78±0.13 respectively (t=9.2241 and 5.0336, P 0.05). The values of relative RBF of the renal carcinoma (1.35±0.34) were significantly higher than that of the normal tissues (1.02±0.06) (q=3.0882, P< 0.01). Conclusion: PWI is able to demonstrate the hemodynamic change of the normal renal tissues and the renal lesions, and it maybe an ideal method for showing the functional changes of the kidney and for differentiating the renal diseases. (authors)

  5. Film analysis systems and applications

    Energy Technology Data Exchange (ETDEWEB)

    Yonekura, Y.; Brill, A.B.

    1981-01-01

    The different components that can be used in modern film analysis systems are reviewed. TV camera and charge-coupled device sensors coupled to computers provide low cost systems for applications such as those described. The autoradiography (ARG) method provides an important tool for medical research and is especially useful for the development of new radiopharmaceutical compounds. Biodistribution information is needed for estimation of radiation dose, and for interpretation of the significance of observed patterns. The need for such precise information is heightened when one seeks to elucidate physiological principles/factors in normal and experimental models of disease. The poor spatial resolution achieved with current PET-imaging systems limits the information on radioreceptor mapping, neutrotransmitter, and neuroleptic drug distribution that can be achieved from patient studies. The artful use of ARG in carefully-controlled animal studies will be required to provide the additional information needed to fully understand results obtained with this new important research tool. (ERB)

  6. Film analysis systems and applications

    International Nuclear Information System (INIS)

    Yonekura, Y.; Brill, A.B.

    1981-01-01

    The different components that can be used in modern film analysis systems are reviewed. TV camera and charge-coupled device sensors coupled to computers provide low cost systems for applications such as those described. The autoradiography (ARG) method provides an important tool for medical research and is especially useful for the development of new radiopharmaceutical compounds. Biodistribution information is needed for estimation of radiation dose, and for interpretation of the significance of observed patterns. The need for such precise information is heightened when one seeks to elucidate physiological principles/factors in normal and experimental models of disease. The poor spatial resolution achieved with current PET-imaging systems limits the information on radioreceptor mapping, neutrotransmitter, and neuroleptic drug distribution that can be achieved from patient studies. The artful use of ARG in carefully-controlled animal studies will be required to provide the additional information needed to fully understand results obtained with this new important research tool

  7. Target normal sheath acceleration analytical modeling, comparative study and developments

    International Nuclear Information System (INIS)

    Perego, C.; Batani, D.; Zani, A.; Passoni, M.

    2012-01-01

    Ultra-intense laser interaction with solid targets appears to be an extremely promising technique to accelerate ions up to several MeV, producing beams that exhibit interesting properties for many foreseen applications. Nowadays, most of all the published experimental results can be theoretically explained in the framework of the target normal sheath acceleration (TNSA) mechanism proposed by Wilks et al. [Phys. Plasmas 8(2), 542 (2001)]. As an alternative to numerical simulation various analytical or semi-analytical TNSA models have been published in the latest years, each of them trying to provide predictions for some of the ion beam features, given the initial laser and target parameters. However, the problem of developing a reliable model for the TNSA process is still open, which is why the purpose of this work is to enlighten the present situation of TNSA modeling and experimental results, by means of a quantitative comparison between measurements and theoretical predictions of the maximum ion energy. Moreover, in the light of such an analysis, some indications for the future development of the model proposed by Passoni and Lontano [Phys. Plasmas 13(4), 042102 (2006)] are then presented.

  8. Large-scale event extraction from literature with multi-level gene normalization.

    Directory of Open Access Journals (Sweden)

    Sofie Van Landeghem

    Full Text Available Text mining for the life sciences aims to aid database curation, knowledge summarization and information retrieval through the automated processing of biomedical texts. To provide comprehensive coverage and enable full integration with existing biomolecular database records, it is crucial that text mining tools scale up to millions of articles and that their analyses can be unambiguously linked to information recorded in resources such as UniProt, KEGG, BioGRID and NCBI databases. In this study, we investigate how fully automated text mining of complex biomolecular events can be augmented with a normalization strategy that identifies biological concepts in text, mapping them to identifiers at varying levels of granularity, ranging from canonicalized symbols to unique gene and proteins and broad gene families. To this end, we have combined two state-of-the-art text mining components, previously evaluated on two community-wide challenges, and have extended and improved upon these methods by exploiting their complementary nature. Using these systems, we perform normalization and event extraction to create a large-scale resource that is publicly available, unique in semantic scope, and covers all 21.9 million PubMed abstracts and 460 thousand PubMed Central open access full-text articles. This dataset contains 40 million biomolecular events involving 76 million gene/protein mentions, linked to 122 thousand distinct genes from 5032 species across the full taxonomic tree. Detailed evaluations and analyses reveal promising results for application of this data in database and pathway curation efforts. The main software components used in this study are released under an open-source license. Further, the resulting dataset is freely accessible through a novel API, providing programmatic and customized access (http://www.evexdb.org/api/v001/. Finally, to allow for large-scale bioinformatic analyses, the entire resource is available for bulk download from

  9. nth roots of normal contractions

    International Nuclear Information System (INIS)

    Duggal, B.P.

    1992-07-01

    Given a complex separable Hilbert space H and a contraction A on H such that A n , n≥2 some integer, is normal it is shown that if the defect operator D A = (1 - A * A) 1/2 is of the Hilbert-Schmidt class, then A is similar to a normal contraction, either A or A 2 is normal, and if A 2 is normal (but A is not) then there is a normal contraction N and a positive definite contraction P of trace class such that parallel to A - N parallel to 1 = 1/2 parallel to P + P parallel to 1 (where parallel to · parallel to 1 denotes the trace norm). If T is a compact contraction such that its characteristics function admits a scalar factor, if T = A n for some integer n≥2 and contraction A with simple eigen-values, and if both T and A satisfy a ''reductive property'', then A is a compact normal contraction. (author). 16 refs

  10. Normal zone propagation characteristics of coated conductor according to insulation materials

    International Nuclear Information System (INIS)

    Yang, S.E.; Ahn, M.C.; Park, D.K.; Chang, K.S.; Bae, D.K.; Ko, T.K.

    2007-01-01

    Recent development of CC, usually called second generation (2G) HTS, is actively in progress. Because of its higher critical current density as well as higher n-value, 2G HTS is feasible for the applications such as superconducting fault current limiter and superconducting cable. For operating the HTS equipment stably, it needs to investigate the characteristics of normal zone propagation occurred by quench. Investigations on the fundamental characteristics can be one of the indispensable foundations for research and development of power equipments. In this paper, normal zone propagation (NZP) characteristics according to various insulation materials are researched. By heating with NiCr heater and insulating with epoxy, we applied the operating current with respect to the critical current for calculation of minimum quench energy (MQE) and measurement of NZP

  11. Confidence bounds and hypothesis tests for normal distribution coefficients of variation

    Science.gov (United States)

    Steve P. Verrill; Richard A. Johnson

    2007-01-01

    For normally distributed populations, we obtain confidence bounds on a ratio of two coefficients of variation, provide a test for the equality of k coefficients of variation, and provide confidence bounds on a coefficient of variation shared by k populations. To develop these confidence bounds and test, we first establish that estimators based on Newton steps from n-...

  12. Orbital Normalization of MESSENGER Gamma-Ray Spectrometer Data

    Science.gov (United States)

    Rhodes, E. A.; Peplowski, P. N.; Evans, L. G.; Hamara, D. K.; Boynton, W. V.; Solomon, S. C.

    2011-12-01

    The MESSENGER Gamma-Ray Spectrometer (GRS) measures energy spectra of gamma rays emanating from the surface of Mercury. Analysis of these spectra provides elemental abundances of surface material. The MESSENGER mission necessarily provides some data normalization challenges for GRS analysis. So as to keep the spacecraft cool while orbiting the dayside of the planet, the orbits are highly eccentric, with altitudes varying from 200-500 km to ~ 15,000 km. A small fraction of time is spent at the low altitudes where gamma-ray signals are largest, requiring a large number of orbits to yield sufficient counting statistics for elemental analysis. Also, the sunshade must always shield the spacecraft from the Sun, which causes the orientation of the GRS often to be far from nadir-pointing, so the detector efficiency and attenuation of gamma rays from the planet must be known for a wide range of off-nadir orientations. An efficiency/attenuation map for the expected ranges of orientations and energies was constructed in a ground calibration experiment for a limited range of orientations using a nuclear reactor and radioisotope sources, and those results were extended to other orientations by radiation transport computations using as input a computer-aided design model of the spacecraft and its composition. This normalization has allowed abundance determinations of elements K, Th, and U from radioisotopes of these elements in the Mercury regolith during the first quarter of the year-long mission. These results provide constraints on models of Mercury's chemical and thermal evolution. The normalization of gamma-ray spectra for surface elements not having radioisotopes is considerably more complex; these gamma rays come from neutron inelastic-scatter and capture reactions in the regolith, where the neutrons are generated by cosmic ray impact onto the planet. A radiation transport computation was performed to generate the expected count rates in the neutron-generated gamma

  13. An analysis of longitudinal data with nonignorable dropout using the truncated multivariate normal distribution

    NARCIS (Netherlands)

    Jolani, Shahab

    2014-01-01

    For a vector of multivariate normal when some elements, but not necessarily all, are truncated, we derive the moment generating function and obtain expressions for the first two moments involving the multivariate hazard gradient. To show one of many applications of these moments, we then extend the

  14. Baby Poop: What's Normal?

    Science.gov (United States)

    ... I'm breast-feeding my newborn and her bowel movements are yellow and mushy. Is this normal for baby poop? Answers from Jay L. Hoecker, M.D. Yellow, mushy bowel movements are perfectly normal for breast-fed babies. Still, ...

  15. Normal people working in normal organizations with normal equipment: system safety and cognition in a mid-air collision.

    Science.gov (United States)

    de Carvalho, Paulo Victor Rodrigues; Gomes, José Orlando; Huber, Gilbert Jacob; Vidal, Mario Cesar

    2009-05-01

    A fundamental challenge in improving the safety of complex systems is to understand how accidents emerge in normal working situations, with equipment functioning normally in normally structured organizations. We present a field study of the en route mid-air collision between a commercial carrier and an executive jet, in the clear afternoon Amazon sky in which 154 people lost their lives, that illustrates one response to this challenge. Our focus was on how and why the several safety barriers of a well structured air traffic system melted down enabling the occurrence of this tragedy, without any catastrophic component failure, and in a situation where everything was functioning normally. We identify strong consistencies and feedbacks regarding factors of system day-to-day functioning that made monitoring and awareness difficult, and the cognitive strategies that operators have developed to deal with overall system behavior. These findings emphasize the active problem-solving behavior needed in air traffic control work, and highlight how the day-to-day functioning of the system can jeopardize such behavior. An immediate consequence is that safety managers and engineers should review their traditional safety approach and accident models based on equipment failure probability, linear combinations of failures, rules and procedures, and human errors, to deal with complex patterns of coincidence possibilities, unexpected links, resonance among system functions and activities, and system cognition.

  16. A study of several normal values of Korean healthy adults on chest roentgenograms

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, Byung Chull [Choong Nam University College of Medicine, Taejeon (Korea, Republic of)

    1975-06-15

    Determination of several normal values were carried out healthy 1805 cases of Korean adults, 1436 cases of male and 369 cases of female, by the drawing and calculation on chest roentgenograms. In many instances, the change of normal values provides an important clinical values, and often is decisive to evaluate the diagnosis, treatment and prognosis of pulmonary, cardiac and mediastinal disease.

  17. Weight, iodine content and iodine uptake of the thyroid gland of normal Japanese

    International Nuclear Information System (INIS)

    Yoshizawa, Yasuo; Kusama, Tomoko

    1976-01-01

    Various questions arise in the application of ICRP ''Standard Man'' values to Japanese. One of the questions is that ''Standard Man'' values of the thyroid are different from normal Japanese values. A systematic survey of past reports was carried out with a view to search for normal Japanese values of the thyroid. The subjects of search were weight, iodine content and iodine uptake rate (f sub(w)) of the thyroid. These are important factors in the estimation of the radiation dose of the thyroid caused by internal contamination of radioiodine, and are foreseen to have the difference between Japanese and ''Standard Man''. The result of study suggested that the weight of the thyroid of normal Japanese is about 19 g for adult male and about 17 g for adult female, and that the iodine content is 12-22 mg and iodine uptake rate (f sub(w)) is about 0.2. (auth.)

  18. Short proofs of strong normalization

    OpenAIRE

    Wojdyga, Aleksander

    2008-01-01

    This paper presents simple, syntactic strong normalization proofs for the simply-typed lambda-calculus and the polymorphic lambda-calculus (system F) with the full set of logical connectives, and all the permutative reductions. The normalization proofs use translations of terms and types to systems, for which strong normalization property is known.

  19. A structure-preserving approach to normal form analysis of power systems; Una propuesta de preservacion de estructura al analisis de su forma normal en sistemas de potencia

    Energy Technology Data Exchange (ETDEWEB)

    Martinez Carrillo, Irma

    2008-01-15

    Power system dynamic behavior is inherently nonlinear and is driven by different processes at different time scales. The size and complexity of these mechanisms has stimulated the search for methods that reduce the original dimension but retain a certain degree of accuracy. In this dissertation, a novel nonlinear dynamical analysis method for the analysis of large amplitude oscillations that embraces ideas from normal form theory and singular perturbation techniques is proposed. This approach allows the full potential of the normal form method to be reached, and is suitably general for application to a wide variety of nonlinear systems. Drawing on the formal theory of dynamical systems, a structure-preserving model of the system is developed that preservers network and load characteristics. By exploiting the separation of fast and slow time scales of the model, an efficient approach based on singular perturbation techniques, is then derived for constructing a nonlinear power system representation that accurately preserves network structure. The method requires no reduction of the constraint equations and gives therefore, information about the effect of network and load characteristics on system behavior. Analytical expressions are then developed that provide approximate solutions to system performance near a singularity and techniques for interpreting these solutions in terms of modal functions are given. New insights into the nature of nonlinear oscillations are also offered and criteria for characterizing network effects on nonlinear system behavior are proposed. Theoretical insight into the behavior of dynamic coupling of differential-algebraic equations and the origin of nonlinearity is given, and implications for analyzing for design and placement of power system controllers in complex nonlinear systems are discussed. The extent of applicability of the proposed procedure is demonstrated by analyzing nonlinear behavior in two realistic test power systems

  20. Multiple spacecraft observations of interplanetary shocks: four spacecraft determination of shock normals

    International Nuclear Information System (INIS)

    Russell, C.T.; Mellott, M.M.; Smith, E.J.; King, J.H.

    1983-01-01

    ISEE 1,2,3 IMP8, and Prognoz 7 observations of interplanetary shocks in 1978 and 1979 provide five instances where a single shock is observed by four spacecraft. These observations are used to determine best-fit normals for these five shocks. In addition to providing well-documented shocks for furture techniques. When the angle between upstream and downstream magnetic field is greater than 20, magnetic coplanarity can be an accurate single spacecraft method. However, no technique based solely on the magnetic measurements at one or multiple sites was universally accurate. Thus, we recommend using overdetermined shock normal solutions whenever possible, utilizing plasma measurements, separation vectors, and time delays together with magnetic constraints

  1. Empirical Radiometric Normalization of Road Points from Terrestrial Mobile Lidar System

    Directory of Open Access Journals (Sweden)

    Tee-Ann Teo

    2015-05-01

    Full Text Available Lidar data provide both geometric and radiometric information. Radiometric information is influenced by sensor and target factors and should be calibrated to obtain consistent energy responses. The radiometric correction of airborne lidar system (ALS converts the amplitude into a backscatter cross-section with physical meaning value by applying a model-driven approach. The radiometric correction of terrestrial mobile lidar system (MLS is a challenging task because it does not completely follow the inverse square range function at near-range. This study proposed a radiometric normalization workflow for MLS using a data-driven approach. The scope of this study is to normalize amplitude of road points for road surface classification, assuming that road points from different scanners or strips should have similar responses in overlapped areas. The normalization parameters for range effect were obtained from crossroads. The experiment showed that the amplitude difference between scanners and strips decreased after radiometric normalization and improved the accuracy of road surface classification.

  2. Standard heart and vessel size on plain films of normal children

    International Nuclear Information System (INIS)

    Stoever, B.

    1986-01-01

    Standards of heart size, i.e. heart diameters and heart volume of normal children aged 4-15 years were obtained. In all cases requiring exact heart size determination, heart volume calculation is mandatory in children as well as in adults. Statistical work to date has provided precise calculation of heart volume plain films in the upright position. Additional plain films in prone position are unnecessary because no evident orthostatic influence on heart volume in children can be found. Percentiles of normal heart volume related to body weight representing the best correlation to the individual data are given as well as percentiles related to age. Furthermore ratios of normal vessel size to the height of the 8sup(th) thoracic vertebral body, measured on the same plain film, are given. In addition the ratio of upper to lower lung vessel size is calculated. These ratios are useful criteria in estimating normal vessel size and also in cases with increased pulmonary venous pressure. (orig.) [de

  3. A Secure Web Application Providing Public Access to High-Performance Data Intensive Scientific Resources - ScalaBLAST Web Application

    International Nuclear Information System (INIS)

    Curtis, Darren S.; Peterson, Elena S.; Oehmen, Chris S.

    2008-01-01

    This work presents the ScalaBLAST Web Application (SWA), a web based application implemented using the PHP script language, MySQL DBMS, and Apache web server under a GNU/Linux platform. SWA is an application built as part of the Data Intensive Computer for Complex Biological Systems (DICCBS) project at the Pacific Northwest National Laboratory (PNNL). SWA delivers accelerated throughput of bioinformatics analysis via high-performance computing through a convenient, easy-to-use web interface. This approach greatly enhances emerging fields of study in biology such as ontology-based homology, and multiple whole genome comparisons which, in the absence of a tool like SWA, require a heroic effort to overcome the computational bottleneck associated with genome analysis. The current version of SWA includes a user account management system, a web based user interface, and a backend process that generates the files necessary for the Internet scientific community to submit a ScalaBLAST parallel processing job on a dedicated cluster

  4. Addressing the use of cloud computing for web hosting providers

    OpenAIRE

    Fitó, Josep Oriol; Guitart Fernández, Jordi

    2009-01-01

    Nobody doubts about cloud computing is and will be a sea change for the Information Tech nology. Specifically, we address an application of this emerging paradigm into the web hosting providers. We create the Cloud Hosting Provider (CHP): a web hosting provider that uses the outsourcing technique in order to take advantage of cloud computing infrastructures (i.e. cloud-based outsourcing) for providing scalability and availability capabilities to the web applications deployed. Hence, the...

  5. Should we ignore western blots when selecting antibodies for other applications?

    DEFF Research Database (Denmark)

    Uhlén, Mathias

    2017-01-01

    .In the Human Protein Atlas (HPA) program, we have validated more than 24,000 in-house-generated antibodies directed to 17,000 human target proteins2. Although there is often a correlation between performance in different applications, we have observed many examples of antibodies that show strong support...... applications and that this influences the epitopes exposed on the target protein, which might have profound consequences for the ability of a given antibody to bind specifically to its target. As an example, proteins that are analyzed by immunohistochemistry (IHC) are normally first cross-linked with formalin.......In conclusion, western blot and protein array analyses can indeed be useful tools when selecting specific antibodies for other applications. The use of these methods is encouraged both for antibody providers and users, and antibodies with signs of cross-reactivity in these applications should be treated...

  6. A normal mode treatment of semi-diurnal body tides on an aspherical, rotating and anelastic Earth

    Science.gov (United States)

    Lau, Harriet C. P.; Yang, Hsin-Ying; Tromp, Jeroen; Mitrovica, Jerry X.; Latychev, Konstantin; Al-Attar, David

    2015-08-01

    Normal mode treatments of the Earth's body tide response were developed in the 1980s to account for the effects of Earth rotation, ellipticity, anelasticity and resonant excitation within the diurnal band. Recent space-geodetic measurements of the Earth's crustal displacement in response to luni-solar tidal forcings have revealed geographical variations that are indicative of aspherical deep mantle structure, thus providing a novel data set for constraining deep mantle elastic and density structure. In light of this, we make use of advances in seismic free oscillation literature to develop a new, generalized normal mode theory for the tidal response within the semi-diurnal and long-period tidal band. Our theory involves a perturbation method that permits an efficient calculation of the impact of aspherical structure on the tidal response. In addition, we introduce a normal mode treatment of anelasticity that is distinct from both earlier work in body tides and the approach adopted in free oscillation seismology. We present several simple numerical applications of the new theory. First, we compute the tidal response of a spherically symmetric, non-rotating, elastic and isotropic Earth model and demonstrate that our predictions match those based on standard Love number theory. Second, we compute perturbations to this response associated with mantle anelasticity and demonstrate that the usual set of seismic modes adopted for this purpose must be augmented by a family of relaxation modes to accurately capture the full effect of anelasticity on the body tide response. Finally, we explore aspherical effects including rotation and we benchmark results from several illustrative case studies of aspherical Earth structure against independent finite-volume numerical calculations of the semi-diurnal body tide response. These tests confirm the accuracy of the normal mode methodology to at least the level of numerical error in the finite-volume predictions. They also demonstrate

  7. Parametric investigations of target normal sheath acceleration experiments

    International Nuclear Information System (INIS)

    Zani, Alessandro; Sgattoni, Andrea; Passoni, Matteo

    2011-01-01

    One of the most important challenges related to laser-driven ion acceleration research is to actively control some important ion beam features. This is a peculiar topic in the light of future possible technological applications. In the present work we make use of one theoretical model for target normal sheath acceleration in order to reproduce recent experimental parametric studies about maximum ion energy dependencies on laser parameters. The key role played by pulse energy and intensity is enlightened. Finally the effective dependence of maximum ion energy on intensity is evaluated using a combined theoretical approach, obtained by means of an analytical and a particle-in-cell numerical investigation.

  8. Parametric investigations of target normal sheath acceleration experiments

    Science.gov (United States)

    Zani, Alessandro; Sgattoni, Andrea; Passoni, Matteo

    2011-10-01

    One of the most important challenges related to laser-driven ion acceleration research is to actively control some important ion beam features. This is a peculiar topic in the light of future possible technological applications. In the present work we make use of one theoretical model for target normal sheath acceleration in order to reproduce recent experimental parametric studies about maximum ion energy dependencies on laser parameters. The key role played by pulse energy and intensity is enlightened. Finally the effective dependence of maximum ion energy on intensity is evaluated using a combined theoretical approach, obtained by means of an analytical and a particle-in-cell numerical investigation.

  9. Is My Penis Normal? (For Teens)

    Science.gov (United States)

    ... Videos for Educators Search English Español Is My Penis Normal? KidsHealth / For Teens / Is My Penis Normal? Print en español ¿Es normal mi pene? ... any guy who's ever worried about whether his penis is a normal size. There's a fairly wide ...

  10. Is normal science good science?

    Directory of Open Access Journals (Sweden)

    Adrianna Kępińska

    2015-09-01

    Full Text Available “Normal science” is a concept introduced by Thomas Kuhn in The Structure of Scientific Revolutions (1962. In Kuhn’s view, normal science means “puzzle solving”, solving problems within the paradigm—framework most successful in solving current major scientific problems—rather than producing major novelties. This paper examines Kuhnian and Popperian accounts of normal science and their criticisms to assess if normal science is good. The advantage of normal science according to Kuhn was “psychological”: subjective satisfaction from successful “puzzle solving”. Popper argues for an “intellectual” science, one that consistently refutes conjectures (hypotheses and offers new ideas rather than focus on personal advantages. His account is criticized as too impersonal and idealistic. Feyerabend’s perspective seems more balanced; he argues for a community that would introduce new ideas, defend old ones, and enable scientists to develop in line with their subjective preferences. The paper concludes that normal science has no one clear-cut set of criteria encompassing its meaning and enabling clear assessment.

  11. The application of XML in the effluents data modeling of nuclear facilities

    International Nuclear Information System (INIS)

    Yue Feng; Lin Quanyi; Yue Huiguo; Zhang Yan; Zhang Peng; Cao Jun; Chen Bo

    2013-01-01

    The radioactive effluent data, which can provide information to distinguish whether facilities, waste disposal, and control system run normally, is an important basis of safety regulation and emergency management. It can also provide the information to start emergency alarm system as soon as possible. XML technology is an effective tool to realize the standard of effluent data exchange, in favor of data collection, statistics and analysis, strengthening the effectiveness of effluent regulation. This paper first introduces the concept of XML, the choices of effluent data modeling method, and then emphasizes the process of effluent model, finally the model and application are shown, While there is deficiency about the application of XML in the effluents data modeling of nuclear facilities, it is a beneficial attempt to the informatization management of effluents. (authors)

  12. SYNTHESIS METHODS OF ALGEBRAIC NORMAL FORM OF MANY-VALUED LOGIC FUNCTIONS

    Directory of Open Access Journals (Sweden)

    A. V. Sokolov

    2016-01-01

    Full Text Available The rapid development of methods of error-correcting coding, cryptography, and signal synthesis theory based on the principles of many-valued logic determines the need for a more detailed study of the forms of representation of functions of many-valued logic. In particular the algebraic normal form of Boolean functions, also known as Zhegalkin polynomial, that well describe many of the cryptographic properties of Boolean functions is widely used. In this article, we formalized the notion of algebraic normal form for many-valued logic functions. We developed a fast method of synthesis of algebraic normal form of 3-functions and 5-functions that work similarly to the Reed-Muller transform for Boolean functions: on the basis of recurrently synthesized transform matrices. We propose the hypothesis, which determines the rules of the synthesis of these matrices for the transformation from the truth table to the coefficients of the algebraic normal form and the inverse transform for any given number of variables of 3-functions or 5-functions. The article also introduces the definition of algebraic degree of nonlinearity of the functions of many-valued logic and the S-box, based on the principles of many-valued logic. Thus, the methods of synthesis of algebraic normal form of 3-functions applied to the known construction of recurrent synthesis of S-boxes of length N = 3k, whereby their algebraic degrees of nonlinearity are computed. The results could be the basis for further theoretical research and practical applications such as: the development of new cryptographic primitives, error-correcting codes, algorithms of data compression, signal structures, and algorithms of block and stream encryption, all based on the perspective principles of many-valued logic. In addition, the fast method of synthesis of algebraic normal form of many-valued logic functions is the basis for their software and hardware implementation.

  13. Time-invariant component-based normalization for a simultaneous PET-MR scanner.

    Science.gov (United States)

    Belzunce, M A; Reader, A J

    2016-05-07

    Component-based normalization is a method used to compensate for the sensitivity of each of the lines of response acquired in positron emission tomography. This method consists of modelling the sensitivity of each line of response as a product of multiple factors, which can be classified as time-invariant, time-variant and acquisition-dependent components. Typical time-variant factors are the intrinsic crystal efficiencies, which are needed to be updated by a regular normalization scan. Failure to do so would in principle generate artifacts in the reconstructed images due to the use of out of date time-variant factors. For this reason, an assessment of the variability and the impact of the crystal efficiencies in the reconstructed images is important to determine the frequency needed for the normalization scans, as well as to estimate the error obtained when an inappropriate normalization is used. Furthermore, if the fluctuations of these components are low enough, they could be neglected and nearly artifact-free reconstructions become achievable without performing a regular normalization scan. In this work, we analyse the impact of the time-variant factors in the component-based normalization used in the Biograph mMR scanner, but the work is applicable to other PET scanners. These factors are the intrinsic crystal efficiencies and the axial factors. For the latter, we propose a new method to obtain fixed axial factors that was validated with simulated data. Regarding the crystal efficiencies, we assessed their fluctuations during a period of 230 d and we found that they had good stability and low dispersion. We studied the impact of not including the intrinsic crystal efficiencies in the normalization when reconstructing simulated and real data. Based on this assessment and using the fixed axial factors, we propose the use of a time-invariant normalization that is able to achieve comparable results to the standard, daily updated, normalization factors used in this

  14. Parallel factor ChIP provides essential internal control for quantitative differential ChIP-seq.

    Science.gov (United States)

    Guertin, Michael J; Cullen, Amy E; Markowetz, Florian; Holding, Andrew N

    2018-04-17

    A key challenge in quantitative ChIP combined with high-throughput sequencing (ChIP-seq) is the normalization of data in the presence of genome-wide changes in occupancy. Analysis-based normalization methods were developed for transcriptomic data and these are dependent on the underlying assumption that total transcription does not change between conditions. For genome-wide changes in transcription factor (TF) binding, these assumptions do not hold true. The challenges in normalization are confounded by experimental variability during sample preparation, processing and recovery. We present a novel normalization strategy utilizing an internal standard of unchanged peaks for reference. Our method can be readily applied to monitor genome-wide changes by ChIP-seq that are otherwise lost or misrepresented through analytical normalization. We compare our approach to normalization by total read depth and two alternative methods that utilize external experimental controls to study TF binding. We successfully resolve the key challenges in quantitative ChIP-seq analysis and demonstrate its application by monitoring the loss of Estrogen Receptor-alpha (ER) binding upon fulvestrant treatment, ER binding in response to estrodiol, ER mediated change in H4K12 acetylation and profiling ER binding in patient-derived xenographs. This is supported by an adaptable pipeline to normalize and quantify differential TF binding genome-wide and generate metrics for differential binding at individual sites.

  15. Diagnosing dementia and normal aging: clinical relevance of brain ratios and cognitive performance in a Brazilian sample

    Directory of Open Access Journals (Sweden)

    Chaves M.L.F.

    1999-01-01

    Full Text Available The main objective of the present study was to evaluate the diagnostic value (clinical application of brain measures and cognitive function. Alzheimer and multiinfarct patients (N = 30 and normal subjects over the age of 50 (N = 40 were submitted to a medical, neurological and cognitive investigation. The cognitive tests applied were Mini-Mental, word span, digit span, logical memory, spatial recognition span, Boston naming test, praxis, and calculation tests. The brain ratios calculated were the ventricle-brain, bifrontal, bicaudate, third ventricle, and suprasellar cistern measures. These data were obtained from a brain computer tomography scan, and the cutoff values from receiver operating characteristic curves. We analyzed the diagnostic parameters provided by these ratios and compared them to those obtained by cognitive evaluation. The sensitivity and specificity of cognitive tests were higher than brain measures, although dementia patients presented higher ratios, showing poorer cognitive performances than normal individuals. Normal controls over the age of 70 presented higher measures than younger groups, but similar cognitive performance. We found diffuse losses of tissue from the central nervous system related to distribution of cerebrospinal fluid in dementia patients. The likelihood of case identification by functional impairment was higher than when changes of the structure of the central nervous system were used. Cognitive evaluation still seems to be the best method to screen individuals from the community, especially for developing countries, where the cost of brain imaging precludes its use for screening and initial assessment of dementia.

  16. A flexible software architecture for scalable real-time image and video processing applications

    Science.gov (United States)

    Usamentiaga, Rubén; Molleda, Julio; García, Daniel F.; Bulnes, Francisco G.

    2012-06-01

    Real-time image and video processing applications require skilled architects, and recent trends in the hardware platform make the design and implementation of these applications increasingly complex. Many frameworks and libraries have been proposed or commercialized to simplify the design and tuning of real-time image processing applications. However, they tend to lack flexibility because they are normally oriented towards particular types of applications, or they impose specific data processing models such as the pipeline. Other issues include large memory footprints, difficulty for reuse and inefficient execution on multicore processors. This paper presents a novel software architecture for real-time image and video processing applications which addresses these issues. The architecture is divided into three layers: the platform abstraction layer, the messaging layer, and the application layer. The platform abstraction layer provides a high level application programming interface for the rest of the architecture. The messaging layer provides a message passing interface based on a dynamic publish/subscribe pattern. A topic-based filtering in which messages are published to topics is used to route the messages from the publishers to the subscribers interested in a particular type of messages. The application layer provides a repository for reusable application modules designed for real-time image and video processing applications. These modules, which include acquisition, visualization, communication, user interface and data processing modules, take advantage of the power of other well-known libraries such as OpenCV, Intel IPP, or CUDA. Finally, we present different prototypes and applications to show the possibilities of the proposed architecture.

  17. Color normalization of histology slides using graph regularized sparse NMF

    Science.gov (United States)

    Sha, Lingdao; Schonfeld, Dan; Sethi, Amit

    2017-03-01

    Computer based automatic medical image processing and quantification are becoming popular in digital pathology. However, preparation of histology slides can vary widely due to differences in staining equipment, procedures and reagents, which can reduce the accuracy of algorithms that analyze their color and texture information. To re- duce the unwanted color variations, various supervised and unsupervised color normalization methods have been proposed. Compared with supervised color normalization methods, unsupervised color normalization methods have advantages of time and cost efficient and universal applicability. Most of the unsupervised color normaliza- tion methods for histology are based on stain separation. Based on the fact that stain concentration cannot be negative and different parts of the tissue absorb different stains, nonnegative matrix factorization (NMF), and particular its sparse version (SNMF), are good candidates for stain separation. However, most of the existing unsupervised color normalization method like PCA, ICA, NMF and SNMF fail to consider important information about sparse manifolds that its pixels occupy, which could potentially result in loss of texture information during color normalization. Manifold learning methods like Graph Laplacian have proven to be very effective in interpreting high-dimensional data. In this paper, we propose a novel unsupervised stain separation method called graph regularized sparse nonnegative matrix factorization (GSNMF). By considering the sparse prior of stain concentration together with manifold information from high-dimensional image data, our method shows better performance in stain color deconvolution than existing unsupervised color deconvolution methods, especially in keeping connected texture information. To utilized the texture information, we construct a nearest neighbor graph between pixels within a spatial area of an image based on their distances using heat kernal in lαβ space. The

  18. Wormhole potentials and throats from quasi-normal modes

    Science.gov (United States)

    Völkel, Sebastian H.; Kokkotas, Kostas D.

    2018-05-01

    Exotic compact objects refer to a wide class of black hole alternatives or effective models to describe phenomenologically quantum gravitational effects on the horizon scale. In this work we show how the knowledge of the quasi-normal mode spectrum of non-rotating wormhole models can be used to reconstruct the effective potential that appears in perturbation equations. From this it is further possible to obtain the parameters that characterize the specific wormhole model, which in this paper was chosen to be the one by Damour and Solodukhin. We also address the question whether one can distinguish such type of wormholes from ultra compact stars, if only the quasi-normal mode spectrum is known. We have proven that this is not possible by using the trapped modes only, but requires additional information. The inverse method presented here is an extension of work that has previously been developed and applied to the oscillation spectra of ultra compact stars and gravastars. However, it is not limited to the study of exotic compact objects, but applicable to symmetric double barrier potentials that appear in one-dimensional wave equations. Therefore we think it can be of interest for other fields too.

  19. A normal metal tunnel-junction heat diode

    Energy Technology Data Exchange (ETDEWEB)

    Fornieri, Antonio, E-mail: antonio.fornieri@sns.it; Martínez-Pérez, María José; Giazotto, Francesco, E-mail: giazotto@sns.it [NEST, Istituto Nanoscienze-CNR and Scuola Normale Superiore, I-56127 Pisa (Italy)

    2014-05-05

    We propose a low-temperature thermal rectifier consisting of a chain of three tunnel-coupled normal metal electrodes. We show that a large heat rectification is achievable if the thermal symmetry of the structure is broken and the central island can release energy to the phonon bath. The performance of the device is theoretically analyzed and, under the appropriate conditions, temperature differences up to ∼200 mK between the forward and reverse thermal bias configurations are obtained below 1 K, corresponding to a rectification ratio R∼2000. The simplicity intrinsic to its design joined with the insensitivity to magnetic fields make our device potentially attractive as a fundamental building block in solid-state thermal nanocircuits and in general-purpose cryogenic electronic applications requiring energy management.

  20. Evaluation of a smartphone nutrition and physical activity application to provide lifestyle advice to pregnant women: The SNAPP randomised trial.

    Science.gov (United States)

    Dodd, Jodie M; Louise, Jennie; Cramp, Courtney; Grivell, Rosalie M; Moran, Lisa J; Deussen, Andrea R

    2018-01-01

    Our objective was to evaluate the impact of a smartphone application as an adjunct to face-to-face consultations in facilitating dietary and physical activity change among pregnant women. This multicentre, nested randomised trial involved pregnant women with a body mass index ≥18.5 kg/m 2 , with a singleton pregnancy between 10 and 20 weeks' gestation, and participating in 2 pregnancy nutrition-based randomised trials across metropolitan Adelaide, South Australia. All women participating in the SNAPP trial received a comprehensive dietary, physical activity, and behavioural intervention, as part of the GRoW or OPTIMISE randomised trials. Women were subsequently randomised to either the "Lifestyle Advice Only Group," where women received the above intervention, or the "Lifestyle Advice plus Smartphone Application Group," where women were additionally provided access to the smartphone application. The primary outcome was healthy eating index (HEI) assessed by maternal food frequency questionnaire completed at trial entry, and 28 and 36 weeks' gestation. Analyses were performed using intention-to-treat principles, with statistical significance at p = .05. One hundred sixty-two women participated: 77 allocated to the Lifestyle Advice plus Smartphone Application Group and 85 to the Lifestyle Advice Only Group. Mean difference in HEI score at 28 weeks of pregnancy was 0.01 (CI [-2.29, 2.62]) and at 36 weeks of pregnancy -1.16 (CI [-4.60, 2.28]). There was no significant additional benefit from the provision of the smartphone application in improving HEI score (p = .452). Although all women improved dietary quality across pregnancy, use of the smartphone application was poor. Our findings do not support addition of the smartphone application. © 2017 John Wiley & Sons Ltd.

  1. EFSA Panel on Dietetic Products, Nutrition and Allergies (NDA); Scientific Opinion on the substantiation of a health claim related to glucosamine and maintenance of normal joint cartilage pursuant to Article 13(5) of Regulation (EC) No 1924/2006

    DEFF Research Database (Denmark)

    Tetens, Inge

    Following an application from Merck Consumer Healthcare, submitted pursuant to Article 13(5) of Regulation (EC) No 1924/2006 via the Competent Authority of Belgium, the Panel on Dietetic Products, Nutrition and Allergies was asked to deliver an opinion on the scientific substantiation of a health...... claim related to glucosamine, formulated as glucosamine sulphate or hydrochloride, and maintenance of normal joint cartilage. Glucosamine is sufficiently characterised. The claimed effect is “contributes to the maintenance of normal joint cartilage”. The target population as proposed by the applicant...... to studies in patients with osteoarthritis, in healthy subjects, in animals and in vitro as being pertinent to the health claim. In weighing the evidence, the Panel took into account that no human studies were provided from which conclusions could be drawn on the effect of dietary glucosamine...

  2. Speech Respiratory Measures in Spastic Cerebral Palsied and Normal Children

    Directory of Open Access Journals (Sweden)

    Hashem Shemshadi

    2007-10-01

    Full Text Available Objective: Research is designed to determine speech respiratory measures in spastic cerebral palsied children versus normal ones, to be used as an applicable tool in speech therapy plans.  Materials & Methods: Via a comparative cross-sectional study (case–control, and through a directive goal oriented sampling in case and convenience approach for controls twenty spastic cerebral palsied and twenty control ones with age (5-12 years old and sex (F=20, M=20 were matched and identified. All possible inclusion and exclusion criteria were considered by thorough past medical, clinical and para clinical such as chest X-ray and Complete Blood Counts reviews to rule out any possible pulmonary and/or systemic disorders. Their speech respiratory indices were determined by Respirometer (ST 1-dysphonia, made and normalized by Glasgow University. Obtained data were analyzed by independent T test. Results: There were significant differences between cases and control groups for "mean tidal volume", "phonatory volume" and "vital capacity" at a=0/05 values and these values in patients were less (34% than normal children (P<0/001. Conclusion: Measures obtained are highly crucial for speech therapist in any speech therapy primary rehabilitative plans for spactic cerebral palsied children.

  3. PTaaS: Platform for Providing Software Developing Applications and Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    technological support for it that is not limited to one specific tools and a particular phase of software development life cycle. In this thesis, we have explored the possibility of offering software development applications and tools as services that can be acquired on demand according to the software...... with process. Information gained from the review of literature on GSD tools and processes is used to extract functional requirements for the middleware platform for provisioning of software development applications and tools as services. Finding from the review of literature on architecture solutions for cloud......Cloud computing has become an established paradigm for enabling organizations to build scalable software systems and to meet challenges of rapid demand of computing and storage resources. There has been a significant success in building cloud-enabled applications for many disciplines ranging from...

  4. Elemental composition of 'normal' and Alzheimer brain tissue by INA and PIXE analyses

    International Nuclear Information System (INIS)

    Stedman, J.D.; Spyrou, N.M.

    1997-01-01

    Instrumental methods based on the nuclear and atomic properties of the elements have been used for many years to determine elemental concentrations in a variety of materials for biomedical, industrial and environmental applications. These methods offer high sensitivity for accurate trace element measurements, suffer few interfering or competing effects. Present no blank problems and are convenient for both research and routine analyses. The present article describes the use of two trace element techniques. Firstly the use of activation of stable nuclei irradiated by neutrons in the core of a low power research reactor as a means of detection of elements through the resulting gamma-rays emitted. Secondly, the observations of the interactions of energetic ion beams with the material in order to identify elemental species. Over recent years there has been some interest in determining the elemental composition of 'normal' and Alzheimer affected brain tissue, however literature findings are inconsistent. Possible reasons for discrepancies need to be identified for further progress to be made. Here, post-mortem tissue samples, provided by the Alzheimer's Disease Brain Bank, Institute of Psychiatry, London, were taken from the frontal, occipital, parietal and temporal lobes of both hemispheres of brains from 13 'normal' and 19 Alzheimer subjects. The elemental composition of the samples was determined using the analytical techniques of INAA (instrumental neutron activation analysis), RBS (Rutherford back-scattering) and PIXE (particle induced x-ray emission). The principal findings are summarised here. (author)

  5. Standard-Chinese Lexical Neighborhood Test in normal-hearing young children.

    Science.gov (United States)

    Liu, Chang; Liu, Sha; Zhang, Ning; Yang, Yilin; Kong, Ying; Zhang, Luo

    2011-06-01

    The purposes of the present study were to establish the Standard-Chinese version of Lexical Neighborhood Test (LNT) and to examine the lexical and age effects on spoken-word recognition in normal-hearing children. Six lists of monosyllabic and six lists of disyllabic words (20 words/list) were selected from the database of daily speech materials for normal-hearing (NH) children of ages 3-5 years. The lists were further divided into "easy" and "hard" halves according to the word frequency and neighborhood density in the database based on the theory of Neighborhood Activation Model (NAM). Ninety-six NH children (age ranged between 4.0 and 7.0 years) were divided into three different age groups of 1-year intervals. Speech-perception tests were conducted using the Standard-Chinese monosyllabic and disyllabic LNT. The inter-list performance was found to be equivalent and inter-rater reliability was high with 92.5-95% consistency. Results of word-recognition scores showed that the lexical effects were all significant. Children scored higher with disyllabic words than with monosyllabic words. "Easy" words scored higher than "hard" words. The word-recognition performance also increased with age in each lexical category. A multiple linear regression analysis showed that neighborhood density, age, and word frequency appeared to have increasingly more contributions to Chinese word recognition. The results of the present study indicated that performances of Chinese word recognition were influenced by word frequency, age, and neighborhood density, with word frequency playing a major role. These results were consistent with those in other languages, supporting the application of NAM in the Chinese language. The development of Standard-Chinese version of LNT and the establishment of a database of children of 4-6 years old can provide a reliable means for spoken-word recognition test in children with hearing impairment. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  6. Identification of genes for normalization of real-time RT-PCR data in breast carcinomas

    DEFF Research Database (Denmark)

    Lyng, Maria B; Laenkholm, Anne-Vibeke; Pallisgaard, Niels

    2008-01-01

    BACKGROUND: Quantitative real-time RT-PCR (RT-qPCR) has become a valuable molecular technique in basic and translational biomedical research, and is emerging as an equally valuable clinical tool. Correlation of inter-sample values requires data normalization, which can be accomplished by various...... means, the most common of which is normalization to internal, stably expressed, reference genes. Recently, such traditionally utilized reference genes as GAPDH and B2M have been found to be regulated in various circumstances in different tissues, emphasizing the need to identify genes independent...... of factors influencing the tissue, and that are stably expressed within the experimental milieu. In this study, we identified genes for normalization of RT-qPCR data for invasive breast cancer (IBC), with special emphasis on estrogen receptor positive (ER+) IBC, but also examined their applicability to ER...

  7. A new selective developmental deficit: Impaired object recognition with normal face recognition.

    Science.gov (United States)

    Germine, Laura; Cashdollar, Nathan; Düzel, Emrah; Duchaine, Bradley

    2011-05-01

    Studies of developmental deficits in face recognition, or developmental prosopagnosia, have shown that individuals who have not suffered brain damage can show face recognition impairments coupled with normal object recognition (Duchaine and Nakayama, 2005; Duchaine et al., 2006; Nunn et al., 2001). However, no developmental cases with the opposite dissociation - normal face recognition with impaired object recognition - have been reported. The existence of a case of non-face developmental visual agnosia would indicate that the development of normal face recognition mechanisms does not rely on the development of normal object recognition mechanisms. To see whether a developmental variant of non-face visual object agnosia exists, we conducted a series of web-based object and face recognition tests to screen for individuals showing object recognition memory impairments but not face recognition impairments. Through this screening process, we identified AW, an otherwise normal 19-year-old female, who was then tested in the lab on face and object recognition tests. AW's performance was impaired in within-class visual recognition memory across six different visual categories (guns, horses, scenes, tools, doors, and cars). In contrast, she scored normally on seven tests of face recognition, tests of memory for two other object categories (houses and glasses), and tests of recall memory for visual shapes. Testing confirmed that her impairment was not related to a general deficit in lower-level perception, object perception, basic-level recognition, or memory. AW's results provide the first neuropsychological evidence that recognition memory for non-face visual object categories can be selectively impaired in individuals without brain damage or other memory impairment. These results indicate that the development of recognition memory for faces does not depend on intact object recognition memory and provide further evidence for category-specific dissociations in visual

  8. Co-Opetition Provides the Halifax with Tailor-Made Training

    Science.gov (United States)

    Education & Training, 2002

    2002-01-01

    Describes the co-operation between Cranfield University, Trans4mation management consultancy and ProActive outdoor activities provider, in a new leadership programme for UK bank Halifax plc. Shows that the three organizations, which might normally have been competing against each other, had to devise ways of tearing down barriers, communicating…

  9. A systematic assessment of normalization approaches for the Infinium 450K methylation platform.

    Science.gov (United States)

    Wu, Michael C; Joubert, Bonnie R; Kuan, Pei-fen; Håberg, Siri E; Nystad, Wenche; Peddada, Shyamal D; London, Stephanie J

    2014-02-01

    The Illumina Infinium HumanMethylation450 BeadChip has emerged as one of the most popular platforms for genome wide profiling of DNA methylation. While the technology is wide-spread, systematic technical biases are believed to be present in the data. For example, this array incorporates two different chemical assays, i.e., Type I and Type II probes, which exhibit different technical characteristics and potentially complicate the computational and statistical analysis. Several normalization methods have been introduced recently to adjust for possible biases. However, there is considerable debate within the field on which normalization procedure should be used and indeed whether normalization is even necessary. Yet despite the importance of the question, there has been little comprehensive comparison of normalization methods. We sought to systematically compare several popular normalization approaches using the Norwegian Mother and Child Cohort Study (MoBa) methylation data set and the technical replicates analyzed with it as a case study. We assessed both the reproducibility between technical replicates following normalization and the effect of normalization on association analysis. Results indicate that the raw data are already highly reproducible, some normalization approaches can slightly improve reproducibility, but other normalization approaches may introduce more variability into the data. Results also suggest that differences in association analysis after applying different normalizations are not large when the signal is strong, but when the signal is more modest, different normalizations can yield very different numbers of findings that meet a weaker statistical significance threshold. Overall, our work provides useful, objective assessment of the effectiveness of key normalization methods.

  10. Normalized glandular dose (DgN) coefficients for flat-panel CT breast imaging

    International Nuclear Information System (INIS)

    Thacker, Samta C; Glick, Stephen J

    2004-01-01

    The development of new digital mammography techniques such as dual-energy imaging, tomosynthesis and CT breast imaging will require investigation of optimal camera design parameters and optimal imaging acquisition parameters. In optimizing these acquisition protocols and imaging systems it is important to have knowledge of the radiation dose to the breast. This study presents a methodology for estimating the normalized glandular dose to the uncompressed breast using the geometry proposed for flat-panel CT breast imaging. The simulation uses the GEANT 3 Monte Carlo code to model x-ray transport and absorption within the breast phantom. The Monte Carlo software was validated for breast dosimetry by comparing results of the normalized glandular dose (DgN) values of the compressed breast to those reported in the literature. The normalized glandular dose was then estimated for a range of breast diameters from 10 cm to 18 cm using an uncompressed breast model with a homogeneous composition of adipose and glandular tissue, and for monoenergetic x-rays from 10 keV to 120 keV. These data were fit providing expressions for the normalized glandular dose. Using these expressions for the DgN coefficients and input variables such as the diameter, height and composition of the breast phantom, the mean glandular dose for any spectra can be estimated. A computer program to provide normalized glandular dose values has been made available online. In addition, figures displaying energy deposition maps are presented to better understand the spatial distribution of dose in CT breast imaging

  11. Off-normal performance of EBR-II [Experimental Breeder Reactor] driver fuel

    International Nuclear Information System (INIS)

    Seidel, B.R.; Batte, G.L.; Lahm, C.E.; Fryer, R.M.; Koenig, J.F.; Hofman, G.L.

    1986-09-01

    The off-normal performance of EBR-II Mark-II driver fuel has been more than satisfactory as demonstrated by robust reliability under repeated transient overpower and undercooled loss-of-flow tests, by benign run-beyond-cladding-breach behavior, and by forgiving response to fabrication defects including lack of bond. Test results have verified that the metallic driver fuel is very tolerant of off-normal events. This behavior has allowed EBR-II to operate in a combined steady-state and transient mode to provide test capability without limitation from the metallic driver fuel

  12. MATHEMATICAL ANALYSIS OF DENTAL ARCH OF CHILDREN IN NORMAL OCCLUSION: A LITERATURE REVIEW

    Directory of Open Access Journals (Sweden)

    M. Abu-Hussein DDS, MScD, MSc, DPD

    2012-03-01

    Full Text Available AIM. This paper is an attempt to compare and analyze the various mathematical models for defining the dental arch curvature of children in normal occlusion based upon a review of available literature. Background. While various studies have touched upon ways to cure or prevent dental diseases and upon surgical ways for teeth reconstitution to correct teeth anomalies during childhood, a substantial literature also exists, attempting to mathematically define the dental arch of children in normal occlusion. This paper reviews these dental studies and compares them analytically. Method. The paper compares the different mathematical approaches, highlights the basic assumptions behind each model, underscores the relevancy and applicability of the same, and also lists applicable mathematical formulae. Results. Each model has been found applicable to specific research conditions, as a universal mathematical model for describing the human dental arch still eludes satisfactory definition. The models necessarily need to include the features of the dental arch, such as shape, spacing between teeth and symmetry or asymmetry, but they also need substantial improvement. Conclusions. While the paper shows that the existing models are inadequate in properly defining the human dental arch, it also acknowledges that future research based on modern imaging techniques and computeraided simulation could well succeed in deriving an allinclusive definition for the human dental curve till now eluding the experts.

  13. Manual on environmental monitoring in normal operation

    International Nuclear Information System (INIS)

    1966-01-01

    Many establishments handling radioactive materials produce, and to some extent also discharge, radioactive waste as part of their normal operation. The radiation doses to which members of the public may be exposed during such operation must remain below the stipulated level. The purpose of this manual is to provide technical guidance for setting up programmes of routine environmental monitoring in the vicinity of nuclear establishment. The annex gives five examples of routine environmental monitoring programmes currently in use: these have been indexed separately.

  14. Relative Radiometric Normalization and Atmospheric Correction of a SPOT 5 Time Series

    Directory of Open Access Journals (Sweden)

    Matthieu Rumeau

    2008-04-01

    Full Text Available Multi-temporal images acquired at high spatial and temporal resolution are an important tool for detecting change and analyzing trends, especially in agricultural applications. However, to insure a reliable use of this kind of data, a rigorous radiometric normalization step is required. Normalization can be addressed by performing an atmospheric correction of each image in the time series. The main problem is the difficulty of obtaining an atmospheric characterization at a given acquisition date. In this paper, we investigate whether relative radiometric normalization can substitute for atmospheric correction. We develop an automatic method for relative radiometric normalization based on calculating linear regressions between unnormalized and reference images. Regressions are obtained using the reflectances of automatically selected invariant targets. We compare this method with an atmospheric correction method that uses the 6S model. The performances of both methods are compared using 18 images from of a SPOT 5 time series acquired over Reunion Island. Results obtained for a set of manually selected invariant targets show excellent agreement between the two methods in all spectral bands: values of the coefficient of determination (r² exceed 0.960, and bias magnitude values are less than 2.65. There is also a strong correlation between normalized NDVI values of sugarcane fields (r² = 0.959. Despite a relative error of 12.66% between values, very comparable NDVI patterns are observed.

  15. Improvements to Lunar BRDF-Corrected Nighttime Satellite Imagery: Uses and Applications

    Science.gov (United States)

    Cole, Tony A.; Molthan, Andrew L.; Schultz, Lori A.; Roman, Miguel O.; Wanik, David W.

    2016-01-01

    Observations made by the VIIRS day/night band (DNB) provide daily, nighttime measurements to monitor Earth surface processes.However, these observations are impacted by variations in reflected solar radiation on the moon's surface. As the moon transitions from new to full phase, increasing radiance is reflected to the Earth's surface and contributes additional reflected moonlight from clouds and land surface, in addition to emissions from other light sources observed by the DNB. The introduction of a bi-directional reflectance distribution function (BRDF) algorithm serves to remove these lunar variations and normalize observed radiances. Provided by the Terrestrial Information Systems Laboratory at Goddard Space Flight Center, a 1 km gridded lunar BRDF-corrected DNB product and VIIRS cloud mask can be used for a multitude of nighttime applications without influence from the moon. Such applications include the detection of power outages following severe weather events using pre-and post-event DNB imagery, as well as the identification of boat features to curtail illegal fishing practices. This presentation will provide context on the importance of the lunar BRDF correction algorithm and explore the aforementioned uses of this improved DNB product for applied science applications.

  16. Improvements to Lunar BRDF-Corrected Nighttime Satellite Imagery: Uses and Applications

    Science.gov (United States)

    Cole, T.; Molthan, A.; Schultz, L. A.; Roman, M. O.; Wanik, D. W.

    2016-12-01

    Observations made by the VIIRS day/night band (DNB) provide daily, nighttime measurements to monitor Earth surface processes. However, these observations are impacted by variations in reflected solar radiation on the moon's surface. As the moon transitions from new to full phase, increasing radiance is reflected to the Earth's surface and contributes additional reflected moonlight from clouds and land surface, in addition to emissions from other light sources observed by the DNB. The introduction of a bi-directional reflectance distribution function (BRDF) algorithm serves to remove these lunar variations and normalize observed radiances. Provided by the Terrestrial Information Systems Laboratory at Goddard Space Flight Center, a 1 km gridded lunar BRDF-corrected DNB product and VIIRS cloud mask can be used for a multitude of nighttime applications without influence from the moon. Such applications include the detection of power outages following severe weather events using pre- and post-event DNB imagery, as well as the identification of boat features to curtail illegal fishing practices. This presentation will provide context on the importance of the lunar BRDF correction algorithm and explore the aforementioned uses of this improved DNB product for applied science applications.

  17. MRI of normal and pathological fetal lung development

    International Nuclear Information System (INIS)

    Kasprian, Gregor; Balassy, Csilla; Brugger, Peter C.; Prayer, Daniela

    2006-01-01

    Normal fetal lung development is a complex process influenced by mechanical and many biochemical factors. In addition to ultrasound, fetal magnetic resonance imaging (MRI) constitutes a new method to investigate this process in vivo during the second and third trimester. The techniques of MRI volumetry, assessment of signal intensities, and MRI spectroscopy of the fetal lung have been used to analyze this process and have already been applied clinically to identify abnormal fetal lung growth. Particularly in conditions such as oligohydramnios and congenital diaphragmatic hernia (CDH), pulmonary hypoplasia may be the cause of neonatal death. A precise diagnosis and quantification of compromised fetal lung development may improve post- and perinatal management. The main events in fetal lung development are reviewed and MR volumetric data from 106 normal fetuses, as well as different examples of pathological lung growth, are provided

  18. MRI of normal and pathological fetal lung development

    Energy Technology Data Exchange (ETDEWEB)

    Kasprian, Gregor [University Clinic of Radiodiagnostics, Medical University of Vienna (Austria)]. E-mail: gregor.kasprian@meduniwien.ac.at; Balassy, Csilla [University Clinic of Radiodiagnostics, Medical University of Vienna (Austria); Brugger, Peter C. [Center of Anatomy and Cell Biology, Medical University of Vienna (Austria); Prayer, Daniela [University Clinic of Radiodiagnostics, Medical University of Vienna (Austria)

    2006-02-15

    Normal fetal lung development is a complex process influenced by mechanical and many biochemical factors. In addition to ultrasound, fetal magnetic resonance imaging (MRI) constitutes a new method to investigate this process in vivo during the second and third trimester. The techniques of MRI volumetry, assessment of signal intensities, and MRI spectroscopy of the fetal lung have been used to analyze this process and have already been applied clinically to identify abnormal fetal lung growth. Particularly in conditions such as oligohydramnios and congenital diaphragmatic hernia (CDH), pulmonary hypoplasia may be the cause of neonatal death. A precise diagnosis and quantification of compromised fetal lung development may improve post- and perinatal management. The main events in fetal lung development are reviewed and MR volumetric data from 106 normal fetuses, as well as different examples of pathological lung growth, are provided.

  19. Delineating the structure of normal and abnormal personality: an integrative hierarchical approach.

    Science.gov (United States)

    Markon, Kristian E; Krueger, Robert F; Watson, David

    2005-01-01

    Increasing evidence indicates that normal and abnormal personality can be treated within a single structural framework. However, identification of a single integrated structure of normal and abnormal personality has remained elusive. Here, a constructive replication approach was used to delineate an integrative hierarchical account of the structure of normal and abnormal personality. This hierarchical structure, which integrates many Big Trait models proposed in the literature, replicated across a meta-analysis as well as an empirical study, and across samples of participants as well as measures. The proposed structure resembles previously suggested accounts of personality hierarchy and provides insight into the nature of personality hierarchy more generally. Potential directions for future research on personality and psychopathology are discussed.

  20. Delineating the Structure of Normal and Abnormal Personality: An Integrative Hierarchical Approach

    Science.gov (United States)

    Markon, Kristian E.; Krueger, Robert F.; Watson, David

    2008-01-01

    Increasing evidence indicates that normal and abnormal personality can be treated within a single structural framework. However, identification of a single integrated structure of normal and abnormal personality has remained elusive. Here, a constructive replication approach was used to delineate an integrative hierarchical account of the structure of normal and abnormal personality. This hierarchical structure, which integrates many Big Trait models proposed in the literature, replicated across a meta-analysis as well as an empirical study, and across samples of participants as well as measures. The proposed structure resembles previously suggested accounts of personality hierarchy and provides insight into the nature of personality hierarchy more generally. Potential directions for future research on personality and psychopathology are discussed. PMID:15631580

  1. Normalized STEAM-based diffusion tensor imaging provides a robust assessment of muscle tears in football players: preliminary results of a new approach to evaluate muscle injuries.

    Science.gov (United States)

    Giraudo, Chiara; Motyka, Stanislav; Weber, Michael; Karner, Manuela; Resinger, Christoph; Feiweier, Thorsten; Trattnig, Siegfried; Bogner, Wolfgang

    2018-02-08

    To assess acute muscle tears in professional football players by diffusion tensor imaging (DTI) and evaluate the impact of normalization of data. Eight football players with acute lower limb muscle tears were examined. DTI metrics of the injured muscle and corresponding healthy contralateral muscle and of ROIs drawn in muscle tears (ROI tear ) in the corresponding healthy contralateral muscle (ROI hc_t ) in a healthy area ipsilateral to the injury (ROI hi ) and in a corresponding contralateral area (ROI hc_i ) were compared. The same comparison was performed for ratios of the injured (ROI tear /ROI hi ) and contralateral sides (ROI hc_t /ROI hc_i ). ANOVA, Bonferroni-corrected post-hoc and Student's t-tests were used. Analyses of the entire muscle did not show any differences (p>0.05 each) except for axial diffusivity (AD; p=0.048). ROI tear showed higher mean diffusivity (MD) and AD than ROI hc_t (ptear than in ROI hi and ROI hc_t (ptear than in any other ROI (pmuscle tears in athletes especially after normalization to healthy muscle tissue. • STEAM-based DTI allows the investigation of muscle tears affecting professional football players. • Fractional anisotropy and mean diffusivity differ between injured and healthy muscle areas. • Only normalized data show differences of fibre tracking metrics in muscle tears. • The normalization of DTI-metrics enables a more robust characterization of muscle tears.

  2. Influence of vascular normalization on interstitial flow and delivery of liposomes in tumors

    International Nuclear Information System (INIS)

    Ozturk, Deniz; Yonucu, Sirin; Yilmaz, Defne; Unlu, Mehmet Burcin

    2015-01-01

    Elevated interstitial fluid pressure is one of the barriers of drug delivery in solid tumors. Recent studies have shown that normalization of tumor vasculature by anti-angiogenic factors may improve the delivery of conventional cytotoxic drugs, possibly by increasing blood flow, decreasing interstitial fluid pressure, and enhancing the convective transvascular transport of drug molecules. Delivery of large therapeutic agents such as nanoparticles and liposomes might also benefit from normalization therapy since their transport depends primarily on convection. In this study, a mathematical model is presented to provide supporting evidence that normalization therapy may improve the delivery of 100 nm liposomes into solid tumors, by both increasing the total drug extravasation and providing a more homogeneous drug distribution within the tumor. However these beneficial effects largely depend on tumor size and are stronger for tumors within a certain size range. It is shown that this size effect may persist under different microenvironmental conditions and for tumors with irregular margins or heterogeneous blood supply. (paper)

  3. The N'ormal Distribution

    Indian Academy of Sciences (India)

    An optimal way of choosing sample size in an opinion poll is indicated using the normal distribution. Introduction. In this article, the ubiquitous normal distribution is intro- duced as a convenient approximation for computing bino- mial probabilities for large values of n. Stirling's formula. • and DeMoivre-Laplace theorem ...

  4. Comparison of spectrum normalization techniques for univariate ...

    Indian Academy of Sciences (India)

    Laser-induced breakdown spectroscopy; univariate study; normalization models; stainless steel; standard error of prediction. Abstract. Analytical performance of six different spectrum normalization techniques, namelyinternal normalization, normalization with total light, normalization with background along with their ...

  5. Application of in situ current normalized PIGE method for determination of total boron and its isotopic composition

    International Nuclear Information System (INIS)

    Chhillar, Sumit; Acharya, R.; Sodaye, S.; Pujari, P.K.

    2014-01-01

    A particle induced gamma-ray emission (PIGE) method using proton beam has been standardized for determination of isotopic composition of natural boron and enriched boron samples. Target pellets of boron standard and samples were prepared in cellulose matrix. The prompt gamma rays of 429 keV, 718 keV and 2125 keV were measured from 10 B(p,αγ) 7 Be, 10 B(p, p'γ) 10 B and 11 B(p, p'γ) 11 B nuclear reactions, respectively. For normalizing the beam current variations in situ current normalization method was used. Validation of method was carried out using synthetic samples of boron carbide, borax, borazine and lithium metaborate in cellulose matrix. (author)

  6. Dynamical systems with applications using MATLAB

    CERN Document Server

    Lynch, Stephen

    2014-01-01

    This textbook, now in its second edition, provides a broad introduction to both continuous and discrete dynamical systems, the theory of which is motivated by examples from a wide range of disciplines. It emphasizes applications and simulation utilizing MATLAB®, Simulink®, the Image Processing Toolbox™, and the Symbolic Math Toolbox™, including MuPAD. Features new to the second edition include, sections on series solutions of ordinary differential equations, perturbation methods, normal forms, Gröbner bases, and chaos synchronization; chapters on image processing and binary oscillator computing; hundreds of new illustrations, examples, and exercises with solutions; and over eighty up-to-date MATLAB® program files and Simulink model files available online. These files were voted MATLAB® Central Pick of the Week in July 2013.  The hands-on approach of Dynamical Systems with Applications using MATLAB®, Second Edition, has minimal prerequisites, only requiring familiarity with ordinary differential equ...

  7. Method of using of the Box-Cox transformation at the application of the xbar and s chart

    Directory of Open Access Journals (Sweden)

    Eftimie Nicolae

    2017-01-01

    Full Text Available The application of the most statistical process control techniques is based on the assumption that the distribution of the measurements is normal. However, there are many situations in practice when the process data distribution is not normal. In certain cases, the Box-Cox transformation can be used for converting the process data distribution into a normal distribution. Considering these aspects, the paper presents a method of application for the xbar and s chart that can be used in the case when the measurements distribution is not normal. The proposed method consists of the following stages: the testing of normality of the process data, the application of the Box-Cox transformation and the testing of normality of the transformed data. In the case when the distribution of the transformed data is normal, they are used at the application of the xbar and s control chart.

  8. New application of procedures computer to normal operating procedures

    Energy Technology Data Exchange (ETDEWEB)

    Medrano Carbajo, J.; Mendez Salguero, J.

    2012-07-01

    Currently, existing technologies in the operating power plants accused increasingly more problems of obsolescence and replacement of equipment supply. For this reason are spreading, to a greater or lesser extent, the projects of migration from analogue to digital technology systems. These systems, in line with the current situation of the industry, provide solutions to such problems and other advantages.

  9. Vocal fold contact patterns based on normal modes of vibration.

    Science.gov (United States)

    Smith, Simeon L; Titze, Ingo R

    2018-05-17

    The fluid-structure interaction and energy transfer from respiratory airflow to self-sustained vocal fold oscillation continues to be a topic of interest in vocal fold research. Vocal fold vibration is driven by pressures on the vocal fold surface, which are determined by the shape of the glottis and the contact between vocal folds. Characterization of three-dimensional glottal shapes and contact patterns can lead to increased understanding of normal and abnormal physiology of the voice, as well as to development of improved vocal fold models, but a large inventory of shapes has not been directly studied previously. This study aimed to take an initial step toward characterizing vocal fold contact patterns systematically. Vocal fold motion and contact was modeled based on normal mode vibration, as it has been shown that vocal fold vibration can be almost entirely described by only the few lowest order vibrational modes. Symmetric and asymmetric combinations of the four lowest normal modes of vibration were superimposed on left and right vocal fold medial surfaces, for each of three prephonatory glottal configurations, according to a surface wave approach. Contact patterns were generated from the interaction of modal shapes at 16 normalized phases during the vibratory cycle. Eight major contact patterns were identified and characterized by the shape of the flow channel, with the following descriptors assigned: convergent, divergent, convergent-divergent, uniform, split, merged, island, and multichannel. Each of the contact patterns and its variation are described, and future work and applications are discussed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Abnormal computerized dynamic posturography findings in dizzy patients with normal ENG results.

    Science.gov (United States)

    Sataloff, Robert T; Hawkshaw, Mary J; Mandel, Heidi; Zwislewski, Amy B; Armour, Jonathan; Mandel, Steven

    2005-04-01

    The complexities of the balance system create difficulties for professionals interested in testing equilibrium function objectively. Traditionally, electronystagmography (ENG) has been used for this purpose, but it provides information on only a limited portion of the equilibrium system. Computerized dynamic posturography (CDP) is less specific than ENG, but it provides more global insight into a patient's ability to maintain equilibrium under more challenging environmental circumstances. CD Palso appears to be valuable in obtaining objective confirmation of an abnormality in some dizzy patients whose ENG findings are normal. Our review of 33 patients with normal ENG results and abnormal CDP findings suggests that posturography is useful for confirming or quantifying a balance abnormality in some patients whose complaints cannot be confirmed by other tests frequently used by otologists.

  11. Ex vivo characterization of normal and adenocarcinoma colon samples by Mueller matrix polarimetry.

    Science.gov (United States)

    Ahmad, Iftikhar; Ahmad, Manzoor; Khan, Karim; Ashraf, Sumara; Ahmad, Shakil; Ikram, Masroor

    2015-05-01

    Mueller matrix polarimetry along with polar decomposition algorithm was employed for the characterization of ex vivo normal and adenocarcinoma human colon tissues by polarized light in the visible spectral range (425-725 nm). Six derived polarization metrics [total diattenuation (DT ), retardance (RT ), depolarization(ΔT ), linear diattenuation (DL), retardance (δ), and depolarization (ΔL)] were compared for normal and adenocarcinoma colon tissue samples. The results show that all six polarimetric properties for adenocarcinoma samples were significantly higher as compared to the normal samples for all wavelengths. The Wilcoxon rank sum test illustrated that total retardance is a good candidate for the discrimination of normal and adenocarcinoma colon samples. Support vector machine classification for normal and adenocarcinoma based on the four polarization properties spectra (ΔT , ΔL, RT ,and δ) yielded 100% accuracy, sensitivity, and specificity, while both DTa nd DL showed 66.6%, 33.3%, and 83.3% accuracy, sensitivity, and specificity, respectively. The combination of polarization analysis and given classification methods provides a framework to distinguish the normal and cancerous tissues.

  12. Spinal cord normalization in multiple sclerosis.

    Science.gov (United States)

    Oh, Jiwon; Seigo, Michaela; Saidha, Shiv; Sotirchos, Elias; Zackowski, Kathy; Chen, Min; Prince, Jerry; Diener-West, Marie; Calabresi, Peter A; Reich, Daniel S

    2014-01-01

    Spinal cord (SC) pathology is common in multiple sclerosis (MS), and measures of SC-atrophy are increasingly utilized. Normalization reduces biological variation of structural measurements unrelated to disease, but optimal parameters for SC volume (SCV)-normalization remain unclear. Using a variety of normalization factors and clinical measures, we assessed the effect of SCV normalization on detecting group differences and clarifying clinical-radiological correlations in MS. 3T cervical SC-MRI was performed in 133 MS cases and 11 healthy controls (HC). Clinical assessment included expanded disability status scale (EDSS), MS functional composite (MSFC), quantitative hip-flexion strength ("strength"), and vibration sensation threshold ("vibration"). SCV between C3 and C4 was measured and normalized individually by subject height, SC-length, and intracranial volume (ICV). There were group differences in raw-SCV and after normalization by height and length (MS vs. HC; progressive vs. relapsing MS-subtypes, P normalization by length (EDSS:r = -.43; MSFC:r = .33; strength:r = .38; vibration:r = -.40), and height (EDSS:r = -.26; MSFC:r = .28; strength:r = .22; vibration:r = -.29), but diminished with normalization by ICV (EDSS:r = -.23; MSFC:r = -.10; strength:r = .23; vibration:r = -.35). In relapsing MS, normalization by length allowed statistical detection of correlations that were not apparent with raw-SCV. SCV-normalization by length improves the ability to detect group differences, strengthens clinical-radiological correlations, and is particularly relevant in settings of subtle disease-related SC-atrophy in MS. SCV-normalization by length may enhance the clinical utility of measures of SC-atrophy. Copyright © 2014 by the American Society of Neuroimaging.

  13. The triangular density to approximate the normal density: decision rules-of-thumb

    International Nuclear Information System (INIS)

    Scherer, William T.; Pomroy, Thomas A.; Fuller, Douglas N.

    2003-01-01

    In this paper we explore the approximation of the normal density function with the triangular density function, a density function that has extensive use in risk analysis. Such an approximation generates a simple piecewise-linear density function and a piecewise-quadratic distribution function that can be easily manipulated mathematically and that produces surprisingly accurate performance under many instances. This mathematical tractability proves useful when it enables closed-form solutions not otherwise possible, as with problems involving the embedded use of the normal density. For benchmarking purposes we compare the basic triangular approximation with two flared triangular distributions and with two simple uniform approximations; however, throughout the paper our focus is on using the triangular density to approximate the normal for reasons of parsimony. We also investigate the logical extensions of using a non-symmetric triangular density to approximate a lognormal density. Several issues associated with using a triangular density as a substitute for the normal and lognormal densities are discussed, and we explore the resulting numerical approximation errors for the normal case. Finally, we present several examples that highlight simple decision rules-of-thumb that the use of the approximation generates. Such rules-of-thumb, which are useful in risk and reliability analysis and general business analysis, can be difficult or impossible to extract without the use of approximations. These examples include uses of the approximation in generating random deviates, uses in mixture models for risk analysis, and an illustrative decision analysis problem. It is our belief that this exploratory look at the triangular approximation to the normal will provoke other practitioners to explore its possible use in various domains and applications

  14. Signal transduction by normal isoforms and W mutant variants of the Kit receptor tyrosine kinase.

    OpenAIRE

    Reith, A D; Ellis, C; Lyman, S D; Anderson, D M; Williams, D E; Bernstein, A; Pawson, T

    1991-01-01

    Germline mutations at the Dominant White Spotting (W) and Steel (Sl) loci have provided conclusive genetic evidence that c-kit mediated signal transduction pathways are essential for normal mouse development. We have analysed the interactions of normal and mutant W/c-kit gene products with cytoplasmic signalling proteins, using transient c-kit expression assays in COS cells. In addition to the previously identified c-kit gene product (Kit+), a second normal Kit isoform (KitA+) containing an i...

  15. Approach of the value of an annuity when non-central moments of the capitalization factor are known: an R application with interest rates following normal and beta distributions

    Directory of Open Access Journals (Sweden)

    Salvador Cruz Rambaud

    2015-07-01

    Full Text Available This paper proposes an expression of the value of an annuity with payments of 1 unit each when the interest rate is random. In order to attain this objective, we proceed on the assumption that the non-central moments of the capitalization factor are known. Specifically, to calculate the value of these annuities, we propose two different expressions. First, we suppose that the random interest rate is normally distributed; then, we assume that it follows the beta distribution. A practical application of these two methodologies is also implemented using the R statistical software.

  16. Deformation around basin scale normal faults

    International Nuclear Information System (INIS)

    Spahic, D.

    2010-01-01

    in the central Vienna Basin from commercial 3D seismic data. In addition to detailed conventional fault analysis (displacement and fault shape), syn-and anticlinal structures of sedimentary horizons occurring both in hanging wall and footwall are assessed. Reverse drag geometries of variable magnitudes are found to correlate with local displacement maxima along the fault. In contrast, normal drag is observed along segment boundaries and relay zones. Thus, the detailed documentation of the distribution, type and magnitude of fault drag provides additional information on the fault evolution, as initial fault segments as well as linkage or relay zones can be identified. (author) [de

  17. Radiosensitive Down syndrome lymphoblastoid lines have normal ionizing-radiation-induced inhibition of DNA synthesis

    International Nuclear Information System (INIS)

    Ganges, M.B.; Robbins, J.H.; Jiang, H.; Hauser, C.; Tarone, R.E.

    1988-01-01

    The extent of X-ray-induced inhibition of DNA synthesis was determined in radiosensitive lymphoblastoid lines from 3 patients with Down syndrome and 3 patients with ataxia telangiectasia (AT). Compared to 6 normal control lines, the 3 AT lines were abnormally resistant to X-ray-induced inhibition of DNA synthesis, while the 3 Down syndrome lines had normal inhibition. These results demonstrate that radiosensitive human cells can have normal X-ray-induced inhibition of DNA synthesis and provide new evidence for the dissociation of radioresistant DNA synthesis. (author). 27 refs.; 1 fig.; 1 tab

  18. Stem Cell Therapy to Reduce Radiation-Induced Normal Tissue Damage

    NARCIS (Netherlands)

    Coppes, Rob P.; van der Goot, Annemieke; Lombaert, Isabelle M. A.

    Normal tissue damage after radiotherapy is still a major problem in cancer treatment. Stem cell therapy may provide a means to reduce radiation-induced side effects and improve the quality of life of patients. This review discusses the current status in stem cell research with respect to their

  19. Portable electrocardiograph through android application.

    Science.gov (United States)

    De Oliveira, Igor H; Cene, V H; Balbinot, A

    2015-01-01

    An electrocardiograph was designed and implemented, being capable of obtaining electrical signals from the heart, and sending this data via Bluetooth to a tablet, in which the signals are graphically shown. The user interface is developed as an Android application. Because of the technological progress and the increasing use of full portable systems, such as tablets and cell phones, it is important to understand the functioning and development of an application, which provides a basis for conducting studies using this technology as an interface. The project development includes concepts of electronics and its application to achieve a portable and functional final project, besides using a specific programmable integrated circuit for electrocardiogram, electroencephalogram and electromyogram, the ADS1294. Using a simulator of cardiac signals, 36 different waveforms were recorded, including normal sinus rhythm, arrhythmias and artifacts. Simulations include variations of heart rate from 30 to 190 beats per minute (BPM), with variations in peak amplitude of 1 mV to 2 mV. Tests were performed with a subject at rest and in motion, observing the signals obtained and the damage to their interpretation due to the introduction of muscle movement artifacts in motion situations.

  20. A New MRI-Based Model of Heart Function with Coupled Hemodynamics and Application to Normal and Diseased Canine Left Ventricles

    Science.gov (United States)

    Choi, Young Joon; Constantino, Jason; Vedula, Vijay; Trayanova, Natalia; Mittal, Rajat

    2015-01-01

    A methodology for the simulation of heart function that combines an MRI-based model of cardiac electromechanics (CE) with a Navier–Stokes-based hemodynamics model is presented. The CE model consists of two coupled components that simulate the electrical and the mechanical functions of the heart. Accurate representations of ventricular geometry and fiber orientations are constructed from the structural magnetic resonance and the diffusion tensor MR images, respectively. The deformation of the ventricle obtained from the electromechanical model serves as input to the hemodynamics model in this one-way coupled approach via imposed kinematic wall velocity boundary conditions and at the same time, governs the blood flow into and out of the ventricular volume. The time-dependent endocardial surfaces are registered using a diffeomorphic mapping algorithm, while the intraventricular blood flow patterns are simulated using a sharp-interface immersed boundary method-based flow solver. The utility of the combined heart-function model is demonstrated by comparing the hemodynamic characteristics of a normal canine heart beating in sinus rhythm against that of the dyssynchronously beating failing heart. We also discuss the potential of coupled CE and hemodynamics models for various clinical applications. PMID:26442254

  1. Single cell analysis of normal and leukemic hematopoiesis.

    Science.gov (United States)

    Povinelli, Benjamin J; Rodriguez-Meira, Alba; Mead, Adam J

    2018-02-01

    The hematopoietic system is well established as a paradigm for the study of cellular hierarchies, their disruption in disease and therapeutic use in regenerative medicine. Traditional approaches to study hematopoiesis involve purification of cell populations based on a small number of surface markers. However, such population-based analysis obscures underlying heterogeneity contained within any phenotypically defined cell population. This heterogeneity can only be resolved through single cell analysis. Recent advances in single cell techniques allow analysis of the genome, transcriptome, epigenome and proteome in single cells at an unprecedented scale. The application of these new single cell methods to investigate the hematopoietic system has led to paradigm shifts in our understanding of cellular heterogeneity in hematopoiesis and how this is disrupted in disease. In this review, we summarize how single cell techniques have been applied to the analysis of hematopoietic stem/progenitor cells in normal and malignant hematopoiesis, with a particular focus on recent advances in single-cell genomics, including how these might be utilized for clinical application. Copyright © 2017. Published by Elsevier Ltd.

  2. Using color histogram normalization for recovering chromatic illumination-changed images.

    Science.gov (United States)

    Pei, S C; Tseng, C L; Wu, C C

    2001-11-01

    We propose a novel image-recovery method using the covariance matrix of the red-green-blue (R-G-B) color histogram and tensor theories. The image-recovery method is called the color histogram normalization algorithm. It is known that the color histograms of an image taken under varied illuminations are related by a general affine transformation of the R-G-B coordinates when the illumination is changed. We propose a simplified affine model for application with illumination variation. This simplified affine model considers the effects of only three basic forms of distortion: translation, scaling, and rotation. According to this principle, we can estimate the affine transformation matrix necessary to recover images whose color distributions are varied as a result of illumination changes. We compare the normalized color histogram of the standard image with that of the tested image. By performing some operations of simple linear algebra, we can estimate the matrix of the affine transformation between two images under different illuminations. To demonstrate the performance of the proposed algorithm, we divide the experiments into two parts: computer-simulated images and real images corresponding to illumination changes. Simulation results show that the proposed algorithm is effective for both types of images. We also explain the noise-sensitive skew-rotation estimation that exists in the general affine model and demonstrate that the proposed simplified affine model without the use of skew rotation is better than the general affine model for such applications.

  3. Normal uniform mixture differential gene expression detection for cDNA microarrays

    Directory of Open Access Journals (Sweden)

    Raftery Adrian E

    2005-07-01

    Full Text Available Abstract Background One of the primary tasks in analysing gene expression data is finding genes that are differentially expressed in different samples. Multiple testing issues due to the thousands of tests run make some of the more popular methods for doing this problematic. Results We propose a simple method, Normal Uniform Differential Gene Expression (NUDGE detection for finding differentially expressed genes in cDNA microarrays. The method uses a simple univariate normal-uniform mixture model, in combination with new normalization methods for spread as well as mean that extend the lowess normalization of Dudoit, Yang, Callow and Speed (2002 1. It takes account of multiple testing, and gives probabilities of differential expression as part of its output. It can be applied to either single-slide or replicated experiments, and it is very fast. Three datasets are analyzed using NUDGE, and the results are compared to those given by other popular methods: unadjusted and Bonferroni-adjusted t tests, Significance Analysis of Microarrays (SAM, and Empirical Bayes for microarrays (EBarrays with both Gamma-Gamma and Lognormal-Normal models. Conclusion The method gives a high probability of differential expression to genes known/suspected a priori to be differentially expressed and a low probability to the others. In terms of known false positives and false negatives, the method outperforms all multiple-replicate methods except for the Gamma-Gamma EBarrays method to which it offers comparable results with the added advantages of greater simplicity, speed, fewer assumptions and applicability to the single replicate case. An R package called nudge to implement the methods in this paper will be made available soon at http://www.bioconductor.org.

  4. Design of Provider-Provisioned Website Protection Scheme against Malware Distribution

    Science.gov (United States)

    Yagi, Takeshi; Tanimoto, Naoto; Hariu, Takeo; Itoh, Mitsutaka

    Vulnerabilities in web applications expose computer networks to security threats, and many websites are used by attackers as hopping sites to attack other websites and user terminals. These incidents prevent service providers from constructing secure networking environments. To protect websites from attacks exploiting vulnerabilities in web applications, service providers use web application firewalls (WAFs). WAFs filter accesses from attackers by using signatures, which are generated based on the exploit codes of previous attacks. However, WAFs cannot filter unknown attacks because the signatures cannot reflect new types of attacks. In service provider environments, the number of exploit codes has recently increased rapidly because of the spread of vulnerable web applications that have been developed through cloud computing. Thus, generating signatures for all exploit codes is difficult. To solve these problems, our proposed scheme detects and filters malware downloads that are sent from websites which have already received exploit codes. In addition, to collect information for detecting malware downloads, web honeypots, which automatically extract the communication records of exploit codes, are used. According to the results of experiments using a prototype, our scheme can filter attacks automatically so that service providers can provide secure and cost-effective network environments.

  5. Imaging the corpus callosum, septum pellucidum and fornix in children: normal anatomy and variations of normality

    International Nuclear Information System (INIS)

    Griffiths, Paul D.; Batty, Ruth; Connolly, Dan J.A.; Reeves, Michael J.

    2009-01-01

    The midline structures of the supra-tentorial brain are important landmarks for judging if the brain has formed correctly. In this article, we consider the normal appearances of the corpus callosum, septum pellucidum and fornix as shown on MR imaging in normal and near-normal states. (orig.)

  6. EFSA Panel on Dietetic Products, Nutrition and Allergies (NDA); Scientific Opinion on the substantiation of a health claim related to OptiEFAX™ and maintenance of normal blood HDL-cholesterol concentrations pursuant to Article 13(5) of Regulation (EC) No 1924/2006

    DEFF Research Database (Denmark)

    Tetens, Inge

    on the scientific substantiation of a health claim related to OptiEFAX™ and maintenance of normal blood HDL-cholesterol concentrations. The food that is the subject of the health claim, OptiEFAX™, which is standardised pure krill oil, is sufficiently characterised in relation to the claimed effect. The claimed...... effect, maintenance of normal blood HDL-cholesterol concentrations, is a beneficial physiological effect. The target population proposed by the applicant is the general population. No human studies have been provided from which conclusions could be drawn for the scientific substantiation of the claim....... A cause and effect relationship has not been established between the consumption of OptiEFAX™ and maintenance of normal blood HDL-cholesterol concentrations....

  7. EFSA Panel on Dietetic Products, Nutrition and Allergies (NDA); Scientific Opinion on the substantiation of a health claim related to OptiEFAX™ and maintenance of normal blood LDL-cholesterol concentrations pursuant to Article 13(5) of Regulation (EC) No 1924/2006

    DEFF Research Database (Denmark)

    Tetens, Inge

    substantiation of a health claim related to OptiEFAX™ and maintenance of normal blood LDL-cholesterol concentrations. The food that is the subject of the health claim, OptiEFAX™, which is standardised pure krill oil, is sufficiently characterised in relation to the claimed effect. The claimed effect, maintenance...... of normal blood LDL-cholesterol concentrations, is a beneficial physiological effect. The target population proposed by the applicant is the general population. No human studies have been provided from which conclusions could be drawn for the scientific substantiation of the claim. A cause and effect...... relationship has not been established between the consumption of OptiEFAX™ and maintenance of normal blood LDL-cholesterol concentrations....

  8. P Status In Andisol And P Content In Arabica Coffee Seedling Leaves Due To The Application Of Phosphate Providing Microorganisms And Organic Matters In Bener Meriah District

    Directory of Open Access Journals (Sweden)

    Hifnalisa

    2017-09-01

    Full Text Available Bener Meriah district is one of the arabica coffee producing regions in Indonesia. Most of arabica coffee in Bener Meriah district grown on Andisol. Generally the availability of P in Andisol is very low. Phosphate providing microorganisms and organic matters can be used to increase Andisol P availability. The objective of this study was to examine the effect of the application of phosphate providing microorganisms and organic matters on P status in Andisol and P content in arabica coffee seedlings leaves in Bener Meriah district. The experiment used a randomized block design that consisted of two factors. Factor I was the application of phosphate providing microorganisms consisting of without microorganisms Glomus sp. Kurthia sp. Corynebacterium sp. and Listeria sp. Factor II was the application of organic matters consisting of T. diversifolia and coffee bean skins. The results of the study showed that Glomus sp. Kurthia sp. Corynebacterium sp. and Listeria sp. decreased soil P-retention by 2.38 5.12 7.48 9.17 respectively increased soil P-available by 24.85 36.03 52.79 77.33 respectively and increased P-content in the arabica coffee seedling leaves by 22.22 33.33 37.0372.27 respectively compared to without the application of microorganisms. The application of coffee bean skins resulted in lower soil P-retention higher soil P-available and P-content in arabica coffee seedling leaves than T. diversifolia. The application of Listeria sp.-coffee bean skins resulted in the lowest soil P-retention the highest soil P-available and P-content in arabica coffee seedlings leaves.

  9. Iterative closest normal point for 3D face recognition.

    Science.gov (United States)

    Mohammadzade, Hoda; Hatzinakos, Dimitrios

    2013-02-01

    The common approach for 3D face recognition is to register a probe face to each of the gallery faces and then calculate the sum of the distances between their points. This approach is computationally expensive and sensitive to facial expression variation. In this paper, we introduce the iterative closest normal point method for finding the corresponding points between a generic reference face and every input face. The proposed correspondence finding method samples a set of points for each face, denoted as the closest normal points. These points are effectively aligned across all faces, enabling effective application of discriminant analysis methods for 3D face recognition. As a result, the expression variation problem is addressed by minimizing the within-class variability of the face samples while maximizing the between-class variability. As an important conclusion, we show that the surface normal vectors of the face at the sampled points contain more discriminatory information than the coordinates of the points. We have performed comprehensive experiments on the Face Recognition Grand Challenge database, which is presently the largest available 3D face database. We have achieved verification rates of 99.6 and 99.2 percent at a false acceptance rate of 0.1 percent for the all versus all and ROC III experiments, respectively, which, to the best of our knowledge, have seven and four times less error rates, respectively, compared to the best existing methods on this database.

  10. The measurement of intrinsic cellular radiosensitivity in human tumours and normal tissues

    International Nuclear Information System (INIS)

    Lawton, P.A.

    1995-01-01

    Human tumour and normal cell radiosensitivity are thought to be important factors determining the response of tumour and normal tissues to radiotherapy, respectively. Clonogenic assays are the standard method for measuring radiosensitivity but they are of limited applicability for clinical use with fresh human tumours. The main aim of this work was to evaluate the Adhesive Tumour Cell Culture System (ATCCS), as a method for measuring the radiosensitivity of human tumours. A soft agar clonogenic assay, the modified Courtenay-Mills assay, was used as a standard to compare with the ATCCS. The demonstration that fibroblast contamination could occur with both assay methods led to the investigation of a new technique for removing unwanted fibroblasts from tumour cell suspensions and to the use of a multiwell assay for measuring fibroblast radiosensitivity. (author)

  11. The measurement of intrinsic cellular radiosensitivity in human tumours and normal tissues

    Energy Technology Data Exchange (ETDEWEB)

    Lawton, P.A.

    1995-12-31

    Human tumour and normal cell radiosensitivity are thought to be important factors determining the response of tumour and normal tissues to radiotherapy, respectively. Clonogenic assays are the standard method for measuring radiosensitivity but they are of limited applicability for clinical use with fresh human tumours. The main aim of this work was to evaluate the Adhesive Tumour Cell Culture System (ATCCS), as a method for measuring the radiosensitivity of human tumours. A soft agar clonogenic assay, the modified Courtenay-Mills assay, was used as a standard to compare with the ATCCS. The demonstration that fibroblast contamination could occur with both assay methods led to the investigation of a new technique for removing unwanted fibroblasts from tumour cell suspensions and to the use of a multiwell assay for measuring fibroblast radiosensitivity. (author).

  12. An alternative reference space for H&E color normalization.

    Directory of Open Access Journals (Sweden)

    Mark D Zarella

    Full Text Available Digital imaging of H&E stained slides has enabled the application of image processing to support pathology workflows. Potential applications include computer-aided diagnostics, advanced quantification tools, and innovative visualization platforms. However, the intrinsic variability of biological tissue and the vast differences in tissue preparation protocols often lead to significant image variability that can hamper the effectiveness of these computational tools. We developed an alternative representation for H&E images that operates within a space that is more amenable to many of these image processing tools. The algorithm to derive this representation operates by exploiting the correlation between color and the spatial properties of the biological structures present in most H&E images. In this way, images are transformed into a structure-centric space in which images are segregated into tissue structure channels. We demonstrate that this framework can be extended to achieve color normalization, effectively reducing inter-slide variability.

  13. Operational behaviour of a reactor normal operation and disturbances

    International Nuclear Information System (INIS)

    Geyer, K.H.

    1982-01-01

    During normal operation, the following topics are dealt with: primary and secondary coolant circuits - full load operation - start-up and shutdown - steady state part load diagramm. During disturbances and incidents, the following procedures are discussed: identification and detection of the events - automatic actions - manual actions of the operator - provided indications - explanation of actuated systems - basic information of reactor protection system. (RW)

  14. PROCESS CAPABILITY ESTIMATION FOR NON-NORMALLY DISTRIBUTED DATA USING ROBUST METHODS - A COMPARATIVE STUDY

    Directory of Open Access Journals (Sweden)

    Yerriswamy Wooluru

    2016-06-01

    Full Text Available Process capability indices are very important process quality assessment tools in automotive industries. The common process capability indices (PCIs Cp, Cpk, Cpm are widely used in practice. The use of these PCIs based on the assumption that process is in control and its output is normally distributed. In practice, normality is not always fulfilled. Indices developed based on normality assumption are very sensitive to non- normal processes. When distribution of a product quality characteristic is non-normal, Cp and Cpk indices calculated using conventional methods often lead to erroneous interpretation of process capability. In the literature, various methods have been proposed for surrogate process capability indices under non normality but few literature sources offer their comprehensive evaluation and comparison of their ability to capture true capability in non-normal situation. In this paper, five methods have been reviewed and capability evaluation is carried out for the data pertaining to resistivity of silicon wafer. The final results revealed that the Burr based percentile method is better than Clements method. Modelling of non-normal data and Box-Cox transformation method using statistical software (Minitab 14 provides reasonably good result as they are very promising methods for non - normal and moderately skewed data (Skewness <= 1.5.

  15. Fungal invasion of normally non-phagocytic host cells.

    Directory of Open Access Journals (Sweden)

    Scott G Filler

    2006-12-01

    Full Text Available Many fungi that cause invasive disease invade host epithelial cells during mucosal and respiratory infection, and subsequently invade endothelial cells during hematogenous infection. Most fungi invade these normally non-phagocytic host cells by inducing their own uptake. Candida albicans hyphae interact with endothelial cells in vitro by binding to N-cadherin on the endothelial cell surface. This binding induces rearrangement of endothelial cell microfilaments, which results in the endocytosis of the organism. The capsule of Cryptococcus neoformans is composed of glucuronoxylomannan, which binds specifically to brain endothelial cells, and appears to mediate both adherence and induction of endocytosis. The mechanisms by which other fungal pathogens induce their own uptake are largely unknown. Some angioinvasive fungi, such as Aspergillus species and the Zygomycetes, invade endothelial cells from the abluminal surface during the initiation of invasive disease, and subsequently invade the luminal surface of endothelial cells during hematogenous dissemination. Invasion of normally non-phagocytic host cells has different consequences, depending on the type of invading fungus. Aspergillus fumigatus blocks apoptosis of pulmonary epithelial cells, whereas Paracoccidioides brasiliensis induces apoptosis of epithelial cells. This review summarizes the mechanisms by which diverse fungal pathogens invade normally non-phagocytic host cells and discusses gaps in our knowledge that provide opportunities for future research.

  16. Echocardiographic reference ranges for normal left atrial function parameters: results from the EACVI NORRE study.

    Science.gov (United States)

    Sugimoto, Tadafumi; Robinet, Sébastien; Dulgheru, Raluca; Bernard, Anne; Ilardi, Federica; Contu, Laura; Addetia, Karima; Caballero, Luis; Kacharava, George; Athanassopoulos, George D; Barone, Daniele; Baroni, Monica; Cardim, Nuno; Hagendorff, Andreas; Hristova, Krasimira; Lopez, Teresa; de la Morena, Gonzalo; Popescu, Bogdan A; Penicka, Martin; Ozyigit, Tolga; Rodrigo Carbonero, Jose David; van de Veire, Nico; Von Bardeleben, Ralph Stephan; Vinereanu, Dragos; Zamorano, Jose Luis; Go, Yun Yun; Marchetta, Stella; Nchimi, Alain; Rosca, Monica; Calin, Andreea; Moonen, Marie; Cimino, Sara; Magne, Julien; Cosyns, Bernard; Galli, Elena; Donal, Erwan; Habib, Gilbert; Esposito, Roberta; Galderisi, Maurizio; Badano, Luigi P; Lang, Roberto M; Lancellotti, Patrizio

    2018-02-23

    To obtain the normal ranges for echocardiographic measurements of left atrial (LA) function from a large group of healthy volunteers accounting for age and gender. A total of 371 (median age 45 years) healthy subjects were enrolled at 22 collaborating institutions collaborating in the Normal Reference Ranges for Echocardiography (NORRE) study of the European Association of Cardiovascular Imaging (EACVI). Left atrial data sets were analysed with a vendor-independent software (VIS) package allowing homogeneous measurements irrespective of the echocardiographic equipment used to acquire data sets. The lowest expected values of LA function were 26.1%, 48.7%, and 41.4% for left atrial strain (LAS), 2D left atrial emptying fraction (LAEF), and 3D LAEF (reservoir function); 7.7%, 24.2%, and -0.53/s for LAS-active, LAEF-active, and LA strain rate during LA contraction (SRa) (pump function) and 12.0% and 21.6% for LAS-passive and LAEF-passive (conduit function). Left atrial reservoir and conduit function were decreased with age while pump function was increased. All indices of reservoir function and all LA strains had no difference in both gender and vendor. However, inter-vendor differences were observed in LA SRa despite the use of VIS. The NORRE study provides contemporary, applicable echocardiographic reference ranges for LA function. Our data highlight the importance of age-specific reference values for LA functions.

  17. Acid-base changes in canine neonates following normal birth or dystocia.

    Science.gov (United States)

    Lúcio, C F; Silva, L C G; Rodrigues, J A; Veiga, G A L; Vannucchi, C I

    2009-07-01

    There are limited data concerning blood gas parameters in neonatal dogs. Knowledge of the normal physiology may facilitate effective therapeutic intervention and potentially reduce neonatal mortality. This study examined acid-base parameters in pups born at normal parturition (n = 27) compared with those born after obstetrical assistance or caesarean operation (n = 13) and those born following oxytocin (OXY) administration for treatment of uterine inertia (n = 11). Pups were subjected to an objective scoring method of neonatal health adapted from use in humans (the Apgar score) at birth and again at 5 and 60 min after birth. Venous blood samples were collected at 5 and 60 min after birth for evaluation of blood gas parameters. At birth, all pups had low Apgar scores and a mixed acidosis. The base excess was lowest for pups delivered after OXY administration. The Apgar score improved for all pups after 5 min of birth and there was an improvement in carbon dioxide tension, base excess and venous blood pH at 1 h, although in all pups a metabolic acidosis persisted. These data provide an important insight into neonatal physiology and the variability of blood gas parameters in pups born at normal and abnormal parturition and provide the basis for clinical decision making following dystocia.

  18. Discriminant analysis of normal and malignant breast tissue based upon INAA investigation of elemental concentration

    International Nuclear Information System (INIS)

    Kwanhoong Ng; Senghuat Ong; Bradley, D.A.; Laimeng Looi

    1997-01-01

    Discriminant analysis of six trace element concentrations measured by instrumental neutron activation analysis (INAA) in 26 paired-samples of malignant and histologically normal human breast tissues shows the technique to be a potentially valuable clinical tool for making malignant-normal classification. Nonparametric discriminant analysis is performed for the data obtained. Linear and quadratic discriminant analyses are also carried out for comparison. For this data set a formal analysis shows that the elements which may be useful in distinguishing between malignant and normal tissues are Ca, Rb and Br, providing correct classification for 24 out of 26 normal samples and 22 out of 26 malignant samples. (Author)

  19. Understanding a Normal Distribution of Data.

    Science.gov (United States)

    Maltenfort, Mitchell G

    2015-12-01

    Assuming data follow a normal distribution is essential for many common statistical tests. However, what are normal data and when can we assume that a data set follows this distribution? What can be done to analyze non-normal data?

  20. Experimental Method for Characterizing Electrical Steel Sheets in the Normal Direction

    Directory of Open Access Journals (Sweden)

    Thierry Belgrand

    2010-10-01

    Full Text Available This paper proposes an experimental method to characterise magnetic laminations in the direction normal to the sheet plane. The principle, which is based on a static excitation to avoid planar eddy currents, is explained and specific test benches are proposed. Measurements of the flux density are made with a sensor moving in and out of an air-gap. A simple analytical model is derived in order to determine the permeability in the normal direction. The experimental results for grain oriented steel sheets are presented and a comparison is provided with values obtained from literature.

  1. High-gradient normal-conducting RF structures for muon cooling channels

    International Nuclear Information System (INIS)

    Corlett, J.N.; Green, M.A.; Hartman, N.; Ladran, A.; Li, D.; MacGill, R.; Rimmer, R.; Moretti, A.; Jurgens, T.; Holtkamp, N.; Black, E.; Summers, D.; Booke, M.

    2001-01-01

    We present a status report on the research and development of high-gradient normal-conducting RF structures for the ionization cooling of muons in a neutrino factory or muon collider. High-gradient RF structures are required in regions enclosed in strong focusing solenoidal magnets, precluding the application of superconducting RF technology [1]. We propose using linear accelerating structures, with individual cells electromagnetically isolated, to achieve the required gradients of over 15 MV/m at 201 MHz and 30 MV/m at 805 MHz. Each cell will be powered independently, and cell length and drive phase adjusted to optimize shunt impedance of the assembled structure. This efficient design allows for relatively small field enhancement on the structure walls, and an accelerating field approximately 1.7 times greater than the peak surface field. The electromagnetic boundary of each cell may be provided by a thin Be sheet, or an assembly of thin-walled metal tubes. Use of thin, low-Z materials will allow passage of the muon beams without significant deterioration in beam quality due to scattering. R and D in design and analysis of robust structures that will operate under large electric and magnetic fields and RF current heating are discussed, including the experimental program based in a high-power test laboratory developed for this purpose

  2. Radiographic normal range of condylar movement of mandible

    International Nuclear Information System (INIS)

    Choi, Byung Ihn; Lee, Jae Mun; Kim, Myung Jin

    1981-01-01

    It is the purpose of this article to determine various normal anatomic measurements of temporomandibular joint and normal range of condylar movement using relatively simple X-ray equipment and radiographic technique in consideration of popular clinical application. Author's cases consisted of 100 clinically normal adult males and temporomandibular joint radiographs of 3 serial positions of condylar head were taken by transcranial oblique lateral projection in each case. The serial positions are centric occlusion, 1 inch opening and maximal opening position. The results were as follows; 1. In centric occlusion, the length between the condylar head and glenoid fossa was 2.23 ± 0.58 mm in anterior part, 3.55 ± 0.80 mm in upper part and 2.76 ± 0.72 mm in posterior part. 2. In centric occlusion, the angle (α) between the horizontal standard line (AB) and anterior slope (BC) was 37.22 ± 3.87 .deg. . 3. In 1 inch opening position, the distance between the summit of condylar head from the standard point of articular eminence (B) was -0.64 ± 3.53 mm in horizontal direction and -1.07 ± 1.00 mm in vertical direction. 4. In maximal opening position, the distance between the summit of condylar head from the standard point of articular eminence (B) was 5.83 ± 3.05 mm in horizontal direction and +0.29 ± 1.58 mm in vertical direction. 5. In positional relationship between the condylar head and the standard point of articular eminence (B), the condyles were found to be at the eminences or anterior to them in 51% with 1 inch opening and 95% with maximal opening

  3. Occult microscopic endometriosis: undetectable by laparoscopy in normal peritoneum.

    Science.gov (United States)

    Khan, Khaleque Newaz; Fujishita, Akira; Kitajima, Michio; Hiraki, Koichi; Nakashima, Masahiro; Masuzaki, Hideaki

    2014-03-01

    the pelvis. We re-confirmed a decade long old concept of invisible (occult) endometriosis in visually normal peritoneum of women with visible endometriosis. The existence of a variable amount of tissue activity in these occult lesions may contribute to the recurrence/occurrence of endometriosis or persistence/recurrence of pain manifestation in women even after successful ablation or excision of visible lesions by laparoscopy. This work was supported in part by Grants-in-aid for Scientific Research from the Japan Society for the Promotion of Science. There is no conflict of interest related to this study. Not applicable.

  4. Cool-and Unusual-CAD Applications

    Science.gov (United States)

    Calhoun, Ken

    2004-01-01

    This article describes several very useful applications of AutoCAD that may lie outside the normal scope of application. AutoCAD commands used in this article are based on AutoCAD 2000I. The author and his students used a Hewlett Packard 750C DesignJet plotter for plotting. (Contains 5 figures and 5 photos.)

  5. On a computer implementation of the block Gauss–Seidel method for normal systems of equations

    Directory of Open Access Journals (Sweden)

    Alexander I. Zhdanov

    2016-12-01

    Full Text Available This article focuses on the modification of the block option Gauss-Seidel method for normal systems of equations, which is a sufficiently effective method of solving generally overdetermined, systems of linear algebraic equations of high dimensionality. The main disadvantage of methods based on normal equations systems is the fact that the condition number of the normal system is equal to the square of the condition number of the original problem. This fact has a negative impact on the rate of convergence of iterative methods based on normal equations systems. To increase the speed of convergence of iterative methods based on normal equations systems, for solving ill-conditioned problems currently different preconditioners options are used that reduce the condition number of the original system of equations. However, universal preconditioner for all applications does not exist. One of the effective approaches that improve the speed of convergence of the iterative Gauss–Seidel method for normal systems of equations, is to use its version of the block. The disadvantage of the block Gauss–Seidel method for production systems is the fact that it is necessary to calculate the pseudoinverse matrix for each iteration. We know that finding the pseudoinverse is a difficult computational procedure. In this paper, we propose a procedure to replace the matrix pseudo-solutions to the problem of normal systems of equations by Cholesky. Normal equations arising at each iteration of Gauss–Seidel method, have a relatively low dimension compared to the original system. The results of numerical experimentation demonstrating the effectiveness of the proposed approach are given.

  6. Normal venous anatomy and physiology of the lower extremity.

    Science.gov (United States)

    Notowitz, L B

    1993-06-01

    Venous disease of the lower extremities is common but is often misunderstood. It seems that the focus is on the exciting world of arterial anatomy and pathology, while the topic of venous anatomy and pathology comes in second place. However, venous diseases such as chronic venous insufficiency, leg ulcers, and varicose veins affect much of the population and may lead to disability and death. Nurses are often required to answer complex questions from the patients and his or her family about the patient's disease. Patients depend on nurses to provide accurate information in terms they can understand. Therefore it is important to have an understanding of the normal venous system of the legs before one can understand the complexities of venous diseases and treatments. This presents an overview of normal venous anatomy and physiology.

  7. Multiple imputation in the presence of non-normal data.

    Science.gov (United States)

    Lee, Katherine J; Carlin, John B

    2017-02-20

    Multiple imputation (MI) is becoming increasingly popular for handling missing data. Standard approaches for MI assume normality for continuous variables (conditionally on the other variables in the imputation model). However, it is unclear how to impute non-normally distributed continuous variables. Using simulation and a case study, we compared various transformations applied prior to imputation, including a novel non-parametric transformation, to imputation on the raw scale and using predictive mean matching (PMM) when imputing non-normal data. We generated data from a range of non-normal distributions, and set 50% to missing completely at random or missing at random. We then imputed missing values on the raw scale, following a zero-skewness log, Box-Cox or non-parametric transformation and using PMM with both type 1 and 2 matching. We compared inferences regarding the marginal mean of the incomplete variable and the association with a fully observed outcome. We also compared results from these approaches in the analysis of depression and anxiety symptoms in parents of very preterm compared with term-born infants. The results provide novel empirical evidence that the decision regarding how to impute a non-normal variable should be based on the nature of the relationship between the variables of interest. If the relationship is linear in the untransformed scale, transformation can introduce bias irrespective of the transformation used. However, if the relationship is non-linear, it may be important to transform the variable to accurately capture this relationship. A useful alternative is to impute the variable using PMM with type 1 matching. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  8. Normal blood supply of the canine patella

    International Nuclear Information System (INIS)

    Howard, P.E.; Wilson, J.W.; Robbins, T.A.; Ribble, G.A.

    1986-01-01

    The normal blood supply of the canine patella was evaluated, using microangiography and correlated histology. Arterioles entered the cortex of the patella at multiple sites along the medial, lateral, and dorsal aspects. The body of the patella was vascularized uniformly, with many arterioles that branched and anastomosed extensively throughout the patella. The patella was not dependent on a single nutrient artery for its afferent supply, but had an extensive interior vascular network. These factors should ensure rapid revascularization and healing of patellar fractures, provided appropriate fracture fixation is achieved

  9. The Normal Fetal Pancreas.

    Science.gov (United States)

    Kivilevitch, Zvi; Achiron, Reuven; Perlman, Sharon; Gilboa, Yinon

    2017-10-01

    The aim of the study was to assess the sonographic feasibility of measuring the fetal pancreas and its normal development throughout pregnancy. We conducted a cross-sectional prospective study between 19 and 36 weeks' gestation. The study included singleton pregnancies with normal pregnancy follow-up. The pancreas circumference was measured. The first 90 cases were tested to assess feasibility. Two hundred ninety-seven fetuses of nondiabetic mothers were recruited during a 3-year period. The overall satisfactory visualization rate was 61.6%. The intraobserver and interobserver variability had high interclass correlation coefficients of of 0.964 and 0.967, respectively. A cubic polynomial regression described best the correlation of pancreas circumference with gestational age (r = 0.744; P pancreas circumference percentiles for each week of gestation were calculated. During the study period, we detected 2 cases with overgrowth syndrome and 1 case with an annular pancreas. In this study, we assessed the feasibility of sonography for measuring the fetal pancreas and established a normal reference range for the fetal pancreas circumference throughout pregnancy. This database can be helpful when investigating fetomaternal disorders that can involve its normal development. © 2017 by the American Institute of Ultrasound in Medicine.

  10. An ultrasonographic study on measurement of normal hip joint in Korean

    International Nuclear Information System (INIS)

    Lim, Hyo Keun; Choo, In Wook; Park, Soo Sung; Han, Man Chung

    1989-01-01

    The ultrasonography is very useful in evaluation of small amount of effusion in hip joint and has several advantages such as noninvasiveness, easiness, accuracy and no radiation hazard. The data of normal hip joint space and capsule is very important in ultrasonographic evaluation of inflammatory hip joint disease. However, normal ultrasonographic data of hip joint has not been reported except in pediatric age. The purpose of this study was to evaluate and measure normal hip joint space and capsule and to provide the basic data for the clinical application. Healthy 70 males and 70 females who have had no past history and present clinical symptom of hip joint were examined with real time sector scanner (5MHz transducer). Width of hip joint spaces and thickness of joint capsule were obtained and analysed by statistical analysis. The results were as follows: 1. The average width of the hip joint space were 2.6±0.5 mm (right), 2.5±0.5 mm (left) in males and 2.4±0.5 (right), 2.5±0.6 mm (left) in females. There was no significant difference by sex. 2. The widths of the hip joint space were increased with aging and decreased after 6th decade (male) and 5th decade (females). 3. The maximal difference of both hip joint space was 1.2 mm and there was no significant difference in both side by sex and age. 4. The average thicknesses of hip joint capsule were 1.9±0.3 mm (right), 1.8±0.2 mm (left) in males and 1.7±0.3 mm (right), 1.7±0.2 mm (left) in females. There was no significant difference by sex. 5. The thickness of the hip joint capsule were increased with aging and were in plateau after 5th decade (male and female). 6. The maximal difference of both hip joint capsules was 0.9 mm and there was no significant difference in both sides by sex and age. It is therefore, considered that ultrasonography could be a very useful modality in diagnosis of hip joint disease in which the hip joint space and the hip joint capsule are changed by various etiologies

  11. [Present Status of Displaying Pharmaceutical Products for Sale on Flea Market Applications for Smartphones and the Responses to Illicit Selling by Service Providers].

    Science.gov (United States)

    Kishimoto, Keiko; Takeuchi, Tomoe; Fukushima, Noriko

    2017-12-01

     In Japan, a pharmacy or drug store license is required for selling pharmaceutical products. However, civilians without a pharmacy or drug store license are displaying pharmaceutical products for sale on a flea market application, which is illegal dealing. This study discussed the modality for implementing countermeasures for the illicit selling of pharmaceutical products. We extracted pharmaceutical products displayed for sale on three flea market applications (Mercari, Rakuma, Fril) on one day. One hundred and eighty-one pharmaceutical products were displayed (49 on Mercari, 86 on Rakuma, and 46 on Fril). There were 6.1% (11/181) domestically prescribed drugs, 69.1% (125/181) domestic OTC drugs, 23.8% (43/181) foreign-made prescribed drugs, and 1.1% (2/181) foreign-made OTC drugs. The seller could display the product for sale without confirming whether it is prohibited. We alerted the service providers of this illicit selling at flea markets at three different instances. The pharmaceutical product displays were deleted by the service providers at a rate of 55.1% (27/49) for Mercari and 51.2% (44/86) for Rakuma. The average number of drugs that were displayed for sale by each seller was 1.4 and the average number of total products that were displayed for sale by each seller was 100. The seller could have unintentionally displayed the pharmaceutical products for sale, without the knowledge that it is illegal. The service providers of flea market applications should create mechanisms to alert the sellers that displaying pharmaceutical products for sale is an illicit act and regulate these violations.

  12. MRI characterization of brown adipose tissue in obese and normal-weight children

    Energy Technology Data Exchange (ETDEWEB)

    Deng, Jie; Rigsby, Cynthia K.; Shore, Richard M. [Ann and Robert H. Lurie Children' s Hospital of Chicago, Department of Medical Imaging, 225 E. Chicago Ave., Box 9, Chicago, IL (United States); Northwestern University, Department of Radiology, Feinberg School of Medicine, Chicago, IL (United States); Schoeneman, Samantha E. [Ann and Robert H. Lurie Children' s Hospital of Chicago, Department of Medical Imaging, 225 E. Chicago Ave., Box 9, Chicago, IL (United States); Zhang, Huiyuan [John H. Stroger, Jr. Hospital of Cook County, Collaborative Research Unit, Chicago, IL (United States); Kwon, Soyang [Ann and Robert H. Lurie Children' s Hospital of Chicago, Stanley Manne Children' s Research Institute, Chicago, IL (United States); Northwestern University, Department of Pediatrics, Feinberg School of Medicine, Chicago, IL (United States); Josefson, Jami L. [Ann and Robert H. Lurie Children' s Hospital of Chicago, Division of Endocrinology, Chicago, IL (United States); Northwestern University, Department of Pediatrics, Feinberg School of Medicine, Chicago, IL (United States)

    2015-10-15

    Brown adipose tissue (BAT) is identified in mammals as an adaptive thermogenic organ for modulation of energy expenditure and heat generation. Human BAT may be primarily composed of brown-in-white (BRITE) adipocytes and stimulation of BRITE may serve as a potential target for obesity interventions. Current imaging studies of BAT detection and characterization have been mainly limited to PET/CT. MRI is an emerging application for BAT characterization in healthy children. To exploit Dixon and diffusion-weighted MRI methods to characterize cervical-supraclavicular BAT/BRITE properties in normal-weight and obese children while accounting for pubertal status. Twenty-eight healthy children (9-15 years old) with a normal or obese body mass index participated. MRI exams were performed to characterize supraclavicular adipose tissues by measuring tissue fat percentage, T2*, tissue water mobility, and microvasculature properties. We used multivariate linear regression models to compare tissue properties between normal-weight and obese groups while accounting for pubertal status. MRI measurements of BAT/BRITE tissues in obese children showed higher fat percentage (P < 0.0001), higher T2* (P < 0.0001), and lower diffusion coefficient (P = 0.015) compared with normal-weight children. Pubertal status was a significant covariate for the T2* measurement, with higher T2* (P = 0.0087) in pubertal children compared to prepubertal children. Perfusion measurements varied by pubertal status. Compared to normal-weight children, obese prepubertal children had lower perfusion fraction (P = 0.003) and pseudo-perfusion coefficient (P = 0.048); however, obese pubertal children had higher perfusion fraction (P = 0.02) and pseudo-perfusion coefficient (P = 0.028). This study utilized chemical-shift Dixon MRI and diffusion-weighted MRI methods to characterize supraclavicular BAT/BRITE tissue properties. The multi-parametric evaluation revealed evidence of morphological differences in brown

  13. MRI characterization of brown adipose tissue in obese and normal-weight children

    International Nuclear Information System (INIS)

    Deng, Jie; Rigsby, Cynthia K.; Shore, Richard M.; Schoeneman, Samantha E.; Zhang, Huiyuan; Kwon, Soyang; Josefson, Jami L.

    2015-01-01

    Brown adipose tissue (BAT) is identified in mammals as an adaptive thermogenic organ for modulation of energy expenditure and heat generation. Human BAT may be primarily composed of brown-in-white (BRITE) adipocytes and stimulation of BRITE may serve as a potential target for obesity interventions. Current imaging studies of BAT detection and characterization have been mainly limited to PET/CT. MRI is an emerging application for BAT characterization in healthy children. To exploit Dixon and diffusion-weighted MRI methods to characterize cervical-supraclavicular BAT/BRITE properties in normal-weight and obese children while accounting for pubertal status. Twenty-eight healthy children (9-15 years old) with a normal or obese body mass index participated. MRI exams were performed to characterize supraclavicular adipose tissues by measuring tissue fat percentage, T2*, tissue water mobility, and microvasculature properties. We used multivariate linear regression models to compare tissue properties between normal-weight and obese groups while accounting for pubertal status. MRI measurements of BAT/BRITE tissues in obese children showed higher fat percentage (P < 0.0001), higher T2* (P < 0.0001), and lower diffusion coefficient (P = 0.015) compared with normal-weight children. Pubertal status was a significant covariate for the T2* measurement, with higher T2* (P = 0.0087) in pubertal children compared to prepubertal children. Perfusion measurements varied by pubertal status. Compared to normal-weight children, obese prepubertal children had lower perfusion fraction (P = 0.003) and pseudo-perfusion coefficient (P = 0.048); however, obese pubertal children had higher perfusion fraction (P = 0.02) and pseudo-perfusion coefficient (P = 0.028). This study utilized chemical-shift Dixon MRI and diffusion-weighted MRI methods to characterize supraclavicular BAT/BRITE tissue properties. The multi-parametric evaluation revealed evidence of morphological differences in brown

  14. Effects of Different LiDAR Intensity Normalization Methods on Scotch Pine Forest Leaf Area Index Estimation

    Directory of Open Access Journals (Sweden)

    YOU Haotian

    2018-02-01

    Full Text Available The intensity data of airborne light detection and ranging (LiDAR are affected by many factors during the acquisition process. It is of great significance for the normalization and application of LiDAR intensity data to study the effective quantification and normalization of the effect from each factor. In this paper, the LiDAR data were normalized with range, angel of incidence, range and angle of incidence based on radar equation, respectively. Then two metrics, including canopy intensity sum and ratio of intensity, were extracted and used to estimate forest LAI, which was aimed at quantifying the effects of intensity normalization on forest LAI estimation. It was found that the range intensity normalization could improve the accuracy of forest LAI estimation. While the angle of incidence intensity normalization did not improve the accuracy and made the results worse. Although the range and incidence angle normalized intensity data could improve the accuracy, the improvement was less than the result of range intensity normalization. Meanwhile, the differences between the results of forest LAI estimation from raw intensity data and normalized intensity data were relatively big for canopy intensity sum metrics. However, the differences were relatively small for the ratio of intensity metrics. The results demonstrated that the effects of intensity normalization on forest LAI estimation were depended on the choice of affecting factor, and the influential level is closely related to the characteristics of metrics used. Therefore, the appropriate method of intensity normalization should be chosen according to the characteristics of metrics used in the future research, which could avoid the waste of cost and the reduction of estimation accuracy caused by the introduction of inappropriate affecting factors into intensity normalization.

  15. Why Do English Universities "Really" Franchise Degrees to Overseas Providers?

    Science.gov (United States)

    Healey, Nigel

    2013-01-01

    Franchising degrees to overseas providers, normally for-profit private companies, has become big business for English universities. The latest data from the Higher Education Statistics Agency reveal that there are now more international students registered for the awards of English higher education institutions that are studying wholly offshore…

  16. Prodrugs designed to discriminate pathological (tumour) and physiological (normal tissue) hypoxia

    International Nuclear Information System (INIS)

    Wilson, W.R.; Patterson, A.V.

    2003-01-01

    There is now abundant evidence that hypoxic contributes to treatment failure in radiation therapy. As a target for therapeutic intervention, hypoxia is especially attractive because it is a common feature of most human tumours and therefore a potential 'pan target' across many tumour types. However, attempts to exploit hypoxia face the problem that oxygen concentrations in some normal tissues are also heterogeneous and that O 2 distributions in tumours and normal tissues overlap. Simply adjusting the K value (O 2 concentration for 50% inhibition of activation) does not provide a satisfactory solution. Bioreductive drugs like tirapazamine with high K values are activated significantly in several normal tissues, while nitro compounds and quinones with low K values spare the hypoxic tumour cells at 'intermediate' O 2 tensions (1-10 mM O 2 ) which are considered to be major contributors to tumour radioresistance. A potential strategy for overcoming this dilemma is to design prodrugs that are activated only at very low K values, but give relatively stable cytotoxic metabolites capable of diffusing to cells at higher O 2 concentrations. This approach redefines the therapeutic target as cells adjacent to zones of pathological hypoxia ( 2 ), providing discrimination from physiological hypoxia in normal tissues. Detecting bioreductive prodrugs capable of providing bystander killing of this kind is not straightforward. We have adapted a multicellular layer (MCL) co-culture model for quantifying bystander effects in GDEPT (Wilson et al., Cancer Res., 62: 1425-1432, 2002), and have used this to measure bystander effects of hypoxia-activated prodrugs. This model uses differences in metabolic activation of bioreductive drugs between A459 cell lines with low and high cytochrome P450 reductase activity, rather than O 2 gradients, to effect localised prodrug activation. It shows that TPZ and the nitroimidazole RSU-1069 have little or no bystander effect, but that dinitrobenzamide

  17. Analysis of a Dynamic Viscoelastic Contact Problem with Normal Compliance, Normal Damped Response, and Nonmonotone Slip Rate Dependent Friction

    Directory of Open Access Journals (Sweden)

    Mikaël Barboteu

    2016-01-01

    Full Text Available We consider a mathematical model which describes the dynamic evolution of a viscoelastic body in frictional contact with an obstacle. The contact is modelled with a combination of a normal compliance and a normal damped response law associated with a slip rate-dependent version of Coulomb’s law of dry friction. We derive a variational formulation and an existence and uniqueness result of the weak solution of the problem is presented. Next, we introduce a fully discrete approximation of the variational problem based on a finite element method and on an implicit time integration scheme. We study this fully discrete approximation schemes and bound the errors of the approximate solutions. Under regularity assumptions imposed on the exact solution, optimal order error estimates are derived for the fully discrete solution. Finally, after recalling the solution of the frictional contact problem, some numerical simulations are provided in order to illustrate both the behavior of the solution related to the frictional contact conditions and the theoretical error estimate result.

  18. Normalization and microbial differential abundance strategies depend upon data characteristics.

    Science.gov (United States)

    Weiss, Sophie; Xu, Zhenjiang Zech; Peddada, Shyamal; Amir, Amnon; Bittinger, Kyle; Gonzalez, Antonio; Lozupone, Catherine; Zaneveld, Jesse R; Vázquez-Baeza, Yoshiki; Birmingham, Amanda; Hyde, Embriette R; Knight, Rob

    2017-03-03

    Data from 16S ribosomal RNA (rRNA) amplicon sequencing present challenges to ecological and statistical interpretation. In particular, library sizes often vary over several ranges of magnitude, and the data contains many zeros. Although we are typically interested in comparing relative abundance of taxa in the ecosystem of two or more groups, we can only measure the taxon relative abundance in specimens obtained from the ecosystems. Because the comparison of taxon relative abundance in the specimen is not equivalent to the comparison of taxon relative abundance in the ecosystems, this presents a special challenge. Second, because the relative abundance of taxa in the specimen (as well as in the ecosystem) sum to 1, these are compositional data. Because the compositional data are constrained by the simplex (sum to 1) and are not unconstrained in the Euclidean space, many standard methods of analysis are not applicable. Here, we evaluate how these challenges impact the performance of existing normalization methods and differential abundance analyses. Effects on normalization: Most normalization methods enable successful clustering of samples according to biological origin when the groups differ substantially in their overall microbial composition. Rarefying more clearly clusters samples according to biological origin than other normalization techniques do for ordination metrics based on presence or absence. Alternate normalization measures are potentially vulnerable to artifacts due to library size. Effects on differential abundance testing: We build on a previous work to evaluate seven proposed statistical methods using rarefied as well as raw data. Our simulation studies suggest that the false discovery rates of many differential abundance-testing methods are not increased by rarefying itself, although of course rarefying results in a loss of sensitivity due to elimination of a portion of available data. For groups with large (~10×) differences in the average

  19. Application of cine cardiac MR imaging in normal subjects and patients with valvular, coronary artery, and aortic disease

    International Nuclear Information System (INIS)

    Maddahi, J.; Ostrzega, E.; Crues, J.; Honma, H.; Siegel, R.; Charuzi, Y.; Berman, D.

    1987-01-01

    Cine MR imaging was performed on 15 normal subjects and 27 patients with cardiac disease. In normal subjects, high signal intensity of flowing blood contrasted with that of the myocardium. In 16 patients with valvular regurgitation, signal void jet due to turbulence was visualized across the diseased valves. In three IHSS patients, thickened LV myocardium, mitral regurgitant jets, and systolic LV outflow jets were noted. Five patients with myocardial infarction (MI) showed thinning and/or hypokinesis of MI regions. In three patients with Marfan syndrome, aortic dilatation, insufficiency, and flap (one pt) were identified. Cine MR imaging is potentially useful for evaluation of a variety of cardiac diseases

  20. 76 FR 60100 - The Singapore Fund, Inc.; Notice of Application

    Science.gov (United States)

    2011-09-28

    ... Singapore Fund, Inc.; Notice of Application September 22, 2011. AGENCY: Securities and Exchange Commission (``Commission''). Applicant: The Singapore Fund, Inc. (the ``Fund''). ACTION: Notice of application for an order... through investment primarily in Singapore equity securities. Applicant states that under normal...

  1. Use of alternative and complementary therapies in labor and delivery care: a cross-sectional study of midwives' training in Catalan hospitals accredited as centers for normal birth.

    Science.gov (United States)

    Muñoz-Sellés, Ester; Vallès-Segalés, Antoni; Goberna-Tricas, Josefina

    2013-11-15

    The use of complementary and alternative medicine (CAM) and complementary and alternative therapies (CAT) during pregnancy is increasing. Scientific evidence for CAM and CAT in the field of obstetrics mainly covers pain relief in labor. Midwives are responsible for labor and delivery care: hence, their knowledge of CAM and CAT is important. The aims of this study are to describe the professional profile of midwives who provide care for natural childbirth in Catalan hospitals accredited as centers for normal birth, to assess midwives' level of training in CAT and their use of these therapies, and to identify specific resources for CAT in labor wards. A descriptive, cross-sectional, quantitative method was used to assess the level of training and use of CAT by midwives working at 28 hospitals in Catalonia, Spain, accredited as public normal birth centers. Just under a third of midwives (30.4%) trained in CAT after completion of basic training. They trained in an average of 5.97 therapies (SD 3.56). The number of CAT in which the midwives were trained correlated negatively with age (r = - 0.284; p trained in CAT considered that the following therapies were useful or very useful for pain relief during labor and delivery: relaxation techniques (64.3%), hydrotherapy (84.8%) and the application of compresses to the perineum (75.9%). The availability of resources for providing CAT during normal birth care varied widely from center to center. Age may influence attitudes towards training. It is important to increase the number of midwives trained in CAM for pain relief during childbirth, in order to promote the use of CAT and ensure efficiency and safety. CAT resources at accredited hospitals providing normal childbirth care should also be standardized.

  2. Molecular signatures in childhood acute leukemia and their correlations to expression patterns in normal hematopoietic subpopulations.

    Science.gov (United States)

    Andersson, Anna; Olofsson, Tor; Lindgren, David; Nilsson, Björn; Ritz, Cecilia; Edén, Patrik; Lassen, Carin; Råde, Johan; Fontes, Magnus; Mörse, Helena; Heldrup, Jesper; Behrendtz, Mikael; Mitelman, Felix; Höglund, Mattias; Johansson, Bertil; Fioretos, Thoas

    2005-12-27

    Global expression profiles of a consecutive series of 121 childhood acute leukemias (87 B lineage acute lymphoblastic leukemias, 11 T cell acute lymphoblastic leukemias, and 23 acute myeloid leukemias), six normal bone marrows, and 10 normal hematopoietic subpopulations of different lineages and maturations were ascertained by using 27K cDNA microarrays. Unsupervised analyses revealed segregation according to lineages and primary genetic changes, i.e., TCF3(E2A)/PBX1, IGH@/MYC, ETV6(TEL)/RUNX1(AML1), 11q23/MLL, and hyperdiploidy (>50 chromosomes). Supervised discriminatory analyses were used to identify differentially expressed genes correlating with lineage and primary genetic change. The gene-expression profiles of normal hematopoietic cells were also studied. By using principal component analyses (PCA), a differentiation axis was exposed, reflecting lineages and maturation stages of normal hematopoietic cells. By applying the three principal components obtained from PCA of the normal cells on the leukemic samples, similarities between malignant and normal cell lineages and maturations were investigated. Apart from showing that leukemias segregate according to lineage and genetic subtype, we provide an extensive study of the genes correlating with primary genetic changes. We also investigated the expression pattern of these genes in normal hematopoietic cells of different lineages and maturations, identifying genes preferentially expressed by the leukemic cells, suggesting an ectopic activation of a large number of genes, likely to reflect regulatory networks of pathogenetic importance that also may provide attractive targets for future directed therapies.

  3. Application of 3.0T magnetic resonance spectroscopy imaging in the evaluation on the development of normal brain white matter in infants and young children

    Directory of Open Access Journals (Sweden)

    Wen-li XU

    2014-01-01

    Full Text Available Objective To calculate the radios of peak area of proton magnetic resonance spectroscopy metabolites in brain white matter of normal infants and young children, to observe the features of metabolite spectra, and to explore the relations between their ratio with age. Methods The peak areas of metabolites, including N-acetyl aspartate (NAA, choline (Cho, creatine (Cr, and their ratio of NAA/Cho, NAA/Cr, Cho/Cr, in paraventricular white matter of 180 normal infants and young children with different ages as evaluated by multi-voxel proton magnetic resonance spectroscopy. Results In paraventricular white matter, spectrum of NAA increased, and that of Cho decreased gradually, while both of them were stabilized at 2 years old. Cr was increased obviously within 3 months, and stabilized after 4 months. Significant differences were found in ratio of different metabolites in paraventricular white matter in different ages (P<0.05. The ratios of NAA/Cho and NAA/Cr in paraventricular white mater were positively correlated with age (r=0.741, r=0.625, while that of Cho/Cr was negatively correlated with age (r=–0.552, P<0.05. Conclusion The ratios of different metabolites are different in brain white matter in infants of different ages. Metabolites concentrations in brain white matter are correlated to some extent with age, which may provide a diagnostic criterion for evaluation of normal brain development and abnormal brain metabolism. DOI: 10.11855/j.issn.0577-7402.2013.12.05

  4. Photodynamic therapy: a review of applications in neurooncology and neuropathology

    Science.gov (United States)

    Uzdensky, Anatoly B.; Berezhnaya, Elena; Kovaleva, Vera; Neginskaya, Marya; Rudkovskii, Mikhail; Sharifulina, Svetlana

    2015-06-01

    Photodynamic therapy (PDT) effect is a promising adjuvant modality for diagnosis and treatment of brain cancer. It is of importance that the bright fluorescence of most photosensitizers provides visualization of brain tumors. This is successfully used for fluorescence-guided tumor resection according to the principle "to see and to treat." Non-oncologic application of PDT effect for induction of photothrombotic infarct of the brain tissue is a well-controlled and reproducible stroke model, in which a local brain lesion is produced in the predetermined brain area. Since normal neurons and glial cells may also be damaged by PDT and this can lead to unwanted neurological consequences, PDT effects on normal neurons and glial cells should be comprehensively studied. We overviewed the current literature data on the PDT effect on a range of signaling and epigenetic proteins that control various cell functions, survival, necrosis, and apoptosis. We hypothesize that using cell-specific inhibitors or activators of some signaling proteins, one can selectively protect normal neurons and glia, and simultaneously exacerbate photodynamic damage of malignant gliomas.

  5. Fuel damage during off-normal transients in metal-fueled fast reactors

    International Nuclear Information System (INIS)

    Kramer, J.M.; Bauer, T.H.

    1990-01-01

    Fuel damage during off-normal transients is a key issue in the safety of fast reactors because the fuel pin cladding provides the primary barrier to the release of radioactive materials. Part of the Safety Task of the Integral Fast Reactor Program is to provide assessments of the damage and margins to failure for metallic fuels over the wide range of transients that must be considered in safety analyses. This paper reviews the current status of the analytical and experimental programs that are providing the bases for these assessments. 13 refs., 2 figs

  6. On the extreme value statistics of normal random matrices and 2D Coulomb gases: Universality and finite N corrections

    Science.gov (United States)

    Ebrahimi, R.; Zohren, S.

    2018-03-01

    In this paper we extend the orthogonal polynomials approach for extreme value calculations of Hermitian random matrices, developed by Nadal and Majumdar (J. Stat. Mech. P04001 arXiv:1102.0738), to normal random matrices and 2D Coulomb gases in general. Firstly, we show that this approach provides an alternative derivation of results in the literature. More precisely, we show convergence of the rescaled eigenvalue with largest modulus of a normal Gaussian ensemble to a Gumbel distribution, as well as universality for an arbitrary radially symmetric potential. Secondly, it is shown that this approach can be generalised to obtain convergence of the eigenvalue with smallest modulus and its universality for ring distributions. Most interestingly, the here presented techniques are used to compute all slowly varying finite N correction of the above distributions, which is important for practical applications, given the slow convergence. Another interesting aspect of this work is the fact that we can use standard techniques from Hermitian random matrices to obtain the extreme value statistics of non-Hermitian random matrices resembling the large N expansion used in context of the double scaling limit of Hermitian matrix models in string theory.

  7. Biomechanical Analysis of Normal Brain Development during the First Year of Life Using Finite Strain Theory

    OpenAIRE

    Kim, Jeong Chul; Wang, Li; Shen, Dinggang; Lin, Weili

    2016-01-01

    The first year of life is the most critical time period for structural and functional development of the human brain. Combining longitudinal MR imaging and finite strain theory, this study aimed to provide new insights into normal brain development through a biomechanical framework. Thirty-three normal infants were longitudinally imaged using MRI from 2 weeks to 1 year of age. Voxel-wise Jacobian determinant was estimated to elucidate volumetric changes while Lagrange strains (both normal and...

  8. Smoothing of the bivariate LOD score for non-normal quantitative traits.

    Science.gov (United States)

    Buil, Alfonso; Dyer, Thomas D; Almasy, Laura; Blangero, John

    2005-12-30

    Variance component analysis provides an efficient method for performing linkage analysis for quantitative traits. However, type I error of variance components-based likelihood ratio testing may be affected when phenotypic data are non-normally distributed (especially with high values of kurtosis). This results in inflated LOD scores when the normality assumption does not hold. Even though different solutions have been proposed to deal with this problem with univariate phenotypes, little work has been done in the multivariate case. We present an empirical approach to adjust the inflated LOD scores obtained from a bivariate phenotype that violates the assumption of normality. Using the Collaborative Study on the Genetics of Alcoholism data available for the Genetic Analysis Workshop 14, we show how bivariate linkage analysis with leptokurtotic traits gives an inflated type I error. We perform a novel correction that achieves acceptable levels of type I error.

  9. Correlation- and covariance-supported normalization method for estimating orthodontic trainer treatment for clenching activity.

    Science.gov (United States)

    Akdenur, B; Okkesum, S; Kara, S; Günes, S

    2009-11-01

    In this study, electromyography signals sampled from children undergoing orthodontic treatment were used to estimate the effect of an orthodontic trainer on the anterior temporal muscle. A novel data normalization method, called the correlation- and covariance-supported normalization method (CCSNM), based on correlation and covariance between features in a data set, is proposed to provide predictive guidance to the orthodontic technique. The method was tested in two stages: first, data normalization using the CCSNM; second, prediction of normalized values of anterior temporal muscles using an artificial neural network (ANN) with a Levenberg-Marquardt learning algorithm. The data set consists of electromyography signals from right anterior temporal muscles, recorded from 20 children aged 8-13 years with class II malocclusion. The signals were recorded at the start and end of a 6-month treatment. In order to train and test the ANN, two-fold cross-validation was used. The CCSNM was compared with four normalization methods: minimum-maximum normalization, z score, decimal scaling, and line base normalization. In order to demonstrate the performance of the proposed method, prevalent performance-measuring methods, and the mean square error and mean absolute error as mathematical methods, the statistical relation factor R2 and the average deviation have been examined. The results show that the CCSNM was the best normalization method among other normalization methods for estimating the effect of the trainer.

  10. The Self-management of Chronic Pain Through the Use of a Smartphone Application: An Interpretative Phenomenological Approach

    Directory of Open Access Journals (Sweden)

    Carolynn Greene

    2015-10-01

    Overall the qualitative accounts of experience revealed that the smartphone application allowed individuals to reflect upon their chronic pain experience in a way which was different to their normal everyday lives. It built up a more ecological and meaningful picture of the pain experience to the participants. The findings also raised some interesting issues surrounding the use of pain scales, the individuality of experience, and how smartphone applications are interpreted and integrated into everyday life. The findings have built upon previous research in this area by providing deeper accounts of insider experience, this enhances our understandings of smartphone application usage in those living with chronic pain.

  11. In vivo H MR spectroscopy of human brain in six normal volunteers

    International Nuclear Information System (INIS)

    Choe, Bo Young; Suh, Tae Suk; Bahk, Yong Whee; Shinn, Kyung Sub

    1993-01-01

    In vivo H MR spectroscopic studies were performed on the human brain in six normal volunteers. Some distinct proton metabolites, such as N-acetylaspartate (NAA), creatine/phosphocreatine (Cr), choline/phosphocholine (Cho), myo-inositol (Ins) and lipid (fat) were clearly identified in normal brain tissue. The signal intensity of NAA resonance is strongest. The standard ratios of metabolites from the normal brain tissue in specific regions were obtained for the references of further in vivo H MR spectroscopic studies. Our initial resulting suggest the in vivo H MR spectroscopy may provide more precise diagnosis on the basis of the metabolic information on brain tissues. The unique ability of In vivo H MR spectroscopy to offer noninvasive information about tissue biochemistry in patients will stimulate its impact on clinical research and disease diagnosis

  12. Effect of Soret diffusion on lean hydrogen/air flames at normal and elevated pressure and temperature

    KAUST Repository

    Zhou, Zhen; Hernandez Perez, Francisco; Shoshin, Yuriy; van Oijen, Jeroen A.; de Goey, Laurentius P.H.

    2017-01-01

    The influence of Soret diffusion on lean premixed flames propagating in hydrogen/air mixtures is numerically investigated with a detailed chemical and transport models at normal and elevated pressure and temperature. The Soret diffusion influence on the one-dimensional (1D) flame mass burning rate and two-dimensional (2D) flame propagating characteristics is analysed, revealing a strong dependency on flame stretch rate, pressure and temperature. For 1D flames, at normal pressure and temperature, with an increase of Karlovitz number from 0 to 0.4, the mass burning rate is first reduced and then enhanced by Soret diffusion of H2 while it is reduced by Soret diffusion of H. The influence of Soret diffusion of H2 is enhanced by pressure and reduced by temperature. On the contrary, the influence of Soret diffusion of H is reduced by pressure and enhanced by temperature. For 2D flames, at normal pressure and temperature, during the early phase of flame evolution, flames with Soret diffusion display more curved flame cells. Pressure enhances this effect, while temperature reduces it. The influence of Soret diffusion of H2 on the global consumption speed is enhanced at elevated pressure. The influence of Soret diffusion of H on the global consumption speed is enhanced at elevated temperature. The flame evolution is more affected by Soret diffusion in the early phase of propagation than in the long run due to the local enrichment of H2 caused by flame curvature effects. The present study provides new insights into the Soret diffusion effect on the characteristics of lean hydrogen/air flames at conditions that are relevant to practical applications, e.g. gas engines and turbines.

  13. Effect of Soret diffusion on lean hydrogen/air flames at normal and elevated pressure and temperature

    KAUST Repository

    Zhou, Zhen

    2017-04-12

    The influence of Soret diffusion on lean premixed flames propagating in hydrogen/air mixtures is numerically investigated with a detailed chemical and transport models at normal and elevated pressure and temperature. The Soret diffusion influence on the one-dimensional (1D) flame mass burning rate and two-dimensional (2D) flame propagating characteristics is analysed, revealing a strong dependency on flame stretch rate, pressure and temperature. For 1D flames, at normal pressure and temperature, with an increase of Karlovitz number from 0 to 0.4, the mass burning rate is first reduced and then enhanced by Soret diffusion of H2 while it is reduced by Soret diffusion of H. The influence of Soret diffusion of H2 is enhanced by pressure and reduced by temperature. On the contrary, the influence of Soret diffusion of H is reduced by pressure and enhanced by temperature. For 2D flames, at normal pressure and temperature, during the early phase of flame evolution, flames with Soret diffusion display more curved flame cells. Pressure enhances this effect, while temperature reduces it. The influence of Soret diffusion of H2 on the global consumption speed is enhanced at elevated pressure. The influence of Soret diffusion of H on the global consumption speed is enhanced at elevated temperature. The flame evolution is more affected by Soret diffusion in the early phase of propagation than in the long run due to the local enrichment of H2 caused by flame curvature effects. The present study provides new insights into the Soret diffusion effect on the characteristics of lean hydrogen/air flames at conditions that are relevant to practical applications, e.g. gas engines and turbines.

  14. LET effects on normal and radiosensitive cell lines

    International Nuclear Information System (INIS)

    Geard, C.R.; Travisano, M.

    1986-01-01

    Charged particles in the track segment mode were produced by the RARAF Van de Graaff accelerator and used to irradiate two CHO cell lines, a radiosensitive hypermutable line EM9 and its normal parent AA8. Asynchronous cells were irradiated attached to 6 micrometer thick Mylar with protons, deuterons and helium-3 particles at LETs ranging from 10 to 150 keV per micrometer. A 50 kVp x-ray tube integrated into the track segment facility provided a low LET comparison. Following irradiation cells were monitored for clonogenicity, and in a separate series of experiments frequencies of sister chromatid exchanges. Up to 9 experiments were carried out at each LET, with a total of 8 radiations of different LETs being compared. The optimally effective LET for cell survival was between 80 and 120 keV per micrometer, with the 150 keV per micrometer particles indicating energy wastage. The differential between the normal and radiosensitive cell lines was maintained at all LETs

  15. Precaval retropancreatic space: Normal anatomy

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yeon Hee; Kim, Ki Whang; Kim, Myung Jin; Yoo, Hyung Sik; Lee, Jong Tae [Yonsei University College of Medicine, Seoul (Korea, Republic of)

    1992-07-15

    The authors defined precaval retropancreatic space as the space between pancreatic head with portal vein and IVC and analyzed the CT findings of this space to know the normal structures and size in this space. We evaluated 100 cases of normal abdominal CT scan to find out normal anatomic structures of precaval retropancreatic space retrospectively. We also measured the distance between these structures and calculated the minimum, maximum and mean values. At the splenoportal confluence level, normal structures between portal vein and IVC were vessel (21%), lymph node (19%), and caudate lobe of liver (2%) in order of frequency. The maximum AP diameter of portocaval lymph node was 4 mm. Common bile duct (CBD) was seen in 44% and the diameter was mean 3 mm and maximum 11 mm. CBD was located in extrapancreatic (75%) and lateral (60.6%) to pancreatic head. At IVC-left renal vein level, the maximum distance between CBD and IVC was 5 mm and the structure between posterior pancreatic surface and IVC was only fat tissue. Knowledge of these normal structures and measurement will be helpful in differentiating pancreatic mass with retropancreatic mass such as lymphadenopathy.

  16. Application of OCT angiography in ophthalmology

    Directory of Open Access Journals (Sweden)

    Ai-Ping Yang

    2017-11-01

    Full Text Available Optical coherence tomography angiography(OCTAis a new technology of angiography in recent years. In addition to the advantages of traditional OCT, it can observe blood flow in different retinal and choroidal segmentation slab. By using the pseudo-color, abnormal vascular structure can be distinguished from normal vascular structure of the retina. Dye injection is not needed with OCTA, which is different from fundus fluorescein angiography(FFAand indocyanine green angiography(ICGA. OCTA provides more and more accurate blood flow information. However, like other biometric technology, OCTA has its limitations and shortcomings. This review will analyze and summarize the operating principle of OCTA, its application in ophthalmology, as well as its advantages and limitations.

  17. BioTapestry now provides a web application and improved drawing and layout tools.

    Science.gov (United States)

    Paquette, Suzanne M; Leinonen, Kalle; Longabaugh, William J R

    2016-01-01

    Gene regulatory networks (GRNs) control embryonic development, and to understand this process in depth, researchers need to have a detailed understanding of both the network architecture and its dynamic evolution over time and space. Interactive visualization tools better enable researchers to conceptualize, understand, and share GRN models. BioTapestry is an established application designed to fill this role, and recent enhancements released in Versions 6 and 7 have targeted two major facets of the program. First, we introduced significant improvements for network drawing and automatic layout that have now made it much easier for the user to create larger, more organized network drawings. Second, we revised the program architecture so it could continue to support the current Java desktop Editor program, while introducing a new BioTapestry GRN Viewer that runs as a JavaScript web application in a browser. We have deployed a number of GRN models using this new web application. These improvements will ensure that BioTapestry remains viable as a research tool in the face of the continuing evolution of web technologies, and as our understanding of GRN models grows.

  18. IBUPROFEN AS A MEDICATION FOR A CORRECTION OF SYMPTOMS OF NORMAL VACCINAL PROCESS IN CHILDREN

    Directory of Open Access Journals (Sweden)

    T.A. Chebotareva

    2008-01-01

    Full Text Available The pathogenetic approach to treatment of symptoms of normal vaccinal process in children after standard vaccination, based on the results of application of anti9inflammatory medications — ibuprofen (nurofen for children and paracetamol is presented in this article. Clinical activity of ibuprofen was established on the basis of clinica catamnestic observation of 856 vaccinated children aged from 3 months to 3 years. recommendations for application of these medications as a treatment for a correction of vaccinal reactions are given.Key words: children, ibuprofen, paracetamol, vaccination.

  19. Extreme-value limit of the convolution of exponential and multivariate normal distributions: Link to the Hüsler–Reiß distribution

    KAUST Repository

    Krupskii, Pavel

    2017-11-02

    The multivariate Hüsler–Reiß copula is obtained as a direct extreme-value limit from the convolution of a multivariate normal random vector and an exponential random variable multiplied by a vector of constants. It is shown how the set of Hüsler–Reiß parameters can be mapped to the parameters of this convolution model. Assuming there are no singular components in the Hüsler–Reiß copula, the convolution model leads to exact and approximate simulation methods. An application of simulation is to check if the Hüsler–Reiß copula with different parsimonious dependence structures provides adequate fit to some data consisting of multivariate extremes.

  20. Extreme-value limit of the convolution of exponential and multivariate normal distributions: Link to the Hüsler–Reiß distribution

    KAUST Repository

    Krupskii, Pavel; Joe, Harry; Lee, David; Genton, Marc G.

    2017-01-01

    The multivariate Hüsler–Reiß copula is obtained as a direct extreme-value limit from the convolution of a multivariate normal random vector and an exponential random variable multiplied by a vector of constants. It is shown how the set of Hüsler–Reiß parameters can be mapped to the parameters of this convolution model. Assuming there are no singular components in the Hüsler–Reiß copula, the convolution model leads to exact and approximate simulation methods. An application of simulation is to check if the Hüsler–Reiß copula with different parsimonious dependence structures provides adequate fit to some data consisting of multivariate extremes.