WorldWideScience

Sample records for level analysis based

  1. INNOVATION ANALYSIS BASED ON SCORES AT THE FIRM LEVEL

    Directory of Open Access Journals (Sweden)

    Cătălin George ALEXE

    2014-04-01

    Full Text Available Innovation analysis based on scores (Innovation Scorecard is a simple way to get a quick diagnosis on the potential of innovation of a firm in its intention to achieve the innovation capability. It aims to identify and remedy the deficient aspects related to innovation management being used as a measuring tool for the innovation initiatives over time within the innovation audit. The paper aims to present the advantages and disadvantages of using the method, and the three approaches developed over time. Therefore, the model proposed by the consulting firm Arthur D. Little in collaboration with the European Business School, Eckelmann's model and AGGB's local model are summarized and compared. At the end of the paper, several possible solutions are proposed to improve the way of analysis based on scores.

  2. Analysis of 14C level around Qinshan NPP base

    International Nuclear Information System (INIS)

    Huang Renjie; Liang Haiyan; Chen Qianyuan; Ni Shiying; He Jun; Zeng Guangjian; Ma Yongfu

    2012-01-01

    By using the method of alkaline solution absorption, the activity concentrations of Carbon-14 as well as its variation tendency in air and biological samples were analyzed. The air samples and biological samples were collected around the Qinshan nuclear power plant base (Qinshan NPP Base) in 2002 to 2009 and 2007 to 2009 respectively. The results showed that, since 2002, the annual average activity concentrations of Carbon-14 in air samples were in the range of 38.3 mBq/m 3 to 55.4 mBq/m 3 . Although the monitoring results of Xiajiawan village and Yanliucun village were comparatively higher than the reference site in Hangzhou City, the results were still at the same level. Meanwhile, monitoring results of Xiajiawan village and Yangliucun village in the summer of 2004 and 2005 are relatively high, with the peak value of 55.4 mBq/m 3 appeared in Xiajiawan village during the summer of 2005. Correspondingly the annual airborne Carbon-14 of 2004 and 2005 discharged from the Qinshan NPP 3 rd Phase were higher than normal as well, it can therefore be concluded that the activity concentration of Carbon-14 around the Qinshan NPP Base are related to the discharged source term. The activity concentration of Carbon-14 in rice and leaf vegetable samples from Xiajiawan village and Yangliucun village were slightly higher, but within the same level, than that of Hangzhou. The activity concentration of Carbon-14 in the mullet samples collected from the sea area around Qinshan NPP Base are approximately the same with the sea area of Zhoushan. (authors)

  3. Base-By-Base: single nucleotide-level analysis of whole viral genome alignments.

    Science.gov (United States)

    Brodie, Ryan; Smith, Alex J; Roper, Rachel L; Tcherepanov, Vasily; Upton, Chris

    2004-07-14

    With ever increasing numbers of closely related virus genomes being sequenced, it has become desirable to be able to compare two genomes at a level more detailed than gene content because two strains of an organism may share the same set of predicted genes but still differ in their pathogenicity profiles. For example, detailed comparison of multiple isolates of the smallpox virus genome (each approximately 200 kb, with 200 genes) is not feasible without new bioinformatics tools. A software package, Base-By-Base, has been developed that provides visualization tools to enable researchers to 1) rapidly identify and correct alignment errors in large, multiple genome alignments; and 2) generate tabular and graphical output of differences between the genomes at the nucleotide level. Base-By-Base uses detailed annotation information about the aligned genomes and can list each predicted gene with nucleotide differences, display whether variations occur within promoter regions or coding regions and whether these changes result in amino acid substitutions. Base-By-Base can connect to our mySQL database (Virus Orthologous Clusters; VOCs) to retrieve detailed annotation information about the aligned genomes or use information from text files. Base-By-Base enables users to quickly and easily compare large viral genomes; it highlights small differences that may be responsible for important phenotypic differences such as virulence. It is available via the Internet using Java Web Start and runs on Macintosh, PC and Linux operating systems with the Java 1.4 virtual machine.

  4. Knowledge-based low-level image analysis for computer vision systems

    Science.gov (United States)

    Dhawan, Atam P.; Baxi, Himanshu; Ranganath, M. V.

    1988-01-01

    Two algorithms for entry-level image analysis and preliminary segmentation are proposed which are flexible enough to incorporate local properties of the image. The first algorithm involves pyramid-based multiresolution processing and a strategy to define and use interlevel and intralevel link strengths. The second algorithm, which is designed for selected window processing, extracts regions adaptively using local histograms. The preliminary segmentation and a set of features are employed as the input to an efficient rule-based low-level analysis system, resulting in suboptimal meaningful segmentation.

  5. Base-By-Base: Single nucleotide-level analysis of whole viral genome alignments

    Directory of Open Access Journals (Sweden)

    Tcherepanov Vasily

    2004-07-01

    Full Text Available Abstract Background With ever increasing numbers of closely related virus genomes being sequenced, it has become desirable to be able to compare two genomes at a level more detailed than gene content because two strains of an organism may share the same set of predicted genes but still differ in their pathogenicity profiles. For example, detailed comparison of multiple isolates of the smallpox virus genome (each approximately 200 kb, with 200 genes is not feasible without new bioinformatics tools. Results A software package, Base-By-Base, has been developed that provides visualization tools to enable researchers to 1 rapidly identify and correct alignment errors in large, multiple genome alignments; and 2 generate tabular and graphical output of differences between the genomes at the nucleotide level. Base-By-Base uses detailed annotation information about the aligned genomes and can list each predicted gene with nucleotide differences, display whether variations occur within promoter regions or coding regions and whether these changes result in amino acid substitutions. Base-By-Base can connect to our mySQL database (Virus Orthologous Clusters; VOCs to retrieve detailed annotation information about the aligned genomes or use information from text files. Conclusion Base-By-Base enables users to quickly and easily compare large viral genomes; it highlights small differences that may be responsible for important phenotypic differences such as virulence. It is available via the Internet using Java Web Start and runs on Macintosh, PC and Linux operating systems with the Java 1.4 virtual machine.

  6. System-Level Sensitivity Analysis of SiNW-bioFET-Based Biosensing Using Lockin Amplification

    DEFF Research Database (Denmark)

    Patou, François; Dimaki, Maria; Kjærgaard, Claus

    2017-01-01

    carry out for the first time the system-level sensitivity analysis of a generic SiNW-bioFET model coupled to a custom-design instrument based on the lock-in amplifier. By investigating a large parametric space spanning over both sensor and instrumentation specifications, we demonstrate that systemwide...

  7. Airborne electromagnetic data levelling using principal component analysis based on flight line difference

    Science.gov (United States)

    Zhang, Qiong; Peng, Cong; Lu, Yiming; Wang, Hao; Zhu, Kaiguang

    2018-04-01

    A novel technique is developed to level airborne geophysical data using principal component analysis based on flight line difference. In the paper, flight line difference is introduced to enhance the features of levelling error for airborne electromagnetic (AEM) data and improve the correlation between pseudo tie lines. Thus we conduct levelling to the flight line difference data instead of to the original AEM data directly. Pseudo tie lines are selected distributively cross profile direction, avoiding the anomalous regions. Since the levelling errors of selective pseudo tie lines show high correlations, principal component analysis is applied to extract the local levelling errors by low-order principal components reconstruction. Furthermore, we can obtain the levelling errors of original AEM data through inverse difference after spatial interpolation. This levelling method does not need to fly tie lines and design the levelling fitting function. The effectiveness of this method is demonstrated by the levelling results of survey data, comparing with the results from tie-line levelling and flight-line correlation levelling.

  8. Fast noise level estimation algorithm based on principal component analysis transform and nonlinear rectification

    Science.gov (United States)

    Xu, Shaoping; Zeng, Xiaoxia; Jiang, Yinnan; Tang, Yiling

    2018-01-01

    We proposed a noniterative principal component analysis (PCA)-based noise level estimation (NLE) algorithm that addresses the problem of estimating the noise level with a two-step scheme. First, we randomly extracted a number of raw patches from a given noisy image and took the smallest eigenvalue of the covariance matrix of the raw patches as the preliminary estimation of the noise level. Next, the final estimation was directly obtained with a nonlinear mapping (rectification) function that was trained on some representative noisy images corrupted with different known noise levels. Compared with the state-of-art NLE algorithms, the experiment results show that the proposed NLE algorithm can reliably infer the noise level and has robust performance over a wide range of image contents and noise levels, showing a good compromise between speed and accuracy in general.

  9. Identifying novel glioma associated pathways based on systems biology level meta-analysis.

    Science.gov (United States)

    Hu, Yangfan; Li, Jinquan; Yan, Wenying; Chen, Jiajia; Li, Yin; Hu, Guang; Shen, Bairong

    2013-01-01

    With recent advances in microarray technology, including genomics, proteomics, and metabolomics, it brings a great challenge for integrating this "-omics" data to analysis complex disease. Glioma is an extremely aggressive and lethal form of brain tumor, and thus the study of the molecule mechanism underlying glioma remains very important. To date, most studies focus on detecting the differentially expressed genes in glioma. However, the meta-analysis for pathway analysis based on multiple microarray datasets has not been systematically pursued. In this study, we therefore developed a systems biology based approach by integrating three types of omics data to identify common pathways in glioma. Firstly, the meta-analysis has been performed to study the overlapping of signatures at different levels based on the microarray gene expression data of glioma. Among these gene expression datasets, 12 pathways were found in GeneGO database that shared by four stages. Then, microRNA expression profiles and ChIP-seq data were integrated for the further pathway enrichment analysis. As a result, we suggest 5 of these pathways could be served as putative pathways in glioma. Among them, the pathway of TGF-beta-dependent induction of EMT via SMAD is of particular importance. Our results demonstrate that the meta-analysis based on systems biology level provide a more useful approach to study the molecule mechanism of complex disease. The integration of different types of omics data, including gene expression microarrays, microRNA and ChIP-seq data, suggest some common pathways correlated with glioma. These findings will offer useful potential candidates for targeted therapeutic intervention of glioma.

  10. A Novel Acoustic Liquid Level Determination Method for Coal Seam Gas Wells Based on Autocorrelation Analysis

    Directory of Open Access Journals (Sweden)

    Ximing Zhang

    2017-11-01

    Full Text Available In coal seam gas (CSG wells, water is periodically removed from the wellbore in order to keep the bottom-hole flowing pressure at low levels, facilitating the desorption of methane gas from the coal bed. In order to calculate gas flow rate and further optimize well performance, it is necessary to accurately monitor the liquid level in real-time. This paper presents a novel method based on autocorrelation function (ACF analysis for determining the liquid level in CSG wells under intense noise conditions. The method involves the calculation of the acoustic travel time in the annulus and processing the autocorrelation signal in order to extract the weak echo under high background noise. In contrast to previous works, the non-linear dependence of the acoustic velocity on temperature and pressure is taken into account. To locate the liquid level of a coal seam gas well the travel time is computed iteratively with the non-linear velocity model. Afterwards, the proposed method is validated using experimental laboratory investigations that have been developed for liquid level detection under two scenarios, representing the combination of low pressure, weak signal, and intense noise generated by gas flowing and leakage. By adopting an evaluation indicator called Crest Factor, the results have shown the superiority of the ACF-based method compared to Fourier filtering (FFT. In the two scenarios, the maximal measurement error from the proposed method was 0.34% and 0.50%, respectively. The latent periodic characteristic of the reflected signal can be extracted by the ACF-based method even when the noise is larger than 1.42 Pa, which is impossible for FFT-based de-noising. A case study focused on a specific CSG well is presented to illustrate the feasibility of the proposed approach, and also to demonstrate that signal processing with autocorrelation analysis can improve the sensitivity of the detection system.

  11. Area-level poverty and preterm birth risk: A population-based multilevel analysis

    Science.gov (United States)

    DeFranco, Emily A; Lian, Min; Muglia, Louis A; Schootman, Mario

    2008-01-01

    Background Preterm birth is a complex disease with etiologic influences from a variety of social, environmental, hormonal, genetic, and other factors. The purpose of this study was to utilize a large population-based birth registry to estimate the independent effect of county-level poverty on preterm birth risk. To accomplish this, we used a multilevel logistic regression approach to account for multiple co-existent individual-level variables and county-level poverty rate. Methods Population-based study utilizing Missouri's birth certificate database (1989–1997). We conducted a multilevel logistic regression analysis to estimate the effect of county-level poverty on PTB risk. Of 634,994 births nested within 115 counties in Missouri, two levels were considered. Individual-level variables included demographics factors, prenatal care, health-related behavioral risk factors, and medical risk factors. The area-level variable included the percentage of the population within each county living below the poverty line (US census data, 1990). Counties were divided into quartiles of poverty; the first quartile (lowest rate of poverty) was the reference group. Results PTB rate of PTB poverty and increased through the 4th quartile (4.9%), p poverty was significantly associated with PTB risk. PTB risk (poverty, adjusted odds ratio (adjOR) 1.18 (95% CI 1.03, 1.35), with a similar effect at earlier gestational ages (birth, above other underlying risk factors. Although the risk increase is modest, it affects a large number of pregnancies. PMID:18793437

  12. Area-level poverty and preterm birth risk: A population-based multilevel analysis

    Directory of Open Access Journals (Sweden)

    Muglia Louis A

    2008-09-01

    Full Text Available Abstract Background Preterm birth is a complex disease with etiologic influences from a variety of social, environmental, hormonal, genetic, and other factors. The purpose of this study was to utilize a large population-based birth registry to estimate the independent effect of county-level poverty on preterm birth risk. To accomplish this, we used a multilevel logistic regression approach to account for multiple co-existent individual-level variables and county-level poverty rate. Methods Population-based study utilizing Missouri's birth certificate database (1989–1997. We conducted a multilevel logistic regression analysis to estimate the effect of county-level poverty on PTB risk. Of 634,994 births nested within 115 counties in Missouri, two levels were considered. Individual-level variables included demographics factors, prenatal care, health-related behavioral risk factors, and medical risk factors. The area-level variable included the percentage of the population within each county living below the poverty line (US census data, 1990. Counties were divided into quartiles of poverty; the first quartile (lowest rate of poverty was the reference group. Results PTB th quartile (4.9%, p adjOR 1.18 (95% CI 1.03, 1.35, with a similar effect at earlier gestational ages (adjOR 1.27 (95% CI 1.06, 1.52. Conclusion Women residing in socioeconomically deprived areas are at increased risk of preterm birth, above other underlying risk factors. Although the risk increase is modest, it affects a large number of pregnancies.

  13. Analysis of theoretical security level of PDF Encryption mechanism based on X.509 certificates

    Directory of Open Access Journals (Sweden)

    Joanna Dmitruk

    2017-12-01

    Full Text Available PDF Encryption is a content security mechanism developed and used by Adobe in their products. In this paper, we have checked a theoretical security level of a variant that uses public key infrastructure and X.509 certificates. We have described a basis of this mechanism and we have performed a simple security analysis. Then, we have showed possible tweaks and security improvements. At the end, we have given some recommendations that can improve security of a content secured with PDF Encryption based on X.509 certificates. Keywords: DRM, cryptography, security level, PDF Encryption, Adobe, X.509

  14. Low-level dosimetry based on activation analysis of badge film, 1

    International Nuclear Information System (INIS)

    Morikawa, Kaoru; Yamashita, Kazuya; Inamoto, Kazuo; Maeda, Masayuki; Sato, Takashi; Ono, Koichi.

    1988-01-01

    Underexposed badge film contains a minor quantity of silver which corresponds to the low radiation dose even after completion of the photographic densitometry ; however, it cannot be detected with a photographic densitometer. We intended to clarify the minor silver content based on radioactivation analysis with thermal neutrons at KUR (Kyoto University Research Reactor). The natural silver consists of two stable nuclides, i.e., 107 Ag and 109 Ag. These can be activated with irradiation of thermal neutrons to two radionuclides, i.e., 108 Ag and 110 Ag. In this paper, through the activation analysis of an underexposed badge film, methods of measurements are shown regarding both 108 Ag produced by the 107 Ag (n, γ) reaction and 110 Ag by the 109 Ag (n, γ). After underexposed badge films were irradiated with thermal neutrons, some gamma-rays emitted from the radionuclides in the activated films were measured with a high pure Ge detector or a NaI (Tl) scintillation detector. The following results were obtained: (1) several elements such as silver, iodine, gold, antimony, manganese and copper were detected by activation analyses of films exposed to low level 60 Co gamma-rays, (2) the exposure vs 108 Ag or 110 Ag activity curve was linear in the lower dose range of 60 Co gamma-rays. These data indicate that low level radiation doses, which is indeterminable by ordinary photographic densitometry, can be estimated by activation analysis of silver atoms in badge films. (author)

  15. Sensitivity Analysis of features in tolerancing based on constraint function level sets

    International Nuclear Information System (INIS)

    Ziegler, Philipp; Wartzack, Sandro

    2015-01-01

    Usually, the geometry of the manufactured product inherently varies from the nominal geometry. This may negatively affect the product functions and properties (such as quality and reliability), as well as the assemblability of the single components. In order to avoid this, the geometric variation of these component surfaces and associated geometry elements (like hole axes) are restricted by tolerances. Since tighter tolerances lead to significant higher manufacturing costs, tolerances should be specified carefully. Therefore, the impact of deviating component surfaces on functions, properties and assemblability of the product has to be analyzed. As physical experiments are expensive, methods of statistical tolerance analysis tools are widely used in engineering design. Current tolerance simulation tools lack of an appropriate indicator for the impact of deviating component surfaces. In the adoption of Sensitivity Analysis methods, there are several challenges, which arise from the specific framework in tolerancing. This paper presents an approach to adopt Sensitivity Analysis methods on current tolerance simulations with an interface module, which bases on level sets of constraint functions for parameters of the simulation model. The paper is an extension and generalization of Ziegler and Wartzack [1]. Mathematical properties of the constraint functions (convexity, homogeneity), which are important for the computational costs of the Sensitivity Analysis, are shown. The practical use of the method is illustrated in a case study of a plain bearing. - Highlights: • Alternative definition of Deviation Domains. • Proof of mathematical properties of the Deviation Domains. • Definition of the interface between Deviation Domains and Sensitivity Analysis. • Sensitivity analysis of a gearbox to show the methods practical use

  16. PAW [Physics Analysis Workstation] at Fermilab: CORE based graphics implementation of HIGZ [High Level Interface to Graphics and Zebra

    International Nuclear Information System (INIS)

    Johnstad, H.

    1989-06-01

    The Physics Analysis Workstation system (PAW) is primarily intended to be the last link in the analysis chain of experimental data. The graphical part of PAW is based on HIGZ (High Level Interface to Graphics and Zebra), which is based on the OSI and ANSI standard Graphics Kernel System (GKS). HIGZ is written in the context of PAW. At Fermilab, the CORE based graphics system DI-3000 by Precision Visuals Inc., is widely used in the analysis of experimental data. The graphical part of the PAW routines has been totally rewritten and implemented in the Fermilab environment. 3 refs

  17. Analysis of factors affecting satisfaction level on problem based learning approach using structural equation modeling

    Science.gov (United States)

    Hussain, Nur Farahin Mee; Zahid, Zalina

    2014-12-01

    Nowadays, in the job market demand, graduates are expected not only to have higher performance in academic but they must also be excellent in soft skill. Problem-Based Learning (PBL) has a number of distinct advantages as a learning method as it can deliver graduates that will be highly prized by industry. This study attempts to determine the satisfaction level of engineering students on the PBL Approach and to evaluate their determinant factors. The Structural Equation Modeling (SEM) was used to investigate how the factors of Good Teaching Scale, Clear Goals, Student Assessment and Levels of Workload affected the student satisfaction towards PBL approach.

  18. Embodied energy analysis of photovoltaic (PV) system based on macro- and micro-level

    International Nuclear Information System (INIS)

    Nawaz, I.; Tiwari, G.N.

    2006-01-01

    In this paper the energy payback time and CO 2 emissions of photovoltaic (PV) system have been analyzed. The embodied energy for production of PV module based on single crystal silicon, as well as for the manufacturing of other system components have been computed at macro- and micro-level assuming irradiation of 800-1200 W/m 2 in different climatic zones in India for inclined surface. The energy payback time with and without balance-of-system for open field and rooftop has been evaluated. It is found that the embodied energy at micro-level is significantly higher than embodied energy at macro-level. The effect of insolation, overall efficiency, lifetime of PV system on energy pay back time and CO 2 emissions have been studied with and without balance of system. A 1.2 kW p PV system of SIEMENS for mudhouse at IIT, Delhi based on macro- and micro-level has been evaluated. The CO 2 mitigation potential, the importance and role of PV system for sustainable development are also highlighted

  19. A hardware acceleration based on high-level synthesis approach for glucose-insulin analysis

    Science.gov (United States)

    Daud, Nur Atikah Mohd; Mahmud, Farhanahani; Jabbar, Muhamad Hairol

    2017-01-01

    In this paper, the research is focusing on Type 1 Diabetes Mellitus (T1DM). Since this disease requires a full attention on the blood glucose concentration with the help of insulin injection, it is important to have a tool that able to predict that level when consume a certain amount of carbohydrate during meal time. Therefore, to make it realizable, a Hovorka model which is aiming towards T1DM is chosen in this research. A high-level language is chosen that is C++ to construct the mathematical model of the Hovorka model. Later, this constructed code is converted into intellectual property (IP) which is also known as a hardware accelerator by using of high-level synthesis (HLS) approach which able to improve in terms of design and performance for glucose-insulin analysis tool later as will be explained further in this paper. This is the first step in this research before implementing the design into system-on-chip (SoC) to achieve a high-performance system for the glucose-insulin analysis tool.

  20. Water security at local government level in South Africa: a qualitative interview-based analysis

    Directory of Open Access Journals (Sweden)

    Richard Meissner, DPhil

    2018-05-01

    Full Text Available Background: As one of the 40 driest countries in the world with an annual average rainfall of 497 mm, South Africa is a water-scarce country. Additionally, South Africa's rate of economic development is closely linked to its water security. Thus, increasing water stress, supply variability, flooding, and water pollution levels and inadequate access to safe drinking water and sanitation are slowing economic growth. Despite the high premium placed on South Africa's water resources, no commonly shared understanding of water security exists. The aim of this study was to research, using qualitative social scientific methods, how people in two South African localities understand water security. Methods: We used interviews and qualitative analyses to establish and compare how people from different lifestyles perceive water security in the Greater Sekhukhune District and the eThekwini Metropolitan Municipalities of South Africa. The inland Sekhukhune has a drier climate and more rural socioeconomic profile than the coastal, urbanised eThekwini with its complex economy and diverse socioeconomic structure. We did face-to-face structured interviews with a diverse stakeholder group consisting of community members, traditional leaders, municipal officials, researchers, business people, and farmers in each municipality and focus groups in two communities of each municipality: Leeuwfontein and Motetema (Sekhukhune and Inanda and Ntshongweni (eThekwini. Each interview lasted 40–60 min, and focus group discussions lasted 90–120 min. We asked the respondents about their understanding of the concept of water security and whether they believe that, at the local and national level, the authorities had achieved water security for all. Findings: Following a qualitative analysis, we found that water security is a state of mind based on context-specific (ie, localised and individualised perceptions held by an individual of water-related threats and how it influences

  1. Pixel-level multisensor image fusion based on matrix completion and robust principal component analysis

    Science.gov (United States)

    Wang, Zhuozheng; Deller, J. R.; Fleet, Blair D.

    2016-01-01

    Acquired digital images are often corrupted by a lack of camera focus, faulty illumination, or missing data. An algorithm is presented for fusion of multiple corrupted images of a scene using the lifting wavelet transform. The method employs adaptive fusion arithmetic based on matrix completion and self-adaptive regional variance estimation. Characteristics of the wavelet coefficients are used to adaptively select fusion rules. Robust principal component analysis is applied to low-frequency image components, and regional variance estimation is applied to high-frequency components. Experiments reveal that the method is effective for multifocus, visible-light, and infrared image fusion. Compared with traditional algorithms, the new algorithm not only increases the amount of preserved information and clarity but also improves robustness.

  2. A systematic fault tree analysis based on multi-level flow modeling

    International Nuclear Information System (INIS)

    Gofuku, Akio; Ohara, Ai

    2010-01-01

    The fault tree analysis (FTA) is widely applied for the safety evaluation of a large-scale and mission-critical system. Because the potential of the FTA, however, strongly depends on human skill of analyzers, problems are pointed out in (1) education and training, (2) unreliable quality, (3) necessity of expertise knowledge, and (4) update of FTA results after the reconstruction of a target system. To get rid of these problems, many techniques to systematize FTA activities by applying computer technologies have been proposed. However, these techniques only use structural information of a target system and do not use functional information that is one of important properties of an artifact. The principle of FTA is to trace comprehensively cause-effect relations from a top undesirable effect to anomaly causes. The tracing is similar to the causality estimation technique that the authors proposed to find plausible counter actions to prevent or to mitigate the undesirable behavior of plants based on the model by a functional modeling technique, Multilevel Flow Modeling (MFM). The authors have extended this systematic technique to construct a fault tree (FT). This paper presents an algorithm of systematic construction of FT based on MFM models and demonstrates the applicability of the extended technique by the FT construction result of a cooling plant of nitric acid. (author)

  3. Designing Dietary Recommendations Using System Level Interactomics Analysis and Network-Based Inference

    Directory of Open Access Journals (Sweden)

    Tingting Zheng

    2017-09-01

    Full Text Available Background: A range of computational methods that rely on the analysis of genome-wide expression datasets have been developed and successfully used for drug repositioning. The success of these methods is based on the hypothesis that introducing a factor (in this case, a drug molecule that could reverse the disease gene expression signature will lead to a therapeutic effect. However, it has also been shown that globally reversing the disease expression signature is not a prerequisite for drug activity. On the other hand, the basic idea of significant anti-correlation in expression profiles could have great value for establishing diet-disease associations and could provide new insights into the role of dietary interventions in disease.Methods: We performed an integrated analysis of publicly available gene expression profiles for foods, diseases and drugs, by calculating pairwise similarity scores for diet and disease gene expression signatures and characterizing their topological features in protein-protein interaction networks.Results: We identified 485 diet-disease pairs where diet could positively influence disease development and 472 pairs where specific diets should be avoided in a disease state. Multiple evidence suggests that orange, whey and coconut fat could be beneficial for psoriasis, lung adenocarcinoma and macular degeneration, respectively. On the other hand, fructose-rich diet should be restricted in patients with chronic intermittent hypoxia and ovarian cancer. Since humans normally do not consume foods in isolation, we also applied different algorithms to predict synergism; as a result, 58 food pairs were predicted. Interestingly, the diets identified as anti-correlated with diseases showed a topological proximity to the disease proteins similar to that of the corresponding drugs.Conclusions: In conclusion, we provide a computational framework for establishing diet-disease associations and additional information on the role of

  4. Opinion mining feature-level using Naive Bayes and feature extraction based analysis dependencies

    Science.gov (United States)

    Sanda, Regi; Baizal, Z. K. Abdurahman; Nhita, Fhira

    2015-12-01

    Development of internet and technology, has major impact and providing new business called e-commerce. Many e-commerce sites that provide convenience in transaction, and consumers can also provide reviews or opinions on products that purchased. These opinions can be used by consumers and producers. Consumers to know the advantages and disadvantages of particular feature of the product. Procuders can analyse own strengths and weaknesses as well as it's competitors products. Many opinions need a method that the reader can know the point of whole opinion. The idea emerged from review summarization that summarizes the overall opinion based on sentiment and features contain. In this study, the domain that become the main focus is about the digital camera. This research consisted of four steps 1) giving the knowledge to the system to recognize the semantic orientation of an opinion 2) indentify the features of product 3) indentify whether the opinion gives a positive or negative 4) summarizing the result. In this research discussed the methods such as Naï;ve Bayes for sentiment classification, and feature extraction algorithm based on Dependencies Analysis, which is one of the tools in Natural Language Processing (NLP) and knowledge based dictionary which is useful for handling implicit features. The end result of research is a summary that contains a bunch of reviews from consumers on the features and sentiment. With proposed method, accuration for sentiment classification giving 81.2 % for positive test data, 80.2 % for negative test data, and accuration for feature extraction reach 90.3 %.

  5. Global cost analysis on adaptation to sea level rise based on RCP/SSP scenarios

    Science.gov (United States)

    Kumano, N.; Tamura, M.; Yotsukuri, M.; Kuwahara, Y.; Yokoki, H.

    2017-12-01

    Low-lying areas are the most vulnerable to sea level rise (SLR) due to climate change in the future. In order to adapt to SLR, it is necessary to decide whether to retreat from vulnerable areas or to install dykes to protect them from inundation. Therefore, cost- analysis of adaptation using coastal dykes is one of the most essential issues in the context of climate change and its countermeasures. However, few studies have globally evaluated the future costs of adaptation in coastal areas. This study tries to globally analyze the cost of adaptation in coastal areas. First, global distributions of projected inundation impacts induced by SLR including astronomical high tide were assessed. Economic damage was estimated on the basis of the econometric relationship between past hydrological disasters, affected population, and per capita GDP using CRED's EM-DAT database. Second, the cost of adaptation was also determined using the cost database and future scenarios. The authors have built a cost database for installed coastal dykes worldwide and applied it to estimating the future cost of adaptation. The unit costs of dyke construction will increase with socio-economic scenario (SSP) such as per capita GDP. Length of vulnerable coastline is calculated by identifying inundation areas using ETOPO1. Future cost was obtained by multiplying the length of vulnerable coastline and the unit cost of dyke construction. Third, the effectiveness of dyke construction was estimated by comparing cases with and without adaptation.As a result, it was found that incremental adaptation cost is lower than economic damage in the cases of SSP1 and SSP3 under RCP scenario, while the cost of adaptation depends on the durability of the coastal dykes.

  6. Meta-analysis review of fish trophic level at marine protected areas based on stable isotopes data

    Directory of Open Access Journals (Sweden)

    J. J. de LOPE ARIAS

    2016-04-01

    Full Text Available Stable isotopes (δ15N are used to determine trophic level in marine food webs. We assessed if Marine Protected Areas (MPAs affect trophic level of fishes based on stable isotopes on the Western Mediterranean. A total of 22 studies including 600 observations were found and the final dataset consisted of 11 fish species and 146 observations comparing trophic level inside and outside MPAs. The database was analysed by meta-analysis and the covariate selected was the level of protection (inside vs. outside MPAs. The results indicate significant difference between trophic levels inside and outside MPAs. However, results differ from expectations since the trophic level inside was lower than outside MPAs. Three habitats were analysed (coastal lagoons, demersal and littoral and significant differences were found among them. Trophic level was higher in demersal habitats than in coastal lagoons and littoral areas. No significant differences were found in species classified by trophic functional groups. We consider several hypotheses explaining the obtained results linked to protection level of the MPAs, time since protection and MPAs size. We debate the suitability of using the stable isotope (δ15N as direct indicator of trophic level in evaluating MPAs effects on food webs.

  7. QAPgrid: a two level QAP-based approach for large-scale data analysis and visualization.

    Directory of Open Access Journals (Sweden)

    Mario Inostroza-Ponta

    Full Text Available BACKGROUND: The visualization of large volumes of data is a computationally challenging task that often promises rewarding new insights. There is great potential in the application of new algorithms and models from combinatorial optimisation. Datasets often contain "hidden regularities" and a combined identification and visualization method should reveal these structures and present them in a way that helps analysis. While several methodologies exist, including those that use non-linear optimization algorithms, severe limitations exist even when working with only a few hundred objects. METHODOLOGY/PRINCIPAL FINDINGS: We present a new data visualization approach (QAPgrid that reveals patterns of similarities and differences in large datasets of objects for which a similarity measure can be computed. Objects are assigned to positions on an underlying square grid in a two-dimensional space. We use the Quadratic Assignment Problem (QAP as a mathematical model to provide an objective function for assignment of objects to positions on the grid. We employ a Memetic Algorithm (a powerful metaheuristic to tackle the large instances of this NP-hard combinatorial optimization problem, and we show its performance on the visualization of real data sets. CONCLUSIONS/SIGNIFICANCE: Overall, the results show that QAPgrid algorithm is able to produce a layout that represents the relationships between objects in the data set. Furthermore, it also represents the relationships between clusters that are feed into the algorithm. We apply the QAPgrid on the 84 Indo-European languages instance, producing a near-optimal layout. Next, we produce a layout of 470 world universities with an observed high degree of correlation with the score used by the Academic Ranking of World Universities compiled in the The Shanghai Jiao Tong University Academic Ranking of World Universities without the need of an ad hoc weighting of attributes. Finally, our Gene Ontology-based study on

  8. Toward Estimating Wetland Water Level Changes Based on Hydrological Sensitivity Analysis of PALSAR Backscattering Coefficients over Different Vegetation Fields

    Directory of Open Access Journals (Sweden)

    Ting Yuan

    2015-03-01

    Full Text Available Synthetic Aperture Radar (SAR has been successfully used to map wetland’s inundation extents and types of vegetation based on the fact that the SAR backscatter signal from the wetland is mainly controlled by the wetland vegetation type and water level changes. This study describes the relation between L-band PALSAR  and seasonal water level changes obtained from Envisat altimetry over the island of Île Mbamou in the Congo Basin where two distinctly different vegetation types are found. We found positive correlations between and water level changes over the forested southern Île Mbamou whereas both positive and negative correlations were observed over the non-forested northern Île Mbamou depending on the amount of water level increase. Based on the analysis of sensitivity, we found that denser vegetation canopy leads to less sensitive  variation with respect to the water level changes regardless of forested or non-forested canopy. Furthermore, we attempted to estimate water level changes which were then compared with the Envisat altimetry and InSAR results. Our results demonstrated a potential to generate two-dimensional maps of water level changes over the wetlands, and thus may have substantial synergy with the planned Surface Water and Ocean Topography (SWOT mission.

  9. Development of an alarm analysis system based on multi-level flow models for nuclear power plant

    International Nuclear Information System (INIS)

    Zhang Jiande; Yang Ming; Zhang Zhijian

    2008-01-01

    An alarm analysis system based on Multi-level Flow Models (MFM) was developed for a PWR NPP. By automatically identifying the primary root causes in complex fault situations, the workload of the operators can be reduced. In addition, because MFM also provides a set of graphical symbols that implies causalities, operators can confirm diagnosis results by semiotic analysis, and hence the understandability of the process of alarm analysis as well as the reliability of maintenance task can be increased. 19 cases of simulation data from RELAP5/MOD2 code were utilized for evaluating the performance of the proposed system. The simulation results show that the proposed alarm analysis system has a good ability to detect and diagnose accidents earlier in time before reactor trip. (authors)

  10. Phosphotyrosine-based-phosphoproteomics scaled-down to biopsy level for analysis of individual tumor biology and treatment selection.

    Science.gov (United States)

    Labots, Mariette; van der Mijn, Johannes C; Beekhof, Robin; Piersma, Sander R; de Goeij-de Haas, Richard R; Pham, Thang V; Knol, Jaco C; Dekker, Henk; van Grieken, Nicole C T; Verheul, Henk M W; Jiménez, Connie R

    2017-06-06

    Mass spectrometry-based phosphoproteomics of cancer cell and tissue lysates provides insight in aberrantly activated signaling pathways and potential drug targets. For improved understanding of individual patient's tumor biology and to allow selection of tyrosine kinase inhibitors in individual patients, phosphoproteomics of small clinical samples should be feasible and reproducible. We aimed to scale down a pTyr-phosphopeptide enrichment protocol to biopsy-level protein input and assess reproducibility and applicability to tumor needle biopsies. To this end, phosphopeptide immunoprecipitation using anti-phosphotyrosine beads was performed using 10, 5 and 1mg protein input from lysates of colorectal cancer (CRC) cell line HCT116. Multiple needle biopsies from 7 human CRC resection specimens were analyzed at the 1mg-level. The total number of phosphopeptides captured and detected by LC-MS/MS ranged from 681 at 10mg input to 471 at 1mg HCT116 protein. ID-reproducibility ranged from 60.5% at 10mg to 43.9% at 1mg. Per 1mg-level biopsy sample, >200 phosphopeptides were identified with 57% ID-reproducibility between paired tumor biopsies. Unsupervised analysis clustered biopsies from individual patients together and revealed known and potential therapeutic targets. This study demonstrates the feasibility of label-free pTyr-phosphoproteomics at the tumor biopsy level based on reproducible analyses using 1mg of protein input. The considerable number of identified phosphopeptides at this level is attributed to an effective down-scaled immuno-affinity protocol as well as to the application of ID propagation in the data processing and analysis steps. Unsupervised cluster analysis reveals patient-specific profiles. Together, these findings pave the way for clinical trials in which pTyr-phosphoproteomics will be performed on pre- and on-treatment biopsies. Such studies will improve our understanding of individual tumor biology and may enable future pTyr-phosphoproteomics-based

  11. Quantitative analysis of the level of readability of online emergency radiology-based patient education resources.

    Science.gov (United States)

    Hansberry, David R; D'Angelo, Michael; White, Michael D; Prabhu, Arpan V; Cox, Mougnyan; Agarwal, Nitin; Deshmukh, Sandeep

    2018-04-01

    The vast amount of information found on the internet, combined with its accessibility, makes it a widely utilized resource for Americans to find information pertaining to medical information. The field of radiology is no exception. In this paper, we assess the readability level of websites pertaining specifically to emergency radiology. Using Google, 23 terms were searched, and the top 10 results were recorded. Each link was evaluated for its readability level using a set of ten reputable readability scales. The search terms included the following: abdominal ultrasound, abdominal aortic aneurysm, aortic dissection, appendicitis, cord compression, CT abdomen, cholecystitis, CT chest, diverticulitis, ectopic pregnancy, epidural hematoma, dural venous thrombosis, head CT, MRI brain, MR angiography, MRI spine, ovarian torsion, pancreatitis, pelvic ultrasound, pneumoperitoneum, pulmonary embolism, subarachnoid hemorrhage, and subdural hematoma. Any content that was not written for patients was excluded. The 230 articles that were assessed were written, on average, at a 12.1 grade level. Only 2 of the 230 articles (1%) were written at the third to seventh grade recommended reading level set forth by the National Institutes of Health (NIH) and American Medical Association (AMA). Fifty-two percent of the 230 articles were written so as to require a minimum of a high school education (at least a 12th grade level). Additionally, 17 of the 230 articles (7.3%) were written at a level that exceeded an undergraduate education (at least a 16th grade level). The majority of websites with emergency radiology-related patient education materials are not adhering to the NIH and AMA's recommended reading levels, and it is likely that the average reader is not benefiting fully from these information outlets. With the link between health literacy and poor health outcomes, it is important to address the online content in this area of radiology, allowing for patient to more fully benefit

  12. Analysis of community-level mesocosm data based on ecologically meaningful dissimilarity measures and data transformation

    NARCIS (Netherlands)

    Tebby, Cleo; Joachim, Sandrine; Brink, Van den Paul J.; Porcher, Jean Marc; Beaudouin, Rémy

    2017-01-01

    The principal response curve (PRC) method is a constrained ordination method developed specifically for the analysis of community data collected in mesocosm experiments, which provides easily understood summaries and graphical representations of community response to stress. It is a redundancy

  13. An enhanced unified uncertainty analysis approach based on first order reliability method with single-level optimization

    International Nuclear Information System (INIS)

    Yao, Wen; Chen, Xiaoqian; Huang, Yiyong; Tooren, Michel van

    2013-01-01

    In engineering, there exist both aleatory uncertainties due to the inherent variation of the physical system and its operational environment, and epistemic uncertainties due to lack of knowledge and which can be reduced with the collection of more data. To analyze the uncertain distribution of the system performance under both aleatory and epistemic uncertainties, combined probability and evidence theory can be employed to quantify the compound effects of the mixed uncertainties. The existing First Order Reliability Method (FORM) based Unified Uncertainty Analysis (UUA) approach nests the optimization based interval analysis in the improved Hasofer–Lind–Rackwitz–Fiessler (iHLRF) algorithm based Most Probable Point (MPP) searching procedure, which is computationally inhibitive for complex systems and may encounter convergence problem as well. Therefore, in this paper it is proposed to use general optimization solvers to search MPP in the outer loop and then reformulate the double-loop optimization problem into an equivalent single-level optimization (SLO) problem, so as to simplify the uncertainty analysis process, improve the robustness of the algorithm, and alleviate the computational complexity. The effectiveness and efficiency of the proposed method is demonstrated with two numerical examples and one practical satellite conceptual design problem. -- Highlights: ► Uncertainty analysis under mixed aleatory and epistemic uncertainties is studied. ► A unified uncertainty analysis method is proposed with combined probability and evidence theory. ► The traditional nested analysis method is converted to single level optimization for efficiency. ► The effectiveness and efficiency of the proposed method are testified with three examples

  14. Analysis of Students' Online Learning Readiness Based on Their Emotional Intelligence Level

    Science.gov (United States)

    Engin, Melih

    2017-01-01

    The objective of the present study is to determine whether there is a significant relationship between the students' readiness in online learning and their emotional intelligence levels. Correlational research method was used in the study. Online Learning Readiness Scale which was developed by Hung et al. (2010) has been used and Trait Emotional…

  15. Wavenumber selection based analysis in Raman spectroscopy improves skin cancer diagnostic specificity at high sensitivity levels (Conference Presentation)

    Science.gov (United States)

    Zhao, Jianhua; Zeng, Haishan; Kalia, Sunil; Lui, Harvey

    2017-02-01

    Background: Raman spectroscopy is a non-invasive optical technique which can measure molecular vibrational modes within tissue. A large-scale clinical study (n = 518) has demonstrated that real-time Raman spectroscopy could distinguish malignant from benign skin lesions with good diagnostic accuracy; this was validated by a follow-up independent study (n = 127). Objective: Most of the previous diagnostic algorithms have typically been based on analyzing the full band of the Raman spectra, either in the fingerprint or high wavenumber regions. Our objective in this presentation is to explore wavenumber selection based analysis in Raman spectroscopy for skin cancer diagnosis. Methods: A wavenumber selection algorithm was implemented using variably-sized wavenumber windows, which were determined by the correlation coefficient between wavenumbers. Wavenumber windows were chosen based on accumulated frequency from leave-one-out cross-validated stepwise regression or least and shrinkage selection operator (LASSO). The diagnostic algorithms were then generated from the selected wavenumber windows using multivariate statistical analyses, including principal component and general discriminant analysis (PC-GDA) and partial least squares (PLS). A total cohort of 645 confirmed lesions from 573 patients encompassing skin cancers, precancers and benign skin lesions were included. Lesion measurements were divided into training cohort (n = 518) and testing cohort (n = 127) according to the measurement time. Result: The area under the receiver operating characteristic curve (ROC) improved from 0.861-0.891 to 0.891-0.911 and the diagnostic specificity for sensitivity levels of 0.99-0.90 increased respectively from 0.17-0.65 to 0.20-0.75 by selecting specific wavenumber windows for analysis. Conclusion: Wavenumber selection based analysis in Raman spectroscopy improves skin cancer diagnostic specificity at high sensitivity levels.

  16. Space elevator systems level analysis

    Energy Technology Data Exchange (ETDEWEB)

    Laubscher, B. E. (Bryan E.)

    2004-01-01

    The Space Elevator (SE) represents a major paradigm shift in space access. It involves new, untried technologies in most of its subsystems. Thus the successful construction of the SE requires a significant amount of development, This in turn implies a high level of risk for the SE. This paper will present a systems level analysis of the SE by subdividing its components into their subsystems to determine their level of technological maturity. such a high-risk endeavor is to follow a disciplined approach to the challenges. A systems level analysis informs this process and is the guide to where resources should be applied in the development processes. It is an efficient path that, if followed, minimizes the overall risk of the system's development. systems level analysis is that the overall system is divided naturally into its subsystems, and those subsystems are further subdivided as appropriate for the analysis. By dealing with the complex system in layers, the parameter space of decisions is kept manageable. Moreover, A rational way to manage One key aspect of a resources are not expended capriciously; rather, resources are put toward the biggest challenges and most promising solutions. This overall graded approach is a proven road to success. The analysis includes topics such as nanotube technology, deployment scenario, power beaming technology, ground-based hardware and operations, ribbon maintenance and repair and climber technology.

  17. An Analysis of Organizational Performance Based on Hospital Specialization Level and Strategy Type.

    Science.gov (United States)

    Kim, Han-Sung; Kim, Young-Hoon; Woo, Jung-Sik; Hyun, Sook-Jung

    2015-01-01

    Hospitals are studying the focused factory concept and attempting to increase their power in a competitive industry by becoming more specialized. This study uses the information theory index (ITI) and the Herfindahl-Hirschman index (HHI) to analyze the extent of specialization by Korean hospitals that receive national health insurance reimbursements. Hierarchical regression analysis is used to assess the impact of hospital specialization on the following four aspects of operational performance: productivity, profitability, efficiency and quality of care. The results show that a focused strategy (high HHI) improves the income and adjusted number of patients per specialist through the efficient utilization of human resources. However, a diversified strategy (high ITI) improves the hospital utilization ratio, income per bed and adjusted number of patients per bed (controlling for material resources such as beds). In addition, as the concentration index increases, case-mix mortality rates and referral rates decrease, indicating that specialization has a positive relationship with quality of care.

  18. An Analysis of Organizational Performance Based on Hospital Specialization Level and Strategy Type.

    Directory of Open Access Journals (Sweden)

    Han-Sung Kim

    Full Text Available Hospitals are studying the focused factory concept and attempting to increase their power in a competitive industry by becoming more specialized.This study uses the information theory index (ITI and the Herfindahl-Hirschman index (HHI to analyze the extent of specialization by Korean hospitals that receive national health insurance reimbursements. Hierarchical regression analysis is used to assess the impact of hospital specialization on the following four aspects of operational performance: productivity, profitability, efficiency and quality of care.The results show that a focused strategy (high HHI improves the income and adjusted number of patients per specialist through the efficient utilization of human resources. However, a diversified strategy (high ITI improves the hospital utilization ratio, income per bed and adjusted number of patients per bed (controlling for material resources such as beds. In addition, as the concentration index increases, case-mix mortality rates and referral rates decrease, indicating that specialization has a positive relationship with quality of care.

  19. Analysis of Student and School Level Variables Related to Mathematics Self-Efficacy Level Based on PISA 2012 Results for China-Shanghai, Turkey, and Greece

    Science.gov (United States)

    Usta, H. Gonca

    2016-01-01

    This study aims to analyze the student and school level variables that affect students' self-efficacy levels in mathematics in China-Shanghai, Turkey, and Greece based on PISA 2012 results. In line with this purpose, the hierarchical linear regression model (HLM) was employed. The interschool variability is estimated at approximately 17% in…

  20. Analysis of students’ creative thinking level in problem solving based on national council of teachers of mathematics

    Science.gov (United States)

    Hobri; Suharto; Rifqi Naja, Ahmad

    2018-04-01

    This research aims to determine students’ creative thinking level in problem solving based on NCTM in function subject. The research type is descriptive with qualitative approach. Data collection methods which were used are test and interview. Creative thinking level in problem solving based on NCTM indicators consists of (1) Make mathematical model from a contextual problem and solve the problem, (2) Solve problem using various possible alternatives, (3) Find new alternative(s) to solve the problem, (4) Determine the most efficient and effective alternative for that problem, (5) Review and correct mistake(s) on the process of problem solving. Result of the research showed that 10 students categorized in very satisfying level, 23 students categorized in satisfying level and 1 students categorized in less satisfying level. Students in very satisfying level meet all indicators, students in satisfying level meet first, second, fourth, and fifth indicator, while students in less satisfying level only meet first and fifth indicator.

  1. MOVES regional level sensitivity analysis

    Science.gov (United States)

    2012-01-01

    The MOVES Regional Level Sensitivity Analysis was conducted to increase understanding of the operations of the MOVES Model in regional emissions analysis and to highlight the following: : the relative sensitivity of selected MOVES Model input paramet...

  2. Different Somatic Hypermutation Levels among Antibody Subclasses Disclosed by a New Next-Generation Sequencing-Based Antibody Repertoire Analysis

    Directory of Open Access Journals (Sweden)

    Kazutaka Kitaura

    2017-05-01

    Full Text Available A diverse antibody repertoire is primarily generated by the rearrangement of V, D, and J genes and subsequent somatic hypermutation (SHM. Class-switch recombination (CSR produces various isotypes and subclasses with different functional properties. Although antibody isotypes and subclasses are considered to be produced by both direct and sequential CSR, it is still not fully understood how SHMs accumulate during the process in which antibody subclasses are generated. Here, we developed a new next-generation sequencing (NGS-based antibody repertoire analysis capable of identifying all antibody isotype and subclass genes and used it to examine the peripheral blood mononuclear cells of 12 healthy individuals. Using a total of 5,480,040 sequences, we compared percentage frequency of variable (V, junctional (J sequence, and a combination of V and J, diversity, length, and amino acid compositions of CDR3, SHM, and shared clones in the IgM, IgD, IgG3, IgG1, IgG2, IgG4, IgA1, IgE, and IgA2 genes. The usage and diversity were similar among the immunoglobulin (Ig subclasses. Clonally related sequences sharing identical V, D, J, and CDR3 amino acid sequences were frequently found within multiple Ig subclasses, especially between IgG1 and IgG2 or IgA1 and IgA2. SHM occurred most frequently in IgG4, while IgG3 genes were the least mutated among all IgG subclasses. The shared clones had almost the same SHM levels among Ig subclasses, while subclass-specific clones had different levels of SHM dependent on the genomic location. Given the sequential CSR, these results suggest that CSR occurs sequentially over multiple subclasses in the order corresponding to the genomic location of IGHCs, but CSR is likely to occur more quickly than SHMs accumulate within Ig genes under physiological conditions. NGS-based antibody repertoire analysis should provide critical information on how various antibodies are generated in the immune system.

  3. Organizational Analysis and Career Projections Based on a Level-of-Responsibility/Equitable Payment Model. Technical Report.

    Science.gov (United States)

    Laner, Stephen; And Others

    Following an explanation of the Level of Responsibility/Equitable Pay Function, its applicability is demonstrated to the analysis and to the design and redesign of organizational hierarchies. It is shown how certain common dysfuntional anomalies can be avoided by structuring an organization along the principles outlined. A technique is then…

  4. Pre and Post Test Evaluations of Students in the Needs-Analysis Based EAP Course at Undergraduate Level

    Science.gov (United States)

    Nafissi, Zohreh; Rezaeipanah, Fariba; Monsefi, Roya

    2017-01-01

    Iran's education system is exam-based and to gain admission to universities at undergraduate, graduate, and postgraduate levels, candidates have to sit a competitive examination. For this reason, developing an EAP course which prepares the candidates for these examinations is of crucial importance. The present study attempted to develop an EAP…

  5. Analysis on the International Trends in Safe Management of Very Low Level Waste Based upon Graded Approach and Their Implications

    International Nuclear Information System (INIS)

    Cheong, Jae Hak

    2011-01-01

    Recently, International Atomic Energy Agency and major leading countries in radioactive waste management tend to subdivide the categories of radioactive waste based upon risk-graded approach. In this context, the category of very low level waste has been newly introduced, or optimized management options for this kind of waste have been pursued in many countries. The application of engineered surface landfill type facilities dedicated to dispose of very low level waste has been gradually expanded, and it was analyzed that their design concept of isolation has been much advanced than those of the old fashioned surface trench-type disposal facilities for low and intermediate level waste, which were usually constructed in 1960's. In addition, the management options for very low level waste in major leading countries are varied depending upon and interfaced with the affecting factors such as: national framework for clearance, legal and practical availability of low and intermediate level waste repository and/or non-nuclear waste landfill, public acceptance toward alternative waste management options, and so forth. In this regard, it was concluded that optimized long-term management options for very low level waste in Korea should be also established in a timely manner through comprehensive review and discussions, in preparation of decommissioning of large nuclear facilities in the future, and be implemented in a systematic manner under the framework of national policy and management plan for radioactive waste management

  6. Logical Entity Level Sentiment Analysis

    DEFF Research Database (Denmark)

    Petersen, Niklas Christoffer; Villadsen, Jørgen

    2017-01-01

    We present a formal logical approach using a combinatory categorial grammar for entity level sentiment analysis that utilizes machine learning techniques for efficient syntactical tagging and performs a deep structural analysis of the syntactical properties of texts in order to yield precise resu...

  7. Multi-level barriers analysis to promote guideline based nursing care: a leadership strategy from home health care.

    Science.gov (United States)

    Gifford, Wendy A; Graham, Ian D; Davies, Barbara L

    2013-07-01

    Understanding the types of barriers that exist when implementing change can assist healthcare managers to tailor implementation strategies for optimal patient outcomes. The aim of this paper is to present an organising framework, the Barriers Assessment Taxonomy, for understanding barriers to nurses' use of clinical practice guideline recommendations. Barriers to recommendations are illustrated using the Barriers Assessment Taxonomy and insights discussed. As part of a pilot implementation study, semi-structured interviews (n = 26) were conducted to understand barriers to nurses' use of nine guideline recommendations for diabetic foot ulcers. Content analysis of verbatim transcripts included thematic coding and categorising barriers using the Barriers Assessment Taxonomy. Nineteen barriers were associated with nine recommendations, crossing five levels of the health care delivery system. The Barriers Assessment Taxonomy revealed that all recommendations had individual and organisational level barriers, with one recommendation having barriers at all levels. Individual level barriers were most frequent and lack of knowledge and skills was the only barrier that crossed all recommendations. The Barriers Assessment Taxonomy provides a framework for nursing managers to understand the complexity of barriers that exist, and can assist in choosing intervention strategies to support improved quality care and patient outcomes. © 2013 John Wiley & Sons Ltd.

  8. Medical expert system for assessment of coronary heart disease destabilization based on the analysis of the level of soluble vascular adhesion molecules

    Science.gov (United States)

    Serkova, Valentina K.; Pavlov, Sergey V.; Romanava, Valentina A.; Monastyrskiy, Yuriy I.; Ziepko, Sergey M.; Kuzminova, Nanaliya V.; Wójcik, Waldemar; DzierŻak, RóŻa; Kalizhanova, Aliya; Kashaganova, Gulzhan

    2017-08-01

    Theoretical and practical substantiation of the possibility of the using the level of soluble vascular adhesion molecules (sVCAM) is performed. Expert system for the assessment of coronary heart disease (CHD) destabilization on the base of the analysis of soluble vascular adhesion molecules level is developed. Correlation between the increase of VCAM level and C-reactive protein (CRP) in patients with different variants of CHD progression is established. Association of chronic nonspecific vascular inflammation activation and CHD destabilization is shown. The expedience of parallel determination of sVCAM and CRP levels for diagnostics of CHD destabilization and forecast elaboration is noted.

  9. Coastal Flooding in Florida's Big Bend Region with Application to Sea Level Rise Based on Synthetic Storms Analysis

    Directory of Open Access Journals (Sweden)

    Scott C. Hagen Peter Bacopoulos

    2012-01-01

    Full Text Available Flooding is examined by comparing maximum envelopes of water against the 0.2% (= 1-in-500-year return-period flooding surface generated as part of revising the Federal Emergency Management Agency¡¦s flood insurance rate maps for Franklin, Wakulla, and Jefferson counties in Florida¡¦s Big Bend Region. The analysis condenses the number of storms to a small fraction of the original 159 used in production. The analysis is performed by assessing which synthetic storms contributed to inundation extent (the extent of inundation into the floodplain, coverage (the overall surface area of the inundated floodplain and the spatially variable 0.2% flooding surface. The results are interpreted in terms of storm attributes (pressure deficit, radius to maximum winds, translation speed, storm heading, and landfall location and the physical processes occurring within the natural system (storms surge and waves; both are contextualized against existing and new hurricane scales. The approach identifies what types of storms and storm attributes lead to what types of inundation, as measured in terms of extent and coverage, in Florida¡¦s Big Bend Region and provides a basis in the identification of a select subset of synthetic storms for studying the impact of sea level rise. The sea level rise application provides a clear contrast between a dynamic approach versus that of a static approach.

  10. Data analysis at Level-1 Trigger level

    CERN Document Server

    Wittmann, Johannes; Aradi, Gregor; Bergauer, Herbert; Jeitler, Manfred; Wulz, Claudia; Apanasevich, Leonard; Winer, Brian; Puigh, Darren Michael

    2017-01-01

    With ever increasing luminosity at the LHC, optimum online data selection is getting more and more important. While in the case of some experiments (LHCb and ALICE) this task is being completely transferred to computer farms, the others - ATLAS and CMS - will not be able to do this in the medium-term future for technological, detector-related reasons. Therefore, these experiments pursue the complementary approach of migrating more and more of the offline and High-Level Trigger intelligence into the trigger electronics. This paper illustrates how the Level-1 Trigger of the CMS experiment and in particular its concluding stage, the Global Trigger, take up this challenge.

  11. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta‐analysis and group level studies

    Science.gov (United States)

    Bakbergenuly, Ilyas; Morgenthaler, Stephan

    2016-01-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062

  12. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta-analysis and group level studies.

    Science.gov (United States)

    Bakbergenuly, Ilyas; Kulinskaya, Elena; Morgenthaler, Stephan

    2016-07-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group-level studies or in meta-analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log-odds and arcsine transformations of the estimated probability p̂, both for single-group studies and in combining results from several groups or studies in meta-analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta-analysis and result in abysmal coverage of the combined effect for large K. We also propose bias-correction for the arcsine transformation. Our simulations demonstrate that this bias-correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta-analyses of prevalence. © 2016 The Authors. Biometrical Journal Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  13. Analysis of Unmanned Aerial System-Based CIR Images in Forestry—A New Perspective to Monitor Pest Infestation Levels

    Directory of Open Access Journals (Sweden)

    Jan Rudolf Karl Lehmann

    2015-03-01

    Full Text Available The detection of pest infestation is an important aspect of forest management. In the case of the oak splendour beetle (Agrilus biguttatus infestation, the affected oaks (Quercus sp. show high levels of defoliation and altered canopy reflection signature. These critical features can be identified in high-resolution colour infrared (CIR images of the tree crown and branches level captured by Unmanned Aerial Systems (UAS. In this study, we used a small UAS equipped with a compact digital camera which has been calibrated and modified to record not only the visual but also the near infrared reflection (NIR of possibly infested oaks. The flight campaigns were realized in August 2013, covering two study sites which are located in a rural area in western Germany. Both locations represent small-scale, privately managed commercial forests in which oaks are economically valuable species. Our workflow includes the CIR/NIR image acquisition, mosaicking, georeferencing and pixel-based image enhancement followed by object-based image classification techniques. A modified Normalized Difference Vegetation Index (NDVImod derived classification was used to distinguish between five vegetation health classes, i.e., infested, healthy or dead branches, other vegetation and canopy gaps. We achieved an overall Kappa Index of Agreement (KIA   of 0.81 and 0.77 for each study site, respectively. This approach offers a low-cost alternative to private forest owners who pursue a sustainable management strategy.

  14. Sample Entropy Analysis of EEG Signals via Artificial Neural Networks to Model Patients’ Consciousness Level Based on Anesthesiologists Experience

    Directory of Open Access Journals (Sweden)

    George J. A. Jiang

    2015-01-01

    Full Text Available Electroencephalogram (EEG signals, as it can express the human brain’s activities and reflect awareness, have been widely used in many research and medical equipment to build a noninvasive monitoring index to the depth of anesthesia (DOA. Bispectral (BIS index monitor is one of the famous and important indicators for anesthesiologists primarily using EEG signals when assessing the DOA. In this study, an attempt is made to build a new indicator using EEG signals to provide a more valuable reference to the DOA for clinical researchers. The EEG signals are collected from patients under anesthetic surgery which are filtered using multivariate empirical mode decomposition (MEMD method and analyzed using sample entropy (SampEn analysis. The calculated signals from SampEn are utilized to train an artificial neural network (ANN model through using expert assessment of consciousness level (EACL which is assessed by experienced anesthesiologists as the target to train, validate, and test the ANN. The results that are achieved using the proposed system are compared to BIS index. The proposed system results show that it is not only having similar characteristic to BIS index but also more close to experienced anesthesiologists which illustrates the consciousness level and reflects the DOA successfully.

  15. REDOX state analysis of platinoid elements in simulated high-level radioactive waste glass by synchrotron radiation based EXAFS

    Energy Technology Data Exchange (ETDEWEB)

    Okamoto, Yoshihiro, E-mail: okamoto.yoshihiro@jaea.go.jp [Condensed Matter Chemistry Group, Quantum Beam Science Center, Japan Atomic Energy Agency, Shirakata 2-4, Tokai-mura, Ibaraki 319-1195 (Japan); Shiwaku, Hideaki [Quantum Beam Science Center, Japan Atomic Energy Agency, Kouto 1-1-1, Sayo-cho, Hyogo 679-5143 (Japan); Nakada, Masami [Nuclear Engineering Science Center, Japan Atomic Energy Agency, Shirakata 2-4, Tokai-mura, Ibaraki 319-1195 (Japan); Komamine, Satoshi; Ochi, Eiji [Japan Nuclear Fuel Limited, 4-108 Aza Okitsuke, Oaza Obuchi, Rokkasho-mura, Aomori 030-3212 (Japan); Akabori, Mitsuo [Nuclear Engineering Science Center, Japan Atomic Energy Agency, Shirakata 2-4, Tokai-mura, Ibaraki 319-1195 (Japan)

    2016-04-01

    Extended X-ray Absorption Fine Structure (EXAFS) analyses were performed to evaluate REDOX (REDuction and OXidation) state of platinoid elements in simulated high-level nuclear waste glass samples prepared under different conditions of temperature and atmosphere. At first, EXAFS functions were compared with those of standard materials such as RuO{sub 2}. Then structural parameters were obtained from a curve fitting analysis. In addition, a fitting analysis used a linear combination of the two standard EXAFS functions of a given elements metal and oxide was applied to determine ratio of metal/oxide in the simulated glass. The redox state of Ru was successfully evaluated from the linear combination fitting results of EXAFS functions. The ratio of metal increased at more reducing atmosphere and at higher temperatures. Chemical form of rhodium oxide in the simulated glass samples was RhO{sub 2} unlike expected Rh{sub 2}O{sub 3}. It can be estimated rhodium behaves according with ruthenium when the chemical form is oxide.

  16. Gay-Straight Alliances are Associated with Lower Levels of School-Based Victimization of LGBTQ+ Youth: A Systematic Review and Meta-analysis.

    Science.gov (United States)

    Marx, Robert A; Kettrey, Heather Hensman

    2016-07-01

    Gay-straight alliances (GSAs) are school-based organizations for lesbian, gay, bisexual, transgender, and queer (LGBTQ+) youth and their allies that often attempt to improve school climate for sexual and gender minority youth. This meta-analysis evaluates the association between school GSA presence and youth's self-reports of school-based victimization by quantitatively synthesizing 15 primary studies with 62,923 participants. Findings indicate GSA presence is associated with significantly lower levels of youth's self-reports of homophobic victimization, fear for safety, and hearing homophobic remarks, and these results are robust, controlling for a variety of study-level factors. The findings of this meta-analysis provide evidence to support GSAs as a means of protecting LGTBQ+ youth from school-based victimization.

  17. Functional gene array-based analysis of microbial community structure in groundwaters with a gradient of contaminant levels

    Energy Technology Data Exchange (ETDEWEB)

    Waldron, P.J.; Wu, L.; Van Nostrand, J.D.; Schadt, C.W.; Watson, D.B.; Jardine, P.M.; Palumbo, A.V.; Hazen, T.C.; Zhou, J.

    2009-06-15

    To understand how contaminants affect microbial community diversity, heterogeneity, and functional structure, six groundwater monitoring wells from the Field Research Center of the U.S. Department of Energy Environmental Remediation Science Program (ERSP; Oak Ridge, TN), with a wide range of pH, nitrate, and heavy metal contamination were investigated. DNA from the groundwater community was analyzed with a functional gene array containing 2006 probes to detect genes involved in metal resistance, sulfate reduction, organic contaminant degradation, and carbon and nitrogen cycling. Microbial diversity decreased in relation to the contamination levels of the wells. Highly contaminated wells had lower gene diversity but greater signal intensity than the pristine well. The microbial composition was heterogeneous, with 17-70% overlap between different wells. Metal-resistant and metal-reducing microorganisms were detected in both contaminated and pristine wells, suggesting the potential for successful bioremediation of metal-contaminated groundwaters. In addition, results of Mantel tests and canonical correspondence analysis indicate that nitrate, sulfate, pH, uranium, and technetium have a significant (p < 0.05) effect on microbial community structure. This study provides an overall picture of microbial community structure in contaminated environments with functional gene arrays by showing that diversity and heterogeneity can vary greatly in relation to contamination.

  18. Characterization of hydraulic connections between mine shaft and caprock based on time series analysis of water level changes for the flooded Asse I salt mine in northern Germany

    International Nuclear Information System (INIS)

    Brauchler, Ralf; Mettier, Ralph; Schulte, Peter; Fuehrboeter, Jens Fred

    2015-01-01

    In the context of safe enclosure of nuclear waste in salt formations, one of the main challenges is potential water inflow into the excavations. In this context, the hydraulic relationship between the abandoned Asse I salt mine and the salt dissolution network at the base of the caprock of the Asse salt structure in northern Germany is characterized by utilizing time series analysis of water level changes. The data base comprises a time series of water level measurements over eight years with a temporal resolution of 15 minutes (in general) and up to 2 minutes for specific intervals. The water level measurements were collected in the shaft of the flooded mine, which is filled with ground rock salt until a depth of 140 m, and a deep well, which is screened in 240 m depth at the salt dissolution zone at the base of the caprock. The distance between the well and the shaft is several hundred meters. Since the beginning of the continuous observations in the 1970s, the shaft has shown periodically abrupt declines of the water level of several meters occurring in intervals of approx. 8 to 10 years. The time series analysis consists of trend, Fourier-, autocorrelation and cross-correlation analysis. The analysis showed that during times with small water level changes the measured water level in the well and the shaft are positively correlated whereas during the abrupt water level drops in the shaft, the measured water levels between the shaft and the well are negatively correlated. A potential explanation for this behavior is that during times with small changes, the measured water levels in the well and in the shaft are influenced by the same external events with similar response times. In contrast, during the abrupt water level decline events in the shaft, a negatively correlated pressure signal is induced in the well, which supports the assumption of a direct hydraulic connection between the shaft and the well via flooded excavations and the salt dissolution network

  19. Characterization of hydraulic connections between mine shaft and caprock based on time series analysis of water level changes for the flooded Asse I salt mine in northern Germany

    Energy Technology Data Exchange (ETDEWEB)

    Brauchler, Ralf; Mettier, Ralph; Schulte, Peter [AF-Consult Switzerland AG, Baden (Switzerland); Fuehrboeter, Jens Fred [Bundesamt fuer Strahlenschutz, Salzgitter (Germany)

    2015-07-01

    In the context of safe enclosure of nuclear waste in salt formations, one of the main challenges is potential water inflow into the excavations. In this context, the hydraulic relationship between the abandoned Asse I salt mine and the salt dissolution network at the base of the caprock of the Asse salt structure in northern Germany is characterized by utilizing time series analysis of water level changes. The data base comprises a time series of water level measurements over eight years with a temporal resolution of 15 minutes (in general) and up to 2 minutes for specific intervals. The water level measurements were collected in the shaft of the flooded mine, which is filled with ground rock salt until a depth of 140 m, and a deep well, which is screened in 240 m depth at the salt dissolution zone at the base of the caprock. The distance between the well and the shaft is several hundred meters. Since the beginning of the continuous observations in the 1970s, the shaft has shown periodically abrupt declines of the water level of several meters occurring in intervals of approx. 8 to 10 years. The time series analysis consists of trend, Fourier-, autocorrelation and cross-correlation analysis. The analysis showed that during times with small water level changes the measured water level in the well and the shaft are positively correlated whereas during the abrupt water level drops in the shaft, the measured water levels between the shaft and the well are negatively correlated. A potential explanation for this behavior is that during times with small changes, the measured water levels in the well and in the shaft are influenced by the same external events with similar response times. In contrast, during the abrupt water level decline events in the shaft, a negatively correlated pressure signal is induced in the well, which supports the assumption of a direct hydraulic connection between the shaft and the well via flooded excavations and the salt dissolution network

  20. MIPAS level 2 operational analysis

    Directory of Open Access Journals (Sweden)

    P. Raspollini

    2006-01-01

    Full Text Available The MIPAS (Michelson Interferometer for Passive Atmospheric Sounding instrument has been operating on-board the ENVISAT satellite since March 2002. In the first two years, it acquired in a nearly continuous manner high resolution (0.025 cm−1 unapodized emission spectra of the Earth's atmosphere at limb in the middle infrared region. This paper describes the level 2 near real-time (NRT and off-line (OL ESA processors that have been used to derive level 2 geophysical products from the calibrated and geolocated level 1b spectra. The design of the code and the analysis methodology have been driven by the requirements for NRT processing. This paper reviews the performance of the optimized retrieval strategy that has been implemented to achieve these requirements and provides estimated error budgets for the target products: pressure, temperature, O3, H2O, CH4, HNO3, N2O and NO2, in the altitude measurement range from 6 to 68 km. From application to real MIPAS data, it was found that no change was needed in the developed code although an external algorithm was introduced to identify clouds with high opacity and to exclude affected spectra from the analysis. In addition, a number of updates were made to the set-up parameters and to auxiliary data. In particular, a new version of the MIPAS dedicated spectroscopic database was used and, in the OL analysis, the retrieval range was extended to reduce errors due to uncertainties in extrapolation of the profile outside the retrieval range and more stringent convergence criteria were implemented. A statistical analysis on the χ2 values obtained in one year of measurements shows good agreement with the a priori estimate of the forward model errors. On the basis of the first two years of MIPAS measurements the estimates of the forward model and instrument errors are in general found to be conservative with excellent performance demonstrated for frequency calibration. It is noted that the total retrieval

  1. Measurement and Analysis of Radio-frequency Radiation Exposure Level from Different Mobile Base Transceiver Stations in Ajaokuta and Environs, Nigeria

    OpenAIRE

    Ushie, P. O.; Nwankwo, Victor U. J.; Bolaji, Ayinmode; Osahun, O. D.

    2013-01-01

    We present the result of a preliminary assessment of radio-frequency radiation exposure from selected mobile base stations in Ajaokuta environs. The Power density of RF radiation within a radial distance of 125m was measured. Although values fluctuated due to the influence of other factors, including wave interference from other electromagnetic sources around reference base stations, we show from analysis that radiation exposure level is below the standard limit (4.5W/sqm for 900MHz and 9W/sq...

  2. A Comparative Land Use-Based Analysis of Noise Pollution Levels in Selected Urban Centers of Nigeria.

    Science.gov (United States)

    Baloye, David O; Palamuleni, Lobina G

    2015-09-29

    Growth in the commercialization, mobility and urbanization of human settlements across the globe has greatly exposed world urban population to potentially harmful noise levels. The situation is more disturbing in developing countries like Nigeria, where there are no sacrosanct noise laws and regulations. This study characterized noise pollution levels in Ibadan and Ile-Ife, two urban areas of Southwestern Nigeria that have experienced significant increases in population and land use activities. Eight hundred noise measurements, taken at 20 different positions in the morning, afternoon, and evening of carefully selected weekdays, in each urban area, were used for this study. Findings put the average noise levels in the urban centers at between 53 dB(A) and 89 dB (A), a far cry from the World Health Organization (WHO) permissible limits in all the land use types, with highest noise pollution levels recorded for transportation, commercial, residential and educational land use types. The result of the one-way ANOVA test carried out on the dependent variable noise and fixed factor land use types reveals a statistically significant mean noise levels across the study area (F(3,34) = 15.13, p = 0.000). The study underscores noise pollution monitoring and the urgent need to control urban noise pollution with appropriate and effective policies.

  3. Assessing the Primary Schools--A Multi-Dimensional Approach: A School Level Analysis Based on Indian Data

    Science.gov (United States)

    Sengupta, Atanu; Pal, Naibedya Prasun

    2012-01-01

    Primary education is essential for the economic development in any country. Most studies give more emphasis to the final output (such as literacy, enrolment etc.) rather than the delivery of the entire primary education system. In this paper, we study the school level data from an Indian district, collected under the official DISE statistics. We…

  4. Comparative Analysis of the Level of Knowledge-Based Part of Economies in European Union Countries with Kam Methodology

    OpenAIRE

    Strożek Piotr

    2013-01-01

    The purpose of this article is to identify disparities in the use of knowledge in socio-economic life in the EU countries. This research was conducted with use of the cluster analysis (tools belonging to multidimensional comparative analysis). W artykule przedstawiono klasyfikację terytorialną państw Unii Europejskiej według rozwoju wiedzy gospodarek, która w dzisiejszym świecie traktowana jest jako determinanta międzynarodowej konkurencyjności. Zróżnicowanie to zostało skonstruowane na po...

  5. Geostatistical interpolation model selection based on ArcGIS and spatio-temporal variability analysis of groundwater level in piedmont plains, northwest China.

    Science.gov (United States)

    Xiao, Yong; Gu, Xiaomin; Yin, Shiyang; Shao, Jingli; Cui, Yali; Zhang, Qiulan; Niu, Yong

    2016-01-01

    Based on the geo-statistical theory and ArcGIS geo-statistical module, datas of 30 groundwater level observation wells were used to estimate the decline of groundwater level in Beijing piedmont. Seven different interpolation methods (inverse distance weighted interpolation, global polynomial interpolation, local polynomial interpolation, tension spline interpolation, ordinary Kriging interpolation, simple Kriging interpolation and universal Kriging interpolation) were used for interpolating groundwater level between 2001 and 2013. Cross-validation, absolute error and coefficient of determination (R(2)) was applied to evaluate the accuracy of different methods. The result shows that simple Kriging method gave the best fit. The analysis of spatial and temporal variability suggest that the nugget effects from 2001 to 2013 were increasing, which means the spatial correlation weakened gradually under the influence of human activities. The spatial variability in the middle areas of the alluvial-proluvial fan is relatively higher than area in top and bottom. Since the changes of the land use, groundwater level also has a temporal variation, the average decline rate of groundwater level between 2007 and 2013 increases compared with 2001-2006. Urban development and population growth cause over-exploitation of residential and industrial areas. The decline rate of the groundwater level in residential, industrial and river areas is relatively high, while the decreasing of farmland area and development of water-saving irrigation reduce the quantity of water using by agriculture and decline rate of groundwater level in agricultural area is not significant.

  6. A situation analysis of the competitive schools based cricket coaching programmes at u/19 level in the Gauteng province

    OpenAIRE

    2014-01-01

    M.Phil. (Sport Management) The purpose of this study was to investigate the management of cricket coaching programmes at u/19 level in the Gauteng Province. Specifically, this study attempted to determine the current situation regarding management of coaching programmes and the delivery of in-school driven programmes in the province. Data was collected from schools offering cricket as a sport from both the Gauteng Lions and Northerns Cricket Union franchises. There were 10 schools in the N...

  7. Transcriptome-Based Analysis in Lactobacillus plantarum WCFS1 Reveals New Insights into Resveratrol Effects at System Level.

    Science.gov (United States)

    Reverón, Inés; Plaza-Vinuesa, Laura; Franch, Mónica; de Las Rivas, Blanca; Muñoz, Rosario; López de Felipe, Félix

    2018-05-01

    This study was undertaken to expand our insights into the mechanisms involved in the tolerance to resveratrol (RSV) that operate at system-level in gut microorganisms and advance knowledge on new RSV-responsive gene circuits. Whole genome transcriptional profiling was used to characterize the molecular response of Lactobacillus plantarum WCFS1 to RSV. DNA repair mechanisms were induced by RSV and responses were triggered to decrease the load of copper, a metal required for RSV-mediated DNA cleavage, and H 2 S, a genotoxic gas. To counter the effects of RSV, L. plantarum strongly up- or downregulated efflux systems and ABC transporters pointing to transport control of RSV across the membrane as a key mechanism for RSV tolerance. L. plantarum also downregulated tRNAs, induced chaperones, and reprogrammed its transcriptome to tightly control ammonia levels. RSV induced a probiotic effector gene and a likely deoxycholate transporter, two functions that improve the host health status. Our data identify novel protective mechanisms involved in RSV tolerance operating at system level in a gut microbe. These insights could influence the way RSV is used for a better management of gut microbial ecosystems to obtain associated health benefits. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Efficiency and Productivity of County-level Public Hospitals Based on the Data Envelopment Analysis Model and Malmquist Index in Anhui, China.

    Science.gov (United States)

    Li, Nian-Nian; Wang, Cun-Hui; Ni, Hong; Wang, Heng

    2017-12-05

    China began to implement the national medical and health system and public hospital reforms in 2009 and 2012, respectively. Anhui Province is one of the four pilot provinces, and the medical reform measures received wide attention nationwide. The effectiveness of the above reform needs to get attention. This study aimed to master the efficiency and productivity of county-level public hospitals based on the data envelopment analysis (DEA) model and Malmquist index in Anhui, China, and then provide improvement measures for the future hospital development. We chose 12 country-level hospitals based on geographical distribution and the economic development level in Anhui Province. Relevant data that were collected in the field and then sorted were provided by the administrative departments of the hospitals. DEA models were used to calculate the dynamic efficiency and Malmquist index factors for the 12 institutions. During 2010-2015, the overall average relative service efficiency of 12 county-level public hospitals was 0.926, and the number of hospitals achieved an effective DEA for each year from 2010 to 2015 was 4, 6, 7, 7, 6, and 8, respectively, as measured using DEA. During this same period, the average overall production efficiency was 0.983, and the total productivity factor had declined. The overall production efficiency of five hospitals was >1, and the rest are productivity has not been effectively improved. County-level public hospitals need to combine their own reality to find their own deficiencies.

  9. Risk-based systems analysis of emerging high-level waste tank remediation technologies. Volume 2: Final report

    International Nuclear Information System (INIS)

    Peters, B.B.; Cameron, R.J.; McCormack, W.D.

    1994-08-01

    The objective of DOE's Radioactive Waste Tank Remediation Technology Focus Area is to identify and develop new technologies that will reduce the risk and/or cost of remediating DOE underground waste storage tanks and tank contents. There are, however, many more technology investment opportunities than the current budget can support. Current technology development selection methods evaluate new technologies in isolation from other components of an overall tank waste remediation system. This report describes a System Analysis Model developed under the US Department of Energy (DOE) Office of Technology Development (OTD) Underground Storage Tank-Integrated Demonstration (UST-ID) program. The report identifies the project objectives and provides a description of the model. Development of the first ''demonstration'' version of this model and a trial application have been completed and the results are presented. This model will continue to evolve as it undergoes additional user review and testing

  10. Risk-based systems analysis of emerging high-level waste tank remediation technologies. Volume 2: Final report

    Energy Technology Data Exchange (ETDEWEB)

    Peters, B.B.; Cameron, R.J.; McCormack, W.D. [Enserch Environmental Corp., Richland, WA (United States)

    1994-08-01

    The objective of DOE`s Radioactive Waste Tank Remediation Technology Focus Area is to identify and develop new technologies that will reduce the risk and/or cost of remediating DOE underground waste storage tanks and tank contents. There are, however, many more technology investment opportunities than the current budget can support. Current technology development selection methods evaluate new technologies in isolation from other components of an overall tank waste remediation system. This report describes a System Analysis Model developed under the US Department of Energy (DOE) Office of Technology Development (OTD) Underground Storage Tank-Integrated Demonstration (UST-ID) program. The report identifies the project objectives and provides a description of the model. Development of the first ``demonstration`` version of this model and a trial application have been completed and the results are presented. This model will continue to evolve as it undergoes additional user review and testing.

  11. Analysis of critical thinking ability of VII grade students based on the mathematical anxiety level through learning cycle 7E model

    Science.gov (United States)

    Widyaningsih, E.; Waluya, S. B.; Kurniasih, A. W.

    2018-03-01

    This study aims to know mastery learning of students’ critical thinking ability with learning cycle 7E, determine whether the critical thinking ability of the students with learning cycle 7E is better than students’ critical thinking ability with expository model, and describe the students’ critical thinking phases based on the mathematical anxiety level. The method is mixed method with concurrent embedded. The population is VII grade students of SMP Negeri 3 Kebumen academic year 2016/2017. Subjects are determined by purposive sampling, selected two students from each level of mathematical anxiety. Data collection techniques include test, questionnaire, interview, and documentation. Quantitative data analysis techniques include mean test, proportion test, difference test of two means, difference test of two proportions and for qualitative data used Miles and Huberman model. The results show that: (1) students’ critical thinking ability with learning cycle 7E achieve mastery learning; (2) students’ critical thinking ability with learning cycle 7E is better than students’ critical thinking ability with expository model; (3) description of students’ critical thinking phases based on the mathematical anxiety level that is the lower the mathematical anxiety level, the subjects have been able to fulfil all of the indicators of clarification, assessment, inference, and strategies phases.

  12. Violence Among Men and Women in Substance Use Disorder Treatment: A Multi-level Event-based Analysis

    Science.gov (United States)

    Chermack, Stephen T.; Grogan-Kaylor, Andy; Perron, Brian E.; Murray, Regan L.; De Chavez, Peter; Walton, Maureen A.

    2010-01-01

    Background This study examined associations between acute alcohol and drug use and violence towards others in conflict incidents (overall, partner, and non-partner conflict incidents) by men and women recruited from substance use disorder (SUD) treatment. Methods Semi-structured interviews were used to obtain details about interpersonal conflict incidents (substance use, whether specific conflicts were with intimate partners or non-partners) in the 180 days pre-treatment. Participants for this study were selected for screening positive for past-year violence (N = 160; 77% men, 23% women). Results Multilevel multinomial regression models showed that after adjusting for clustering within individual participants, the most consistent predictors of violence across models were acute cocaine use (significant for overall, intimate partner and non-partner models), acute heavy alcohol use (significant for overall and non-partner models), and male gender (significant in all models). Conclusions This study was the first to explicitly examine the role of acute alcohol and drug use across overall, partner and non-partner conflict incidents. Consistent with prior studies using a variety of methodologies, alcohol, cocaine use and male gender were most consistently and positively related to violence severity (e.g., resulting in injury). The results provide important and novel event-level information regarding the relationship between acute alcohol and specific drug use and the severity of violence in interpersonal conflict incidents. PMID:20667666

  13. Spatial distribution level of land erosion disposition based on the analysis of slope on Central Lematang sub basin

    Science.gov (United States)

    Putranto, Dinar Dwi Anugerah; Sarino, Yuono, Agus Lestari

    2017-11-01

    Soil erosion is a natural process that is influenced by the magnitude of rainfall intensity, land cover, slope, soil type and soil processing system. However, it is often accelerated by human activities, such as improper cultivation of agricultural land, clearing of forest land for mining activities, and changes in topographic area due to use for other purposes such as pile materials, mined pits and so on. The Central Lematang sub-basin is part of the Lematang sub basin, at the Musi River Region Unit, South Sumatra Province, in Indonesia, which has a topographic shape with varying types of slope and altitude. The critical condition of Central Lematang sub basin has been at an alarming rate, as more than 47.5% of topographic and land use changes are dominated by coal mining activities and forest encroachment by communities. The method used in predicting erosion is by USPED (Unit Stream Power Erosion and Disposition). This is because the USPED [1] method can predict not only sediment transport but also the value of peeling (detachment) and sediment deposition. From slope analysis result, it is found that the highest erosion potential value is found on slope (8-15%) and the sediment is carried on a steep slope (15-25%). Meanwhile, the high sediment deposition area is found in the waters of 5.226 tons / ha / year, the steeper area of 2.12 tons / ha / year.

  14. Risk-based systems analysis of emerging high-level waste tank remediation technologies. Volume 1: Executive summary

    International Nuclear Information System (INIS)

    Peters, B.B.; Cameron, R.J.; McCormack, W.D.

    1994-08-01

    This report describes a System Analysis Model developed under the US Department of Energy (DOE) Office of Technology Development (OTD) Underground Storage Tank-Integrated Demonstration (UST-ID) program to aid technology development funding decisions for radioactive tank waste remediation. Current technology development selection methods evaluate new technologies in isolation from other components of an overall tank waste remediation system. These methods do not show the relative effect of new technologies on tank remediation systems as a whole. Consequently, DOE may spend its resources on technologies that promise to improve a single function but have a small or possibly negative, impact on the overall system, or DOE may overlook a technology that does not address a high priority problem in the system but that does, if implemented, offer sufficient overall improvements. Systems engineering and detailed analyses often conducted under the National Environmental Policy Act (NEPA 1969) use a ''whole system'' approach but are costly, too time-consuming, and often not sufficiently focused to support the needs of the technology program decision-makers. An alternative approach is required to evaluate these systems impacts but still meet the budget and schedule needs of the technology program

  15. Exposure Based Health Issues Project Report: Phase I of High Level Tank Operations, Retrieval, Pretreatment, and Vitrification Exposure Based Health Issues Analysis

    International Nuclear Information System (INIS)

    Stenner, Robert D.; Bowers, Harold N.; Kenoyer, Judson L.; Strenge, Dennis L.; Brady, William H.; Ladue, Buffi; Samuels, Joseph K.

    2001-01-01

    The Department of Energy (DOE) has the responsibility to understand the ''big picture'' of worker health and safety which includes fully recognizing the vulnerabilities and associated programs necessary to protect workers at the various DOE sites across the complex. Exposure analysis and medical surveillance are key aspects for understanding this big picture, as is understanding current health and safety practices and how they may need to change to relate to future health and safety management needs. The exposure-based health issues project was initiated to assemble the components necessary to understand potential exposure situations and their medical surveillance and clinical aspects. Phase I focused only on current Hanford tank farm operations and serves as a starting point for the overall project. It is also anticipated that once the pilot is fully developed for Hanford HLW (i.e., current operations, retrieval, pretreatment, vitrification, and disposal), the process and analysis methods developed will be available and applicable for other DOE operations and sites. The purpose of this Phase I project report is to present the health impact information collected regarding ongoing tank waste maintenance operations, show the various aspects of health and safety involved in protecting workers, introduce the reader to the kinds of information that will need to be analyzed in order to effectively manage worker safety

  16. Analysis of Sea Level Rise in Action

    Science.gov (United States)

    Gill, K. M.; Huang, T.; Quach, N. T.; Boening, C.

    2016-12-01

    NASA's Sea Level Change Portal provides scientists and the general public with "one-stop" source for current sea level change information and data. Sea Level Rise research is a multidisciplinary research and in order to understand its causes, scientists must be able to access different measurements and to be able to compare them. The portal includes an interactive tool, called the Data Analysis Tool (DAT), for accessing, visualizing, and analyzing observations and models relevant to the study of Sea Level Rise. Using NEXUS, an open source, big data analytic technology developed at the Jet Propulsion Laboratory, the DAT is able provide user on-the-fly data analysis on all relevant parameters. DAT is composed of three major components: A dedicated instance of OnEarth (a WMTS service), NEXUS deep data analytic platform, and the JPL Common Mapping Client (CMC) for web browser based user interface (UI). Utilizing the global imagery, a user is capable of browsing the data in a visual manner and isolate areas of interest for further study. The interfaces "Analysis" tool provides tools for area or point selection, single and/or comparative dataset selection, and a range of options, algorithms, and plotting. This analysis component utilizes the Nexus cloud computing platform to provide on-demand processing of the data within the user-selected parameters and immediate display of the results. A RESTful web API is exposed for users comfortable with other interfaces and who may want to take advantage of the cloud computing capabilities. This talk discuss how DAT enables on-the-fly sea level research. The talk will introduce the DAT with an end-to-end tour of the tool with exploration and animating of available imagery, a demonstration of comparative analysis and plotting, and how to share and export data along with images for use in publications/presentations. The session will cover what kind of data is available, what kind of analysis is possible, and what are the outputs.

  17. Pulmonary Nodule Volumetry at Different Low Computed Tomography Radiation Dose Levels With Hybrid and Model-Based Iterative Reconstruction: A Within Patient Analysis.

    Science.gov (United States)

    den Harder, Annemarie M; Willemink, Martin J; van Hamersvelt, Robbert W; Vonken, Evertjan P A; Schilham, Arnold M R; Lammers, Jan-Willem J; Luijk, Bart; Budde, Ricardo P J; Leiner, Tim; de Jong, Pim A

    2016-01-01

    The aim of the study was to determine the effects of dose reduction and iterative reconstruction (IR) on pulmonary nodule volumetry. In this prospective study, 25 patients scheduled for follow-up of pulmonary nodules were included. Computed tomography acquisitions were acquired at 4 dose levels with a median of 2.1, 1.2, 0.8, and 0.6 mSv. Data were reconstructed with filtered back projection (FBP), hybrid IR, and model-based IR. Volumetry was performed using semiautomatic software. At the highest dose level, more than 91% (34/37) of the nodules could be segmented, and at the lowest dose level, this was more than 83%. Thirty-three nodules were included for further analysis. Filtered back projection and hybrid IR did not lead to significant differences, whereas model-based IR resulted in lower volume measurements with a maximum difference of -11% compared with FBP at routine dose. Pulmonary nodule volumetry can be accurately performed at a submillisievert dose with both FBP and hybrid IR.

  18. DTI analysis methods : Voxel-based analysis

    NARCIS (Netherlands)

    Van Hecke, Wim; Leemans, Alexander; Emsell, Louise

    2016-01-01

    Voxel-based analysis (VBA) of diffusion tensor imaging (DTI) data permits the investigation of voxel-wise differences or changes in DTI metrics in every voxel of a brain dataset. It is applied primarily in the exploratory analysis of hypothesized group-level alterations in DTI parameters, as it does

  19. Portfolio Selection Using Level Crossing Analysis

    Science.gov (United States)

    Bolgorian, Meysam; Shirazi, A. H.; Jafari, G. R.

    Asset allocation is one of the most important and also challenging issues in finance. In this paper using level crossing analysis we introduce a new approach for portfolio selection. We introduce a portfolio index that is obtained based on minimizing the waiting time to receive known return and risk values. By the waiting time, we mean time that a special level is observed in average. The advantage of this approach is that the investors are able to set their goals based on gaining return and knowing the average waiting time and risk value at the same time. As an example we use our model for forming portfolio of stocks in Tehran Stock Exchange (TSE).

  20. 4D-CT Lung registration using anatomy-based multi-level multi-resolution optical flow analysis and thin-plate splines.

    Science.gov (United States)

    Min, Yugang; Neylon, John; Shah, Amish; Meeks, Sanford; Lee, Percy; Kupelian, Patrick; Santhanam, Anand P

    2014-09-01

    The accuracy of 4D-CT registration is limited by inconsistent Hounsfield unit (HU) values in the 4D-CT data from one respiratory phase to another and lower image contrast for lung substructures. This paper presents an optical flow and thin-plate spline (TPS)-based 4D-CT registration method to account for these limitations. The use of unified HU values on multiple anatomy levels (e.g., the lung contour, blood vessels, and parenchyma) accounts for registration errors by inconsistent landmark HU value. While 3D multi-resolution optical flow analysis registers each anatomical level, TPS is employed for propagating the results from one anatomical level to another ultimately leading to the 4D-CT registration. 4D-CT registration was validated using target registration error (TRE), inverse consistency error (ICE) metrics, and a statistical image comparison using Gamma criteria of 1 % intensity difference in 2 mm(3) window range. Validation results showed that the proposed method was able to register CT lung datasets with TRE and ICE values <3 mm. In addition, the average number of voxel that failed the Gamma criteria was <3 %, which supports the clinical applicability of the propose registration mechanism. The proposed 4D-CT registration computes the volumetric lung deformations within clinically viable accuracy.

  1. 14 CFR 91.861 - Base level.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 2 2010-01-01 2010-01-01 false Base level. 91.861 Section 91.861... level. (a) U.S. Operators. The base level of a U.S. operator is equal to the number of owned or leased... to paragraphs (a) (1) and (2). (1) The base level of a U.S. operator shall be increased by a number...

  2. Variability in copepod trophic levels and feeding selectivity based on stable isotope analysis in Gwangyang Bay of the southern coast of the Korean Peninsula

    Science.gov (United States)

    Chen, Mianrun; Kim, Dongyoung; Liu, Hongbin; Kang, Chang-Keun

    2018-04-01

    Trophic preference (i.e., food resources and trophic levels) of different copepod groups was assessed along a salinity gradient in the temperate estuarine Gwangyang Bay of Korea, based on seasonal investigation of taxonomic results in 2015 and stable isotope analysis incorporating multiple linear regression models. The δ13C and δ15N values of copepods in the bay displayed significant spatial heterogeneity as well as seasonal variations, which were indicated by their significant relationships with salinity and temperature, respectively. Both spatial and temporal variations reflected those in isotopic values of food sources. The major calanoid groups (marine calanoids and brackish water calanoids) had a mean trophic level of 2.2 relative to nanoplankton as the basal food source, similar to the bulk copepod assemblage; however, they had dissimilar food sources based on the different δ13C values. Calanoid isotopic values indicated a mixture of different genera including species with high δ15N values (e.g., Labidocera, Sinocalanus, and Tortanus), moderate values (Calanus sinicus, Centropages, Paracalanus, and Acartia), and relatively low δ15N values (Eurytemora pacifica and Pseudodiaptomus). Feeding preferences of different copepods probably explain these seasonal and spatial patterns of the community trophic niche. Bayesian mixing model calculations based on source materials of two size fractions of particulate organic matter (nanoplankton at simple energy flow of the planktonic food web of Gwangyang Bay: from primary producers (nanoplankton) and a mixture of primary producers and herbivores (microplankton) through omnivores (Acartia, Calanus, Centropages, and Paracalanus) and detritivores (Pseudodiaptomus, Eurytemora, and harpacticoids) to carnivores (Corycaeus, Tortanus, Labidocera, and Sinocalanus).

  3. Scalable group level probabilistic sparse factor analysis

    DEFF Research Database (Denmark)

    Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard

    2017-01-01

    Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...

  4. Analysis of Students' Online Information Searching Strategies, Exposure to Internet Information Pollution and Cognitive Absorption Levels Based on Various Variables

    Science.gov (United States)

    Kurt, Adile Askim; Emiroglu, Bülent Gürsel

    2018-01-01

    The objective of the present study was to examine students' online information searching strategies, their cognitive absorption levels and the information pollution levels on the Internet based on different variables and to determine the correlation between these variables. The study was designed with the survey model, the study group included 198…

  5. Trace-level mercury ion (Hg2+) analysis in aqueous sample based on solid-phase extraction followed by microfluidic immunoassay.

    Science.gov (United States)

    Date, Yasumoto; Aota, Arata; Terakado, Shingo; Sasaki, Kazuhiro; Matsumoto, Norio; Watanabe, Yoshitomo; Matsue, Tomokazu; Ohmura, Naoya

    2013-01-02

    Mercury is considered the most important heavy-metal pollutant, because of the likelihood of bioaccumulation and toxicity. Monitoring widespread ionic mercury (Hg(2+)) contamination requires high-throughput and cost-effective methods to screen large numbers of environmental samples. In this study, we developed a simple and sensitive analysis for Hg(2+) in environmental aqueous samples by combining a microfluidic immunoassay and solid-phase extraction (SPE). Using a microfluidic platform, an ultrasensitive Hg(2+) immunoassay, which yields results within only 10 min and with a lower detection limit (LOD) of 0.13 μg/L, was developed. To allow application of the developed immunoassay to actual environmental aqueous samples, we developed an ion-exchange resin (IER)-based SPE for selective Hg(2+) extraction from an ion mixture. When using optimized SPE conditions, followed by the microfluidic immunoassay, the LOD of the assay was 0.83 μg/L, which satisfied the guideline values for drinking water suggested by the United States Environmental Protection Agency (USEPA) (2 μg/L; total mercury), and the World Health Organisation (WHO) (6 μg/L; inorganic mercury). Actual water samples, including tap water, mineral water, and river water, which had been spiked with trace levels of Hg(2+), were well-analyzed by SPE, followed by microfluidic Hg(2+) immunoassay, and the results agreed with those obtained from reduction vaporizing-atomic adsorption spectroscopy.

  6. Rolling circle amplification-based analysis of Sri Lankan cassava mosaic virus isolates from Tamil Nadu, India, suggests a low level of genetic variability.

    Science.gov (United States)

    Kushawaha, Akhilesh Kumar; Rabindran, Ramalingam; Dasgupta, Indranil

    2018-03-01

    Cassava mosaic disease is a widespread disease of cassava in south Asia and the African continent. In India, CMD is known to be caused by two single-stranded DNA viruses (geminiviruses), Indian cassava mosaic virus (ICMV) and Sri Lankan cassava mosdaic virus (SLCMV). Previously, the diversity of ICMV and SLCMV in India has been studied using PCR, a sequence-dependent method. To have a more in-depth study of the variability of the above viruses and to detect any novel geminiviruses associated with CMD, sequence-independent amplification using rolling circle amplification (RCA)-based methods were used. CMD affected cassava plants were sampled across eighty locations in nine districts of the southern Indian state of Tamil Nadu. Twelve complete sequence of coat protein genes of the resident geminiviruses, comprising 256 amino acid residues were generated from the above samples, which indicated changes at only six positions. RCA followed by RFLP of the 80 samples indicated that most samples (47) contained only SLCMV, followed by 8, which were infected jointly with ICMV and SLCMV. In 11 samples, the pattern did not match the expected patterns from either of the two viruses and hence, were variants. Sequence analysis of an average of 700 nucleotides from 31 RCA-generated fragments of the variants indicated identities of 97-99% with the sequence of a previously reported infectious clone of SLCMV. The evidence suggests low levels of genetic variability in the begomoviruses infecting cassava, mainly in the form of scattered single nucleotide changes.

  7. Comparative Analysis of Students’ Media Competences Levels

    Directory of Open Access Journals (Sweden)

    Alexander Fedorov

    2015-08-01

    Full Text Available This article analyzed the results of survey of university students’ media literacy competence (on the base of a classification of indicators of media literacy competence of the audience as an effective tool for comparative analysis of the levels of development of media competence of students of the control and experimental groups: the level of media competence of students who have a one-year training course in the framework of media literacy education courses four times higher than in similar indicators in the control group. Analysis of the results of this survey confirmed the general trend of media contacts of student audience – its orientation to entertainment genres of audiovisual media, visually appealing; positive, active, unmarried, childless, educated, highly qualified characters (primarily – male characters aged 19 to 35 years. These heroes are characteristic optimism, independence, intelligence, emotion. They have an excellent command of the life situation and have a positive impact on the development progress of the plot of a media text.

  8. Levels of narrative analysis in health psychology.

    Science.gov (United States)

    Murray, M

    2000-05-01

    The past 10-15 years have seen a rapid increase in the study of narrative across all the social sciences. It is sometimes assumed that narrative has the same meaning irrespective of the context in which it is expressed. This article considers different levels of narrative analysis within health psychology. Specifically, it considers the character of health and illness narratives as a function of the personal, interpersonal, positional and societal levels of analysis. At the personal level of analysis narratives are portrayed as expressions of the lived experience of the narrator. At the interpersonal level of analysis the narrative is one that is co-created in dialogue. At the positional level of analysis the analysis considers the differences in social position between the narrator and the listener. The societal level of analysis is concerned with the socially shared stories that are characteristic of certain communities or societies. The challenge is to articulate the connections between these different levels of narrative analysis and to develop strategies to promote emancipatory narratives.

  9. Serum Alanine Aminotransferase Levels within Normal Range Have Different Associations with Augmentation Index and Other Cardiometabolic Risk Factors in Nondrinkers and Drinkers: A Chinese Community-Based Analysis

    Directory of Open Access Journals (Sweden)

    Shihui Fu

    2017-01-01

    Full Text Available Background. To investigate whether serum alanine aminotransferase (ALT levels within normal range were associated with augmentation index (AIx and cardiometabolic risk factors in nondrinkers and drinkers in Chinese community-dwelling population. Methods. There were 4165 participants with serum ALT levels within normal range. Results. Alcohol drinking was observed in 1173 participants (28.2%. In multivariate analysis, serum ALT levels of nondrinkers were independently associated with age, sex, body mass index (BMI, hypertension, diabetes mellitus, diastolic blood pressure, triglyceride, low-density lipoprotein-cholesterol (LDL-c, and AIx, while serum ALT levels of drinkers were independently associated with age, sex, BMI, triglyceride, and LDL-c (p<0.05 for all. Conclusions. Associations of serum ALT levels within normal range with age, sex, body height and weight, and blood lipid were simultaneously present in participants with and without alcohol drinking, while associations of serum ALT levels within normal range with AIx, blood pressure, and glucose were seen in nondrinkers rather than in drinkers. These findings not only provide the evidence that serum ALT levels, even within the normal range, have different associations with arteriosclerosis and cardiometabolic risk factors in nondrinkers and drinkers but also are helpful in understanding the underlying pathophysiologic mechanisms linking the hepatic function to arteriosclerosis and cardiometabolic risk factors.

  10. Predator-based psychosocial stress animal model of PTSD: Preclinical assessment of traumatic stress at cognitive, hormonal, pharmacological, cardiovascular and epigenetic levels of analysis.

    Science.gov (United States)

    Zoladz, Phillip R; Diamond, David M

    2016-10-01

    Research on post-traumatic stress disorder (PTSD) is faced with the challenge of understanding how a traumatic experience produces long-lasting detrimental effects on behavior and brain functioning, and more globally, how stress exacerbates somatic disorders, including cardiovascular disease. Moreover, the design of translational research needs to link animal models of PTSD to clinically relevant risk factors which address why only a subset of traumatized individuals develop persistent psychopathology. In this review, we have summarized our psychosocial stress rodent model of PTSD which is based on well-described PTSD-inducing risk factors, including a life-threatening experience, a sense of horror and uncontrollability, and insufficient social support. Specifically, our animal model of PTSD integrates acute episodes of inescapable exposure of immobilized rats to a predator with chronic daily social instability. This stress regimen produces PTSD-like effects in rats at behavioral, cognitive, physiological, pharmacological and epigenetic levels of analysis. We have discussed a recent extension of our animal model of PTSD in which stress exacerbated coronary pathology following an ischemic event, assessed in vitro. In addition, we have reviewed our research investigating pharmacological and non-pharmacological therapeutic strategies which may have value in clinical approaches toward the treatment of traumatized people. Overall, our translational approach bridges the gap between human and animal PTSD research to create a framework with which to enhance our understanding of the biological basis of trauma-induced pathology and to assess therapeutic approaches in the treatment of psychopathology. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Single-Level and Multilevel Mediation Analysis

    Science.gov (United States)

    Tofighi, Davood; Thoemmes, Felix

    2014-01-01

    Mediation analysis is a statistical approach used to examine how the effect of an independent variable on an outcome is transmitted through an intervening variable (mediator). In this article, we provide a gentle introduction to single-level and multilevel mediation analyses. Using single-level data, we demonstrate an application of structural…

  12. Utilizing Integrated Prediction Error Filter Analysis (INPEFA) to divide base-level cycle of fan-deltas: A case study of the Triassic Baikouquan Formation in Mabei Slope Area, Mahu Depression, Junggar Basin, China

    Science.gov (United States)

    Yuan, Rui; Zhu, Rui; Qu, Jianhua; Wu, Jun; You, Xincai; Sun, Yuqiu; Zhou, Yuanquan (Nancy)

    2018-05-01

    The Mahu Depression is an important hydrocarbon-bearing foreland sag located at the northwestern margin of the Junggar Basin, China. On the northern slope of the depression, large coarse-grained proximal fan-delta depositional systems developed in the Lower Triassic Baikouquan Formation (T1b). Some lithologic hydrocarbon reservoirs have been found in the conglomerates of the formation since recent years. However, the rapid vertical and horizontal lithology variations make it is difficult to divide the base-level cycle of the formation using the conventional methods. Spectral analysis technologies, such as Integrated Prediction Error Filter Analysis (INPEFA), provide another effective way to overcome this difficultly. In this paper, processed by INPEFA, conventional resistivity logs are utilized to study the base-level cycle of the fan-delta depositional systems. The negative trend of the INPEFA curve indicates the base-level fall semi-cycles, adversely, positive trend suggests the rise semi-cycles. Base-level cycles of Baikouquan Formation are divided in single and correlation wells. One long-term base-level rise semi-cycle, including three medium-term base-level cycles, is identified overall the Baikouquan Formation. The medium-term base-level cycles are characterized as rise semi-cycles mainly in the fan-delta plain, symmetric cycles in the fan-delta front and fall semi-cycles mainly in the pro-fan-delta. The short-term base-level rise semi-cycles most developed in the braided channels, sub-aqueous distributary channels and sheet sands. While, the interdistributary bays and pro-fan-delta mud indicate short-term base-level fall semi-cycles. Finally, based on the method of INPEFA, sequence filling model of Baikouquan formation is established.

  13. Internet-based Modeling, Mapping, and Analysis for the Greater Everglades (IMMAGE; Version 1.0): web-based tools to assess the impact of sea level rise in south Florida

    Science.gov (United States)

    Hearn, Paul; Strong, David; Swain, Eric; Decker, Jeremy

    2013-01-01

    South Florida's Greater Everglades area is particularly vulnerable to sea level rise, due to its rich endowment of animal and plant species and its heavily populated urban areas along the coast. Rising sea levels are expected to have substantial impacts on inland flooding, the depth and extent of surge from coastal storms, the degradation of water supplies by saltwater intrusion, and the integrity of plant and animal habitats. Planners and managers responsible for mitigating these impacts require advanced tools to help them more effectively identify areas at risk. The U.S. Geological Survey's (USGS) Internet-based Modeling, Mapping, and Analysis for the Greater Everglades (IMMAGE) Web site has been developed to address these needs by providing more convenient access to projections from models that forecast the effects of sea level rise on surface water and groundwater, the extent of surge and resulting economic losses from coastal storms, and the distribution of habitats. IMMAGE not only provides an advanced geographic information system (GIS) interface to support decision making, but also includes topic-based modules that explain and illustrate key concepts for nontechnical users. The purpose of this report is to familiarize both technical and nontechnical users with the IMMAGE Web site and its various applications.

  14. Stoichiometric Representation of Gene–Protein–Reaction Associations Leverages Constraint-Based Analysis from Reaction to Gene-Level Phenotype Prediction

    DEFF Research Database (Denmark)

    Machado, Daniel; Herrgard, Markus; Rocha, Isabel

    2016-01-01

    only describe the metabolic phenotype at the reaction level, understanding the mechanistic link between genotype and phenotype is still hampered by the complexity of gene-protein-reaction associations. We implement a model transformation that enables constraint-based methods to be applied at the gene...... design methods are not actually feasible, and show how our approach allows using the same methods to obtain feasible gene-based designs. We also show, by extensive comparison with experimental 13C-flux data, how simple reformulations of different simulation methods with gene-wise objective functions...

  15. Students’ thinking level based on intrapersonal intelligence

    Science.gov (United States)

    Sholikhati, Rahadian; Mardiyana; Retno Sari Saputro, Dewi

    2017-12-01

    This research aims to determine the students’ thinking level based on bloom taxonomy guidance and reviewed from students' Intrapersonal Intelligence. Taxonomy bloom is a taxonomy that classifies the students' thinking level into six, ie the remembering, understanding, applying, analyzing, creating, and evaluating levels. Students' Intrapersonal Intelligence is the intelligence associated with awareness and knowledge of oneself. The type of this research is descriptive research with qualitative approach. The research subject were taken by one student in each Intrapersonal Intelligence category (high, moderate, and low) which then given the problem solving test and the result was triangulated by interview. From this research, it is found that high Intrapersonal Intelligence students can achieve analyzing thinking level, subject with moderate Intrapersonal Intelligence being able to reach the level of applying thinking, and subject with low Intrapersonal Intelligence able to reach understanding level.

  16. Analysis of acid-base misconceptions using modified certainty of response index (CRI and diagnostic interview for different student levels cognitive

    Directory of Open Access Journals (Sweden)

    Satya Sadhu

    2017-08-01

    Full Text Available The authors in this paper draw attention to the importance of an instrument that can analyze student’s misconception.This study described the kind of the misconception in acid-base theory, and the percentage students’ misconception occur in every subconcept of acid-base theory. The design of this study is a descriptive method, involved 148 of 11th grade science students from Senior High School, which divided into two classes are high cognitive and low cognitive. Further analysis of using Modified Certainty of Response Index (CRI as a diagnostic instrument is used to explore misconception which in that test included evaluating only content knowledge with considering the reason behind the students' choice of response and their certainty of response in every question. The result of data analysis has shown that misconception occurred in high cognitive class, gained 43,86% and misconception occurred in low cognitive class, gained 24,63%. Based on the diagnostic interview has shown that misconception occurred in students due to students does not understand the concept well and they related the one concept to the other concepts with partial understanding, the result students make the failed conclusions. The type of misconception occurred is a conceptual misunderstanding.  According to the data analysis showed that Modified Certainty of Response Index (CRI is effective used to analyze students’ misconceptions and the diagnostic interview is effective used to know the reasons that caused students which having misconceptions.

  17. Allocating organisational level funding on the basis of Research Performance Based assessments, a comparative analysis of the EU Member States in international perspective

    Energy Technology Data Exchange (ETDEWEB)

    Jonkers, K.; Zacharewicz, T.; Lepori, B.; Reale, E.

    2016-07-01

    The paper analyses the extent to which RPBF allocation mechanisms are being implemented in Europe. To do so, this study builds on a novel set of data on project and organisational level funding developed for the European Commission, which identifies funding allocation mechanisms in each of the EU-28 Member States. This approach allows to compare the scope of RPBF systems across European countries.Further, the paper build on an in-depth analysis of RPBF implementation in 28 European countries, which comes to a classification of different types of RPBF implementation around three characteristics, i.e. a) the way research performance is measured and b) the type of link between performance assessment and allocation of resources. The analysis furthermore identifies a number of good practices while highlighting the potential for adverse effects of RPBF systems in research systems at different stages of development. (Author)

  18. Foreign Policy: Approaches, Levels Of Analysis, Dimensions

    OpenAIRE

    Nina Šoljan

    2012-01-01

    This paper provides an overview of key issues related to foreign policy and foreign policy theories in the wider context of political science. Discussing the origins and development of foreign policy analysis (FPA), as well as scholarly work produced over time, it argues that today FPA encompasses a variety of theoretical approaches, models and tools. These share the understanding that foreign policy outputs cannot be fully explained if analysis is confined to the systemic level. Furthermore,...

  19. Python based high-level synthesis compiler

    Science.gov (United States)

    Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard

    2014-11-01

    This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.

  20. Cork-based activated carbons as supported adsorbent materials for trace level analysis of ibuprofen and clofibric acid in environmental and biological matrices.

    Science.gov (United States)

    Neng, N R; Mestre, A S; Carvalho, A P; Nogueira, J M F

    2011-09-16

    In this contribution, powdered activated carbons (ACs) from cork waste were supported for bar adsorptive micro-extraction (BAμE), as novel adsorbent phases for the analysis of polar compounds. By combining this approach with liquid desorption followed by high performance liquid chromatography with diode array detection (BAμE(AC)-LD/HPLC-DAD), good analytical performance was achieved using clofibric acid (CLOF) and ibuprofen (IBU) model compounds in environmental and biological matrices. Assays performed on 30 mL water samples spiked at the 25.0 μg L(-1) level yielded recoveries around 80% for CLOF and 95% for IBU, under optimized experimental conditions. The ACs textural and surface chemistry properties were correlated with the results obtained. The analytical performance showed good precision (0.9922) from 1.0 to 600.0 μg L(-1). By using the standard addition methodology, the application of the present approach to environmental water and urine matrices allowed remarkable performance at the trace level. The proposed methodology proved to be a viable alternative for acidic pharmaceuticals analysis, showing to be easy to implement, reliable, sensitive and requiring low sample volume to monitor these priority compounds in environmental and biological matrices. Copyright © 2011 Elsevier B.V. All rights reserved.

  1. DNA methylation levels analysis in four tissues of sea cucumber Apostichopus japonicus based on fluorescence-labeled methylation-sensitive amplified polymorphism (F-MSAP) during aestivation.

    Science.gov (United States)

    Zhao, Ye; Chen, Muyan; Storey, Kenneth B; Sun, Lina; Yang, Hongsheng

    2015-03-01

    DNA methylation plays an important role in regulating transcriptional change in response to environmental stimuli. In the present study, DNA methylation levels of tissues of the sea cucumber Apostichopus japonicus were analyzed by the fluorescence-labeled methylation-sensitive amplified polymorphism (F-MSAP) technique over three stages of the aestivation cycle. Overall, a total of 26,963 fragments were amplified including 9112 methylated fragments among four sea cucumber tissues using 18 pairs of selective primers. Results indicated an average DNA methylation level of 33.79% for A. japonicus. The incidence of DNA methylation was different across tissue types in the non-aestivation stage: intestine (30.16%), respiratory tree (27.61%), muscle (27.94%) and body wall (56.25%). Our results show that hypermethylation accompanied deep-aestivation in A. japonicus, which suggests that DNA methylation may have an important role in regulating global transcriptional suppression during aestivation. Further analysis indicated that the main DNA modification sites were focused on intestine and respiratory tree tissues and that full-methylation but not hemi-methylation levels exhibited significant increases in the deep-aestivation stage. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Level crossing analysis of growing surfaces

    International Nuclear Information System (INIS)

    Shahbazi, F; Sobhanian, S; Tabar, M Reza Rahimi; Khorram, S; Frootan, G R; Zahed, H

    2003-01-01

    We investigate the average frequency of positive slope ν + α , crossing the height α = h - h-bar in the surface growing processes. The exact level crossing analysis of the random deposition model and the Kardar-Parisi-Zhang equation in the strong coupling limit before creation of singularities is given

  3. Trajectory Based Traffic Analysis

    DEFF Research Database (Denmark)

    Krogh, Benjamin Bjerre; Andersen, Ove; Lewis-Kelham, Edwin

    2013-01-01

    We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point-and-click a......We present the INTRA system for interactive path-based traffic analysis. The analyses are developed in collaboration with traffic researchers and provide novel insights into conditions such as congestion, travel-time, choice of route, and traffic-flow. INTRA supports interactive point......-and-click analysis, due to a novel and efficient indexing structure. With the web-site daisy.aau.dk/its/spqdemo/we will demonstrate several analyses, using a very large real-world data set consisting of 1.9 billion GPS records (1.5 million trajectories) recorded from more than 13000 vehicles, and touching most...

  4. Developing Mathematics Problems Based on Pisa Level

    Directory of Open Access Journals (Sweden)

    Shahibul Ahyan

    2014-01-01

    Full Text Available This research aims to produce mathematics problems based on PISA level with valid and practical content of change and relationships and has potential effect for Junior High School students. A development research method developed by Akker, Gravemeijer, McKenney and Nieveen is used this research. In the first stage, the researcher analyzed students, algebra material in school-based curricula (KTSP and mathematics problems of PISA 2003 of change and relationships content. The second stage, the researcher designed 13 problems with content of change and relationships. The last, the researcher used formative evaluation design developed by Tessmer which includes self evaluation, one-to-one, expert review, small group, and field test. The data collect by walk through, interview, and questionnaire. The result of this research indicated that 12 mathematical problems based on PISA level of change and relationships content that developed have validity, practically, and potential effects for Junior High School students.

  5. THE ANALYSIS OF THE GLOBAL LEVEL OF THE ACTIVITY OF A COMPANY ON THE BASE OF THE RESULT INDICATORS WHICH EXPRESS PROFITABILITY

    Directory of Open Access Journals (Sweden)

    CĂRUNTU CONSTANTIN

    2014-02-01

    Full Text Available The analysis of the result indicators which express the profitability of a company have an important role in establishing the strategy to be followed especially during the economic and financial crisis. To highlight the performance of a business and determine the efficiency of the use of inputs, it is necessary to quantify and measure the results, and in order to measure the results we use a range of performance indicators. They are part of a philosophy of continuous improvement, as the one introduced by Deming, "Deming Way", which is based on the repetitive application of the following principles: "planning - implementation - control - action". These indicators measure the degree to which the results obtained correspond to the efforts and the planned objectives. The purpose of this article is to analyze a company's performance starting from the outcome indicators of the nature of profit, and for reaching this aim we shall have in view the performance of the following objectives: the dynamic analysis of revenues, expenses and profit, the diagnosis analysis of factorial type of the total profit and of the financial profit at S.C. OMV PETROM S.A..

  6. radio frequency based radio frequency based water level monitor

    African Journals Online (AJOL)

    eobe

    ABSTRACT. This paper elucidates a radio frequency (RF) based transmission and reception system used to remotely monitor and .... range the wireless can cover but in this prototype, it ... power supply to the system, the sensed water level is.

  7. Late Pleistocene Vegetation, Climate and Relative Sea Level Changes in the Southeastern Brazilian Coast, Based on C and N Isotopes and Bio Indicator Analysis of Mangrove Organic Matter

    Energy Technology Data Exchange (ETDEWEB)

    Pessenda, L. C.R.; Vidotto, E.; Buso, Jr., A. A.; Assarini, Jr., J. P.; Bendassollia, J. A. [Center for Nuclear Energy in Agriculture (CENA), University of Sao Paulo (USP), Piracicaba, Sao Paulo (Brazil); De Oliveira, P. E. [University of Guarulhos (UNG), Guarulhos, Sao Paulo (Brazil); Macias, F. [' ' Luiz de Queiroz' ' College of Agriculture/USP, Piracicaba, Sao Paulo (Brazil); Ricardi-Branco, F. [University of Campinas - Geosciences Institute, Campinas, Sao Paulo (Brazil)

    2013-07-15

    On the southeastern Brazilian coast, mangrove organic matter records have been studied by C and N isotopes, pollen and diatom analysis to reconstruct 40 ka of vegetation and climatic history. The {delta}{sup 13}C and {delta}{sup 15}N presented more depleted values from 40 to 19 ka BP. The high C/N ratios and depleted isotopic values indicate the predominance of C3 land plants in the location presently occupied by the mangrove vegetation, and a lower sea level than today. The presence of pollen of Ilex, Weinmannia, Symplocos, Drimys and Podocarpus suggest a colder and more humid climate than present. From 19 to 2.2 ka BP a sedimentary hiatus is associated with a sea level rise. The presence of mangrove in its present position since at least 2.200 a BP and the return of the marine coastline are associated with the lowest C/N ratio, more enriched {delta}{sup 13}C and {delta}{sup 15}N values and the presence of marine diatoms. (author)

  8. Prognostic value and clinicopathological significance of serum- and tissue-based cytokeratin 18 express level in breast cancer: a meta-analysis.

    Science.gov (United States)

    Yang, Jiangling; Gao, Sicheng; Xu, Jian; Zhu, Junfeng

    2018-04-27

    Cytokeratin 18 (CK18), a type I cytokeratin of the intermediate filament family, has been associated with the prognosis of cancer patients for decades. However, its exact role in predicting the clinical outcome of breast cancer remains controversial. To comprehensively investigated the prognostic value of CK18 in breast cancer, a systematically meta-analysis was conducted to explore the association between CK18 expression and overall survival. Literature collection was conducted by retrieving electronic databases Pubmed, Cochrane Library, Web of Science, EMBASE, and OVID completely (up to January 1, 2017). Nine relevant studies with 4857 cases assessing the relationship between CK18 high expression and the outcome of breast cancer patients were enrolled in our analysis. The results indicated that the high level of CK18 expression was significantly associated with overall survival of breast cancer patients via a specimen-depended manner. Reports which used serum to detect the expression of CK18 predicted a poor outcome of breast cancer (HR = 1.24, 95%CI: 1.11-1.38, P present study demonstrated that CK18 might be served as a novel biomarker to predict clinicopathological features and the outcome of breast cancer. © 2018 The Author(s).

  9. A large-scale measurement, analysis and modelling of electromagnetic radiation levels in the vicinity of GSM/UMTS base stations in an urban area.

    Science.gov (United States)

    Karadağ, Teoman; Yüceer, Mehmet; Abbasov, Teymuraz

    2016-01-01

    The present study analyses the electric field radiating from the GSM/UMTS base stations located in central Malatya, a densely populated urban area in Turkey. The authors have conducted both instant and continuous measurements of high-frequency electromagnetic fields throughout their research by using non-ionising radiation-monitoring networks. Over 15,000 instant and 13,000,000 continuous measurements were taken throughout the process. The authors have found that the normal electric field radiation can increase ∼25% during daytime, depending on mobile communication traffic. The authors' research work has also demonstrated the fact that the electric field intensity values can be modelled for each hour, day or week with the results obtained from continuous measurements. The authors have developed an estimation model based on these values, including mobile communication traffic (Erlang) values obtained from mobile phone base stations and the temperature and humidity values in the environment. The authors believe that their proposed artificial neural network model and multivariable least-squares regression analysis will help predict the electric field intensity in an environment in advance. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. A large-scale measurement, analysis and modelling of electromagnetic radiation levels in the vicinity of GSM/UMTS base stations in an urban area

    International Nuclear Information System (INIS)

    Karadag, Teoman; Abbasov, Teymuraz; Yuceer, Mehmet

    2016-01-01

    The present study analyses the electric field radiating from the GSM/UMTS base stations located in central Malatya, a densely populated urban area in Turkey. The authors have conducted both instant and continuous measurements of high-frequency electromagnetic fields throughout their research by using non-ionising radiation-monitoring networks. Over 15 000 instant and 13 000 000 continuous measurements were taken throughout the process. The authors have found that the normal electric field radiation can increase ∼25 % during daytime, depending on mobile communication traffic. The authors' research work has also demonstrated the fact that the electric field intensity values can be modelled for each hour, day or week with the results obtained from continuous measurements. The authors have developed an estimation model based on these values, including mobile communication traffic (Erlang) values obtained from mobile phone base stations and the temperature and humidity values in the environment. The authors believe that their proposed artificial neural network model and multivariable least-squares regression analysis will help predict the electric field intensity in an environment in advance. (authors)

  11. Aspect level sentiment analysis using machine learning

    Science.gov (United States)

    Shubham, D.; Mithil, P.; Shobharani, Meesala; Sumathy, S.

    2017-11-01

    In modern world the development of web and smartphones increases the usage of online shopping. The overall feedback about product is generated with the help of sentiment analysis using text processing.Opinion mining or sentiment analysis is used to collect and categorized the reviews of product. The proposed system uses aspect leveldetection in which features are extracted from the datasets. The system performs pre-processing operation such as tokenization, part of speech and limitization on the data tofinds meaningful information which is used to detect the polarity level and assigns rating to product. The proposed model focuses on aspects to produces accurate result by avoiding the spam reviews.

  12. Cloud-based CT dose monitoring using the DICOM-structured report. Fully automated analysis in regard to national diagnostic reference levels

    International Nuclear Information System (INIS)

    Boos, J.; Rubbert, C.; Heusch, P.; Lanzman, R.S.; Aissa, J.; Antoch, G.; Kroepil, P.

    2016-01-01

    To implement automated CT dose data monitoring using the DICOM-Structured Report (DICOM-SR) in order to monitor dose-related CT data in regard to national diagnostic reference levels (DRLs). Materials and Methods: We used a novel in-house co-developed software tool based on the DICOM-SR to automatically monitor dose-related data from CT examinations. The DICOM-SR for each CT examination performed between 09/2011 and 03/2015 was automatically anonymized and sent from the CT scanners to a cloud server. Data was automatically analyzed in accordance with body region, patient age and corresponding DRL for volumetric computed tomography dose index (CTDI vol ) and dose length product (DLP). Results: Data of 36 523 examinations (131 527 scan series) performed on three different CT scanners and one PET/CT were analyzed. The overall mean CTDI vol and DLP were 51.3 % and 52.8 % of the national DRLs, respectively. CTDI vol and DLP reached 43.8 % and 43.1 % for abdominal CT (n = 10 590), 66.6 % and 69.6 % for cranial CT (n = 16 098) and 37.8 % and 44.0 % for chest CT (n = 10 387) of the compared national DRLs, respectively. Overall, the CTDI vol exceeded national DRLs in 1.9 % of the examinations, while the DLP exceeded national DRLs in 2.9 % of the examinations. Between different CT protocols of the same body region, radiation exposure varied up to 50 % of the DRLs. Conclusion: The implemented cloud-based CT dose monitoring based on the DICOM-SR enables automated benchmarking in regard to national DRLs. Overall the local dose exposure from CT reached approximately 50 % of these DRLs indicating that DRL actualization as well as protocol-specific DRLs are desirable. The cloud-based approach enables multi-center dose monitoring and offers great potential to further optimize radiation exposure in radiological departments.

  13. Final base case community analysis: Indian Springs, Nevada for the Clark County socioeconomic impact assessment of the proposed high- level nuclear waste repository at Yucca Mountain, Nevada

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1992-06-18

    This document provides a base case description of the rural Clark County community of Indian Springs in anticipation of change associated with the proposed high-level nuclear waste repository at Yucca Mountain. As the community closest to the proposed site, Indian Springs may be seen by site characterization workers, as well as workers associated with later repository phases, as a logical place to live. This report develops and updates information relating to a broad spectrum of socioeconomic variables, thereby providing a `snapshot` or `base case` look at Indian Springs in early 1992. With this as a background, future repository-related developments may be analytically separated from changes brought about by other factors, thus allowing for the assessment of the magnitude of local changes associated with the proposed repository. Given the size of the community, changes that may be considered small in an absolute sense may have relatively large impacts at the local level. Indian Springs is, in many respects, a unique community and a community of contrasts. An unincorporated town, it is a small yet important enclave of workers on large federal projects and home to employees of small- scale businesses and services. It is a rural community, but it is also close to the urbanized Las Vega Valley. It is a desert community, but has good water resources. It is on flat terrain, but it is located within 20 miles of the tallest mountains in Nevada. It is a town in which various interest groups diverge on issues of local importance, but in a sense of community remains an important feature of life. Finally, it has a sociodemographic history of both surface transience and underlying stability. If local land becomes available, Indian Springs has some room for growth but must first consider the historical effects of growth on the town and its desired direction for the future.

  14. Final base case community analysis: Indian Springs, Nevada for the Clark County socioeconomic impact assessment of the proposed high- level nuclear waste repository at Yucca Mountain, Nevada

    International Nuclear Information System (INIS)

    1992-01-01

    This document provides a base case description of the rural Clark County community of Indian Springs in anticipation of change associated with the proposed high-level nuclear waste repository at Yucca Mountain. As the community closest to the proposed site, Indian Springs may be seen by site characterization workers, as well as workers associated with later repository phases, as a logical place to live. This report develops and updates information relating to a broad spectrum of socioeconomic variables, thereby providing a 'snapshot' or 'base case' look at Indian Springs in early 1992. With this as a background, future repository-related developments may be analytically separated from changes brought about by other factors, thus allowing for the assessment of the magnitude of local changes associated with the proposed repository. Given the size of the community, changes that may be considered small in an absolute sense may have relatively large impacts at the local level. Indian Springs is, in many respects, a unique community and a community of contrasts. An unincorporated town, it is a small yet important enclave of workers on large federal projects and home to employees of small- scale businesses and services. It is a rural community, but it is also close to the urbanized Las Vega Valley. It is a desert community, but has good water resources. It is on flat terrain, but it is located within 20 miles of the tallest mountains in Nevada. It is a town in which various interest groups diverge on issues of local importance, but in a sense of community remains an important feature of life. Finally, it has a sociodemographic history of both surface transience and underlying stability. If local land becomes available, Indian Springs has some room for growth but must first consider the historical effects of growth on the town and its desired direction for the future

  15. Gender Inequity Associated with Increased Child Physical Abuse and Neglect: a Cross-Country Analysis of Population-Based Surveys and Country-Level Statistics.

    Science.gov (United States)

    Klevens, Joanne; Ports, Katie A

    2017-11-01

    Gender inequity is proposed as a societal-level risk factor for child maltreatment. However, most cross-national research examining this association is limited to developing countries and has used limited measures of gender inequity and child homicides as a proxy for child maltreatment. To examine the relationship between gender inequity and child maltreatment, we used caregivers' reported use of severe physical punishment (proxy for physical abuse) and children under 5 left alone or under the care of another child younger than 10 years of age (supervisory neglect) and three indices of gender inequity (the Social and Institutional Gender Index, the Gender Inequality Index, and the Gender Gap Index) from 57 countries, over half of which were developing countries. We found all three gender inequity indices to be significantly associated with physical abuse and two of the three to be significantly associated with neglect, after controlling for country-level development. Based on these findings, efforts to prevent child abuse and neglect might benefit from reducing gender inequity.

  16. Associations of Serum Manganese Levels with Prediabetes and Diabetes among ≥60-Year-Old Chinese Adults: A Population-Based Cross-Sectional Analysis.

    Science.gov (United States)

    Wang, Xuan; Zhang, Mingyue; Lui, Guang; Chang, Hong; Zhang, Meilin; Liu, Wei; Li, Ziwei; Liu, Yixin; Huang, Guowei

    2016-08-13

    Older adults can experience glucose metabolism dysfunction, and although manganese may help regulate glucose metabolism, there is little information regarding this association among older people. This cross-sectional study included 2402 Chinese adults who were ≥60 years old in 2013 (Tianjin, China), and evaluated the associations of serum manganese with prediabetes and diabetes. Serum manganese levels were measured using inductively coupled plasma mass spectrometry. Multivariable logistic regression models were used to evaluate the sex-specific associations of manganese levels with diabetes and prediabetes after adjusting for confounding factors (age, sex, life style factors, and health status). Based on the WHO criteria, prediabetes was observed in 15.1% of men and 13.4% of women, while diabetes was observed in 30.0% of men and 34.4% of women. In the final model, the odds ratios (95% confidence interval) for prediabetes according to manganese quartile were 1.000, 0.463 (0.269-0.798), 0.639 (0.383-1.065), and 0.614 (0.365-1.031) among men and 1.000, 0.773 (0.498-1.200), 0.602 (0.382-0.947), and 0.603 (0.381-0.953) among women (p for trend = 0.134 and 0.015, respectively). The lowest prevalence of diabetes among men occurred at a moderate range of serum manganese (p prediabetes and diabetes.

  17. Structure and performance of a real-time algorithm to detect tsunami or tsunami-like alert conditions based on sea-level records analysis

    Directory of Open Access Journals (Sweden)

    L. Bressan

    2011-05-01

    Full Text Available The goal of this paper is to present an original real-time algorithm devised for detection of tsunami or tsunami-like waves we call TEDA (Tsunami Early Detection Algorithm, and to introduce a methodology to evaluate its performance. TEDA works on the sea level records of a single station and implements two distinct modules running concurrently: one to assess the presence of tsunami waves ("tsunami detection" and the other to identify high-amplitude long waves ("secure detection". Both detection methods are based on continuously updated time functions depending on a number of parameters that can be varied according to the application. In order to select the most adequate parameter setting for a given station, a methodology to evaluate TEDA performance has been devised, that is based on a number of indicators and that is simple to use. In this paper an example of TEDA application is given by using data from a tide gauge located at the Adak Island in Alaska, USA, that resulted in being quite suitable since it recorded several tsunamis in the last years using the sampling rate of 1 min.

  18. Time series analysis based on two-part models for excessive zero count data to detect farm-level outbreaks of swine echinococcosis during meat inspections.

    Science.gov (United States)

    Adachi, Yasumoto; Makita, Kohei

    2017-12-01

    Echinococcus multilocularis is a parasite that causes highly pathogenic zoonoses and is maintained in foxes and rodents on Hokkaido Island, Japan. Detection of E. multilocularis infections in swine is epidemiologically important. In Hokkaido, administrative information is provided to swine producers based on the results of meat inspections. However, as the current criteria for providing administrative information often results in delays in providing information to producers, novel criteria are needed. Time series models were developed to monitor autocorrelations between data and lags using data collected from 84 producers at the Higashi-Mokoto Meat Inspection Center between April 2003 and November 2015. The two criteria were quantitatively compared using the sign test for the ability to rapidly detect farm-level outbreaks. Overall, the time series models based on an autoexponentially regressed zero-inflated negative binomial distribution with 60th percentile cumulative distribution function of the model detected outbreaks earlier more frequently than the current criteria (90.5%, 276/305, ppart model with autoexponential regression can adequately deal with data involving an excessive number of zeros and that the novel criteria overcome disadvantages of the current criteria to provide an earlier indication of increases in the rate of echinococcosis. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. GRABGAM Analysis of Ultra-Low-Level HPGe Gamma Spectra

    International Nuclear Information System (INIS)

    Winn, W.G.

    1999-01-01

    The GRABGAM code has been used successfully for ultra-low level HPGe gamma spectrometry analysis since its development in 1985 at Savannah River Technology Center (SRTC). Although numerous gamma analysis codes existed at that time, reviews of institutional and commercial codes indicated that none addressed all features that were desired by SRTC. Furthermore, it was recognized that development of an in-house code would better facilitate future evolution of the code to address SRTC needs based on experience with low-level spectra. GRABGAM derives its name from Gamma Ray Analysis BASIC Generated At MCA/PC

  20. GRABGAM Analysis of Ultra-Low-Level HPGe Gamma Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Winn, W.G.

    1999-07-28

    The GRABGAM code has been used successfully for ultra-low level HPGe gamma spectrometry analysis since its development in 1985 at Savannah River Technology Center (SRTC). Although numerous gamma analysis codes existed at that time, reviews of institutional and commercial codes indicated that none addressed all features that were desired by SRTC. Furthermore, it was recognized that development of an in-house code would better facilitate future evolution of the code to address SRTC needs based on experience with low-level spectra. GRABGAM derives its name from Gamma Ray Analysis BASIC Generated At MCA/PC.

  1. Roadmap-Based Level Clearing of Buildings

    KAUST Repository

    Rodriguez, Samuel

    2011-01-01

    In this paper we describe a roadmap-based approach for a multi-agent search strategy to clear a building or multi-story environment. This approach utilizes an encoding of the environment in the form of a graph (roadmap) that is used to encode feasible paths through the environment. The roadmap is partitioned into regions, e.g., one per level, and we design region-based search strategies to cover and clear the environment. We can provide certain guarantees within this roadmap-based framework on coverage and the number of agents needed. Our approach can handle complex and realistic environments where many approaches are restricted to simple 2D environments. © 2011 Springer-Verlag.

  2. HIV-1 transmission patterns in antiretroviral therapy-naive, HIV-infected North Americans based on phylogenetic analysis by population level and ultra-deep DNA sequencing.

    Directory of Open Access Journals (Sweden)

    Lisa L Ross

    Full Text Available Factors that contribute to the transmission of human immunodeficiency virus type 1 (HIV-1, especially drug-resistant HIV-1 variants remain a significant public health concern. In-depth phylogenetic analyses of viral sequences obtained in the screening phase from antiretroviral-naïve HIV-infected patients seeking enrollment in EPZ108859, a large open-label study in the USA, Canada and Puerto Rico (ClinicalTrials.gov NCT00440947 were examined for insights into the roles of drug resistance and epidemiological factors that could impact disease dissemination. Viral transmission clusters (VTCs were initially predicted from a phylogenetic analysis of population level HIV-1 pol sequences obtained from 690 antiretroviral-naïve subjects in 2007. Subsequently, the predicted VTCs were tested for robustness by ultra deep sequencing (UDS using pyrosequencing technology and further phylogenetic analyses. The demographic characteristics of clustered and non-clustered subjects were then compared. From 690 subjects, 69 were assigned to 1 of 30 VTCs, each containing 2 to 5 subjects. Race composition of VTCs were significantly more likely to be white (72% vs. 60%; p = 0.04. VTCs had fewer reverse transcriptase and major PI resistance mutations (9% vs. 24%; p = 0.002 than non-clustered sequences. Both men-who-have-sex-with-men (MSM (68% vs. 48%; p = 0.001 and Canadians (29% vs. 14%; p = 0.03 were significantly more frequent in VTCs than non-clustered sequences. Of the 515 subjects who initiated antiretroviral therapy, 33 experienced confirmed virologic failure through 144 weeks while only 3/33 were from VTCs. Fewer VTCs subjects (as compared to those with non-clustering virus had HIV-1 with resistance-associated mutations or experienced virologic failure during the course of the study. Our analysis shows specific geographical and drug resistance trends that correlate well with transmission clusters defined by HIV sequences of similarity

  3. Analysis of Fundus Fluorescein Angiogram Based on the Hessian Matrix of Directional Curvelet Sub-bands and Distance Regularized Level Set Evolution.

    Science.gov (United States)

    Soltanipour, Asieh; Sadri, Saeed; Rabbani, Hossein; Akhlaghi, Mohammad Reza

    2015-01-01

    This paper presents a new procedure for automatic extraction of the blood vessels and optic disk (OD) in fundus fluorescein angiogram (FFA). In order to extract blood vessel centerlines, the algorithm of vessel extraction starts with the analysis of directional images resulting from sub-bands of fast discrete curvelet transform (FDCT) in the similar directions and different scales. For this purpose, each directional image is processed by using information of the first order derivative and eigenvalues obtained from the Hessian matrix. The final vessel segmentation is obtained using a simple region growing algorithm iteratively, which merges centerline images with the contents of images resulting from modified top-hat transform followed by bit plane slicing. After extracting blood vessels from FFA image, candidates regions for OD are enhanced by removing blood vessels from the FFA image, using multi-structure elements morphology, and modification of FDCT coefficients. Then, canny edge detector and Hough transform are applied to the reconstructed image to extract the boundary of candidate regions. At the next step, the information of the main arc of the retinal vessels surrounding the OD region is used to extract the actual location of the OD. Finally, the OD boundary is detected by applying distance regularized level set evolution. The proposed method was tested on the FFA images from angiography unit of Isfahan Feiz Hospital, containing 70 FFA images from different diabetic retinopathy stages. The experimental results show the accuracy more than 93% for vessel segmentation and more than 87% for OD boundary extraction.

  4. Analysis on the Change in Shallow Groundwater Level based on Monitoring Electric Energy Consumption - A Case Study in the North China Plain

    Science.gov (United States)

    Wang, L.; Wolfgang, K.; Steiner, J. F.

    2016-12-01

    Groundwater has been over-pumped for irrigation in the North China Plain in the past decades causing a drastic decrease in the groundwater level. Shallow groundwater can be recharged by rainfall, and the aquifer could be rehabilitated for sustainable use. However, understanding and maintaining the balance of the aquifer - including climatic as well as anthropogenic influences - are fundamental to enable such a sustainable groundwater management. This is still severely obstructed by a lack of measurements of recharge and exploitation. A project to measure groundwater pumping rate at the distributed scale based on monitoring electric energy consumption is going on in Guantao County (456 km2) located in the southern part of the North China Plain. Considerably less costly than direct measurements of the pumping rate, this approach enables us to (a) cover a larger area and (b) use historic electricity data to reconstruct water use in the past. Pumping tests have been carried out to establish a relation between energy consumption and groundwater exploitation. Based on the results of the pumping tests, the time series of the pumping rate can be estimated from the historical energy consumption and serves as the input for a box model to reconstruct the water balance of the shallow aquifer for recent years. This helps us to determine the relative contribution of recharge due to rainfall as well as drawdown due to groundwater pumping for irrigation. Additionally, 100 electric meters have been installed at the electric transformers supplying power for irrigation. With insights gained from the pumping tests, real-time monitoring of the groundwater exploitation is achieved by converting the measured energy consumption to the water use, and pumping control can also be achieved by limiting the energy use. A monitoring and controlling system can then be set up to implement the strategy of sustainable groundwater use.

  5. Price and expenditure elasticities of residential energy demand during urbanization: An empirical analysis based on the household-level survey data in China

    International Nuclear Information System (INIS)

    Sun, Chuanwang; Ouyang, Xiaoling

    2016-01-01

    Urbanization, one of the most obvious characteristics of economic growth in China, has an apparent “lock-in effect” on residential energy consumption pattern. It is expected that residential sector would become a major force that drives China's energy consumption after urbanization process. We estimate price and expenditure elasticities of residential energy demand using data from China's Residential Energy Consumption Survey (CRECS) that covers households at different income levels and from different regional and social groups. Empirical results from the Almost Ideal Demand System model are in accordance with the basic expectations: the demands for electricity, natural gas and transport fuels are inelastic in the residential sector due to the unreasonable pricing mechanism. We further investigate the sensitivities of different income groups to prices of the three types of energy. Policy simulations indicate that rationalizing energy pricing mechanism is an important guarantee for energy sustainable development during urbanization. Finally, we put forward suggestions on energy pricing reform in the residential sector based on characteristics of China's undergoing urbanization process and the current energy consumption situations.

  6. Semantic risk estimation of suspected minefields based on spatial relationships analysis of minefield indicators from multi-level remote sensing imagery

    Science.gov (United States)

    Chan, Jonathan Cheung-Wai; Sahli, Hichem; Wang, Yuhang

    2005-06-01

    This paper presents semantic risk estimation of suspected minefields using spatial relationships of minefield indicators extracted from multi-level remote sensing. Both satellite image and pyramidal airborne acquisitions from 900m to 30m flying heights with resolutions from 1m to 2cm resolutions are used for identification of minefield indicators. R-Histogram [1] is a quantitative representation of spatial relationship between two objects in an image. Eight spatial relationships can be generated: 1) LEFT OF, 2) RIGHT OF, 3) ABOVE, 4) BELOW, 5) NEAR, 6) FAR, 7) INSIDE, 8) OUTSIDE. R-Histogram semantics are first generated from selected indicators and metrics such as topological proximity and directional relationships are trained for soft classification of risk index (normalized as 0-1). We presented a framework of how semantic metadata generated from remote sensing images are used in risk estimation. The resultant risk index identified seven out of twelve mine accidents occurred at high risk region. More importantly, comparison with ground truth obtained after mine clearance show that three out of the four identified pattern minefields falls into the area estimated at very high risk. A parcel-based per-field risk estimation can also be easily generated to show the usefulness of the risk index.

  7. Is ozonation environmentally benign for reverse osmosis concentrate treatment? Four-level analysis on toxicity reduction based on organic matter fractionation.

    Science.gov (United States)

    Weng, Jingxia; Jia, Huichao; Wu, Bing; Pan, Bingcai

    2018-01-01

    Ozonation is a promising option to treat reverse osmosis concentrate (ROC). However, a systematic understanding and assessment of ozonation on toxicity reduction is insufficient. In this study, ROC sampled from a typical industrial park wastewater treatment plant of China was fractionated into hydrophobic acid (HOA), hydrophobic base (HOB), hydrophobic neutral (HON), and hydrophilic fraction (HI). Systematic bioassays covering bacteria, algae, fish, and human cell lines were conducted to reveal the role of ozonation in toxicity variation of the four ROC fractions. HOA in the raw ROC exhibited the highest toxicity, followed by HON and HI. Ozonation significantly reduced total organic carbon (TOC) and UV 254 values in HOA, HON, and HI and their toxicity except in HOB. Correlation analysis indicated that chemical data (TOC and UV 254 ) of HOA and HON correlated well with their toxicities; however, poor correlations were observed for HOB and HI, suggesting that a battery of toxicity assays is necessary. This study indicates that TOC reduction during ozonation could not fully reflect the toxicity issue, and toxicity assessment is required in conjunction with the chemical data to evaluate the effectiveness of ozonation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. System Level Analysis of LTE-Advanced

    DEFF Research Database (Denmark)

    Wang, Yuanye

    This PhD thesis focuses on system level analysis of Multi-Component Carrier (CC) management for Long Term Evolution (LTE)-Advanced. Cases where multiple CCs are aggregated to form a larger bandwidth are studied. The analysis is performed for both local area and wide area networks. In local area...... reduction. Compared to the case of reuse-1, they achieve a gain of 50∼500% in cell edge user throughput, with small or no loss in average cell throughput. For the wide area network, effort is devoted to the downlink of LTE-Advanced. Such a system is assumed to be backwards compatible to LTE release 8, i...... scheme is recommended. It reduces the CQI by 94% at low load, and 79∼93% at medium to high load, with reasonable loss in downlink performance. To reduce the ACK/NACK feedback, multiple ACK/NACKs can be bundled, with slightly degraded downlink throughput....

  9. Analysis of carotid artery plaque and wall boundaries on CT images by using a semi-automatic method based on level set model

    International Nuclear Information System (INIS)

    Saba, Luca; Sannia, Stefano; Ledda, Giuseppe; Gao, Hao; Acharya, U.R.; Suri, Jasjit S.

    2012-01-01

    The purpose of this study was to evaluate the potentialities of a semi-automated technique in the detection and measurement of the carotid artery plaque. Twenty-two consecutive patients (18 males, 4 females; mean age 62 years) examined with MDCTA from January 2011 to March 2011 were included in this retrospective study. Carotid arteries are examined with a 16-multi-detector-row CT system, and for each patient, the most diseased carotid was selected. In the first phase, the carotid plaque was identified and one experienced radiologist manually traced the inner and outer boundaries by using polyline and radial distance method (PDM and RDM, respectively). In the second phase, the carotid inner and outer boundaries were traced with an automated algorithm: level-set-method (LSM). Data were compared by using Pearson rho correlation, Bland-Altman, and regression. A total of 715 slices were analyzed. The mean thickness of the plaque using the reference PDM was 1.86 mm whereas using the LSM-PDM was 1.96 mm; using the reference RDM was 2.06 mm whereas using the LSM-RDM was 2.03 mm. The correlation values between the references, the LSM, the PDM and the RDM were 0.8428, 0.9921, 0.745 and 0.6425. Bland-Altman demonstrated a very good agreement in particular with the RDM method. Results of our study indicate that LSM method can automatically measure the thickness of the plaque and that the best results are obtained with the RDM. Our results suggest that advanced computer-based algorithms can identify and trace the plaque boundaries like an experienced human reader. (orig.)

  10. Analysis of carotid artery plaque and wall boundaries on CT images by using a semi-automatic method based on level set model

    Energy Technology Data Exchange (ETDEWEB)

    Saba, Luca; Sannia, Stefano; Ledda, Giuseppe [University of Cagliari - Azienda Ospedaliero Universitaria di Cagliari, Department of Radiology, Monserrato, Cagliari (Italy); Gao, Hao [University of Strathclyde, Signal Processing Centre for Excellence in Signal and Image Processing, Department of Electronic and Electrical Engineering, Glasgow (United Kingdom); Acharya, U.R. [Ngee Ann Polytechnic University, Department of Electronics and Computer Engineering, Clementi (Singapore); Suri, Jasjit S. [Biomedical Technologies Inc., Denver, CO (United States); Idaho State University (Aff.), Pocatello, ID (United States)

    2012-11-15

    The purpose of this study was to evaluate the potentialities of a semi-automated technique in the detection and measurement of the carotid artery plaque. Twenty-two consecutive patients (18 males, 4 females; mean age 62 years) examined with MDCTA from January 2011 to March 2011 were included in this retrospective study. Carotid arteries are examined with a 16-multi-detector-row CT system, and for each patient, the most diseased carotid was selected. In the first phase, the carotid plaque was identified and one experienced radiologist manually traced the inner and outer boundaries by using polyline and radial distance method (PDM and RDM, respectively). In the second phase, the carotid inner and outer boundaries were traced with an automated algorithm: level-set-method (LSM). Data were compared by using Pearson rho correlation, Bland-Altman, and regression. A total of 715 slices were analyzed. The mean thickness of the plaque using the reference PDM was 1.86 mm whereas using the LSM-PDM was 1.96 mm; using the reference RDM was 2.06 mm whereas using the LSM-RDM was 2.03 mm. The correlation values between the references, the LSM, the PDM and the RDM were 0.8428, 0.9921, 0.745 and 0.6425. Bland-Altman demonstrated a very good agreement in particular with the RDM method. Results of our study indicate that LSM method can automatically measure the thickness of the plaque and that the best results are obtained with the RDM. Our results suggest that advanced computer-based algorithms can identify and trace the plaque boundaries like an experienced human reader. (orig.)

  11. DISCRIMINANT ANALYSIS OF BANK PROFITABILITY LEVELS

    Directory of Open Access Journals (Sweden)

    Ante Rozga

    2013-02-01

    Full Text Available Discriminant analysis has been employed in this paper in order to identify and explain key features of bank profitability levels. Bank profitability is set up in the form of two categorical variables: profit or loss recorded and above or below average return on equity. Predictor variables are selected from various groups of financial indicators usually included in the empirical work on microeconomic determinants of bank profitability. The data from the Croatian banking sector is analyzed using the Enter method. General recommendations for a more profitable business of banking found in the bank management literature and existing empirical framework such as rationalization of overhead costs, asset growth, increase of non-interest income by expanding scale and scope of financial products proved to be important for classification of banks in different profitability levels. A higher market share may bring additional advantages. Classification results, canonical correlation and Wilks’ Lambda test confirm statistical significance of research results. Altogether, discriminant analysis turns out to be a suitable statistical method for solving presented research problem and moving forward from the bankruptcy, credit rating or default issues in finance.

  12. Calculation of critical level value for radioactivity detection in gamma spectrometric analysis on the base of semiconductor detectors under the Chernobyl' conditions in 1986-1987

    International Nuclear Information System (INIS)

    Glazunov, V.O.; Rusyaev, R.V.

    1989-01-01

    The problem of determination of radioactivity critical level in a sample by means of gamma spectrometer with semiconductor detector is studied theoretically. The formula for critical level, which shows that it is necessary to know the background pulse counting rate in order to determine the minimum gamma photon pulse counting rates, is derived. Calculations of critical level for the Chernobyl' conditions in time period from October 1986 till July 1987 are made. 8 refs.; 7 figs.; 17 tabs

  13. Bayesian Analysis of Individual Level Personality Dynamics

    Directory of Open Access Journals (Sweden)

    Edward Cripps

    2016-07-01

    Full Text Available A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine if the patterns of within-person responses on a 12 trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999. ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability, which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiralling. While Bayesian techniques have many potential advantages for the analyses of within-person processes at the individual level, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques.

  14. Higher blood 25(OHD level may reduce the breast cancer risk: evidence from a Chinese population based case-control study and meta-analysis of the observational studies.

    Directory of Open Access Journals (Sweden)

    Peizhan Chen

    Full Text Available Experimental data suggest a protective effect of vitamin D on breast cancer; however, epidemiologic results remain inclusive. With a Chinese population-based case-control study and meta-analysis of the observational studies, we here systematically evaluated the association of blood 25(OHD level and breast cancer risk. With 593 breast cancer cases and 580 cancer-free controls from Shanghai, China, we found that 80% of the normal women had severe vitamin D deficiency (less than 20 ng/mL and 15.2% had mild deficiency (20 to 30 ng/mL and only 4.8% of women had sufficient vitamin D level (>30 ng/mL while the proportion was 96.1%, 3.2% and 0.7% respectively for the breast cancer patients. Compared to those with the lowest quartile of plasma 25(OHD level, women with highest quartile 25(OHD level showed a significant decreased breast cancer risk (Q4 vs.Q1: OR = 0.10, 95% CI = 0.06-0.15 and every 1 ng/ml increment of plasma 25(OHD level led to a 16% lower odds of breast cancer (OR = 0.84, 95% CI = 0.81-0.87; P<0.001. From the meta-analysis of the observational studies, we found that women with highest quantile of blood 25(OHD level was associated with a significantly reduced breast cancer risk compared to those with lowest quantile of blood 25(OHD level for the 11 nested case-control and retrospective studies (pooled OR = 0.86, 95% CI = 0.75-1.00 and 10 case-control studies (7 population based, OR = 0.35, 95% CI = 0.24-0.52; 3 hospital based, OR = 0.08, 95% CI = 0.02-0.33. These results suggest that vitamin D may have a chemo-preventive effect against breast cancer.

  15. GNSS-Reflectometry based water level monitoring

    Science.gov (United States)

    Beckheinrich, Jamila; Schön, Steffen; Beyerle, Georg; Apel, Heiko; Semmling, Maximilian; Wickert, Jens

    2013-04-01

    Due to climate changing conditions severe changes in the Mekong delta in Vietnam have been recorded in the last years. The goal of the German Vietnamese WISDOM (Water-related Information system for the Sustainable Development Of the Mekong Delta) project is to build an information system to support and assist the decision makers, planners and authorities for an optimized water and land management. One of WISDOM's tasks is the flood monitoring of the Mekong delta. Earth reflected L-band signals from the Global Navigation Satellite System show a high reflectivity on water and ice surfaces or on wet soil so that GNSS-Reflectometry (GNSS-R) could contribute to monitor the water level in the main streams of the Mekong delta complementary to already existing monitoring networks. In principle, two different GNSS-R methods exist: the code- and the phase-based one. As the latter being more accurate, a new generation of GORS (GNSS Occultation, Reflectometry and Scatterometry) JAVAD DELTA GNSS receiver has been developed with the aim to extract precise phase observations. In a two week lasting measurement campaign, the receiver has been tested and several reflection events at the 150-200 m wide Can Tho river in Vietnam have been recorded. To analyze the geometrical impact on the quantity and quality of the reflection traces two different antennas height were tested. To track separately the direct and the reflected signal, two antennas were used. To derive an average height of the water level, for a 15 min observation interval, a phase model has been developed. Combined with the coherent observations, the minimum slope has been calculated based on the Least- Squares method. As cycle slips and outliers will impair the results, a preprocessing of the data has been performed. A cycle slip detection strategy that allows for automatic detection, identification and correction is proposed. To identify outliers, the data snooping method developed by Baarda 1968 is used. In this

  16. Efficiency and Productivity of County-level Public Hospitals Based on the Data Envelopment Analysis Model and Malmquist Index in Anhui, China

    Directory of Open Access Journals (Sweden)

    Nian-Nian Li

    2017-01-01

    Conclusions: In 2010–2015, the relative service efficiency of 12 county-level public hospitals in Anhui Province showed a decreasing trend, and the service efficiency of each hospital changed. In the past 6 years, although some hospitals have been effective, the efficiency of the county-level public hospitals in Anhui Province has not improved significantly, and the total factor productivity has not been effectively improved. County-level public hospitals need to combine their own reality to find their own deficiencies.

  17. Moderation analysis using a two-level regression model.

    Science.gov (United States)

    Yuan, Ke-Hai; Cheng, Ying; Maxwell, Scott

    2014-10-01

    Moderation analysis is widely used in social and behavioral research. The most commonly used model for moderation analysis is moderated multiple regression (MMR) in which the explanatory variables of the regression model include product terms, and the model is typically estimated by least squares (LS). This paper argues for a two-level regression model in which the regression coefficients of a criterion variable on predictors are further regressed on moderator variables. An algorithm for estimating the parameters of the two-level model by normal-distribution-based maximum likelihood (NML) is developed. Formulas for the standard errors (SEs) of the parameter estimates are provided and studied. Results indicate that, when heteroscedasticity exists, NML with the two-level model gives more efficient and more accurate parameter estimates than the LS analysis of the MMR model. When error variances are homoscedastic, NML with the two-level model leads to essentially the same results as LS with the MMR model. Most importantly, the two-level regression model permits estimating the percentage of variance of each regression coefficient that is due to moderator variables. When applied to data from General Social Surveys 1991, NML with the two-level model identified a significant moderation effect of race on the regression of job prestige on years of education while LS with the MMR model did not. An R package is also developed and documented to facilitate the application of the two-level model.

  18. Wear-out Failure Analysis of an Impedance-Source PV Microinverter Based on System-Level Electro-Thermal Modeling

    DEFF Research Database (Denmark)

    Shen, Yanfeng; Chub, Andrii; Wang, Huai

    2018-01-01

    and system-level finite element method (FEM) simulations, the electro-thermal models are built for the most reliability-critical components, i.e., power semi-conductor devices and capacitors. The dependence of the power loss on the junction/hotspot temperature is considered, the enclosure temperature...

  19. Are Ethnic Disparities in HbA1c Levels Explained by Mental Wellbeing? Analysis of Population-Based Data from the Health Survey for England.

    Science.gov (United States)

    Umeh, Kanayo

    2018-02-01

    It is unclear how ethnic differences in HbA 1c levels are affected by individual variations in mental wellbeing. Thus, the aim of this study was to assess the extent to which HbA 1c disparities between Caucasian and South Asian adults are mediated by various aspects of positive psychological functioning. Data from the 2014 Health Survey for England was analysed using bootstrapping methods. A total of 3894 UK residents with HbA 1c data were eligible to participate. Mental wellbeing was assessed using the Warwick-Edinburgh Mental Well-being Scale. To reduce bias BMI, blood pressure, diabetes status, and other factors were treated as covariates. Ethnicity directly predicted blood sugar control (unadjusted coefficient -2.15; 95% CI -3.64, -0.67), with Caucasians generating lower average HbA 1c levels (37.68 mmol/mol (5.6%)) compared to South Asians (39.87 mmol/mol (5.8%)). This association was mediated by positive mental wellbeing, specifically concerning perceived vigour (unadjusted effect 0.30; 95% CI 0.13, 0.58): South Asians felt more energetic than Caucasians (unadjusted coefficient -0.32; 95% CI -0.49, -0.16), and greater perceived energy predicted lower HbA 1c levels (unadjusted coefficient -0.92; 95% CI -1.29, -0.55). This mediator effect accounted for just over 14% of the HbA 1c variance and was negated after adjusting for BMI. Caucasian experience better HbA 1c levels compared with their South Asian counterparts. However, this association is partly confounded by individual differences in perceived energy levels, which is implicated in better glycaemic control, and appears to serve a protective function in South Asians.

  20. Concurrent chemoradiotherapy with S-1 in patients with stage III-IV oral squamous cell carcinoma: A retrospective analysis of nodal classification based on the neck node level.

    Science.gov (United States)

    Murakami, Ryuji; Semba, Akiko; Kawahara, Kenta; Matsuyama, Keiya; Hiraki, Akimitsu; Nagata, Masashi; Toya, Ryo; Yamashita, Yasuyuki; Oya, Natsuo; Nakayama, Hideki

    2017-07-01

    The aim of the present study was to retrospectively evaluate the treatment outcomes of concurrent chemoradiotherapy (CCRT) with S-1, an oral fluoropyrimidine anticancer agent, for advanced oral squamous cell carcinoma (SCC). The study population consisted of 47 patients with clinical stage III or IV oral SCC, who underwent CCRT with S-1. Pretreatment variables, including patient age, clinical stage, T classification, midline involvement of the primary tumor and nodal status, were analyzed as predictors of survival. In addition to the N classification (node-positive, multiple and contralateral), the prognostic impact of the level of nodal involvement was assessed. Nodal involvement was mainly observed at levels Ib and II; involvement at levels Ia and III-V was considered to be anterior and inferior extension, respectively, and was recorded as extensive nodal involvement (ENI). The 3-year overall survival (OS) and progression-free survival (PFS) rates were 37 and 27%, respectively. A finding of ENI was a significant factor for OS [hazard ratio (HR)=2.16; 95% confidence interval (CI): 1.03-4.55; P=0.038] and PFS (HR=2.65; 95% CI: 1.32-5.33; P=0.005); the 3-year OS and PFS rates in patients with vs. those without ENI were 23 vs. 50% and 9 vs. 43%, respectively. The other variables were not significant. Therefore, CCRT with S-1 may be an alternative treatment for advanced oral SCC; favorable outcomes are expected in patients without ENI.

  1. Does atlas-based autosegmentation of neck levels require subsequent manual contour editing to avoid risk of severe target underdosage? A dosimetric analysis

    International Nuclear Information System (INIS)

    Voet, Peter W.J.; Dirkx, Maarten L.P.; Teguh, David N.; Hoogeman, Mischa S.; Levendag, Peter C.; Heijmen, Ben J.M.

    2011-01-01

    Background and purpose: To investigate the dosimetric impact of not editing auto-contours of the elective neck and organs at risk (OAR), generated with atlas-based autosegmentation (ABAS) (Elekta software) for head and neck cancer patients. Materials and methods: For nine patients ABAS auto-contours and auto-contours edited by two observers were available. Based on the non-edited auto-contours clinically acceptable IMRT plans were constructed (designated 'ABAS plans'). These plans were then evaluated for the two edited structure sets, by quantifying the percentage of the neck-PTV receiving more than 95% of the prescribed dose (V 95 ) and the near-minimum dose (D 99 ) in the neck PTV. Dice coefficients and mean contour distances were calculated to quantify the similarity of ABAS auto-contours with the structure sets edited by observer 1 and observer 2. To study the dosimetric importance of editing OAR auto-contours a new IMRT plan was generated for each patient-observer combination, based on the observer's edited CTV and the non-edited salivary gland auto-contours. For each plan mean doses for the non-edited glands were compared with doses for the same glands edited by the observer. Results: For both observers, edited neck CTVs were larger than ABAS auto-contours (p ≤ 0.04), by a mean of 8.7%. When evaluating ABAS plans on the PTVs of the edited structure sets, V 95 reduced by 7.2% ± 5.4% (1 SD) (p 99 was 14.2 Gy (range 1-54 Gy). Even for Dice coefficients >0.8 and mean contour distances 99 up to 11 Gy were observed. For treatment plans based on observer PTVs and non-edited auto-contoured salivary glands, the mean doses in the edited glands differed by only -0.6 Gy ± 1.0 Gy (p = 0.06). Conclusions: Editing of auto-contoured neck CTVs generated by ABAS is required to avoid large underdosages in target volumes. Often used similarity measures for evaluation of auto-contouring algorithms, such as dice coefficients, do not predict well for expected PTV underdose

  2. Analysis of baseline gene expression levels from ...

    Science.gov (United States)

    The use of gene expression profiling to predict chemical mode of action would be enhanced by better characterization of variance due to individual, environmental, and technical factors. Meta-analysis of microarray data from untreated or vehicle-treated animals within the control arm of toxicogenomics studies has yielded useful information on baseline fluctuations in gene expression. A dataset of control animal microarray expression data was assembled by a working group of the Health and Environmental Sciences Institute's Technical Committee on the Application of Genomics in Mechanism Based Risk Assessment in order to provide a public resource for assessments of variability in baseline gene expression. Data from over 500 Affymetrix microarrays from control rat liver and kidney were collected from 16 different institutions. Thirty-five biological and technical factors were obtained for each animal, describing a wide range of study characteristics, and a subset were evaluated in detail for their contribution to total variability using multivariate statistical and graphical techniques. The study factors that emerged as key sources of variability included gender, organ section, strain, and fasting state. These and other study factors were identified as key descriptors that should be included in the minimal information about a toxicogenomics study needed for interpretation of results by an independent source. Genes that are the most and least variable, gender-selectiv

  3. Normal-Mode Analysis of Circular DNA at the Base-Pair Level. 2. Large-Scale Configurational Transformation of a Naturally Curved Molecule.

    Science.gov (United States)

    Matsumoto, Atsushi; Tobias, Irwin; Olson, Wilma K

    2005-01-01

    Fine structural and energetic details embedded in the DNA base sequence, such as intrinsic curvature, are important to the packaging and processing of the genetic material. Here we investigate the internal dynamics of a 200 bp closed circular molecule with natural curvature using a newly developed normal-mode treatment of DNA in terms of neighboring base-pair "step" parameters. The intrinsic curvature of the DNA is described by a 10 bp repeating pattern of bending distortions at successive base-pair steps. We vary the degree of intrinsic curvature and the superhelical stress on the molecule and consider the normal-mode fluctuations of both the circle and the stable figure-8 configuration under conditions where the energies of the two states are similar. To extract the properties due solely to curvature, we ignore other important features of the double helix, such as the extensibility of the chain, the anisotropy of local bending, and the coupling of step parameters. We compare the computed normal modes of the curved DNA model with the corresponding dynamical features of a covalently closed duplex of the same chain length constructed from naturally straight DNA and with the theoretically predicted dynamical properties of a naturally circular, inextensible elastic rod, i.e., an O-ring. The cyclic molecules with intrinsic curvature are found to be more deformable under superhelical stress than rings formed from naturally straight DNA. As superhelical stress is accumulated in the DNA, the frequency, i.e., energy, of the dominant bending mode decreases in value, and if the imposed stress is sufficiently large, a global configurational rearrangement of the circle to the figure-8 form takes place. We combine energy minimization with normal-mode calculations of the two states to decipher the configurational pathway between the two states. We also describe and make use of a general analytical treatment of the thermal fluctuations of an elastic rod to characterize the

  4. 24 CFR 990.180 - Utilities expense level: Computation of the rolling base consumption level.

    Science.gov (United States)

    2010-04-01

    ...: Computation of the rolling base consumption level. 990.180 Section 990.180 Housing and Urban Development... Calculating Formula Expenses § 990.180 Utilities expense level: Computation of the rolling base consumption level. (a) General. (1) The rolling base consumption level (RBCL) shall be equal to the average of...

  5. Teaching Badminton Based on Student Skill Levels

    Science.gov (United States)

    Wang, Jianyu; Moffit, Jeff

    2009-01-01

    Badminton has been identified as a lifelong activity. It is an inexpensive sport and everyone--children, seniors, and individuals with disabilities--can reach a level of enjoyment in the game after mastering basic skills and tactics. In teaching badminton, teachers need to understand how students develop game play ability from a low level to an…

  6. Comparative Analysis of RNAi-Based Methods to Down-Regulate Expression of Two Genes Expressed at Different Levels in Myzus persicae

    Directory of Open Access Journals (Sweden)

    Michaël Mulot

    2016-11-01

    Full Text Available With the increasing availability of aphid genomic data, it is necessary to develop robust functional validation methods to evaluate the role of specific aphid genes. This work represents the first study in which five different techniques, all based on RNA interference and on oral acquisition of double-stranded RNA (dsRNA, were developed to silence two genes, ALY and Eph, potentially involved in polerovirus transmission by aphids. Efficient silencing of only Eph transcripts, which are less abundant than those of ALY, could be achieved by feeding aphids on transgenic Arabidopsis thaliana expressing an RNA hairpin targeting Eph, on Nicotiana benthamiana infected with a Tobacco rattle virus (TRV-Eph recombinant virus, or on in vitro-synthesized Eph-targeting dsRNA. These experiments showed that the silencing efficiency may differ greatly between genes and that aphid gut cells seem to be preferentially affected by the silencing mechanism after oral acquisition of dsRNA. In addition, the use of plants infected with recombinant TRV proved to be a promising technique to silence aphid genes as it does not require plant transformation. This work highlights the need to pursue development of innovative strategies to reproducibly achieve reduction of expression of aphid genes.

  7. Assessment of the population-level effectiveness of the Avahan HIV-prevention programme in South India: a preplanned, causal-pathway-based modelling analysis.

    Science.gov (United States)

    Pickles, Michael; Boily, Marie-Claude; Vickerman, Peter; Lowndes, Catherine M; Moses, Stephen; Blanchard, James F; Deering, Kathleen N; Bradley, Janet; Ramesh, Banadakoppa M; Washington, Reynold; Adhikary, Rajatashuvra; Mainkar, Mandar; Paranjape, Ramesh S; Alary, Michel

    2013-11-01

    Avahan, the India AIDS initiative of the Bill & Melinda Gates Foundation, was a large-scale, targeted HIV prevention intervention. We aimed to assess its overall effectiveness by estimating the number and proportion of HIV infections averted across Avahan districts, following the causal pathway of the intervention. We created a mathematical model of HIV transmission in high-risk groups and the general population using data from serial cross-sectional surveys (integrated behavioural and biological assessments, IBBAs) within a Bayesian framework, which we used to reproduce HIV prevalence trends in female sex workers and their clients, men who have sex with men, and the general population in 24 South Indian districts over the first 4 years (2004-07 or 2005-08 dependent on the district) and the full 10 years (2004-13) of the Avahan programme. We tested whether these prevalence trends were more consistent with self-reported increases in consistent condom use after the implementation of Avahan or with a counterfactual (assuming consistent condom use increased at slower, pre-Avahan rates) using a Bayes factor, which gave a measure of the strength of evidence for the effectiveness estimates. Using regression analysis, we extrapolated the prevention effect in the districts covered by IBBAs to all 69 Avahan districts. In 13 of 24 IBBA districts, modelling suggested medium to strong evidence for the large self-reported increase in consistent condom use since Avahan implementation. In the remaining 11 IBBA districts, the evidence was weaker, with consistent condom use generally already high before Avahan began. Roughly 32700 HIV infections (95% credibility interval 17900-61600) were averted over the first 4 years of the programme in the IBBA districts with moderate to strong evidence. Addition of the districts with weaker evidence increased this total to 62800 (32000-118000) averted infections, and extrapolation suggested that 202000 (98300-407000) infections were averted

  8. Secondary Low-Level Waste Treatment Strategy Analysis

    International Nuclear Information System (INIS)

    D.M. LaRue

    1999-01-01

    The objective of this analysis is to identify and review potential options for processing and disposing of the secondary low-level waste (LLW) that will be generated through operation of the Monitored Geologic Repository (MGR). An estimate of annual secondary LLW is generated utilizing the mechanism established in ''Secondary Waste Treatment Analysis'' (Reference 8.1) and ''Secondary Low-Level Waste Generation Rate Analysis'' (Reference 8.5). The secondary LLW quantities are based on the spent fuel and high-level waste (HLW) arrival schedule as defined in the ''Controlled Design Assumptions Document'' (CDA) (Reference 8.6). This analysis presents estimates of the quantities of LLW in its various forms. A review of applicable laws, codes, and standards is discussed, and a synopsis of those applicable laws, codes, and standards and their impacts on potential processing and disposal options is presented. The analysis identifies viable processing/disposal options in light of the existing laws, codes, and standards, and then evaluates these options in regard to: (1) Process and equipment requirements; (2) LLW disposal volumes; and (3) Facility requirements

  9. A Character Level Based and Word Level Based Approach for Chinese-Vietnamese Machine Translation

    Directory of Open Access Journals (Sweden)

    Phuoc Tran

    2016-01-01

    Full Text Available Chinese and Vietnamese have the same isolated language; that is, the words are not delimited by spaces. In machine translation, word segmentation is often done first when translating from Chinese or Vietnamese into different languages (typically English and vice versa. However, it is a matter for consideration that words may or may not be segmented when translating between two languages in which spaces are not used between words, such as Chinese and Vietnamese. Since Chinese-Vietnamese is a low-resource language pair, the sparse data problem is evident in the translation system of this language pair. Therefore, while translating, whether it should be segmented or not becomes more important. In this paper, we propose a new method for translating Chinese to Vietnamese based on a combination of the advantages of character level and word level translation. In addition, a hybrid approach that combines statistics and rules is used to translate on the word level. And at the character level, a statistical translation is used. The experimental results showed that our method improved the performance of machine translation over that of character or word level translation.

  10. A Character Level Based and Word Level Based Approach for Chinese-Vietnamese Machine Translation.

    Science.gov (United States)

    Tran, Phuoc; Dinh, Dien; Nguyen, Hien T

    2016-01-01

    Chinese and Vietnamese have the same isolated language; that is, the words are not delimited by spaces. In machine translation, word segmentation is often done first when translating from Chinese or Vietnamese into different languages (typically English) and vice versa. However, it is a matter for consideration that words may or may not be segmented when translating between two languages in which spaces are not used between words, such as Chinese and Vietnamese. Since Chinese-Vietnamese is a low-resource language pair, the sparse data problem is evident in the translation system of this language pair. Therefore, while translating, whether it should be segmented or not becomes more important. In this paper, we propose a new method for translating Chinese to Vietnamese based on a combination of the advantages of character level and word level translation. In addition, a hybrid approach that combines statistics and rules is used to translate on the word level. And at the character level, a statistical translation is used. The experimental results showed that our method improved the performance of machine translation over that of character or word level translation.

  11. MOVES2010a regional level sensitivity analysis

    Science.gov (United States)

    2012-12-10

    This document discusses the sensitivity of various input parameter effects on emission rates using the US Environmental Protection Agencys (EPAs) MOVES2010a model at the regional level. Pollutants included in the study are carbon monoxide (CO),...

  12. Cellular-based sea level gauge

    Digital Repository Service at National Institute of Oceanography (India)

    Desai, R.G.P.; Joseph, A.

    treaties with greater transparency. Among the various communication technologies used for real-time transmission of sea-level data are the wired telephone connection, VHF/UHF transceivers, satellite transmit terminals and cellular connectivity. Wired... telephone connections are severely susceptible to loss of connectivity during natural disasters such as storm surges, primarily because of telephone line breakage. Communication via VHF/UHF transceivers is limited by line-of-sight distance between...

  13. SpirPro: A Spirulina proteome database and web-based tools for the analysis of protein-protein interactions at the metabolic level in Spirulina (Arthrospira) platensis C1.

    Science.gov (United States)

    Senachak, Jittisak; Cheevadhanarak, Supapon; Hongsthong, Apiradee

    2015-07-29

    Spirulina (Arthrospira) platensis is the only cyanobacterium that in addition to being studied at the molecular level and subjected to gene manipulation, can also be mass cultivated in outdoor ponds for commercial use as a food supplement. Thus, encountering environmental changes, including temperature stresses, is common during the mass production of Spirulina. The use of cyanobacteria as an experimental platform, especially for photosynthetic gene manipulation in plants and bacteria, is becoming increasingly important. Understanding the mechanisms and protein-protein interaction networks that underlie low- and high-temperature responses is relevant to Spirulina mass production. To accomplish this goal, high-throughput techniques such as OMICs analyses are used. Thus, large datasets must be collected, managed and subjected to information extraction. Therefore, databases including (i) proteomic analysis and protein-protein interaction (PPI) data and (ii) domain/motif visualization tools are required for potential use in temperature response models for plant chloroplasts and photosynthetic bacteria. A web-based repository was developed including an embedded database, SpirPro, and tools for network visualization. Proteome data were analyzed integrated with protein-protein interactions and/or metabolic pathways from KEGG. The repository provides various information, ranging from raw data (2D-gel images) to associated results, such as data from interaction and/or pathway analyses. This integration allows in silico analyses of protein-protein interactions affected at the metabolic level and, particularly, analyses of interactions between and within the affected metabolic pathways under temperature stresses for comparative proteomic analysis. The developed tool, which is coded in HTML with CSS/JavaScript and depicted in Scalable Vector Graphics (SVG), is designed for interactive analysis and exploration of the constructed network. SpirPro is publicly available on the web

  14. Roadmap-Based Level Clearing of Buildings

    KAUST Repository

    Rodriguez, Samuel; Amato, Nancy M.

    2011-01-01

    In this paper we describe a roadmap-based approach for a multi-agent search strategy to clear a building or multi-story environment. This approach utilizes an encoding of the environment in the form of a graph (roadmap) that is used to encode

  15. Analysis of Low Level DNA Mixtures

    Czech Academy of Sciences Publication Activity Database

    Slovák, Dalibor; Zvárová, Jana

    2013-01-01

    Roč. 1, č. 1 (2013), s. 63-63 ISSN 1805-8698. [EFMI 2013 Special Topic Conference. 17.04.2013-19.04.2013, Prague] Institutional support: RVO:67985807 Keywords : forensic DNA interpretation * low level samples * allele peak heights * dropout probability Subject RIV: IN - Informatics, Computer Science

  16. Microprocessor-based accelerating power level detector

    Energy Technology Data Exchange (ETDEWEB)

    Nagpal, M.; Zarecki, W.; Albrecht, J.C.

    1994-01-01

    An accelerating power level detector was built using state-of-the-art microprocessor technology at Powertech Labs Inc. The detector will monitor the real power flowing in two 300 kV transmission lines out of Kemano Hydroelectric Generating Station and will detect any sudden loss of load due to a fault on either line under certain pre-selected power flow conditions. This paper discusses the criteria of operation for the detector and its implementation details, including digital processing, hardware, and software.

  17. Music Video: An Analysis at Three Levels.

    Science.gov (United States)

    Burns, Gary

    This paper is an analysis of the different aspects of the music video. Music video is defined as having three meanings: an individual clip, a format, or the "aesthetic" that describes what the clips and format look like. The paper examines interruptions, the dialectical tension and the organization of the work of art, shot-scene…

  18. Area-Level and Individual-Level Factors for Teenage Motherhood: A Multilevel Analysis in Japan.

    Science.gov (United States)

    Baba, Sachiko; Iso, Hiroyasu; Fujiwara, Takeo

    2016-01-01

    Teenage motherhood is strongly associated with a range of disadvantages for both the mother and the child. No epidemiological studies have examined related factors for teenage motherhood at both area and individual levels among Japanese women. Therefore, we performed a multilevel analysis of nationwide data in Japan to explore the association of area- and individual-level factors with teenage motherhood. The study population comprised 21,177 mothers living in 47 prefectures who had their first, singleton baby between 10 and 17 January or between 10 and 17 July, 2001. Information on the prefecture in which the mothers resided was linked to prefecture-level variables. Primary outcomes were area-level characteristics (single-mother households, three-generation households, college enrollment, abortions, juvenile crime, and per capita income) and individual-level characteristics, and divided into tertiles or quintiles based on their variable distributions. Multilevel logistic regression analysis was then performed. There were 440 teenage mothers (2.1%) in this study. In addition to individual low level of education [adjusted odds ratio (OR), 7.40; 95% confidence interval (CI), 5.59-9.78], low income [4.23 (2.95-6.08)], and smoking [1.65 (1.31-2.07)], high proportions of single-mother households [1.72 (1.05-2.80)] and three-generation household [1.81 (1.17-2.78)], and per capita income [2.19 (1.06-3.81)] at an area level were positively associated, and high level of college enrollment [0.46 (0.25-0.83)] and lower crime rate [0.62 (0.40-0.98)] at area level were inversely associated with teenage motherhood compared with the corresponding women living in prefectures with the lowest levels of these variables. Our findings suggest that encouraging the completion of higher education and reducing the number of single-mother household at an area level may be important public health strategies to reduce teenage motherhood.

  19. Period analysis at high noise level

    International Nuclear Information System (INIS)

    Kovacs, G.

    1980-01-01

    Analytical expressions are derived for the variances of some types of the periodograms due to normal-distributed noise present in the data. The equivalence of the Jurkevich and the Warner and Robinson methods is proved. The optimum phase cell number of the Warner and Robinson method is given; this number depends on the data length, signal form and noise level. The results are illustrated by numerical examples. (orig.)

  20. Where is the difference between an epidemic and a high endemic level with respect to nosocomial infection control measures? An analysis based on the example of vancomycin-resistant Enterococcus faecium in hematology and oncology departments

    Directory of Open Access Journals (Sweden)

    Ulrich, Nikos

    2017-08-01

    Full Text Available Some infection control recommendations distinguish epidemic and endemic levels for infection control. However, it is often difficult to separate long lasting outbreaks from high endemic levels and it remains open, if this distinction is really useful.Aim: To compare infection control measures in endemic and epidemic outbreaks.Methods: The example of vancomycin-resistant outbreaks in haematology or oncology departments was used to analyse differences in infection control measures between outbreaks and high endemic levels. The outbreak database and PubMed, including long lasting outbreaks, were used for this analysis. Two time limits were used for separation: 6 and 12 months. In addition, monoclonal and polyclonal outbreaks were distinguished. Findings: A total of 36 outbreaks were included. 13 outbreaks lasted 6 months or less, 9 outbreaks more than 6 months but at maximum 12 months and 9 more than 12 months. For the remaining outbreaks, no information about their duration was available. Altogether, 11 outbreaks were monoclonal and 20 polyclonal. ri infection control measures, there were almost no differences between the different groups compared. Patient screening was given up in 37.5% of long lasting outbreaks (>12 months and hand hygiene not reported in the majority of polyclonal outbreaks (77.8%.Conclusion: Despite many institutions trying to add further infection control measures in case of an outbreak, evidence based infection control measures should be implemented in endemic and epidemic situations. The crucial aspect is probably the degree of implementation and its control in both situations.

  1. FPGA based compute nodes for high level triggering in PANDA

    International Nuclear Information System (INIS)

    Kuehn, W; Gilardi, C; Kirschner, D; Lang, J; Lange, S; Liu, M; Perez, T; Yang, S; Schmitt, L; Jin, D; Li, L; Liu, Z; Lu, Y; Wang, Q; Wei, S; Xu, H; Zhao, D; Korcyl, K; Otwinowski, J T; Salabura, P

    2008-01-01

    PANDA is a new universal detector for antiproton physics at the HESR facility at FAIR/GSI. The PANDA data acquisition system has to handle interaction rates of the order of 10 7 /s and data rates of several 100 Gb/s. FPGA based compute nodes with multi-Gb/s bandwidth capability using the ATCA architecture are designed to handle tasks such as event building, feature extraction and high level trigger processing. Data connectivity is provided via optical links as well as multiple Gb Ethernet ports. The boards will support trigger algorithms such us pattern recognition for RICH detectors, EM shower analysis, fast tracking algorithms and global event characterization. Besides VHDL, high level C-like hardware description languages will be considered to implement the firmware

  2. Lake-level frequency analysis for Devils Lake, North Dakota

    Science.gov (United States)

    Wiche, Gregg J.; Vecchia, Aldo V.

    1996-01-01

    Two approaches were used to estimate future lake-level probabilities for Devils Lake. The first approach is based on an annual lake-volume model, and the second approach is based on a statistical water mass-balance model that generates seasonal lake volumes on the basis of seasonal precipitation, evaporation, and inflow. Autoregressive moving average models were used to model the annual mean lake volume and the difference between the annual maximum lake volume and the annual mean lake volume. Residuals from both models were determined to be uncorrelated with zero mean and constant variance. However, a nonlinear relation between the residuals of the two models was included in the final annual lakevolume model.Because of high autocorrelation in the annual lake levels of Devils Lake, the annual lake-volume model was verified using annual lake-level changes. The annual lake-volume model closely reproduced the statistics of the recorded lake-level changes for 1901-93 except for the skewness coefficient. However, the model output is less skewed than the data indicate because of some unrealistically large lake-level declines. The statistical water mass-balance model requires as inputs seasonal precipitation, evaporation, and inflow data for Devils Lake. Analysis of annual precipitation, evaporation, and inflow data for 1950-93 revealed no significant trends or long-range dependence so the input time series were assumed to be stationary and short-range dependent.Normality transformations were used to approximately maintain the marginal probability distributions; and a multivariate, periodic autoregressive model was used to reproduce the correlation structure. Each of the coefficients in the model is significantly different from zero at the 5-percent significance level. Coefficients relating spring inflow from one year to spring and fall inflows from the previous year had the largest effect on the lake-level frequency analysis.Inclusion of parameter uncertainty in the model

  3. National high-level waste systems analysis

    International Nuclear Information System (INIS)

    Kristofferson, K.; O'Holleran, T.P.

    1996-01-01

    Previously, no mechanism existed that provided a systematic, interrelated view or national perspective of all high-level waste treatment and storage systems that the US Department of Energy manages. The impacts of budgetary constraints and repository availability on storage and treatment must be assessed against existing and pending negotiated milestones for their impact on the overall HLW system. This assessment can give DOE a complex-wide view of the availability of waste treatment and help project the time required to prepare HLW for disposal. Facilities, throughputs, schedules, and milestones were modeled to ascertain the treatment and storage systems resource requirements at the Hanford Site, Savannah River Site, Idaho National Engineering Laboratory, and West Valley Demonstration Project. The impacts of various treatment system availabilities on schedule and throughput were compared to repository readiness to determine the prudent application of resources. To assess the various impacts, the model was exercised against a number of plausible scenarios as discussed in this paper

  4. Base compaction specification feasibility analysis.

    Science.gov (United States)

    2012-12-01

    The objective of this research is to establish the technical engineering and cost : analysis concepts that will enable WisDOT management to objectively evaluate the : feasibility of switching construction specification philosophies for aggregate base...

  5. Severe accident analysis for level 2 PSA of SMART reactor

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jin Yong; Lee, Jeong Hun; Kim, Jong Uk; Yoo, Tae Geun; Chung, Soon Il; Kim, Min Gi [FNC Technology Co., Seoul (Korea, Republic of)

    2010-12-15

    The objectives of this study are to produce data for level 2 PSA and evaluation results of severe accident by analyzing severe accident sequence of transient events, producing fault tree of containment systems and evaluating direct containment heating of the SMART. In this project, severe accident analysis results were produced for general transient, loss of feedwater, station blackout, and steam line break events, and based on the results, design safety of SMART was verified. Also, direct containment heating phenomenon of the SMART was evaluated using TCE methodology. For level 2 PSA, fault tree of the containment isolation system, reactor cavity flooding system, plant chilled water system, and reactor containment building HVAC system was produced and analyzed

  6. Trace and ultratrace level elemental and speciation analysis

    International Nuclear Information System (INIS)

    Arunachalam, J.

    2012-01-01

    Accurate determination of elements present at parts per million and billion levels in various matrices is a growing requirement in different fields. In environmental sciences various trace elements need to be analyzed so as establish the dispersal models of pollutants or the adequacy of effluent treatment prior to discharge into water bodies. The issues of bioaccumulation and magnification are important in aquatic systems. In nutrition and biochemistry one has to establish the bio-availability of essential and toxic elemental species as toxic elements prevent assimilation of essential elements. Fission and fusion technologies use a variety of structural materials requiring many trace elements to be present at levels strictly below the specified levels. Ultra-pure bulk semiconductor materials are required for fabrication devices. In metallurgy and materials sciences too, various trace elements are known to influence the properties. In the emerging fields like nanotechnology, it is necessary to understand the passage and accumulation of nano-particles inside the cells, through trace analysis. Many analytical techniques exist which can provide the concentration information in the bulk materials with good accuracy. They include ICP-AES, FAAS, and ICP-MS, which are solution based techniques. Direct solid state analytical techniques are Glow Discharge Mass Spectrometry (GDMS) and XRF. Accelerator based ion-beam analysis techniques can provide information on concentration and depth profiles of different elements in layered structures. Hyphenated techniques such as HPLC/lC-ICPMS, are helpful in identifying various chemical oxidation states in which a given element might be present in a matrix, which is termed as speciation analysis. This presentation will include the existing analytical competencies and the laboratory requirements for trace and ultra trace element elemental and speciation analyses and their applications. (author)

  7. Multi-level approach for parametric roll analysis

    Science.gov (United States)

    Kim, Taeyoung; Kim, Yonghwan

    2011-03-01

    The present study considers multi-level approach for the analysis of parametric roll phenomena. Three kinds of computation method, GM variation, impulse response function (IRF), and Rankine panel method, are applied for the multi-level approach. IRF and Rankine panel method are based on the weakly nonlinear formulation which includes nonlinear Froude- Krylov and restoring forces. In the computation result of parametric roll occurrence test in regular waves, IRF and Rankine panel method show similar tendency. Although the GM variation approach predicts the occurrence of parametric roll at twice roll natural frequency, its frequency criteria shows a little difference. Nonlinear roll motion in bichromatic wave is also considered in this study. To prove the unstable roll motion in bichromatic waves, theoretical and numerical approaches are applied. The occurrence of parametric roll is theoretically examined by introducing the quasi-periodic Mathieu equation. Instability criteria are well predicted from stability analysis in theoretical approach. From the Fourier analysis, it has been verified that difference-frequency effects create the unstable roll motion. The occurrence of unstable roll motion in bichromatic wave is also observed in the experiment.

  8. High-Level Overview of Data Needs for RE Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lopez, Anthony

    2016-12-22

    This presentation provides a high level overview of analysis topics and associated data needs. Types of renewable energy analysis are grouped into two buckets: First, analysis for renewable energy potential, and second, analysis for other goals. Data requirements are similar but and they build upon one another.

  9. Adjacent level effects of bi level disc replacement, bi level fusion and disc replacement plus fusion in cervical spine--a finite element based study.

    Science.gov (United States)

    Faizan, Ahmad; Goel, Vijay K; Biyani, Ashok; Garfin, Steven R; Bono, Christopher M

    2012-03-01

    Studies delineating the adjacent level effect of single level disc replacement systems have been reported in literature. The aim of this study was to compare the adjacent level biomechanics of bi-level disc replacement, bi-level fusion and a construct having adjoining level disc replacement and fusion system. In total, biomechanics of four models- intact, bi level disc replacement, bi level fusion and fusion plus disc replacement at adjoining levels- was studied to gain insight into the effects of various instrumentation systems on cranial and caudal adjacent levels using finite element analysis (73.6N+varying moment). The bi-level fusion models are more than twice as stiff as compared to the intact model during flexion-extension, lateral bending and axial rotation. Bi-level disc replacement model required moments lower than intact model (1.5Nm). Fusion plus disc replacement model required moment 10-25% more than intact model, except in extension. Adjacent level motions, facet loads and endplate stresses increased substantially in the bi-level fusion model. On the other hand, adjacent level motions, facet loads and endplate stresses were similar to intact for the bi-level disc replacement model. For the fusion plus disc replacement model, adjacent level motions, facet loads and endplate stresses were closer to intact model rather than the bi-level fusion model, except in extension. Based on our finite element analysis, fusion plus disc replacement procedure has less severe biomechanical effects on adjacent levels when compared to bi-level fusion procedure. Bi-level disc replacement procedure did not have any adverse mechanical effects on adjacent levels. Copyright © 2011 Elsevier Ltd. All rights reserved.

  10. Analysis of conservativity analysis for clearance levels. Final report

    International Nuclear Information System (INIS)

    Deckert, A.; Thierfeldt, S.

    1997-07-01

    When deriving clearance levels for material from nuclear installations it is necessary to proceed with a certain degree of conservativity. This can, however, differ between various sets of clearance levels leading to inconsistencies between clearance pathways. The aim of the work therefore is to compare levels of conservativity for the following two sets of clearance levels: clearance levels for disposal as conventional waste and for metallic materials for recycling/reuse. A method was developed to quantify the degree of conservativity and make it comparable. The actual and future situation for disposal of (conventional) wastes in Germany was analysed. In addition, the masses, nuclide vectors, geographical distribution etc. for slightly radioactive material being cleared for conventional disposal was analysed and modelled and the resulting dose distributions were calculated. The values for the clearance levels were taken from the 1995 recommendation by the German Commission on Radiation Protection (SSK). By using realistic scenarios, the exposure was calculated for the personnel on the landfills and for persons of the general public being exposed via groundwater pathways. It could be shown that the trivial dose range will not be exceeded even if the masses of cleared material per landfill site exceed 100 Mg/a. Because of the types and distribution of nuclear installations in Germany and because of the nuclide vectors it is therefore not necessary to limit the masses per landfill sites. Clearance levels that are determined by the exposure pathways external exposure to and the inhalation of dust by the landfill personnel show a similar level of conservativity as those for metal scrap. This means that the clearance levels for gamma emitting nuclides are not overly restrictive. Although radiologically justified, raising the clearance levels would not lead to an increase of the material quantities because other nuclides of the respective nuclide vectors are limiting. In

  11. A multivariate analysis of serum nutrient levels and lung function

    Directory of Open Access Journals (Sweden)

    Smit Henriette A

    2008-09-01

    Full Text Available Abstract Background There is mounting evidence that estimates of intakes of a range of dietary nutrients are related to both lung function level and rate of decline, but far less evidence on the relation between lung function and objective measures of serum levels of individual nutrients. The aim of this study was to conduct a comprehensive examination of the independent associations of a wide range of serum markers of nutritional status with lung function, measured as the one-second forced expiratory volume (FEV1. Methods Using data from the Third National Health and Nutrition Examination Survey, a US population-based cross-sectional study, we investigated the relation between 21 serum markers of potentially relevant nutrients and FEV1, with adjustment for potential confounding factors. Systematic approaches were used to guide the analysis. Results In a mutually adjusted model, higher serum levels of antioxidant vitamins (vitamin A, beta-cryptoxanthin, vitamin C, vitamin E, selenium, normalized calcium, chloride, and iron were independently associated with higher levels of FEV1. Higher concentrations of potassium and sodium were associated with lower FEV1. Conclusion Maintaining higher serum concentrations of dietary antioxidant vitamins and selenium is potentially beneficial to lung health. In addition other novel associations found in this study merit further investigation.

  12. Interaction between High-Level and Low-Level Image Analysis for Semantic Video Object Extraction

    Directory of Open Access Journals (Sweden)

    Andrea Cavallaro

    2004-06-01

    Full Text Available The task of extracting a semantic video object is split into two subproblems, namely, object segmentation and region segmentation. Object segmentation relies on a priori assumptions, whereas region segmentation is data-driven and can be solved in an automatic manner. These two subproblems are not mutually independent, and they can benefit from interactions with each other. In this paper, a framework for such interaction is formulated. This representation scheme based on region segmentation and semantic segmentation is compatible with the view that image analysis and scene understanding problems can be decomposed into low-level and high-level tasks. Low-level tasks pertain to region-oriented processing, whereas the high-level tasks are closely related to object-level processing. This approach emulates the human visual system: what one “sees” in a scene depends on the scene itself (region segmentation as well as on the cognitive task (semantic segmentation at hand. The higher-level segmentation results in a partition corresponding to semantic video objects. Semantic video objects do not usually have invariant physical properties and the definition depends on the application. Hence, the definition incorporates complex domain-specific knowledge and is not easy to generalize. For the specific implementation used in this paper, motion is used as a clue to semantic information. In this framework, an automatic algorithm is presented for computing the semantic partition based on color change detection. The change detection strategy is designed to be immune to the sensor noise and local illumination variations. The lower-level segmentation identifies the partition corresponding to perceptually uniform regions. These regions are derived by clustering in an N-dimensional feature space, composed of static as well as dynamic image attributes. We propose an interaction mechanism between the semantic and the region partitions which allows to

  13. Elementary study on γ analysis software for low level measurement

    International Nuclear Information System (INIS)

    Ruan Guanglin; Huang Xianguo; Xing Shixiong

    2001-01-01

    The difficulty in using fashion γ analysis software in low level measurement is discussed. The ROI report file of ORTEC operation system has been chosen as interface file to write γ analysis software for low-level measurement. The author gives software flowchart and applied example and discusses the existent problems

  14. Base level Investigation in various buildings and corresponding effective factors

    Directory of Open Access Journals (Sweden)

    Mohsen Tehranizadeh

    2017-07-01

    Full Text Available Base level is one of the important parameters in determining the seismic force and preliminary design of structural sections. Base level, According to 2800 seismic regulations, in cases which the basement perimeter is executed with reinforced concrete walls integrated by structure, in addition with surrounding dense soil; set top of basement walls. The critical issue involved in determining base level is horizontal motion of the land. Usually horizontal movement of the Earth is transferred by shear and friction between the edges of the basement walls and foundation, also this process is completed by soil friction between underside of slabs and shallow. Different conditions such as non-same elevated foundations, soil type around building, soil-structure interaction and type of foundation are impressive on location of base level. Other factors including retaining wall openings in basement, basement floors and soil characteristics around the base structures affect base level coordination. As regards there is cleared definition for base level in different regulation all around the world, sometimes engineers cannot comprehend main purpose correctly, or concepts occasionally are interpreted inaccurately. When structure conditions little different from what normally there is, for example, buildings on slope, or structures on deep foundation such as piles, often experts are conflicted by finding location of base level in this status. In this paper investigations about base level in the past years expressed and studied, also, important issues around them are discussed.

  15. Level set method for image segmentation based on moment competition

    Science.gov (United States)

    Min, Hai; Wang, Xiao-Feng; Huang, De-Shuang; Jin, Jing; Wang, Hong-Zhi; Li, Hai

    2015-05-01

    We propose a level set method for image segmentation which introduces the moment competition and weakly supervised information into the energy functional construction. Different from the region-based level set methods which use force competition, the moment competition is adopted to drive the contour evolution. Here, a so-called three-point labeling scheme is proposed to manually label three independent points (weakly supervised information) on the image. Then the intensity differences between the three points and the unlabeled pixels are used to construct the force arms for each image pixel. The corresponding force is generated from the global statistical information of a region-based method and weighted by the force arm. As a result, the moment can be constructed and incorporated into the energy functional to drive the evolving contour to approach the object boundary. In our method, the force arm can take full advantage of the three-point labeling scheme to constrain the moment competition. Additionally, the global statistical information and weakly supervised information are successfully integrated, which makes the proposed method more robust than traditional methods for initial contour placement and parameter setting. Experimental results with performance analysis also show the superiority of the proposed method on segmenting different types of complicated images, such as noisy images, three-phase images, images with intensity inhomogeneity, and texture images.

  16. 12 CFR 652.70 - Risk-based capital level.

    Science.gov (United States)

    2010-01-01

    ... risk-based capital level is the sum of the following amounts: (a) Credit and interest rate risk. The... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Risk-based capital level. 652.70 Section 652.70 Banks and Banking FARM CREDIT ADMINISTRATION FARM CREDIT SYSTEM FEDERAL AGRICULTURAL MORTGAGE...

  17. A Container-based Trusted Multi-level Security Mechanism

    Directory of Open Access Journals (Sweden)

    Li Xiao-Yong

    2017-01-01

    Full Text Available Multi-level security mechanism has been widely applied in the military, government, defense and other domains in which information is required to be divided by security-level. Through this type of security mechanism, users at different security levels are provided with information at corresponding security levels. Traditional multi-level security mechanism which depends on the safety of operating system finally proved to be not practical. We propose a container-based trusted multi-level security mechanism in this paper to improve the applicability of the multi-level mechanism. It guarantees multi-level security of the system through a set of multi-level security policy rules and trusted techniques. The technical feasibility and application scenarios are also discussed. The ease of realization, strong practical significance and low cost of our method will largely expand the application of multi-level security mechanism in real life.

  18. Hand-Based Biometric Analysis

    Science.gov (United States)

    Bebis, George (Inventor); Amayeh, Gholamreza (Inventor)

    2015-01-01

    Hand-based biometric analysis systems and techniques are described which provide robust hand-based identification and verification. An image of a hand is obtained, which is then segmented into a palm region and separate finger regions. Acquisition of the image is performed without requiring particular orientation or placement restrictions. Segmentation is performed without the use of reference points on the images. Each segment is analyzed by calculating a set of Zernike moment descriptors for the segment. The feature parameters thus obtained are then fused and compared to stored sets of descriptors in enrollment templates to arrive at an identity decision. By using Zernike moments, and through additional manipulation, the biometric analysis is invariant to rotation, scale, or translation or an in put image. Additionally, the analysis utilizes re-use of commonly-seen terms in Zernike calculations to achieve additional efficiencies over traditional Zernike moment calculation.

  19. Risk Based Milk Pricing Model at Dairy Farmers Level

    Directory of Open Access Journals (Sweden)

    W. Septiani

    2017-12-01

    Full Text Available The milk price from a cooperative institution to farmer does not fully cover the production cost. Though, dairy farmers encounter various risks and uncertainties in conducting their business. The highest risk in milk supply lies in the activities at the farm. This study was designed to formulate a model for calculating milk price at farmer’s level based on risk. Risks that occur on farms include the risk of cow breeding, sanitation, health care, cattle feed management, milking and milk sales. This research used the location of the farm in West Java region. There were five main stages in the preparation of this model, (1 identification and analysis of influential factors, (2 development of a conceptual model, (3 structural analysis and the amount of production costs, (4 model calculation of production cost with risk factors, and (5 risk based milk pricing model. This research built a relationship between risks on smallholder dairy farms with the production costs to be incurred by the farmers. It was also obtained the formulation of risk adjustment factor calculation for the variable costs of production in dairy cattle farm. The difference in production costs with risk and the total production cost without risk was about 8% to 10%. It could be concluded that the basic price of milk proposed based on the research was around IDR 4,250-IDR 4,350/L for 3 to 4 cows ownership. Increasing farmer income was expected to be obtained by entering the value of this risk in the calculation of production costs. 

  20. Safety analysis of urban arterials at the meso level.

    Science.gov (United States)

    Li, Jia; Wang, Xuesong

    2017-11-01

    Urban arterials form the main structure of street networks. They typically have multiple lanes, high traffic volume, and high crash frequency. Classical crash prediction models investigate the relationship between arterial characteristics and traffic safety by treating road segments and intersections as isolated units. This micro-level analysis does not work when examining urban arterial crashes because signal spacing is typically short for urban arterials, and there are interactions between intersections and road segments that classical models do not accommodate. Signal spacing also has safety effects on both intersections and road segments that classical models cannot fully account for because they allocate crashes separately to intersections and road segments. In addition, classical models do not consider the impact on arterial safety of the immediately surrounding street network pattern. This study proposes a new modeling methodology that will offer an integrated treatment of intersections and road segments by combining signalized intersections and their adjacent road segments into a single unit based on road geometric design characteristics and operational conditions. These are called meso-level units because they offer an analytical approach between micro and macro. The safety effects of signal spacing and street network pattern were estimated for this study based on 118 meso-level units obtained from 21 urban arterials in Shanghai, and were examined using CAR (conditional auto regressive) models that corrected for spatial correlation among the units within individual arterials. Results showed shorter arterial signal spacing was associated with higher total and PDO (property damage only) crashes, while arterials with a greater number of parallel roads were associated with lower total, PDO, and injury crashes. The findings from this study can be used in the traffic safety planning, design, and management of urban arterials. Copyright © 2017 Elsevier Ltd. All

  1. Sea level rise and the geoid: factor analysis approach

    Directory of Open Access Journals (Sweden)

    Alexey Sadovski

    2013-08-01

    Full Text Available Sea levels are rising around the world, and this is a particular concern along most of the coasts of the United States. A 1989 EPA report shows that sea levels rose 5-6 inches more than the global average along the Mid-Atlantic and Gulf Coasts in the last century. The main reason for this is coastal land subsidence. This sea level rise is considered more as relative sea level rise than global sea level rise. Thus, instead of studying sea level rise globally, this paper describes a statistical approach by using factor analysis of regional sea level rates of change. Unlike physical models and semi-empirical models that attempt to approach how much and how fast sea levels are changing, this methodology allows for a discussion of the factor(s that statistically affects sea level rates of change, and seeks patterns to explain spatial correlations.

  2. Levels and Patterns in the Analysis of the Organizational Culture

    OpenAIRE

    Mariana Aida Cimpeanu

    2011-01-01

    Knowledge and analysis of the component elements of the organizational culture helps us greatly understand the respective culture, establish the main guidelines of the company values and understand the behaviours and attitudes of the employees. M. Thevenet indentifies two levels at which the culture manifests itself: the external level – the outside culture (which refers to local, regional or national culture), and the inner level –the internal culture (including organizational culture, profe...

  3. Microcontroller based multi-channel ultrasonic level monitoring system

    International Nuclear Information System (INIS)

    Ambastha, K.P.; Chaudhari, Y.V.; Singh, Inder Jeet; Chadda, V.K.

    2004-01-01

    Microcontroller based Multi-channel Ultrasonic Level Monitoring System developed by Computer Division is based on echo ranging techniques to monitor level. The transmitter directs an ultrasonic burst towards the liquid, which gets reflected from the top of the liquid surface. The time taken for ultrasound to travel from the transmitter to the top of liquid surface is measured and used to calculate the liquid level. The system provides for temperature compensation for accurate measurement as the ultrasound velocity depends on the ambient temperature. It can measure liquid level up to 5 meters. A single monitor can be used to measure level in 6 tanks. PC connectivity has been provided via RS 232 and RS 485 for remote operation and data logging of level. A GUI program developed using LABVIEW package displays level on PC monitor. The program provides for pictorial as well as numerical display for level and temperature in the front panel on the PC monitor. A user can monitor level for any or all tanks from the PC. One unit is installed at CIRUS for measuring level in Acid/ Alkali tanks and one is installed at APSARA for measuring water level in the reactor pool. (author)

  4. Radar Based Flow and Water Level Forecasting in Sewer Systems

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Rasmussen, Michael R.; Grum, M.

    2009-01-01

    This paper describes the first radar based forecast of flow and/or water level in sewer systems in Denmark. The rainfall is successfully forecasted with a lead time of 1-2 hours, and flow/levels are forecasted an additional ½-1½ hours using models describing the behaviour of the sewer system. Bot...

  5. Exposure level from selected base station tower around Kuala Nerus

    African Journals Online (AJOL)

    Health risk due to RF radiation exposure from base station tower (BST) has been debated for years leading to public concerns. Thus, this preliminary study aims to measure, evaluate and analyze the exposure level on three selected BST around Kuala Nerus. The measurement of exposure level in terms of voltage ...

  6. Perceived need to increase physical activity levels among adults at high risk of type 2 diabetes. A cross-sectional analysis within a community-based diabetes prevention project FIN-D2D

    Directory of Open Access Journals (Sweden)

    Vähäsarja Kati

    2012-07-01

    Full Text Available Abstract Background Increased physical activity is a cornerstone of type 2 diabetes prevention. The perception of a need to change is considered essential in behaviour change processes. However, the existing literature on individuals’ perceived need to change health behaviour is limited. In order to improve understanding of diabetes prevention through increased physical activity levels (PAL, we assessed factors associated with perceiving a need to increase PAL among adults at high risk of diabetes. Methods Opportunistic screening was used within a primary-care based lifestyle intervention covering 10 149 men and women at high risk of type 2 diabetes. Data were obtained at baseline visits. The explored determinants were demographic, anthropometric/clinical, behavioural and psychosocial characteristics, along with four categories of PAL awareness. Logistic regression was used in the analysis. Results 74% of men (n = 2 577 and 76% of women (n = 4 551 perceived a need to increase their PAL. The participants most likely to perceive this need were inactive, had a larger waist circumference, rated their PAL as insufficient, and were at the contemplation stage of change. Smoking, elevated blood pressure, dyslipidaemia, and a family history of diabetes were not associated with this perception. The likelihood was also greater among women with less perceived fitness and less education. Demographic factors other than education did not determine participants’ perceived need to increase PAL. PAL overestimators were less likely to perceive the need to increase their PAL than realistic inactive participants. Conclusions Subjective rather than objective health factors appear to determine the perception of a need to increase PAL among adults at high risk of diabetes. Client perceptions need to be evaluated in health counselling in order to facilitate a change in PAL. Practical descriptions of the associations between metabolic risk factors, PAL, and

  7. A novel bi-level meta-analysis approach: applied to biological pathway analysis.

    Science.gov (United States)

    Nguyen, Tin; Tagett, Rebecca; Donato, Michele; Mitrea, Cristina; Draghici, Sorin

    2016-02-01

    The accumulation of high-throughput data in public repositories creates a pressing need for integrative analysis of multiple datasets from independent experiments. However, study heterogeneity, study bias, outliers and the lack of power of available methods present real challenge in integrating genomic data. One practical drawback of many P-value-based meta-analysis methods, including Fisher's, Stouffer's, minP and maxP, is that they are sensitive to outliers. Another drawback is that, because they perform just one statistical test for each individual experiment, they may not fully exploit the potentially large number of samples within each study. We propose a novel bi-level meta-analysis approach that employs the additive method and the Central Limit Theorem within each individual experiment and also across multiple experiments. We prove that the bi-level framework is robust against bias, less sensitive to outliers than other methods, and more sensitive to small changes in signal. For comparative analysis, we demonstrate that the intra-experiment analysis has more power than the equivalent statistical test performed on a single large experiment. For pathway analysis, we compare the proposed framework versus classical meta-analysis approaches (Fisher's, Stouffer's and the additive method) as well as against a dedicated pathway meta-analysis package (MetaPath), using 1252 samples from 21 datasets related to three human diseases, acute myeloid leukemia (9 datasets), type II diabetes (5 datasets) and Alzheimer's disease (7 datasets). Our framework outperforms its competitors to correctly identify pathways relevant to the phenotypes. The framework is sufficiently general to be applied to any type of statistical meta-analysis. The R scripts are available on demand from the authors. sorin@wayne.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e

  8. Evaluating public ambulance service levels by applying a GIS based ...

    African Journals Online (AJOL)

    Hunadi Mokgalaka

    based analysis of ambulance response time was undertaken. The purpose was to .... He tested the travel time of the primary response vehicles and ..... Real CORP Proceedings / Tagungsband, ISBN: 978-3-9502139-7-3, pp, 22-25, viewed 18.

  9. A brain-based account of "basic-level" concepts.

    Science.gov (United States)

    Bauer, Andrew James; Just, Marcel Adam

    2017-11-01

    This study provides a brain-based account of how object concepts at an intermediate (basic) level of specificity are represented, offering an enriched view of what it means for a concept to be a basic-level concept, a research topic pioneered by Rosch and others (Rosch et al., 1976). Applying machine learning techniques to fMRI data, it was possible to determine the semantic content encoded in the neural representations of object concepts at basic and subordinate levels of abstraction. The representation of basic-level concepts (e.g. bird) was spatially broad, encompassing sensorimotor brain areas that encode concrete object properties, and also language and heteromodal integrative areas that encode abstract semantic content. The representation of subordinate-level concepts (robin) was less widely distributed, concentrated in perceptual areas that underlie concrete content. Furthermore, basic-level concepts were representative of their subordinates in that they were neurally similar to their typical but not atypical subordinates (bird was neurally similar to robin but not woodpecker). The findings provide a brain-based account of the advantages that basic-level concepts enjoy in everyday life over subordinate-level concepts: the basic level is a broad topographical representation that encompasses both concrete and abstract semantic content, reflecting the multifaceted yet intuitive meaning of basic-level concepts. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Economic evaluation comparing intraoperative cone beam CT-based navigation and conventional fluoroscopy for the placement of spinal pedicle screws: a patient-level data cost-effectiveness analysis.

    Science.gov (United States)

    Dea, Nicolas; Fisher, Charles G; Batke, Juliet; Strelzow, Jason; Mendelsohn, Daniel; Paquette, Scott J; Kwon, Brian K; Boyd, Michael D; Dvorak, Marcel F S; Street, John T

    2016-01-01

    Pedicle screws are routinely used in contemporary spinal surgery. Screw misplacement may be asymptomatic but is also correlated with potential adverse events. Computer-assisted surgery (CAS) has been associated with improved screw placement accuracy rates. However, this technology has substantial acquisition and maintenance costs. Despite its increasing usage, no rigorous full economic evaluation comparing this technology to current standard of care has been reported. Medical costs are exploding in an unsustainable way. Health economic theory requires that medical equipment costs be compared with expected benefits. To answer this question for computer-assisted spinal surgery, we present an economic evaluation looking specifically at symptomatic misplaced screws leading to reoperation secondary to neurologic deficits or biomechanical concerns. The study design was an observational case-control study from prospectively collected data of consecutive patients treated with the aid of CAS (treatment group) compared with a matched historical cohort of patients treated with conventional fluoroscopy (control group). The patient sample consisted of consecutive patients treated surgically at a quaternary academic center. The primary effectiveness measure studied was the number of reoperations for misplaced screws within 1 year of the index surgery. Secondary outcome measures included were total adverse event rate and postoperative computed tomography usage for pedicle screw examination. A patient-level data cost-effectiveness analysis from the hospital perspective was conducted to determine the value of a navigation system coupled with intraoperative 3-D imaging (O-arm Imaging and the StealthStation S7 Navigation Systems, Medtronic, Louisville, CO, USA) in adult spinal surgery. The capital costs for both alternatives were reported as equivalent annual costs based on the annuitization of capital expenditures method using a 3% discount rate and a 7-year amortization period

  11. Level Sets and Voronoi based Feature Extraction from any Imagery

    DEFF Research Database (Denmark)

    Sharma, O.; Anton, François; Mioc, Darka

    2012-01-01

    Polygon features are of interest in many GEOProcessing applications like shoreline mapping, boundary delineation, change detection, etc. This paper presents a unique new GPU-based methodology to automate feature extraction combining level sets, or mean shift based segmentation together with Voron...

  12. Level of Radiofrequency (RF) Radiations from GSM Base Stations ...

    African Journals Online (AJOL)

    Levels of radiofrequency radiations around two global systems for mobile communication (GSM) base stations located in the vicinity of a residential quarter and workplace complex were measured. The effects of the radiofrequency radiations on albino mice placed in exposure cages and located around the base stations ...

  13. A bird strike handbook for base-level managers

    Science.gov (United States)

    Payson, R. P.; Vance, J. D.

    1984-09-01

    To help develop more awareness about bird strikes and bird strike reduction techniques, this thesis compiled all relevant information through an extensive literature search, review of base-level documents, and personal interviews. The final product--A Bird Strike Handbook for Base-Level Managers--provides information on bird strike statistics, methods to reduce the strike hazards, and means to obtain additional assistance. The handbook is organized for use by six major base agencies: Maintenance, Civil Engineering, Operations, Air Field Management, Safety, and Air Traffic Control. An appendix follows at the end.

  14. Pathway analysis for alternate low-level waste disposal methods

    International Nuclear Information System (INIS)

    Rao, R.R.; Kozak, M.W.; McCord, J.T.; Olague, N.E.

    1992-01-01

    The purpose of this paper is to evaluate a complete set of environmental pathways for disposal options and conditions that the Nuclear Regulatory Commission (NRC) may analyze for a low-level radioactive waste (LLW) license application. The regulations pertaining In the past, shallow-land burial has been used for the disposal of low-level radioactive waste. However, with the advent of the State Compact system of LLW disposal, many alternative technologies may be used. The alternative LLW disposal facilities include below- ground vault, tumulus, above-ground vault, shaft, and mine disposal This paper will form the foundation of an update of the previously developed Sandia National Laboratories (SNL)/NRC LLW performance assessment methodology. Based on the pathway assessment for alternative disposal methods, a determination will be made about whether the current methodology can satisfactorily analyze the pathways and phenomena likely to be important for the full range of potential disposal options. We have attempted to be conservative in keeping pathways in the lists that may usually be of marginal importance. In this way we can build confidence that we have spanned the range of cases likely to be encountered at a real site. Results of the pathway assessment indicate that disposal methods can be categorized in groupings based on their depth of disposal. For the deep disposal options of shaft and mine disposal, the key pathways are identical. The shallow disposal options, such as tumulus, shallow-land, and below-ground vault disposal also may be grouped together from a pathway analysis perspective. Above-ground vault disposal cannot be grouped with any of the other disposal options. The pathway analysis shows a definite trend concerning depth of disposal. The above-ground option has the largest number of significant pathways. As the waste becomes more isolated, the number of significant pathways is reduced. Similar to shallow-land burial, it was found that for all

  15. Cognitive Task Analysis of the Battalion Level Visualization Process

    National Research Council Canada - National Science Library

    Leedom, Dennis K; McElroy, William; Shadrick, Scott B; Lickteig, Carl; Pokorny, Robet A; Haynes, Jacqueline A; Bell, James

    2007-01-01

    ... position or as a battalion Operations Officer or Executive Officer. Bases on findings from the cognitive task analysis, 11 skill areas were identified as potential focal points for future training development...

  16. Characteristics Data Base: Programmer's guide to the High-Level Waste Data Base

    International Nuclear Information System (INIS)

    Jones, K.E.; Salmon, R.

    1990-08-01

    The High-Level Waste Data Base is a menu-driven PC data base developed as part of OCRWM's technical data base on the characteristics of potential repository wastes, which also includes spent fuel and other materials. This programmer's guide completes the documentation for the High-Level Waste Data Base, the user's guide having been published previously. 3 figs

  17. Spectral analysis of highly aliased sea-level signals

    Science.gov (United States)

    Ray, Richard D.

    1998-10-01

    Observing high-wavenumber ocean phenomena with a satellite altimeter generally calls for "along-track" analyses of the data: measurements along a repeating satellite ground track are analyzed in a point-by-point fashion, as opposed to spatially averaging data over multiple tracks. The sea-level aliasing problems encountered in such analyses can be especially challenging. For TOPEX/POSEIDON, all signals with frequency greater than 18 cycles per year (cpy), including both tidal and subdiurnal signals, are folded into the 0-18 cpy band. Because the tidal bands are wider than 18 cpy, residual tidal cusp energy, plus any subdiurnal energy, is capable of corrupting any low-frequency signal of interest. The practical consequences of this are explored here by using real sea-level measurements from conventional tide gauges, for which the true oceanographic spectrum is known and to which a simulated "satellite-measured" spectrum, based on coarsely subsampled data, may be compared. At many locations the spectrum is sufficently red that interannual frequencies remain unaffected. Intra-annual frequencies, however, must be interpreted with greater caution, and even interannual frequencies can be corrupted if the spectrum is flat. The results also suggest that whenever tides must be estimated directly from the altimetry, response methods of analysis are preferable to harmonic methods, even in nonlinear regimes; this will remain so for the foreseeable future. We concentrate on three example tide gauges: two coastal stations on the Malay Peninsula where the closely aliased K1 and Ssa tides are strong and at Canton Island where trapped equatorial waves are aliased.

  18. RISK LEVEL ANALYSIS ON THE PREVENTIVE EROSION CAPACITY OF BRIDGES

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Deficiency of the Preventive Erosion Capacity (PEC) of a bridge pier is the main factor leading to bridge failures. In this paper, the PEC of bridge piers was analyzed using the stochastic analysis method. The definitions of the reliability and risk level of a bridge pier subjected to water erosion were proposed and a computational model for erosion depth and risk level in was suggested.

  19. Circulating interleukin-10 levels and human papilloma virus and Epstein-Barr virus-associated cancers: evidence from a Mendelian randomization meta-analysis based on 11,170 subjects.

    Science.gov (United States)

    Qu, Kai; Pang, Qing; Lin, Ting; Zhang, Li; Gu, Mingliang; Niu, Wenquan; Liu, Chang; Zhang, Ming

    2016-01-01

    Recent studies have showed interleukin 10 (IL-10) is a critical cytokine that determines antiviral immune response and is related to virus-associated cancers. However, whether genetically elevated circulating IL-10 levels are associated with the risk of human papilloma virus and Epstein-Barr virus-associated cancers (HEACs) is still unclear. Mendelian randomization method was implemented to meta-analyze available observational studies by employing IL-10 three variants (-592C>A, -819C>T, and -1082A>G) as instruments. A total of 24 articles encompassing 11,170 subjects were ultimately eligible for the meta-analysis. Overall, there was a significant association between IL-10 promoter variant -1082A>G and HEACs under allelic and dominant models (both PG was significant for nasopharyngeal cancer under allelic, homozygous genotypic and dominant models (all P<0.001). Moreover by ethnicity, carriers of -1082G allele had a 74% increased risk for nasopharyngeal cancer in Asians under dominant model (odds ratio [OR] =1.737; 95% confidence interval [CI]: 1.280-2.358; P<0.001). In further Mendelian randomization analysis, the predicted OR for 10 pg/mL increment in IL-10 levels was 1.14 (95% CI: 1.01-16.99) in HEACs. Our findings provided strong evidence for a critical role of genetically elevated circulating IL-10 levels in the development of HEACs, especially in Asian population and for nasopharyngeal cancer.

  20. Watershed-based Morphometric Analysis: A Review

    Science.gov (United States)

    Sukristiyanti, S.; Maria, R.; Lestiana, H.

    2018-02-01

    Drainage basin/watershed analysis based on morphometric parameters is very important for watershed planning. Morphometric analysis of watershed is the best method to identify the relationship of various aspects in the area. Despite many technical papers were dealt with in this area of study, there is no particular standard classification and implication of each parameter. It is very confusing to evaluate a value of every morphometric parameter. This paper deals with the meaning of values of the various morphometric parameters, with adequate contextual information. A critical review is presented on each classification, the range of values, and their implications. Besides classification and its impact, the authors also concern about the quality of input data, either in data preparation or scale/the detail level of mapping. This review paper hopefully can give a comprehensive explanation to assist the upcoming research dealing with morphometric analysis.

  1. DPASV analytical technique for ppb level uranium analysis

    Science.gov (United States)

    Pal, Sangita; Singha, Mousumi; Meena, Sher Singh

    2018-04-01

    Determining uranium in ppb level is considered to be most crucial for reuse of water originated in nuclear industries at the time of decontamination of plant effluents generated during uranium (fuel) production, fuel rod fabrication, application in nuclear reactors and comparatively small amount of effluents obtained during laboratory research and developmental work. Higher level of uranium in percentage level can be analyzed through gravimetry, titration etc, whereas inductively coupled plasma-atomic energy spectroscopy (ICP-AES), fluorimeter are well suited for ppm level. For ppb level of uranium, inductively coupled plasma - mass spectroscopy (ICP-MS) or Differential Pulse Anodic Stripping Voltammetry (DPASV) serve the purpose. High precision, accuracy and sensitivity are the crucial for uranium analysis in trace (ppb) level, which are satisfied by ICP-MS and stripping voltammeter. Voltammeter has been found to be less expensive, requires low maintenance and is convenient for measuring uranium in presence of large number of other ions in the waste effluent. In this paper, necessity of uranium concentration quantification for recovery as well as safe disposal of plant effluent, working mechanism of voltammeter w.r.t. uranium analysis in ppb level with its standard deviation and a data comparison with ICP-MS has been represented.

  2. Power Grid Construction Project Portfolio Optimization Based on Bi-level programming model

    Science.gov (United States)

    Zhao, Erdong; Li, Shangqi

    2017-08-01

    As the main body of power grid operation, county-level power supply enterprises undertake an important emission to guarantee the security of power grid operation and safeguard social power using order. The optimization of grid construction projects has been a key issue of power supply capacity and service level of grid enterprises. According to the actual situation of power grid construction project optimization of county-level power enterprises, on the basis of qualitative analysis of the projects, this paper builds a Bi-level programming model based on quantitative analysis. The upper layer of the model is the target restriction of the optimal portfolio; the lower layer of the model is enterprises’ financial restrictions on the size of the enterprise project portfolio. Finally, using a real example to illustrate operation proceeding and the optimization result of the model. Through qualitative analysis and quantitative analysis, the bi-level programming model improves the accuracy and normative standardization of power grid enterprises projects.

  3. DeviceNet-based device-level control in SSRF

    CERN Document Server

    Leng Yong Bin; Lu Cheng Meng; Miao Hai Feng; Liu Song Qiang; Shen Guo Bao

    2002-01-01

    The control system of Shanghai Synchrotron Radiation Facility is an EPICS-based distributed system. One of the key techniques to construct the system is the device-level control. The author describes the design and implementation of the DeviceNet-based device controller. A prototype of the device controller was tested in the experiments of magnet power supply and the result showed a precision of 3 x 10 sup - sup 5

  4. Network-based analysis of proteomic profiles

    KAUST Repository

    Wong, Limsoon

    2016-01-26

    Mass spectrometry (MS)-based proteomics is a widely used and powerful tool for profiling systems-wide protein expression changes. It can be applied for various purposes, e.g. biomarker discovery in diseases and study of drug responses. Although RNA-based high-throughput methods have been useful in providing glimpses into the underlying molecular processes, the evidences they provide are indirect. Furthermore, RNA and corresponding protein levels have been known to have poor correlation. On the other hand, MS-based proteomics tend to have consistency issues (poor reproducibility and inter-sample agreement) and coverage issues (inability to detect the entire proteome) that need to be urgently addressed. In this talk, I will discuss how these issues can be addressed by proteomic profile analysis techniques that use biological networks (especially protein complexes) as the biological context. In particular, I will describe several techniques that we have been developing for network-based analysis of proteomics profile. And I will present evidence that these techniques are useful in identifying proteomics-profile analysis results that are more consistent, more reproducible, and more biologically coherent, and that these techniques allow expansion of the detected proteome to uncover and/or discover novel proteins.

  5. Recurrence Quantifcation Analysis of Sentence-Level Speech Kinematics

    Science.gov (United States)

    Jackson, Eric S.; Tiede, Mark; Riley, Michael A.; Whalen, D. H.

    2016-01-01

    Purpose: Current approaches to assessing sentence-level speech variability rely on measures that quantify variability across utterances and use normalization procedures that alter raw trajectory data. The current work tests the feasibility of a less restrictive nonlinear approach--recurrence quantification analysis (RQA)--via a procedural example…

  6. Waste analysis plan for the low-level burial grounds

    International Nuclear Information System (INIS)

    Barnes, B.M.

    1996-01-01

    This waste analysis plan (WAP) has been prepared for the Low-Level Burial Grounds that are located in the 200 East and 200 West Areas of the Hanford Facility, Richland, Washington. This WAP documents the methods used to characterize and obtain and analyze representative samples of waste managed at this unit

  7. Waste analysis plan for the low-level burial grounds

    Energy Technology Data Exchange (ETDEWEB)

    Haas, C.R.

    1996-09-19

    This waste analysis plan (WAP) has been prepared for the Low-Level Burial Grounds (LLBG) which are located in the 200 East and West Areas of the Hanford Facility, Richland, Washington. This WAP documents the methods used to characterize, and obtain and analyze representative samples of waste managed at this unit.

  8. Detecting bots using multi-level traffic analysis

    DEFF Research Database (Denmark)

    Stevanovic, Matija; Pedersen, Jens Myrup

    2016-01-01

    introduces a novel multi-level botnet detection approach that performs network traffic analysis of three protocols widely considered as the main carriers of botnet Command and Control (C&C) and attack traffic, i.e. TCP, UDP and DNS. The proposed method relies on supervised machine learning for identifying...

  9. A Minimum Cost Flow model for Level of Repair Analysis

    NARCIS (Netherlands)

    Basten, Robertus Johannes Ida; Schutten, Johannes M.J.; van der Heijden, Matthijs C.

    2008-01-01

    Given a product design and a repair network for capital goods, a level of repair analysis determines for each component in the product (1) whether it should be discarded or repaired upon failure and (2) at which location in the repair network to do this. In this paper, we show how the problem can be

  10. A minimum cost flow model for level of repair analysis

    NARCIS (Netherlands)

    Basten, Robertus Johannes Ida; van der Heijden, Matthijs C.; Schutten, Johannes M.J.

    2011-01-01

    Given a product design and a repair network for capital goods, a level of repair analysis determines for each component in the product (1) whether it should be discarded or repaired upon failure and (2) at which location in the repair network to do this. In this paper, we show how the problem can be

  11. Rule-based emergency action level monitor prototype

    International Nuclear Information System (INIS)

    Touchton, R.A.; Gunter, A.D.; Cain, D.

    1985-01-01

    In late 1983, the Electric Power Research Institute (EPRI) began a program to encourage and stimulate the development of artificial intelligence (AI) applications for the nuclear industry. Development of a rule-based emergency action level classification system prototype is discussed. The paper describes both the full prototype currently under development and the completed, simplified prototype

  12. Importance-performance analysis based SWOT analysis

    OpenAIRE

    Phadermrod, Boonyarat; Crowder, Richard M.; Wills, Gary B.

    2016-01-01

    SWOT analysis, a commonly used tool for strategic planning, is traditionally a form of brainstorming. Hence, it has been criticised that it is likely to hold subjective views of the individuals who participate in a brainstorming session and that SWOT factors are not prioritized by their significance thus it may result in an improper strategic action. While most studies of SWOT analysis have only focused on solving these shortcomings separately, this study offers an approach to diminish both s...

  13. Adaptable Value-Set Analysis for Low-Level Code

    OpenAIRE

    Brauer, Jörg; Hansen, René Rydhof; Kowalewski, Stefan; Larsen, Kim G.; Olesen, Mads Chr.

    2012-01-01

    This paper presents a framework for binary code analysis that uses only SAT-based algorithms. Within the framework, incremental SAT solving is used to perform a form of weakly relational value-set analysis in a novel way, connecting the expressiveness of the value sets to computational complexity. Another key feature of our framework is that it translates the semantics of binary code into an intermediate representation. This allows for a straightforward translation of the program semantics in...

  14. Mathematics creative thinking levels based on interpersonal intelligence

    Science.gov (United States)

    Kuncorowati, R. H.; Mardiyana; Saputro, D. R. S.

    2017-12-01

    Creative thinking ability was one of student’s ability to determine various alternative solutions toward mathematics problem. One of indicators related to creative thinking ability was interpersonal intelligence. Student’s interpersonal intelligence would influence to student’s creativity. This research aimed to analyze creative thinking ability level of junior high school students in Karanganyar using descriptive method. Data was collected by test, questionnaire, interview, and documentation. The result showed that students with high interpersonal intelligence achieved third and fourth level in creative thinking ability. Students with moderate interpersonal intelligence achieved second level in creative thinking ability and students with low interpersonal intelligence achieved first and zero level in creative thinking ability. Hence, students with high, moderate, and low interpersonal intelligence could solve mathematics problem based on their mathematics creative thinking ability.

  15. NHS-based Tandem Mass Tagging of Proteins at the Level of Whole Cells: A Critical Evaluation in Comparison to Conventional TMT-Labeling Approaches for Quantitative Proteome Analysis.

    Science.gov (United States)

    Megger, Dominik A; Pott, Leona L; Rosowski, Kristin; Zülch, Birgit; Tautges, Stephanie; Bracht, Thilo; Sitek, Barbara

    2017-01-01

    Tandem mass tags (TMT) are usually introduced at the levels of isolated proteins or peptides. Here, for the first time, we report the labeling of whole cells and a critical evaluation of its performance in comparison to conventional labeling approaches. The obtained results indicated that TMT protein labeling using intact cells is generally possible, if it is coupled to a subsequent enrichment using anti-TMT antibody. The quantitative results were similar to those obtained after labeling of isolated proteins and both were found to be slightly complementary to peptide labeling. Furthermore, when using NHS-based TMT, no specificity towards cell surface proteins was observed in the case of cell labeling. In summary, the conducted study revealed first evidence for the general possibility of TMT cell labeling and highlighted limitations of NHS-based labeling reagents. Future studies should therefore focus on the synthesis and investigation of membrane impermeable TMTs to increase specificity towards cell surface proteins.

  16. Numerical analysis for prediction of fatigue crack opening level

    International Nuclear Information System (INIS)

    Choi, Hyeon Chang

    2004-01-01

    Finite Element Analysis (FEA) is the most popular numerical method to simulate plasticity-induced fatigue crack closure and can predict fatigue crack closure behavior. Finite element analysis under plane stress state using 4-node isoparametric elements is performed to investigate the detailed closure behavior of fatigue cracks and the numerical results are compared with experimental results. The mesh of constant size elements on the crack surface can not correctly predict the opening level for fatigue crack as shown in the previous works. The crack opening behavior for the size mesh with a linear change shows almost flat stress level after a crack tip has passed by the monotonic plastic zone. The prediction of crack opening level presents a good agreement with published experimental data regardless of stress ratios, which are using the mesh of the elements that are in proportion to the reversed plastic zone size considering the opening stress intensity factors. Numerical interpolation results of finite element analysis can precisely predict the crack opening level. This method shows a good agreement with the experimental data regardless of the stress ratios and kinds of materials

  17. Information Flow Analysis of Level 4 Payload Processing Operations

    Science.gov (United States)

    Danz, Mary E.

    1991-01-01

    The Level 4 Mission Sequence Test (MST) was studied to develop strategies and recommendations to facilitate information flow. Recommendations developed as a result of this study include revised format of the Test and Assembly Procedure (TAP) document and a conceptualized software based system to assist in the management of information flow during the MST.

  18. Levels and Patterns in the Analysis of the Organizational Culture

    Directory of Open Access Journals (Sweden)

    Mariana Aida Cimpeanu

    2011-05-01

    Full Text Available Knowledge and analysis of the component elements of the organizational culture helps us greatly understand the respective culture, establish the main guidelines of the company values and understand the behaviours and attitudes of the employees. M. Thevenet indentifies two levels at which the culture manifests itself: the external level – the outside culture (which refers to local, regional or national culture, and the inner level –the internal culture (including organizational culture, professional culture, the culture of a group. Starting from this assumption, one can identify the main components of the organizational culture: founders, the organization’s history, values, beliefs and symbols, the way of thinking, the standards of behaviour etc. Some of these are visible, forming a cultural foundation surface, while others create a less visible foundation of culture – the hidden level. Kotter and Heskett agree that these two levels of analysis are very connected and influence each other. Considering their importance, other authors identify three, four or more levels of culture (Denison, Hofstede, Shein, bringing forth first the values then the rituals, heroes and symbols. Different models of culture analysis help us explain the elements of culture and understand its importance by providing for the researchers a starting point in explaining specific aspects related to the organizational culture and the organizational behaviour. By understanding the organizational culture, the members of an organization are able to shape their behaviour, can recognize their rights and obligations inside the company and the style of internal communication. They can determine the style of clothing and the dominant attitude inside the company, the way in which the management defines and implements its decisions and the staff policy.

  19. Aluminum Level in Infants’ Powdered Milk Based Formulae

    Directory of Open Access Journals (Sweden)

    Ahmed Abdel-Hameid Ahmed

    2016-10-01

    Full Text Available Aluminum level (Al in infant formula was determined to postulate its public health significance and suggesting recommendations to avoid such contamination. Hence, fifty random samples of infants powdered         milk based formulae were collected from different markets and pharmacies in Assiut Governorate, Egypt. These samples were digested and Al level was detected by using HR-CS (High Resolution Continum Source Atomic Absorption Spectrophotometer and compared with Maximum Permissible Limit (MPL. About 90% of examined infant formula samples containing Al with an average value of 0.145 mg/L and 8% of samples were above the MPL.

  20. Understanding extreme sea levels for coastal impact and adaptation analysis

    Science.gov (United States)

    Wahl, T.; Haigh, I. D.; Nicholls, R. J.; Arns, A.; Hinkel, J.; Dangendorf, S.; Slangen, A.

    2016-12-01

    Coastal impact and adaptation assessments require detailed knowledge on extreme sea levels, because increasing damage due to extreme events, such as storm surges and tropical cyclones, is one of the major consequences of sea level rise and climate change. In fact, the IPCC has highlighted in its AR4 report that "societal impacts of sea level change primarily occur via the extreme levels rather than as a direct consequence of mean sea level changes". Over the last few decades, substantial research efforts have been directed towards improved understanding of past and future mean sea level; different scenarios were developed with process-based or semi-empirical models and used for coastal impact assessments at various spatial scales to guide coastal management and adaptation efforts. The uncertainties in future sea level rise are typically accounted for by analyzing the impacts associated with a range of scenarios leading to a vertical displacement of the distribution of extreme sea-levels. And indeed most regional and global studies find little or no evidence for changes in storminess with climate change, although there is still low confidence in the results. However, and much more importantly, there is still a limited understanding of present-day extreme sea-levels which is largely ignored in most impact and adaptation analyses. The two key uncertainties stem from: (1) numerical models that are used to generate long time series of extreme sea-levels. The bias of these models varies spatially and can reach values much larger than the expected sea level rise; but it can be accounted for in most regions making use of in-situ measurements; (2) Statistical models used for determining present-day extreme sea-level exceedance probabilities. There is no universally accepted approach to obtain such values for flood risk assessments and while substantial research has explored inter-model uncertainties for mean sea level, we explore here, for the first time, inter

  1. Statistical analysis of global surface air temperature and sea level using cointegration methods

    DEFF Research Database (Denmark)

    Schmith, Torben; Johansen, Søren; Thejll, Peter

    Global sea levels are rising which is widely understood as a consequence of thermal expansion and melting of glaciers and land-based ice caps. Due to physically-based models being unable to simulate observed sea level trends, semi-empirical models have been applied as an alternative for projecting...... of future sea levels. There is in this, however, potential pitfalls due to the trending nature of the time series. We apply a statistical method called cointegration analysis to observed global sea level and surface air temperature, capable of handling such peculiarities. We find a relationship between sea...... level and temperature and find that temperature causally depends on the sea level, which can be understood as a consequence of the large heat capacity of the ocean. We further find that the warming episode in the 1940s is exceptional in the sense that sea level and warming deviates from the expected...

  2. Level crossing analysis of Burgers equation in 1 + 1 dimensions

    International Nuclear Information System (INIS)

    Movahed, M Sadegh; Bahraminasab, A; Rezazadeh, H; Masoudi, A A

    2006-01-01

    We investigate the average frequency of positive slope ν + α , crossing the velocity field u(x) - u-bar = α in the Burgers equation. The level crossing analysis in the inviscid limit and the total number of positive crossings of the velocity field before the creation of singularities are given. The main goal of this paper is to show that this quantity, ν + α , is a good measure for the fluctuations of velocity fields in the Burgers turbulence

  3. Electrically pumped graphene-based Landau-level laser

    Science.gov (United States)

    Brem, Samuel; Wendler, Florian; Winnerl, Stephan; Malic, Ermin

    2018-03-01

    Graphene exhibits a nonequidistant Landau quantization with tunable Landau-level (LL) transitions in the technologically desired terahertz spectral range. Here, we present a strategy for an electrically driven terahertz laser based on Landau-quantized graphene as the gain medium. Performing microscopic modeling of the coupled electron, phonon, and photon dynamics in such a laser, we reveal that an inter-LL population inversion can be achieved resulting in the emission of coherent terahertz radiation. The presented paper provides a concrete recipe for the experimental realization of tunable graphene-based terahertz laser systems.

  4. Skull defect reconstruction based on a new hybrid level set.

    Science.gov (United States)

    Zhang, Ziqun; Zhang, Ran; Song, Zhijian

    2014-01-01

    Skull defect reconstruction is an important aspect of surgical repair. Historically, a skull defect prosthesis was created by the mirroring technique, surface fitting, or formed templates. These methods are not based on the anatomy of the individual patient's skull, and therefore, the prosthesis cannot precisely correct the defect. This study presented a new hybrid level set model, taking into account both the global optimization region information and the local accuracy edge information, while avoiding re-initialization during the evolution of the level set function. Based on the new method, a skull defect was reconstructed, and the skull prosthesis was produced by rapid prototyping technology. This resulted in a skull defect prosthesis that well matched the skull defect with excellent individual adaptation.

  5. Hydrocoin level 3 - Testing methods for sensitivity/uncertainty analysis

    International Nuclear Information System (INIS)

    Grundfelt, B.; Lindbom, B.; Larsson, A.; Andersson, K.

    1991-01-01

    The HYDROCOIN study is an international cooperative project for testing groundwater hydrology modelling strategies for performance assessment of nuclear waste disposal. The study was initiated in 1984 by the Swedish Nuclear Power Inspectorate and the technical work was finalised in 1987. The participating organisations are regulatory authorities as well as implementing organisations in 10 countries. The study has been performed at three levels aimed at studying computer code verification, model validation and sensitivity/uncertainty analysis respectively. The results from the first two levels, code verification and model validation, have been published in reports in 1988 and 1990 respectively. This paper focuses on some aspects of the results from Level 3, sensitivity/uncertainty analysis, for which a final report is planned to be published during 1990. For Level 3, seven test cases were defined. Some of these aimed at exploring the uncertainty associated with the modelling results by simply varying parameter values and conceptual assumptions. In other test cases statistical sampling methods were applied. One of the test cases dealt with particle tracking and the uncertainty introduced by this type of post processing. The amount of results available is substantial although unevenly spread over the test cases. It has not been possible to cover all aspects of the results in this paper. Instead, the different methods applied will be illustrated by some typical analyses. 4 figs., 9 refs

  6. Base-Stock Level yang Meminimasi Biaya pada Permasalahan Single Order

    Directory of Open Access Journals (Sweden)

    Tanti Octavia

    2011-01-01

    Full Text Available The purpose of this research is to conduct an analysis on demand characteristic and inventory cost which might potentially influence the determination of base-stock level. Appropriate base-stock level will determine the maximum profit and also reduce the stockout cost for fashion products which has high demand uncertainty and single order problem. Case study is applied on a garment company with eight fashion product types that can be classified into short and long lifetime. The effect of coefficient variation, salvage price and stockout cost is analyzed towards profits and base stock level. It can be concluded that demand variability does not affect base stock level, while stockout cost and salvage price does.

  7. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  8. Procedure for conducting probabilistic safety assessment: level 1 full power internal event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dae; Lee, Y. H.; Hwang, M. J. [and others

    2003-07-01

    This report provides guidance on conducting a Level I PSA for internal events in NPPs, which is based on the method and procedure that was used in the PSA for the design of Korea Standard Nuclear Plants (KSNPs). Level I PSA is to delineate the accident sequences leading to core damage and to estimate their frequencies. It has been directly used for assessing and modifying the system safety and reliability as a key and base part of PSA. Also, Level I PSA provides insights into design weakness and into ways of preventing core damage, which in most cases is the precursor to accidents leading to major accidents. So Level I PSA has been used as the essential technical bases for risk-informed application in NPPs. The report consists six major procedural steps for Level I PSA; familiarization of plant, initiating event analysis, event tree analysis, system fault tree analysis, reliability data analysis, and accident sequence quantification. The report is intended to assist technical persons performing Level I PSA for NPPs. A particular aim is to promote a standardized framework, terminology and form of documentation for PSAs. On the other hand, this report would be useful for the managers or regulatory persons related to risk-informed regulation, and also for conducting PSA for other industries.

  9. Shape-based grey-level image interpolation

    International Nuclear Information System (INIS)

    Keh-Shih Chuang; Chun-Yuan Chen; Ching-Kai Yeh

    1999-01-01

    The three-dimensional (3D) object data obtained from a CT scanner usually have unequal sampling frequencies in the x-, y- and z-directions. Generally, the 3D data are first interpolated between slices to obtain isotropic resolution, reconstructed, then operated on using object extraction and display algorithms. The traditional grey-level interpolation introduces a layer of intermediate substance and is not suitable for objects that are very different from the opposite background. The shape-based interpolation method transfers a pixel location to a parameter related to the object shape and the interpolation is performed on that parameter. This process is able to achieve a better interpolation but its application is limited to binary images only. In this paper, we present an improved shape-based interpolation method for grey-level images. The new method uses a polygon to approximate the object shape and performs the interpolation using polygon vertices as references. The binary images representing the shape of the object were first generated via image segmentation on the source images. The target object binary image was then created using regular shape-based interpolation. The polygon enclosing the object for each slice can be generated from the shape of that slice. We determined the relative location in the source slices of each pixel inside the target polygon using the vertices of a polygon as the reference. The target slice grey-level was interpolated from the corresponding source image pixels. The image quality of this interpolation method is better and the mean squared difference is smaller than with traditional grey-level interpolation. (author)

  10. Shape-based interpolation of multidimensional grey-level images

    International Nuclear Information System (INIS)

    Grevera, G.J.; Udupa, J.K.

    1996-01-01

    Shape-based interpolation as applied to binary images causes the interpolation process to be influenced by the shape of the object. It accomplishes this by first applying a distance transform to the data. This results in the creation of a grey-level data set in which the value at each point represents the minimum distance from that point to the surface of the object. (By convention, points inside the object are assigned positive values; points outside are assigned negative values.) This distance transformed data set is then interpolated using linear or higher-order interpolation and is then thresholded at a distance value of zero to produce the interpolated binary data set. In this paper, the authors describe a new method that extends shape-based interpolation to grey-level input data sets. This generalization consists of first lifting the n-dimensional (n-D) image data to represent it as a surface, or equivalently as a binary image, in an (n + 1)-dimensional [(n + 1)-D] space. The binary shape-based method is then applied to this image to create an (n + 1)-D binary interpolated image. Finally, this image is collapsed (inverse of lifting) to create the n-D interpolated grey-level data set. The authors have conducted several evaluation studies involving patient computed tomography (CT) and magnetic resonance (MR) data as well as mathematical phantoms. They all indicate that the new method produces more accurate results than commonly used grey-level linear interpolation methods, although at the cost of increased computation

  11. System based practice: a concept analysis

    Directory of Open Access Journals (Sweden)

    SHAHRAM YAZDANI

    2016-04-01

    Full Text Available Introduction: Systems-Based Practice (SBP is one of the six competencies introduced by the ACGME for physicians to provide high quality of care and also the most challenging of them in performance, training, and evaluation of medical students. This concept analysis clarifies the concept of SBP by identifying its components to make it possible to differentiate it from other similar concepts. For proper training of SBP and to ensure these competencies in physicians, it is necessary to have an operational definition, and SBP’s components must be precisely defined in order to provide valid and reliable assessment tools. Methods: Walker & Avant’s approach to concept analysis was performed in eight stages: choosing a concept, determining the purpose of analysis, identifying all uses of the concept, defining attributes, identifying a model case, identifying borderline, related, and contrary cases, identifying antecedents and consequences, and defining empirical referents. Results: Based on the analysis undertaken, the attributes of SBP includes knowledge of the system, balanced decision between patients’ need and system goals, effective role playing in interprofessional health care team, system level of health advocacy, and acting for system improvement. System thinking and a functional system are antecedents and system goals are consequences. A case model, as well as border, and contrary cases of SBP, has been introduced. Conclusion: The identification of SBP attributes in this study contributes to the body of knowledge in SBP and reduces the ambiguity of this concept to make it possible for applying it in training of different medical specialties. Also, it would be possible to develop and use more precise tools to evaluate SBP competency by using empirical referents of the analysis.

  12. Statistical analysis of global surface temperature and sea level using cointegration methods

    DEFF Research Database (Denmark)

    Schmidt, Torben; Johansen, Søren; Thejll, Peter

    2012-01-01

    Global sea levels are rising which is widely understood as a consequence of thermal expansion and melting of glaciers and land-based ice caps. Due to the lack of representation of ice-sheet dynamics in present-day physically-based climate models being unable to simulate observed sea level trends......, semi-empirical models have been applied as an alternative for projecting of future sea levels. There is in this, however, potential pitfalls due to the trending nature of the time series. We apply a statistical method called cointegration analysis to observed global sea level and land-ocean surface air...... temperature, capable of handling such peculiarities. We find a relationship between sea level and temperature and find that temperature causally depends on the sea level, which can be understood as a consequence of the large heat capacity of the ocean. We further find that the warming episode in the 1940s...

  13. Pilot task-based assessment of noise levels among firefighters.

    Science.gov (United States)

    Neitzel, Rl; Hong, O; Quinlan, P; Hulea, R

    2013-11-01

    Over one million American firefighters are routinely exposed to various occupational hazards agents. While efforts have been made to identify and reduce some causes of injuries and illnesses among firefighters, relatively little has been done to evaluate and understand occupational noise exposures in this group. The purpose of this pilot study was to apply a task-based noise exposure assessment methodology to firefighting operations to evaluate potential noise exposure sources, and to use collected task-based noise levels to create noise exposure estimates for evaluation of risk of noise-induced hearing loss by comparison to the 8-hr and 24-hr recommended exposure limits (RELs) for noise of 85 and 80.3 dBA, respectively. Task-based noise exposures (n=100 measurements) were measured in three different fire departments (a rural department in Southeast Michigan and suburban and urban departments in Northern California). These levels were then combined with time-at-task information collected from firefighters to estimate 8-hr noise exposures for the rural and suburban fire departments (n=6 estimates for each department). Data from 24-hr dosimetry measurements and crude self-reported activity categories from the urban fire department (n=4 measurements) were used to create 24-hr exposure estimates to evaluate the bias associated with the task-based estimates. Task-based noise levels were found to range from 82-109 dBA, with the highest levels resulting from use of saws and pneumatic chisels. Some short (e.g., 30 min) sequences of common tasks were found to result in nearly an entire allowable daily exposure. The majority of estimated 8-hr and 24-hr exposures exceeded the relevant recommended exposure limit. Predicted 24-hr exposures showed substantial imprecision in some cases, suggesting the need for increased task specificity. The results indicate potential for overexposure to noise from a variety of firefighting tasks and equipment, and suggest a need for further

  14. Patterns of debate in tertiary level asynchronous text-based conferencing

    OpenAIRE

    Coffin, Caroline; Painter, Clare; Hewings, Ann

    2005-01-01

    Argumentation can be defined at different levels and serve different purposes, but its role in knowledge understanding and construction has given it a central place in education, particularly at tertiary level. The advent of computer-supported text-based conferences has created new sites where such educational dialogues can take place, but the quality of the interaction and whether it is serving its educational purpose is still uncertain. This paper reports on a framework of analysis that has...

  15. Meso-level analysis, the missing link in energy strategies

    International Nuclear Information System (INIS)

    Schenk, Niels J.; Moll, Henri C.; Schoot Uiterkamp, Anton J.M.

    2007-01-01

    Energy is essential for human societies. Energy systems, though, are also associated with several adverse environmental effects. So far societies have been unable to successfully change their energy systems in a way that addresses environmental and health concerns. Lack of policy consensus often resulted in so-called 'stop-go' policies, which were identified as some of the most important barriers regarding successful energy transitions. The lack of policy consensus and coherent long-term strategies may result from a lack of knowledge of energy systems' meso-level dynamics. The meso-level involves the dynamic behaviour of the individual system elements and the coupling of individual technologies, resulting in interdependencies and regimes. Energy systems are at the meso-level characterised by two typical aspects, i.e. dynamics driven by interactions between actors, and heterogeneous characteristics of actors. These aspects give rise to the ineffectiveness of traditional energy policies, which is illustrated with examples from the transport sector and household electricity consumption. We found that analysis of energy systems at the meso-level helps to better understand energy systems. To resolve persistent policy issues, the traditional 'one size fits all' energy policies are not sufficient. In order to tackle the difficult issues, 'redesign of system organisation', 'target group approach', or 'target group induced system re-orientation' are needed

  16. Adjusting game difficulty level through Formal Concept Analysis

    Science.gov (United States)

    Gómez-Martín, Marco A.; Gómez-Martín, Pedro P.; Gonzâlez-Calero, Pedro A.; Díaz-Agudo, Belén

    In order to reach as many players as possible, videogames usually allow the user to choose the difficulty level. To do it, game designers have to decide the values that some game parameters will have depending on that decision. In simple videogames this is almost trivial: minesweeper is harder with longer board sizes and number of mines. In more complex games, game designers may take advantage of data mining to establish which of all the possible parameters will affect positively to the player experience. This paper describes the use of Formal Concept Analysis to help to balance the game using the logs obtained in the tests made prior the release of the game.

  17. Special I/O routine based on the BSAM level

    International Nuclear Information System (INIS)

    Okamoto, Masao; Takizuka, Tomonori; Wada, Yoshiyuki; Okada, Takamitsu.

    1977-10-01

    A Special I/O routine ''FORTXDAM'' has been developed, which is based on the BSAM Level. Program ''FORTXDAM'' is useful in input-output of large quantities of data, random access (direct access), usage of the double (or multi) buffers technique or asynchronous input-output. Written in FASP language available for the computer system FACOM 230-75 of JAERI, it consists of six basic sub-programs which can be called in FORTRAN language. FORTXDAM is useful especially in large-scale computer simulations in thermonuclear fusion and plasma physics research. (auth.)

  18. Fitness levels with tail bounds for the analysis of randomized search heuristics

    DEFF Research Database (Denmark)

    Witt, Carsten

    2014-01-01

    The fitness-level method, also called the method of f-based partitions, is an intuitive and widely used technique for the running time analysis of randomized search heuristics. It was originally defined to prove upper and lower bounds on the expected running time. Recently, upper tail bounds were...

  19. A state-level analysis of the economic impacts of medical tourism in Malaysia

    OpenAIRE

    Klijs, J.; Ormond, M.E.; Mainil, T.; Peerlings, J.H.M.; Heijman, W.J.M.

    2016-01-01

    In Malaysia, a country that ranks among the world's most recognised medical tourism destinations, medical tourism is identified as a potential economic growth engine for both medical and non-medical sectors. A state-level analysis of economic impacts is important, given differences between states in economic profiles and numbers, origins, and expenditure of medical tourists. We applied input–output (I–O) analysis, based on state-specific I–O data and disaggregated foreign patient data. The an...

  20. Wavelet Packet Transform Based Driver Distraction Level Classification Using EEG

    Directory of Open Access Journals (Sweden)

    Mousa Kadhim Wali

    2013-01-01

    Full Text Available We classify the driver distraction level (neutral, low, medium, and high based on different wavelets and classifiers using wireless electroencephalogram (EEG signals. 50 subjects were used for data collection using 14 electrodes. We considered for this research 4 distraction stimuli such as Global Position Systems (GPS, music player, short message service (SMS, and mental tasks. Deriving the amplitude spectrum of three different frequency bands theta, alpha, and beta of EEG signals was based on fusion of discrete wavelet packet transform (DWPT and FFT. Comparing the results of three different classifiers (subtractive fuzzy clustering probabilistic neural network, -nearest neighbor was based on spectral centroid, and power spectral features extracted by different wavelets (db4, db8, sym8, and coif5. The results of this study indicate that the best average accuracy achieved by subtractive fuzzy inference system classifier is 79.21% based on power spectral density feature extracted by sym8 wavelet which gave a good class discrimination under ANOVA test.

  1. EMPLOYMENT LEVEL ANALYSIS FROM THE DETERMINANT FACTORS PERSPECTIVE

    Directory of Open Access Journals (Sweden)

    Elena Diana ŞERB

    2016-02-01

    Full Text Available Neglecting the human factor as part of the labor market causes losses for society as any activity that is initiated within it, has as a starting point, and also as a finishing point, the human intervention. The starting point of the article is represented by the projections made by the European    Commission in the Population Ageing Report in 2015 underlying assumptions and projections, and also by the projections of the United Nations report in 2015, and this resulted in many conclusions including the one that for the first time in Romania the average aging in 2015 exceeds the values measured by EU till present day, and this is reflected in the employment level (active aging population. The hypothesis behind the article is that the evolution of the population and migrants has repercussions on employment. Structured in three parts: knowledge status, the analysis of employment indicators and information about the intensity and direction of the link between a number of factors and employment level, this article aims to establish the determinant factors of employment through a research focused on the analysis of secondary sources, and also using the regression model. The most important lesson learned as a result of this research is that the labor market works with a variety of factors with a higher or lower influence, and in turn the labor market influences other factors.

  2. Data Extraction Based on Page Structure Analysis

    Directory of Open Access Journals (Sweden)

    Ren Yichao

    2017-01-01

    Full Text Available The information we need has some confusing problems such as dispersion and different organizational structure. In addition, because of the existence of unstructured data like natural language and images, extracting local content pages is extremely difficult. In the light of of the problems above, this article will apply a method combined with page structure analysis algorithm and page data extraction algorithm to accomplish the gathering of network data. In this way, the problem that traditional complex extraction model behave poorly when dealing with large-scale data is perfectly solved and the page data extraction efficiency is also boosted to a new level. In the meantime, the article will also make a comparison about pages and content of different types between the methods of DOM structure based on the page and HTML regularities of distribution. After all of those, we may find a more efficient extract method.

  3. Arctic sea-level reconstruction analysis using recent satellite altimetry

    DEFF Research Database (Denmark)

    Svendsen, Peter Limkilde; Andersen, Ole Baltazar; Nielsen, Allan Aasbjerg

    2014-01-01

    We present a sea-level reconstruction for the Arctic Ocean using recent satellite altimetry data. The model, forced by historical tide gauge data, is based on empirical orthogonal functions (EOFs) from a calibration period; for this purpose, newly retracked satellite altimetry from ERS-1 and -2...... and Envisat has been used. Despite the limited coverage of these datasets, we have made a reconstruction up to 82 degrees north for the period 1950–2010. We place particular emphasis on determining appropriate preprocessing for the tide gauge data, and on validation of the model, including the ability...

  4. Size stratification in a Gilbert delta due to a varying base level: flume experiments.

    Science.gov (United States)

    Chavarrias, Victor; Orru, Clara; Viparelli, Enrica; Vide, Juan Pedro Martin; Blom, Astrid

    2014-05-01

    A foreset-dominated Gilbert delta is a delta that is dominated by sediment avalanches (i.e., discontinuous grain flows) over its front. It forms when a river flows into a basin or sea characterized by a flow depth that is much larger than the one in the fluvial reach, and the conditions are such that the transported sediment passing the brinkpoint forms a wedge at the topmost part of the foreset, which results in avalanches down the foreset and a fining upward pattern within the foreset deposit. A Gilbert delta is typically described in terms of a low-slope topset (resulting from deposition over the fluvial reach), a steep-slope foreset (resulting from sediment avalanches over the lee face), and a bottomset (resulting from deposition of fine sediment passing the brinkpoint as suspended load). The objective of the present study is to gain insight into the mechanisms taking part in Gilbert delta formation and progradation under variable base level conditions. In order to do so, three flume experiments were conducted in which the water discharge and sediment feed rate were maintained constant but the base level varied between the experiments: (I) constant base level, (II) a gradually rising base level, and (III) a slowly varying base level. The stratigraphy within the delta deposit was measured using image analysis combined with particle coloring. A steady base level resulted in aggradation over the fluvial reach in order to maintain a slope required to transport the supplied sediment downstream. Sea level rise enhanced the amount of aggradation over the fluvial reach due to the presence of an M1 backwater curve. The aggrading flux to the substrate was slightly coarser than the fed sediment. The sediment at the base of the foreset deposit appeared to become coarser in streamwise direction. Eventually, a fall of the base level induced an M2 backwater curve over the fluvial reach that caused degradation of the fluvial reach. Base level fall first induced erosion of the

  5. Complex Wavelet Based Modulation Analysis

    DEFF Research Database (Denmark)

    Luneau, Jean-Marc; Lebrun, Jérôme; Jensen, Søren Holdt

    2008-01-01

    Low-frequency modulation of sound carry important information for speech and music. The modulation spectrum i commonly obtained by spectral analysis of the sole temporal envelopes of the sub-bands out of a time-frequency analysis. Processing in this domain usually creates undesirable distortions...... polynomial trends. Moreover an analytic Hilbert-like transform is possible with complex wavelets implemented as an orthogonal filter bank. By working in an alternative transform domain coined as “Modulation Subbands”, this transform shows very promising denoising capabilities and suggests new approaches for joint...

  6. Recharge signal identification based on groundwater level observations.

    Science.gov (United States)

    Yu, Hwa-Lung; Chu, Hone-Jay

    2012-10-01

    This study applied a method of the rotated empirical orthogonal functions to directly decompose the space-time groundwater level variations and determine the potential recharge zones by investigating the correlation between the identified groundwater signals and the observed local rainfall records. The approach is used to analyze the spatiotemporal process of piezometric heads estimated by Bayesian maximum entropy method from monthly observations of 45 wells in 1999-2007 located in the Pingtung Plain of Taiwan. From the results, the primary potential recharge area is located at the proximal fan areas where the recharge process accounts for 88% of the spatiotemporal variations of piezometric heads in the study area. The decomposition of groundwater levels associated with rainfall can provide information on the recharge process since rainfall is an important contributor to groundwater recharge in semi-arid regions. Correlation analysis shows that the identified recharge closely associates with the temporal variation of the local precipitation with a delay of 1-2 months in the study area.

  7. CONTEXT BASED FOOD IMAGE ANALYSIS

    OpenAIRE

    He, Ye; Xu, Chang; Khanna, Nitin; Boushey, Carol J.; Delp, Edward J.

    2013-01-01

    We are developing a dietary assessment system that records daily food intake through the use of food images. Recognizing food in an image is difficult due to large visual variance with respect to eating or preparation conditions. This task becomes even more challenging when different foods have similar visual appearance. In this paper we propose to incorporate two types of contextual dietary information, food co-occurrence patterns and personalized learning models, in food image analysis to r...

  8. Resource variation in colorectal surgery: a national centre level analysis.

    Science.gov (United States)

    Drake, T M; Lee, M J; Senapati, A; Brown, S R

    2017-07-01

    Delivery of quality colorectal surgery requires adequate resources. We set out to assess the relationship between resources and outcomes in English colorectal units. Data were extracted from the Association of Coloproctology of Great Britain and Ireland resource questionnaire to profile resources. This was correlated with Hospital Episode Statistics outcome data including 90-day mortality and readmissions. Patient satisfaction measures were extracted from the Cancer Experience Patient Survey and compared at unit level. Centres were divided by workload into low, middle and top tertile. Completed questionnaires were received from 75 centres in England. Service resources were similar between low and top tertiles in access to Confidential Enquiry into Patient Outcome and Death (CEPOD) theatre, level two or three beds per 250 000 population or the likelihood of having a dedicated colorectal ward. There was no difference in staffing levels per 250 000 unit of population. Each 10% increase in the proportion of cases attempted laparoscopically was associated with reduced 90-day unplanned readmission (relative risk 0.94, 95% CI 0.91-0.97, P colorectal ward (relative risk 0.85, 95% CI 0.73-0.99, P = 0.040) was also associated with a significant reduction in unplanned readmissions. There was no association between staffing or service factors and patient satisfaction. Resource levels do not vary based on unit of population. There is benefit associated with increased use of laparoscopy and a dedicated surgical ward. Alternative measures to assess the relationship between resources and outcome, such as failure to rescue, should be explored in UK practice. Colorectal Disease © 2017 The Association of Coloproctology of Great Britain and Ireland.

  9. A high-level language for rule-based modelling.

    Science.gov (United States)

    Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D

    2015-01-01

    Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages.

  10. Low-level processing for real-time image analysis

    Science.gov (United States)

    Eskenazi, R.; Wilf, J. M.

    1979-01-01

    A system that detects object outlines in television images in real time is described. A high-speed pipeline processor transforms the raw image into an edge map and a microprocessor, which is integrated into the system, clusters the edges, and represents them as chain codes. Image statistics, useful for higher level tasks such as pattern recognition, are computed by the microprocessor. Peak intensity and peak gradient values are extracted within a programmable window and are used for iris and focus control. The algorithms implemented in hardware and the pipeline processor architecture are described. The strategy for partitioning functions in the pipeline was chosen to make the implementation modular. The microprocessor interface allows flexible and adaptive control of the feature extraction process. The software algorithms for clustering edge segments, creating chain codes, and computing image statistics are also discussed. A strategy for real time image analysis that uses this system is given.

  11. Remote ignitability analysis of high-level radioactive waste

    International Nuclear Information System (INIS)

    Lundholm, C.W.; Morgan, J.M.; Shurtliff, R.M.; Trejo, L.E.

    1992-09-01

    The Idaho Chemical Processing Plant (ICPP), was used to reprocess nuclear fuel from government owned reactors to recover the unused uranium-235. These processes generated highly radioactive liquid wastes which are stored in large underground tanks prior to being calcined into a granular solid. The Resource Conservation and Recovery Act (RCRA) and state/federal clean air statutes require waste characterization of these high level radioactive wastes for regulatory permitting and waste treatment purposes. The determination of the characteristic of ignitability is part of the required analyses prior to calcination and waste treatment. To perform this analysis in a radiologically safe manner, a remoted instrument was needed. The remote ignitability Method and Instrument will meet the 60 deg. C. requirement as prescribed for the ignitability in method 1020 of SW-846. The method for remote use will be equivalent to method 1020 of SW-846

  12. Accelerating next generation sequencing data analysis with system level optimizations.

    Science.gov (United States)

    Kathiresan, Nagarajan; Temanni, Ramzi; Almabrazi, Hakeem; Syed, Najeeb; Jithesh, Puthen V; Al-Ali, Rashid

    2017-08-22

    Next generation sequencing (NGS) data analysis is highly compute intensive. In-memory computing, vectorization, bulk data transfer, CPU frequency scaling are some of the hardware features in the modern computing architectures. To get the best execution time and utilize these hardware features, it is necessary to tune the system level parameters before running the application. We studied the GATK-HaplotypeCaller which is part of common NGS workflows, that consume more than 43% of the total execution time. Multiple GATK 3.x versions were benchmarked and the execution time of HaplotypeCaller was optimized by various system level parameters which included: (i) tuning the parallel garbage collection and kernel shared memory to simulate in-memory computing, (ii) architecture-specific tuning in the PairHMM library for vectorization, (iii) including Java 1.8 features through GATK source code compilation and building a runtime environment for parallel sorting and bulk data transfer (iv) the default 'on-demand' mode of CPU frequency is over-clocked by using 'performance-mode' to accelerate the Java multi-threads. As a result, the HaplotypeCaller execution time was reduced by 82.66% in GATK 3.3 and 42.61% in GATK 3.7. Overall, the execution time of NGS pipeline was reduced to 70.60% and 34.14% for GATK 3.3 and GATK 3.7 respectively.

  13. Recurrence Quantification Analysis of Sentence-Level Speech Kinematics.

    Science.gov (United States)

    Jackson, Eric S; Tiede, Mark; Riley, Michael A; Whalen, D H

    2016-12-01

    Current approaches to assessing sentence-level speech variability rely on measures that quantify variability across utterances and use normalization procedures that alter raw trajectory data. The current work tests the feasibility of a less restrictive nonlinear approach-recurrence quantification analysis (RQA)-via a procedural example and subsequent analysis of kinematic data. To test the feasibility of RQA, lip aperture (i.e., the Euclidean distance between lip-tracking sensors) was recorded for 21 typically developing adult speakers during production of a simple utterance. The utterance was produced in isolation and in carrier structures differing just in length or in length and complexity. Four RQA indices were calculated: percent recurrence (%REC), percent determinism (%DET), stability (MAXLINE), and stationarity (TREND). Percent determinism (%DET) decreased only for the most linguistically complex sentence; MAXLINE decreased as a function of linguistic complexity but increased for the longer-only sentence; TREND decreased as a function of both length and linguistic complexity. This research note demonstrates the feasibility of using RQA as a tool to compare speech variability across speakers and groups. RQA offers promise as a technique to assess effects of potential stressors (e.g., linguistic or cognitive factors) on the speech production system.

  14. Analysis of the personal doses lower than the reporting level

    International Nuclear Information System (INIS)

    Askounis, P.; Papadomarkaki, E.; Kirgiakou, H.; Dimitropoulou, F.; Carinou, E.; Maltezos, A.; Kamenopoulou, V.

    2008-01-01

    The personnel dosimetry department of Greek Atomic Energy Commission (GAEC) is the only laboratory in the country that assures the individual monitoring of more than 10,500 workers with whole body dosemeters and 100 workers with extremity dosemeters at 1300 establishments. Every year a statistical analysis of the results is performed that provides data for epidemiological studies and assists the evaluation of the radiation protection system of the country. The aim of this study is to perform an analysis of the doses that are lower than the reporting level. The vast majority (92%) of the evaluated H p (10) doses has been reported and recorded as zero. The mean H p (10) (of the non-reported doses) per dosemeter and per monitoring period has been calculated for every geographical department of Greece. The results were compared with the ones produced by the external gamma rate measurements performed by GAEC's telemetric network. The conclusion that can be drawn is that the part of the TL signal of the personal dosemeters is due to the natural background radiation and this can affect the evaluation of the low doses

  15. High-level PC-based laser system modeling

    Science.gov (United States)

    Taylor, Michael S.

    1991-05-01

    Since the inception of the Strategic Defense Initiative (SDI) there have been a multitude of comparison studies done in an attempt to evaluate the effectiveness and relative sizes of complementary, and sometimes competitive, laser weapon systems. It became more and more apparent that what the systems analyst needed was not only a fast, but a cost effective way to perform high-level trade studies. In the present investigation, a general procedure is presented for the development of PC-based algorithmic systems models for laser systems. This procedure points out all of the major issues that should be addressed in the design and development of such a model. Issues addressed include defining the problem to be modeled, defining a strategy for development, and finally, effective use of the model once developed. Being a general procedure, it will allow a systems analyst to develop a model to meet specific needs. To illustrate this method of model development, a description of the Strategic Defense Simulation - Design To (SDS-DT) model developed and used by Science Applications International Corporation (SAIC) is presented. SDS-DT is a menu-driven, fast executing, PC-based program that can be used to either calculate performance, weight, volume, and cost values for a particular design or, alternatively, to run parametrics on particular system parameters to perhaps optimize a design.

  16. The value of blood oxygenation level-dependent (BOLD MR imaging in differentiation of renal solid mass and grading of renal cell carcinoma (RCC: analysis based on the largest cross-sectional area versus the entire whole tumour.

    Directory of Open Access Journals (Sweden)

    Guang-Yu Wu

    Full Text Available To study the value of assessing renal masses using different methods in parameter approaches and to determine whether BOLD MRI is helpful in differentiating RCC from benign renal masses, differentiating clear-cell RCC from renal masses other than clear-cell RCC and determining the tumour grade.Ninety-five patients with 139 renal masses (93 malignant and 46 benign who underwent abdominal BOLD MRI were enrolled. R2* values were derived from the largest cross-section (R2*largest and from the whole tumour (R2*whole. Intra-observer and inter-observer agreements were analysed based on two measurements by the same observer and the first measurement from each observer, respectively, and these agreements are reported with intra-class correlation coefficients and 95% confidence intervals. The diagnostic value of the R2* value in the evaluation was assessed with receiver-operating characteristic analysis.The intra-observer agreement was very good for R2*largest and R2*whole (all > 0.8. The inter-observer agreement of R2*whole (0.75, 95% confidence interval: 0.69~0.79 was good and was significantly improved compared with the R2*largest (0.61, 95% confidence interval: 0.52~0.68, as there was no overlap in the 95% confidence interval of the intra-class correlation coefficients. The diagnostic value in differentiating renal cell carcinoma from benign lesions with R2*whole (AUC=0.79/0.78[observer1/observer2] and R2*largest (AUC=0.75[observer1] was good and significantly higher (p=0.01 for R2*largest[observer2] vs R2*whole[observer2], p 0.7 and were not significantly different (p=0.89/0.93 for R2*largest vs R2*whole[observer1/observer2], 0.96 for R2*whole[observer1] vs R2*largest[observer2] and 0.96 for R2*whole [observer2] vs R2*largest[observer1].BOLD MRI could provide a feasible parameter for differentiating renal cell carcinoma from benign renal masses and for predicting clear-cell renal cell carcinoma grading. Compared with the largest cross

  17. [Concept analysis "Competency-based education"].

    Science.gov (United States)

    Loosli, Clarence

    2016-03-01

    Competency-based education (CBE) stands out at global level as the best educational practice. Indeed, CBE is supposed to improve the quality of care provided by newly graduated nurses. Yet, there is a dearth of knowledge in nursing literature regarding CBE concept's definition. CBE is implemented differently in each entity even inside the same discipline in a single country. What accounts for CBE in nursing education ? to clarify CBE concept meaning according to literature review in order to propose a definition. Wilson concept analysis method framed our literature review through two databases: CINHAL and ERIC. following the 11 Wilson techniques analysis, we identified CBE concept as a multidimensional concept clustering three dimensions : learning, teaching and assessment. nurses educators are accountable for providing performants newly graduated professional to the society. Schools should struggle for the visibility and the transparency of means they are using to accomplish their educational activities. This first attempt to understand CBE concept opens a matter of debate concerning further development and clarification of the concept. This first description of CBE concept is a step toward its identification and assessment.

  18. A Symmetric Chaos-Based Image Cipher with an Improved Bit-Level Permutation Strategy

    Directory of Open Access Journals (Sweden)

    Chong Fu

    2014-02-01

    Full Text Available Very recently, several chaos-based image ciphers using a bit-level permutation have been suggested and shown promising results. Due to the diffusion effect introduced in the permutation stage, the workload of the time-consuming diffusion stage is reduced, and hence the performance of the cryptosystem is improved. In this paper, a symmetric chaos-based image cipher with a 3D cat map-based spatial bit-level permutation strategy is proposed. Compared with those recently proposed bit-level permutation methods, the diffusion effect of the new method is superior as the bits are shuffled among different bit-planes rather than within the same bit-plane. Moreover, the diffusion key stream extracted from hyperchaotic system is related to both the secret key and the plain image, which enhances the security against known/chosen plaintext attack. Extensive security analysis has been performed on the proposed scheme, including the most important ones like key space analysis, key sensitivity analysis, plaintext sensitivity analysis and various statistical analyses, which has demonstrated the satisfactory security of the proposed scheme

  19. Analysis of Cyberbullying Sensitivity Levels of High School Students and Their Perceived Social Support Levels

    Science.gov (United States)

    Akturk, Ahmet Oguz

    2015-01-01

    Purpose: The purpose of this paper is to determine the cyberbullying sensitivity levels of high school students and their perceived social supports levels, and analyze the variables that predict cyberbullying sensitivity. In addition, whether cyberbullying sensitivity levels and social support levels differed according to gender was also…

  20. Neutronographic Texture Analysis of Zirconium Based Alloys

    International Nuclear Information System (INIS)

    Kruz'elová, M; Vratislav, S; Kalvoda, L; Dlouhá, M

    2012-01-01

    Neutron diffraction is a very powerful tool in texture analysis of zirconium based alloys used in nuclear technique. Textures of five samples (two rolled sheets and three tubes) were investigated by using basal pole figures, inversion pole figures, and ODF distribution function. The texture measurement was performed at diffractometer KSN2 on the Laboratory of Neutron Diffraction, Department of Solid State Engineering, Faculty of Nuclear Sciences and Physical Engineering, CTU in Prague. Procedures for studying textures with thermal neutrons and procedures for obtaining texture parameters (direct and inverse pole figures, three dimensional orientation distribution function) are also described. Observed data were processed by software packages HEXAL and GSAS. Our results can be summarized as follows: i) All samples of zirconium alloys show the distribution of middle area into two maxima in basal pole figures. This is caused by alloying elements. A characteristic split of the basal pole maxima tilted from the normal direction toward the transverse direction can be observed for all samples, ii) Sheet samples prefer orientation of planes (100) and (110) perpendicular to rolling direction and orientation of planes (002) perpendicular to normal direction, iii) Basal planes of tubes are oriented parallel to tube axis, meanwhile (100) planes are oriented perpendicular to tube axis. Level of resulting texture and maxima position is different for tubes and for sheets. The obtained results are characteristic for zirconium based alloys.

  1. ANALYSIS OF THE PHTHALATE CONTENT LEVELS IN WINE PRODUCTS

    Directory of Open Access Journals (Sweden)

    Duca Gheorghe

    2011-12-01

    Full Text Available In the context of studies conducted in the laboratory of National Center for Quality Testing of the Alcoholic Beverages (Republic of Moldova were included more than 1300 samples of the bottled wine and base-wine for the presence of most widespread and toxic phthalate – dibutylphthalate using modern method of analysis like GC-MS. Results display presences DBP in 85 % of studied samples of wines, i.e. a content of DBP more than LOQ (0.01mg/dm3. Has been determined that contamination of phthalates has a technogenic character, and it is the result of contact with polymeric materials. Optimum conditions of extraction DBP from liquid samples were obtained.

  2. Low-level radioactive waste data base management

    International Nuclear Information System (INIS)

    Roles, G.W.

    1987-01-01

    This paper outlines the uses of information obtained from a national system for management of low-level waste shipment manifest information from the perspective of the NRC Division of Waste Management (DWM). A background section is first presented which briefly reviews some of the basic attributes of a workable system, as well as the existing data management systems established by the disposal facility operators. This background leads into a more detailed discussion of the major uses to which a regulatory agency would put the manifest information, including technical studies and analyses of a broad nature as well as day-to-day compliance with regulations and disposal site license conditions. The next two sections respectively summarize NRC's current data base capabilities as well as the limitations in these capabilities. The final section addresses the question: Where do we go from Here? One option under consideration is a rule making which would: (1) set forth the minimum information to be included in shipment manifests in greater detail than that currently specified in 10 CFR 20.311, and (2) require that operators of all low-level waste disposal facilities reduce the information on incoming shipment manifests to an electronic data format which would be periodically forwarded to a centralized location. However, this option would conflict with other NRC priorities and probably require considerable time to implement. Much of the groundwork for a national system has already been prepared, and NRC's preferred approach is to work with States, Compacts, disposal site operators, and DOE to upgrade these existing capabilities. 8 references, 1 figure, 2 tables

  3. A hazard and probabilistic safety analysis of a high-level waste transfer process

    International Nuclear Information System (INIS)

    Bott, T.F.; Sasser, M.K.

    1996-01-01

    This paper describes a safety analysis of a transfer process for high-level radioactive and toxic waste. The analysis began with a hazard assessment that used elements of What If, Checklist, Failure Modes and Effects Analysis, and Hazards and Operability Study (HAZOP) techniques to identify and rough-in accident sequences. Based on this preliminary analysis, the most significant accident sequences were developed further using event trees. Quantitative frequency estimates for the accident sequences were based on operational data taken from the historical record of the site where the process is performed. Several modeling challenges were encountered in the course of the study. These included linked initiating and accident progression events, fire propagation modeling, accounting for administrative control violations, and handling mission-phase effects

  4. An analysis of the factors affecting the hydraulic conductivity and swelling pressure of Kyungju ca-bentonite for use as a clay-based sealing material for a high level waste repository

    International Nuclear Information System (INIS)

    Cho, Won Jin; Lee, Jae Owen; Kwon, Sang Ki

    2012-01-01

    The buffer and backfill are important components of the engineered barrier system in a high-level waste repository, which should be constructed in a hard rock formation at a depth of several hundred meters below the ground surface. The primary function of the buffer and backfill is to seal the underground excavation as a preferred flow path for radionuclide migration from the deposited high-level waste. This study investigates the hydraulic conductivity and swelling pressure of Kyungju Ca-bentonite, which is the candidate material for the buffer and backfill in the Korean reference high-level waste disposal system. The factors that influence the hydraulic conductivity and swelling pressure of the buffer and backfill are analyzed. The factors considered are the dry density, the temperature, the sand content, the salinity and the organic carbon content. The possibility of deterioration in the sealing performance of the buffer and backfill is also assessed.

  5. Evidence based practice readiness: A concept analysis.

    Science.gov (United States)

    Schaefer, Jessica D; Welton, John M

    2018-01-15

    To analyse and define the concept "evidence based practice readiness" in nurses. Evidence based practice readiness is a term commonly used in health literature, but without a clear understanding of what readiness means. Concept analysis is needed to define the meaning of evidence based practice readiness. A concept analysis was conducted using Walker and Avant's method to clarify the defining attributes of evidence based practice readiness as well as antecedents and consequences. A Boolean search of PubMed and Cumulative Index for Nursing and Allied Health Literature was conducted and limited to those published after the year 2000. Eleven articles met the inclusion criteria for this analysis. Evidence based practice readiness incorporates personal and organisational readiness. Antecedents include the ability to recognize the need for evidence based practice, ability to access and interpret evidence based practice, and a supportive environment. The concept analysis demonstrates the complexity of the concept and its implications for nursing practice. The four pillars of evidence based practice readiness: nursing, training, equipping and leadership support are necessary to achieve evidence based practice readiness. Nurse managers are in the position to address all elements of evidence based practice readiness. Creating an environment that fosters evidence based practice can improve patient outcomes, decreased health care cost, increase nurses' job satisfaction and decrease nursing turnover. © 2018 John Wiley & Sons Ltd.

  6. Network-based analysis of proteomic profiles

    KAUST Repository

    Wong, Limsoon

    2016-01-01

    -based high-throughput methods have been useful in providing glimpses into the underlying molecular processes, the evidences they provide are indirect. Furthermore, RNA and corresponding protein levels have been known to have poor correlation. On the other

  7. The main requirements for research work against the challenges of "Industry - 4.0" and the analysis of ongoing work in improving the state of the material and technical base and improving the level of training of young engineers

    Directory of Open Access Journals (Sweden)

    Р. С. Турманидзе

    2017-12-01

    Full Text Available The work analyzes theoretical and practical results obtained after each of the first three world scientific - technical revolution, appreciated their role in the development of humanity, and improve the level of industry in different sectors of the economy. More discussed in detail situation on the threshold of the fourth scientific - technical revolution "industry-4". It is proved crucial in accelerating these processes, the level of training of young engineering staff and suggestions, some ideas and solutions to these problems on the example of the Georgian Technical University.

  8. Uncertainty and sensitivity analysis in a Probabilistic Safety Analysis level-1

    International Nuclear Information System (INIS)

    Nunez Mc Leod, Jorge E.; Rivera, Selva S.

    1996-01-01

    A methodology for sensitivity and uncertainty analysis, applicable to a Probabilistic Safety Assessment Level I has been presented. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and systems response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well as different graphical visualization for the control of the study. (author)

  9. Basal cortisol levels and metabolic syndrome: A systematic review and meta-analysis of observational studies.

    Science.gov (United States)

    Garcez, Anderson; Leite, Heloísa Marquardt; Weiderpass, Elisabete; Paniz, Vera Maria Vieira; Watte, Guilherme; Canuto, Raquel; Olinto, Maria Teresa Anselmo

    2018-05-17

    To perform a qualitative synthesis (systematic review) and quantitative analysis (meta-analysis) to summarize the evidence regarding the relationship between basal cortisol levels and metabolic syndrome (MetS) in adults. A systematic search was performed in the PubMed, Embase, and PsycINFO databases for observational studies on the association between basal cortisol levels and MetS. The quality of individual studies was assessed by the Newcastle-Ottawa score. A random effects model was used to report pooled quantitative results and the I 2 statistic was used to assess heterogeneity. Egger's and Begg's tests were used to evaluate publication bias. Twenty-six studies (19 cross-sectional and seven case-control) met the inclusion criteria for the systematic review. The majority was classified as having a low risk of bias and used established criteria for the diagnosis of MetS. Twenty-one studies provided data on basal cortisol levels as continuous values and were included in the meta-analysis; they comprised 35 analyses and 11,808 subjects. Pooled results showed no significant difference in basal cortisol levels between subjects with and without MetS (standardized mean difference [SMD] = 0.02, 95% confidence interval [CI]=-0.11 to 0.14). There was high heterogeneity between the studies when all comparisons were considered (I 2  = 83.1%;p meta-analysis of studies evaluating saliva samples showed no significantly lower basal cortisol levels among subjects with MetS (SMD=-0.18, 95% CI=-0.37 to 0.01), whereas those studies that evaluated serum samples (SMD = 0.11, 95% CI=-0.02 to 0.24) and urine samples (SMD = 0.73, 95% CI=-0.40 to 1.86) showed no significantly higher basal cortisol levels among subjects with MetS. In the subgroup and meta-regression analyses, a significant difference in basal cortisol levels was observed according to study design, population base, age, gender, cortisol level assessment method, and study quality. This systematic review

  10. Level-1 pixel based tracking trigger algorithm for LHC upgrade

    CERN Document Server

    Moon, Chang-Seong

    2015-01-01

    The Pixel Detector is the innermost detector of the tracking system of the Compact Muon Solenoid (CMS) experiment at CERN Large Hadron Collider (LHC). It precisely determines the interaction point (primary vertex) of the events and the possible secondary vertexes due to heavy flavours ($b$ and $c$ quarks); it is part of the overall tracking system that allows reconstructing the tracks of the charged particles in the events and combined with the magnetic field to measure their impulsion. The pixel detector allows measuring the tracks in the region closest to the interaction point. The Level-1 (real-time) pixel based tracking trigger is a novel trigger system that is currently being studied for the LHC upgrade. An important goal is developing real-time track reconstruction algorithms able to cope with very high rates and high flux of data in a very harsh environment. The pixel detector has an especially crucial role in precisely identifying the primary vertex of the rare physics events from the large pile-up (P...

  11. Adaptive Rate Sampling and Filtering Based on Level Crossing Sampling

    Directory of Open Access Journals (Sweden)

    Saeed Mian Qaisar

    2009-01-01

    Full Text Available The recent sophistications in areas of mobile systems and sensor networks demand more and more processing resources. In order to maintain the system autonomy, energy saving is becoming one of the most difficult industrial challenges, in mobile computing. Most of efforts to achieve this goal are focused on improving the embedded systems design and the battery technology, but very few studies target to exploit the input signal time-varying nature. This paper aims to achieve power efficiency by intelligently adapting the processing activity to the input signal local characteristics. It is done by completely rethinking the processing chain, by adopting a non conventional sampling scheme and adaptive rate filtering. The proposed approach, based on the LCSS (Level Crossing Sampling Scheme presents two filtering techniques, able to adapt their sampling rate and filter order by online analyzing the input signal variations. Indeed, the principle is to intelligently exploit the signal local characteristics—which is usually never considered—to filter only the relevant signal parts, by employing the relevant order filters. This idea leads towards a drastic gain in the computational efficiency and hence in the processing power when compared to the classical techniques.

  12. EBprot: Statistical analysis of labeling-based quantitative proteomics data.

    Science.gov (United States)

    Koh, Hiromi W L; Swa, Hannah L F; Fermin, Damian; Ler, Siok Ghee; Gunaratne, Jayantha; Choi, Hyungwon

    2015-08-01

    Labeling-based proteomics is a powerful method for detection of differentially expressed proteins (DEPs). The current data analysis platform typically relies on protein-level ratios, which is obtained by summarizing peptide-level ratios for each protein. In shotgun proteomics, however, some proteins are quantified with more peptides than others, and this reproducibility information is not incorporated into the differential expression (DE) analysis. Here, we propose a novel probabilistic framework EBprot that directly models the peptide-protein hierarchy and rewards the proteins with reproducible evidence of DE over multiple peptides. To evaluate its performance with known DE states, we conducted a simulation study to show that the peptide-level analysis of EBprot provides better receiver-operating characteristic and more accurate estimation of the false discovery rates than the methods based on protein-level ratios. We also demonstrate superior classification performance of peptide-level EBprot analysis in a spike-in dataset. To illustrate the wide applicability of EBprot in different experimental designs, we applied EBprot to a dataset for lung cancer subtype analysis with biological replicates and another dataset for time course phosphoproteome analysis of EGF-stimulated HeLa cells with multiplexed labeling. Through these examples, we show that the peptide-level analysis of EBprot is a robust alternative to the existing statistical methods for the DE analysis of labeling-based quantitative datasets. The software suite is freely available on the Sourceforge website http://ebprot.sourceforge.net/. All MS data have been deposited in the ProteomeXchange with identifier PXD001426 (http://proteomecentral.proteomexchange.org/dataset/PXD001426/). © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. ANALYSIS OF CRISIS LEVEL IN REGIONS OF UKRAINE

    Directory of Open Access Journals (Sweden)

    Irina Abramova

    2017-12-01

    corresponds to the limits of the indicators from 0.75 to 0.5. The timely implementation of liquidation measures to neutralize the effects of the existing ones and prevent new crises will lead to the transition of the region into a zone of deep crisis. The zone of deep crisis is characterized by a partial destruction of the socio-economic system of the region. Out of such a situation requires the use of systemic measures of anti-crisis state and regional management with the assistance of foreign aid. A quantitative indicator of this zone, its threshold is numerical measure, which is limited to 75 percent deviation from the threshold level of the non-crisis zone, which corresponds to the limits of indicators from 0.5 to 0.25. The zone of bankruptcy involves the complete destruction of the region as a social and economic system. The reasons for such a situation are force majeure circumstances (wars, natural disasters, man-made disasters, etc.. Such a state of the region is characterized by the cessation of the work of enterprises and organizations, the economic and social devastation of the region, the intensification of migration processes. The solution to the current situation is targeted state crisis management. A numerical indicator of this zone, its threshold is considered numerical measurements, characterized by more than 75 percent deviation from the threshold level of the non-crisis zone, which corresponds to the limits of indicators from 0.25 to 0.0. Results of the survey showed that there was a moderate level of crisis according to economic parameters with a high risk of transition into a deep crisis in 14 of 27 regions as of 2015. Practical implications. Thus, the conducted analysis on the crisis of socio-economic development of Ukraine’s regions made it possible to detect the level of its depth according to social and economic parameters and to determine the weakest areas that need the most support and display in anti-crisis regional management. Value

  14. Enhancement of the FDOT's project level and network level bridge management analysis tools

    Science.gov (United States)

    2011-02-01

    Over several years, the Florida Department of Transportation (FDOT) has been implementing the AASHTO Pontis Bridge Management System to support network-level and project-level decision making in the headquarters and district offices. Pontis is an int...

  15. Radiation level survey of a mobile phone base station

    International Nuclear Information System (INIS)

    Campos, M.C.; Schaffer, S.R.

    2006-01-01

    Electromagnetic field (E.M.F.) evaluations were carried out in the surroundings of a roof-top mobile-phone radio-base station (R.B.S.). Four of its sector-panel antennas are installed on two parallel vertical masts, each supporting two panels in a vertical collinear-array. The geometry is such that the vertical plane containing both masts is about 10 meters distant and parallel to the backside of an educational institution. This proximity provoked great anxiety among the local community members regarding potential health hazards.1. Introduction: To keep up with the expansion of the mobile-phone services, the number of Radio-Base Stations (R.B.S.) installations is increasing tremendously in Brazil. Efficient control and radiation monitoring to assess R.B.S. compliance to existing regulations are still lacking and particularly in big cities, clearly non - compliant R.B.S. can be seen which represent potentially hazardous E.M.F. sources to the nearby population. This first survey of an irregular R.B.S. revealed significant E-field strengths outside, as well as inside a classroom of an educational building where an usually prolonged stay is necessary. These results confirm that this problem deserves further attention, moreover, if one considers that public and occupational exposure limits set by I.C.N.I.R.P. (also adopted in Brazil) are exclusively based on the immediate thermal effects of acute exposure, disregarding any potential health risk from prolonged exposure to lower level radiation. Research activities focusing on quantitative aspects of electromagnetic radiation from R.B.S., as well as on biological and adverse health effects are still at a very incipient level, urging for immediate actions to improve this scenario in our country. 2. Material, methods and results Measurements were carried out with a broadband field strength monitor, E.M.R.-300 (W and G) coupled to an isotropic E-field probe (100 khz to 3 GHz). Preliminary measurements helped locating

  16. Decreased serum pyridoxal levels in schizophrenia: meta-analysis and Mendelian randomization analysis

    Science.gov (United States)

    Tomioka, Yukiko; Kinoshita, Makoto; Umehara, Hidehiro; Watanabe, Shin-ya; Nakataki, Masahito; Iwayama, Yoshimi; Toyota, Tomoko; Ikeda, Masashi; Yamamori, Hidenaga; Shimodera, Shinji; Tajima, Atsushi; Hashimoto, Ryota; Iwata, Nakao; Yoshikawa, Takeo; Ohmori, Tetsuro

    2018-01-01

    Background Alterations in one-carbon metabolism have been associated with schizophrenia, and vitamin B6 is one of the key components in this pathway. Methods We first conducted a case–control study of serum pyridoxal levels and schizophrenia in a large Japanese cohort (n = 1276). Subsequently, we conducted a meta-analysis of association studies (n = 2125). Second, we investigated whether rs4654748, which was identified in a genome-wide association study as a vitamin B6-related single nucleotide polymorphism, was genetically implicated in patients with schizophrenia in the Japanese population (n = 10 689). Finally, we assessed the effect of serum pyridoxal levels on schizophrenia risk using a Mendelian randomization (MR) approach. Results Serum pyridoxal levels were significantly lower in patients with schizophrenia than in controls, not only in our cohort, but also in the pooled data set of the meta-analysis of association studies (standardized mean difference −0.48, 95% confidence interval [CI] −0.57 to −0.39, p = 9.8 × 10−24). We failed to find a significant association between rs4654748 and schizophrenia. Furthermore, an MR analysis failed to find a causal relationship between pyridoxal levels and schizophrenia risk (odds ratio 0.99, 95% CI 0.65–1.51, p = 0.96). Limitations Food consumption and medications may have affected serum pyridoxal levels in our cross-sectional study. Sample size, number of instrumental variables and substantial heterogeneity among patients with schizophrenia are limitations of an MR analysis. Conclusion We found decreased serum pyridoxal levels in patients with schizophrenia in this observational study. However, we failed to obtain data supporting a causal relationship between pyridoxal levels and schizophrenia risk using the MR approach. PMID:29688875

  17. Identification and simulation for steam generator water level based on Kalman Filter

    International Nuclear Information System (INIS)

    Deng Chen; Zhang Qinshun

    2008-01-01

    In order to effectively control the water level of the steam generator (SG), this paper has set about the state-observer theory in modern control and put forward a method to detect the 'false water level' based on Kalman Filter. Kalman Filter is a efficient tool to estimate state-variable by measured value including noise. For heavy measurement noise of steam flow, constructing a 'false water level' observer by Kalman Filter could availably obtain state variable of 'false water level'. The simulation computing for the dynamics characteristic of nuclear SG water level process under several typically running power was implemented by employing the simulation model. The result shows that the simulation model accurately identifies the 'false water level' produced in the reverse thermal-dynamic effects of nuclear SG water level process. The simulation model can realize the precise analysis of dynamics characteristic for the nuclear SG water level process. It can provide a kind of new ideas for the 'false water level' detecting of SG. (authors)

  18. JAVA based LCD Reconstruction and Analysis Tools

    International Nuclear Information System (INIS)

    Bower, G.

    2004-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  19. Java based LCD reconstruction and analysis tools

    International Nuclear Information System (INIS)

    Bower, Gary; Cassell, Ron; Graf, Norman; Johnson, Tony; Ronan, Mike

    2001-01-01

    We summarize the current status and future developments of the North American Group's Java-based system for studying physics and detector design issues at a linear collider. The system is built around Java Analysis Studio (JAS) an experiment-independent Java-based utility for data analysis. Although the system is an integrated package running in JAS, many parts of it are also standalone Java utilities

  20. Structured Performance Analysis for Component Based Systems

    OpenAIRE

    Salmi , N.; Moreaux , Patrice; Ioualalen , M.

    2012-01-01

    International audience; The Component Based System (CBS) paradigm is now largely used to design software systems. In addition, performance and behavioural analysis remains a required step for the design and the construction of efficient systems. This is especially the case of CBS, which involve interconnected components running concurrent processes. % This paper proposes a compositional method for modeling and structured performance analysis of CBS. Modeling is based on Stochastic Well-formed...

  1. Vandenberg Air Force Base Upper Level Wind Launch Weather Constraints

    Science.gov (United States)

    Shafer, Jaclyn A.; Wheeler, Mark M.

    2012-01-01

    The 30th Operational Support Squadron Weather Flight (30 OSSWF) provides comprehensive weather services to the space program at Vandenberg Air Force Base (VAFB) in California. One of their responsibilities is to monitor upper-level winds to ensure safe launch operations of the Minuteman III ballistic missile. The 30 OSSWF tasked the Applied Meteorology Unit (AMU) to analyze VAFB sounding data with the goal of determining the probability of violating (PoV) their upper-level thresholds for wind speed and shear constraints specific to this launch vehicle, and to develop a tool that will calculate the PoV of each constraint on the day of launch. In order to calculate the probability of exceeding each constraint, the AMU collected and analyzed historical data from VAFB. The historical sounding data were retrieved from the National Oceanic and Atmospheric Administration Earth System Research Laboratory archive for the years 1994-2011 and then stratified into four sub-seasons: January-March, April-June, July-September, and October-December. The maximum wind speed and 1000-ft shear values for each sounding in each subseason were determined. To accurately calculate the PoV, the AMU determined the theoretical distributions that best fit the maximum wind speed and maximum shear datasets. Ultimately it was discovered that the maximum wind speeds follow a Gaussian distribution while the maximum shear values follow a lognormal distribution. These results were applied when calculating the averages and standard deviations needed for the historical and real-time PoV calculations. In addition to the requirements outlined in the original task plan, the AMU also included forecast sounding data from the Rapid Refresh model. This information provides further insight for the launch weather officers (LWOs) when determining if a wind constraint violation will occur over the next few hours on day of launch. The interactive graphical user interface (GUI) for this project was developed in

  2. Sea level rise and the geoid: factor analysis approach

    OpenAIRE

    Song, Hongzhi; Sadovski, Alexey; Jeffress, Gary

    2013-01-01

    Sea levels are rising around the world, and this is a particular concern along most of the coasts of the United States. A 1989 EPA report shows that sea levels rose 5-6 inches more than the global average along the Mid-Atlantic and Gulf Coasts in the last century. The main reason for this is coastal land subsidence. This sea level rise is considered more as relative sea level rise than global sea level rise. Thus, instead of studying sea level rise globally, this paper describes a statistical...

  3. Reliability analysis of software based safety functions

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1993-05-01

    The methods applicable in the reliability analysis of software based safety functions are described in the report. Although the safety functions also include other components, the main emphasis in the report is on the reliability analysis of software. The check list type qualitative reliability analysis methods, such as failure mode and effects analysis (FMEA), are described, as well as the software fault tree analysis. The safety analysis based on the Petri nets is discussed. The most essential concepts and models of quantitative software reliability analysis are described. The most common software metrics and their combined use with software reliability models are discussed. The application of software reliability models in PSA is evaluated; it is observed that the recent software reliability models do not produce the estimates needed in PSA directly. As a result from the study some recommendations and conclusions are drawn. The need of formal methods in the analysis and development of software based systems, the applicability of qualitative reliability engineering methods in connection to PSA and the need to make more precise the requirements for software based systems and their analyses in the regulatory guides should be mentioned. (orig.). (46 refs., 13 figs., 1 tab.)

  4. Preference-Based Recommendations for OLAP Analysis

    Science.gov (United States)

    Jerbi, Houssem; Ravat, Franck; Teste, Olivier; Zurfluh, Gilles

    This paper presents a framework for integrating OLAP and recommendations. We focus on the anticipatory recommendation process that assists the user during his OLAP analysis by proposing to him the forthcoming analysis step. We present a context-aware preference model that matches decision-makers intuition, and we discuss a preference-based approach for generating personalized recommendations.

  5. Analysis of co-authorship patterns at the individual level

    Directory of Open Access Journals (Sweden)

    Wolfgang Glänzel

    Full Text Available Publication activity, citation impact and communication patterns, in general, change in the course of a scientist's career. Mobility and radical changes in a scientist's research environment or profile are among the most spectacular factors that have effect on individual collaboration patterns. Although bibliometrics at this level should be applied with the utmost care, characteristic patterns of an individual scientist's research collaboration and changes in these in the course of a career can be well depicted using bibliometric methods. A wide variety of indicators and network tools are chosen to follow up the evolution and to visualise and to quantify collaboration and performance profiles of individual researchers. These methods are, however, designed to supplement expert-opinion based assessment and other qualitative assessments, and should not be used as stand-alone evaluation tools. This study presents part of the results published in an earlier study by Zhang and Glänzel (20124 as well as new applications of these methods.

  6. An analysis of tolerance levels in IMRT quality assurance procedures

    International Nuclear Information System (INIS)

    Basran, Parminder S.; Woo, Milton K.

    2008-01-01

    Increased use of intensity modulated radiation therapy (IMRT) has resulted in increased efforts in patient quality assurance (QA). Software and detector systems intended to streamline the IMRT quality assurance process often report metrics, such as percent discrepancies between measured and computed doses, which can be compared to benchmark or threshold values. The purpose of this work is to examine the relationships between two different types of IMRT QA processes in order to define, or refine, appropriate tolerances values. For 115 IMRT plans delivered in a 3 month period, we examine the discrepancies between (a) the treatment planning system (TPS) and results from a commercial independent monitor unit (MU) calculation program; (b) TPS and results from a commercial diode-array measurement system; and (c) the independent MU calculation and the diode-array measurements. Statistical tests were performed to assess significance in the IMRT QA results for different disease site and machine models. There is no evidence that the average total dose discrepancy in the monitor unit calculation depends on the disease site. Second, the discrepancies in the two IMRT QA methods are independent: there is no evidence that a better --or worse--monitor unit validation result is related to a better--or worse--diode-array measurement result. Third, there is marginal benefit in repeating the independent MU calculation with a more suitable dose point, if the initial IMRT QA failed a certain tolerance. Based on these findings, the authors conclude at some acceptable tolerances based on disease site and IMRT QA method. Specifically, monitor unit validations are expected to have a total dose discrepancy of 3% overall, and 5% per beam, independent of disease site. Diode array measurements are expected to have a total absolute dose discrepancy of 3% overall, and 3% per beam, independent of disease site. The percent of pixels exceeding a 3% and 3 mm threshold in a gamma analysis should be

  7. Levels of lead in solvent and water-based paints manufactured in Pakistan

    International Nuclear Information System (INIS)

    Ikram, M.; Rauf, M.A.; Chotona, G.A.; Bukhari, N.

    2000-01-01

    The levels of lead in eight popular brands of solvent- and water-based paint manufactured locally in Pakistan are reported. The analysis was done using the flame Atomic Absorption Spectrophotometric method. The lead concentration was found to vary from 3.3 mg/kg to 13179 in different solvent-based brands, whereas the concentration of the metal was in the range of 1768 to less than 0.5mg/kg in water based paints. The lead concentrations were especially high in oil based green (maximum value of 13170 mg/kg) and yellow paints (maximum value of 84940 mg/kg). The corresponding higher concentration were observed in case of emerald (maximum value of 1768 mg/kg) and gray (maximum value of 542 mg/kg) paints in the water-based category. (author)

  8. Team-Based Care: A Concept Analysis.

    Science.gov (United States)

    Baik, Dawon

    2017-10-01

    The purpose of this concept analysis is to clarify and analyze the concept of team-based care in clinical practice. Team-based care has garnered attention as a way to enhance healthcare delivery and patient care related to quality and safety. However, there is no consensus on the concept of team-based care; as a result, the lack of common definition impedes further studies on team-based care. This analysis was conducted using Walker and Avant's strategy. Literature searches were conducted using PubMed, Cumulative Index to Nursing and Allied Health Literature (CINAHL), and PsycINFO, with a timeline from January 1985 to December 2015. The analysis demonstrates that the concept of team-based care has three core attributes: (a) interprofessional collaboration, (b) patient-centered approach, and (c) integrated care process. This is accomplished through understanding other team members' roles and responsibilities, a climate of mutual respect, and organizational support. Consequences of team-based care are identified with three aspects: (a) patient, (b) healthcare professional, and (c) healthcare organization. This concept analysis helps better understand the characteristics of team-based care in the clinical practice as well as promote the development of a theoretical definition of team-based care. © 2016 Wiley Periodicals, Inc.

  9. An Abstraction Hierarchy based mobile PC display design in NPP maintenance considering the level of expertise

    International Nuclear Information System (INIS)

    Yim, Ho Bin; Kim, In; Seong, Poong Hyun

    2011-01-01

    Research highlights: → Six levels of Abstraction Hierarchy based information for maintenance were proposed. → Errors and workload with AH based information display were reduced for LL subjects. → Design concerns discovered can be applied to practical use of mobile maintenance aids. - Abstract: Recently, the importance of effective maintenance in nuclear power plants (NPPs) has been emphasized and research into effective maintenance by adopting mobile maintenance aids (MMAs) have been attempted. For improved and effective use of an MMA display design method based on the hierarchy is proposed and its design considerations are discussed in this study. Six levels of hierarchy are proposed in this paper to classify the maintenance information. By classifying and organizing maintenance information using the hierarchy, maintenance information can be used effectively by users with either high or low levels of expertise. When information classification has been finished, the information for MMA design is selected and designed. With the considerations of MMA design analysis and guidelines, a hierarchy-based MMA is designed for the maintenance tasks. An experiment is conducted using the hierarchy-based MMA in order to estimate the effectiveness of the proposed method for the maintenance tasks and to identify design considerations to enhance the proposed MMAs. The result indicated that a hierarchy-based manual was more effective than a conventional manual in terms of task completion time and number of errors. The workload for the hierarchy-based manual was estimated less than the conventional manual for subjects with low level of expertise. As the level of expertise increases, subjects tended to follow more abstract information while the number of navigations decreased. It is believed that when mobile devices become pervasive in NPP maintenance fields, the hierarchy model applied MMAs can be used as an effective maintenance supporting tool.

  10. Automatic Online Lecture Highlighting Based on Multimedia Analysis

    Science.gov (United States)

    Che, Xiaoyin; Yang, Haojin; Meinel, Christoph

    2018-01-01

    Textbook highlighting is widely considered to be beneficial for students. In this paper, we propose a comprehensive solution to highlight the online lecture videos in both sentence- and segment-level, just as is done with paper books. The solution is based on automatic analysis of multimedia lecture materials, such as speeches, transcripts, and…

  11. An Analysis of Losses to the Southern Commercial Timberland Base

    Science.gov (United States)

    Ian A. Munn; David Cleaves

    1998-01-01

    Demographic and physical factors influencing the conversion of commercial timberland iu the south to non-forestry uses between the last two Forest Inventory Analysis (FIA) surveys were investigated. GIS techniques linked Census data and FIA plot level data. Multinomial logit regression identified factors associated with losses to the timberland base. Conversion to...

  12. Dielectrophoresis-based discrimination of bacteria at the strain level based on their surface properties.

    Directory of Open Access Journals (Sweden)

    William A Braff

    Full Text Available Insulator-based dielectrophoresis can be used to manipulate biological particles, but has thus far found limited practical applications due to low sensitivity. We present linear sweep three-dimensional insulator-based dielectrophoresis as a considerably more sensitive approach for strain-level discrimination bacteria. In this work, linear sweep three-dimensional insulator-based dielectrophoresis was performed on Pseudomonas aeruginosa PA14 along with six isogenic mutants as well as Streptococcus mitis SF100 and PS344. Strain-level discrimination was achieved between these clinically important pathogens with applied electric fields below 10 V/mm. This low voltage, high sensitivity technique has potential applications in clinical diagnostics as well as microbial physiology research.

  13. An Analysis of Some Commercial Management Aspects at the Level of a Travel Agency

    Directory of Open Access Journals (Sweden)

    Solomia Andres

    2017-06-01

    Full Text Available This paper presents some theoretical and practical contribution in the application of a managerial method based on the analysis of some commercial management aspects, practiced in a travel agency, which gains profits in the Caras-Severin County. The purpose of this analysis was to underline the functionality state of this agency and the performance level of the commercial functionality management, and the issuing of some relevant proposals for the domain analysed, considered to present a high potential and a special importance for this county.

  14. Denaturing high-performance liquid chromatography mutation analysis in patients with reduced Protein S levels

    DEFF Research Database (Denmark)

    Bathum, Lise; Münster, Anna-Marie; Nybo, Mads

    2008-01-01

    diagnosis and risk estimation. The aim was to design a high-throughput genetic analysis based on denaturing high-performance liquid chromatography to identify sequence variations in the gene coding for Protein S. PATIENTS: In total, 55 patients referred to the Section of Thrombosis and Haemostasis, Odense......BACKGROUND: Patients with congenital Protein S deficiency have increased risk of venous thromboembolism. However, Protein S levels show large intra-individual variation and the biochemical assays have low accuracy and a high interlaboratory variability. Genetic analysis might aid in a more precise......, giving a precise diagnosis and subsequently a better risk estimation....

  15. Risk-based Optimization and Reliability Levels of Coastal Structures

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, H. F.

    Identification of optimum reliability levels for coastal structures is considered. A class of breakwaters is considered where no human injuries can be expected in cases of failure. The optimum reliability level is identified by minimizing the total costs over the service life of the structure, in...... on the minimumcost reliability levels is investigated for different values of the real rate of interest, the service lifetime, the downtime costs due to malfunction and the decommission costs.......Identification of optimum reliability levels for coastal structures is considered. A class of breakwaters is considered where no human injuries can be expected in cases of failure. The optimum reliability level is identified by minimizing the total costs over the service life of the structure...

  16. Risk-based Optimization and Reliability Levels of Coastal Structures

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, Hans F.

    2005-01-01

     Identification of optimum reliability levels for coastal structures is considered. A class of breakwaters is considered where no human injuries can be expected in cases of failure. The optimum reliability level is identified by minimizing the total costs over the service life of the structure, i...... on the minimumcost reliability levels is investigated for different values of the real rate of interest, the service lifetime, the downtime costs due to malfunction and the decommission costs....... Identification of optimum reliability levels for coastal structures is considered. A class of breakwaters is considered where no human injuries can be expected in cases of failure. The optimum reliability level is identified by minimizing the total costs over the service life of the structure...

  17. Multi-Level Wavelet Shannon Entropy-Based Method for Single-Sensor Fault Location

    Directory of Open Access Journals (Sweden)

    Qiaoning Yang

    2015-10-01

    Full Text Available In actual application, sensors are prone to failure because of harsh environments, battery drain, and sensor aging. Sensor fault location is an important step for follow-up sensor fault detection. In this paper, two new multi-level wavelet Shannon entropies (multi-level wavelet time Shannon entropy and multi-level wavelet time-energy Shannon entropy are defined. They take full advantage of sensor fault frequency distribution and energy distribution across multi-subband in wavelet domain. Based on the multi-level wavelet Shannon entropy, a method is proposed for single sensor fault location. The method firstly uses a criterion of maximum energy-to-Shannon entropy ratio to select the appropriate wavelet base for signal analysis. Then multi-level wavelet time Shannon entropy and multi-level wavelet time-energy Shannon entropy are used to locate the fault. The method is validated using practical chemical gas concentration data from a gas sensor array. Compared with wavelet time Shannon entropy and wavelet energy Shannon entropy, the experimental results demonstrate that the proposed method can achieve accurate location of a single sensor fault and has good anti-noise ability. The proposed method is feasible and effective for single-sensor fault location.

  18. Face-iris multimodal biometric scheme based on feature level fusion

    Science.gov (United States)

    Huo, Guang; Liu, Yuanning; Zhu, Xiaodong; Dong, Hongxing; He, Fei

    2015-11-01

    Unlike score level fusion, feature level fusion demands all the features extracted from unimodal traits with high distinguishability, as well as homogeneity and compatibility, which is difficult to achieve. Therefore, most multimodal biometric research focuses on score level fusion, whereas few investigate feature level fusion. We propose a face-iris recognition method based on feature level fusion. We build a special two-dimensional-Gabor filter bank to extract local texture features from face and iris images, and then transform them by histogram statistics into an energy-orientation variance histogram feature with lower dimensions and higher distinguishability. Finally, through a fusion-recognition strategy based on principal components analysis and support vector machine (FRSPS), feature level fusion and one-to-n identification are accomplished. The experimental results demonstrate that this method can not only effectively extract face and iris features but also provide higher recognition accuracy. Compared with some state-of-the-art fusion methods, the proposed method has a significant performance advantage.

  19. Creating a three level building classification using topographic and address-based data for Manchester

    Science.gov (United States)

    Hussain, M.; Chen, D.

    2014-11-01

    Buildings, the basic unit of an urban landscape, host most of its socio-economic activities and play an important role in the creation of urban land-use patterns. The spatial arrangement of different building types creates varied urban land-use clusters which can provide an insight to understand the relationships between social, economic, and living spaces. The classification of such urban clusters can help in policy-making and resource management. In many countries including the UK no national-level cadastral database containing information on individual building types exists in public domain. In this paper, we present a framework for inferring functional types of buildings based on the analysis of their form (e.g. geometrical properties, such as area and perimeter, layout) and spatial relationship from large topographic and address-based GIS database. Machine learning algorithms along with exploratory spatial analysis techniques are used to create the classification rules. The classification is extended to two further levels based on the functions (use) of buildings derived from address-based data. The developed methodology was applied to the Manchester metropolitan area using the Ordnance Survey's MasterMap®, a large-scale topographic and address-based data available for the UK.

  20. Free trade area of the Americas a three level analysis

    OpenAIRE

    Williams, Clay G.

    2006-01-01

    The Free Trade Area of the Americas is a proposed treaty that would encompass the Western Hemisphere-800 million people and a 13 trillion dollar economy. It is a regional agreement that cannot be understood without the interrelated issues at both the international and domestic level. The single most important issue that resides at the nexus of all three of these levels is domestic subsidies on agriculture. FTAA cannot move forward at the regional level without reduction in the U.S. domest...

  1. Region based route planning - Multi-abstraction route planning based on intermediate level vision processing

    Science.gov (United States)

    Doshi, Rajkumar S.; Lam, Raymond; White, James E.

    1989-01-01

    Intermediate and high level processing operations are performed on vision data for the organization of images into more meaningful, higher-level topological representations by means of a region-based route planner (RBRP). The RBRP operates in terrain scenarios where some or most of the terrain is occluded, proceeding without a priori maps on the basis of two-dimensional representations and gradient-and-roughness information. Route planning is accomplished by three successive abstractions and yields a detailed point-by-point path by searching only within the boundaries of relatively small regions.

  2. Establishment of Infrastructure for Domestic-Specific Level 3 PSA based on MACCS2

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seung-Cheol; Han, Seok-Jung; Choi, Sun-Yeong; Lee, Seung-Jun [KAERI, Daejeon (Korea, Republic of); Kim, Wan-Seob [Korea Reliability Technology and System, Daejeon (Korea, Republic of)

    2015-05-15

    Research activities related to the Level 3 PSA have naturally disappeared since the use of risk surrogates. Recently, Level 3 PSA was only performed to the extent of the purpose of operating license for the plant under construction. Since the Fukushima accident, concern about a comprehensive site-specific Level 3 PSA has been raised for some compelling reasons, especially the evaluation of the domestic multi-unit site risk effect including other site radiological sources (e.g., spent fuel pool, multi-units). Unfortunately, there are no domestic-specific consequence analysis code and input database required to perform a site-specific Level 3 PSA. The paper focuses on the development of the input data management system for domestic-specific Level 3 PSA based MACCS2 (MELCOR Accident Consequence Code System). The authors call it KOSCA-MACCS2 (Korea Off-Site Consequence Analysis based in MACCS2). It serves as an integrated platform for a domestic-specific Level 3 PSA. Also, it provides the pre-processing modules to automatically generate MACCS2 input from diverse types of the domestic-specific data including numerical map data, e.g., meteorological data, numerical population map, digital land use map, economic statistics and so on. Note that some functions should be still developed and added on it, e.g., post-processing module to convert MACCS2 outputs to graphic report forms, and so on. Henceforth, it is necessary to develop a Korean-specific Level 3 PSA code as a substitution for the foreign software, MACCS2.

  3. ANALYSIS OF LEVEL OF BOTH SHOULDERS IN PHYSICAL THERAPY STUDENTS

    Directory of Open Access Journals (Sweden)

    Ghazala Noor Nizami

    2017-09-01

    Full Text Available Background: During lectures, usually students sit in an awkward position, for prolonged period of time and that may cause postural instability. For a good posture, bilateral landmarks should be on same level, when viewed from front or behind. Therefore, both shoulders should also be on same level as well. Any alteration in level of shoulders in healthy individual may lead to deformity in spine or extremity. The objective of this study was to analyze the level of both shoulders in the physical therapy students and to find its correlation with the perception of students about their shoulder balance. Methods: An observational (cross – sectional study was conducted on students of Doctor in Physical Therapy (DPT from colleges of Physical Therapy, Karachi. 100 Students were selected by Simple Random Sampling technique. Data from students was collected by administering a questionnaire. It includes close-ended questions. Afterwards, the level of both shoulders of the students, were assessed by using Scoliosis Meter. Results: Response from students showed that 79% of them assumed that both shoulders are in same level. When level of shoulder of students was assessed by scoliosis meter, it showed that 37% students have absolute level shoulder. Spearman’s Correlation coefficient (r = 0.046, p= 0.65 showed a weak, positive correlation between perception of the students about shoulder level and assessment of shoulder tilt. Conclusion: This showed that the perception of students about level of both shoulders was not correlated to the actual levels of the shoulders. Hence, as they were not assuming it uneven, so they may not pay any attention to keep themselves straight.

  4. NOAA Next Generation Radar (NEXRAD) Level 2 Base Data

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — This dataset consists of Level II weather radar data collected from Next-Generation Radar (NEXRAD) stations located in the contiguous United States, Alaska, Hawaii,...

  5. Lot Sizing Based on Stochastic Demand and Service Level Constraint

    Directory of Open Access Journals (Sweden)

    hajar shirneshan

    2012-06-01

    Full Text Available Considering its application, stochastic lot sizing is a significant subject in production planning. Also the concept of service level is more applicable than shortage cost from managers' viewpoint. In this paper, the stochastic multi period multi item capacitated lot sizing problem has been investigated considering service level constraint. First, the single item model has been developed considering service level and with no capacity constraint and then, it has been solved using dynamic programming algorithm and the optimal solution has been derived. Then the model has been generalized to multi item problem with capacity constraint. The stochastic multi period multi item capacitated lot sizing problem is NP-Hard, hence the model could not be solved by exact optimization approaches. Therefore, simulated annealing method has been applied for solving the problem. Finally, in order to evaluate the efficiency of the model, low level criterion has been used .

  6. Breath analysis based on micropreconcentrator for early cancer diagnosis

    Science.gov (United States)

    Lee, Sang-Seok

    2018-02-01

    We are developing micropreconcentrators based on micro/nanotechnology to detect trace levels of volatile organic compound (VOC) gases contained in human and canine exhaled breath. The possibility of using exhaled VOC gases as biomarkers for various cancer diagnoses has been previously discussed. For early cancer diagnosis, detection of trace levels of VOC gas is indispensable. Using micropreconcentrators based on MEMS technology or nanotechnology is very promising for detection of VOC gas. A micropreconcentrator based breath analysis technique also has advantages from the viewpoints of cost performance and availability for various cancers diagnosis. In this paper, we introduce design, fabrication and evaluation results of our MEMS and nanotechnology based micropreconcentrators. In the MEMS based device, we propose a flower leaf type Si microstructure, and its shape and configuration are optimized quantitatively by finite element method simulation. The nanotechnology based micropreconcentrator consists of carbon nanotube (CNT) structures. As a result, we achieve ppb level VOC gas detection with our micropreconcentrators and usual gas chromatography system that can detect on the order of ppm VOC in gas samples. In performance evaluation, we also confirm that the CNT based micropreconcentrator shows 115 times better concentration ratio than that of the Si based micropreconcentrator. Moreover, we discuss a commercialization idea for new cancer diagnosis using breath analysis. Future work and preliminary clinical testing in dogs is also discussed.

  7. Robust Mediation Analysis Based on Median Regression

    Science.gov (United States)

    Yuan, Ying; MacKinnon, David P.

    2014-01-01

    Mediation analysis has many applications in psychology and the social sciences. The most prevalent methods typically assume that the error distribution is normal and homoscedastic. However, this assumption may rarely be met in practice, which can affect the validity of the mediation analysis. To address this problem, we propose robust mediation analysis based on median regression. Our approach is robust to various departures from the assumption of homoscedasticity and normality, including heavy-tailed, skewed, contaminated, and heteroscedastic distributions. Simulation studies show that under these circumstances, the proposed method is more efficient and powerful than standard mediation analysis. We further extend the proposed robust method to multilevel mediation analysis, and demonstrate through simulation studies that the new approach outperforms the standard multilevel mediation analysis. We illustrate the proposed method using data from a program designed to increase reemployment and enhance mental health of job seekers. PMID:24079925

  8. Trace level and highly selective determination of urea in various real samples based upon voltammetric analysis of diacetylmonoxime-urea reaction product on the carbon nanotube/carbon paste electrode.

    Science.gov (United States)

    Alizadeh, Taher; Ganjali, Mohammad Reza; Rafiei, Faride

    2017-06-29

    In this study an innovative method was introduced for selective and precise determination of urea in various real samples including urine, blood serum, soil and water. The method was based on the square wave voltammetry determination of an electroactive product, generated during diacetylmonoxime reaction with urea. A carbon paste electrode, modified with multi-walled carbon nanotubes (MWCNTs) was found to be an appropriate electrochemical transducer for recording of the electrochemical signal. It was found that the chemical reaction conditions influenced the analytical signal directly. The calibration graph of the method was linear in the range of 1 × 10 -7 - 1 × 10 -2  mol L -1 . The detection limit was calculated to be 52 nmol L -1 . Relative standard error of the method was also calculated to be 3.9% (n = 3). The developed determination procedure was applied for urea determination in various real samples including soil, urine, plasma and water samples. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Factors affecting public prejudice and social distance on mental illness: analysis of contextual effect by multi-level analysis.

    Science.gov (United States)

    Jang, Hyeongap; Lim, Jun-Tae; Oh, Juhwan; Lee, Seon-Young; Kim, Yong-Ik; Lee, Jin-Seok

    2012-03-01

    While there have been many quantitative studies on the public's attitude towards mental illnesses, it is hard to find quantitative study which focused on the contextual effect on the public's attitude. The purpose of this study was to identify factors that affect the public's beliefs and attitudes including contextual effects. We analyzed survey on the public's beliefs and attitudes towards mental illness in Korea with multi-level analysis. We analyzed the public's beliefs and attitudes in terms of prejudice as an intermediate outcome and social distance as a final outcome. Then, we focused on the associations of factors, which were individual and regional socio-economic factors, familiarity, and knowledge based on the comparison of the intermediate and final outcomes. Prejudice was not explained by regional variables but was only correlated with individual factors. Prejudice increased with age and decreased by high education level. However, social distance controlling for prejudice increased in females, in people with a high education level, and in regions with a high education level and a high proportion of the old. Therefore, social distance without controlling for prejudice increased in females, in the elderly, in highly educated people, and in regions with a high education and aged community. The result of the multi-level analysis for the regional variables suggests that social distance for mental illness are not only determined by individual factors but also influenced by the surroundings so that it could be tackled sufficiently with appropriate considering of the relevant regional context with individual characteristics.

  10. Trial-Based Functional Analysis Informs Treatment for Vocal Scripting.

    Science.gov (United States)

    Rispoli, Mandy; Brodhead, Matthew; Wolfe, Katie; Gregori, Emily

    2018-05-01

    Research on trial-based functional analysis has primarily focused on socially maintained challenging behaviors. However, procedural modifications may be necessary to clarify ambiguous assessment results. The purposes of this study were to evaluate the utility of iterative modifications to trial-based functional analysis on the identification of putative reinforcement and subsequent treatment for vocal scripting. For all participants, modifications to the trial-based functional analysis identified a primary function of automatic reinforcement. The structure of the trial-based format led to identification of social attention as an abolishing operation for vocal scripting. A noncontingent attention treatment was evaluated using withdrawal designs for each participant. This noncontingent attention treatment resulted in near zero levels of vocal scripting for all participants. Implications for research and practice are presented.

  11. Micromechanics Based Failure Analysis of Heterogeneous Materials

    Science.gov (United States)

    Sertse, Hamsasew M.

    In recent decades, heterogeneous materials are extensively used in various industries such as aerospace, defense, automotive and others due to their desirable specific properties and excellent capability of accumulating damage. Despite their wide use, there are numerous challenges associated with the application of these materials. One of the main challenges is lack of accurate tools to predict the initiation, progression and final failure of these materials under various thermomechanical loading conditions. Although failure is usually treated at the macro and meso-scale level, the initiation and growth of failure is a complex phenomena across multiple scales. The objective of this work is to enable the mechanics of structure genome (MSG) and its companion code SwiftComp to analyze the initial failure (also called static failure), progressive failure, and fatigue failure of heterogeneous materials using micromechanics approach. The initial failure is evaluated at each numerical integration point using pointwise and nonlocal approach for each constituent of the heterogeneous materials. The effects of imperfect interfaces among constituents of heterogeneous materials are also investigated using a linear traction-displacement model. Moreover, the progressive and fatigue damage analyses are conducted using continuum damage mechanics (CDM) approach. The various failure criteria are also applied at a material point to analyze progressive damage in each constituent. The constitutive equation of a damaged material is formulated based on a consistent irreversible thermodynamics approach. The overall tangent modulus of uncoupled elastoplastic damage for negligible back stress effect is derived. The initiation of plasticity and damage in each constituent is evaluated at each numerical integration point using a nonlocal approach. The accumulated plastic strain and anisotropic damage evolution variables are iteratively solved using an incremental algorithm. The damage analyses

  12. Accident sequence precursor analysis level 2/3 model development

    International Nuclear Information System (INIS)

    Lui, C.H.; Galyean, W.J.; Brownson, D.A.

    1997-01-01

    The US Nuclear Regulatory Commission's Accident Sequence Precursor (ASP) program currently uses simple Level 1 models to assess the conditional core damage probability for operational events occurring in commercial nuclear power plants (NPP). Since not all accident sequences leading to core damage will result in the same radiological consequences, it is necessary to develop simple Level 2/3 models that can be used to analyze the response of the NPP containment structure in the context of a core damage accident, estimate the magnitude of the resulting radioactive releases to the environment, and calculate the consequences associated with these releases. The simple Level 2/3 model development work was initiated in 1995, and several prototype models have been completed. Once developed, these simple Level 2/3 models are linked to the simple Level 1 models to provide risk perspectives for operational events. This paper describes the methods implemented for the development of these simple Level 2/3 ASP models, and the linkage process to the existing Level 1 models

  13. Multi-level human motion analysis for surveillance applications

    NARCIS (Netherlands)

    Lao, W.; Han, Jungong; With, de P.H.N.; Rabbani, M.; Stevenson, R.L.

    2009-01-01

    In this paper, we study a flexible framework for semantic analysis of human motion from a monocular surveillance video. Successful trajectory estimation and human-body modeling facilitate the semantic analysis of human activities in video sequences. As a first contribution, we propose a flexible

  14. Analysis of Sea Level Rise in Singapore Strait

    Science.gov (United States)

    Tkalich, Pavel; Luu, Quang-Hung

    2013-04-01

    Sea level in Singapore Strait is governed by various scale phenomena, from global to local. Global signals are dominated by the climate change and multi-decadal variability and associated sea level rise; at regional scale seasonal sea level variability is caused by ENSO-modulated monsoons; locally, astronomic tides are the strongest force. Tide gauge records in Singapore Strait are analyzed to derive local sea level trend, and attempts are made to attribute observed sea level variability to phenomena at various scales, from global to local. It is found that at annual scale, sea level anomalies in Singapore Strait are quasi-periodic, of the order of ±15 cm, the highest during northeast monsoon and the lowest during southwest monsoon. Interannual regional sea level falls are associated with El Niño events, while the rises are related to La Niña episodes; both variations are in the range of ±9 cm. At multi-decadal scale, sea level in Singapore Strait has been rising at the rate 1.2-1.9 mm/year for the period 1975-2009, 2.0±0.3 mm/year for 1984-2009, and 1.3-4.7 mm/year for 1993-2009. When compared with the respective global trends of 2.0±0.3, 2.4, and 2.8±0.8 mm/year, Singapore Strait sea level rise trend was weaker at the earlier period and stronger at the recent decade.

  15. Quantitative scenario analysis of low and intermediate level radioactive repository

    International Nuclear Information System (INIS)

    Lee, Keon Jae; Lee, Sang Yoon; Park, Keon Baek; Song, Min Cheon; Lee, Ho Jin

    1998-03-01

    Derivation of hypothetical radioactive waste disposal facility os conducted through sub-component characteristic analysis and conceptual modeling. It is studied that quantitative analysis of constructed scenario in terms of annual effective dose equivalent. This study is sequentially conducted according to performance assessment of radioactive waste disposal facility such as : ground water flow analysis, source term analysis, ground water transport, surface water transport, dose and pathways. The routine program module such as VAM2D-PAGAN-GENII is used for quantitative scenario analysis. Detailed data used in this module are come from experimental data of Korean territory and default data given within this module. Is case of blank data for code execution, it is estimated through reasonable engineering sense

  16. Higher-level fusion for military operations based on abductive inference: proof of principle

    Science.gov (United States)

    Pantaleev, Aleksandar V.; Josephson, John

    2006-04-01

    The ability of contemporary military commanders to estimate and understand complicated situations already suffers from information overload, and the situation can only grow worse. We describe a prototype application that uses abductive inferencing to fuse information from multiple sensors to evaluate the evidence for higher-level hypotheses that are close to the levels of abstraction needed for decision making (approximately JDL levels 2 and 3). Abductive inference (abduction, inference to the best explanation) is a pattern of reasoning that occurs naturally in diverse settings such as medical diagnosis, criminal investigations, scientific theory formation, and military intelligence analysis. Because abduction is part of common-sense reasoning, implementations of it can produce reasoning traces that are very human understandable. Automated abductive inferencing can be deployed to augment human reasoning, taking advantage of computation to process large amounts of information, and to bypass limits to human attention and short-term memory. We illustrate the workings of the prototype system by describing an example of its use for small-unit military operations in an urban setting. Knowledge was encoded as it might be captured prior to engagement from a standard military decision making process (MDMP) and analysis of commander's priority intelligence requirements (PIR). The system is able to reasonably estimate the evidence for higher-level hypotheses based on information from multiple sensors. Its inference processes can be examined closely to verify correctness. Decision makers can override conclusions at any level and changes will propagate appropriately.

  17. Analysis of superconducting microstrip resonator at various microwave power levels

    International Nuclear Information System (INIS)

    Srivastava, G.P.; Jacob, M.V.; Jayakumar, M.; Bhatnagar, P.K.; Kataria, N.D.

    1997-01-01

    The real and imaginary parts of the surface impedance of YBCO superconductors have been studied at different microwave power levels. Using the relations for the critical current density and the grain boundary resistance, a relation for calculating the power dependence of the surface resistance has been obtained. Also, a relation to find the resonant frequency of a superconducting microstrip resonator at various input power levels has been derived. Measurements have been carried out on various microstrip resonators to study the variation of surface resistance and resonant frequency at different rf power levels. The experimental results are in good agreement with theoretical results. copyright 1997 American Institute of Physics

  18. National high-level waste systems analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Kristofferson, K.; Oholleran, T.P.; Powell, R.H.

    1995-09-01

    This report documents the assessment of budgetary impacts, constraints, and repository availability on the storage and treatment of high-level waste and on both existing and pending negotiated milestones. The impacts of the availabilities of various treatment systems on schedule and throughput at four Department of Energy sites are compared to repository readiness in order to determine the prudent application of resources. The information modeled for each of these sites is integrated with a single national model. The report suggests a high-level-waste model that offers a national perspective on all high-level waste treatment and storage systems managed by the Department of Energy.

  19. National high-level waste systems analysis report

    International Nuclear Information System (INIS)

    Kristofferson, K.; Oholleran, T.P.; Powell, R.H.

    1995-09-01

    This report documents the assessment of budgetary impacts, constraints, and repository availability on the storage and treatment of high-level waste and on both existing and pending negotiated milestones. The impacts of the availabilities of various treatment systems on schedule and throughput at four Department of Energy sites are compared to repository readiness in order to determine the prudent application of resources. The information modeled for each of these sites is integrated with a single national model. The report suggests a high-level-waste model that offers a national perspective on all high-level waste treatment and storage systems managed by the Department of Energy

  20. Analysis of impurities at trace levels in metallic niobium by instrumental neutron activation analysis

    International Nuclear Information System (INIS)

    Favaro, D.I.T.; Vasconcellos, M.B.A.; Santos, C.

    1989-10-01

    The interest in obtaining niobium of high purity has increased due to the recent applications of this material in both vacuum and high temperature technologies and to its potential uses in the aeronautic and aerospacial industries and in the nuclear energy field. In the present work, a procedure of analysis of impurities in the parts per million level, in eletrolitic and non-eletrolitic niobium samples has been established. The method of neutron activation analysis followed by high resolution gamma ray spectrometry has been used. The elements Al, Na, Mn, Cl and In, in ppm level and Y, in the percentage level, were determined after irradiation from 1 to 20 minutes, under a thermal neutron flux of 10 11 n.cm -2 .s -1 at the IEA-Rl reactor of the IPEN-CNEN/SP. The γ-rays from the radioactive products were measured with a Ge(Li) detector coupled to a 4096 channel analyzer. The elements Ta, Cr and W, in the parts per million level, were determined with irradiation of 8 hours under a thermal neutron flux of 10 12 n.cm -2 .s -1 . (autor) [pt

  1. ITIL Based Service Level Management if SLAs Cover Security

    Directory of Open Access Journals (Sweden)

    Tomas Feglar

    2005-08-01

    Full Text Available Current level of information technology creates new perspectives for more IT service oriented market. Quality of these services requires slightly different approach then was applied for products including software. No IT services are delivered and supported in risk free environment. Risks would be considered consistently with IT services quality gaps from Service Level Management (SLM perspective. SLM is one of ITIL modules that are widely used within the IT service industry. We identified some weaknesses in how SLM is developed in ITIL environment if service level agreement (SLA has cover Security. We argue that in such cases Architecture modeling and risk assessment approach let us effectively control analytical effort that relates to risks identification and understanding. Risk driven countermeasures designed in a next step (Risk treatment have significant impact to the SLM especially from responsibility perspective. To demonstrate SLM's importance in real practice we analyze SLA synthesize process in CCI (Cyber Critical Infrastructure environment.

  2. A Variational Level Set Approach Based on Local Entropy for Image Segmentation and Bias Field Correction.

    Science.gov (United States)

    Tang, Jian; Jiang, Xiaoliang

    2017-01-01

    Image segmentation has always been a considerable challenge in image analysis and understanding due to the intensity inhomogeneity, which is also commonly known as bias field. In this paper, we present a novel region-based approach based on local entropy for segmenting images and estimating the bias field simultaneously. Firstly, a local Gaussian distribution fitting (LGDF) energy function is defined as a weighted energy integral, where the weight is local entropy derived from a grey level distribution of local image. The means of this objective function have a multiplicative factor that estimates the bias field in the transformed domain. Then, the bias field prior is fully used. Therefore, our model can estimate the bias field more accurately. Finally, minimization of this energy function with a level set regularization term, image segmentation, and bias field estimation can be achieved. Experiments on images of various modalities demonstrated the superior performance of the proposed method when compared with other state-of-the-art approaches.

  3. Development of a computerized data base for low-level waste leaching data

    International Nuclear Information System (INIS)

    Dougherty, D.R.; Colombo, P.

    1987-01-01

    A computerized data base (db) of low-level waste (LLW) leaching data is being compiled by Brookhaven National Laboratory under contract to the DOE Low-Level Waste Management Program. Although this db is being compiled as part of an effort to develop accelerated leach test procedures for LLW forms, other involved in LLW management may find it useful. The db is implemented on an IBM PC XT and is self-contained in that its data manipulation and analysis programs are not proprietary (i.e., need not be purchased). The db includes data from the Accelerated Leach Test(s) Program plus selected literature data, which have been selected based on criteria that include completeness of the experimental description and elucidation of leaching mechanisms. 6 references, 5 figures, 3 tables

  4. FPGA-based multimodal embedded sensor system integrating low- and mid-level vision.

    Science.gov (United States)

    Botella, Guillermo; Martín H, José Antonio; Santos, Matilde; Meyer-Baese, Uwe

    2011-01-01

    Motion estimation is a low-level vision task that is especially relevant due to its wide range of applications in the real world. Many of the best motion estimation algorithms include some of the features that are found in mammalians, which would demand huge computational resources and therefore are not usually available in real-time. In this paper we present a novel bioinspired sensor based on the synergy between optical flow and orthogonal variant moments. The bioinspired sensor has been designed for Very Large Scale Integration (VLSI) using properties of the mammalian cortical motion pathway. This sensor combines low-level primitives (optical flow and image moments) in order to produce a mid-level vision abstraction layer. The results are described trough experiments showing the validity of the proposed system and an analysis of the computational resources and performance of the applied algorithms.

  5. Probabilistic safety analysis second level of WWER-TOI

    International Nuclear Information System (INIS)

    Chekin, A.A.; Bajkova, E.V.; Levin, V.N.; Shishina, E.S.

    2015-01-01

    Probabilistic safety assessment (PSA) of Level-1 and Level-2 gives a comprehensive qualitative and quantitative evaluation of the safety of the project. The operation of the unit at rated power is considered. As sources of radioactivity in the development of the second-level PSA, nuclear fuel in the core of the reactor is considered. As initiating events, internal initiating events (including de-energizing) are considered, which may arise due to failures of NPP systems, equipment or components, or due to erroneous actions of personnel. In general, an assessment of the level of project safety shows that the WWER-TOI project complies with the requirements of the TOR, as well as all the requirements of modern Russian and foreign regulatory documents in the field of security [ru

  6. Analysis of water-level fluctuations in Wisconsin wells

    Science.gov (United States)

    Patterson, G.L.; Zaporozec, A.

    1987-01-01

    More than 60 percent of the residents of Wisconsin use ground water as their primary water source. Water supplies presently are abundant, but ground-water levels continually fluctuate in response to natural factors and human-related stresses. A better understanding of the magnitude, duration, and frequency of past fluctuations, and the factors controlling these fluctuations may help anticipate future changes in ground-water levels.

  7. Structural level characterization of base oils using advanced analytical techniques

    KAUST Repository

    Hourani, Nadim

    2015-05-21

    Base oils, blended for finished lubricant formulations, are classified by the American Petroleum Institute into five groups, viz., groups I-V. Groups I-III consist of petroleum based hydrocarbons whereas groups IV and V are made of synthetic polymers. In the present study, five base oil samples belonging to groups I and III were extensively characterized using high performance liquid chromatography (HPLC), comprehensive two-dimensional gas chromatography (GC×GC), and Fourier transform ion cyclotron resonance mass spectrometry (FT-ICR MS) equipped with atmospheric pressure chemical ionization (APCI) and atmospheric pressure photoionization (APPI) sources. First, the capabilities and limitations of each analytical technique were evaluated, and then the availed information was combined to reveal compositional details on the base oil samples studied. HPLC showed the overwhelming presence of saturated over aromatic compounds in all five base oils. A similar trend was further corroborated using GC×GC, which yielded semiquantitative information on the compound classes present in the samples and provided further details on the carbon number distributions within these classes. In addition to chromatography methods, FT-ICR MS supplemented the compositional information on the base oil samples by resolving the aromatics compounds into alkyl- and naphtheno-subtituted families. APCI proved more effective for the ionization of the highly saturated base oil components compared to APPI. Furthermore, for the detailed information on hydrocarbon molecules FT-ICR MS revealed the presence of saturated and aromatic sulfur species in all base oil samples. The results presented herein offer a unique perspective into the detailed molecular structure of base oils typically used to formulate lubricants. © 2015 American Chemical Society.

  8. Safety analysis of the transportation of high-level radioactive waste

    International Nuclear Information System (INIS)

    Murphy, E.S.; Winegardner, W.K.

    1975-01-01

    An analysis of the risk from transportation of solidified high-level waste is being performed at Battelle-Northwest as part of a comprehensive study of the management of high-level waste. The risk analysis study makes use of fault trees to identify failure events and to specify combinations of events which could result in breach of containment and a release of radioactive material to the environment. Contributions to risk analysis methodology which have been made in connection with this study include procedures for identification of dominant failure sequences, methods for quantifying the effects of probabilistic failure events, and computer code development. Preliminary analysis based on evaluation of the rail transportation fault tree indicates that the dominant failure sequences for transportation of solidified high-level waste will be those related to railroad accidents. Detailed evaluation of rail accident failure sequences is proceeding and is making use of the limited frequency-severity data which is available in the literature. (U.S.)

  9. Radio Frequency Based Water Level Monitor and Controller for ...

    African Journals Online (AJOL)

    Similarly, the control unit of the prototype performs automatic switching control of on and off on a single phase centrifugal water pump, 220volts, 0.5hp motor via a motor driver circuit (relay). It also incorporates a buzzer that beeps briefly when water level hits 100%, thus causing the pump to be switched off but when water ...

  10. Special Analysis for Disposal of High-Concentration I-129 Waste in the Intermediate-Level Vaults at the E-Area Low-Level Waste Facility

    International Nuclear Information System (INIS)

    Collard, L.B.

    2000-01-01

    This revision was prepared to address comments from DOE-SR that arose following publication of revision 0. This Special Analysis (SA) addresses disposal of wastes with high concentrations of I-129 in the Intermediate-Level (IL) Vaults at the operating, low-level radioactive waste disposal facility (the E-Area Low-Level Waste Facility or LLWF) on the Savannah River Site (SRS). This SA provides limits for disposal in the IL Vaults of high-concentration I-129 wastes, including activated carbon beds from the Effluent Treatment Facility (ETF), based on their measured, waste-specific Kds

  11. Special Analysis for Disposal of High-Concentration I-129 Waste in the Intermediate-Level Vaults at the E-Area Low-Level Waste Facility

    Energy Technology Data Exchange (ETDEWEB)

    Collard, L.B.

    2000-09-26

    This revision was prepared to address comments from DOE-SR that arose following publication of revision 0. This Special Analysis (SA) addresses disposal of wastes with high concentrations of I-129 in the Intermediate-Level (IL) Vaults at the operating, low-level radioactive waste disposal facility (the E-Area Low-Level Waste Facility or LLWF) on the Savannah River Site (SRS). This SA provides limits for disposal in the IL Vaults of high-concentration I-129 wastes, including activated carbon beds from the Effluent Treatment Facility (ETF), based on their measured, waste-specific Kds.

  12. On the Relationship between Variational Level Set-Based and SOM-Based Active Contours

    Science.gov (United States)

    Abdelsamea, Mohammed M.; Gnecco, Giorgio; Gaber, Mohamed Medhat; Elyan, Eyad

    2015-01-01

    Most Active Contour Models (ACMs) deal with the image segmentation problem as a functional optimization problem, as they work on dividing an image into several regions by optimizing a suitable functional. Among ACMs, variational level set methods have been used to build an active contour with the aim of modeling arbitrarily complex shapes. Moreover, they can handle also topological changes of the contours. Self-Organizing Maps (SOMs) have attracted the attention of many computer vision scientists, particularly in modeling an active contour based on the idea of utilizing the prototypes (weights) of a SOM to control the evolution of the contour. SOM-based models have been proposed in general with the aim of exploiting the specific ability of SOMs to learn the edge-map information via their topology preservation property and overcoming some drawbacks of other ACMs, such as trapping into local minima of the image energy functional to be minimized in such models. In this survey, we illustrate the main concepts of variational level set-based ACMs, SOM-based ACMs, and their relationship and review in a comprehensive fashion the development of their state-of-the-art models from a machine learning perspective, with a focus on their strengths and weaknesses. PMID:25960736

  13. Processor farming in two-level analysis of historical bridge

    Science.gov (United States)

    Krejčí, T.; Kruis, J.; Koudelka, T.; Šejnoha, M.

    2017-11-01

    This contribution presents a processor farming method in connection with a multi-scale analysis. In this method, each macro-scopic integration point or each finite element is connected with a certain meso-scopic problem represented by an appropriate representative volume element (RVE). The solution of a meso-scale problem provides then effective parameters needed on the macro-scale. Such an analysis is suitable for parallel computing because the meso-scale problems can be distributed among many processors. The application of the processor farming method to a real world masonry structure is illustrated by an analysis of Charles bridge in Prague. The three-dimensional numerical model simulates the coupled heat and moisture transfer of one half of arch No. 3. and it is a part of a complex hygro-thermo-mechanical analysis which has been developed to determine the influence of climatic loading on the current state of the bridge.

  14. Water-Level Data Analysis for the Saturated Zone Site-Scale Flow and Transport Model

    International Nuclear Information System (INIS)

    K. Rehfeldt

    2004-01-01

    This report is an updated analysis of water-level data performed to provide the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]) (referred to as the saturated zone (SZ) site-scale flow model or site-scale SZ flow model in this report) with the configuration of the potentiometric surface, target water-level data, and hydraulic gradients for calibration of groundwater flow models. This report also contains an expanded discussion of uncertainty in the potentiometric-surface map. The analysis of the potentiometric data presented in Revision 00 of this report (USGS 2001 [DIRS 154625]) provides the configuration of the potentiometric surface, target heads, and hydraulic gradients for the calibration of the SZ site-scale flow model (BSC 2004 [DIRS 170037]). Revision 01 of this report (USGS 2004 [DIRS 168473]) used updated water-level data for selected wells through the year 2000 as the basis for estimating water-level altitudes and the potentiometric surface in the SZ site-scale flow and transport model domain based on an alternative interpretation of perched water conditions. That revision developed computer files containing: Water-level data within the model area (DTN: GS010908312332.002); A table of known vertical head differences (DTN: GS010908312332.003); and A potentiometric-surface map (DTN: GS010608312332.001) using an alternative concept from that presented by USGS (2001 [DIRS 154625]) for the area north of Yucca Mountain. The updated water-level data presented in USGS (2004 [DIRS 168473]) include data obtained from the Nye County Early Warning Drilling Program (EWDP) Phases I and II and data from Borehole USW WT-24. This document is based on Revision 01 (USGS 2004 [DIRS 168473]) and expands the discussion of uncertainty in the potentiometric-surface map. This uncertainty assessment includes an analysis of the impact of more recent water-level data and the impact of adding data from the EWDP Phases III and IV wells. In addition to being utilized

  15. Water-Level Data Analysis for the Saturated Zone Site-Scale Flow and Transport Model

    Energy Technology Data Exchange (ETDEWEB)

    K. Rehfeldt

    2004-10-08

    This report is an updated analysis of water-level data performed to provide the ''Saturated Zone Site-Scale Flow Model'' (BSC 2004 [DIRS 170037]) (referred to as the saturated zone (SZ) site-scale flow model or site-scale SZ flow model in this report) with the configuration of the potentiometric surface, target water-level data, and hydraulic gradients for calibration of groundwater flow models. This report also contains an expanded discussion of uncertainty in the potentiometric-surface map. The analysis of the potentiometric data presented in Revision 00 of this report (USGS 2001 [DIRS 154625]) provides the configuration of the potentiometric surface, target heads, and hydraulic gradients for the calibration of the SZ site-scale flow model (BSC 2004 [DIRS 170037]). Revision 01 of this report (USGS 2004 [DIRS 168473]) used updated water-level data for selected wells through the year 2000 as the basis for estimating water-level altitudes and the potentiometric surface in the SZ site-scale flow and transport model domain based on an alternative interpretation of perched water conditions. That revision developed computer files containing: Water-level data within the model area (DTN: GS010908312332.002); A table of known vertical head differences (DTN: GS010908312332.003); and A potentiometric-surface map (DTN: GS010608312332.001) using an alternative concept from that presented by USGS (2001 [DIRS 154625]) for the area north of Yucca Mountain. The updated water-level data presented in USGS (2004 [DIRS 168473]) include data obtained from the Nye County Early Warning Drilling Program (EWDP) Phases I and II and data from Borehole USW WT-24. This document is based on Revision 01 (USGS 2004 [DIRS 168473]) and expands the discussion of uncertainty in the potentiometric-surface map. This uncertainty assessment includes an analysis of the impact of more recent water-level data and the impact of adding data from the EWDP Phases III and IV wells. In

  16. A systems biology approach for pathway level analysis

    OpenAIRE

    Draghici, Sorin; Khatri, Purvesh; Tarca, Adi Laurentiu; Amin, Kashyap; Done, Arina; Voichita, Calin; Georgescu, Constantin; Romero, Roberto

    2007-01-01

    A common challenge in the analysis of genomics data is trying to understand the underlying phenomenon in the context of all complex interactions taking place on various signaling pathways. A statistical approach using various models is universally used to identify the most relevant pathways in a given experiment. Here, we show that the existing pathway analysis methods fail to take into consideration important biological aspects and may provide incorrect results in certain situations. By usin...

  17. MAAP4.0.7 analysis and justification for PRA level 1 mission success criteria

    International Nuclear Information System (INIS)

    Butler, J.S.; Kapitz, D.; Martin, R.P.; Seifaee, F.; Sundaram, R.K.

    2008-01-01

    The U.S. EPR is a 4590 MWth evolutionary pressurized water reactor that incorporates proven technology with innovative system architecture to provide an unprecedented level of safety. One of the measures of safety is provided by Probability Risk Assessment (PRA). PRA Level 1 concerns the evaluation of core damage frequency based on various initiating events and the success or failure of various plant event mitigation features. Determination of this measure requires mission success criteria, which are used to build the logic that makes up the fault trees and event trees of the Level 1 PRA. Developing mission success criteria for the wide variety of accident sequences modeled in the PRA Level 1 model requires a large number of thermal hydraulic calculations. The MAAP4 code, developed by Fauske and Associates, Inc. and distributed by EPRI, was chosen to perform these calculations because of its fast computation times relative to more sophisticated thermal-hydraulics codes This is a unique application of MAAP4, which was developed specifically for severe accident and PRA Level 2 analysis. As such, a study was performed to assess MAAP4 's thermal-hydraulic response capabilities against AREVA 's S-RELAP5 best-estimate integral systems thermal-hydraulic analysis code. (authors)

  18. Integrating Micro-level Interactions with Social Network Analysis in Tie Strength Research

    DEFF Research Database (Denmark)

    Torre, Osku; Gupta, Jayesh Prakash; Kärkkäinen, Hannu

    2017-01-01

    of tie strength based on reciprocal interaction from publicly available Facebook data, and suggest that this approach could work as a basis for further tie strength studies. Our approach makes use of weak tie theory, and enables researchers to study micro-level interactions (i.e. discussions, messages......A social tie is a target for ongoing, high-level scientific debate. Measuring the tie strength in social networks has been an important topic for academic studies since Mark Granovetter's seminal papers in 1970's. However, it is still a problematic issue mainly for two reasons: 1) existing tie...... strengthening process in online social networks. Therefore, we suggest a new approach to tie strength research, which focuses on studying communication patterns (edges) more rather than actors (nodes) in a social network. In this paper we build a social network analysis-based approach to enable the evaluation...

  19. Atmospheric forcing of decadal Baltic Sea level variability in the last 200 years. A statistical analysis

    Energy Technology Data Exchange (ETDEWEB)

    Huenicke, B. [GKSS-Forschungszentrum Geesthacht GmbH (Germany). Inst. fuer Kuestenforschung

    2008-11-06

    This study aims at the estimation of the impact of different atmospheric factors on the past sealevel variations (up to 200 years) in the Baltic Sea by statistically analysing the relationship between Baltic Sea level records and observational and proxy-based reconstructed climatic data sets. The focus lies on the identification and possible quantification of the contribution of sealevel pressure (wind), air-temperature and precipitation to the low-frequency (decadal and multi-decadal) variability of Baltic Sea level. It is known that the wind forcing is the main factor explaining average Baltic Sea level variability at inter-annual to decadal timescales, especially in wintertime. In this thesis it is statistically estimated to what extent other regional climate factors contribute to the spatially heterogeneous Baltic Sea level variations around the isostatic trend at multi-decadal timescales. Although the statistical analysis cannot be completely conclusive, as the potential climate drivers are all statistically interrelated to some degree, the results indicate that precipitation should be taken into account as an explanatory variable for sea-level variations. On the one hand it has been detected that the amplitude of the annual cycle of Baltic Sea level has increased throughout the 20th century and precipitation seems to be the only factor among those analysed (wind through SLP field, barometric effect, temperature and precipitation) that can account for this evolution. On the other hand, precipitation increases the ability to hindcast inter-annual variations of sea level in some regions and seasons, especially in the Southern Baltic in summertime. The mechanism by which precipitation exerts its influence on Baltic Sea level is not ascertained in this statistical analysis due to the lack of long salinity time series. This result, however, represents a working hypothesis that can be confirmed or disproved by long simulations of the Baltic Sea system - ocean

  20. Analysis of Foreign Direct Investment Determinants at the Level of a County in Romania

    Directory of Open Access Journals (Sweden)

    Răzvan Cătălin DOBREA

    2012-06-01

    Full Text Available The attraction of the foreign direct investment (FDI represents a major challenge both at macroeconomic level from the perspective of the central public authorities and at the microeconomic one from the point of view of all entities involved in this process. While the FDI attraction problem is widely debated in the specialty literature at macro level, at micro level the representative solutions and models are still being searched for. The main objective of the realized research was the identification and analysis of some FDI attractively variables, on the example of a county in Romania. Based on the obtained results we proposed the development of a model, available both for the potential investors and for other regions with lower performances in order to improve the situation.

  1. In-situ nitrite analysis in high level waste tanks

    International Nuclear Information System (INIS)

    O'Rourke, P.E.; Prather, W.S.; Livingston, R.R.

    1992-01-01

    The Savannah River Site produces special nuclear materials used in the defense of the United States. Most of the processes at SRS are primarily chemical separations and purifications. In-situ chemical analyses help improve the safety, efficiency and quality of these operations. One area where in situ fiberoptic spectroscopy can have a great impact is the management of high level radioactive waste. High level radioactive waste at SRS is stored in more than 50 large waste tanks. The waste exists as a slurry of nitrate salts and metal hydroxides at pH's higher than 10. Sodium Nitrite is added to the tanks as a corrosion inhibitor. In-situ fiberoptic probes are being developed to measure the nitrate, nitrite and hydroxide concentrations in both liquid and solid fractions. Nitrite levels can be measured between 0.01M and 1M in a 1mm pathlength optical cell

  2. Remote-Handled Low Level Waste Disposal Project Alternatives Analysis

    Energy Technology Data Exchange (ETDEWEB)

    David Duncan

    2010-10-01

    This report identifies, evaluates, and compares alternatives for meeting the U.S. Department of Energy’s mission need for management of remote-handled low-level waste generated by the Idaho National Laboratory and its tenants. Each alternative identified in the Mission Need Statement for the Remote-Handled Low-Level Waste Treatment Project is described and evaluated for capability to fulfill the mission need. Alternatives that could meet the mission need are further evaluated and compared using criteria of cost, risk, complexity, stakeholder values, and regulatory compliance. The alternative for disposal of remote-handled low-level waste that has the highest confidence of meeting the mission need and represents best value to the government is to build a new disposal facility at the Idaho National Laboratory Site.

  3. Feeding different levels of a barley based concentrate to Jersey ...

    African Journals Online (AJOL)

    maryna

    there was no further increase when feeding the high level of concentrate. Live weight ... 33º 58′ 38″ S, 22º 25′ 16 ″E and at an altitude of 210 m. The milk ... detergent fibre was determined by heating a 0.5 g sample to boiling point in 100 mL of neutral detergent plus. 50 µL of heat ... of bags in cold water for 10 minutes.

  4. Validation of cryoSat-2 based lake levels

    DEFF Research Database (Denmark)

    Nielsen, Karina; Stenseng, Lars; Andersen, Ole Baltazar

    In this study, which is part of the FP7 project Land and Ocean take up from Sentinel-3 (LOTUS), we demonstrate the potential SAR altimetry. We consider lakes at various sizes and evaluate the CryoSat-2 derived lake levels in terms of along-track precision and agreement with in-situ data. As a ref......In this study, which is part of the FP7 project Land and Ocean take up from Sentinel-3 (LOTUS), we demonstrate the potential SAR altimetry. We consider lakes at various sizes and evaluate the CryoSat-2 derived lake levels in terms of along-track precision and agreement with in-situ data....... To derive lake level time series we apply a state-space model with a robust handling of erroneous data. Instead of attempting to identify and remove the polluted observations we use a mixture distribution to describe the observation noise, which prevents the polluted observations from biasing our final...

  5. Bioprinting: an assessment based on manufacturing readiness levels.

    Science.gov (United States)

    Wu, Changsheng; Wang, Ben; Zhang, Chuck; Wysk, Richard A; Chen, Yi-Wen

    2017-05-01

    Over the last decade, bioprinting has emerged as a promising technology in the fields of tissue engineering and regenerative medicine. With recent advances in additive manufacturing, bioprinting is poised to provide patient-specific therapies and new approaches for tissue and organ studies, drug discoveries and even food manufacturing. Manufacturing Readiness Level (MRL) is a method that has been applied to assess manufacturing maturity and to identify risks and gaps in technology-manufacturing transitions. Technology Readiness Level (TRL) is used to evaluate the maturity of a technology. This paper reviews recent advances in bioprinting following the MRL scheme and addresses corresponding MRL levels of engineering challenges and gaps associated with the translation of bioprinting from lab-bench experiments to ultimate full-scale manufacturing of tissues and organs. According to our step-by-step TRL and MRL assessment, after years of rigorous investigation by the biotechnology community, bioprinting is on the cusp of entering the translational phase where laboratory research practices can be scaled up into manufacturing products specifically designed for individual patients.

  6. Evaluation of low-level waste analysis using the MADAM system

    Energy Technology Data Exchange (ETDEWEB)

    Foster, L.A.; Wachter, J.R.; Hagan, R.C. [Los Alamos National Lab., NM (United States). Nuclear Materials Measurement and Accountability

    1994-08-01

    Previously, the important hardware features and capabilities for the Multiple Assay Dual Analysis Measurement (MADAM) system were reported. MADAM is a combined low-level and transuranic waste assay system. The system integrated commercially available Segmented Gamma Scanner (SGS) capability together with multienergy X-ray and gamma-ray analysis to measure these two waste forms. In addition, the system incorporated a small neutron slab detector to satisfy safeguards concerns and high resolution gamma-ray isotopics analysis proficiency. Since delivery of the system to this facility, an evaluation of its low-level waste measurement performance has been conducted using a set of specially constructed NIST-traceable standards. The evaluation studied existing analysis algorithms, matrix and attenuation effects, source position as a function of detector response, instrument stability, and sensitivity. Based on these studies, several modifications to the existing analysis algorithms have been performed, new correction factors for matrix attenuation have been devised, and measurement error estimates have been calculated and incorporated into the software. This report discusses the results of the evaluation program and the software modifications that have been developed.

  7. Evaluation of low-level waste analysis using the MADAM system

    International Nuclear Information System (INIS)

    Foster, L.A.; Wachter, J.R.; Hagan, R.C.

    1994-01-01

    Previously, the important hardware features and capabilities for the Multiple Assay Dual Analysis Measurement (MADAM) system were reported. MADAM is a combined low-level and transuranic waste assay system. The system integrated commercially available Segmented Gamma Scanner (SGS) capability together with multienergy X-ray and gamma-ray analysis to measure these two waste forms. In addition, the system incorporated a small neutron slab detector to satisfy safeguards concerns and high resolution gamma-ray isotopics analysis proficiency. Since delivery of the system to this facility, an evaluation of its low-level waste measurement performance has been conducted using a set of specially constructed NIST-traceable standards. The evaluation studied existing analysis algorithms, matrix and attenuation effects, source position as a function of detector response, instrument stability, and sensitivity. Based on these studies, several modifications to the existing analysis algorithms have been performed, new correction factors for matrix attenuation have been devised, and measurement error estimates have been calculated and incorporated into the software. This report discusses the results of the evaluation program and the software modifications that have been developed

  8. Ontology-Based High-Level Context Inference for Human Behavior Identification

    Directory of Open Access Journals (Sweden)

    Claudia Villalonga

    2016-09-01

    Full Text Available Recent years have witnessed a huge progress in the automatic identification of individual primitives of human behavior, such as activities or locations. However, the complex nature of human behavior demands more abstract contextual information for its analysis. This work presents an ontology-based method that combines low-level primitives of behavior, namely activity, locations and emotions, unprecedented to date, to intelligently derive more meaningful high-level context information. The paper contributes with a new open ontology describing both low-level and high-level context information, as well as their relationships. Furthermore, a framework building on the developed ontology and reasoning models is presented and evaluated. The proposed method proves to be robust while identifying high-level contexts even in the event of erroneously-detected low-level contexts. Despite reasonable inference times being obtained for a relevant set of users and instances, additional work is required to scale to long-term scenarios with a large number of users.

  9. Groundwater level prediction of landslide based on classification and regression tree

    Directory of Open Access Journals (Sweden)

    Yannan Zhao

    2016-09-01

    Full Text Available According to groundwater level monitoring data of Shuping landslide in the Three Gorges Reservoir area, based on the response relationship between influential factors such as rainfall and reservoir level and the change of groundwater level, the influential factors of groundwater level were selected. Then the classification and regression tree (CART model was constructed by the subset and used to predict the groundwater level. Through the verification, the predictive results of the test sample were consistent with the actually measured values, and the mean absolute error and relative error is 0.28 m and 1.15% respectively. To compare the support vector machine (SVM model constructed using the same set of factors, the mean absolute error and relative error of predicted results is 1.53 m and 6.11% respectively. It is indicated that CART model has not only better fitting and generalization ability, but also strong advantages in the analysis of landslide groundwater dynamic characteristics and the screening of important variables. It is an effective method for prediction of ground water level in landslides.

  10. Ontology-Based High-Level Context Inference for Human Behavior Identification

    Science.gov (United States)

    Villalonga, Claudia; Razzaq, Muhammad Asif; Khan, Wajahat Ali; Pomares, Hector; Rojas, Ignacio; Lee, Sungyoung; Banos, Oresti

    2016-01-01

    Recent years have witnessed a huge progress in the automatic identification of individual primitives of human behavior, such as activities or locations. However, the complex nature of human behavior demands more abstract contextual information for its analysis. This work presents an ontology-based method that combines low-level primitives of behavior, namely activity, locations and emotions, unprecedented to date, to intelligently derive more meaningful high-level context information. The paper contributes with a new open ontology describing both low-level and high-level context information, as well as their relationships. Furthermore, a framework building on the developed ontology and reasoning models is presented and evaluated. The proposed method proves to be robust while identifying high-level contexts even in the event of erroneously-detected low-level contexts. Despite reasonable inference times being obtained for a relevant set of users and instances, additional work is required to scale to long-term scenarios with a large number of users. PMID:27690050

  11. Chip based electroanalytical systems for cell analysis

    DEFF Research Database (Denmark)

    Spegel, C.; Heiskanen, A.; Skjolding, L.H.D.

    2008-01-01

    ' measurements of processes related to living cells, i.e., systems without lysing the cells. The focus is on chip based amperometric and impedimetric cell analysis systems where measurements utilizing solely carbon fiber microelectrodes (CFME) and other nonchip electrode formats, such as CFME for exocytosis...

  12. The effect of project-based learning on students' statistical literacy levels for data representation

    Science.gov (United States)

    Koparan, Timur; Güven, Bülent

    2015-07-01

    The point of this study is to define the effect of project-based learning approach on 8th Grade secondary-school students' statistical literacy levels for data representation. To achieve this goal, a test which consists of 12 open-ended questions in accordance with the views of experts was developed. Seventy 8th grade secondary-school students, 35 in the experimental group and 35 in the control group, took this test twice, one before the application and one after the application. All the raw scores were turned into linear points by using the Winsteps 3.72 modelling program that makes the Rasch analysis and t-tests, and an ANCOVA analysis was carried out with the linear points. Depending on the findings, it was concluded that the project-based learning approach increases students' level of statistical literacy for data representation. Students' levels of statistical literacy before and after the application were shown through the obtained person-item maps.

  13. Development of a computerized data base for low-level radioactive waste leaching data: Topical report

    International Nuclear Information System (INIS)

    Dougherty, D.R.; Colombo, P.

    1986-09-01

    This report documents the development of a computerized data base (db) of leaching data for solidified low-level radioactive waste (LLW) forms. Brookhaven National Lab performed this work under contract with the US Department of Energy's Low-Level Waste Management Program as part of an effort to develop an accelerated leach test(s) that can be used to predict leachabilities of LLW forms over long time periods, i.e., hundreds of years. The accelerated leach test(s) is (are) to be developed based on knowledge of leaching mechanisms and factors that affect leaching. Although developed specifically for the Accelerated Leach Test(s) Program, this db may be useful to others concerned with the management of low-level waste. The db is being developed to provide efficient data compilation and analysis capabilities. The data compiled in the db, which include data from the Accelerated Leach Test(s) Program and selected data from the literature, have been selected to elucidate leaching mechanisms and factors that affect leaching and are not meant to be a comprehensive compilation of leaching data. This report presents the data compilation aspect of the db. It does not present the programmatic results obtained from analysis of the data regarding leaching mechanisms and factors that affect leaching, which will be presented in reports from the Accelerated Leach Test(s) Program. 6 refs

  14. Research Update: Spatially resolved mapping of electronic structure on atomic level by multivariate statistical analysis

    International Nuclear Information System (INIS)

    Belianinov, Alex; Ganesh, Panchapakesan; Lin, Wenzhi; Jesse, Stephen; Pan, Minghu; Kalinin, Sergei V.; Sales, Brian C.; Sefat, Athena S.

    2014-01-01

    Atomic level spatial variability of electronic structure in Fe-based superconductor FeTe 0.55 Se 0.45 (T c = 15 K) is explored using current-imaging tunneling-spectroscopy. Multivariate statistical analysis of the data differentiates regions of dissimilar electronic behavior that can be identified with the segregation of chalcogen atoms, as well as boundaries between terminations and near neighbor interactions. Subsequent clustering analysis allows identification of the spatial localization of these dissimilar regions. Similar statistical analysis of modeled calculated density of states of chemically inhomogeneous FeTe 1−x Se x structures further confirms that the two types of chalcogens, i.e., Te and Se, can be identified by their electronic signature and differentiated by their local chemical environment. This approach allows detailed chemical discrimination of the scanning tunneling microscopy data including separation of atomic identities, proximity, and local configuration effects and can be universally applicable to chemically and electronically inhomogeneous surfaces

  15. Radioactive waste management complex low-level waste radiological composite analysis

    Energy Technology Data Exchange (ETDEWEB)

    McCarthy, J.M.; Becker, B.H.; Magnuson, S.O.; Keck, K.N.; Honeycutt, T.K.

    1998-05-01

    The composite analysis estimates the projected cumulative impacts to future members of the public from the disposal of low-level radioactive waste (LLW) at the Idaho National Engineering and Environmental Laboratory (INEEL) Radioactive Waste Management Complex (RWMC) and all other sources of radioactive contamination at the INEEL that could interact with the LLW disposal facility to affect the radiological dose. Based upon the composite analysis evaluation, waste buried in the Subsurface Disposal Area (SDA) at the RWMC is the only source at the INEEL that will significantly interact with the LLW facility. The source term used in the composite analysis consists of all historical SDA subsurface disposals of radionuclides as well as the authorized LLW subsurface disposal inventory and projected LLW subsurface disposal inventory. Exposure scenarios evaluated in the composite analysis include all the all-pathways and groundwater protection scenarios. The projected dose of 58 mrem/yr exceeds the composite analysis guidance dose constraint of 30 mrem/yr; therefore, an options analysis was conducted to determine the feasibility of reducing the projected annual dose. Three options for creating such a reduction were considered: (1) lowering infiltration of precipitation through the waste by providing a better cover, (2) maintaining control over the RWMC and portions of the INEEL indefinitely, and (3) extending the period of institutional control beyond the 100 years assumed in the composite analysis. Of the three options investigated, maintaining control over the RWMC and a small part of the present INEEL appears to be feasible and cost effective.

  16. Radioactive waste management complex low-level waste radiological composite analysis

    International Nuclear Information System (INIS)

    McCarthy, J.M.; Becker, B.H.; Magnuson, S.O.; Keck, K.N.; Honeycutt, T.K.

    1998-05-01

    The composite analysis estimates the projected cumulative impacts to future members of the public from the disposal of low-level radioactive waste (LLW) at the Idaho National Engineering and Environmental Laboratory (INEEL) Radioactive Waste Management Complex (RWMC) and all other sources of radioactive contamination at the INEEL that could interact with the LLW disposal facility to affect the radiological dose. Based upon the composite analysis evaluation, waste buried in the Subsurface Disposal Area (SDA) at the RWMC is the only source at the INEEL that will significantly interact with the LLW facility. The source term used in the composite analysis consists of all historical SDA subsurface disposals of radionuclides as well as the authorized LLW subsurface disposal inventory and projected LLW subsurface disposal inventory. Exposure scenarios evaluated in the composite analysis include all the all-pathways and groundwater protection scenarios. The projected dose of 58 mrem/yr exceeds the composite analysis guidance dose constraint of 30 mrem/yr; therefore, an options analysis was conducted to determine the feasibility of reducing the projected annual dose. Three options for creating such a reduction were considered: (1) lowering infiltration of precipitation through the waste by providing a better cover, (2) maintaining control over the RWMC and portions of the INEEL indefinitely, and (3) extending the period of institutional control beyond the 100 years assumed in the composite analysis. Of the three options investigated, maintaining control over the RWMC and a small part of the present INEEL appears to be feasible and cost effective

  17. CLUSTER ANALYSIS UKRAINIAN REGIONAL DISTRIBUTION BY LEVEL OF INNOVATION

    Directory of Open Access Journals (Sweden)

    Roman Shchur

    2016-07-01

    Full Text Available   SWOT-analysis of the threats and benefits of innovation development strategy of Ivano-Frankivsk region in the context of financial support was сonducted. Methodical approach to determine of public-private partnerships potential that is tool of innovative economic development financing was identified. Cluster analysis of possibilities of forming public-private partnership in a particular region was carried out. Optimal set of problem areas that require urgent solutions and financial security is defined on the basis of cluster approach. It will help to form practical recommendations for the formation of an effective financial mechanism in the regions of Ukraine. Key words: the mechanism of innovation development financial provision, innovation development, public-private partnerships, cluster analysis, innovative development strategy.

  18. Analysis of jitter due to call-level fluctuations

    NARCIS (Netherlands)

    M.R.H. Mandjes (Michel)

    2005-01-01

    textabstractIn communication networks used by constant bit rate applications, call-level dynamics (i.e., entering and leaving calls) lead to fluctuations in the load, and therefore also fluctuations in the delay (jitter). By intentionally delaying the packets at the destination, one can transform

  19. Power Analysis for Cross Level Mediation in CRTs

    Science.gov (United States)

    Kelcey, Ben

    2014-01-01

    A common design in education research for interventions operating at a group or cluster level is a cluster randomized trial (CRT) (Bloom, 2005). In CRTs, intact clusters (e.g., schools) are assigned to treatment conditions rather than individuals (e.g., students) and are frequently an effective way to study interventions because they permit…

  20. Landscape level analysis of disturbance regimes in protected areas ...

    Indian Academy of Sciences (India)

    G B Pant Institute of Himalayan Environment and Development, Almora 263 643, Uttarakhand, India. ... level assessment of fragmentation and disturbance index in protected areas of Rajasthan using remote ..... anthropogenic/natural forces on the landscape was ..... Environmental Research, Engineering and Management.

  1. Water-Level Analysis for Cumberland Sound, Georgia

    National Research Council Canada - National Science Library

    Kraus, Nicholas

    1997-01-01

    .... The channel through St Marys Entrance is maintained at a 50-ft depth through significant dredging that occurred from 1986-1988 Questions arose as to whether this dredging had raised the water level in Cumberland Sound. The U.S...

  2. Measuring Structural Gender Equality in Mexico: A State Level Analysis

    Science.gov (United States)

    Frias, Sonia M.

    2008-01-01

    The main goal of this article is to assess the level of gender equality across the 32 Mexican states. After reviewing conceptual and methodological issues related to previous measures of structural inequality I detail the logic and methodology involved in the construction of a composite and multidimensional measure of gender equality, at the…

  3. Low level liquid scintillation analysis for environmental and biomedical quantitation

    International Nuclear Information System (INIS)

    Kessler, M.J.

    1991-01-01

    Over the past five years low level liquid scintillation counting has become increasing popular because of the large number of applications which can be performed using this technique. These applications include environmental monitoring ( 3 H, 90 Sr/ 90 Y, etc.), radiocarbon dating (for age determination to 50,000 years), food adulteration studies (alcohol and beverage industries), radon monitoring (air/water), nuclear power plant monitoring (low level 3 H) and metabolism studies (pharmaceutical research). These applications can be performed with either a dedicated low level LSC or using a standard liquid scintillation counter in conjunction with the new technique of time-resolved LSC (TR-LSC). This technique when used on a standard LSC reduces the instrument background without substantially effecting the background, thus increasing the performance (E 2 /B) of the LSC. Data will be presented for each of the applications mentioned above, comparing the standard LSC and the new TR-LSC techniques. The optimization of the samples for each of these applications will be explored in detail with experimental results. In conclusion, by using the TR-LSC technique in conjunction with a standard LSC the performance of the standard LSC can be increased substantially without dedicating the LSC to doing only low level samples

  4. Arabic Feature-Based Level Sentiment Analysis Using Lexicon ...

    African Journals Online (AJOL)

    pc

    2018-03-05

    Mar 5, 2018 ... structured reviews being prior knowledge for mining unstructured reviews. ... FDSO has been introduced, which defines a space of product features ... polarity of a review using feature ontology and sentiment lexicons.

  5. Model-Aided Altimeter-Based Water Level Forecasting System in Mekong River

    Science.gov (United States)

    Chang, C. H.; Lee, H.; Hossain, F.; Okeowo, M. A.; Basnayake, S. B.; Jayasinghe, S.; Saah, D. S.; Anderson, E.; Hwang, E.

    2017-12-01

    Mekong River, one of the massive river systems in the world, has drainage area of about 795,000 km2 covering six countries. People living in its drainage area highly rely on resources given by the river in terms of agriculture, fishery, and hydropower. Monitoring and forecasting the water level in a timely manner, is urgently needed over the Mekong River. Recently, using TOPEX/Poseidon (T/P) altimetry water level measurements in India, Biancamaria et al. [2011] has demonstrated the capability of an altimeter-based flood forecasting system in Bangladesh, with RMSE from 0.6 - 0.8 m for lead times up to 5 days on 10-day basis due to T/P's repeat period. Hossain et al. [2013] further established a daily water level forecasting system in Bangladesh using observations from Jason-2 in India and HEC-RAS hydraulic model, with RMSE from 0.5 - 1.5 m and an underestimating mean bias of 0.25 - 1.25 m. However, such daily forecasting system relies on a collection of Jason-2 virtual stations (VSs) to ensure frequent sampling and data availability. Since the Mekong River is a meridional river with few number of VSs, the direct application of this system to the Mekong River becomes challenging. To address this problem, we propose a model-aided altimeter-based forecasting system. The discharge output by Variable Infiltration Capacity hydrologic model is used to reconstruct a daily water level product at upstream Jason-2 VSs based on the discharge-to-level rating curve. The reconstructed daily water level is then used to perform regression analysis with downstream in-situ water level to build regression models, which are used to forecast a daily water level. In the middle reach of the Mekong River from Nakhon Phanom to Kratie, a 3-day lead time forecasting can reach RMSE about 0.7 - 1.3 m with correlation coefficient around 0.95. For the lower reach of the Mekong River, the water flow becomes more complicated due to the reversal flow between the Tonle Sap Lake and the Mekong River

  6. Adaptive Game Level Creation through Rank-based Interactive Evolution

    DEFF Research Database (Denmark)

    Liapis, Antonios; Martínez, Héctor Pérez; Togelius, Julian

    2013-01-01

    as fitness functions for the optimization of the generated content. The preference models are built via ranking-based preference learning, while the content is generated via evolutionary search. The proposed method is evaluated on the creation of strategy game maps, and its performance is tested using...

  7. The marketing firm and consumer choice: implications of bilateral contingency for levels of analysis in organizational neuroscience

    Directory of Open Access Journals (Sweden)

    Gordon Robert Foxall

    2014-07-01

    Full Text Available The emergence of a conception of the marketing firm (Foxall, 1999a conceived within behavioral psychology and based on a corresponding model of consumer choice, (Foxall, 1990/2004 permits an assessment of the levels of behavioral and organizational analysis amenable to neuroscientific examination. This paper explores the ways in which the bilateral contingencies that link the marketing firm with its consumerate allow appropriate levels of organizational neuroscientific analysis to be specified. Having described the concept of the marketing firm and the model of consumer behavior on which it is based, the paper analyzes bilateral contingencies at the levels of (i market exchange, (ii emotional reward, and (iii neuroeconomics. Market exchange emerges as a level of analysis that lends itself predominantly to the explanation of firm—consumerate interactions in terms of the super-personal level of reinforcing and punishing contingencies: the marketing firm can be treated as a contextual or operant system in its own right. However, the emotional reward and neuroeconomic levels of analysis should be confined to the personal level of analysis represented by individual managers on the one hand and individual consumers on the other. This also entails a level of abstraction but it is one that can be satisfactorily handled in terms of the concept of bilateral contingency.

  8. The marketing firm and consumer choice: implications of bilateral contingency for levels of analysis in organizational neuroscience.

    Science.gov (United States)

    Foxall, Gordon R

    2014-01-01

    The emergence of a conception of the marketing firm (Foxall, 1999a) conceived within behavioral psychology and based on a corresponding model of consumer choice, (Foxall, 1990/2004) permits an assessment of the levels of behavioral and organizational analysis amenable to neuroscientific examination. This paper explores the ways in which the bilateral contingencies that link the marketing firm with its consumerate allow appropriate levels of organizational neuroscientific analysis to be specified. Having described the concept of the marketing firm and the model of consumer behavior on which it is based, the paper analyzes bilateral contingencies at the levels of (i) market exchange, (ii) emotional reward, and (iii) neuroeconomics. Market exchange emerges as a level of analysis that lends itself predominantly to the explanation of firm-consumerate interactions in terms of the super-personal level of reinforcing and punishing contingencies: the marketing firm can be treated as a contextual or operant system in its own right. However, the emotional reward and neuroeconomic levels of analysis should be confined to the personal level of analysis represented by individual managers on the one hand and individual consumers on the other. This also entails a level of abstraction but it is one that can be satisfactorily handled in terms of the concept of bilateral contingency.

  9. The marketing firm and consumer choice: implications of bilateral contingency for levels of analysis in organizational neuroscience

    Science.gov (United States)

    Foxall, Gordon R.

    2014-01-01

    The emergence of a conception of the marketing firm (Foxall, 1999a) conceived within behavioral psychology and based on a corresponding model of consumer choice, (Foxall, 1990/2004) permits an assessment of the levels of behavioral and organizational analysis amenable to neuroscientific examination. This paper explores the ways in which the bilateral contingencies that link the marketing firm with its consumerate allow appropriate levels of organizational neuroscientific analysis to be specified. Having described the concept of the marketing firm and the model of consumer behavior on which it is based, the paper analyzes bilateral contingencies at the levels of (i) market exchange, (ii) emotional reward, and (iii) neuroeconomics. Market exchange emerges as a level of analysis that lends itself predominantly to the explanation of firm—consumerate interactions in terms of the super-personal level of reinforcing and punishing contingencies: the marketing firm can be treated as a contextual or operant system in its own right. However, the emotional reward and neuroeconomic levels of analysis should be confined to the personal level of analysis represented by individual managers on the one hand and individual consumers on the other. This also entails a level of abstraction but it is one that can be satisfactorily handled in terms of the concept of bilateral contingency. PMID:25071506

  10. Robust Methods for Moderation Analysis with a Two-Level Regression Model.

    Science.gov (United States)

    Yang, Miao; Yuan, Ke-Hai

    2016-01-01

    Moderation analysis has many applications in social sciences. Most widely used estimation methods for moderation analysis assume that errors are normally distributed and homoscedastic. When these assumptions are not met, the results from a classical moderation analysis can be misleading. For more reliable moderation analysis, this article proposes two robust methods with a two-level regression model when the predictors do not contain measurement error. One method is based on maximum likelihood with Student's t distribution and the other is based on M-estimators with Huber-type weights. An algorithm for obtaining the robust estimators is developed. Consistent estimates of standard errors of the robust estimators are provided. The robust approaches are compared against normal-distribution-based maximum likelihood (NML) with respect to power and accuracy of parameter estimates through a simulation study. Results show that the robust approaches outperform NML under various distributional conditions. Application of the robust methods is illustrated through a real data example. An R program is developed and documented to facilitate the application of the robust methods.

  11. Based on user interest level of modeling scenarios and browse content

    Science.gov (United States)

    Zhao, Yang

    2017-08-01

    User interest modeling is the core of personalized service, taking into account the impact of situational information on user preferences, the user behavior days of financial information. This paper proposes a method of user interest modeling based on scenario information, which is obtained by calculating the similarity of the situation. The user's current scene of the approximate scenario set; on the "user - interest items - scenarios" three-dimensional model using the situation pre-filtering method of dimension reduction processing. View the content of the user interested in the theme, the analysis of the page content to get each topic of interest keywords, based on the level of vector space model user interest. The experimental results show that the user interest model based on the scenario information is within 9% of the user's interest prediction, which is effective.

  12. Procurement Fraud: A Knowledge-Level Analysis of Contracting Personnel

    Science.gov (United States)

    2014-12-01

    goods and services for cheaper or substandard merchandise that does not conform to contract specifications. This process is also known as product...11 shows a visual depiction of these results. 48 Figure II . Number of Prut icipants by W aiTant Status C. ANALYSIS OF KNOWLEDGE QUESTIONS There

  13. PUBLIC DEBT ANALYSIS BASED ON SUSTAINABILITY INDICATORS

    Directory of Open Access Journals (Sweden)

    Elena DASCALU

    2016-09-01

    Full Text Available This article is an analysis of public debt, in terms of sustainability and vulnerability indicators, under a functioning market economy. The problems encountered regarding the high level of public debt or the potential risks of budgetary pressure converge to the idea that sustainability of public finances should be a major challenge for public policy. Thus, the policy adequate to address public finance sustainability must have as its starting point the overall strategy of the European Union, as well as the economic development of Member States, focusing on the most important performance components, namely, reducing public debt levels, increasing productivity and employment and, last but not the least, reforming social security systems. In order to achieve sustainable levels of public debt, the European Union Member States are required to establish and accomplish medium term strategic budgetary goals to ensure a downward trend in public debt.

  14. Confidence-Based Learning in Investment Analysis

    Science.gov (United States)

    Serradell-Lopez, Enric; Lara-Navarra, Pablo; Castillo-Merino, David; González-González, Inés

    The aim of this study is to determine the effectiveness of using multiple choice tests in subjects related to the administration and business management. To this end we used a multiple-choice test with specific questions to verify the extent of knowledge gained and the confidence and trust in the answers. The tests were performed in a group of 200 students at the bachelor's degree in Business Administration and Management. The analysis made have been implemented in one subject of the scope of investment analysis and measured the level of knowledge gained and the degree of trust and security in the responses at two different times of the course. The measurements have been taken into account different levels of difficulty in the questions asked and the time spent by students to complete the test. The results confirm that students are generally able to obtain more knowledge along the way and get increases in the degree of trust and confidence in the answers. It is confirmed as the difficulty level of the questions set a priori by the heads of the subjects are related to levels of security and confidence in the answers. It is estimated that the improvement in the skills learned is viewed favourably by businesses and are especially important for job placement of students.

  15. Risk Evaluation of Railway Coal Transportation Network Based on Multi Level Grey Evaluation Model

    Science.gov (United States)

    Niu, Wei; Wang, Xifu

    2018-01-01

    The railway transport mode is currently the most important way of coal transportation, and now China’s railway coal transportation network has become increasingly perfect, but there is still insufficient capacity, some lines close to saturation and other issues. In this paper, the theory and method of risk assessment, analytic hierarchy process and multi-level gray evaluation model are applied to the risk evaluation of coal railway transportation network in China. Based on the example analysis of Shanxi railway coal transportation network, to improve the internal structure and the competitiveness of the market.

  16. Extraction and Analysis of Autonomous System Level Internet Map of Turkey

    Directory of Open Access Journals (Sweden)

    Hakan Çetin

    2010-01-01

    Full Text Available At the high level, the Internet is a mesh that is composed of thousands of autonomous system (AS connected together. This mesh is represented as a graph where each autonomous system is considered as a node and the connections with Border Gateway Protocol neighbored autonomous systems considered as an edge. Analysis of this mesh and visual representation of the graph gives us the AS level topology of the Internet. In recent years there are increasing numbers of studies that are focused on the structure of the topology of the Internet. It is important to study the Internet infrastructure in Turkey and to provide a way to monitor the changes to it over time. In this study we present the AS level Internet map of Turkey with explanation of each step. In order to get the whole AS level map, we first determined the ASs that geographically reside in Turkey and afterwards determined the interconnections among this ASs, along with international interconnections. Then we extracted the relations between connected ASs and analyzed the structural properties of AS infrastructure. We explained the methods we used in each step. Using the extracted data we analyzed the AS level properties of Turkey and we provide the AS level Internet map of Turkey along with a web-based software that can monitor and provide information of ASs in Turkey.

  17. Experimental vibration level analysis of a Francis turbine

    International Nuclear Information System (INIS)

    Bucur, D M; Dunca, G; Calinoiu, C

    2012-01-01

    In this study the vibration level of a Francis turbine is investigated by experimental work in site. Measurements are carried out for different power output values, in order to highlight the influence of the operation regimes on the turbine behavior. The study focuses on the turbine shaft to identify the mechanical vibration sources and on the draft tube in order to identify the hydraulic vibration sources. Analyzing the vibration results, recommendations regarding the operation of the turbine, at partial load close to minimum values, in the middle of the operating domain or close to maximum values of electric power, can be made in order to keep relatively low levels of vibration. Finally, conclusions are drawn in order to present the real sources of the vibrations.

  18. Analysis of Plasma Homocysteine Levels in Patients with Unstable Angina

    Directory of Open Access Journals (Sweden)

    José Roberto Tavares

    2002-08-01

    Full Text Available OBJECTIVE - To determine the prevalence of hyperhomocystinemia in patients with acute ischemic syndrome of the unstable angina type. METHODS - We prospectively studied 46 patients (24 females with unstable angina and 46 control patients (19 males, paired by sex and age, blinded to the laboratory data. Details of diets, smoking habits, medication used, body mass index, and the presence of hypertension and diabetes were recorded, as were plasma lipid and glucose levels, C-reactive protein, and lipoperoxidation in all participants. Patients with renal disease were excluded. Plasma homocysteine was estimated using high-pressure liquid chromatography. RESULTS - Plasma homocysteine levels were significantly higher in the group of patients with unstable angina (12.7±6.7 µmol/L than in the control group (8.7±4.4 µmol/L (p<0.05. Among males, homocystinemia was higher in the group with unstable angina than in the control group, but this difference was not statistically significant (14.1±5.9 µmol/L versus 11.9±4.2 µmol/L. Among females, however, a statistically significant difference was observed between the 2 groups: 11.0±7.4 µmol/L versus 6.4±2.9 µmol/L (p<0.05 in the unstable angina and control groups, respectively. Approximately 24% of the patients had unstable angina at homocysteine levels above 15 µmol/L. CONCLUSION - High homocysteine levels seem to be a relevant prevalent factor in the population with unstable angina, particularly among females.

  19. Population Stabilization in India: A Sub-State level Analysis

    OpenAIRE

    Purohit C, Dr Brijesh

    2007-01-01

    The study aims at analyzing economic and policy factors impinging upon population stabilization measures at the district (sub-state level) in India. It reflects upon popularly debated notions, namely, that development is the best contraceptive or whether contraceptive is the best development. In order to reflect upon this notion, we hypothesize that the factors determining the success of population stabilization measures are likely to be different across rich and poor states. It is more likel...

  20. GOCI Level-2 Processing Improvements and Cloud Motion Analysis

    Science.gov (United States)

    Robinson, Wayne D.

    2015-01-01

    The Ocean Biology Processing Group has been working with the Korean Institute of Ocean Science and Technology (KIOST) to process geosynchronous ocean color data from the GOCI (Geostationary Ocean Color Instrument) aboard the COMS (Communications, Ocean and Meteorological Satellite). The level-2 processing program, l2gen has GOCI processing as an option. Improvements made to that processing are discussed here as well as a discussion about cloud motion effects.

  1. Analysis of radiation level on dinosaur fossil in Zigong

    International Nuclear Information System (INIS)

    Yang Changshu; Liang Shuzhi; Fan Zhengnian.

    1995-01-01

    Study on radiation level of dinosaur fossil and environment in conservation zone in Zigong, Sichuan has been done. The results showed that the γ radiation dose and radioactivity strength of 232 Th and 40 K in dinosaur fossil, soil and rock in the conservation zone were within the limits of radioactive background value in Zigong. Radioactivity strength of 238 U, 226 Ra in dinosaur fossil were 26.6 and 29.2 times higher than the rock of same layer respectively

  2. An Analysis of Futsal Players' Self-Esteem Levels

    Science.gov (United States)

    Kocak, Mehmet

    2015-01-01

    The purpose of this study is to investigate the self-esteem levels of futsal players according to certain variables. The samples of the study constituted 119 females and 96 males; a total of 215 players with an average age of 21.57 ± 2.20 years. The research was carried out with the end of "Rosenberg self-esteem Scale" developed by…

  3. Individual relocation decisions after tornadoes: a multi-level analysis.

    Science.gov (United States)

    Cong, Zhen; Nejat, Ali; Liang, Daan; Pei, Yaolin; Javid, Roxana J

    2018-04-01

    This study examines how multi-level factors affected individuals' relocation decisions after EF4 and EF5 (Enhanced Fujita Tornado Intensity Scale) tornadoes struck the United States in 2013. A telephone survey was conducted with 536 respondents, including oversampled older adults, one year after these two disaster events. Respondents' addresses were used to associate individual information with block group-level variables recorded by the American Community Survey. Logistic regression revealed that residential damage and homeownership are important predictors of relocation. There was also significant interaction between these two variables, indicating less difference between homeowners and renters at higher damage levels. Homeownership diminished the likelihood of relocation among younger respondents. Random effects logistic regression found that the percentage of homeownership and of higher income households in the community buffered the effect of damage on relocation; the percentage of older adults reduced the likelihood of this group relocating. The findings are assessed from the standpoint of age difference, policy implications, and social capital and vulnerability. © 2018 The Author(s). Disasters © Overseas Development Institute, 2018.

  4. Mean Streets: An analysis on street level pollution in NYC

    Science.gov (United States)

    Parker, G.

    2017-12-01

    The overarching objective of this study is to quantify the spatial and temporal variability in particulatematter concentration (PM 2.5) along crowded streets in New York City. Due to their fine size and lowdensity PM 2.5 stays longer in the atmosphere and could bypass human nose and throat and penetratedeep in to the lungs and even enter the circulatory system. PM 2.5 is a by-product of automobilecombustion and is a primary cause of respiratory malfunction in NYC. The study would monitor streetlevel concentration of PM2.5 across three different routes that witness significant pedestrian traffic;observations will be conducted along these three routes at different time periods. The study will use theAirBeam community air quality monitor. The monitor tracks PM 2.5 concentration along with GPS, airtemperature and relative humidity. The surface level concentration monitored by AirBeam will becompared with atmospheric concentration of PM 2.5 that are monitored at the NOAA CREST facility onCCNY campus. The lower atmospheric values will be correlated with street level values to assess thevalidity of using of lower atmospheric values to predict street level concentrations. The street levelconcentration will be compared to the air quality forecasted by the New York Department ofEnvironment Conservation to estimate its accuracy and applicability.

  5. Law enforcers recognition level emerging threats based on physical appearance and behavior signs the enemy

    Directory of Open Access Journals (Sweden)

    R.M. Radzievskiy

    2015-02-01

    Full Text Available Purpose: examine the effectiveness of the training method of differential approach to the choice of means of influence on the action of law enforcers opponent with different levels of aggressiveness. Material : the experiment involved 15 students of the Kyiv National Academy of Internal Affairs and the 15 employees of the State Guard of Ukraine. Results : presented curriculum for special physical and tactical training. The program details the conceptual apparatus of THREATS and DANGERS manifestations of different levels of aggressiveness opponent (case analysis of its motor behavior. The study participants underwent 7 day course focused training. The basis of the course is an advanced theoretical base. The base is aimed at developing knowledge and skills of employees in determining the level of danger. Including threats from testing and modeling episodes of extreme situations the options cadets. Conclusions : In the simulated collision situations with aggressive opponent to the students significantly improved the adequacy of the response to the threat of execution time and within the legal grounds. Recognition was determined by the level of aggressiveness manifest manners enemy, his emotions, motivation, motor behavior, positional arrangement for 2 - 3 seconds. The program contributed to the development of qualities: attention, orientation, perception, motor lead.

  6. Computer-based control of nuclear power information systems at international level

    International Nuclear Information System (INIS)

    Boniface, Ekechukwu; Okonkwo, Obi

    2011-01-01

    In most highly industrialized countries of the world information plays major role in anti-nuclear campaign. Information and discussions on nuclear power need critical and objective analysis before the structured information presentation to the public to avoid bias anti-nuclear information on one side and neglect of great risk in nuclear power. This research is developing a computer-based information system for the control of nuclear power at international level. The system is to provide easy and fast information highways for the followings: (1) Low Regulatory dose and activity limit as level of high danger for individuals and public. (2) Provision of relevant technical or scientific education among the information carriers in the nuclear power countries. The research is on fact oriented investigation about radioactivity. It also deals with fact oriented education about nuclear accidents and safety. A standard procedure for dissemination of latest findings using technical and scientific experts in nuclear technology is developed. The information highway clearly analyzes the factual information about radiation risk and nuclear energy. Radiation cannot be removed from our environment. The necessity of radiation utilizations defines nuclear energy as two-edge sword. It is therefore, possible to use computer-based information system in projecting and dissemination of expert knowledge about nuclear technology positively and also to use it in directing the public on the safety and control of the nuclear energy. The computer-based information highway for nuclear energy technology is to assist in scientific research and technological development at international level. (author)

  7. [The effect of group-based psychodrama therapy on decreasing the level of aggression in adolescents].

    Science.gov (United States)

    Karataş, Zeynep; Gökçakan, Dan Zafer

    2009-01-01

    This study aimed to examine the effect of group-based psychodrama therapy on the level aggression in adolescents. The study included 23 students from Nezihe Yalvac Anatolian Vocational High School of Hotel Management and Tourism that had high aggression scores. Eleven of the participants (6 female, 5 male) constituted the experimental group and 12 (6 male, 6 female) were in the control group. The 34-item Aggression Scale was used to measure level of aggression. We utilized mixed pattern design including experiment-control, pre-test and post test and follow up. The experimental group participated in group-based psychodrama therapy once a week for 90 minutes, for 14 weeks in total. The Aggression Scale was administered to the experimental and control groups before and after treatment; it was additionally administered to the experimental group 16 weeks after treatment. Data were analyzed using ANCOVA and dependent samples t tests. Our analysis shows that group-based psychodrama had an effect on the experimental group in terms of total aggression, anger, hostility, and indirect aggression scores (F=65.109, F=20.175, F=18.593, F=40.987, respectively, Ppsychodrama therapy decreased the level of aggression in the experimental group. Current findings are discussed with reference to the literature. Recommendations for further research and for psychiatric counselors are provided.

  8. Application of Adjusted Canonical Correlation Analysis (ACCA) to study the association between mathematics in Level 1 and Level 2 and performance of engineering disciplines in Level 2

    Science.gov (United States)

    Peiris, T. S. G.; Nanayakkara, K. A. D. S. A.

    2017-09-01

    Mathematics plays a key role in engineering sciences as it assists to develop the intellectual maturity and analytical thinking of engineering students and exploring the student academic performance has received great attention recently. The lack of control over covariates motivates the need for their adjustment when measuring the degree of association between two sets of variables in Canonical Correlation Analysis (CCA). Thus to examine the individual effects of mathematics in Level 1 and Level 2 on engineering performance in Level 2, two adjusted analyses in CCA: Part CCA and Partial CCA were applied for the raw marks of engineering undergraduates for three different disciplines, at the Faculty of Engineering, University of Moratuwa, Sri Lanka. The joint influence of mathematics in Level 1 and Level 2 is significant on engineering performance in Level 2 irrespective of the engineering disciplines. The individual effect of mathematics in Level 2 is significantly higher compared to the individual effect of mathematics in Level 1 on engineering performance in Level 2. Furthermore, the individual effect of mathematics in Level 1 can be negligible. But, there would be a notable indirect effect of mathematics in Level 1 on engineering performance in Level 2. It can be concluded that the joint effect of mathematics in both Level 1 and Level 2 is immensely beneficial to improve the overall academic performance at the end of Level 2 of the engineering students. Furthermore, it was found that the impact mathematics varies among engineering disciplines. As partial CCA and partial CCA are not widely explored in applied work, it is recommended to use these techniques for various applications.

  9. Deformation analysis and prediction of bank protection structure with river level fluctuations

    Science.gov (United States)

    Hu, Rui; Xing, Yixuan

    2017-04-01

    Bank structure is an important barrier to maintain the safety of the embankment. The deformation of bank protection structure is not only affected by soil pressure caused by the excavation of the riverway, but also by the water pressure caused river water level fluctuations. Thus, it is necessary to establish a coupled soil-water model to analyze the deformation of bank structure. Based on Druck-Prager failure criteria and groundwater seepage theory, a numerical model of bank protection structure with consideration of the pore water pressure of soil mass is established. According to the measured river level data with seasonal fluctuating, numerical analysis of the deformation of bank protection structure is implemented. The simulation results show that the river water level fluctuation has clear influence on the maximum lateral displacement of the pile. Meanwhile, the distribution of plastic zone is related to the depth of groundwater level. Finally, according to the river water level data of the recent ten years, we analyze the deformation of the bank structure under extreme river level. The result shows that, compared with the scenario of extreme high river level, the horizontal displacement of bank protection structure is larger (up to 65mm) under extreme low river level, which is a potential risk to the embankment. Reference Schweiger H F. On the use of drucker-prager failure criteria for earth pressure problems[J]. Computers and Geotechnics, 1994, 16(3): 223-246. DING Yong-chun,CHENG Ze-kun. Numerical study on performance of waterfront excavation[J]. Chinese Journal of Geotechnical Engineering,2013,35(2):515-521. Wu L M, Wang Z Q. Three gorges reservoir water level fluctuation influents on the stability of the slope[J]. Advanced Materials Research. Trans Tech Publications, 2013, 739: 283-286.

  10. Methodology of safety assessment and sensitivity analysis for geologic disposal of high-level radioactive waste

    International Nuclear Information System (INIS)

    Kimura, Hideo; Takahashi, Tomoyuki; Shima, Shigeki; Matsuzuru, Hideo

    1995-01-01

    A deterministic safety assessment methodology has been developed to evaluate long-term radiological consequences associated with geologic disposal of high-level radioactive waste, and to demonstrate a generic feasibility of geologic disposal. An exposure scenario considered here is based on a normal evolution scenario which excludes events attributable to probabilistic alterations in the environment. A computer code system GSRW thus developed is based on a non site-specific model, and consists of a set of sub-modules for calculating the release of radionuclides from engineered barriers, the transport of radionuclides in and through the geosphere, the behavior of radionuclides in the biosphere, and radiation exposures of the public. In order to identify the important parameters of the assessment models, an automated procedure for sensitivity analysis based on the Differential Algebra method has been developed to apply to the GSRW. (author)

  11. Extraversion and psychopathology: A facet-level analysis.

    Science.gov (United States)

    Watson, David; Stasik, Sara M; Ellickson-Larew, Stephanie; Stanton, Kasey

    2015-05-01

    The goal of this study was to explicate how the lower order facets of extraversion are related to psychopathology. We used a "bottom-up" approach in which specific extraversion scales from 3 comprehensive personality inventories were used to model these facets as latent factors. We collected both self-report and interview measures of a broad range of psychopathology from a large community sample. Replicating previous findings using a similar approach (Naragon-Gainey & Watson, 2014; Naragon-Gainey, Watson, & Markon, 2009), structural analyses yielded four factors: Positive Emotionality, Sociability, Assertiveness, and Experience Seeking. Scores on these latent dimensions were related to psychopathology in correlational analyses and in two sets of regressions (the first series used the four facets as predictors; the second included composite scores on the other Big Five domains as additional predictors). These results revealed a striking level of specificity. As predicted, Positive Emotionality displayed especially strong negative links to depressive symptoms and diagnoses. Sociability also was negatively related to psychopathology, showing particularly strong associations with indicators of social dysfunction and the negative symptoms of schizotypy (i.e., social anxiety, social aloofness, and restricted affectivity). Assertiveness generally had weak associations at the bivariate level but was negatively related to social anxiety and was positively correlated with some forms of externalizing. Finally, Experience Seeking had substantial positive associations with a broad range of indicators related to externalizing and bipolar disorder; it also displayed negative links to agoraphobia. These differential correlates demonstrate the importance of examining personality-psychopathology relations at the specific facet level. (c) 2015 APA, all rights reserved).

  12. Macro-level safety analysis of pedestrian crashes in Shanghai, China.

    Science.gov (United States)

    Wang, Xuesong; Yang, Junguang; Lee, Chris; Ji, Zhuoran; You, Shikai

    2016-11-01

    Pedestrian safety has become one of the most important issues in the field of traffic safety. This study aims at investigating the association between pedestrian crash frequency and various predictor variables including roadway, socio-economic, and land-use features. The relationships were modeled using the data from 263 Traffic Analysis Zones (TAZs) within the urban area of Shanghai - the largest city in China. Since spatial correlation exists among the zonal-level data, Bayesian Conditional Autoregressive (CAR) models with seven different spatial weight features (i.e. (a) 0-1 first order, adjacency-based, (b) common boundary-length-based, (c) geometric centroid-distance-based, (d) crash-weighted centroid-distance-based, (e) land use type, adjacency-based, (f) land use intensity, adjacency-based, and (g) geometric centroid-distance-order) were developed to characterize the spatial correlations among TAZs. Model results indicated that the geometric centroid-distance-order spatial weight feature, which was introduced in macro-level safety analysis for the first time, outperformed all the other spatial weight features. Population was used as the surrogate for pedestrian exposure, and had a positive effect on pedestrian crashes. Other significant factors included length of major arterials, length of minor arterials, road density, average intersection spacing, percentage of 3-legged intersections, and area of TAZ. Pedestrian crashes were higher in TAZs with medium land use intensity than in TAZs with low and high land use intensity. Thus, higher priority should be given to TAZs with medium land use intensity to improve pedestrian safety. Overall, these findings can help transportation planners and managers understand the characteristics of pedestrian crashes and improve pedestrian safety. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Alternatives Generation and Analysis for Phase 1 High-Level Waste Feed Tanks Selection

    International Nuclear Information System (INIS)

    CRAWFORD, T.W.

    1999-01-01

    A recent revision of the US Department of Energy privatization contract for the immobilization of high-level waste (HLW) at Hanford necessitates the investigation of alternative waste feed sources to meet contractual feed requirements. This analysis identifies wastes to be considered as HLW feeds and develops and conducts alternative analyses to comply with established criteria. A total of 12,426 cases involving 72 waste streams are evaluated and ranked in three cost-based alternative models. Additional programmatic criteria are assessed against leading alternative options to yield an optimum blended waste feed stream

  14. High-level waste canister envelope study: structural analysis

    International Nuclear Information System (INIS)

    1977-11-01

    The structural integrity of waste canisters, fabricated from standard weight Type 304L stainless steel pipe, was analyzed for sizes ranging from 8 to 24 in. diameter and 10 to 16 feet long under normal, abnormal, and improbable life cycle loading conditions. The canisters are assumed to be filled with vitrified high-level nuclear waste, stored temporarily at a fuel reprocessing plant, and then transported for storage in an underground salt bed or other geologic storage. In each of the three impact conditions studies, the resulting impact force is far greater than the elastic limit capacity of the material. Recommendations are made for further study

  15. Mean-field level analysis of epidemics in directed networks

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Jiazeng [School of Mathematical Sciences, Peking University, Beijing 100871 (China); Liu, Zengrong [Mathematics Department, Shanghai University, Shanghai 200444 (China)], E-mail: wangjiazen@yahoo.com.cn, E-mail: zrongliu@online.sh.cn

    2009-09-04

    The susceptible-infected-removed spreading model in a directed graph is studied. The mean-field level rate equations are built with the degree-degree connectivity correlation element and the (in, out)-degree distribution. And the outbreak threshold is obtained analytically-it is determined by the combination of connectivity probability and the degree distribution. Furthermore, the methods of calculating the degree-degree correlations in directed networks are presented. The numerical results of the discrete epidemic processes in networks verify our analyses.

  16. Mean-field level analysis of epidemics in directed networks

    International Nuclear Information System (INIS)

    Wang, Jiazeng; Liu, Zengrong

    2009-01-01

    The susceptible-infected-removed spreading model in a directed graph is studied. The mean-field level rate equations are built with the degree-degree connectivity correlation element and the (in, out)-degree distribution. And the outbreak threshold is obtained analytically-it is determined by the combination of connectivity probability and the degree distribution. Furthermore, the methods of calculating the degree-degree correlations in directed networks are presented. The numerical results of the discrete epidemic processes in networks verify our analyses.

  17. Estimating Driving Performance Based on EEG Spectrum Analysis

    Directory of Open Access Journals (Sweden)

    Jung Tzyy-Ping

    2005-01-01

    Full Text Available The growing number of traffic accidents in recent years has become a serious concern to society. Accidents caused by driver's drowsiness behind the steering wheel have a high fatality rate because of the marked decline in the driver's abilities of perception, recognition, and vehicle control abilities while sleepy. Preventing such accidents caused by drowsiness is highly desirable but requires techniques for continuously detecting, estimating, and predicting the level of alertness of drivers and delivering effective feedbacks to maintain their maximum performance. This paper proposes an EEG-based drowsiness estimation system that combines electroencephalogram (EEG log subband power spectrum, correlation analysis, principal component analysis, and linear regression models to indirectly estimate driver's drowsiness level in a virtual-reality-based driving simulator. Our results demonstrated that it is feasible to accurately estimate quantitatively driving performance, expressed as deviation between the center of the vehicle and the center of the cruising lane, in a realistic driving simulator.

  18. Texture-based analysis of COPD

    DEFF Research Database (Denmark)

    Sørensen, Lauge; Nielsen, Mads; Lo, Pechin Chien Pau

    2012-01-01

    This study presents a fully automatic, data-driven approach for texture-based quantitative analysis of chronic obstructive pulmonary disease (COPD) in pulmonary computed tomography (CT) images. The approach uses supervised learning where the class labels are, in contrast to previous work, based...... on measured lung function instead of on manually annotated regions of interest (ROIs). A quantitative measure of COPD is obtained by fusing COPD probabilities computed in ROIs within the lung fields where the individual ROI probabilities are computed using a k nearest neighbor (kNN ) classifier. The distance...... and subsequently applied to classify 200 independent images from the same screening trial. The texture-based measure was significantly better at discriminating between subjects with and without COPD than were the two most common quantitative measures of COPD in the literature, which are based on density...

  19. Analysis of sound pressure levels emitted by children's toys.

    Science.gov (United States)

    Sleifer, Pricila; Gonçalves, Maiara Santos; Tomasi, Marinês; Gomes, Erissandra

    2013-06-01

    To verify the levels of sound pressure emitted by non-certified children's toys. Cross-sectional study of sound toys available at popular retail stores of the so-called informal sector. Electronic, mechanical, and musical toys were analyzed. The measurement of each product was carried out by an acoustic engineer in an acoustically isolated booth, by a decibel meter. To obtain the sound parameters of intensity and frequency, the toys were set to produce sounds at a distance of 10 and 50cm from the researcher's ear. The intensity of sound pressure [dB(A)] and the frequency in hertz (Hz) were measured. 48 toys were evaluated. The mean sound pressure 10cm from the ear was 102±10 dB(A), and at 50cm, 94±8 dB(A), with ptoys was above 85dB(A). The frequency ranged from 413 to 6,635Hz, with 56.3% of toys emitting frequency higher than 2,000Hz. The majority of toys assessed in this research emitted a high level of sound pressure.

  20. Comparative Analysis of Nitrate Levels in Pensacola Area Rain Water

    Science.gov (United States)

    Jacobs, J.; Caffrey, J. M.; Maestre, A.; Landing, W. M.

    2017-12-01

    Nitrate is an important constituent of acid rain and often correlated with atmospheric NOx levels. This link between air and water quality was tested over a course of summer 2017 and compared to data from 2005-2012. Rain water samples collected from late May through early July of 2017 were tested for pH and nitrate concentrations. These months were among the stormiest on record for the Northwest Florida region with a total rainfall of 648 mm. The data analyzed from these rain events was compared to previous data to show the trends of nitrate and pH levels in the rainwater. Median pH for this study was 5.2, higher than the medians between 2015-2012 which ranged from 4.2 to 5.0, while nitrate concentrations for this study were 15.2 µM. This contrasts with a significant drop in nitrate concentrations from 41 µM in 2005 and 2006 to around 12 µM between 2007 and 2012. The drop between 2006-7 was suspected to be a result of implementation of NOx controls at Plant Crist coal fired power plant and other Clean Air Act requirements. These inputs of nitrate and H+ ions from rainwater can have a significant influence water quality throughout the region.

  1. Security of electricity supply at the generation level: Problem analysis

    International Nuclear Information System (INIS)

    Rodilla, P.; Batlle, C.

    2012-01-01

    Since the very beginning of the restructuring process, back in 1982 in Chile, the ability of an electricity market to provide the system with the required level of security of supply has been put into question. The mistrust on the ability of the market, left to its own devices, to provide sufficient generation availability when needed, is more and more leading to the implementation of additional regulatory mechanisms. This matter is undoubtedly gaining importance and it has taken a key role in the energy regulators’ agendas. In this paper, we revisit this discussion under the light of thirty years of electricity market experience. We analyze the different reasons why, although ideally the market is supposed to provide itself an adequate security of supply at the generation level, this result is still far from being achieved in practice. - Highlights: ► Discussion on the need for capacity mechanisms is revisited. ► Reasons behind adequacy problem are analyzed. ► Regulator’s intervention to guarantee supply is most of the times justified.

  2. Systematics and Population Level Analysis of Anopheles darlingi

    Directory of Open Access Journals (Sweden)

    Conn JE

    1998-01-01

    Full Text Available A new phylogenetic analysis of the Nyssorhynchus subgenus (Danoff-Burg and Conn, unpub. data using six data sets {morphological (all life stages; scanning electron micrographs of eggs; nuclear ITS2 sequences; mitochondrial COII, ND2 and ND6 sequences} revealed different topologies when each data set was analyzed separately but no heterogeneity between the data sets using the arn test. Consequently, the most accurate estimate of the phylogeny was obtained when all the data were combined. This new phylogeny supports a monophyletic Nyssorhynchus subgenus but both previously recognized sections in the subgenus (Albimanus and Argyritarsis were demonstrated to be paraphyletic relative to each other and four of the seven clades included species previously placed in both sections. One of these clades includes both Anopheles darlingi and An. albimanus, suggesting that the ability to vector malaria effectively may have originated once in this subgenus. Both a conserved (315 bp and a variable (425 bp region of the mitochondrial COI gene from 15 populations of An. darlingi from Belize, Bolivia, Brazil, French Guiana, Peru and Venezuela were used to examine the evolutionary history of this species and to test several analytical assumptions. Results demonstrated (1 parsimony analysis is equally informative compared to distance analysis using NJ; (2 clades or clusters are more strongly supported when these two regions are combined compared to either region separately; (3 evidence (in the form of remnants of older haplotype lineages for two colonization events; and (4 significant genetic divergence within the population from Peixoto de Azevedo (State of Mato Grosso, Brazil. The oldest lineage includes populations from Peixoto, Boa Vista (State of Roraima and Dourado (State of São Paulo.

  3. Managing coopetition through horizontal supply chain relations : Linking dyadic and network levels of analysis

    NARCIS (Netherlands)

    Wilhelm, Miriam M.

    2011-01-01

    A growing research stream has expanded the level of analysis beyond single buyer-supplier relations to the network, including supplier-supplier relations. These supplier-supplier relations may constitute a missing link between the traditional analysis of the dyadic and the network level of analysis

  4. Managing coopetition through horizontal supply chain relations : Linking dyadic and network levels of analysis

    NARCIS (Netherlands)

    Wilhelm, Miriam M.

    A growing research stream has expanded the level of analysis beyond single buyer-supplier relations to the network, including supplier-supplier relations. These supplier-supplier relations may constitute a missing link between the traditional analysis of the dyadic and the network level of analysis

  5. [Model-based biofuels system analysis: a review].

    Science.gov (United States)

    Chang, Shiyan; Zhang, Xiliang; Zhao, Lili; Ou, Xunmin

    2011-03-01

    Model-based system analysis is an important tool for evaluating the potential and impacts of biofuels, and for drafting biofuels technology roadmaps and targets. The broad reach of the biofuels supply chain requires that biofuels system analyses span a range of disciplines, including agriculture/forestry, energy, economics, and the environment. Here we reviewed various models developed for or applied to modeling biofuels, and presented a critical analysis of Agriculture/Forestry System Models, Energy System Models, Integrated Assessment Models, Micro-level Cost, Energy and Emission Calculation Models, and Specific Macro-level Biofuel Models. We focused on the models' strengths, weaknesses, and applicability, facilitating the selection of a suitable type of model for specific issues. Such an analysis was a prerequisite for future biofuels system modeling, and represented a valuable resource for researchers and policy makers.

  6. Work in process level definition: a method based on computer simulation and electre tri

    Directory of Open Access Journals (Sweden)

    Isaac Pergher

    2014-09-01

    Full Text Available This paper proposes a method for defining the levels of work in progress (WIP in productive environments managed by constant work in process (CONWIP policies. The proposed method combines the approaches of Computer Simulation and Electre TRI to support estimation of the adequate level of WIP and is presented in eighteen steps. The paper also presents an application example, performed on a metalworking company. The research method is based on Computer Simulation, supported by quantitative data analysis. The main contribution of the paper is its provision of a structured way to define inventories according to demand. With this method, the authors hope to contribute to the establishment of better capacity plans in production environments.

  7. Structural invariance of multiple intelligences, based on the level of execution.

    Science.gov (United States)

    Almeida, Leandro S; Prieto, María Dolores; Ferreira, Arístides; Ferrando, Mercedes; Ferrandiz, Carmen; Bermejo, Rosario; Hernández, Daniel

    2011-11-01

    The independence of multiple intelligences (MI) of Gardner's theory has been debated since its conception. This article examines whether the one- factor structure of the MI theory tested in previous studies is invariant for low and high ability students. Two hundred ninety-four children (aged 5 to 7) participated in this study. A set of Gardner's Multiple Intelligence assessment tasks based on the Spectrum Project was used. To analyze the invariance of a general dimension of intelligence, the different models of behaviours were studied in samples of participants with different performance on the Spectrum Project tasks with Multi-Group Confirmatory Factor Analysis (MGCFA). Results suggest an absence of structural invariance in Gardner's tasks. Exploratory analyses suggest a three-factor structure for individuals with higher performance levels and a two-factor structure for individuals with lower performance levels.

  8. Multi-level Discourse Analysis in a Physics Teaching Methods Course from the Psychological Perspective of Activity Theory

    Science.gov (United States)

    Vieira, Rodrigo Drumond; Kelly, Gregory J.

    2014-11-01

    In this paper, we present and apply a multi-level method for discourse analysis in science classrooms. This method is based on the structure of human activity (activity, actions, and operations) and it was applied to study a pre-service physics teacher methods course. We argue that such an approach, based on a cultural psychological perspective, affords opportunities for analysts to perform a theoretically based detailed analysis of discourse events. Along with the presentation of analysis, we show and discuss how the articulation of different levels offers interpretative criteria for analyzing instructional conversations. We synthesize the results into a model for a teacher's practice and discuss the implications and possibilities of this approach for the field of discourse analysis in science classrooms. Finally, we reflect on how the development of teachers' understanding of their activity structures can contribute to forms of progressive discourse of science education.

  9. Bariatric surgery: an evidence-based analysis.

    Science.gov (United States)

    2005-01-01

    economic analysis. Bariatric surgery generally is effective for sustained weight loss of about 16% for people with BMIs of at least 40 kg/m(2) or at least 35 kg/m(2) with comorbid conditions (including diabetes, high lipid levels, and hypertension). It also is effective at resolving the associated comorbid conditions. This conclusion is largely based on level 3a evidence from the prospectively designed Swedish Obese Subjects study, which recently published 10-year outcomes for patients who had bariatric surgery compared with patients who received nonsurgical treatment. (1)Regarding specific procedures, there is evidence that malabsorptive techniques are better than other banding techniques for weight loss and resolution of comorbid illnesses. However, there are no published prospective, long-term, direct comparisons of these techniques available.Surgery for morbid obesity is considered an intervention of last resort for patients who have attempted first-line forms of medical management, such as diet, increased physical activity, behavioural modification, and drugs. In the absence of direct comparisons of active nonsurgical intervention via caloric restriction with bariatric techniques, the following observations are made:A recent systematic review examining the efficacy of major commercial and organized self-help weight loss programs in the United States concluded that the evidence to support the use of such programs was suboptimal, except for one trial on Weight Watchers. Furthermore, the programs were associated with high costs, attrition rates, and probability of regaining at least 50% of the lost weight in 1 to 2 years. (2)A recent randomized controlled trial reported 1-year outcomes comparing weight loss and metabolic changes in severely obese patients assigned to either a low-carbohydrate diet or a conventional weight loss diet. At 1 year, weight loss was similar for patients in each group (mean, 2-5 kg). There was a favourable effect on triglyceride levels and

  10. Tree-indexed processes: a high level crossing analysis

    Directory of Open Access Journals (Sweden)

    Mark Kelbert

    2003-01-01

    Full Text Available Consider a branching diffusion process on R1 starting at the origin. Take a high level u>0 and count the number R(u,n of branches reaching u by generation n. Let Fk,n(u be the probability P(R(u,n

  11. Analysis of a UAV that can Hover and Fly Level

    Directory of Open Access Journals (Sweden)

    Çakıcı Ferit

    2016-01-01

    Full Text Available In this study, an unmanned aerial vehicle (UAV with level flight, vertical take-off and landing (VTOL and mode-changing capability is analysed. The platform design combines both multirotor and fixed-wing (FW conventional airplane structures and control surfaces; therefore, named as VTOL-FW. The aircraft is modelled using aerodynamical principles and linear models are constructed utilizing small perturbation theory for trim conditions. The proposed method of control includes implementation of multirotor and airplane mode controllers and design of an algorithm to transition between modes in achieving smooth switching manoeuvres between VTOL and FW flight. Thus, VTOL-FW UAV’s flight characteristics are expected to be improved by enlarging operational flight envelope through enabling mode-transitioning, agile manoeuvres and increasing survivability. Experiments conducted in simulation and real world environments show that, VTOL-FW UAV has both multirotor and airplane characteristics with extra benefits in an enlarged flight envelope.

  12. National high-level waste systems analysis plan

    International Nuclear Information System (INIS)

    Kristofferson, K.; Oholleran, T.P.; Powell, R.H.; Thiel, E.C.

    1995-05-01

    This document details the development of modeling capabilities that can provide a system-wide view of all US Department of Energy (DOE) high-level waste (HLW) treatment and storage systems. This model can assess the impact of budget constraints on storage and treatment system schedules and throughput. These impacts can then be assessed against existing and pending milestones to determine the impact to the overall HLW system. A nation-wide view of waste treatment availability will help project the time required to prepare HLW for disposal. The impacts of the availability of various treatment systems and throughput can be compared to repository readiness to determine the prudent application of resources or the need to renegotiate milestones

  13. Detecting sea-level hazards: Simple regression-based methods for calculating the acceleration of sea level

    Science.gov (United States)

    Doran, Kara S.; Howd, Peter A.; Sallenger,, Asbury H.

    2016-01-04

    This report documents the development of statistical tools used to quantify the hazard presented by the response of sea-level elevation to natural or anthropogenic changes in climate and ocean circulation. A hazard is a physical process (or processes) that, when combined with vulnerability (or susceptibility to the hazard), results in risk. This study presents the development and comparison of new and existing sea-level analysis methods, exploration of the strengths and weaknesses of the methods using synthetic time series, and when appropriate, synthesis of the application of the method to observed sea-level time series. These reports are intended to enhance material presented in peer-reviewed journal articles where it is not always possible to provide the level of detail that might be necessary to fully support or recreate published results.

  14. Hadronic Triggers and trigger-object level analysis at ATLAS

    CERN Document Server

    Zaripovas, Donatas Ramilas; The ATLAS collaboration

    2017-01-01

    Hadronic signatures are critical to the high energy physics analysis program, and are broadly used for both Standard Model measurements and searches for new physics. These signatures include generic quark and gluon jets, as well as jets originating from b-quarks or the decay of massive particles (such as electroweak bosons or top quarks). Additionally missing transverse momentum from non-interacting particles provides an interesting probe in the search for new physics beyond the Standard Model. Developing trigger selections that target these events is a huge challenge at the LHC due to the enormous rates associated with these signatures. This challenge is exacerbated by the amount of pile-up activity, which continues to grow. In order to address these challenges, several new techniques have been developed during the past year in order to significantly improve the potential of the 2017 dataset and overcome the limiting factors to more deeply probing for new physics, such as storage and computing requirements f...

  15. Hadronic triggers and trigger object-level analysis at ATLAS

    CERN Document Server

    Zaripovas, Donatas Ramilas; The ATLAS collaboration

    2017-01-01

    Hadronic signatures are critical to the high energy physics analysis program at the Large Hadron Collider (LHC), and are broadly used for both Standard Model measurements and searches for new physics. These signatures include generic quark and gluon jets, as well as jets originating from b-quarks or the decay of massive particles (such as electroweak bosons or top quarks). Additionally missing transverse momentum from non-interacting particles provides an interesting probe in the search for new physics beyond the Standard Model. Developing trigger selections that target these events is a huge challenge at the LHC due to the enormous event rates associated with these signatures. This challenge is exacerbated by the amount of pile-up activity, which continues to grow. In order to address these challenges, several new techniques have been developed during the past year in order to significantly improve the potential of the 2017 dataset and overcome the limiting factors, such as storage and computing requirements...

  16. Scanning ion images; analysis of pharmaceutical drugs at organelle levels

    Science.gov (United States)

    Larras-Regard, E.; Mony, M.-C.

    1995-05-01

    With the ion analyser IMS 4F used in microprobe mode, it is possible to obtain images of fields of 10 × 10 [mu]m2, corresponding to an effective magnification of 7000 with lateral resolution of 250 nm, technical characteristics that are appropriate for the size of cell organelles. It is possible to characterize organelles by their relative CN-, P- and S- intensities when the tissues are prepared by freeze fixation and freeze substitution. The recognition of organelles enables correlation of the tissue distribution of ebselen, a pharmaceutical drug containing selenium. The various metabolites characterized in plasma, bile and urine during biotransformation of ebselen all contain selenium, so the presence of the drug and its metabolites can be followed by images of Se. We were also able to detect the endogenous content of Se in tissue, due to the increased sensitivity of ion analysis in microprobe mode. Our results show a natural occurrence of Se in the border corresponding to the basal lamina of cells of proximal but not distal tubules of the kidney. After treatment of rats with ebselen, an additional site of Se is found in the lysosomes. We suggest that in addition to direct elimination of ebselen and its metabolites by glomerular filtration and urinary elimination, a second process of elimination may occur: Se compounds reaching the epithelial cells via the basal lamina accumulate in lysosomes prior to excretion into the tubular fluid. The technical developments of using the IMS 4F instrument in the microprobe mode and the improvement in preparation of samples by freeze fixation and substitution further extend the limit of ion analysis in biology. Direct imaging of trace elements and molecules marked with a tracer make it possible to determine their targets by comparison with images of subcellular structures. This is a promising advance in the study of pathways of compounds within tissues, cells and the whole organism.

  17. XML-based analysis interface for particle physics data analysis

    International Nuclear Information System (INIS)

    Hu Jifeng; Lu Xiaorui; Zhang Yangheng

    2011-01-01

    The letter emphasizes on an XML-based interface and its framework for particle physics data analysis. The interface uses a concise XML syntax to describe, in data analysis, the basic tasks: event-selection, kinematic fitting, particle identification, etc. and a basic processing logic: the next step goes on if and only if this step succeeds. The framework can perform an analysis without compiling by loading the XML-interface file, setting p in run-time and running dynamically. An analysis coding in XML instead of C++, easy-to-understood arid use, effectively reduces the work load, and enables users to carry out their analyses quickly. The framework has been developed on the BESⅢ offline software system (BOSS) with the object-oriented C++ programming. These functions, required by the regular tasks and the basic processing logic, are implemented with both standard modules or inherited from the modules in BOSS. The interface and its framework have been tested to perform physics analysis. (authors)

  18. Chapter 11. Community analysis-based methods

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Y.; Wu, C.H.; Andersen, G.L.; Holden, P.A.

    2010-05-01

    Microbial communities are each a composite of populations whose presence and relative abundance in water or other environmental samples are a direct manifestation of environmental conditions, including the introduction of microbe-rich fecal material and factors promoting persistence of the microbes therein. As shown by culture-independent methods, different animal-host fecal microbial communities appear distinctive, suggesting that their community profiles can be used to differentiate fecal samples and to potentially reveal the presence of host fecal material in environmental waters. Cross-comparisons of microbial communities from different hosts also reveal relative abundances of genetic groups that can be used to distinguish sources. In increasing order of their information richness, several community analysis methods hold promise for MST applications: phospholipid fatty acid (PLFA) analysis, denaturing gradient gel electrophoresis (DGGE), terminal restriction fragment length polymorphism (TRFLP), cloning/sequencing, and PhyloChip. Specific case studies involving TRFLP and PhyloChip approaches demonstrate the ability of community-based analyses of contaminated waters to confirm a diagnosis of water quality based on host-specific marker(s). The success of community-based MST for comprehensively confirming fecal sources relies extensively upon using appropriate multivariate statistical approaches. While community-based MST is still under evaluation and development as a primary diagnostic tool, results presented herein demonstrate its promise. Coupled with its inherently comprehensive ability to capture an unprecedented amount of microbiological data that is relevant to water quality, the tools for microbial community analysis are increasingly accessible, and community-based approaches have unparalleled potential for translation into rapid, perhaps real-time, monitoring platforms.

  19. Cloud-Based Orchestration of a Model-Based Power and Data Analysis Toolchain

    Science.gov (United States)

    Post, Ethan; Cole, Bjorn; Dinkel, Kevin; Kim, Hongman; Lee, Erich; Nairouz, Bassem

    2016-01-01

    The proposed Europa Mission concept contains many engineering and scientific instruments that consume varying amounts of power and produce varying amounts of data throughout the mission. System-level power and data usage must be well understood and analyzed to verify design requirements. Numerous cross-disciplinary tools and analysis models are used to simulate the system-level spacecraft power and data behavior. This paper addresses the problem of orchestrating a consistent set of models, tools, and data in a unified analysis toolchain when ownership is distributed among numerous domain experts. An analysis and simulation environment was developed as a way to manage the complexity of the power and data analysis toolchain and to reduce the simulation turnaround time. A system model data repository is used as the trusted store of high-level inputs and results while other remote servers are used for archival of larger data sets and for analysis tool execution. Simulation data passes through numerous domain-specific analysis tools and end-to-end simulation execution is enabled through a web-based tool. The use of a cloud-based service facilitates coordination among distributed developers and enables scalable computation and storage needs, and ensures a consistent execution environment. Configuration management is emphasized to maintain traceability between current and historical simulation runs and their corresponding versions of models, tools and data.

  20. Comparison of nurse staffing based on changes in unit-level workload associated with patient churn.

    Science.gov (United States)

    Hughes, Ronda G; Bobay, Kathleen L; Jolly, Nicholas A; Suby, Chrysmarie

    2015-04-01

    This analysis compares the staffing implications of three measures of nurse staffing requirements: midnight census, turnover adjustment based on length of stay, and volume of admissions, discharges and transfers. Midnight census is commonly used to determine registered nurse staffing. Unit-level workload increases with patient churn, the movement of patients in and out of the nursing unit. Failure to account for patient churn in staffing allocation impacts nurse workload and may result in adverse patient outcomes. Secondary data analysis of unit-level data from 32 hospitals, where nursing units are grouped into three unit-type categories: intensive care, intermediate care, and medical surgical. Midnight census alone did not account adequately for registered nurse workload intensity associated with patient churn. On average, units were staffed with a mixture of registered nurses and other nursing staff not always to budgeted levels. Adjusting for patient churn increases nurse staffing across all units and shifts. Use of the discharges and transfers adjustment to midnight census may be useful in adjusting RN staffing on a shift basis to account for patient churn. Nurse managers should understand the implications to nurse workload of various methods of calculating registered nurse staff requirements. © 2013 John Wiley & Sons Ltd.

  1. Codec-on-Demand Based on User-Level Virtualization

    Science.gov (United States)

    Zhang, Youhui; Zheng, Weimin

    At work, at home, and in some public places, a desktop PC is usually available nowadays. Therefore, it is important for users to be able to play various videos on different PCs smoothly, but the diversity of codec types complicates the situation. Although some mainstream media players can try to download the needed codec automatically, this may fail for average users because installing the codec usually requires administrator privileges to complete, while the user may not be the owner of the PC. We believe an ideal solution should work without users' intervention, and need no special privileges. This paper proposes such a user-friendly, program-transparent solution for Windows-based media players. It runs the media player in a user-mode virtualization environment, and then downloads the needed codec on-the-fly. Because of API (Application Programming Interface) interception, some resource-accessing API calls from the player will be redirected to the downloaded codec resources. Then from the viewpoint of the player, the necessary codec exists locally and it can handle the video smoothly, although neither system registry nor system folders was modified during this process. Besides convenience, the principle of least privilege is maintained and the host system is left clean. This paper completely analyzes the technical issues and presents such a prototype which can work with DirectShow-compatible players. Performance tests show that the overhead is negligible. Moreover, our solution conforms to the Software-As-A-Service (SaaS) mode, which is very promising in the Internet era.

  2. Systematic review and meta-analysis of circulating S100B blood levels in schizophrenia.

    Directory of Open Access Journals (Sweden)

    Katina Aleksovska

    Full Text Available S100B is a calcium-binding protein secreted in central nervous system from astrocytes and other glia cells. High blood S100B levels have been linked to brain damage and psychiatric disorders. S100B levels have been reported to be higher in schizophrenics than healthy controls. To quantify the relationship between S100B blood levels and schizophrenia a systematic literature review of case-control studies published on this topic within July 3rd 2014 was carried out using three bibliographic databases: Medline, Scopus and Web of Science. Studies reporting mean and standard deviation of S100B blood levels both in cases and controls were included in the meta-analysis. The meta-Mean Ratio (mMR of S100B blood levels in cases compared to controls was used as a measure of effect along with its 95% Confidence Intervals (CI. 20 studies were included totaling for 994 cases and 785 controls. Schizophrenia patients showed 76% higher S100B blood levels than controls with mMR = 1.76 95% CI: 1.44-2.15. No difference could be found between drug-free patients with mMR = 1.84 95%CI: 1.24-2.74 and patients on antipsychotic medication with mMR = 1.75 95% CI: 1.41-2.16. Similarly, ethnicity and stage of disease didn't affect results. Although S100B could be regarded as a possible biomarker of schizophrenia, limitations should be accounted when interpreting results, especially because of the high heterogeneity that remained >70%, even after carrying out subgroups analyses. These results point out that approaches based on traditional categorical diagnoses may be too restrictive and new approaches based on the characterization of new complex phenotypes should be considered.

  3. Analysis of stationary availability factor of two-level backbone computer networks with arbitrary topology

    Science.gov (United States)

    Rahman, P. A.

    2018-05-01

    This scientific paper deals with the two-level backbone computer networks with arbitrary topology. A specialized method, offered by the author for calculation of the stationary availability factor of the two-level backbone computer networks, based on the Markov reliability models for the set of the independent repairable elements with the given failure and repair rates and the methods of the discrete mathematics, is also discussed. A specialized algorithm, offered by the author for analysis of the network connectivity, taking into account different kinds of the network equipment failures, is also observed. Finally, this paper presents an example of calculation of the stationary availability factor for the backbone computer network with the given topology.

  4. Exploring changes in open defecation prevalence in sub-Saharan Africa based on national level indices.

    Science.gov (United States)

    Galan, Deise I; Kim, Seung-Sup; Graham, Jay P

    2013-05-30

    In sub-Saharan Africa, it is estimated that 215 million people continue to engage in open defecation. This practice facilitates the transmission of diarrheal diseases - one of the leading causes of mortality in children under 5 in sub-Saharan Africa. The main purpose of this study is to: estimate changes in open defecation prevalence between 2005 and 2010 across countries in sub-Saharan Africa; examine the association between national level indices and changes in open defecation prevalence; and assess how many countries can achieve 'open defecation free status' by 2015. After applying selection criteria, this study analyzed country-level data for 34 sub-Saharan African countries. Seven country-level indices were collected: 1) presence of a national sanitation policy; 2) budget line for sanitation; 3) budget allocated to sanitation; 4) annual per capita GDP; 5) GDP growth; 6) implementation of total sanitation approaches; and 7) per capita aid disbursement for water supply and sanitation. The relationships between these country-level indices and the change in open defecation from 2005 to 2010 were investigated using Wilcoxon Signed-Rank test and Spearman's rank correlation test. Only 3 countries (i.e. Ethiopia, Angola and Sao Tome and Principe) decreased open defecation by 10% or more between 2005 and 2010. No significant associations were observed between the change in open defecation prevalence and all of national level indices except per capita aid disbursement. Per capita aid disbursement for water and sanitation was positively associated with a reduction in open defecation (p-value = 0.02) for a subset of 29 low-income countries from 2005 to 2010. Only one country in our analysis, Angola, is on track to end open defecation by 2015 based on their performance between 2000 and 2010. Most of the national level indices, including a country's economic status, were not associated with the change in the open defecation prevalence. Based on current trends, the goal

  5. Metabolic Profiling of Adiponectin Levels in Adults: Mendelian Randomization Analysis.

    Science.gov (United States)

    Borges, Maria Carolina; Barros, Aluísio J D; Ferreira, Diana L Santos; Casas, Juan Pablo; Horta, Bernardo Lessa; Kivimaki, Mika; Kumari, Meena; Menon, Usha; Gaunt, Tom R; Ben-Shlomo, Yoav; Freitas, Deise F; Oliveira, Isabel O; Gentry-Maharaj, Aleksandra; Fourkala, Evangelia; Lawlor, Debbie A; Hingorani, Aroon D

    2017-12-01

    Adiponectin, a circulating adipocyte-derived protein, has insulin-sensitizing, anti-inflammatory, antiatherogenic, and cardiomyocyte-protective properties in animal models. However, the systemic effects of adiponectin in humans are unknown. Our aims were to define the metabolic profile associated with higher blood adiponectin concentration and investigate whether variation in adiponectin concentration affects the systemic metabolic profile. We applied multivariable regression in ≤5909 adults and Mendelian randomization (using cis -acting genetic variants in the vicinity of the adiponectin gene as instrumental variables) for analyzing the causal effect of adiponectin in the metabolic profile of ≤37 545 adults. Participants were largely European from 6 longitudinal studies and 1 genome-wide association consortium. In the multivariable regression analyses, higher circulating adiponectin was associated with higher high-density lipoprotein lipids and lower very-low-density lipoprotein lipids, glucose levels, branched-chain amino acids, and inflammatory markers. However, these findings were not supported by Mendelian randomization analyses for most metabolites. Findings were consistent between sexes and after excluding high-risk groups (defined by age and occurrence of previous cardiovascular event) and 1 study with admixed population. Our findings indicate that blood adiponectin concentration is more likely to be an epiphenomenon in the context of metabolic disease than a key determinant. © 2017 The Authors.

  6. Statistical cluster analysis and diagnosis of nuclear system level performance

    International Nuclear Information System (INIS)

    Teichmann, T.; Levine, M.M.; Samanta, P.K.; Kato, W.Y.

    1985-01-01

    The complexity of individual nuclear power plants and the importance of maintaining reliable and safe operations makes it desirable to complement the deterministic analyses of these plants by corresponding statistical surveys and diagnoses. Based on such investigations, one can then explore, statistically, the anticipation, prevention, and when necessary, the control of such failures and malfunctions. This paper, and the accompanying one by Samanta et al., describe some of the initial steps in exploring the feasibility of setting up such a program on an integrated and global (industry-wide) basis. The conceptual statistical and data framework was originally outlined in BNL/NUREG-51609, NUREG/CR-3026, and the present work aims at showing how some important elements might be implemented in a practical way (albeit using hypothetical or simulated data)

  7. Serum adiponectin levels are inversely correlated with leukemia: A meta-analysis

    Directory of Open Access Journals (Sweden)

    Jun-Jie Ma

    2016-01-01

    Conclusion: Our meta-analysis suggested that serum ADPN levels may be inversely correlated with leukemia, and ADPN levels can be used as an effective biologic marker in early diagnosis and therapeutic monitoring of leukemia.

  8. Biases in Farm-Level Yield Risk Analysis due to Data Aggregation

    NARCIS (Netherlands)

    Finger, R.

    2012-01-01

    We investigate biases in farm-level yield risk analysis caused by data aggregation from the farm-level to regional and national levels using the example of Swiss wheat and barley yields. The estimated yield variability decreases significantly with increasing level of aggregation, with crop yield

  9. Distribution-level electricity reliability: Temporal trends using statistical analysis

    International Nuclear Information System (INIS)

    Eto, Joseph H.; LaCommare, Kristina H.; Larsen, Peter; Todd, Annika; Fisher, Emily

    2012-01-01

    This paper helps to address the lack of comprehensive, national-scale information on the reliability of the U.S. electric power system by assessing trends in U.S. electricity reliability based on the information reported by the electric utilities on power interruptions experienced by their customers. The research analyzes up to 10 years of electricity reliability information collected from 155 U.S. electric utilities, which together account for roughly 50% of total U.S. electricity sales. We find that reported annual average duration and annual average frequency of power interruptions have been increasing over time at a rate of approximately 2% annually. We find that, independent of this trend, installation or upgrade of an automated outage management system is correlated with an increase in the reported annual average duration of power interruptions. We also find that reliance on IEEE Standard 1366-2003 is correlated with higher reported reliability compared to reported reliability not using the IEEE standard. However, we caution that we cannot attribute reliance on the IEEE standard as having caused or led to higher reported reliability because we could not separate the effect of reliance on the IEEE standard from other utility-specific factors that may be correlated with reliance on the IEEE standard. - Highlights: ► We assess trends in electricity reliability based on the information reported by the electric utilities. ► We use rigorous statistical techniques to account for utility-specific differences. ► We find modest declines in reliability analyzing interruption duration and frequency experienced by utility customers. ► Installation or upgrade of an OMS is correlated to an increase in reported duration of power interruptions. ► We find reliance in IEEE Standard 1366 is correlated with higher reported reliability.

  10. DISPOSALSITE, Low-Level Radioactive Waste Storage Cost Analysis

    International Nuclear Information System (INIS)

    Smith, P.R.

    1990-01-01

    1 - Description of program or function: The Disposal Site Economic Model calculates the average generator price, or average price per cubic foot charged by a disposal facility to a waste generator, one measure of comparing the economic attractiveness of different waste disposal site and disposal technology combinations. The generator price is calculated to recover all costs necessary to develop, construct, operate, close, and care for a site through the end of the institutional care period and to provide the necessary financial returns to the site developer and lender (when used). Six alternative disposal technologies, based on either private or public financing, can be considered - shallow land disposal, intermediate depth disposal, above or below ground vaults, modular concrete canister disposal, and earth mounded concrete bunkers - based on either private or public development. 2 - Method of solution: The economic models incorporate default cost data from the Conceptual Design Report (DOE/LLW-60T, June 1987), a study by Rodgers Associated Engineering Corporation. Because all costs are in constant 1986 dollars, the figures must be modified to account for inflation. Interest during construction is either capitalized for the private developer or rolled into the loan for the public developer. All capital costs during construction are depreciated over the operating life of the site using straight-line depreciation for the private sector. 3 - Restrictions on the complexity of the problem - Maxima of: 100 years post-operating period, 30 years operating period, 15 years pre-operating period. The model should be used with caution outside the range of 1.8 to 10.5 million cubic feet of total volume. Depreciation is not recognized with public development

  11. Separation of base flow from streamflow using groundwater levels - illustrated for the Pang catchment (UK)

    NARCIS (Netherlands)

    Peters, E.; Lanen, van H.A.J.

    2005-01-01

    A new filter to separate base flow from streamflow has developed that uses observed groundwater levels. To relate the base flow to the observed groundwater levels, a non-linear relation was used. This relation is suitable for unconfined aquifers with deep groundwater levels that do not respond to

  12. Web-based Analysis Services Report

    CERN Document Server

    AUTHOR|(CDS)2108758; Canali, Luca; Grancher, Eric; Lamanna, Massimo; McCance, Gavin; Mato Vila, Pere; Piparo, Danilo; Moscicki, Jakub; Pace, Alberto; Brito Da Rocha, Ricardo; Simko, Tibor; Smith, Tim; Tejedor Saavedra, Enric; CERN. Geneva. IT Department

    2017-01-01

    Web-based services (cloud services) is an important trend to innovate end-user services while optimising the service operational costs. CERN users are constantly proposing new approaches (inspired from services existing on the web, tools used in education or other science or based on their experience in using existing computing services). In addition, industry and open source communities have recently made available a large number of powerful and attractive tools and platforms that enable large scale data processing. “Big Data” software stacks notably provide solutions for scalable storage, distributed compute and data analysis engines, data streaming, web-based interfaces (notebooks). Some of those platforms and tools, typically available as open source products, are experiencing a very fast adoption in industry and science such that they are becoming “de facto” references in several areas of data engineering, data science and machine learning. In parallel to users' requests, WLCG is considering to c...

  13. Centering Pregnancy in Missouri: A System Level Analysis

    Directory of Open Access Journals (Sweden)

    Pamela K. Xaverius

    2014-01-01

    Full Text Available Background. Centering Pregnancy (CP is an effective method of delivering prenatal care, yet providers have been slow to adopt the CP model. Our main hypothesis is that a site’s adoption of CP is contingent upon knowledge of the CP, characteristics health care personnel, anticipated patient impact, and system readiness. Methods. Using a matched, pretest-posttest, observational design, 223 people completed pretest and posttest surveys. Our analysis included the effect of the seminar on the groups’ knowledge of CP essential elements, barriers to prenatal care, and perceived value of CP to the patients and to the system of care. Results. Before the CP Seminar only 34% of respondents were aware of the model, while knowledge significantly after the Seminar. The three greatest improvements were in understanding that the group is conducted in a circle, the health assessment occurs in the group space, and a facilitative leadership style is used. Child care, transportation, and language issues were the top three barriers. The greatest improvements reported for patients included improvements in timeliness, patient-centeredness and efficiency, although readiness for adoption was influenced by costs, resources, and expertise. Discussion. Readiness to adopt CP will require support for the start-up and sustainability of this model.

  14. The feeling of doing across levels of analysis: The effects of perceived control on learning

    Directory of Open Access Journals (Sweden)

    Ljubica Chatman

    2011-12-01

    Full Text Available A person's sense of control was initially conceptualized in psychology as either a trait (Rotter, 1966, an attribution style (Weiner, 1979 or self-efficacy belief (Bandura, 1989a. More recent work in social cognition focuses on the process of inferring one's own causality and how the feeling of doing comes about. This investigation centers on a cue based process as leading to the experience of agency. These cues include vision, proprioception, social cues, and action relevant thought (Wegner & Sparrow, 2004. Since the advent of Functional Magnetic Resonance Imaging (fMRI, progress has been made in understanding the neural substrates implicated when one's infers own causality (for review see David, Newen, & Vogeley, 2008. An analysis of the different approaches to studying human agency, reveals their contributions with each level of analysis adding to and refining our understanding of perceived control and its effect on learning.

  15. Level of literacy and dementia: A secondary post-hoc analysis from North-West India

    Directory of Open Access Journals (Sweden)

    Sunil Kumar Raina

    2014-01-01

    Full Text Available Introduction: A relation between literacy and dementia has been studied in past and an association has been documented. This is in spite of some studies pointing to the contrary. The current study was aimed at investigating the influence of level of literacy on dementia in a sample stratified by geography (Migrant, Urban, Rural and Tribal areas of sub-Himalayan state of Himachal Pradesh, India. Materials and Methods: The study was based on post-hoc analysis of data obtained from a study conducted on elderly population (60 years and above from selected geographical areas (Migrant, Urban, Rural and Tribal of Himachal Pradesh state in North-west India. Results: Analysis of variance revealed an effect of education on cognitive scores [F = 2.823, P =0.01], however, post-hoc Tukey′s HSD test did not reveal any significant pairwise comparisons. Discussion: The possibility that education effects dementia needs further evaluation, more so in Indian context.

  16. Crest Level Optimization of the Multi Level Overtopping based Wave Energy Converter Seawave Slot-Cone Generator

    DEFF Research Database (Denmark)

    Kofoed, Jens Peter; Osaland, E.

    2005-01-01

    The paper describes the optimization of the crest levels and geometrical layout of the SSG structure, focusing on maximizing the obtained potential energy in the overtopping water. During wave tank testing at AAU average overtopping rates into the individual reservoirs have been measured. The ini......The paper describes the optimization of the crest levels and geometrical layout of the SSG structure, focusing on maximizing the obtained potential energy in the overtopping water. During wave tank testing at AAU average overtopping rates into the individual reservoirs have been measured....... The initial tests led to an expression describing the derivative of the overtopping rate with respect to the vertical distance. Based on this, numerical optimizations of the crest levels, for a number of combinations of wave conditions, have been performed. The hereby found optimal crest levels have been...

  17. Sentiment Analysis on Tweets about Diabetes: An Aspect-Level Approach

    KAUST Repository

    Salas-Zárate, María del Pilar

    2017-02-19

    In recent years, some methods of sentiment analysis have been developed for the health domain; however, the diabetes domain has not been explored yet. In addition, there is a lack of approaches that analyze the positive or negative orientation of each aspect contained in a document (a review, a piece of news, and a tweet, among others). Based on this understanding, we propose an aspect-level sentiment analysis method based on ontologies in the diabetes domain. The sentiment of the aspects is calculated by considering the words around the aspect which are obtained through N-gram methods (N-gram after, N-gram before, and N-gram around). To evaluate the effectiveness of our method, we obtained a corpus from Twitter, which has been manually labelled at aspect level as positive, negative, or neutral. The experimental results show that the best result was obtained through the N-gram around method with a precision of 81.93%, a recall of 81.13%, and an -measure of 81.24%.

  18. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    Science.gov (United States)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  19. Sentiment Analysis on Tweets about Diabetes: An Aspect-Level Approach

    Directory of Open Access Journals (Sweden)

    María del Pilar Salas-Zárate

    2017-01-01

    Full Text Available In recent years, some methods of sentiment analysis have been developed for the health domain; however, the diabetes domain has not been explored yet. In addition, there is a lack of approaches that analyze the positive or negative orientation of each aspect contained in a document (a review, a piece of news, and a tweet, among others. Based on this understanding, we propose an aspect-level sentiment analysis method based on ontologies in the diabetes domain. The sentiment of the aspects is calculated by considering the words around the aspect which are obtained through N-gram methods (N-gram after, N-gram before, and N-gram around. To evaluate the effectiveness of our method, we obtained a corpus from Twitter, which has been manually labelled at aspect level as positive, negative, or neutral. The experimental results show that the best result was obtained through the N-gram around method with a precision of 81.93%, a recall of 81.13%, and an F-measure of 81.24%.

  20. Sentiment Analysis on Tweets about Diabetes: An Aspect-Level Approach

    KAUST Repository

    Salas-Zá rate, Marí a del Pilar; Medina-Moreira, José ; Lagos-Ortiz, Katty; Luna-Aveiga, Harry; Rodriguez-Garcia, Miguel Angel; Valencia-Garcí a, Rafael

    2017-01-01

    In recent years, some methods of sentiment analysis have been developed for the health domain; however, the diabetes domain has not been explored yet. In addition, there is a lack of approaches that analyze the positive or negative orientation of each aspect contained in a document (a review, a piece of news, and a tweet, among others). Based on this understanding, we propose an aspect-level sentiment analysis method based on ontologies in the diabetes domain. The sentiment of the aspects is calculated by considering the words around the aspect which are obtained through N-gram methods (N-gram after, N-gram before, and N-gram around). To evaluate the effectiveness of our method, we obtained a corpus from Twitter, which has been manually labelled at aspect level as positive, negative, or neutral. The experimental results show that the best result was obtained through the N-gram around method with a precision of 81.93%, a recall of 81.13%, and an -measure of 81.24%.

  1. Localization and force analysis at the single virus particle level using atomic force microscopy

    International Nuclear Information System (INIS)

    Liu, Chih-Hao; Horng, Jim-Tong; Chang, Jeng-Shian; Hsieh, Chung-Fan; Tseng, You-Chen; Lin, Shiming

    2012-01-01

    Highlights: ► Localization of single virus particle. ► Force measurements. ► Force mapping. -- Abstract: Atomic force microscopy (AFM) is a vital instrument in nanobiotechnology. In this study, we developed a method that enables AFM to simultaneously measure specific unbinding force and map the viral glycoprotein at the single virus particle level. The average diameter of virus particles from AFM images and the specificity between the viral surface antigen and antibody probe were integrated to design a three-stage method that sets the measuring area to a single virus particle before obtaining the force measurements, where the influenza virus was used as the object of measurements. Based on the purposed method and performed analysis, several findings can be derived from the results. The mean unbinding force of a single virus particle can be quantified, and no significant difference exists in this value among virus particles. Furthermore, the repeatability of the proposed method is demonstrated. The force mapping images reveal that the distributions of surface viral antigens recognized by antibody probe were dispersed on the whole surface of individual virus particles under the proposed method and experimental criteria; meanwhile, the binding probabilities are similar among particles. This approach can be easily applied to most AFM systems without specific components or configurations. These results help understand the force-based analysis at the single virus particle level, and therefore, can reinforce the capability of AFM to investigate a specific type of viral surface protein and its distributions.

  2. Comparative analysis of elements and models of implementation in local-level spatial plans in Serbia

    Directory of Open Access Journals (Sweden)

    Stefanović Nebojša

    2017-01-01

    Full Text Available Implementation of local-level spatial plans is of paramount importance to the development of the local community. This paper aims to demonstrate the importance of and offer further directions for research into the implementation of spatial plans by presenting the results of a study on models of implementation. The paper describes the basic theoretical postulates of a model for implementing spatial plans. A comparative analysis of the application of elements and models of implementation of plans in practice was conducted based on the spatial plans for the local municipalities of Arilje, Lazarevac and Sremska Mitrovica. The analysis includes four models of implementation: the strategy and policy of spatial development; spatial protection; the implementation of planning solutions of a technical nature; and the implementation of rules of use, arrangement and construction of spaces. The main results of the analysis are presented and used to give recommendations for improving the elements and models of implementation. Final deliberations show that models of implementation are generally used in practice and combined in spatial plans. Based on the analysis of how models of implementation are applied in practice, a general conclusion concerning the complex character of the local level of planning is presented and elaborated. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. TR 36035: Spatial, Environmental, Energy and Social Aspects of Developing Settlements and Climate Change - Mutual Impacts and Grant no. III 47014: The Role and Implementation of the National Spatial Plan and Regional Development Documents in Renewal of Strategic Research, Thinking and Governance in Serbia

  3. Thermal analysis of a ventilated high-level waste repository

    International Nuclear Information System (INIS)

    1977-04-01

    The cooling response of a single ventilated storage room in an unventilated array of rooms is examined. Calculations show that ventilation provides a thermal sink in the heated system inducing temperature gradients in the formation different from the unventilated case. An asymptotic cool-down limit exists for the storage room temperature; this minimum temperature depends on inlet air temperature, ventilation flow rate, and convective heat transfer coefficient. For inlet air at 75 0 F and 50,000 cfm and a heat transfer coefficient of 0.8 Btu h- 0 F-ft 2 , the limit is about 100 0 F. A storage room sealed for 5 years will achieve temperatures of approximately 180 0 F, and approximately 4 months would be required in order to cool the storage room floor to a temperature of 120 0 F with a flow rate of 50,000 cfm at an inlet air temperature of 75 0 F, assuming a convective heat transfer coefficient of 0.8 Btu/h- 0 F-ft 2 . Two months would be needed to cool the exhaust air to 120 0 F. For large air flow rates, the cooling time is independent of the flow rate. Increasing the storage room surface area by 25% over the baseline model depresses the cool-down temperatures by only 4 0 F and decreases cooling times by 20%. Modifications in canister design or width have virtually no effect on the cooling, but placing the waste deeper beneath the storage rooms and/or using longer canisters can lower the operating temperatures and cooling times. Reducing the canisters from 3.5 kW power density for 10-year-old waste (108.5 kW/acre) to 2.0 kW/canister (62 kW/acre) reduces cooling temperatures by more than 20 0 F and reduces cooling times to a few weeks or less. The cooling times are nearly independent of the conductivity of the geologic formation. The temperature increase in the air brought from the surface down the supply shaft to the storage room level is about 5 to 7 F 0 per 1000 feet. Temperature increases in regionsshould not be seriously restricted 30 or more feet away

  4. Analysis of generic exemption levels for radioactive material

    International Nuclear Information System (INIS)

    Bossio, Maria C.; Muniz, Carolina C.

    2007-01-01

    In essence, exemption may be considered a generic authorization granted by the regulatory body, which, once issued, releases the practice or source from the requirements that would otherwise apply, in particular, the requirements relating to notification and authorization. The exemption figures included in the Basic Safety Standards BSS 115 were derived from the scenarios postulated in the document 'Radiation Protection 65' of the Commission of the European Communities considering quantitative exemption criteria. After presenting the basic exemption criteria for limited quantities of radioactive materials this paper describes and analyses the exposure scenarios utilized, namely: 1- Normal use (workplace) scenario; 2- Accidental ( workplace) scenario; 3- Disposal (public) scenario. Each one has different exposure pathways, summing up a total of 24, covering external exposure, ingestion and inhalation. These scenarios were used to calculate both, exempted activity concentrations (Bq/g) and total activity (Bq), though in the first case exemption applies to limited masses of low concentration activity materials. For each radionuclide, the generic exemption level was derived as the more restrictive value obtained from the scenarios, that is the lowest ratio between the applicable individual dose and the dose per unit activity (Bq) or activity concentration (Bq/g). The individual dose per unit (Bq or Bq/g, as applicable) was calculated by a formula that was adjusted, for each scenario and pathway, through different parameters, such as exposure time, dosimetric factors and geometric factors. In general, the critical pathways for α emitters were inhalation of dust and aerosols in workplace for activity concentration scenario and inhalation of dust and volatiles from an accidental fire in the workplace for total activity scenario. For β emitters the critical pathways were ingestion of an object from a landfill site by a member of the public for activity concentration

  5. A New N-Level Inverter Based on Z-NPC

    Directory of Open Access Journals (Sweden)

    E. Babaei

    2017-12-01

    Full Text Available First of all, in this paper, the topology and operation of the three-phase three-level Z-source inverter based on neutral-point-clamped (Z-NPC are studied. Moreover, different combinations of permissible switching states and control signals are explained for this inverter. In this paper, the topology of the three-phase three-level Z-NPC inverter is extended for an n-level state. Also, a combination of allowed switching states with relevant mathematical equations is presented for the proposed n-level Z-NPC inverter. In comparison with multilevel voltage-source inverters (only voltage-boost capability, the proposed multilevel Z-NPC inverter is a single-stage converter and it has a buck-boost capability of voltage. On the other hand, the control of two-stage converters compared to single-stage converters can be more difficult because of existing more active and passive components. In this paper, two new PWM control methods are also proposed for various multilevel Z-NPC inverters. One advantage of the proposed PWM control methods in comparison with conventional PWM control methods is maintaining the charge balance of the dc-link capacitors in neutral point. The correct performance of the proposed multilevel Z-NPC topology and PWM control methods are verified by the obtained results of analysis and simulations performed in the PSCAD software.

  6. Survey and analysis of the domestic technology level for the concept development of high level waste disposal

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Chang Sun; Kim, Byung Su; Song, Jae Hyok [Seoul National University, Seoul (Korea); Park, Kwang Hon; Hwang, Ju Ho; Park, Sung Hyun; Lee, Jae Min [Kyunghee University, Seoul (Korea); Han, Joung Sang; Kim, Ku Young [Yonsei University, Seoul (Korea); Lee, Jae Ki; Chang, Jae Kwon [Hangyang University, Seoul (Korea)

    1998-09-01

    The objectives of this study are the analysis of the status of HLW disposal technology and the investigation of the domestic technology level. The study has taken two years to complete with the participation of forty five researchers. The study was mainly carried out through means of literature surveys, collection of related data, visits to research institutes, and meetings with experts in the specific fields. During the first year of this project, the International Symposium on the Concept Development of the High Level Waste Disposal System was held in Taejon, Korea in October, 1997. Eight highly professed foreign experts whose fields of expertise projected to the area of high level waste disposal were invited to the symposium. This study is composed of four major areas; disposal system design/construction, engineered barrier characterization, geologic environment evaluation and performance assessment and total safety. A technical tree scheme of HLW disposal has been illustrated according to the investigation and an analysis for each technical area. For each detailed technology, research projects, performing organization/method and techniques that are to be secured in the order of priority are proposed, but the suggestions are merely at a superfluous level of propositional idea due to the reduction of the budget in the second year. The detailed programs on HLW disposal are greatly affected by governmental HLW disposal policy and in this study, the primary decisions to be made in each level of HLW disposal enterprise and a rough scheme are proposed. (author). 20 refs., 97 figs., 33 tabs.

  7. Two-level method for unsteady Navier-Stokes equations based on a new projection

    International Nuclear Information System (INIS)

    Hou Yanren; Li Kaitai

    2004-12-01

    A two-level algorithm for the two dimensional unsteady Navier-Stokes equations based on a new projection is proposed and investigated. The approximate solution is solved as a sum of a large eddy component and a small eddy component, which are in the sense of the new projection, constructed in this paper. These two terms advance in time explicitly. Actually, the new algorithm proposed here can be regarded as a sort of postprocessing algorithm for the standard Galerkin method (SGM). The large eddy part is solved by SGM in the usual L 2 -based large eddy subspace while the small eddy part (the correction part) is obtained in its complement subspace in the sense of the new projection. The stability analysis indicates the improvement of the stability comparing with SGM of the same scale, and the L 2 -error estimate shows that the scheme can improve the accuracy of SGM approximation for half order. We also propose a numerical implementation based on Lagrange multiplier for this two-level algorithm. (author)

  8. SB certification handout material requirements, test methods, responsibilities, and minimum classification levels for mixture-based specification for flexible base.

    Science.gov (United States)

    2012-10-01

    A handout with tables representing the material requirements, test methods, responsibilities, and minimum classification levels mixture-based specification for flexible base and details on aggregate and test methods employed, along with agency and co...

  9. Particle Generation by Laser Ablation in Support of Chemical Analysis of High Level Mixed Waste from Plutonium Production Operations

    International Nuclear Information System (INIS)

    Dickinson, J. Thomas; Alexander, Michael L.

    2001-01-01

    Investigate particles produced by laser irradiation and their analysis by Laser Ablation Inductively Coupled Plasma Mass Spectroscopy (LA/ICP-MS), with a view towards optimizing particle production for analysis of high level waste materials and waste glass. LA/ICP-MS has considerable potential to increase the safety and speed of analysis required for the remediation of high level wastes from cold war plutonium production operations. In some sample types, notably the sodium nitrate-based wastes at Hanford and elsewhere, chemical analysis using typical laser conditions depends strongly on the details of sample history composition in a complex fashion, rendering the results of analysis uncertain. Conversely, waste glass materials appear to be better behaved and require different strategies to optimize analysis

  10. 14 CFR 91.865 - Phased compliance for operators with base level.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 2 2010-01-01 2010-01-01 false Phased compliance for operators with base... Noise Limits § 91.865 Phased compliance for operators with base level. Except as provided in paragraph... maximum of: (1) After December 31, 1994, 75 percent of the base level held by the operator; (2) After...

  11. 7 CFR 1463.106 - Base quota levels for eligible tobacco producers.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 10 2010-01-01 2010-01-01 false Base quota levels for eligible tobacco producers... TOBACCO TRANSITION PROGRAM Tobacco Transition Payment Program § 1463.106 Base quota levels for eligible... farm's average yield for 2001, 2002, and 2003 to get the 2004 farm base pounds total. 2 Multiply any...

  12. Analysis of commercial bouillons for trace levels of mutagens.

    Science.gov (United States)

    Stavric, B; Matula, T I; Klassen, R; Downie, R H

    1993-12-01

    A new method, developed specifically for the extraction of heterocyclic aromatic amine (HAA) type mutagens from different food matrices, was applied to various forms of commercially available bouillons. This procedure is based on liquid-liquid extraction of the sample at different pH values. Recovery and reproducibility of the procedure was determined by processing spiked samples using a mutagenicity bioassay technique as an endpoint. The mutagenicity was tested in the Salmonella/microsome assay using strain TA98 with metabolic activation. 22 bouillon samples in liquid, cube or powder forms from seven manufacturers were extracted and tested for potential mutagenicity. The mutagenic activity of these samples varied and ranged from non-detectable to about 1200 induced revertants per gram of solid material, with a median value of approximately 250 revertants/g. The mutagenic response appeared to be dependent on the source rather than the type or form of the product tested. A negative response was obtained from only one chicken bouillon, and the highest positive response was obtained from a beef bouillon in cube form. It appears that the average beef sample, regardless of form, has a higher mutagenic potency than chicken or chicken and turkey samples. Overall, the intake of mutagens from commercial bouillons (obtained as cubes, concentrates or dry mixes) to prepare one serving (as bouillon, soup, casseroles, etc.) is considerably less than that reported in the literature for one serving of fried beef or pork. The extractability and mutagenic characteristics of these samples indicate the presence of HAA-type mutagens. Work is in progress to identify the mutagenic factors in bouillons.

  13. Seismic design ampersand analysis considerations for high level nuclear waste repositories

    International Nuclear Information System (INIS)

    Hossain, Q.A.

    1993-01-01

    A high level nuclear waste repository, like the one at Nevada's Yucca Mountain that is being investigated for site suitability, will have some unique seismic design and analysis considerations. These are discussed, and a design philosophy that can rationally account for the unique performance objectives of such facilities is presented. A case is made for the use of DOE's performance goal-based seismic design and evaluation methodology that is based on a hybrid open-quotes deterministicclose quotes and open-quotes probabilisticclose quotes concept. How and to what extent this methodology should be modified to adopt it for a potential site like Yucca Mountain is also outlined. Finally, the issue of designing for seismic fault rupture is discussed briefly, and the desirability of using the proposed seismic design philosophy in fault rupture evaluation is described

  14. Seismic design and analysis considerations for high level nuclear waste repositories

    International Nuclear Information System (INIS)

    Hossain, Q.A.

    1993-01-01

    A high level nuclear waste repository, like the one at Nevada's Yucca Mountain that is being investigated for site suitability, will have some unique seismic design and analysis considerations. These are discussed, and a design philosophy that can rationally account for the unique performance objectives of such facilities is presented. A case is made for the use of DOE's performance goal-based seismic design and evaluation methodology that is based on a hybrid ''deterministic'' and ''probabilistic'' concept. How and to what extent this methodology should be modified to adopt it for a potential site like Yucca Mountain is also outlined. Finally, the issue of designing for seismic fault rupture is discussed briefly, and the desirability of using the proposed seismic design philosophy in fault rupture evaluation is described

  15. A new framework for estimating return levels using regional frequency analysis

    Science.gov (United States)

    Winter, Hugo; Bernardara, Pietro; Clegg, Georgina

    2017-04-01

    We propose a new framework for incorporating more spatial and temporal information into the estimation of extreme return levels. Currently, most studies use extreme value models applied to data from a single site; an approach which is inefficient statistically and leads to return level estimates that are less physically realistic. We aim to highlight the benefits that could be obtained by using methodology based upon regional frequency analysis as opposed to classic single site extreme value analysis. This motivates a shift in thinking, which permits the evaluation of local and regional effects and makes use of the wide variety of data that are now available on high temporal and spatial resolutions. The recent winter storms over the UK during the winters of 2013-14 and 2015-16, which have caused wide-ranging disruption and damaged important infrastructure, provide the main motivation for the current work. One of the most impactful natural hazards is flooding, which is often initiated by extreme precipitation. In this presentation, we focus on extreme rainfall, but shall discuss other meteorological variables alongside potentially damaging hazard combinations. To understand the risks posed by extreme precipitation, we need reliable statistical models which can be used to estimate quantities such as the T-year return level, i.e. the level which is expected to be exceeded once every T-years. Extreme value theory provides the main collection of statistical models that can be used to estimate the risks posed by extreme precipitation events. Broadly, at a single site, a statistical model is fitted to exceedances of a high threshold and the model is used to extrapolate to levels beyond the range of the observed data. However, when we have data at many sites over a spatial domain, fitting a separate model for each separate site makes little sense and it would be better if we could incorporate all this information to improve the reliability of return level estimates. Here

  16. Live demonstration: Screen printed, microwave based level sensor for automated drug delivery

    KAUST Repository

    Karimi, Muhammad Akram; Arsalan, Muhammad; Shamim, Atif

    2018-01-01

    Level sensors find numerous applications in many industries to automate the processes involving chemicals. Recently, some commercial ultrasound based level sensors are also being used to automate the drug delivery process [1]. Some of the most

  17. A low cost, printed microwave based level sensor with integrated oscillator readout circuitry

    KAUST Repository

    Karimi, Muhammad Akram; Arsalan, Muhammad; Shamim, Atif

    2017-01-01

    This paper presents an extremely low cost, tube conformable, printed T-resonator based microwave level sensor, whose resonance frequency shifts by changing the level of fluids inside the tube. Printed T-resonator forms the frequency selective

  18. NASA Lunar Base Wireless System Propagation Analysis

    Science.gov (United States)

    Hwu, Shian U.; Upanavage, Matthew; Sham, Catherine C.

    2007-01-01

    There have been many radio wave propagation studies using both experimental and theoretical techniques over the recent years. However, most of studies have been in support of commercial cellular phone wireless applications. The signal frequencies are mostly at the commercial cellular and Personal Communications Service bands. The antenna configurations are mostly one on a high tower and one near the ground to simulate communications between a cellular base station and a mobile unit. There are great interests in wireless communication and sensor systems for NASA lunar missions because of the emerging importance of establishing permanent lunar human exploration bases. Because of the specific lunar terrain geometries and RF frequencies of interest to the NASA missions, much of the published literature for the commercial cellular and PCS bands of 900 and 1800 MHz may not be directly applicable to the lunar base wireless system and environment. There are various communication and sensor configurations required to support all elements of a lunar base. For example, the communications between astronauts, between astronauts and the lunar vehicles, between lunar vehicles and satellites on the lunar orbits. There are also various wireless sensor systems among scientific, experimental sensors and data collection ground stations. This presentation illustrates the propagation analysis of the lunar wireless communication and sensor systems taking into account the three dimensional terrain multipath effects. It is observed that the propagation characteristics are significantly affected by the presence of the lunar terrain. The obtained results indicate the lunar surface material, terrain geometry and antenna location are the important factors affecting the propagation characteristics of the lunar wireless systems. The path loss can be much more severe than the free space propagation and is greatly affected by the antenna height, surface material and operating frequency. The

  19. Gait Correlation Analysis Based Human Identification

    Directory of Open Access Journals (Sweden)

    Jinyan Chen

    2014-01-01

    Full Text Available Human gait identification aims to identify people by a sequence of walking images. Comparing with fingerprint or iris based identification, the most important advantage of gait identification is that it can be done at a distance. In this paper, silhouette correlation analysis based human identification approach is proposed. By background subtracting algorithm, the moving silhouette figure can be extracted from the walking images sequence. Every pixel in the silhouette has three dimensions: horizontal axis (x, vertical axis (y, and temporal axis (t. By moving every pixel in the silhouette image along these three dimensions, we can get a new silhouette. The correlation result between the original silhouette and the new one can be used as the raw feature of human gait. Discrete Fourier transform is used to extract features from this correlation result. Then, these features are normalized to minimize the affection of noise. Primary component analysis method is used to reduce the features’ dimensions. Experiment based on CASIA database shows that this method has an encouraging recognition performance.

  20. Optimal depth-based regional frequency analysis

    Directory of Open Access Journals (Sweden)

    H. Wazneh

    2013-06-01

    Full Text Available Classical methods of regional frequency analysis (RFA of hydrological variables face two drawbacks: (1 the restriction to a particular region which can lead to a loss of some information and (2 the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors. In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  1. Optimal depth-based regional frequency analysis

    Science.gov (United States)

    Wazneh, H.; Chebana, F.; Ouarda, T. B. M. J.

    2013-06-01

    Classical methods of regional frequency analysis (RFA) of hydrological variables face two drawbacks: (1) the restriction to a particular region which can lead to a loss of some information and (2) the definition of a region that generates a border effect. To reduce the impact of these drawbacks on regional modeling performance, an iterative method was proposed recently, based on the statistical notion of the depth function and a weight function φ. This depth-based RFA (DBRFA) approach was shown to be superior to traditional approaches in terms of flexibility, generality and performance. The main difficulty of the DBRFA approach is the optimal choice of the weight function ϕ (e.g., φ minimizing estimation errors). In order to avoid a subjective choice and naïve selection procedures of φ, the aim of the present paper is to propose an algorithm-based procedure to optimize the DBRFA and automate the choice of ϕ according to objective performance criteria. This procedure is applied to estimate flood quantiles in three different regions in North America. One of the findings from the application is that the optimal weight function depends on the considered region and can also quantify the region's homogeneity. By comparing the DBRFA to the canonical correlation analysis (CCA) method, results show that the DBRFA approach leads to better performances both in terms of relative bias and mean square error.

  2. Dependent failure analysis of NPP data bases

    International Nuclear Information System (INIS)

    Cooper, S.E.; Lofgren, E.V.; Samanta, P.K.; Wong Seemeng

    1993-01-01

    A technical approach for analyzing plant-specific data bases for vulnerabilities to dependent failures has been developed and applied. Since the focus of this work is to aid in the formulation of defenses to dependent failures, rather than to quantify dependent failure probabilities, the approach of this analysis is critically different. For instance, the determination of component failure dependencies has been based upon identical failure mechanisms related to component piecepart failures, rather than failure modes. Also, component failures involving all types of component function loss (e.g., catastrophic, degraded, incipient) are equally important to the predictive purposes of dependent failure defense development. Consequently, dependent component failures are identified with a different dependent failure definition which uses a component failure mechanism categorization scheme in this study. In this context, clusters of component failures which satisfy the revised dependent failure definition are termed common failure mechanism (CFM) events. Motor-operated valves (MOVs) in two nuclear power plant data bases have been analyzed with this approach. The analysis results include seven different failure mechanism categories; identified potential CFM events; an assessment of the risk-significance of the potential CFM events using existing probabilistic risk assessments (PRAs); and postulated defenses to the identified potential CFM events. (orig.)

  3. Rational use of cognitive resources: levels of analysis between the computational and the algorithmic.

    Science.gov (United States)

    Griffiths, Thomas L; Lieder, Falk; Goodman, Noah D

    2015-04-01

    Marr's levels of analysis-computational, algorithmic, and implementation-have served cognitive science well over the last 30 years. But the recent increase in the popularity of the computational level raises a new challenge: How do we begin to relate models at different levels of analysis? We propose that it is possible to define levels of analysis that lie between the computational and the algorithmic, providing a way to build a bridge between computational- and algorithmic-level models. The key idea is to push the notion of rationality, often used in defining computational-level models, deeper toward the algorithmic level. We offer a simple recipe for reverse-engineering the mind's cognitive strategies by deriving optimal algorithms for a series of increasingly more realistic abstract computational architectures, which we call "resource-rational analysis." Copyright © 2015 Cognitive Science Society, Inc.

  4. The effect of olive oil-based ketogenic diet on serum lipid levels in epileptic children.

    Science.gov (United States)

    Güzel, Orkide; Yılmaz, Unsal; Uysal, Utku; Arslan, Nur

    2016-03-01

    Ketogenic diet (KD) is one of the most effective therapies for intractable epilepsy. Olive oil is rich in monounsaturated fatty acids and antioxidant molecules and has some beneficial effects on lipid profile, inflammation and oxidant status. The aim of this study was to evaluate the serum lipid levels of children who were receiving olive oil-based KD for intractable seizures at least 1 year. 121 patients (mean age 7.45 ± 4.21 years, 57 girls) were enrolled. At baseline and post-treatment 1, 3, 6, and 12 months body mass index-SDS, total cholesterol, high-density lipoprotein (HDL) cholesterol, low-density lipoprotein (LDL) cholesterol and triglyceride levels were measured. Repeated measure ANOVA with post hoc Bonferroni correction was used for data analysis. The mean duration of KD was 15.4 ± 4.1 months. Mean total cholesterol, LDL-cholesterol and triglyceride levels were significantly higher at 1st, 3rd, 6th and 12th months of the KD treatment, compared to pre-treatment levels (p = 0.001), but showed no difference among during-treatment measurements. Mean body mass index-SDS and HDL-cholesterol levels were not different among the baseline and follow-up time points (p = 0.113 and p = 0.067, respectively). No child in this study discontinued the KD because of dyslipidemia. Even if rich in olive oil, high-fat KD causes significant increase in LDL-cholesterol and triglyceride levels. More studies are needed to determine the effect of KD on serum lipids in children using different fat sources in the diet.

  5. A LEVEL SET BASED SHAPE OPTIMIZATION METHOD FOR AN ELLIPTIC OBSTACLE PROBLEM

    KAUST Repository

    Burger, Martin; Matevosyan, Norayr; Wolfram, Marie-Therese

    2011-01-01

    analysis of the level set method in terms of viscosity solutions. To our knowledge this is the first complete analysis of a level set method for a nonlocal shape optimization problem. Finally, we discuss the implementation of the methods and illustrate its

  6. Investigation of Inquiry-based Science Pedagogy among Middle Level Science Teachers: A Qualitative Study

    Science.gov (United States)

    Weiland, Sunny Minelli

    This study implemented a qualitative approach to examine the phenomenon of "inquiry-based science pedagogy or inquiry instruction" as it has been experienced by individuals. Data was collected through online open-ended surveys, focus groups, and teacher reported self-reflections to answer the research questions: 1) How do middle level science teachers conceptualize "inquiry-based instruction?" 2) What are preferred instructional strategies for implementation in middle level science classrooms? And 3) How do middle level science teachers perceive the connection between science instruction and student learning? The participants within this research study represent 33 percent of teachers in grades 5 through 9 within six school districts in northeastern Pennsylvania. Of the 12 consent forms originally obtained, 10 teachers completed all three phases of the data collection, including the online survey, participation in focus groups, and teacher self-reflection. 60 percent of the participants taught only science, and 40 percent taught all content areas. Of the ten participants, 50 percent were certified teachers of science and 50 percent were certified as teachers of elementary education. 70 percent of the research participants reflected having obtained a master's, with 60 percent of these degrees being received in areas of education, and 10 percent in the area of science. The research participants have a total of 85 collective years of experience as professional educators, with the average years of experience being 8.5 years. Analysis of data revealed three themes related to research question #1) How do middle-level science teachers conceptualize inquiry-based instruction? and sub-question #1) How do middle-level science teachers characterize effective instruction? The themes that capture the essence of teachers' formulation of inquiry-based instruction that emerged in this study were student centered, problem solving, and hands-on . Analysis of data revealed one theme

  7. Rweb:Web-based Statistical Analysis

    Directory of Open Access Journals (Sweden)

    Jeff Banfield

    1999-03-01

    Full Text Available Rweb is a freely accessible statistical analysis environment that is delivered through the World Wide Web (WWW. It is based on R, a well known statistical analysis package. The only requirement to run the basic Rweb interface is a WWW browser that supports forms. If you want graphical output you must, of course, have a browser that supports graphics. The interface provides access to WWW accessible data sets, so you may run Rweb on your own data. Rweb can provide a four window statistical computing environment (code input, text output, graphical output, and error information through browsers that support Javascript. There is also a set of point and click modules under development for use in introductory statistics courses.

  8. Qualitative safety analysis in accelerator based systems

    International Nuclear Information System (INIS)

    Sarkar, P.K.; Chowdhury, Lekha M.

    2006-01-01

    In recent developments connected to high energy and high current accelerators, the accelerator driven systems (ADS) and the Radioactive Ion Beam (RIB) facilities come in the forefront of application. For medical and industrial applications high current accelerators often need to be located in populated areas. These facilities pose significant radiological hazard during their operation and accidental situations. We have done a qualitative evaluation of radiological safety analysis using the probabilistic safety analysis (PSA) methods for accelerator-based systems. The major contribution to hazard comes from a target rupture scenario in both ADS and RIB facilities. Other significant contributors to hazard in the facilities are also discussed using fault tree and event tree methodologies. (author)

  9. Detection of Cracking Levels in Brittle Rocks by Parametric Analysis of the Acoustic Emission Signals

    Science.gov (United States)

    Moradian, Zabihallah; Einstein, Herbert H.; Ballivy, Gerard

    2016-03-01

    Determination of the cracking levels during the crack propagation is one of the key challenges in the field of fracture mechanics of rocks. Acoustic emission (AE) is a technique that has been used to detect cracks as they occur across the specimen. Parametric analysis of AE signals and correlating these parameters (e.g., hits and energy) to stress-strain plots of rocks let us detect cracking levels properly. The number of AE hits is related to the number of cracks, and the AE energy is related to magnitude of the cracking event. For a full understanding of the fracture process in brittle rocks, prismatic specimens of granite containing pre-existing flaws have been tested in uniaxial compression tests, and their cracking process was monitored with both AE and high-speed video imaging. In this paper, the characteristics of the AE parameters and the evolution of cracking sequences are analyzed for every cracking level. Based on micro- and macro-crack damage, a classification of cracking levels is introduced. This classification contains eight stages (1) crack closure, (2) linear elastic deformation, (3) micro-crack initiation (white patch initiation), (4) micro-crack growth (stable crack growth), (5) micro-crack coalescence (macro-crack initiation), (6) macro-crack growth (unstable crack growth), (7) macro-crack coalescence and (8) failure.

  10. Noninvasive analysis of skin iron and zinc levels in beta-thalassemia major and intermedia

    International Nuclear Information System (INIS)

    Gorodetsky, R.; Goldfarb, A.; Dagan, I.; Rachmilewitz, E.A.

    1985-01-01

    Diagnostic x-ray spectrometry, a method based on x-ray fluorescence analysis, was used for noninvasive determination of iron and zinc in two distinct skin areas, representing predominantly dermal and epidermal tissues, in 56 patients with beta-thalassemia major and intermedia. The mean iron levels in the skin of patients with beta-thalassemia major and intermedia were elevated by greater than 200% and greater than 50%, respectively, compared with control values. The zinc levels of both skin areas examined were within the normal range. The data indicate that the rate and number of blood transfusions, which correlated well with serum ferritin levels (r . 0.8), are not the only factors that determine the amount of iron deposition in the skin (r less than 0.6). Other sources of iron intake contribute to the total iron load in the tissues, particularly in patients who are not given multiple transfusions. The noninvasive quantitation of skin levels may reflect the extent of iron deposition in major parenchymal organs. Repeated DXS examinations of the skin could monitor the clearance of iron from the tissues of patients with iron overload in the course of therapy with chelating agents

  11. Thermal-Signature-Based Sleep Analysis Sensor

    Directory of Open Access Journals (Sweden)

    Ali Seba

    2017-10-01

    Full Text Available This paper addresses the development of a new technique in the sleep analysis domain. Sleep is defined as a periodic physiological state during which vigilance is suspended and reactivity to external stimulations diminished. We sleep on average between six and nine hours per night and our sleep is composed of four to six cycles of about 90 min each. Each of these cycles is composed of a succession of several stages of sleep that vary in depth. Analysis of sleep is usually done via polysomnography. This examination consists of recording, among other things, electrical cerebral activity by electroencephalography (EEG, ocular movements by electrooculography (EOG, and chin muscle tone by electromyography (EMG. Recordings are made mostly in a hospital, more specifically in a service for monitoring the pathologies related to sleep. The readings are then interpreted manually by an expert to generate a hypnogram, a curve showing the succession of sleep stages during the night in 30s epochs. The proposed method is based on the follow-up of the thermal signature that makes it possible to classify the activity into three classes: “awakening,” “calm sleep,” and “restless sleep”. The contribution of this non-invasive method is part of the screening of sleep disorders, to be validated by a more complete analysis of the sleep. The measure provided by this new system, based on temperature monitoring (patient and ambient, aims to be integrated into the tele-medicine platform developed within the framework of the Smart-EEG project by the SYEL–SYstèmes ELectroniques team. Analysis of the data collected during the first surveys carried out with this method showed a correlation between thermal signature and activity during sleep. The advantage of this method lies in its simplicity and the possibility of carrying out measurements of activity during sleep and without direct contact with the patient at home or hospitals.

  12. Nuclear power company activity based costing management analysis

    International Nuclear Information System (INIS)

    Xu Dan

    2012-01-01

    With Nuclear Energy Industry development, Nuclear Power Company has the continual promoting stress of inner management to the sustainable marketing operation development. In view of this, it is very imminence that Nuclear Power Company should promote the cost management levels and built the nuclear safety based lower cost competitive advantage. Activity based costing management (ABCM) transfer the cost management emphases from the 'product' to the 'activity' using the value chain analysis methods, cost driver analysis methods and so on. According to the analysis of the detail activities and the value chains, cancel the unnecessary activity, low down the resource consuming of the necessary activity, and manage the cost from the source, achieve the purpose of reducing cost, boosting efficiency and realizing the management value. It gets the conclusion from the detail analysis with the nuclear power company procedure and activity, and also with the selection to 'pieces analysis' of the important cost related project in the nuclear power company. The conclusion is that the activities of the nuclear power company has the obviously performance. It can use the management of ABC method. And with the management of the procedure and activity, it is helpful to realize the nuclear safety based low cost competitive advantage in the nuclear power company. (author)

  13. Multi-phase flow monitoring with electrical impedance tomography using level set based method

    International Nuclear Information System (INIS)

    Liu, Dong; Khambampati, Anil Kumar; Kim, Sin; Kim, Kyung Youn

    2015-01-01

    Highlights: • LSM has been used for shape reconstruction to monitor multi-phase flow using EIT. • Multi-phase level set model for conductivity is represented by two level set functions. • LSM handles topological merging and breaking naturally during evolution process. • To reduce the computational time, a narrowband technique was applied. • Use of narrowband and optimization approach results in efficient and fast method. - Abstract: In this paper, a level set-based reconstruction scheme is applied to multi-phase flow monitoring using electrical impedance tomography (EIT). The proposed scheme involves applying a narrowband level set method to solve the inverse problem of finding the interface between the regions having different conductivity values. The multi-phase level set model for the conductivity distribution inside the domain is represented by two level set functions. The key principle of the level set-based method is to implicitly represent the shape of interface as the zero level set of higher dimensional function and then solve a set of partial differential equations. The level set-based scheme handles topological merging and breaking naturally during the evolution process. It also offers several advantages compared to traditional pixel-based approach. Level set-based method for multi-phase flow is tested with numerical and experimental data. It is found that level set-based method has better reconstruction performance when compared to pixel-based method

  14. Does regional disadvantage affect health-related sport and physical activity level? A multi-level analysis of individual behaviour.

    Science.gov (United States)

    Wicker, Pamela; Downward, Paul; Lera-López, Fernando

    2017-11-01

    This study examines the role of regional government quality in health-related participation in sport and physical activity among adults (18-64 years) in 28 European countries. The importance of the analysis rests in the relative autonomy that regional and local governments have over policy decisions connected with sport and physical activity. While existing studies have focussed on economic and infrastructural investment and expenditure, this research investigates the quality of regional governments across 208 regions within 28 European countries. The individual-level data stem from the 2013 Eurobarometer 80.2 (n = 18,675) and were combined with regional-level data from Eurostat. An individual's level of participation in sport and physical activity was measured by three variables reflecting whether an individual's activity level is below, meets, or exceeds the recommendations of the World Health Organization. The results of multi-level analyses reveal that regional government quality has a significant and positive association with individual participation in sport and physical activity at a level meeting or exceeding the guidelines. The impact is much larger than that of regional gross domestic product per capita, indicating that regional disadvantage in terms of political quality is more relevant than being disadvantaged in terms of economic wealth.

  15. Food Image Recognition via Superpixel Based Low-Level and Mid-Level Distance Coding for Smart Home Applications

    Directory of Open Access Journals (Sweden)

    Jiannan Zheng

    2017-05-01

    Full Text Available Food image recognition is a key enabler for many smart home applications such as smart kitchen and smart personal nutrition log. In order to improve living experience and life quality, smart home systems collect valuable insights of users’ preferences, nutrition intake and health conditions via accurate and robust food image recognition. In addition, efficiency is also a major concern since many smart home applications are deployed on mobile devices where high-end GPUs are not available. In this paper, we investigate compact and efficient food image recognition methods, namely low-level and mid-level approaches. Considering the real application scenario where only limited and noisy data are available, we first proposed a superpixel based Linear Distance Coding (LDC framework where distinctive low-level food image features are extracted to improve performance. On a challenging small food image dataset where only 12 training images are available per category, our framework has shown superior performance in both accuracy and robustness. In addition, to better model deformable food part distribution, we extend LDC’s feature-to-class distance idea and propose a mid-level superpixel food parts-to-class distance mining framework. The proposed framework show superior performance on a benchmark food image datasets compared to other low-level and mid-level approaches in the literature.

  16. Experimental data base for containment thermalhydraulic analysis

    International Nuclear Information System (INIS)

    Cheng, X.; Bazin, P.; Cornet, P.; Hittner, D.; Jackson, J.D.; Lopez Jimenez, J.; Naviglio, A.; Oriolo, F.; Petzold, H.

    2001-01-01

    This paper describes the joint research project DABASCO which is supported by the European Community under a cost-shared contract and participated by nine European institutions. The main objective of the project is to provide a generic experimental data base for the development of physical models and correlations for containment thermalhydraulic analysis. The project consists of seven separate-effects experimental programs which deal with new innovative conceptual features, e.g. passive decay heat removal and spray systems. The results of the various stages of the test programs will be assessed by industrial partners in relation to their applicability to reactor conditions

  17. Constructing storyboards based on hierarchical clustering analysis

    Science.gov (United States)

    Hasebe, Satoshi; Sami, Mustafa M.; Muramatsu, Shogo; Kikuchi, Hisakazu

    2005-07-01

    There are growing needs for quick preview of video contents for the purpose of improving accessibility of video archives as well as reducing network traffics. In this paper, a storyboard that contains a user-specified number of keyframes is produced from a given video sequence. It is based on hierarchical cluster analysis of feature vectors that are derived from wavelet coefficients of video frames. Consistent use of extracted feature vectors is the key to avoid a repetition of computationally-intensive parsing of the same video sequence. Experimental results suggest that a significant reduction in computational time is gained by this strategy.

  18. Robust Face Recognition Based on Texture Analysis

    Directory of Open Access Journals (Sweden)

    Sanun Srisuk

    2013-01-01

    Full Text Available In this paper, we present a new framework for face recognition with varying illumination based on DCT total variation minimization (DTV, a Gabor filter, a sub-micro-pattern analysis (SMP and discriminated accumulative feature transform (DAFT. We first suppress the illumination effect by using the DCT with the help of TV as a tool for face normalization. The DTV image is then emphasized by the Gabor filter. The facial features are encoded by our proposed method - the SMP. The SMP image is then transformed to the 2D histogram using DAFT. Our system is verified with experiments on the AR and the Yale face database B.

  19. AR(p) -based detrended fluctuation analysis

    Science.gov (United States)

    Alvarez-Ramirez, J.; Rodriguez, E.

    2018-07-01

    Autoregressive models are commonly used for modeling time-series from nature, economics and finance. This work explored simple autoregressive AR(p) models to remove long-term trends in detrended fluctuation analysis (DFA). Crude oil prices and bitcoin exchange rate were considered, with the former corresponding to a mature market and the latter to an emergent market. Results showed that AR(p) -based DFA performs similar to traditional DFA. However, the former DFA provides information on stability of long-term trends, which is valuable for understanding and quantifying the dynamics of complex time series from financial systems.

  20. Similarity-based pattern analysis and recognition

    CERN Document Server

    Pelillo, Marcello

    2013-01-01

    This accessible text/reference presents a coherent overview of the emerging field of non-Euclidean similarity learning. The book presents a broad range of perspectives on similarity-based pattern analysis and recognition methods, from purely theoretical challenges to practical, real-world applications. The coverage includes both supervised and unsupervised learning paradigms, as well as generative and discriminative models. Topics and features: explores the origination and causes of non-Euclidean (dis)similarity measures, and how they influence the performance of traditional classification alg

  1. Nondeducibility-Based Analysis of Cyber-Physical Systems

    Science.gov (United States)

    Gamage, Thoshitha; McMillin, Bruce

    Controlling information flow in a cyber-physical system (CPS) is challenging because cyber domain decisions and actions manifest themselves as visible changes in the physical domain. This paper presents a nondeducibility-based observability analysis for CPSs. In many CPSs, the capacity of a low-level (LL) observer to deduce high-level (HL) actions ranges from limited to none. However, a collaborative set of observers strategically located in a network may be able to deduce all the HL actions. This paper models a distributed power electronics control device network using a simple DC circuit in order to understand the effect of multiple observers in a CPS. The analysis reveals that the number of observers required to deduce all the HL actions in a system increases linearly with the number of configurable units. A simple definition of nondeducibility based on the uniqueness of low-level projections is also presented. This definition is used to show that a system with two security domain levels could be considered “nondeducibility secure” if no unique LL projections exist.

  2. Motion Analysis Based on Invertible Rapid Transform

    Directory of Open Access Journals (Sweden)

    J. Turan

    1999-06-01

    Full Text Available This paper presents the results of a study on the use of invertible rapid transform (IRT for the motion estimation in a sequence of images. Motion estimation algorithms based on the analysis of the matrix of states (produced in the IRT calculation are described. The new method was used experimentally to estimate crowd and traffic motion from the image data sequences captured at railway stations and at high ways in large cities. The motion vectors may be used to devise a polar plot (showing velocity magnitude and direction for moving objects where the dominant motion tendency can be seen. The experimental results of comparison of the new motion estimation methods with other well known block matching methods (full search, 2D-log, method based on conventional (cross correlation (CC function or phase correlation (PC function for application of crowd motion estimation are also presented.

  3. IMAGE ANALYSIS BASED ON EDGE DETECTION TECHNIQUES

    Institute of Scientific and Technical Information of China (English)

    纳瑟; 刘重庆

    2002-01-01

    A method that incorporates edge detection technique, Markov Random field (MRF), watershed segmentation and merging techniques was presented for performing image segmentation and edge detection tasks. It first applies edge detection technique to obtain a Difference In Strength (DIS) map. An initial segmented result is obtained based on K-means clustering technique and the minimum distance. Then the region process is modeled by MRF to obtain an image that contains different intensity regions. The gradient values are calculated and then the watershed technique is used. DIS calculation is used for each pixel to define all the edges (weak or strong) in the image. The DIS map is obtained. This help as priority knowledge to know the possibility of the region segmentation by the next step (MRF), which gives an image that has all the edges and regions information. In MRF model,gray level l, at pixel location i, in an image X, depends on the gray levels of neighboring pixels. The segmentation results are improved by using watershed algorithm. After all pixels of the segmented regions are processed, a map of primitive region with edges is generated. The edge map is obtained using a merge process based on averaged intensity mean values. A common edge detectors that work on (MRF) segmented image are used and the results are compared. The segmentation and edge detection result is one closed boundary per actual region in the image.

  4. Assessment of Radio-Frequency Radiation Exposure Level from Selected Mobile Base Stations (MBS) in Lokoja, Kogi State, Nigeria

    OpenAIRE

    Nwankwo, U. J. Victor; Jibiri, N. N.; Dada, S. S.; Onugba, A. A.; Ushie, P.

    2012-01-01

    The acquisition and use of mobile phone is tremendously increasing especially in developing countries, but not without a concern. The greater concern among the public is principally over the proximity of mobile base stations (MBS) to residential areas rather than the use of handsets. In this paper, we present an assessment of Radio-Frequency (RF) radiation exposure level measurements and analysis of radiation power density (in W/sq m) from mobile base stations relative to radial distance (in ...

  5. Staffing Levels and Inpatient Outcomes at Military Health Care Facilities: A Resource-Based View

    National Research Council Canada - National Science Library

    Yap, Glenn

    2004-01-01

    Using a Resource-Based Theory/View of the firm, this study examined if increased inpatient staffing levels at military hospitals can generate a competitive advantage based on better patient quality outcomes...

  6. System-level hazard analysis using the sequence-tree method

    International Nuclear Information System (INIS)

    Huang, H.-W.; Shih Chunkuan; Yih Swu; Chen, M.-H.

    2008-01-01

    A system-level PHA using the sequence-tree method is presented to perform safety-related digital I and C system SSA. The conventional PHA involves brainstorming among experts on various portions of the system to identify hazards through discussions. However, since the conventional PHA is not a systematic technique, the analysis results depend strongly on the experts' subjective opinions. The quality of analysis cannot be appropriately controlled. Therefore, this study presents a system-level sequence tree based PHA, which can clarify the relationship among the major digital I and C systems. This sequence-tree-based technique has two major phases. The first phase adopts a table to analyze each event in SAR Chapter 15 for a specific safety-related I and C system, such as RPS. The second phase adopts a sequence tree to recognize the I and C systems involved in the event, the working of the safety-related systems and how the backup systems can be activated to mitigate the consequence if the primary safety systems fail. The defense-in-depth echelons, namely the Control echelon, Reactor trip echelon, ESFAS echelon and Monitoring and indicator echelon, are arranged to build the sequence-tree structure. All the related I and C systems, including the digital systems and the analog back-up systems, are allocated in their specific echelons. This system-centric sequence-tree analysis not only systematically identifies preliminary hazards, but also vulnerabilities in a nuclear power plant. Hence, an effective simplified D3 evaluation can also be conducted

  7. Recurrence quantity analysis based on matrix eigenvalues

    Science.gov (United States)

    Yang, Pengbo; Shang, Pengjian

    2018-06-01

    Recurrence plots is a powerful tool for visualization and analysis of dynamical systems. Recurrence quantification analysis (RQA), based on point density and diagonal and vertical line structures in the recurrence plots, is considered to be alternative measures to quantify the complexity of dynamical systems. In this paper, we present a new measure based on recurrence matrix to quantify the dynamical properties of a given system. Matrix eigenvalues can reflect the basic characteristics of the complex systems, so we show the properties of the system by exploring the eigenvalues of the recurrence matrix. Considering that Shannon entropy has been defined as a complexity measure, we propose the definition of entropy of matrix eigenvalues (EOME) as a new RQA measure. We confirm that EOME can be used as a metric to quantify the behavior changes of the system. As a given dynamical system changes from a non-chaotic to a chaotic regime, the EOME will increase as well. The bigger EOME values imply higher complexity and lower predictability. We also study the effect of some factors on EOME,including data length, recurrence threshold, the embedding dimension, and additional noise. Finally, we demonstrate an application in physiology. The advantage of this measure lies in a high sensitivity and simple computation.

  8. Visibility Graph Based Time Series Analysis.

    Science.gov (United States)

    Stephen, Mutua; Gu, Changgui; Yang, Huijie

    2015-01-01

    Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq) and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  9. Visibility Graph Based Time Series Analysis.

    Directory of Open Access Journals (Sweden)

    Mutua Stephen

    Full Text Available Network based time series analysis has made considerable achievements in the recent years. By mapping mono/multivariate time series into networks, one can investigate both it's microscopic and macroscopic behaviors. However, most proposed approaches lead to the construction of static networks consequently providing limited information on evolutionary behaviors. In the present paper we propose a method called visibility graph based time series analysis, in which series segments are mapped to visibility graphs as being descriptions of the corresponding states and the successively occurring states are linked. This procedure converts a time series to a temporal network and at the same time a network of networks. Findings from empirical records for stock markets in USA (S&P500 and Nasdaq and artificial series generated by means of fractional Gaussian motions show that the method can provide us rich information benefiting short-term and long-term predictions. Theoretically, we propose a method to investigate time series from the viewpoint of network of networks.

  10. Erythropoietin levels in patients with sleep apnea: a meta-analysis.

    Science.gov (United States)

    Zhang, Xiao-Bin; Zeng, Yi-Ming; Zeng, Hui-Qing; Zhang, Hua-Ping; Wang, Hui-Ling

    2017-06-01

    Currently available data regarding the blood levels of erythropoietin (EPO) in sleep apnea (SA) patients are contradictory. The aim of the present meta-analysis was to evaluate the EPO levels in SA patients via quantitative analysis. A systematic search of Pubmed, Embase, and Web of Science were performed. EPO levels in SA group and control group were extracted from each eligible study. Weight mean difference (WMD) or Standard mean difference (SMD) with 95% confidence interval (CI) was calculated by using fixed-effects or random effect model analysis according to the degree of heterogeneity between studies. A total of 9 studies involving 407 participants were enrolled. The results indicated that EPO levels in SA group were significantly higher than that in control group (SMD 0.61, 95% CI 0.11-1.11, p = 0.016). Significantly higher EPO levels were found in patients with body mass index analysis (both p analysis.

  11. DTI Analysis of Presbycusis Using Voxel-Based Analysis.

    Science.gov (United States)

    Ma, W; Li, M; Gao, F; Zhang, X; Shi, L; Yu, L; Zhao, B; Chen, W; Wang, G; Wang, X

    2016-07-14

    Presbycusis is the most common sensory deficit in the aging population. A recent study reported using a DTI-based tractography technique to identify a lack of integrity in a portion of the auditory pathway in patients with presbycusis. The aim of our study was to investigate the white matter pathology of patients with presbycusis by using a voxel-based analysis that is highly sensitive to local intensity changes in DTI data. Fifteen patients with presbycusis and 14 age- and sex-matched healthy controls were scanned on a 3T scanner. Fractional anisotropy, mean diffusivity, axial diffusivity, and radial diffusivity were obtained from the DTI data. Intergroup statistics were implemented on these measurements, which were transformed to Montreal Neurological Institute coordinates by using a nonrigid image registration method called large deformation diffeomorphic metric mapping. Increased axial diffusivity, radial diffusivity, and mean diffusivity and decreased fractional anisotropy were found near the right-side hearing-related areas in patients with presbycusis. Increased radial diffusivity and mean diffusivity were also found near a language-related area (Broca area) in patients with presbycusis. Our findings could be important for exploring reliable imaging evidence of presbycusis and could complement an ROI-based approach. © 2016 American Society of Neuroradiology.

  12. DETERMINING OPTIMAL CUBE FOR 3D-DCT BASED VIDEO COMPRESSION FOR DIFFERENT MOTION LEVELS

    Directory of Open Access Journals (Sweden)

    J. Augustin Jacob

    2012-11-01

    Full Text Available This paper proposes new three dimensional discrete cosine transform (3D-DCT based video compression algorithm that will select the optimal cube size based on the motion content of the video sequence. It is determined by finding normalized pixel difference (NPD values, and by categorizing the cubes as “low” or “high” motion cube suitable cube size of dimension either [16×16×8] or[8×8×8] is chosen instead of fixed cube algorithm. To evaluate the performance of the proposed algorithm test sequence with different motion levels are chosen. By doing rate vs. distortion analysis the level of compression that can be achieved and the quality of reconstructed video sequence are determined and compared against fixed cube size algorithm. Peak signal to noise ratio (PSNR is taken to measure the video quality. Experimental result shows that varying the cube size with reference to the motion content of video frames gives better performance in terms of compression ratio and video quality.

  13. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method.

    Science.gov (United States)

    Yang, Jun-He; Cheng, Ching-Hsue; Chan, Chia-Pan

    2017-01-01

    Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir's water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir's water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.

  14. A Time-Series Water Level Forecasting Model Based on Imputation and Variable Selection Method

    Directory of Open Access Journals (Sweden)

    Jun-He Yang

    2017-01-01

    Full Text Available Reservoirs are important for households and impact the national economy. This paper proposed a time-series forecasting model based on estimating a missing value followed by variable selection to forecast the reservoir’s water level. This study collected data from the Taiwan Shimen Reservoir as well as daily atmospheric data from 2008 to 2015. The two datasets are concatenated into an integrated dataset based on ordering of the data as a research dataset. The proposed time-series forecasting model summarily has three foci. First, this study uses five imputation methods to directly delete the missing value. Second, we identified the key variable via factor analysis and then deleted the unimportant variables sequentially via the variable selection method. Finally, the proposed model uses a Random Forest to build the forecasting model of the reservoir’s water level. This was done to compare with the listing method under the forecasting error. These experimental results indicate that the Random Forest forecasting model when applied to variable selection with full variables has better forecasting performance than the listing model. In addition, this experiment shows that the proposed variable selection can help determine five forecast methods used here to improve the forecasting capability.

  15. Intensity liquid level sensor based on multimode interference and fiber Bragg grating

    International Nuclear Information System (INIS)

    Oliveira, Ricardo; Aristilde, Stenio; Osório, Jonas H; Cordeiro, Cristiano M B; Franco, Marcos A R; Bilro, Lúcia; Nogueira, Rogério N

    2016-01-01

    In this paper an intensity liquid level sensor based on a single-mode—no-core—single-mode (SMS) fiber structure together with a Bragg grating inscribed in the later single mode fiber is proposed. As the no-core fiber is sensitive to the external refractive index, the SMS spectral response will be shifted related to the length of no-core fiber that is immersed in a liquid. By positioning the FBG central wavelength at the spectral region of the SMS edge filter, it is possible to measure the liquid level using the reflected FBG peak power through an intensity-based approach. The sensor is also self-referenced using the peak power of another FBG that is placed before and far from the sensing part. The temperature error analysis was also studied revealing that the sensor can operate in environments where the temperature changes are minimal. The possibility to use a second setup that makes the whole device temperature insensitive is also discussed. (paper)

  16. Local and System Level Considerations for Plasma-Based Techniques in Hypersonic Flight

    Science.gov (United States)

    Suchomel, Charles; Gaitonde, Datta

    2007-01-01

    The harsh environment encountered due to hypersonic flight, particularly when air-breathing propulsion devices are utilized, poses daunting challenges to successful maturation of suitable technologies. This has spurred the quest for revolutionary solutions, particularly those exploiting the fact that air under these conditions can become electrically conducting either naturally or through artificial enhancement. Optimized development of such concepts must emphasize not only the detailed physics by which the fluid interacts with the imposed electromagnetic fields, but must also simultaneously identify system level issues integration and efficiencies that provide the greatest leverage. This paper presents some recent advances at both levels. At the system level, an analysis is summarized that incorporates the interdependencies occurring between weight, power and flow field performance improvements. Cruise performance comparisons highlight how one drag reduction device interacts with the vehicle to improve range. Quantified parameter interactions allow specification of system requirements and energy consuming technologies that affect overall flight vehicle performance. Results based on on the fundamental physics are presented by distilling numerous computational studies into a few guiding principles. These highlight the complex non-intuitive relationships between the various fluid and electromagnetic fields, together with thermodynamic considerations. Generally, energy extraction is an efficient process, while the reverse is accompanied by significant dissipative heating and inefficiency. Velocity distortions can be detrimental to plasma operation, but can be exploited to tailor flows through innovative electromagnetic configurations.

  17. Discrimination of nitrogen fertilizer levels of tea plant (Camellia sinensis) based on hyperspectral imaging.

    Science.gov (United States)

    Wang, Yujie; Hu, Xin; Hou, Zhiwei; Ning, Jingming; Zhang, Zhengzhu

    2018-04-01

    Nitrogen (N) fertilizer plays an important role in tea plantation management, with significant impacts on the photosynthetic capacity, productivity and nutrition status of tea plants. The present study aimed to establish a method for the discrimination of N fertilizer levels using hyperspectral imaging technique. Spectral data were extracted from the region of interest, followed by the first derivative to reduce background noise. Five optimal wavelengths were selected by principal component analysis. Texture features were extracted from the images at optimal wavelengths by gray-level gradient co-occurrence matrix. Support vector machine (SVM) and extreme learning machine were used to build classification models based on spectral data, optimal wavelengths, texture features and data fusion, respectively. The SVM model using fused data gave the best performance with highest correct classification rate of 100% for prediction set. The overall results indicated that visible and near-infrared hyperspectral imaging combined with SVM were effective in discriminating N fertilizer levels of tea plants. © 2018 Society of Chemical Industry. © 2018 Society of Chemical Industry.

  18. Greater-than-Class-C Low-Level Waste Data Base user's manual

    International Nuclear Information System (INIS)

    1992-07-01

    The Greater-than-Class-C Low-level Waste (GTCC LLW) Data Base characterizes GTCC LLW using low, base, and high cases for three different scenarios: unpackaged, packaged, and concentration averages. The GTCC LLW Data Base can be used to project future volumes and radionuclide activities. This manual provides instructions for users of the GTCC LLW Data Base

  19. Local level epidemiological analysis of TB in people from a high incidence country of birth

    OpenAIRE

    Massey Peter D; Durrheim David N; Stephens Nicola; Christensen Amanda

    2013-01-01

    Abstract Background The setting for this analysis is the low tuberculosis (TB) incidence state of New South Wales (NSW), Australia. Local level analysis of TB epidemiology in people from high incidence countries-of-birth (HIC) in a low incidence setting has not been conducted in Australia and has not been widely reported. Local level analysis could inform measures such as active case finding and targeted earlier diagnosis. The aim of this study was to use a novel approach to identify local ar...

  20. Work-engaged nurses for a better clinical learning environment: a ward-level analysis.

    Science.gov (United States)

    Tomietto, Marco; Comparcini, Dania; Simonetti, Valentina; Pelusi, Gilda; Troiani, Silvano; Saarikoski, Mikko; Cicolini, Giancarlo

    2016-05-01

    To correlate workgroup engagement in nursing teams and the clinical learning experience of nursing students. Work engagement plays a pivotal role in explaining motivational dynamics. Nursing education is workplace-based and, through their clinical placements, nursing students develop both their clinical competences and their professional identity. However, there is currently a lack of evidence on the role of work engagement related to students' learning experiences. A total of 519 nurses and 519 nursing students were enrolled in hospital settings. The Utrecht Work Engagement Scale (UWES) was used to assess work engagement, and the Clinical Learning Environment and Supervision plus nurse Teacher (CLES+T) scale was used to assess students' learning experience. A multilevel linear regression analysis was performed. Group-level work engagement of nurses correlated with students' clinical learning experience (β = 0.11, P learning (respectively, β = 0.37, P education. Nursing education institutions and health-care settings need to conjointly work to build effective organisational climates. The results highlighted the importance of considering the group-level analysis to understand the most effective strategies of intervention for both organisations and nursing education. © 2015 John Wiley & Sons Ltd.