Energy Technology Data Exchange (ETDEWEB)
Nygaard, E. T. [Babcock and Wilcox Technical Services Group, 800 Main Street, Lynchburg, VA 24504 (United States); Pain, C. C.; Eaton, M. D.; Gomes, J. L. M. A.; Goddard, A. J. H.; Gorman, G.; Tollit, B.; Buchan, A. G.; Cooling, C. M. [Applied Modelling and Computation Group, Dept. of Earth Science and Engineering, Imperial College London, SW7 2AZ (United Kingdom); Angelo, P. L. [Y-12 National Security Complex, Oak Ridge, TN 37831 (United States)
2012-07-01
Babcock and Wilcox Technical Services Group (B and W) has identified aqueous homogeneous reactors (AHRs) as a technology well suited to produce the medical isotope molybdenum 99 (Mo-99). AHRs have never been specifically designed or built for this specialized purpose. However, AHRs have a proven history of being safe research reactors. In fact, in 1958, AHRs had 'a longer history of operation than any other type of research reactor using enriched fuel' and had 'experimentally demonstrated to be among the safest of all various type of research reactor now in use [1].' While AHRs have been modeled effectively using simplified 'Level 1' tools, the complex interactions between fluids, neutronics, and solid structures are important (but not necessarily safety significant). These interactions require a 'Level 2' modeling tool. Imperial College London (ICL) has developed such a tool: Finite Element Transient Criticality (FETCH). FETCH couples the radiation transport code EVENT with the computational fluid dynamics code (Fluidity), the result is a code capable of modeling sub-critical, critical, and super-critical solutions in both two-and three-dimensions. Using FETCH, ICL researchers and B and W engineers have studied many fissioning solution systems include the Tokaimura criticality accident, the Y12 accident, SILENE, TRACY, and SUPO. These modeling efforts will ultimately be incorporated into FETCH'S extensive automated verification and validation (V and V) test suite expanding FETCH'S area of applicability to include all relevant physics associated with AHRs. These efforts parallel B and W's engineering effort to design and optimize an AHR to produce Mo99. (authors)
Energy Technology Data Exchange (ETDEWEB)
Rider, William, E-mail: wjrider@sandia.gov [Sandia National Laboratories, Center for Computing Research, Albuquerque, NM 87185 (United States); Witkowski, Walt [Sandia National Laboratories, Verification and Validation, Uncertainty Quantification, Credibility Processes Department, Engineering Sciences Center, Albuquerque, NM 87185 (United States); Kamm, James R. [Los Alamos National Laboratory, Methods and Algorithms Group, Computational Physics Division, Los Alamos, NM 87545 (United States); Wildey, Tim [Sandia National Laboratories, Center for Computing Research, Albuquerque, NM 87185 (United States)
2016-02-15
We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.
Verification and Performance Analysis for Embedded Systems
DEFF Research Database (Denmark)
Larsen, Kim Guldstrand
2009-01-01
This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems.......This talk provides a thorough tutorial of the UPPAAL tool suite for, modeling, simulation, verification, optimal scheduling, synthesis, testing and performance analysis of embedded and real-time systems....
Homogeneity Analysis in Xlisp-Stat
Directory of Open Access Journals (Sweden)
Jason Bond
1996-08-01
Full Text Available In this paper a highly interactive, user-friendly Lisp program is introduced to perform homogeneity analysis. A brief introduction to the technique is presented as well as its modification in the presence of missing data. The algorithm and its Lisp implemenation is discussed, and an overview of the object oriented code that produces the interactive dialogs and plots is provided. In order to demonstrate the main features of the program, a small and a large dataset are analyzed. Finally, some comparisons are made with other currently available programs.
Ahmad, R.
2016-07-01
This article reports an unbiased analysis for the water based rod shaped alumina nanoparticles by considering both the homogeneous and non-homogeneous nanofluid models over the coupled nanofluid-surface interface. The mechanics of the surface are found for both the homogeneous and non-homogeneous models, which were ignored in previous studies. The viscosity and thermal conductivity data are implemented from the international nanofluid property benchmark exercise. All the simulations are being done by using the experimentally verified results. By considering the homogeneous and non-homogeneous models, the precise movement of the alumina nanoparticles over the surface has been observed by solving the corresponding system of differential equations. For the non-homogeneous model, a uniform temperature and nanofluid volume fraction are assumed at the surface, and the flux of the alumina nanoparticle is taken as zero. The assumption of zero nanoparticle flux at the surface makes the non-homogeneous model physically more realistic. The differences of all profiles for both the homogeneous and nonhomogeneous models are insignificant, and this is due to small deviations in the values of the Brownian motion and thermophoresis parameters.
DEFF Research Database (Denmark)
Mikkelsen, Lars Pilgaard
2015-01-01
strength and fatigue performance is essential. Nevertheless, testing composites includes some challenges regarding stiffness determination using conventional strain gauges and achieving correct material failure unaffected by the gripping region during fatigue testing. Challenges, which in the present study......, has been addressed using the finite element method. During this, a verification of experimental observations, a deeper understanding on the test coupon loading and thereby improved test methods has been achieved....
Computing equations of water hammer in pseudo-homogeneous solid-liquid flow and their verification
Institute of Scientific and Technical Information of China (English)
无
2000-01-01
In engineering practice, single-phase water hammer models are still employed to analyze the water hammer of solid-liquid flow. According to the characteristics of solid-liquid flow, continuity equations and momentum equations of pseudo-homogeneous flows are deduced, and a pseudo-homogeneous water hammer model is thus built and verified with experiment results. The characteristics of solid-liquid flow's viscosity, resistance and wave velocity are considered in the model. Therefore, it has higher precision than a single-phase model.
Computing equations of water hammer in pseudo-homogeneous solid-liquid flow and their verification
Institute of Scientific and Technical Information of China (English)
韩文亮; 董曾南; 柴宏恩; 韩军
2000-01-01
In engineering practice, single-phase water hammer models are still employed to analyze the water hammer of solid-liquid flow. According to the characteristics of solid-liquid flow, continuity equations and momentum equations of pseudo-homogeneous flows are deduced, and a pseudo-homogeneous water hammer model is thus built and verified with experiment results. The characteristics of solid-liquid flow’s viscosity, resistance and wave velocity are considered in the model. Therefore, it has higher precision than a single-phase model.
Functional verification coverage measurement and analysis
Piziali, Andrew
2007-01-01
This book addresses a means of quantitatively assessing functional verification progress. Without this process, design and verification engineers, and management, are left guessing whether or not they have completed verifying the device they are designing.
Verification and validation plan for the SFR system analysis module
Energy Technology Data Exchange (ETDEWEB)
Hu, R. [Argonne National Lab. (ANL), Argonne, IL (United States)
2014-12-18
This report documents the Verification and Validation (V&V) Plan for software verification and validation of the SFR System Analysis Module (SAM), developed at Argonne National Laboratory for sodium fast reactor whole-plant transient analysis. SAM is developed under the DOE NEAMS program and is part of the Reactor Product Line toolkit. The SAM code, the phenomena and computational models of interest, the software quality assurance, and the verification and validation requirements and plans are discussed in this report.
Local real analysis in locally homogeneous spaces
Bramanti, Marco
2011-01-01
We introduce the concept of locally homogeneous space, and prove in this context L^p and Holder estimates for singular and fractional integrals, as well as L^p estimates on the commutator of a singular or fractional integral with a BMO or VMO function. These results are motivated by local a-priori estimates for subelliptic equations.
Homogenization analysis of complementary waveguide metamaterials
Landy, Nathan; Hunt, John; Smith, David R.
2013-11-01
We analyze the properties of complementary metamaterials as effective inclusions patterned into the conducting walls of metal waveguide structures. We show that guided wave metamaterials can be homogenized using the same retrieval techniques used for volumetric metamaterials, leading to a description in which a given complementary element is conceptually replaced by a block of material within the waveguide whose effective permittivity and permeability result in equivalent scattering characteristics. The use of effective constitutive parameters for waveguide materials provides an alternative point-of-view for the design of waveguide and microstrip based components, including planar lenses and filters, as well as devices with derived from a bulk material response. In addition to imparting effective constitutive properties to the waveguide, complementary metamaterials also couple energy from waveguide modes into radiation. Thus, complementary waveguide metamaterials can be used to modify and optimize a variety of antenna structures.
Trend and Homogeneity Analysis of Precipitation in Iran
Directory of Open Access Journals (Sweden)
Majid Javari
2016-09-01
Full Text Available The main objective of this study is to examine trend and homogeneity through the analysis of rainfall variability patterns in Iran. The study presents a review on the application of homogeneity and seasonal time series analysis methods for forecasting rainfall variations. Trend and homogeneity methods are applied in the time series analysis from collecting rainfall data to evaluating results in climate studies. For the homogeneity analysis of monthly, seasonal and annual rainfall, homogeneity tests were used in 140 stations in the 1975–2014 period. The homogeneity of the monthly and annual rainfall at each station was studied using the autocorrelation (ACF, and the von Neumann (VN tests at a significance level of 0.05. In addition, the nature of the monthly and seasonal rainfall series in Iran was studied using the Kruskal-Wallis (KW test, the Thumb test (TT, and the least squares regression (LSR test at a significance level of 0.05. The present results indicate that the seasonal patterns of rainfall exhibit considerable diversity across Iran. Rainfall seasonality is generally less spatially coherent than temporal patterns in Iran. The seasonal variations of rainfall decreased significantly throughout eastern and central Iran, but they increased in the west and north of Iran during the studied interval. The present study comparisons among variations of patterns with the seasonal rainfall series reveal that the variability of rainfall can be predicted by the non-trended and trended patterns.
Systems analysis-independent analysis and verification
Energy Technology Data Exchange (ETDEWEB)
Badin, J.S.; DiPietro, J.P. [Energetics, Inc., Columbia, MD (United States)
1995-09-01
The DOE Hydrogen Program is supporting research, development, and demonstration activities to overcome the barriers to the integration of hydrogen into the Nation`s energy infrastructure. Much work is required to gain acceptance of hydrogen energy system concepts and to develop them for implementation. A systems analysis database has been created that includes a formal documentation of technology characterization profiles and cost and performance information. Through a systematic and quantitative approach, system developers can understand and address important issues and thereby assure effective and timely commercial implementation. This project builds upon and expands the previously developed and tested pathway model and provides the basis for a consistent and objective analysis of all hydrogen energy concepts considered by the DOE Hydrogen Program Manager. This project can greatly accelerate the development of a system by minimizing the risk of costly design evolutions, and by stimulating discussions, feedback, and coordination of key players and allows them to assess the analysis, evaluate the trade-offs, and to address any emerging problem areas. Specific analytical studies will result in the validation of the competitive feasibility of the proposed system and identify system development needs. Systems that are investigated include hydrogen bromine electrolysis, municipal solid waste gasification, electro-farming (biomass gasifier and PEM fuel cell), wind/hydrogen hybrid system for remote sites, home electrolysis and alternate infrastructure options, renewable-based electrolysis to fuel PEM fuel cell vehicle fleet, and geothermal energy used to produce hydrogen. These systems are compared to conventional and benchmark technologies. Interim results and findings are presented. Independent analyses emphasize quality, integrity, objectivity, a long-term perspective, corporate memory, and the merging of technical, economic, operational, and programmatic expertise.
Systems analysis - independent analysis and verification
Energy Technology Data Exchange (ETDEWEB)
DiPietro, J.P.; Skolnik, E.G.; Badin, J.S. [Energetics, Inc., Columbia, MD (United States)
1996-10-01
The Hydrogen Program of the U.S. Department of Energy (DOE) funds a portfolio of activities ranging from conceptual research to pilot plant testing. The long-term research projects support DOE`s goal of a sustainable, domestically based energy system, and the development activities are focused on hydrogen-based energy systems that can be commercially viable in the near-term. Energetics develops analytic products that enable the Hydrogen Program Manager to assess the potential for near- and long-term R&D activities to satisfy DOE and energy market criteria. This work is based on a pathway analysis methodology. The authors consider an energy component (e.g., hydrogen production from biomass gasification, hybrid hydrogen internal combustion engine (ICE) vehicle) within a complete energy system. The work involves close interaction with the principal investigators to ensure accurate representation of the component technology. Comparisons are made with the current cost and performance of fossil-based and alternative renewable energy systems, and sensitivity analyses are conducted to determine the effect of changes in cost and performance parameters on the projects` viability.
Solar Array Verification Analysis Tool (SAVANT) Developed
Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert
1999-01-01
Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.
Missing data analysis and homogeneity test for Turkish precipitation series
Indian Academy of Sciences (India)
Mahmut Firat; Fatih Dikbas; A Cem Koç; Mahmud Gungor
2010-12-01
In this study, missing value analysis and homogeneity tests were conducted for 267 precipitation stations throughout Turkey. For this purpose, the monthly and annual total precipitation records at stations operated by Turkish State Meteorological Service (DMI) from 1968 to 1998 were considered. In these stations, precipitation records for each month was investigated separately and the stations with missing values for too many years were eliminated. The missing values of the stations were completed by Expectation Maximization (EM) method by using the precipitation records of the nearest gauging station. In this analysis, 38 stations were eliminated because they had missing values for more than 5 years, 161 stations had no missing values and missing precipitation values were completed in the remaining 68 stations. By this analysis, annual total precipitation data were obtained by using the monthly values. These data should be hydrologically and statistically reliable for later hydrological, meteorological, climate change modelling and forecasting studies. For this reason, Standard Normal Homogeneity Test (SNHT), (Swed–Eisenhart) Runs Test and Pettitt homogeneity tests were applied for the annual total precipitation data at 229 gauging stations from 1968 to 1998. The results of each of the testing methods were evaluated separately at a signiﬁcance level of 95% and the inhomogeneous years were determined. With the application of the aforementioned methods, inhomogeneity was detected at 50 stations of which the natural structure was deteriorated and 179 stations were found to be homogeneous.
Frenod, Emmanuel
2013-01-01
In this note, a classification of Homogenization-Based Numerical Methods and (in particular) of Numerical Methods that are based on the Two-Scale Convergence is done. In this classification stand: Direct Homogenization-Based Numerical Methods; H-Measure-Based Numerical Methods; Two-Scale Numerical Methods and TSAPS: Two-Scale Asymptotic Preserving Schemes.
ECCS flow verification to support transient analysis
Energy Technology Data Exchange (ETDEWEB)
Kovach, C.; Jacobs, R.H.; Ballard, J.E. [Commonwealth Edison Co., Chicago, IL (United States). Nuclear Fuel Services Dept.
1994-12-31
The RETRAN code has been used to develop a model of the Emergency Core Cooling System (ECCS). The model was developed in order to provide conservative injection flow data to be used in various LOCA and non-LOCA analyses and evaluations and to ensure that ECCS pump runout does not occur. The analyses were also needed in order to address a number of ECCS performance issues identified by Westinghouse. These issues include how previous analyses modeled miniflow, RCP seal injection, ECCS branch line resistance, pump suction boost during recirculation, injection line flow imbalances, and, of particular importance, ECCS flow measurement inaccuracies. In turn, these issues directly impact pump runout concerns, Technical Specification verification, and ECCS injection flow during transient conditions. The RETRAN ECCS model has proven to be quite versatile, easy to use, and requires only minimal information about the physical construction and performance of the ECCS system.
Analysis and Transformation Tools for Constrained Horn Clause Verification
DEFF Research Database (Denmark)
Kafle, Bishoksan; Gallagher, John Patrick
2014-01-01
is to investigate the use of a combination of off-the-shelf techniques from the literature in analysis and transformation of Constraint Logic Programs (CLPs) to solve challenging CHC verification problems. We find that many problems can be solved using a combination of tools based on well-known techniques from...... abstract interpretation, semantics-preserving transformations, program specialisation and query-answer transformations. This gives insights into the design of automatic, more general CHC verification tools based on a library of components....
Homogenization of soil properties map by Principal Component Analysis
Valverde Arias, Omar; Garrido, Alberto; Villeta, Maria; Tarquis, Ana Maria
2016-04-01
It is widely known that extreme climatic phenomena occur with more intensity and frequency. This fact has put more pressure over farming, becoming very important to implement agriculture risk management policies by governments and institutions. One of the main strategies is transfer risk by agriculture insurance. Agriculture insurance based in indexes has gained importance in the last decade. And consist in a comparison between measured index values with a defined threshold that triggers damage losses. However, based index insurance could not be based on an isolated measurement. It is necessary to be integrated in a complete monitoring system that uses many sources of information and tools. For example, index influence areas, crop production risk maps, crop yields, claim statistics, and so on. To establish index influence area is necessary to have a secondary information that show us homogeneous climatic and soil areas, which inside of each homogeneous classes, index measurements on crops of interest are going to be similar, and in this way reduce basis risk. But it is necessary an efficient method to accomplish this aim, to get homogeneous areas that not depends on only in expert criteria and that could be widely used, for this reason this study asses two conventional agricultural and geographic methods (control and climatic maps) based in expert criteria, and one classical statistical method of multi-factorial analysis (factorial map), all of them to homogenize soil and climatic characteristics. Resulting maps were validated by agricultural and spatial analysis, obtaining very good results in statistical method (Factorial map) that proves to be an efficient and accuracy method that could be used for similar porpoises.
Analysis of spectral methods for the homogeneous Boltzmann equation
Filbet, Francis
2011-04-01
The development of accurate and fast algorithms for the Boltzmann collision integral and their analysis represent a challenging problem in scientific computing and numerical analysis. Recently, several works were devoted to the derivation of spectrally accurate schemes for the Boltzmann equation, but very few of them were concerned with the stability analysis of the method. In particular there was no result of stability except when the method was modified in order to enforce the positivity preservation, which destroys the spectral accuracy. In this paper we propose a new method to study the stability of homogeneous Boltzmann equations perturbed by smoothed balanced operators which do not preserve positivity of the distribution. This method takes advantage of the "spreading" property of the collision, together with estimates on regularity and entropy production. As an application we prove stability and convergence of spectral methods for the Boltzmann equation, when the discretization parameter is large enough (with explicit bound). © 2010 American Mathematical Society.
Sensitivity analysis for reliable design verification of nuclear turbosets
Energy Technology Data Exchange (ETDEWEB)
Zentner, Irmela, E-mail: irmela.zentner@edf.f [Lamsid-Laboratory for Mechanics of Aging Industrial Structures, UMR CNRS/EDF, 1, avenue Du General de Gaulle, 92141 Clamart (France); EDF R and D-Structural Mechanics and Acoustics Department, 1, avenue Du General de Gaulle, 92141 Clamart (France); Tarantola, Stefano [Joint Research Centre of the European Commission-Institute for Protection and Security of the Citizen, T.P. 361, 21027 Ispra (Italy); Rocquigny, E. de [Ecole Centrale Paris-Applied Mathematics and Systems Department (MAS), Grande Voie des Vignes, 92 295 Chatenay-Malabry (France)
2011-03-15
In this paper, we present an application of sensitivity analysis for design verification of nuclear turbosets. Before the acquisition of a turbogenerator, energy power operators perform independent design assessment in order to assure safe operating conditions of the new machine in its environment. Variables of interest are related to the vibration behaviour of the machine: its eigenfrequencies and dynamic sensitivity to unbalance. In the framework of design verification, epistemic uncertainties are preponderant. This lack of knowledge is due to inexistent or imprecise information about the design as well as to interaction of the rotating machinery with supporting and sub-structures. Sensitivity analysis enables the analyst to rank sources of uncertainty with respect to their importance and, possibly, to screen out insignificant sources of uncertainty. Further studies, if necessary, can then focus on predominant parameters. In particular, the constructor can be asked for detailed information only about the most significant parameters.
Analysis and Verification of Service Interaction Protocols - A Brief Survey
Salaün, Gwen
2010-01-01
Modeling and analysis of interactions among services is a crucial issue in Service-Oriented Computing. Composing Web services is a complicated task which requires techniques and tools to verify that the new system will behave correctly. In this paper, we first overview some formal models proposed in the literature to describe services. Second, we give a brief survey of verification techniques that can be used to analyse services and their interaction. Last, we focus on the realizability and conformance of choreographies.
Stochastic analysis of laminated composite plate considering stochastic homogenization problem
Institute of Scientific and Technical Information of China (English)
S. SAKATA; K. OKUDA; K. IKEDA
2015-01-01
This paper discusses a multiscale stochastic analysis of a laminated composite plate consisting of unidirectional fiber reinforced composite laminae. In particular, influence of a microscopic random variation of the elastic properties of component materials on mechanical properties of the laminated plate is investigated. Laminated composites are widely used in civil engineering, and therefore multiscale stochastic analysis of laminated composites should be performed for reliability evaluation of a composite civil structure. This study deals with the stochastic response of a laminated composite plate against the microscopic random variation in addition to a random variation of fiber orientation in each lamina, and stochastic properties of the mechanical responses of the laminated plate is investigated. Halpin-Tsai formula and the homogenization theory-based finite element analysis are employed for estimation of effective elastic properties of lamina, and the classical laminate theory is employed for analysis of a laminated plate. The Monte-Carlo simulation and the first-order second moment method with sensitivity analysis are employed for the stochastic analysis. From the numerical results, importance of the multiscale stochastic analysis for reliability evaluation of a laminated composite structure and applicability of the sensitivity-based approach are discussed.
Homogeneous protein analysis by magnetic core-shell nanorod probes
Schrittwieser, Stefan
2016-03-29
Studying protein interactions is of vital importance both to fundamental biology research and to medical applications. Here, we report on the experimental proof of a universally applicable label-free homogeneous platform for rapid protein analysis. It is based on optically detecting changes in the rotational dynamics of magnetically agitated core-shell nanorods upon their specific interaction with proteins. By adjusting the excitation frequency, we are able to optimize the measurement signal for each analyte protein size. In addition, due to the locking of the optical signal to the magnetic excitation frequency, background signals are suppressed, thus allowing exclusive studies of processes at the nanoprobe surface only. We study target proteins (soluble domain of the human epidermal growth factor receptor 2 - sHER2) specifically binding to antibodies (trastuzumab) immobilized on the surface of our nanoprobes and demonstrate direct deduction of their respective sizes. Additionally, we examine the dependence of our measurement signal on the concentration of the analyte protein, and deduce a minimally detectable sHER2 concentration of 440 pM. For our homogeneous measurement platform, good dispersion stability of the applied nanoprobes under physiological conditions is of vital importance. To that end, we support our measurement data by theoretical modeling of the total particle-particle interaction energies. The successful implementation of our platform offers scope for applications in biomarker-based diagnostics as well as for answering basic biology questions.
Time Optimal Reachability Analysis Using Swarm Verification
DEFF Research Database (Denmark)
Zhang, Zhengkui; Nielsen, Brian; Larsen, Kim Guldstrand
2016-01-01
Time optimal reachability analysis employs model-checking to compute goal states that can be reached from an initial state with a minimal accumulated time duration. The model-checker may produce a corresponding diagnostic trace which can be interpreted as a feasible schedule for many scheduling a...... algorithms work much faster than sequential algorithms, and especially two using combinations of random-depth-first and breadth-first show very promising performance....... search strategies. We develop four swarm algorithms and evaluate them with four models in terms scalability, and time- and memory consumption. Three of these cooperate by exchanging costs of intermediate solutions to prune the search using a branch-and-bound approach. Our results show that swarm...
Homogenization in strength and durability analysis of reinforced tooth filling
Mikhailov, SE; Orlik, J
2002-01-01
An asymptotic homogenization procedure is employed to obtain effective elastic properties of the composite tooth filling, a homogenized macro– stress field and a first approximation to the micro-stress field, from properties of the components and applied macro–loads. Using the approximate micro–stress field, a non–local initial strength and fatigue durability macro–conditions for the composite filling material is expressed in terms of the homogenized macro–stresses. An illustrative example wi...
A homogeneous analysis of disks around brown dwarfs
Liu, Y; Bayo, A; Nielbock, M; Wang, H
2015-01-01
We re-analyzed the Herschel/PACS data of a sample of 55 brown dwarfs (BDs) and very low mass stars with spectral types ranging from M5.5 to L0. We investigated the dependence of disk structure on the mass of the central object in the substellar regime based on a homogeneous analysis of Herschel data from flux density measurements to spectral energy distribution (SED) modeling. A systematic comparison between the derived disk properties and those of sun-like stars shows that the disk flaring of BDs and very low mass stars is generally smaller than that of their higher mass counterparts, the disk mass is orders of magnitude lower than the typical value found in T Tauri stars, and the disk scale heights are comparable in both sun-like stars and BDs. We further divided our sample into an early-type brown dwarf (ETBD) group and a late-type brown dwarf (LTBD) group by using spectral type (=M8) as the border criterion. We systematically compared the modeling results from Bayesian analysis between these two groups, a...
Analysis of intra-genomic GC content homogeneity within prokaryotes
Directory of Open Access Journals (Sweden)
Bohlin Jon
2010-08-01
Full Text Available Abstract Background Bacterial genomes possess varying GC content (total guanines (Gs and cytosines (Cs per total of the four bases within the genome but within a given genome, GC content can vary locally along the chromosome, with some regions significantly more or less GC rich than on average. We have examined how the GC content varies within microbial genomes to assess whether this property can be associated with certain biological functions related to the organism's environment and phylogeny. We utilize a new quantity GCVAR, the intra-genomic GC content variability with respect to the average GC content of the total genome. A low GCVAR indicates intra-genomic GC homogeneity and high GCVAR heterogeneity. Results The regression analyses indicated that GCVAR was significantly associated with domain (i.e. archaea or bacteria, phylum, and oxygen requirement. GCVAR was significantly higher among anaerobes than both aerobic and facultative microbes. Although an association has previously been found between mean genomic GC content and oxygen requirement, our analysis suggests that no such association exits when phylogenetic bias is accounted for. A significant association between GCVAR and mean GC content was also found but appears to be non-linear and varies greatly among phyla. Conclusions Our findings show that GCVAR is linked with oxygen requirement, while mean genomic GC content is not. We therefore suggest that GCVAR should be used as a complement to mean GC content.
Liu, Hai-feng; Yao, Ming-fa; Jin, Chao; Zhang, Peng; Li, Zhe-ming; Zheng, Zun-qing
2010-10-01
To study the combustion reaction kinetics of homogeneous charge compression ignition (HCCI) under different port injection strategies and intake temperature conditions, the tests were carried out on a modified single-cylinder optical engine using chemiluminescence spectroscopic analysis. The experimental conditions are keeping the fuel mass constant; fueling the n-heptane; controlling speed at 600 r x min(-1) and inlet pressure at 0.1 MPa; controlling inlet temperature at 95 degrees C and 125 degrees C, respectively. The results of chemiluminescence spectrum show that the chemiluminescence is quite faint during low temperature heat release (LTHR), and these bands spectrum originates from formaldehyde (CH2O) chemiluminescence. During the phase of later LTHR-negative temperature coefficient (NTC)-early high temperature heat release (HTHR), these bands spectrum also originates from formaldehyde (CH2O) chemiluminescence. The CO--O* continuum is strong during HTHR, and radicals such as OH, HCO, CH and CH2O appear superimposed on this CO--O* continuum. After the HTHR, the chemiluminescence intensity is quite faint. In comparison to the start of injection (SOI) of -30 degrees ATDC, the chemiluminescence intensity is higher under the SOI = -300 degrees ATDC condition due to the more intense emissions of CO--O* continuum. And more radicals of HCO and OH are formed, which also indicates a more intense combustion reaction. Similarly, more intense CO--O* continuum and more radicals of HCO and OH are emitted under higher intake temperature case.
SiSn diodes: Theoretical analysis and experimental verification
Hussain, Aftab M.
2015-08-24
We report a theoretical analysis and experimental verification of change in band gap of silicon lattice due to the incorporation of tin (Sn). We formed SiSn ultra-thin film on the top surface of a 4 in. silicon wafer using thermal diffusion of Sn. We report a reduction of 0.1 V in the average built-in potential, and a reduction of 0.2 V in the average reverse bias breakdown voltage, as measured across the substrate. These reductions indicate that the band gap of the silicon lattice has been reduced due to the incorporation of Sn, as expected from the theoretical analysis. We report the experimentally calculated band gap of SiSn to be 1.11 ± 0.09 eV. This low-cost, CMOS compatible, and scalable process offers a unique opportunity to tune the band gap of silicon for specific applications.
Analysis of stability of a homogeneous state of anisotropic plasma
Energy Technology Data Exchange (ETDEWEB)
Zakharov, V. Yu., E-mail: vladiyuz@mail.ru; Chernova, T. G., E-mail: chernova-tg@yandex.ru; Stepanov, S. E., E-mail: stepanov@bmstu-kaluga.ru [Bauman Moscow State Technical University, Kaluga Branch (Russian Federation)
2015-04-15
Small-amplitude waves in collisionless magnetized plasma are considered in the framework of one-fluid anisotropic magnetohydrodynamics with allowance for the anisotropy of the pressure and thermal flux. Stability of a homogeneous plasma state is analyzed using an eighth-order dispersion relation. Restrictions on the parameters of the homogeneous state at which the dispersion relation has no complex roots at any value of the angle between the wave vector and the unperturbed magnetic field are obtained. The applied method also makes it possible to determine the types of unstable waves.
Directory of Open Access Journals (Sweden)
Vladimir Jurak
2006-12-01
Full Text Available A great deal of data which determine the characteristics of engineering soils was collected in the course of geotechnical explorations conducted for the purpose of building two industrial objects near Kutina (Tvornica umjetnih gnojiva / Artificial fertilizer factory II and Soot Factory II/ Čađara II. Wishing to familiarize ourselves with the condition of homogeneity/heterogeneity underneath the object, we decided to check the hypothesis regarding the homogeneity of the a priori acquired geotechnical mediums and their geotechnical similarity. The relationship of the two mediums under observation is superpositional, they are lithologically similar but genetically different. They are represented through an engineering geological model reaching the depth of forty meters. Following the basic statistical data analysis for identification, geotechnical parameters and the use of several statistical tests, we were able to reach an engineering judgment on the basis of statistical conclusions. We realized that, from a statistical point of view, both geotechnical mediums are mostly homogenous or, speaking from the engineering point of view, ''quasi-homogenous''. The comparison of these two mediums showed that there is no statistically significant difference according to certain geotechnical parameters of geotechnical parameters. It follows, therefore, that the unification of the superpositioned mediums in a physically united halfspace located under the object is acceptable (the paper is published in Croatian.
Mercer, C. N.; Roberge, J.; Todorov, T. I.; Hofstra, A. H.
2013-12-01
Melt inclusions hosted in quartz can provide the only direct information about the pressure, temperature, and melt composition of pre-eruptive rhyolitic magmas, many of which are the precursors to mineralizing aqueous fluids [1]. With ideal, rapidly-quenched pumice samples, analysis of glassy quartz-hosted melt inclusions is relatively straightforward. These data can be directly interpreted to represent snapshots of metal and volatile concentrations during magma crystallization and degassing. However, most ore deposit-related igneous samples are non-ideal; being older, potentially hydrothermally altered, and often crystallized due to slow cooling in subvolcanic regions (e.g., porphyry-type deposits). In this case, analysis of crystalline melt inclusions in quartz is not straightforward and resulting data must be meticulously examined before interpretation. Many melt inclusions may have experienced post-entrapment modifications [1] such as diffusion of elements (e.g., H, Li, Na, Ag, Cu) [2], which may lead to changes in measured oxygen fugacity. Slowly cooled inclusions may crystallize, producing a heterogeneous "micro-rock" that cannot be analyzed by spectroscopic methods or electron microprobe. While crystallized inclusions can be homogenized in a high-temperature furnace, many new problems may arise such as inclusion decrepitation [3], diffusion of elements [2], and incorporation of too little or too much Si from the inclusion rim or host crystal. However, if unwanted homogenization effects are minimized by choosing ideal experimental conditions, then these homogenized inclusions can be analyzed by traditional FTIR and electron microprobe methods. The electron microprobe data from homogenized inclusions can be used as accurate internal standards for laser ablation-ICP-MS data reduction. Alternatively, crystalline inclusions can be directly analyzed for major and trace elements by laser ablation-ICP-MS [4], which considerably reduces sample preparation time, but
Kinematic Analysis and Experimental Verification on the Locomotion of Gecko
Institute of Scientific and Technical Information of China (English)
Woochul Nam; TaeWon Seo; Byungwook Kim; Dongsu Jeon; Kyu-Jin Cho; Jongwon Kim
2009-01-01
This paper presents a kinematic analysis of the locomotion of a gecko, and experimental verification of the kinematic model. Kinematic analysis is important for parameter design, dynamic analysis, and optimization in biomimetic robot research. The proposed kinematic analysis can simulate, without iteration, the locomotion of gecko satisfying the constraint conditions that maintain the position of the contacted feet on the surface. So the method has an advantage for analyzing the climbing motion of the quadruped mechanism in a real time application. The kinematic model of a gecko consists of four legs based on 7-degrees of freedom spherical-revolute-spherical joints and two revolute joints in the waist. The motion of the kinematic model is simulated based on measurement data of each joint. The motion of the kinematic model simulates the investigated real gecko's motion by using the experimental results. The analysis solves the forward kinematics by considering the model as a combination of closed and open serial mechanisms under the condition that maintains the contact positions of the attached feet on the ground. The motions of each joint are validated by comparing with the experimental results. In addition to the measured gait, three other gaits are simulated based on the kinematic model. The maximum strides of each gait are calculated by workspace analysis. The result can be used in biomimetic robot design and motion planning.
Infinite dimensional spherical analysis and harmonic analysis for groups acting on homogeneous trees
DEFF Research Database (Denmark)
Axelgaard, Emil
of the groups, the so-called irreducible tame representations. We prove the existence of irreducible non-tame representations by constructing a compactification of the boundary of the tree - an object which until now has not played any role in the analysis of automorphism groups for trees which are not locally......In this thesis, we study groups of automorphisms for homogeneous trees of countable degree by using an inductive limit approach. The main focus is the thourough discussion of two Olshanski spherical pairs consisting of automorphism groups for a homogeneous tree and a homogeneous rooted tree...... finite. Finally, we discuss conditionally positive definite functions on the groups and use the generalized Bochner-Godement theorem for Olshanski spherical pairs to prove Levy-Khinchine formulas for both of the considered pairs....
Triple Modular Redundancy verification via heuristic netlist analysis
Directory of Open Access Journals (Sweden)
Giovanni Beltrame
2015-08-01
Full Text Available Triple Modular Redundancy (TMR is a common technique to protect memory elements for digital processing systems subject to radiation effects (such as in space, high-altitude, or near nuclear sources. This paper presents an approach to verify the correct implementation of TMR for the memory elements of a given netlist (i.e., a digital circuit specification using heuristic analysis. The purpose is detecting any issues that might incur during the use of automatic tools for TMR insertion, optimization, place and route, etc. Our analysis does not require a testbench and can perform full, exhaustive coverage within less than an hour even for large designs. This is achieved by applying a divide et impera approach, splitting the circuit into smaller submodules without loss of generality, instead of applying formal verification to the whole netlist at once. The methodology has been applied to a production netlist of the LEON2-FT processor that had reported errors during radiation testing, successfully showing a number of unprotected memory elements, namely 351 flip-flops.
Identification of Homogeneous Hydrological Regions through Multivariate Analysis
Directory of Open Access Journals (Sweden)
Álvarez-Olguín G.
2011-07-01
Full Text Available Hydrological regionalization is used to transfer information from gauged catchments to ungauged river basins. However, to obtain reliable results, the basins involved must have a similar hydrological behavior. The objective of this research was to identify hydrologically homogeneous regions in the Mixteca Oaxaqueña and surrounding areas. The area of study included 17 basins for which 20 climate and physiographic variables potentially useful in the prediction of flow were quantified. The applications of multivariate statistics techniques allowed us to identify three groups of basins hydrologically associated. A regional model was obtained to predict mean annual fl ow, which determined that the best predictive variables are the area and the average annual precipitation.
Verification and Validation of the General Mission Analysis Tool (GMAT)
Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.
2014-01-01
This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.
SAVANT: Solar Array Verification and Analysis Tool Demonstrated
Chock, Ricaurte
2000-01-01
The photovoltaics (PV) industry is now being held to strict specifications, such as end-oflife power requirements, that force them to overengineer their products to avoid contractual penalties. Such overengineering has been the only reliable way to meet such specifications. Unfortunately, it also results in a more costly process than is probably necessary. In our conversations with the PV industry, the issue of cost has been raised again and again. Consequently, the Photovoltaics and Space Environment Effects branch at the NASA Glenn Research Center at Lewis Field has been developing a software tool to address this problem. SAVANT, Glenn's tool for solar array verification and analysis is in the technology demonstration phase. Ongoing work has proven that more efficient and less costly PV designs should be possible by using SAVANT to predict the on-orbit life-cycle performance. The ultimate goal of the SAVANT project is to provide a user-friendly computer tool to predict PV on-orbit life-cycle performance. This should greatly simplify the tasks of scaling and designing the PV power component of any given flight or mission. By being able to predict how a particular PV article will perform, designers will be able to balance mission power requirements (both beginning-of-life and end-of-life) with survivability concerns such as power degradation due to radiation and/or contamination. Recent comparisons with actual flight data from the Photovoltaic Array Space Power Plus Diagnostics (PASP Plus) mission validate this approach.
Design verification and performance analysis of Serial AXI Links in Broadcom System-on-Chip
Sarai, Simran Kaur
2014-01-01
Design verification is an essential step in the development of any product. Also referred to as qualification testing, design verification ensures that the product as designed is the same as the product as intended. In this project, design verification and performance analysis of Thin Advanced Extensible Interface Links (T-AXI) is conducted on a Broadcom’s SoC (System on Chip). T-AXI is a Broadcom’s proprietary bus that interfaces all the subsystems on the System-onchip (SoC) to the system me...
Modeling and Verification of Insider Threats Using Logical Analysis
DEFF Research Database (Denmark)
Kammuller, Florian; Probst, Christian W.
2017-01-01
and use a common trick from the formal verification of security protocols, showing that it is applicable to insider threats. We introduce briefly a three-step process of social explanation, illustrating that it can be applied fruitfully to the characterization of insider threats. We introduce the insider...
Homogeneity and heterogeneousness in European food cultures: An exploratory analysis
DEFF Research Database (Denmark)
Askegaard, Søren; Madsen, Tage Koed
One type pf boundaries rarely explored in international marketing but of potentially vital importance to international marketing are the cultural boundaries dividing Europe into regions with indidvidual cultural background and different consumptui patterns. This paper explores information about...... such cultural patterns of food consumption based on information from an existing database originating from a 1989 pan-European life style suvey questioning around 20,000 people in 16 European countri divided into 79 regions. A factor analysis reduced the number of variables from 138 to 41, discovering...... the latent factor structuring the Europeans' responses to questions about t their food behaviour and preferences. On the basis of the factor variables, a cluster analysis was made in order to produce a picture of the groupings of the single regions, thus getting a picture of the pattern of European food...
The colour analysis method applied to homogeneous rocks
Directory of Open Access Journals (Sweden)
Halász Amadé
2015-12-01
Full Text Available Computer-aided colour analysis can facilitate cyclostratigraphic studies. Here we report on a case study involving the development of a digital colour analysis method for examination of the Boda Claystone Formation which is the most suitable in Hungary for the disposal of high-level radioactive waste. Rock type colours are reddish brown or brownish red, or any shade between brown and red. The method presented here could be used to differentiate similar colours and to identify gradual transitions between these; the latter are of great importance in a cyclostratigraphic analysis of the succession. Geophysical well-logging has demonstrated the existence of characteristic cyclic units, as detected by colour and natural gamma. Based on our research, colour, natural gamma and lithology correlate well. For core Ib-4, these features reveal the presence of orderly cycles with thicknesses of roughly 0.64 to 13 metres. Once the core has been scanned, this is a time- and cost-effective method.
Analysis of intra-genomic GC content homogeneity within prokaryotes
DEFF Research Database (Denmark)
Bohlin, J; Snipen, L; Hardy, S.P.
2010-01-01
both aerobic and facultative microbes. Although an association has previously been found between mean genomic GC content and oxygen requirement, our analysis suggests that no such association exits when phylogenetic bias is accounted for. A significant association between GCVAR and mean GC content......Bacterial genomes possess varying GC content (total guanines (Gs) and cytosines (Cs) per total of the four bases within the genome) but within a given genome, GC content can vary locally along the chromosome, with some regions significantly more or less GC rich than on average. We have examined how...... the GC content varies within microbial genomes to assess whether this property can be associated with certain biological functions related to the organism's environment and phylogeny. We utilize a new quantity GCVAR, the intra-genomic GC content variability with respect to the average GC content...
Pressure transient analysis for long homogeneous reservoirs using TDS technique
Energy Technology Data Exchange (ETDEWEB)
Escobar, Freddy Humberto [Universidad Surcolombiana, Av. Pastrana - Cra. 1, Neiva, Huila (Colombia); Hernandez, Yuly Andrea [Hocol S.A., Cra. 7 No 114-43, Floor 16, Bogota (Colombia); Hernandez, Claudia Marcela [Weatherford, Cra. 7 No 81-90, Neiva, Huila (Colombia)
2007-08-15
A significant number of well pressure tests are conducted in long, narrow reservoirs with close and open extreme boundaries. It is desirable not only to appropriately identify these types of systems but also to develop an adequate and practical interpretation technique to determine their parameters and size, when possible. An accurate understanding of how the reservoir produces and the magnitude of producible reserves can lead to competent decisions and adequate reservoir management. So far, studies found for identification and determination of parameters for such systems are conducted by conventional techniques (semilog analysis) and semilog and log-log type-curve matching of pressure versus time. Type-curve matching is basically a trial-and-error procedure which may provide inaccurate results. Besides, a limitation in the number of type curves plays a negative role. In this paper, a detailed analysis of pressure derivative behavior for a vertical well in linear reservoirs with open and closed extreme boundaries is presented for the case of constant rate production. We studied independently each flow regime, especially the linear flow regime since it is the most characteristic 'fingerprint' of these systems. We found that when the well is located at one of the extremes of the reservoir, a single linear flow regime develops once radial flow and/or wellbore storage effects have ended. When the well is located at a given distance from both extreme boundaries, the pressure derivative permits the identification of two linear flows toward the well and it has been called that 'dual-linear flow regime'. This is characterized by an increment of the intercept of the 1/2-slope line from {pi}{sup 0.5} to {pi} with a consequent transition between these two straight lines. The identification of intersection points, lines, and characteristic slopes allows us to develop an interpretation technique without employing type-curve matching. This technique uses
Evaluation of in-cylinder mixture homogeneity in a diesel HCCI engine – A CFD analysis
Directory of Open Access Journals (Sweden)
N. Ramesh
2016-06-01
Full Text Available Performance and emission characteristics of HCCI engines depend on achieving a good in-cylinder homogeneous mixture. The formation of in-cylinder mixture depends on many engine parameters, which need optimization. In addition, as of now, there is no direct way to clearly describe and estimate in-cylinder mixture homogeneity. In the CFD analysis, it is evaluated indirectly using contour plots of equivalence ratio, variation of in-cylinder pressure with crank angles, heat release curves or by the comparison of emissions. In this study, an attempt has been made to develop methods to evaluate the in-cylinder mixture homogeneity by the CFD analysis using AVL-FIRE. Here, global and local in-cylinder fuel distribution and in-cylinder fuel distribution index are used to evaluate the mixture homogeneity. In order to evaluate these methods, mixture homogeneities in two cases of fuel injections with 7- and 10-hole injector are compared. Finally, we found that the global fuel distribution (GFD plot helps direct quantitative assessment of mixture distribution in various ER range. However, the GFD method cannot explain the spatial variation of fuel distribution and does not provide mixture homogeneity on a simple scale. In the method of plotting fuel distribution index, the overall homogeneity will be evaluated on a scale of 0 to 1 by a simple way. In the method of plotting local fuel distribution (LFD, the spatial variation of mixture homogeneity is well defined in local zones both in radial and axial directions. Further, these proposed methods help us to reduce the computation time significantly.
Verification of ESPI Stress Analysis by Means Of FEM
Directory of Open Access Journals (Sweden)
Luboš PEČENKA
2013-06-01
Full Text Available The main goal of this paper is verification of accuracy of ESPI strain sensor Dantec dynamics Q-100. ESPI is the digital holographic interferometry method for measuring small displacements and deformations. There is necessary to consider the isotropic material properties for stress calculations using the ESPI systems. All stress measurements were performed on simple shape specimens subjected to tension and bending. The experimental results obtained on the tensile testing machine Testometric will be compared with results from Finite element calculation.
Verification of Evolving Software via Component Substitutability Analysis
2005-12-01
Lecture Notes in Computer Science . New Castle, UK, July 18–22, 2005. New York, NY: Springer-Verlag...17th International Conference on Computer Aided Verification (CAV ’05), volume 3576 of Lecture Notes in Computer Science . Edinburgh, Scotland, July 6–10...volume 2404 of Lecture Notes in Computer Science . Copenhagen, Denmark, July 27–31, 2002. New York, NY: Springer-Verlag, 2002. [Clarke 82] Clarke,
van der Burg, Eeke; de Leeuw, Jan; Verdegaal, R.
1986-01-01
Homogeneity analysis, or multiple correspondence analysis, is usually applied to k separate variables. In this paper, it is applied to sets of variables by using sums within sets. The resulting technique is referred to as OVERALS. It uses the notion of optimal scaling, with transformations that can
Homogenization of Monthly Temperaure Data and Climate Trend Analysis in British Columbia, Canada.
Wang, Y.; Anslow, F. S.; Zwiers, F. W.; Atkinson, D. E.
2016-12-01
Non-climatic variations (such as changes of instrument, station relocation, changes in observing time and procedure, etc.) in climate data can lead to discontinuities, causing the inaccurate analysis of the climatic characteristics for a given location. Thus, data quality control and homogenization is the crucial first step before properly analyzing climate trend and extremes. In Canada, the most recent Adjusted and Homogenized Canadian Climate Data (AHCCD) from Environment and Climate Change Canada have been produced for four climate variables at various temporal resolution, such as adjusted surface air temperature for 338 locations (Vincent et al., 2012), adjusted precipitation dataset for over 450 locations (Mekis and Vincent, 2011). In British Columbia (B.C.), thousands of stations from non-ECCC networks are available for quality control and homogenization. In this project, homogenization of monthly temperature data for 79 stations (more stations will be included) from three networks (BCHydro, Ministry of Forests Lands and Natural Resource Operations Wildfire Management Branch and the Ministry of Transportation and Infrastructure) is based on a penalized maximum t-test with Quantile-Matching (QM) algorithm to detect inhomogeneities and make adjustments to the data (Wang et al. 2007, Wang 2008a, Wang 2008b). The homogenized product from the project will be made available to climate researchers through the Pacific Climate Impacts Consortium's (PCIC) data portal. Climate trends in the studied region (Northwest of B.C. and Vancouver Island) will be presented from the homogenized dataset and will be compared to those calculated from datasets without homogenization and the AHCCD data. After such evaluation, the results are expected to suggest an improvement in the ability of characterizing climate change with the homogenized datasets.
Analysis, Test and Verification in The Presence of Variability (Dagstuhl Seminar 13091)
DEFF Research Database (Denmark)
2014-01-01
This report documents the program and the outcomes of Dagstuhl Seminar 13091 “Analysis, Test and Verification in The Presence of Variability”. The seminar had the goal of consolidating and stimulating research on analysis of software models with variability, enabling the design of variability-awa...
Ramsey, V. Jean; Dodge, L. Delf
1983-01-01
The appropriateness of using academic departments as a level of analysis of organizational administration is examined. Factors analyzed include homogeneity of faculty responses to measures of organizational structure, environmental uncertainty, and task routineness. Results were mixed, demonstrating the importance of empirically testing rather…
The KeY platform for verification and analysis of Java programs
Ahrendt, W.; Beckert, B.; Bruns, D.; Bubel, R.; Gladisch, C.; Grebing, S.; Hähnle, R.; Hentschel, M.; Herda, M.; Klebanov, V.; Mostowski, Wojciech; Scheben, C.; Schmitt, P.H.; Ulbrich, M.; Giannakopoulou, D.; Kroening, D.
2014-01-01
The KeY system offers a platform of software analysis tools for sequential Java. Foremost, this includes full functional verification against contracts written in the Java Modeling Language. But the approach is general enough to provide a basis for other methods and purposes: (i) complementary valid
Verification of HYDRASTAR: Analysis of hydraulic conductivity fields and dispersion
Energy Technology Data Exchange (ETDEWEB)
Morris, S.T.; Cliffe, K.A. [AEA Technology, Harwell (United Kingdom)
1994-10-01
HYDRASTAR is a code for the stochastic simulation of groundwater flow. It can be used to simulate both time-dependent and steady-state groundwater flow at constant density. Realizations of the hydraulic conductivity field are generated using the Turning Bands algorithm. The realizations can be conditioned on measured values of the hydraulic conductivity using Kriging. This report describes a series of verification studies that have been carried out on the code. The first study concerns the accuracy of the implementation of the Turning Bands algorithm in HYDRASTAR. The implementation has been examined by evaluating the ensemble mean and covariance of the generated fields analytically and comparing them with their prescribed values. Three other studies were carried out in which HYDRASTAR was used to solve problems of uniform mean flow and to calculate the transport and dispersion of fluid particles. In all three cases the hydraulic conductivity fields were unconditioned. The first two were two-dimensional: one at small values of the variance of the logarithm of the hydraulic conductivity for which there exists analytical results that the code can be compared with, and one at moderate variance where the results can only be compared with those obtained by another code. The third problem was three dimensional with a small variance and again analytical results are available for comparison. 14 refs, 24 figs.
Energy Technology Data Exchange (ETDEWEB)
Ramirez Ros, J. C.; Jerez Sainz, M. I.; Lobato Munoz, M.; Jodar Lopez, C. A.; Ruiz Lopez, M. A.; Carrasco rodriguez, J. L.; Pamos Urena, M.
2013-07-01
We evaluated the Monte Carlo Monaco Planner v2.0.3 for calculation between non-homogeneous low density (equivalent to lung), as a complement to the verification of modeling in homogeneous medium and prior to the introduction of the SBRT technique. We performed the same tests on Pinnacle v8.0m, with the same purpose. We compare the results obtained with the algorithm Monte Carlo of Monaco and the Collapsed Cone of Pinnacle. (Author)
Nonlinear region of attraction analysis for hypersonic flight vehicles’ flight control verification
Directory of Open Access Journals (Sweden)
Jie Chen
2017-05-01
Full Text Available The stability analysis method based on region of attraction is proposed for the hypersonic flight vehicles’ flight control verification in this article. Current practice for hypersonic flight vehicles’ flight control verification is largely dependent on linear theoretical analysis and nonlinear simulation research. This problem can be improved by the nonlinear stability analysis of flight control system. Firstly, the hypersonic flight vehicles’ flight dynamic model is simplified and fitted by polynomial equation. And then the region of attraction estimation method based on V–s iteration is presented to complete the stability analysis. Finally, with the controller law, the closed-loop system stability is analyzed to verify the effectiveness of the proposed method.
Verification of temporal-causal network models by mathematical analysis
Directory of Open Access Journals (Sweden)
Jan Treur
2016-04-01
Full Text Available Abstract Usually dynamic properties of models can be analysed by conducting simulation experiments. But sometimes, as a kind of prediction properties can also be found by calculations in a mathematical manner, without performing simulations. Examples of properties that can be explored in such a manner are: whether some values for the variables exist for which no change occurs (stationary points or equilibria, and how such values may depend on the values of the parameters of the model and/or the initial values for the variables whether certain variables in the model converge to some limit value (equilibria and how this may depend on the values of the parameters of the model and/or the initial values for the variables whether or not certain variables will show monotonically increasing or decreasing values over time (monotonicity how fast a convergence to a limit value takes place (convergence speed whether situations occur in which no convergence takes place but in the end a specific sequence of values is repeated all the time (limit cycle Such properties found in an analytic mathematical manner can be used for verification of the model by checking them for the values observed in simulation experiments. If one of these properties is not fulfilled, then there will be some error in the implementation of the model. In this paper some methods to analyse such properties of dynamical models will be described and illustrated for the Hebbian learning model, and for dynamic connection strengths in social networks. The properties analysed by the methods discussed cover equilibria, increasing or decreasing trends, recurring patterns (limit cycles, and speed of convergence to equilibria.
Bayesian analysis of non-homogeneous Markov chains: application to mental health data.
Sung, Minje; Soyer, Refik; Nhan, Nguyen
2007-07-10
In this paper we present a formal treatment of non-homogeneous Markov chains by introducing a hierarchical Bayesian framework. Our work is motivated by the analysis of correlated categorical data which arise in assessment of psychiatric treatment programs. In our development, we introduce a Markovian structure to describe the non-homogeneity of transition patterns. In doing so, we introduce a logistic regression set-up for Markov chains and incorporate covariates in our model. We present a Bayesian model using Markov chain Monte Carlo methods and develop inference procedures to address issues encountered in the analyses of data from psychiatric treatment programs. Our model and inference procedures are implemented to some real data from a psychiatric treatment study.
Barker, S A; Hsieh, L C; Short, C R
1986-05-15
New methodology for the extraction and analysis of the anthelmintic fenbendazole and its metabolites from plasma, urine, liver homogenates, and feces from several animal species is presented. Quantitation of fenbendazole and its metabolites was conducted by high-pressure liquid chromatography using ultraviolet detection at 290 nm. The combined extraction and analysis procedures give excellent recoveries in all of the different biological matrices examined. High specificity, low limits of detection, and excellent linearity, accuracy, and inter- and intrasample variability were also obtained. The study of fenbendazole pharmacokinetics in vitro and in vivo should be greatly enhanced through the utilization of these methods.
Verification and uncertainty evaluation of CASMO-3/MASTER nuclear analysis system
Energy Technology Data Exchange (ETDEWEB)
Song, Jae Seung; Cho, Byung Oh; Joo, Han Kyu; Zee, Sung Quun; Lee, Chung Chan; Park, Sang Yoon
2000-06-01
MASTER is a nuclear design code developed by KAERI. It uses group constants generated by CASMO-3 developed by Studsvik. In this report the verification and evaluation of uncertainty were performed for the code system application in nuclear reactor core analysis and design. The verification is performed via various benchmark comparisons for static and transient core condition, and core follow calculations with startup physics test predictions of total 14 cycles of pressurized water reactors. Benchmark calculation include comparisons with reference solutions of IAEA and OECA/NEA problems and critical experiment measurements. The uncertainty evaluation is focused to safety related parameters such as power distribution, reactivity coefficients, control rod worth and core reactivity. It is concluded that CASMO-3/MASTER can be applied for PWR core nuclear analysis and design without any bias factors. Also, it is verified that the system can be applied for SMART core, via supplemental comparisons with reference calculations by MCNP which is a probabilistic nuclear calculation code.
Staniszewska-Slezak, Emilia; Malek, Kamilla; Baranska, Malgorzata
2015-08-01
Raman spectroscopy and four excitation lines in the visible (Vis: 488, 532, 633 nm) and near infrared (NIR: 785 nm) were used for biochemical analysis of rat tissue homogenates, i.e. myocardium, brain, liver, lung, intestine, and kidney. The Vis Raman spectra are very similar for some organs (brain/intestines and kidney/liver) and dominated by heme signals when tissues of lung and myocardium were investigated (especially with 532 nm excitation). On the other hand, the NIR Raman spectra are specific for each tissue and more informative than the corresponding ones collected with the Vis excitations. The spectra analyzed without any special pre-processing clearly illustrate different chemical composition of each tissue and give information about main components e.g. lipids or proteins, but also about the content of some specific compounds such as amino acid residues, nucleotides and nucleobases. However, in order to obtain the whole spectral information about tissues complex composition the spectra of Vis and NIR excitations should be collected and analyzed together. A good agreement of data gathered from Raman spectra of the homogenates and those obtained previously from Raman imaging of the tissue cross-sections indicates that the presented here approach can be a method of choice for an investigation of biochemical variation in animal tissues. Moreover, the Raman spectral profile of tissue homogenates is specific enough to be used for an investigation of potential pathological changes the organism undergoes, in particular when supported by the complementary FTIR spectroscopy.
Homogeneity study of a corn flour laboratory reference material candidate for inorganic analysis.
Dos Santos, Ana Maria Pinto; Dos Santos, Liz Oliveira; Brandao, Geovani Cardoso; Leao, Danilo Junqueira; Bernedo, Alfredo Victor Bellido; Lopes, Ricardo Tadeu; Lemos, Valfredo Azevedo
2015-07-01
In this work, a homogeneity study of a corn flour reference material candidate for inorganic analysis is presented. Seven kilograms of corn flour were used to prepare the material, which was distributed among 100 bottles. The elements Ca, K, Mg, P, Zn, Cu, Fe, Mn and Mo were quantified by inductively coupled plasma optical emission spectrometry (ICP OES) after acid digestion procedure. The method accuracy was confirmed by analyzing the rice flour certified reference material, NIST 1568a. All results were evaluated by analysis of variance (ANOVA) and principal component analysis (PCA). In the study, a sample mass of 400mg was established as the minimum mass required for analysis, according to the PCA. The between-bottle test was performed by analyzing 9 bottles of the material. Subsamples of a single bottle were analyzed for the within-bottle test. No significant differences were observed for the results obtained through the application of both statistical methods. This fact demonstrates that the material is homogeneous for use as a laboratory reference material. Copyright © 2015 Elsevier Ltd. All rights reserved.
Capability Portfolio Analysis Tool (CPAT) Verification and Validation Report
2013-01-01
Med Evac Vehicle MGS Mobile Gun System MILPRS Military Personnel MILCON Military Construction MODA Multiple Objective Decision Analysis...Analysis ( MODA ) approach for assessing the value of vehicle modernization in the HBCT and SBCT combat fleets. The MODA approach provides insight to...used to measure the returns of scale for a given attribute. The MODA approach promotes buy-in from multiple stakeholders. The CPAT team held an SME
Structural Dynamics Verification of Rotorcraft Comprehensive Analysis System (RCAS)
Energy Technology Data Exchange (ETDEWEB)
Bir, G. S.
2005-02-01
The Rotorcraft Comprehensive Analysis System (RCAS) was acquired and evaluated as part of an ongoing effort by the U.S Department of Energy (DOE) and the National Renewable Energy Laboratory (NREL) to provide state-of-the-art wind turbine modeling and analysis technology for Government and industry. RCAS is an interdisciplinary tool offering aeroelastic modeling and analysis options not supported by current codes. RCAS was developed during a 4-year joint effort among the U.S. Army's Aeroflightdynamics Directorate, Advanced Rotorcraft Technology Inc., and the helicopter industry. The code draws heavily from its predecessor 2GCHAS (Second Generation Comprehensive Helicopter Analysis System), which required an additional 14 years to develop. Though developed for the rotorcraft industry, its general-purpose features allow it to model or analyze a general dynamic system. Its key feature is a specialized finite element that can model spinning flexible parts. The code, therefore, appears particularly suited for wind turbines whose dynamics is dominated by massive flexible spinning rotors. In addition to the simulation capability of the existing codes, RCAS [1-3] offers a range of unique capabilities, including aeroelastic stability analysis, trim, state-space modeling, operating modes, modal reduction, multi-blade coordinate transformation, periodic-system-specific analysis, choice of aerodynamic models, and a controls design/implementation graphical interface.
Plasmon analysis and homogenization in plane layered photonic crystals and hyperbolic metamaterials
Energy Technology Data Exchange (ETDEWEB)
Davidovich, M. V., E-mail: davidovichmv@info.sgu.ru [Saratov State University (Russian Federation)
2016-12-15
Dispersion equations are obtained and analysis and homogenization are carried out in periodic and quasiperiodic plane layered structures consisting of alternating dielectric layers, metal and dielectric layers, as well as graphene sheets and dielectric (SiO{sub 2}) layers. Situations are considered when these structures acquire the properties of hyperbolic metamaterials (HMMs), i.e., materials the real parts of whose effective permittivity tensor have opposite signs. It is shown that the application of solely dielectric layers is more promising in the context of reducing losses.
Plasmon analysis and homogenization in plane layered photonic crystals and hyperbolic metamaterials
Davidovich, M. V.
2016-12-01
Dispersion equations are obtained and analysis and homogenization are carried out in periodic and quasiperiodic plane layered structures consisting of alternating dielectric layers, metal and dielectric layers, as well as graphene sheets and dielectric (SiO2) layers. Situations are considered when these structures acquire the properties of hyperbolic metamaterials (HMMs), i.e., materials the real parts of whose effective permittivity tensor have opposite signs. It is shown that the application of solely dielectric layers is more promising in the context of reducing losses.
Davidovich, Mikhael V.
2016-04-01
The dispersion equation and the analysis and homogenization in periodic and quasiperiodic plane layered structures with alternating dielectric layers of metal and dielectric layers, as well as a graphene sheet and SiO2 layers have been investigated. The cases are considered when these patterns become the properties of hyperbolic metamaterials, i.e., having different signs of the real parts of the tensor components of the effective dielectric constant. It is shown that usage only dielectric layers is perspective in reducing losses.
Energy Technology Data Exchange (ETDEWEB)
Jo, Young Beom; Kim, Eung Soo [Seoul National Univ., Seoul (Korea, Republic of)
2014-10-15
It becomes more complicated when considering the shape and phase of the ground below the seawater. Therefore, some different attempts are required to precisely analyze the behavior of tsunami. This paper introduces an on-going activities on code development in SNU based on an unconventional mesh-free fluid analysis method called Smoothed Particle Hydrodynamics (SPH) and its verification work with some practice simulations. This paper summarizes the on-going development and verification activities on Lagrangian mesh-free SPH code in SNU. The newly developed code can cover equation of motions and heat conduction equation so far, and verification of each models is completed. In addition, parallel computation using GPU is now possible, and GUI is also prepared. If users change input geometry or input values, they can simulate for various conditions geometries. A SPH method has large advantages and potential in modeling of free surface, highly deformable geometry and multi-phase problems that traditional grid-based code has difficulties in analysis. Therefore, by incorporating more complex physical models such as turbulent flow, phase change, two-phase flow, and even solid mechanics, application of the current SPH code is expected to be much more extended including molten fuel behaviors in the sever accident.
Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang
2016-07-01
This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.
Monte Carlo homogenized limit analysis model for randomly assembled blocks in-plane loaded
Milani, Gabriele; Lourenço, Paulo B.
2010-11-01
A simple rigid-plastic homogenization model for the limit analysis of masonry walls in-plane loaded and constituted by the random assemblage of blocks with variable dimensions is proposed. In the model, blocks constituting a masonry wall are supposed infinitely resistant with a Gaussian distribution of height and length, whereas joints are reduced to interfaces with frictional behavior and limited tensile and compressive strength. Block by block, a representative element of volume (REV) is considered, constituted by a central block interconnected with its neighbors by means of rigid-plastic interfaces. The model is characterized by a few material parameters, is numerically inexpensive and very stable. A sub-class of elementary deformation modes is a-priori chosen in the REV, mimicking typical failures due to joints cracking and crushing. Masonry strength domains are obtained equating the power dissipated in the heterogeneous model with the power dissipated by a fictitious homogeneous macroscopic plate. Due to the inexpensiveness of the approach proposed, Monte Carlo simulations can be repeated on the REV in order to have a stochastic estimation of in-plane masonry strength at different orientations of the bed joints with respect to external loads accounting for the geometrical statistical variability of blocks dimensions. Two cases are discussed, the former consisting on full stochastic REV assemblages (obtained considering a random variability of both blocks height an length) and the latter assuming the presence of a horizontal alignment along bed joints, i.e. allowing blocks height variability only row by row. The case of deterministic blocks height (quasi-periodic texture) can be obtained as a subclass of this latter case. Masonry homogenized failure surfaces are finally implemented in an upper bound FE limit analysis code for the analysis at collapse of entire walls in-plane loaded. Two cases of engineering practice, consisting on the prediction of the failure
Directory of Open Access Journals (Sweden)
Hui-Jing Li
2014-03-01
Full Text Available Electrophilic aromatic bromination is the most common synthetic method used to prepare aryl bromides, which are very useful intermediates in organic synthesis. To understand the experimental results in electrophilic aromatic brominations, ab initio calculations are used here for a tentative analysis of the positional selectivity. The calculated results agree well with the corresponding experimental data, and the reliability of the resulting positional selectivity was verified by the corresponding experimental data.
Verification of BModes: Rotary Beam and Tower Modal Analysis Code; Preprint
Energy Technology Data Exchange (ETDEWEB)
Bir, G.
2010-04-01
This paper describes verification of BModes, a finite-element code developed to provide coupled modes for the blades and tower of a wind turbine. The blades, which may be rotating or non-rotating, and the towers, whether onshore or offshore, are modeled using specialized 15-dof beam finite elements. Both blade and tower models allow a tip attachment, which is assumed to be rigid body with six moments of inertia, and a mass centroid that may be offset from the blade or tower axis. Examples of tip attachments are aerodynamic brakes for blades and nacelle-rotor subassembly for towers. BModes modeling allows for tower supports including tension wires, floating platforms, and monopiles on elastic foundations. Coupled modes (implying coupling of flap, lag, axial, and torsional motions) are required for modeling major flexible components in a modal-based, aeroelastic code such as FAST1. These are also required for validation of turbine models using experimental data, modal-based fatigue analysis, controls design, and understanding aeroelastic-stability behavior of turbines. Verification studies began with uniform tower models, with and without tip inertia, and progressed to realistic towers. For the floating turbine, we accounted for the effects of hydrodynamic inertia, hydrostatic restoring, and mooring lines stiffness. For the monopole-supported tower, we accounted for distributed hydrodynamic mass on the submerged part of the tower and for distributed foundation stiffness. Finally, we verified a model of a blade carrying tip mass and rotating at different speeds (verifications of other blade models, rotating or non-rotating, have been reported in another paper.) Verifications were performed by comparing BModes-generated modes with analytical results, if available, or with MSC.ADAMS results. All results in general show excellent agreement.
Spectral signature verification using statistical analysis and text mining
DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.
2016-05-01
In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is
Institute of Scientific and Technical Information of China (English)
ChenChangya; PanJin; WangDeyu
2005-01-01
With the development of satellite structure technology, more and more design parameters will affect its structural performance. It is desirable to obtain an optimal structure design with a minimum weight, including optimal configuration and sizes. The present paper aims to describe an optimization analysis for a satellite structure, including topology optimization and size optimization. Based on the homogenization method, the topology optimization is carried out for the main supporting frame of service module under given constraints and load conditions, and then the sensitivity analysis is made of 15 structural size parameters of the whole satellite and the optimal sizes are obtained. The numerical result shows that the present optimization design method is very effective.
Visual verification and analysis of cluster detection for molecular dynamics.
Grottel, Sebastian; Reina, Guido; Vrabec, Jadran; Ertl, Thomas
2007-01-01
A current research topic in molecular thermodynamics is the condensation of vapor to liquid and the investigation of this process at the molecular level. Condensation is found in many physical phenomena, e.g. the formation of atmospheric clouds or the processes inside steam turbines, where a detailed knowledge of the dynamics of condensation processes will help to optimize energy efficiency and avoid problems with droplets of macroscopic size. The key properties of these processes are the nucleation rate and the critical cluster size. For the calculation of these properties it is essential to make use of a meaningful definition of molecular clusters, which currently is a not completely resolved issue. In this paper a framework capable of interactively visualizing molecular datasets of such nucleation simulations is presented, with an emphasis on the detected molecular clusters. To check the quality of the results of the cluster detection, our framework introduces the concept of flow groups to highlight potential cluster evolution over time which is not detected by the employed algorithm. To confirm the findings of the visual analysis, we coupled the rendering view with a schematic view of the clusters' evolution. This allows to rapidly assess the quality of the molecular cluster detection algorithm and to identify locations in the simulation data in space as well as in time where the cluster detection fails. Thus, thermodynamics researchers can eliminate weaknesses in their cluster detection algorithms. Several examples for the effective and efficient usage of our tool are presented.
A Novel Linear Switched Reluctance Machine: Analysis and Experimental Verification
Directory of Open Access Journals (Sweden)
N. C. Lenin
2010-01-01
Full Text Available The important problems to be solved in Linear Switched Reluctance Machines (LSRMs are: (1 to design the shape and size of poles in stator and translator cores; (2 to optimize their geometrical configuration. A novel stator geometry for LSRMs that improved the force profile was presented in this study. In the new geometry, pole shoes were affixed on the stator poles. Static and dynamic characteristics for the proposed structure had been highlighted using Two Dimensional (2-D Finite Element Analyses (FEA. Motor performance for variable load conditions was discussed. The finite element analyses and the experimental results of this study proved that, LSRMs were one of the strong candidates for linear propulsion drives. Problem statement: To mitigate the force ripple without any loss in average force and force density. Approach: Design modifications in the magnetic structures. Results: 2-D finite element analysis was used to predict the performance of the studied structures. Conclusion/Recommendations: The proposed structure not only reduces the force ripple, also reduced the volume and mass. The future study is to make an attempt on vibration, thermal and stress analyses.
Local SIFT analysis for hand vein pattern verification
Wang, Yunxin; Wang, Dayong; Liu, Tiegen; Li, Xiuyan
2009-11-01
The newly emerging hand vein recognition technology has attracted remarkable attention for its uniqueness, noninvasion, friendliness and high reliability. It is unavoidable to produce small location deviation of human hand in the practical application; however, the existing recognition methods are sensitive to the hand shift or rotation. The test sample is matched with a series of registered images after affine transformation including the shift or rotation by most of researches, this affine transform method can remedy the location deviation to some extent, but the limited range for hand shift and rotation brings users much inconvenience and the computational cost also increases greatly. Aiming at this issue, a hand vein recognition algorithm based on local SIFT (Scale Invariant Feature Transform) analysis is developed in this contribution, which has practical significance due to its translation and rotation invariance. First, the hand vein image is preprocessed to remove the background and reduce image noises, and then SIFT features are extracted to describe the gradient information of hand vein. Many one-to-more matching pairs are produced by the common matching method of SIFT features, thus the matching rule is improved by appending a constrained condition to ensure the one-to-one matching, which is achieved by selecting feature point with the nearest distance as the optimal match. Finally the match ratio of features between the registered and test images is calculated as the similarity measurement to verify the personal identification. The experiment results show that FRR (False Rejection Rate) is only 0.93% when FAR (False Acceptance Rate) is 0.002%, and EER (Equal Error Rate) is low to 0.12%, which demonstrate the proposed approach is valid and effective for hand vein authentication.
Energy Technology Data Exchange (ETDEWEB)
Duran-Lobato, Matilde, E-mail: mduran@us.es [Universidad de Sevilla, Dpto. Farmacia y Tecnologia Farmaceutica, Facultad de Farmacia (Espana) (Spain); Enguix-Gonzalez, Alicia [Universidad de Sevilla, Dpto. Estadistica e Investigacion Operativa, Facultad de Matematicas (Espana) (Spain); Fernandez-Arevalo, Mercedes; Martin-Banderas, Lucia [Universidad de Sevilla, Dpto. Farmacia y Tecnologia Farmaceutica, Facultad de Farmacia (Espana) (Spain)
2013-02-15
Lipid nanoparticles (LNPs) are a promising carrier for all administration routes due to their safety, small size, and high loading of lipophilic compounds. Among the LNP production techniques, the easy scale-up, lack of organic solvents, and short production times of the high-pressure homogenization technique (HPH) make this method stand out. In this study, a statistical analysis was applied to the production of LNP by HPH. Spherical LNPs with mean size ranging from 65 nm to 11.623 {mu}m, negative zeta potential under -30 mV, and smooth surface were produced. Manageable equations based on commonly used parameters in the pharmaceutical field were obtained. The lipid to emulsifier ratio (R{sub L/S}) was proved to statistically explain the influence of oil phase and surfactant concentration on final nanoparticles size. Besides, the homogenization pressure was found to ultimately determine LNP size for a given R{sub L/S}, while the number of passes applied mainly determined polydispersion. {alpha}-Tocopherol was used as a model drug to illustrate release properties of LNP as a function of particle size, which was optimized by the regression models. This study is intended as a first step to optimize production conditions prior to LNP production at both laboratory and industrial scale from an eminently practical approach, based on parameters extensively used in formulation.
Identifying homogenous subgroups for individual patient meta-analysis based on Rough Set Theory.
Gil-Herrera, Eleazar; Tsalatsanis, Athanasios; Kumar, Ambuj; Mhaskar, Rahul; Miladinovic, Branko; Yalcin, Ali; Djulbegovic, Benjamin
2014-01-01
Failure to detect and manage heterogeneity between clinical trials included in meta-analysis may lead to misinterpretation of summary effect estimates. This may ultimately compromise the validity of the results of the meta-analysis. Typically, when heterogeneity between trials is detected, researchers use sensitivity or subgroup analysis to manage it. However, both methods fail to explain why heterogeneity existed in the first place. Here we propose a novel methodology that relies on Rough Set Theory (RST) to detect, explain, and manage the sources of heterogeneity applicable to meta-analysis performed on individual patient data (IPD). The method exploits the RST relations of discernibility and indiscernibility to create homogeneous groups of patients. We applied our methodology on a dataset of 1,111 patients enrolled in 9 randomized controlled trials studying the effect of two transplantation procedures in the management of hematologic malignancies. Our method was able to create three subgroups of patients with remarkably low statistical heterogeneity values (16.8%, 0% and 0% respectively). The proposed methodology has the potential to automatize and standardize the process of detecting and managing heterogeneity in IPD meta-analysis. Future work involves investigating the applications of the proposed methodology in analyzing treatment effects in patients belonging to different risk groups, which will ultimately assist in personalized healthcare decision making.
Directory of Open Access Journals (Sweden)
Hassan Ijaz
2017-01-01
Full Text Available The purpose of this article is to present a simplified methodology for analysis of sandwich structures using the homogenization method. This methodology is based upon the strain energy criterion. Normally, sandwich structures are composed of hexagonal core and face sheets and a complete and complex hexagonal core is modeled for finite element (FE structural analysis. In the present work, the hexagonal core is replaced by a simple equivalent volume for FE analysis. The properties of an equivalent volume were calculated by taking a single representative cell for the entire core structure and the analysis was performed to determine the effective elastic orthotropic modulus of the equivalent volume. Since each elemental cell of the hexagonal core repeats itself within the in-plane direction, periodic boundary conditions were applied to the single cell to obtain the more realistic values of effective modulus. A sandwich beam was then modeled using determined effective properties. 3D FE analysis of Three- and Four-Point Bend Tests (3PBT and 4PBT for sandwich structures having an equivalent polypropylene honeycomb core and Glass Fiber Reinforced Plastic (GFRP composite face sheets are performed in the present study. The authenticity of the proposed methodology has been verified by comparing the simulation results with the experimental bend test results on hexagonal core sandwich beams.
Design and Analysis of SD_DWCA - A Mobility based clustering of Homogeneous MANETs
Janakiraman, T N
2011-01-01
This paper deals with the design and analysis of the distributed weighted clustering algorithm SD_DWCA proposed for homogeneous mobile ad hoc networks. It is a connectivity, mobility and energy based clustering algorithm which is suitable for scalable ad hoc networks. The algorithm uses a new graph parameter called strong degree defined based on the quality of neighbours of a node. The parameters are so chosen to ensure high connectivity, cluster stability and energy efficient communication among nodes of high dynamic nature. This paper also includes the experimental results of the algorithm implemented using the network simulator NS2. The experimental results show that the algorithm is suitable for high speed networks and generate stable clusters with less maintenance overhead.
Homogeneity Analysis of a MEMS-based PZT Thick Film Vibration Energy Harvester Manufacturing Process
DEFF Research Database (Denmark)
Lei, Anders; Xu, Ruichao; Borregaard, Louise M.
2012-01-01
This paper presents a homogeneity analysis of a high yield wafer scale fabrication of MEMS-based unimorph silicon/PZT thick film vibration energy harvesters aimed towards vibration sources with peak vibrations in the range of around 300Hz. A wafer with a yield of 91% (41/45 devices) has been...... indicating that the main variation in open circuit voltage performance is caused by varying quality factor. The average resonant frequency was measured to 333Hz with a standard variation of 9.8Hz and a harvesting bandwidth of 5-10Hz. A maximum power output of 39.3μW was achieved at 1g for the best performing...
Design and Analysis of SD_DWCA - A Mobility Based Clustering of Homogeneous MANETs
Directory of Open Access Journals (Sweden)
T.N. Janakiraman
2011-05-01
Full Text Available This paper deals with the design and analysis of the distributed weighted clustering algorithm SD_DWCAproposed for homogeneous mobile ad hoc networks. It is a connectivity, mobility and energy based clustering algorithm which is suitable for scalable ad hoc networks. The algorithm uses a new graph parameter called strong degree defined based on the quality of neighbours of a node. The parameters are so chosen to ensure high connectivity, cluster stability and energy efficient communication among nodes of high dynamic nature. This paper also includes the experimental results of the algorithm implementedusing the network simulator NS2. The experimental results show that the algorithm is suitable for highspeed networks and generate stable clusters with less maintenance overhead.
Tsigginou, Alexandra; Vlachopoulos, Fotios; Arzimanoglou, Iordanis; Zagouri, Flora; Dimitrakakis, Constantine
2015-01-01
Screening for BRCA 1 and BRCA 2 mutations has long moved from the research lab to the clinic as a routine clinical genetic testing. BRCA molecular alteration pattern varies among ethnic groups which makes it already a less straightforward process to select the appropriate mutations for routine genetic testing on the basis of known clinical significance. The present report comprises an in depth literature review of the so far reported BRCA 1 and BRCA 2 molecular alterations in Greek families. Our analysis of Greek cumulative BRCA 1 and 2 molecular data, produced by several independent groups, confirmed that six recurrent deleterious mutations account for almost 60 % and 70 % of all BRCA 1 and 2 and BRCA 1 mutations, respectively. As a result, it makes more sense to perform BRCA mutation analysis in the clinic in two sequential steps, first conventional analysis for the six most prevalent pathogenic mutations and if none identified, a second step of New Generation Sequencing-based whole genome or whole exome sequencing would follow. Our suggested approach would enable more clinically meaningful, considerably easier and less expensive BRCA analysis in the Greek population which is considered homogenous.
Verification of combined thermal-hydraulic and heat conduction analysis code FLOWNET/TRUMP
Maruyama, Soh; Fujimoto, Nozomu; Kiso, Yoshihiro; Murakami, Tomoyuki; Sudo, Yukio
1988-09-01
This report presents the verification results of the combined thermal-hydraulic and heat conduction analysis code, FLOWNET/TRUMP which has been utilized for the core thermal hydraulic design, especially for the analysis of flow distribution among fuel block coolant channels, the determination of thermal boundary conditions for fuel block stress analysis and the estimation of fuel temperature in the case of fuel block coolant channel blockage accident in the design of the High Temperature Engineering Test Reactor(HTTR), which the Japan Atomic Energy Research Institute has been planning to construct in order to establish basic technologies for future advanced very high temperature gas-cooled reactors and to be served as an irradiation test reactor for promotion of innovative high temperature new frontier technologies. The verification of the code was done through the comparison between the analytical results and experimental results of the Helium Engineering Demonstration Loop Multi-channel Test Section(HENDEL T(sub 1-M)) with simulated fuel rods and fuel blocks.
Directory of Open Access Journals (Sweden)
Arun Kumar Taxak
2014-08-01
Full Text Available Gridded rainfall data of 0.5×0.5° resolution (CRU TS 3.21 was analysed to study long term spatial and temporal trends on annual and seasonal scales in Wainganga river basin located in Central India during 1901–2012. After testing the presence of autocorrelation, Mann–Kendall (Modified Mann–Kendall test was applied to non-auto correlated (auto correlated series to detect the trends in rainfall data. Theil and Sen׳s slope estimator test was used for finding the magnitude of change over a time period. For detecting the most probable change year, Pettitt–Mann–Whitney test was applied. The Rainfall series was then divided into two partial duration series for finding changes in trends before and after the change year. Arc GIS was used to explore spatial patterns of the trends over the entire basin. Though most of the grid points shows a decreasing trend in annual rainfall, only seven grids has a significant decreasing trend during 1901–2012. On the basis of seasonal trend analysis, non-significant increasing trend is observed only in post monsoon season while seven grid points show significant decreasing trend in monsoon rainfall and non-significant in pre-monsoon and winter rainfall over the last 112 years. During the study period, overall a 8.45% decrease in annual rainfall is estimated. The most probable year of change was found to be 1948 in annual and monsoonal rainfall. There is an increasing rainfall trend in the basin during the period 1901–1948, which is reversed during the period 1949–2012 resulting in decreasing rainfall trend in the basin. Homogeneous trends in annual and seasonal rainfall over a grid points is exhibited in the basin by van Belle and Hughes׳ homogeneity trend test.
Analysis of nuclear characteristics and fuel economics for PWR core with homogeneous thorium fuels
Energy Technology Data Exchange (ETDEWEB)
Joo, H. K.; Noh, J. M.; Yoo, J. W.; Song, J. S.; Kim, J. C.; Noh, T. W
2000-12-01
The nuclear core characteristics and economics of an once-through homogenized thorium cycle for PWR were analyzed. The lattice code, HELIOS has been qualified against BNL and B and W critical experiments and the IAEA numerical benchmark problem in advance of the core analysis. The infinite multiplication factor and the evolution of main isotopes with fuel burnup were investigated for the assessment of depletion charateristics of thorium fuel. The reactivity of thorium fuel at the beginning of irradiation is smaller than that of uranium fuel having the same inventory of {sup 235}U, but it decrease with burnup more slowly than in UO{sub 2} fuel. The gadolinia worth in thorium fuel assembly is also slightly smaller than in UO{sub 2} fuel. The inventory of {sup 233}U which is converted from {sup 232}Th is proportional to the initial mass of {sup 232}Th and is about 13kg per one tones of initial heavy metal mass. The followings are observed for thorium fuel cycle compared with UO{sub 2} cycle ; shorter cycle length, more positive MTC at EOC, more negative FTC, similar boron worth and control rod. Fuel economics of thorium cycle was analyzed by investigating the natural uranium requirements, the separative work requirements, and the cost for burnable poison rods. Even though less number of burnable poison rods are required in thorium fuel cycle, the costs for the natural uranium requirements and the separative work requirements are increased in thorium fuel cycle. So within the scope of this study, once through cycle concept, homogenized fuel concept, the same fuel management scheme as uranium cycle, the thorium fuel cycle for PWR does not have any economic incentives in preference to uranium.
Analysis of an indirect neutron signature for enhanced UF6 cylinder verification
Kulisek, J. A.; McDonald, B. S.; Smith, L. E.; Zalavadia, M. A.; Webster, J. B.
2017-02-01
The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF6) cylinders. The current method provides relatively low accuracy for the assay of 235U enrichment, especially for natural and depleted UF6. Furthermore, the current method provides no capability to assay the absolute mass of 235U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from 235U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVANT). HEVANT enables full-volume assay of UF6 cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF6. In this work, Monte Carlo modeling is used as the basis for characterizing HEVANT in terms of the individual contributions to HEVANT from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVANT signature to manipulation by the nearby placement of neutron-conversion materials.
Analysis of an Indirect Neutron Signature for Enhanced UF6 Cylinder Verification
Energy Technology Data Exchange (ETDEWEB)
Kulisek, Jonathan A.; McDonald, Benjamin S.; Smith, Leon E.; Zalavadia, Mital A.; Webster, Jennifer B.
2017-02-21
The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF6) cylinders. The current method provides relatively low accuracy for the assay of 235U enrichment, especially for natural and depleted UF6. Furthermore, the current method provides no capability to assay the absolute mass of 235U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from 235U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVANT). HEVANT enables full-volume assay of UF6 cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF6. In this work, Monte Carlo modeling is used as the basis for characterizing HEVANT in terms of the individual contributions to HEVANT from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVANT signature to manipulation by the nearby placement of neutron-conversion materials.
Zhao, Huaqing
There are two major objectives of this thesis work. One is to study theoretically the fracture and fatigue behavior of both homogeneous and functionally graded materials, with or without crack bridging. The other is to further develop the singular integral equation approach in solving mixed boundary value problems. The newly developed functionally graded materials (FGMs) have attracted considerable research interests as candidate materials for structural applications ranging from aerospace to automobile to manufacturing. From the mechanics viewpoint, the unique feature of FGMs is that their resistance to deformation, fracture and damage varies spatially. In order to guide the microstructure selection and the design and performance assessment of components made of functionally graded materials, in this thesis work, a series of theoretical studies has been carried out on the mode I stress intensity factors and crack opening displacements for FGMs with different combinations of geometry and material under various loading conditions, including: (1) a functionally graded layer under uniform strain, far field pure bending and far field axial loading, (2) a functionally graded coating on an infinite substrate under uniform strain, and (3) a functionally graded coating on a finite substrate under uniform strain, far field pure bending and far field axial loading. In solving crack problems in homogeneous and non-homogeneous materials, a very powerful singular integral equation (SEE) method has been developed since 1960s by Erdogan and associates to solve mixed boundary value problems. However, some of the kernel functions developed earlier are incomplete and possibly erroneous. In this thesis work, mode I fracture problems in a homogeneous strip are reformulated and accurate singular Cauchy type kernels are derived. Very good convergence rates and consistency with standard data are achieved. Other kernel functions are subsequently developed for mode I fracture in
Williams, Rebecca S; Derrick, Jason; Liebman, Aliza Kate; LaFleur, Kevin; Ribisl, Kurt M
2017-05-08
Identify the population of internet e-cigarette vendors (IEVs) and conduct content analyses of their age verification, purchase and delivery methods in 2013 and 2014. We used multiple sources to identify IEV websites, primarily complex search algorithms scanning more than 180 million websites. In 2013, we manually screened 32 446 websites, identifying 980 IEVs, selecting the 281 most popular for content analysis. This methodology yielded 31 239 websites for screening in 2014, identifying 3096 IEVs, with 283 selected for content analysis. The proportion of vendors that sold online-only, with no retail store, dropped significantly from 2013 (74.7%) to 2014 (64.3%) (p<0.01), with a corresponding significant decrease in US-based vendors (71.9% in 2013 and 65% in 2014). Most vendors did little to prevent youth access in either year, with 67.6% in 2013 and 63.2% in 2014 employing no age verification or relying exclusively on strategies that cannot effectively verify age. Effective age verification strategies such as online age verification services (7.1% in 2013 and 8.5% in 2014), driving licences (1.8% in 2013 and 7.4% in 2014, p<0.01) or age verification at delivery (6.4% in 2013 and 8.1% in 2104) were rarely advertised on IEV websites. Nearly all vendors advertised accepting credit cards, and about ¾ shipping via United States Postal Service, similar to the internet cigarette industry prior to federal bans. The number of IEVs grew sharply from 2013 to 2014, with poor age verification practices. New and expanded regulations for online e-cigarette sales are needed, including strict age and identity verification requirements. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.
Data and Knowledge Extraction Based on Structure Analysis of Homogeneous Websites
Directory of Open Access Journals (Sweden)
Mohammed Abdullah Hassan Al-Hagery
2014-11-01
Full Text Available The World Wide Web includes several types of website applications. Mainly these applications are related to business, organizations, companies, and others. There is a lack to get raw data sets to study the behavior of the internal structure of each type of these websites. Where websites structures include treasure of links, and sub-links, in addition to some embedded features associated with the internal structure of each website. The objective of this paper is to analysis a set of homogeneous websites to establish raw data sets. These data sets can be employed for several research purposes. It also can be used to extract some invisible aspects/features within the structure. Several steps are required to accomplish this objective; first, to propose an algorithm for structure analysis, second, to implement the proposed algorithm as a software tool for the purpose of extraction and establishment of raw data sets (real data set, third, to extrapolate a set of rules or relations from these data sets. This data set can be employed for researches purposes in the field of web structure mining, to estimate important factors related to websites development processes, and websites ranking. The results comprise creation of Oriented Data Sets (ODS for research purposes and also for deducing a set of features represents a type of new discovered knowledge in this ODS.
Directory of Open Access Journals (Sweden)
Araújo Manuel
2016-01-01
Full Text Available Multi-material domains are often found in industrial applications. Modelling them can be computationally very expensive due to meshing requirements. The finite element properties comprising different materials are hardly accurate. In this work, a new homogenization method that simplifies the computation of the homogenized Young modulus, Poisson ratio and thermal expansion coefficient is proposed, and applied to composite-like material on a printed circuit board. The results show a good properties correspondence between the homogenized domain and the real geometry simulation.
Distributed source term analysis, a new approach to nuclear material inventory verification
Beddingfield, D H
2002-01-01
The Distributed Source-Term Analysis (DSTA) technique is a new approach to measuring in-process material holdup that is a significant departure from traditional hold-up measurement methodology. The DSTA method is a means of determining the mass of nuclear material within a large, diffuse, volume using passive neutron counting. The DSTA method is a more efficient approach than traditional methods of holdup measurement and inventory verification. The time spent in performing DSTA measurement and analysis is a fraction of that required by traditional techniques. The error ascribed to a DSTA survey result is generally less than that from traditional methods. Also, the negative bias ascribed to gamma-ray methods is greatly diminished because the DSTA method uses neutrons which are more penetrating than gamma-rays.
Variability of apparently homogeneous soilscapes in São Paulo state, Brazil: I. spatial analysis
Directory of Open Access Journals (Sweden)
M. van Den Berg
2000-06-01
Full Text Available The spatial variability of strongly weathered soils under sugarcane and soybean/wheat rotation was quantitatively assessed on 33 fields in two regions in São Paulo State, Brazil: Araras (15 fields with sugarcane and Assis (11 fields with sugarcane and seven fields with soybean/wheat rotation. Statistical methods used were: nested analysis of variance (for 11 fields, semivariance analysis and analysis of variance within and between fields. Spatial levels from 50 m to several km were analyzed. Results are discussed with reference to a previously published study carried out in the surroundings of Passo Fundo (RS. Similar variability patterns were found for clay content, organic C content and cation exchange capacity. The fields studied are quite homogeneous with respect to these relatively stable soil characteristics. Spatial variability of other characteristics (resin extractable P, pH, base- and Al-saturation and also soil colour, varies with region and, or land use management. Soil management for sugarcane seems to have induced modifications to greater depths than for soybean/wheat rotation. Surface layers of soils under soybean/wheat present relatively little variation, apparently as a result of very intensive soil management. The major part of within-field variation occurs at short distances (< 50 m in all study areas. Hence, little extra information would be gained by increasing sampling density from, say, 1/km² to 1/50 m². For many purposes, the soils in the study regions can be mapped with the same observation density, but residual variance will not be the same in all areas. Bulk sampling may help to reveal spatial patterns between 50 and 1.000 m.
Mukhacheva, Tatyana A; Salikhova, Irina I; Kovalev, Sergey Y
2015-04-01
Borrelia miyamotoi, a member of the relapsing fever group borreliae, was first isolated in Japan and subsequently found in Ixodes ticks in North America, Europe and Russia. Currently, there are three types of B. miyamotoi: Asian or Siberian (transmitted mainly by Ixodes persulcatus), European (Ixodesricinus) and American (Ixodesscapularis and Ixodespacificus). Despite the great genetic distances between B. miyamotoi types, isolates within a type are characterised by an extremely low genetic variability. In particular, strains of B. miyamotoi of Asian type, isolated in Russia from the Baltic sea to the Far East, have been shown to be identical based on the analysis of several conventional genetic markers, such as 16S rRNA, flagellin, outer membrane protein p66 and glpQ genes. Thus, protein or rRNA - coding genes were shown not to be informative enough in studying genetic diversity of B. miyamotoi within a type. In the present paper, we have attempted to design a new multilocus technique based on eight non-coding intergenic spacers (3686bp in total) and have applied it to the analysis of intra-type genetic variability of В. miyamotoi detected in different regions of Russia and from two tick species, I. persulcatus and Ixodespavlovskyi. However, even though potentially the most variable loci were selected, no genetic variability between studied DNA samples was found, except for one nucleotide substitution in two of them. The sequences obtained were identical to those of the reference strain FR64b. Analysis of the data obtained with the GenBank sequences indicates a highly homogeneous genetic background of B. miyamotoi from the Baltic Sea to the Japanese Islands. In this paper, a hypothesis of clonal expansion of B. miyamotoi is discussed, as well as possible mechanisms for the rapid dissemination of one B. miyamotoi clone over large distances.
Energy Technology Data Exchange (ETDEWEB)
Yasuaki, Ichikawa; Somehai, Prayongphan [Nagoya Univ., Dpt. of Environmental Engineering and Architecture, Chikusa, Nagoya (Japan); Kazumi, Kitayama [NUMO, Minato, Tokyo (Japan); Katsuyuki, Kawamura [Tokyo Institute of Technology, Tokyo (Japan)
2005-07-01
The major scenario of transport of radioactive nuclides is due to groundwater flow in most HLW projects. The phenomena of water flow and diffusion of chemical species in a bentonite buffer and surrounding rock mass have been treated mainly based on the classical porous media theories under the Darcy law and Fick law. The classical theories involve the following difficulties: 1) True velocity field is hard to be identified, especially in microscale sense. Note that it essentially affects the transport of chemical species. 2) The classical theories are not applicable to the problems in which experimental data are not available. Thus, the very long time behavior cannot be proven. We commonly recognize that the water flow in bentonite and mud-stone is strongly retarded. It is highly doubtful whether the above classical theories are applicable for such very low permeable materials. In this work we first show that the velocity and diffusion fields in pure smectite bentonite can be calculated by a coupled molecular dynamics (MD) simulation and the homogenization analysis (HA). The true velocity field can be calculated by applying HA to the Navier-Stokes equation, and the local distribution of viscosity used in this HA is obtained by MD. The diffusion field is also calculated under the same procedure of MD/HA by using the local diffusion equation with diffusivity calculated by MD. (authors)
An analysis of the trend in ground-level ozone using non-homogeneous poisson processes
Shively, Thomas S.
This paper provides a method for measuring the long-term trend in the frequency with which ground-level ozone present in the ambient air exceeds the U.S. Environmental Protection Agency's National Ambient Air Quality Standard (NAAQS) for ozone. A major weakness of previous studies that estimate the long-term trend in the very high values of ozone, and therefore the long-term trend in the probability of satisfying the NAAQS for ozone, is their failure to account for the confounding effects of meterological conditions on ozone levels. Meteorological variables such as temperature, wind speed, and frontal passage play an important role in the formation of ground-level ozone. A non-homogenous Poisson process is used to account for the relationship between very high values of ozone and meteorological conditions. This model provides an estimate of the trend in the ozone values after allowing for the effects of meteorological conditions. Therefore, this model provides a means to measure the effectiveness of pollution control programs after accounting for the effects of changing weather conditions. When our approach is applied to data collected at two sites in Houston, TX, we find evidence of a gradual long-term downward trend in the frequency of high values of ozone. The empirical results indicate how possibly misleading results can be obtained if the analysis does not account for changing weather conditions.
Krzysztof, Kecik; Borowiec, Marek; Rafał, Rusinek
2016-01-01
Correctness verification of the stability lobe diagrams of milling process determined by commercial software CutPro 9 is the aim of this work. The analysis is performed for nickel superalloy Inconel 718 which is widely used in aviation industry. A methodology of stability analysis which bases on advanced nonlinear methods such as recurrence plot, recurrence quantifications analysis and composite multiscale entropy analysis are applied to the experimental data. Additionally, a new criterion for the determination of the unstable areas is proposed.
Chadwick, John C; Freixa, Zoraida; van Leeuwen, Piet W N M
2011-01-01
This first book to illuminate this important aspect of chemical synthesis improves the lifetime of catalysts, thus reducing material and saving energy, costs and waste.The international panel of expert authors describes the studies that have been conducted concerning the way homogeneous catalysts decompose, and the differences between homogeneous and heterogeneous catalysts. The result is a ready reference for organic, catalytic, polymer and complex chemists, as well as those working in industry and with/on organometallics.
Energy Technology Data Exchange (ETDEWEB)
Kalimullah, M. [Argonne National Lab. (ANL), Argonne, IL (United States); Olson, Arne P. [Argonne National Lab. (ANL), Argonne, IL (United States); Feldman, E. E. [Argonne National Lab. (ANL), Argonne, IL (United States); Hanan, N. [Argonne National Lab. (ANL), Argonne, IL (United States); Dionne, B. [Argonne National Lab. (ANL), Argonne, IL (United States)
2015-04-07
The document compiles in a single volume several verification and validation works done for the PLTEMP/ANL code during the years of its development and improvement. Some works that are available in the open literature are simply referenced at the outset, and are not included in the document. PLTEMP has been used in conversion safety analysis reports of several US and foreign research reactors that have been licensed and converted. A list of such reactors is given. Each chapter of the document deals with the verification or validation of a specific model. The model verification is usually done by comparing the code with hand calculation, Microsoft spreadsheet calculation, or Mathematica calculation. The model validation is done by comparing the code with experimental data or a more validated code like the RELAP5 code.
Verification and large deformation analysis using the reproducing kernel particle method
Energy Technology Data Exchange (ETDEWEB)
Beckwith, Frank [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)
2015-09-01
The reproducing kernel particle method (RKPM) is a meshless method used to solve general boundary value problems using the principle of virtual work. RKPM corrects the kernel approximation by introducing reproducing conditions which force the method to be complete to arbritrary order polynomials selected by the user. Effort in recent years has led to the implementation of RKPM within the Sierra/SM physics software framework. The purpose of this report is to investigate convergence of RKPM for verification and validation purposes as well as to demonstrate the large deformation capability of RKPM in problems where the finite element method is known to experience difficulty. Results from analyses using RKPM are compared against finite element analysis. A host of issues associated with RKPM are identified and a number of potential improvements are discussed for future work.
Requirement Assurance: A Verification Process
Alexander, Michael G.
2011-01-01
Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.
Analysis of zinc oxide nanoparticles binding proteins in rat blood and brain homogenate
Directory of Open Access Journals (Sweden)
Shim KH
2014-12-01
Full Text Available Kyu Hwan Shim,1 John Hulme,1 Eun Ho Maeng,2 Meyoung-Kon Kim,3 Seong Soo A An1 1Department of Bionano Technology, Gachon Medical Research Institute, Gachon University, Sungnam-si, Gyeonggi-do, South Korea; 2Department of Analysis, KTR, Kimpo, Gyeonggi-do, South Korea; 3Department of Biochemistry and Molecular Biology, Korea University Medical School and College, Seoul, South Korea Abstract: Nanoparticles (NPs are currently used in chemical, cosmetic, pharmaceutical, and electronic products. Nevertheless, limited safety information is available for many NPs, especially in terms of their interactions with various binding proteins, leading to potential toxic effects. Zinc oxide (ZnO NPs are included in the formulation of new products, such as adhesives, batteries, ceramics, cosmetics, cement, glass, ointments, paints, pigments, and supplementary foods, resulting in increased human exposures to ZnO. Hence, we investigated the potential ZnO nanotoxic pathways by analyzing the adsorbed proteins, called protein corona, from blood and brain from four ZnO NPs, ZnOSM20(-, ZnOSM20(+, ZnOAE100(-, and ZnOAE100(+, in order to understand their potential mechanisms in vivo. Through this study, liquid chromatography–mass spectroscopy/mass spectroscopy technology was employed to identify all bound proteins. Totals of 52 and 58 plasma proteins were identified as being bound to ZnOSM20(- and ZnOSM20(+, respectively. For ZnOAE100(- and ZnOAE100(+, 58 and 44 proteins were bound, respectively. Similar numbers of proteins were adsorbed onto ZnO irrespective of size or surface charge of the nanoparticle. These proteins were further analyzed with ClueGO, a Cytoscape plugin, which provided gene ontology and the biological interaction processes of identified proteins. Interactions between diverse proteins and ZnO nanoparticles could result in an alteration of their functions, conformation, and clearance, eventually affecting many biological processes. Keywords: brain
Analysis of global climate variability from homogenously reprocessed ground-based GNSS measurements
Ahmed, Furqan; Hunegnaw, Addisu; Teferle, Felix Norman; Bingley, Richard
2015-04-01
The tropospheric delay information obtained through long-term homogenous reprocessing of Global Navigation Satellite System (GNSS) observations can be used for climate change and variability analysis on a global scale. A reprocessed global dataset of GNSS-derived zenith total delay (ZTD) and position estimates, based on the network double differencing (DD) strategy and covering 1994-2012, has been recently produced at the University of Luxembourg using the Bernese GNSS Software 5.2 (BSW5.2) and the reprocessed products from the Centre for Orbit Determination in Europe (CODE). The network of ground-based GNSS stations processed to obtain this dataset consists of over 400 globally distributed stations. The GNSS-derived ZTD has been validated by comparing it to that derived from reanalysis data from the European Centre for Medium-Range Weather Forecasts (ECMWF). After validation and quality control, the ZTD dataset obtained using the DD strategy has been used to investigate the inter-annual, seasonal and diurnal climate variability and trends in the tropospheric delay on various regional to global spatial scales. Precise point positioning (PPP) is a processing strategy for GNSS observations which is based on observations from a single station rather than a network of baselines and is therefore computationally more efficient than the DD strategy. However, the two processing strategies, i.e. DD and PPP, have their own strengths and weaknesses and could affect the solutions differently at different geographical locations. In order to explore the use of PPP strategy for climate monitoring, another experimental dataset covering a shorter period has been produced using the PPP strategy and compared to the DD based ZTD dataset.
Automation and uncertainty analysis of a method for in-vivo range verification in particle therapy
Frey, K.; Unholtz, D.; Bauer, J.; Debus, J.; Min, C. H.; Bortfeld, T.; Paganetti, H.; Parodi, K.
2014-10-01
We introduce the automation of the range difference calculation deduced from particle-irradiation induced β+-activity distributions with the so-called most-likely-shift approach, and evaluate its reliability via the monitoring of algorithm- and patient-specific uncertainty factors. The calculation of the range deviation is based on the minimization of the absolute profile differences in the distal part of two activity depth profiles shifted against each other. Depending on the workflow of positron emission tomography (PET)-based range verification, the two profiles under evaluation can correspond to measured and simulated distributions, or only measured data from different treatment sessions. In comparison to previous work, the proposed approach includes an automated identification of the distal region of interest for each pair of PET depth profiles and under consideration of the planned dose distribution, resulting in the optimal shift distance. Moreover, it introduces an estimate of uncertainty associated to the identified shift, which is then used as weighting factor to ‘red flag’ problematic large range differences. Furthermore, additional patient-specific uncertainty factors are calculated using available computed tomography (CT) data to support the range analysis. The performance of the new method for in-vivo treatment verification in the clinical routine is investigated with in-room PET images for proton therapy as well as with offline PET images for proton and carbon ion therapy. The comparison between measured PET activity distributions and predictions obtained by Monte Carlo simulations or measurements from previous treatment fractions is performed. For this purpose, a total of 15 patient datasets were analyzed, which were acquired at Massachusetts General Hospital and Heidelberg Ion-Beam Therapy Center with in-room PET and offline PET/CT scanners, respectively. Calculated range differences between the compared activity distributions are reported in
Comparative analysis on homogeneity of Pb and Cd in epoxy molding compounds
Institute of Scientific and Technical Information of China (English)
Kyun-Gmee LEE; Yong-Keun SON; Jin-Sook LEE; Tai-Min NOH; Hee-Soo LEE
2011-01-01
Reference materials for quantitative determination of regulated heavy metals, such as Pb and Cd in electronic components,were designed and investigated in terms of stability and homogeneity. Reference materials with two concentration levels of heavy metals were prepared by spiking Pb and Cd compounds to epoxy molding compounds made by mixing silica powders and epoxy resin. The concentration changes of the reference materials during stability test for I a were not observed. In the homogeneity assessment, the as-prepared reference materials were studied by using three different analytical tools, inductively coupled plasma atomic emission spectrometry (ICP-AES). X-ray fluoroescence spectrometry (XRF) and laser ablation ICP mass. The results show different homogeneities by the characteristics of analytical tools and the materials.
Development of the Verification and Validation Matrix for Safety Analysis Code SPACE
Energy Technology Data Exchange (ETDEWEB)
Kim, Yo Han; Ha, Sang Jun; Yang, Chang Keun [Korea Electric Power Research Institute, Daejeon (Korea, Republic of)
2009-10-15
Korea Electric Power Research Institute (KEPRI) has been developed the safety analysis code, called as SPACE (Safety and Performance Analysis CodE for Nuclear Power Plant), for typical pressurized water reactors (PWR). Current safety analysis codes were conducted from foreign vendors, such as Westinghouse Electric Corp., ABB Combustion Engineering Inc., Kraftwerk Union, etc. Considering the conservatism and inflexibility of the foreign code systems, it is difficult to expand the application areas and analysis scopes. To overcome the mentioned problems KEPRI has launched the project to develop the native safety analysis code with Korea Power Engineering Co.(KOPEC), Korea Atomic Energy Research Inst.(KAERI), Korea Nuclear Fuel(KNF), and Korea Hydro and Nuclear Power Co.(KHNP) under the funding of Ministry of Knowledge Economy (MKE). As a result of the project, the demo-version of SPACE has been released in July 2009. As an advance preparation of the next step, KEPRI and colleagues have developed the verification and validation (V and V) matrix for SPACE. To develop the matrix, the preceding studies and experiments were reviewed. After mature consideration, the V and V matrix has been developed and the experiment plans were designed for the next step to compensate the lack of data.
Phase resolved analysis of the homogeneity of a diffuse dielectric barrier discharge
Baldus, Sabrina; Kogelheide, Friederike; Bibinov, Nikita; Stapelmann, Katharina; Awakowicz, Peter
2015-09-01
Cold atmospheric pressure plasmas have already proven their ability of supporting the healing process of chronic wounds. Especially simple configurations like a dielectric barrier discharge (DBD), comprising of one driven electrode which is coated with a dielectric layer, are of interest, because they are cost-effective and easy to handle. The homogeneity of such plasmas during treatment is necessary since the whole wound should be treated evenly. In this investigation phase resolved optical emission spectroscopy is used to investigate the homogeneity of a DBD. Electron densities and reduced electric field distributions are determined with temporal and spatial resolution and the differences for applied positive and negative voltage pulses are studied.
A four-scale homogenization analysis of creep of a nuclear containment structure
Energy Technology Data Exchange (ETDEWEB)
Tran, A.B. [Université Paris-Est, Laboratoire Modélisation et Simulation Multi Échelle, MSME UMR 8208 CNRS, 5 bd Descartes, F-77454 Marne-la-Vallée (France); EDF R and D – Département MMC Site des Renardières – Avenue des Renardières - Ecuelles, 77818 Moret sur Loing Cedex (France); Department of Applied Informatics in Construction, National University of Civil Engineering, 55 Giai Phong Road, Hai Ba Trung District, Hanoi (Viet Nam); Yvonnet, J., E-mail: julien.yvonnet@univ-paris-est.fr [Université Paris-Est, Laboratoire Modélisation et Simulation Multi Échelle, MSME UMR 8208 CNRS, 5 bd Descartes, F-77454 Marne-la-Vallée (France); He, Q.-C. [Université Paris-Est, Laboratoire Modélisation et Simulation Multi Échelle, MSME UMR 8208 CNRS, 5 bd Descartes, F-77454 Marne-la-Vallée (France); Toulemonde, C.; Sanahuja, J. [EDF R and D – Département MMC Site des Renardières – Avenue des Renardières - Ecuelles, 77818 Moret sur Loing Cedex (France)
2013-12-15
A four-scale approach is proposed to predict the creep behavior of a concrete structure. The behavior of concrete is modeled through a numerical multiscale methodology, by successively homogenizing the viscoelastic behavior at different scales, starting from the cement paste. The homogenization is carried out by numerically constructing an effective relaxation tensor at each scale. In this framework, the impact of modifying the microstructural parameters can be directly observed on the structure response, like the interaction of the creep of concrete with the prestressing tendons network, and the effects of an internal pressure which might occur during a nuclear accident.
Radiation of Air-Borne Noise in Non-Homogeneous Wind and Temperature Fields using FEM Analysis
DEFF Research Database (Denmark)
Kirkegaard, Poul Henning; Nielsen, Søren R. K.; Krenk, S.;
1999-01-01
The paper describes analysis in the time domain of noise propagating in non-homogeneous mean wind or temperature fields. The analysis is based on a field equation for the velocity potential, which contains strong convection terms. In order to circumvent the problem of numerical instability and lo...... source with a prescribed time variation. Stability and accuracy of the numerical scheme have been estimated for different values of the Mach number, the Courant number and the wave length to element length ratio....
Energy Technology Data Exchange (ETDEWEB)
Wilczek, M; Friedrich, R [Institute for Theoretical Physics, University of Muenster, Wilhelm-Klemm-Str. 9, 48149 Muenster (Germany); Kadoch, B [Aix-Marseille Universite and M2P2-CNRS Ecole Centrale de Marseille, 38 Rue Joliot-Curie, 13451 Marseille Cedex 20 (France); Schneider, K [M2P2-CNRS and CMI, Universite de Provence, 39 Rue Joliot-Curie, 13453 Marseille Cedex 13 (France); Farge, M, E-mail: mwilczek@uni-muenster.de [LMD-CNRS, Ecole Normale Superieure, 24 Rue Lhomond, 75231 Paris Cedex 5 (France)
2011-12-22
We study the conditional balance of vortex stretching and vorticity diffusion of fully developed three-dimensional homogeneous isotropic turbulence with respect to coherent and incoherent flow contributions. This decomposition is achieved by the Coherent Vorticity Extraction based on orthogonal wavelets applied to DNS data, which yields insights into the influence of the different contributions as well as their interaction.
Global bifurcation analysis and pattern formation in homogeneous diffusive predator-prey systems
Wang, Jinfeng; Wei, Junjie; Shi, Junping
2016-02-01
The dynamics of a general diffusive predator-prey system is considered. Existence and nonexistence of non-constant positive steady state solutions are shown to identify the ranges of parameters of spatial pattern formation. Bifurcations of spatially homogeneous and nonhomogeneous periodic solutions as well as non-constant steady state solutions are studied.
Ohno, Shuji; Ohshima, Hiroyuki; Tajima, Yuji; Ohki, Hiroshi
Thermodynamic consequence in liquid sodium leak and fire accident is one of the important issues to be evaluated when considering the safety aspect of fast reactor plant building. The authors are therefore initiating systematic verification and validation (V&V) activity to assure and demonstrate the reliability of numerical simulation tool for sodium fire analysis. The V&V activity is in progress with the main focuses on already developed sodium fire analysis codes SPHINCS and AQUA-SF. The events to be evaluated are hypothetical sodium spray, pool, or combined fire accidents followed by thermodynamic behaviors postulated in a plant building. The present paper describes that the ‘Phenomena Identification and Ranking Table (PIRT)’ is developed at first for clarifying the important validation points in the sodium fire analysis codes, and that an ‘Assessment Matrix’ is proposed which summarizes both separate effect tests and integral effect tests for validating the computational models or whole code for important phenomena. Furthermore, the paper shows a practical validation with a separate effect test in which the spray droplet combustion model of SPHINCS and AQUA-SF predicts the burned amount of a falling sodium droplet with the error mostly less than 30%.
Okada, Jun-Ichi; Hisada, Toshiaki
It is well known that the compressibility or incompressibility of biological tissue stems from its microscopic structure, which is generally composed of material with varied compressibility, including incompressibility. This paper proposes a framework for a homogenization method in which the compressibility/incompressibility of the macrostructure properly reflects that of the microstructure. The formulation is based on the mixed variational principle with a perturbed Lagrange-multiplier. It is shown that the rate of volumetric change of the macrostructure can be controlled through the homogenization procedure by introducing the constraint on the microstructure only. A couple of numerical examples are given to demonstrate the validity of the proposed method. By comparing the numerical results with theoretical solutions, the method is also confirmed to be free from locking.
Derivation of the state matrix for dynamic analysis of linear homogeneous media.
Parra Martinez, Juan Pablo; Dazel, Olivier; Göransson, Peter; Cuenca, Jacques
2016-08-01
A method to obtain the state matrix of an arbitrary linear homogeneous medium excited by a plane wave is proposed. The approach is based on projections on the eigenspace of the governing equations matrix. It is an alternative to manually obtaining a linearly independent set of equations by combining the governing equations. The resulting matrix has been validated against previously published derivations for an anisotropic poroelastic medium.
Improved method for quantitative analysis of the cyclotide kalata B1 in plasma and brain homogenate
Eriksson, Camilla; Jansson, Britt; Göransson, Ulf; Hammarlund‐Udenaes, Margareta
2016-01-01
Abstract This study provides a new method for quantifying the cyclotide kalata B1 in both plasma and brain homogenate. Cyclotides are ultra‐stable peptides with three disulfide bonds that are interesting from a drug development perspective as they can be used as scaffolds. In this study we describe a new validated LC‐MS/MS method with high sensitivity and specificity for kalata B1. The limit of quantification was 2 ng/mL in plasma and 5 ng/gmL in brain homogenate. The method was linear in the range 2–10,000 ng/mL for plasma and 5–2000 ng/g for brain. Liquid Chromatographic separation was performed on a HyPurity C18 column, 50 × 4.6 mm, 3 µm particle size. The method had inter‐ and intra‐day precision and accuracy levels <15% and 12% respectively. Applying the method to in vivo plasma samples and brain homogenate samples from equilibrium dialysis yielded satisfying results and was able to describe the plasma pharmacokinetics and brain tissue binding of kalata B1. The described method is quick, reproducible and well suited to quantifying kalata B1 in biological matrices. PMID:27603276
Improved method for quantitative analysis of the cyclotide kalata B1 in plasma and brain homogenate.
Melander, Erik; Eriksson, Camilla; Jansson, Britt; Göransson, Ulf; Hammarlund-Udenaes, Margareta
2016-11-01
This study provides a new method for quantifying the cyclotide kalata B1 in both plasma and brain homogenate. Cyclotides are ultra-stable peptides with three disulfide bonds that are interesting from a drug development perspective as they can be used as scaffolds. In this study we describe a new validated LC-MS/MS method with high sensitivity and specificity for kalata B1. The limit of quantification was 2 ng/mL in plasma and 5 ng/gmL in brain homogenate. The method was linear in the range 2-10,000 ng/mL for plasma and 5-2000 ng/g for brain. Liquid Chromatographic separation was performed on a HyPurity C18 column, 50 × 4.6 mm, 3 µm particle size. The method had inter- and intra-day precision and accuracy levels plasma samples and brain homogenate samples from equilibrium dialysis yielded satisfying results and was able to describe the plasma pharmacokinetics and brain tissue binding of kalata B1. The described method is quick, reproducible and well suited to quantifying kalata B1 in biological matrices.
Sujatha Bhat; Ajeetkumar Patil; Lavanya Rai; Kartha, V. B.; Santhosh Chidangil
2012-01-01
A highly objective method, High Performance Liquid Chromatography with Laser Induced Fluorescence (HPLC-LIF) technique was used to study the protein profiles of normal and cervical cancer tissue homogenates. A total of 44 samples including normal cervical biopsy samples from the hysterectomy patients and the patients suffering from different stages of the cervical cancer were recorded by HPLC-LIF and analysed by Principle Component Analysis (PCA) to get statistical information on different t...
Energy Technology Data Exchange (ETDEWEB)
Sanchez, R.; Ragusa, J.; Santandrea, S. [Commissariat a l' Energie Atomique, Direction de l' Energie Nucleaire, Service d' Etudes de Reacteurs et de Modelisation Avancee, CEA de Saclay, DM2S/SERMA 91 191 Gif-sur-Yvette cedex (France)]. e-mail: richard.sanchez@cea.fr
2004-07-01
The problem of the determination of a homogeneous reflector that preserves a set of prescribed albedo is considered. Duality is used for a direct estimation of the derivatives needed in the iterative calculation of the optimal homogeneous cross sections. The calculation is based on the preservation of collapsed multigroup albedo obtained from detailed reference calculations and depends on the low-order operator used for core calculations. In this work we analyze diffusion and transport as low-order operators and argue that the P{sub 0} transfers are the best choice for the unknown cross sections to be adjusted. Numerical results illustrate the new approach for SP{sub N} core calculations. (Author)
Spiegel, van der M.; Fels-Klerx, van der H.J.; Sterrenburg, P.; Ruth, van S.M.; Scholtens-Toma, I.M.J.; Kok, E.J.
2012-01-01
The global halal market is increasing. Worldwide a large number of standardization and certification organizations has been established. This article discusses halal requirements, summarizes applied standards and certification, and explores current verification of halal certificates using audits and
Spiegel, van der M.; Fels-Klerx, van der H.J.; Sterrenburg, P.; Ruth, van S.M.; Scholtens-Toma, I.M.J.; Kok, E.J.
2012-01-01
The global halal market is increasing. Worldwide a large number of standardization and certification organizations has been established. This article discusses halal requirements, summarizes applied standards and certification, and explores current verification of halal certificates using audits and
Using the SAL technique for spatial verification of cloud processes: A sensitivity analysis
Weniger, Michael
2016-01-01
The feature based spatial verification method SAL is applied to cloud data, i.e. two-dimensional spatial fields of total cloud cover and spectral radiance. Model output is obtained from the COSMO-DE forward operator SynSat and compared to SEVIRI satellite data. The aim of this study is twofold. First, to assess the applicability of SAL to this kind of data, and second, to analyze the role of external object identification algorithms (OIA) and the effects of observational uncertainties on the resulting scores. As a feature based method, SAL requires external OIA. A comparison of three different algorithms shows that the threshold level, which is a fundamental part of all studied algorithms, induces high sensitivity and unstable behavior of object dependent SAL scores (i.e. even very small changes in parameter values can lead to large changes in the resulting scores). An in-depth statistical analysis reveals significant effects on distributional quantities commonly used in the interpretation of SAL, e.g. median...
Yang, Liangen; Wang, Xuanze; Lv, Wei
2011-05-01
A displacement sensor with controlled measuring force and its error analysis and precision verification are discussed in this paper. The displacement sensor consists of an electric induction transducer with high resolution and a voice coil motor (VCM). The measuring principles, structure, method enlarging measuring range, signal process of the sensor are discussed. The main error sources such as parallelism error and incline of framework by unequal length of leaf springs, rigidity of measuring rods, shape error of stylus, friction between iron core and other parts, damping of leaf springs, variation of voltage, linearity of induction transducer, resolution and stability are analyzed. A measuring system for surface topography with large measuring range is constructed based on the displacement sensor and 2D moving platform. Measuring precision and stability of the measuring system is verified. Measuring force of the sensor in measurement process of surface topography can be controlled at μN level and hardly changes. It has been used in measurement of bearing ball, bullet mark, etc. It has measuring range up to 2mm and precision of nm level.
Modal analysis based equivalent circuit model and its verification for a single cMUT cell
Mao, S. P.; Rottenberg, X.; Rochus, V.; Czarnecki, P.; Helin, P.; Severi, S.; Nauwelaers, B.; Tilmans, H. A. C.
2017-03-01
This paper presents the lumped equivalent circuit model and its verification of both transmission and reception properties of a single cell capacitive micromachined ultrasonic transducer (cMUT), which is operating in a non-collapse small signal region. The derivation of this equivalent circuit model is based on the modal analysis techniques, harmonic modes are included by using the mode superposition method; and thus a wide frequency range response of the cMUT cell can be simulated by our equivalent circuit model. The importance of the cross modal coupling between different eigenmodes of a cMUT cell is discussed by us for the first time. In this paper the development of this model is only illustrated by a single circular cMUT cell under a uniform excitation. Extension of this model and corresponding results under a more generalized excitation will be presented in our upcoming publication (Mao et al 2016 Proc. IEEE Int. Ultrasonics Symp.). This model is verified by both finite element method (FEM) simulation and experimental characterizations. Results predicted by our model are in a good agreement with the FEM simulation results, and this works for a single cMUT cell operated in either transmission or reception. Results obtained from the model also rather match the experimental results of the cMUT cell. This equivalent circuit model provides an easy and precise way to rapidly predict the behaviors of cMUT cells.
Characteristics of a micro-fin evaporator: Theoretical analysis and experimental verification
Directory of Open Access Journals (Sweden)
Zheng Hui-Fan
2013-01-01
Full Text Available A theoretical analysis and experimental verification on the characteristics of a micro-fin evaporator using R290 and R717 as refrigerants were carried out. The heat capacity and heat transfer coefficient of the micro-fin evaporator were investigated under different water mass flow rate, different refrigerant mass flow rate, and different inner tube diameter of micro-fin evaporator. The simulation results of the heat transfer coefficient are fairly in good agreement with the experimental data. The results show that heat capacity and the heat transfer coefficient of the micro-fin evaporator increase with increasing logarithmic mean temperature difference, the water mass flow rate and the refrigerant mass flow rate. Heat capacity of the micro-fin evaporator for diameter 9.52 mm is higher than that of diameter 7.00 mm with using R290 as refrigerant. Heat capacity of the micro-fin evaporator with using R717 as refrigerant is higher than that of R290 as refrigerant. The results of this study can provide useful guidelines for optimal design and operation of micro-fin evaporator in its present or future applications.
Verification of HELIOS/MASTER Nuclear Analysis System for SMART Research Reactor, Rev. 1.0
Energy Technology Data Exchange (ETDEWEB)
Lee, Kyung Hoon; Kim, Kang Seog; Cho, Jin Young; Lee, Chung Chan; Zee, Sung Quun
2005-12-15
Nuclear design for the SMART reactor is performed by using the transport lattice code HELIOS and the core analysis code MASTER. HELIOS code developed by Studsvik Scandpower in Norway is a transport lattice code for the neutron and gamma behavior, and is used to generate few group constants. MASTER code is a nodal diffusion code developed by KAERI, and is used to analyze reactor physics. This nuclear design code package requires verification. Since the SMART reactor is unique, it is impossible to verify this code system through the comparison of the calculation results with the measured ones. Therefore, the uncertainties for the nuclear physics parameters calculated by HELIOS/MASTER have been evaluated indirectly. Since Monte Carlo calculation includes least approximations an assumptions to simulate a neutron behavior, HELIOS/MASTER has been verified by this one. Monte Carlo code has been verified by the Kurchatov critical experiments similar to SMART reactor, and HELIOS/MASTER code package has been verified by Monte Carlo calculations for the SMART research reactor.
Verification of HELIOS/MASTER Nuclear Analysis System for SMART Research Reactor
Energy Technology Data Exchange (ETDEWEB)
Kim, Kang Seog; Cho, Jin Young; Lee, Chung Chan; Zee, Sung Quun
2005-07-15
Nuclear design for the SMART reactor is performed by using the transport lattice code HELIOS and the core analysis code MASTER. HELIOS code developed by Studsvik Scandpower in Norway is a transport lattice code for the neutron and gamma behavior, and is used to generate few group constants. MASTER code is a nodal diffusion code developed by KAERI, and is used to analyze reactor physics. This nuclear design code package requires verification. Since the SMART reactor is unique, it is impossible to verify this code system through the comparison of the calculation results with the measured ones. Therefore, the uncertainties for the nuclear physics parameters calculated by HELIOS/MASTER have been evaluated indirectly. Since Monte Carlo calculation includes least approximations an assumptions to simulate a neutron behavior, HELIOS/MASTER has been verified by this one. Monte Carlo code has been verified by the Kurchatov critical experiments similar to SMART reactor, and HELIOS/MASTER code package has been verified by Monte Carlo calculations for the SMART research reactor.
Chacón-Cardona, C. A.; Casas-Miranda, R. A.
2012-12-01
We investigate from a fractal viewpoint the way in which dark matter is grouped at z = 0 in the Millennium dark matter cosmological simulation. Determination of the crossing point to homogeneity in the Millennium Simulation data is described with regard to the behaviour of the fractal dimension and lacunarity. We use the sliding-window technique to calculate the fractal mass-radius dimension, the pre-factor F and the lacunarity of this fractal relation. Additionally, we determine the multifractal dimension and the lacunarity spectrum, including their dependence on radial distance. The calculations show a radial distance dependence of all fractal quantities, with heterogeneous clustering of dark matter haloes up to depths of 100 Mpc h-1. Dark matter halo clustering in the Millennium Simulation shows a radial distance dependence, with two regions clearly defined. The lacunarity spectrum for values of the structure parameter q ≥ 1 shows regions with relative maxima, revealing the formation of clusters and voids in the dark matter halo distribution. With use of the multifractal dimension and the lacunarity spectrum, the transition to homogeneity at depths between 100 Mpc h-1 and 120 Mpc h-1 for Millennium Simulation dark matter haloes is detected.
STRAIGHT-LINE CONVENTIONAL TRANSIENT RATE ANALYSIS FOR LONG HOMOGENEOUS AND HETEROGENEOUS RESERVOIRS
Directory of Open Access Journals (Sweden)
FREDDY HUMBERTO ESCOBAR
2012-01-01
Full Text Available El flujo lineal es un régimen de flujo muy importante que se presenta en pozos fracturados, horizontales y yacimientos alargados. Tanto el análisis de pruebas de presión como de transitorio de caudal podrán verse afectados por la presencia del flujo lineal. Para el caso de producción a caudal variable, la mayor parte del análisis se realiza mediante ajuste de curvas de declinación y poca atención ha recibido el análisis transitorio de caudal. Este artículo presenta las ecuaciones gobernantes usadas para análisis transitorio de caudal en sistemas alargados y proporciona ejemplos mediante el método convencional. La metodología permite la estimación de la permeabilidad, el ancho del yacimiento y los factores de daño geométricos. Si la prueba es lo suficientemente larga se puede estimar el área de drenaje del yacimiento y la posición del pozo dentro del mismo. La metodología se verificó satisfactoriamente mediante su aplicación a pruebas sintéticas.
Integrated Java Bytecode Verification
DEFF Research Database (Denmark)
Gal, Andreas; Probst, Christian; Franz, Michael
2005-01-01
Existing Java verifiers perform an iterative data-flow analysis to discover the unambiguous type of values stored on the stack or in registers. Our novel verification algorithm uses abstract interpretation to obtain definition/use information for each register and stack location in the program......, which in turn is used to transform the program into Static Single Assignment form. In SSA, verification is reduced to simple type compatibility checking between the definition type of each SSA variable and the type of each of its uses. Inter-adjacent transitions of a value through stack and registers...... are no longer verified explicitly. This integrated approach is more efficient than traditional bytecode verification but still as safe as strict verification, as overall program correctness can be induced once the data flow from each definition to all associated uses is known to be type-safe....
Nasution, Muhammad Ridlo Erdata
2014-06-01
A new asymptotic expansion homogenization analysis is proposed to analyze 3-D composite in which thermomechanical and finite thickness effects are considered. Finite thickness effect is captured by relieving periodic boundary condition at the top and bottom of unit-cell surfaces. The mathematical treatment yields that only 2-D periodicity (i.e. in in-plane directions) is taken into account. A unit-cell representing the whole thickness of 3-D composite is built to facilitate the present method. The equivalent in-plane thermomechanical properties of 3-D orthogonal interlock composites are calculated by present method, and the results are compared with those obtained by standard homogenization method (with 3-D periodicity). Young\\'s modulus and Poisson\\'s ratio obtained by present method are also compared with experiments whereby a good agreement is particularly found for the Young\\'s modulus. Localization analysis is carried out to evaluate the stress responses within the unit-cell of 3-D composites for two cases: thermal and biaxial tensile loading. Standard finite element (FE) analysis is also performed to validate the stress responses obtained by localization analysis. It is found that present method results are in a good agreement with standard FE analysis. This fact emphasizes that relieving periodicity in the thickness direction is necessary to accurately simulate the real free-traction condition in 3-D composite. © 2014 Elsevier Ltd.
Nebbak, A; El Hamzaoui, B; Berenger, J-M; Bitam, I; Raoult, D; Almeras, L; Parola, P
2017-07-19
Ticks and fleas are vectors for numerous human and animal pathogens. Controlling them, which is important in combating such diseases, requires accurate identification, to distinguish between vector and non-vector species. Recently, matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) was applied to the rapid identification of arthropods. The growth of this promising tool, however, requires guidelines to be established. To this end, standardization protocols were applied to species of Rhipicephalus sanguineus (Ixodida: Ixodidae) Latreille and Ctenocephalides felis felis (Siphonaptera: Pulicidae) Bouché, including the automation of sample homogenization using two homogenizer devices, and varied sample preservation modes for a period of 1-6 months. The MS spectra were then compared with those obtained from manual pestle grinding, the standard homogenization method. Both automated methods generated intense, reproducible MS spectra from fresh specimens. Frozen storage methods appeared to represent the best preservation mode, for up to 6 months, while storage in ethanol is also possible, with some caveats for tick specimens. Carnoy's buffer, however, was shown to be less compatible with MS analysis for the purpose of identifying ticks or fleas. These standard protocols for MALDI-TOF MS arthropod identification should be complemented by additional MS spectrum quality controls, to generalize their use in monitoring arthropods of medical interest. © 2017 The Royal Entomological Society.
Institute of Scientific and Technical Information of China (English)
CHENG Zhanqi; ZHONG Zheng
2006-01-01
In this paper the plane elasticity problem for a functionally graded interfacial zone containing a crack between two dissimilar homogeneous materials has been considered. It is assumed that in the interfacial zone the reciprocal of the shear modulus is a linear function of the coordinate, while Possion's ratio keeps constant. By utilizing the Fourier transformation technique and the transfer matrix method, the mixed boundary problem is reduced to a system of singular integral equations that are solved numerically.The influences of the geometric parameters and the graded parameter on the stress intensity factors are investigated. The numerical results show that the graded parameters,the thickness of interfacial zone, the crack size and location have significant effects on the stress intensity factors.
Chacón-Cardona, César A
2012-01-01
We investigate from the fractal viewpoint the way in which the dark matter is grouped at z = 0 in the Millennium dark matter cosmological simulation. The determination of the cross to homogeneity in the Millennium Simulation data is described from the behaviour of the fractal dimension and the lacunarity. We use the sliding window technique to calculate the fractal mass-radius dimension, the pre-factor F and the lacunarity of this fractal relation. Besides, we determinate the multi-fractal dimension and the lacunarity spectrum, including their dependence with radial distance. This calculations show a radial distance dependency of all the fractal quantities, with heterogeneity clustering of dark matter haloes up to depths of 100 Mpc/h. The dark matter haloes clustering in the Millennium Simulation shows a radial distance dependency, with two regions clearly defined. The lacunarity spectrum for values of the structure parameter q >= 1 shows regions with relative maxima, revealing the formation of clusters and v...
Pashkevich, Anatoly; Chablat, Damien
2007-01-01
The Orthoglide is a Delta-type PKM dedicated to 3-axis rapid machining applications that was originally developed at IRCCyN in 2000-2001 to meet the advantages of both serial 3-axis machines (regular workspace and homogeneous performances) and parallel kinematic architectures (good dynamic performances and stiffness). This machine has three fixed parallel linear joints that are mounted orthogonally. The geometric parameters of the Orthoglide were defined as function of the size of a prescribed cubic Cartesian workspace that is free of singularities and internal collision. The interesting features of the Orthoglide are a regular Cartesian workspace shape, uniform performances in all directions and good compactness. In this paper, a new method is proposed to analyze the stiffness of overconstrained Delta-type manipulators, such as the Orthoglide. The Orthoglide is then benchmarked according to geometric, kinematic and stiffness criteria: workspace to footprint ratio, velocity and force transmission factors, sen...
Analysis of Cliff-Ramp Structures in Homogeneous Scalar Turbulence by the Method of Line Segments
Gauding, Michael; Goebbert, Jens Henrik; Peters, Norbert; Hasse, Christian
2015-11-01
The local structure of a turbulent scalar field in homogeneous isotropic turbulence is analyzed by direct numerical simulations (DNS). A novel signal decomposition approach is introduced where the signal of the scalar along a straight line is partitioned into segments based on the local extremal points of the scalar field. These segments are then parameterized by the distance between adjacent extremal points and a segment-based gradient. Joint statistics of the length and the segment-based gradient provide novel understanding about the local structure of the turbulent field and particularly about cliff-ramp-like structures. Ramp-like structures are unveiled by the asymmetry of joint distribution functions. Cliff-like structures are further analyzed by conditional statistics and it is shown from DNS that the width of cliffs scales with the Kolmogorov length scale.
Akbarzadeh, Pooria
2016-05-12
In this paper, magneto-hydrodynamic blood flows through porous arteries are numerically simulated using a locally modified homogenous nanofluids model. Blood is taken into account as the third-grade non-Newtonian fluid containing nanoparticles. In the modified nanofluids model, the viscosity, density, and thermal conductivity of the solid-liquid mixture (nanofluids) which are commonly utilized as an effective value, are locally combined with the prevalent single-phase model. The modified governing equations are solved numerically using Newton's method and a block tridiagonal matrix solver. The results are compared to the prevalent nanofluids single-phase model. In addition, the efficacies of important physical parameters such as pressure gradient, Brownian motion parameter, thermophoresis parameter, magnetic-field parameter, porosity parameter, and etc. on temperature, velocity and nanoparticles concentration profiles are examined.
Thermal-hydraulic analysis of the HL-2M divertor using an homogeneous equilibrium model
Lu, Yong; Cai, Lijun; Liu, Yuxiang; Liu, Jian; Yuan, Yinglong; Zheng, Guoyao; Liu, Dequan
2017-09-01
The heat flux of the HL-2M divertor would reach 10 MW m-2 or more at the local area when the device operates at high parameters. Subcooled boiling could occur at high thermal load, which would be simulated based on the homogeneous equilibrium model. The results show that the current design of the HL-2M divertor could withstand the local heat flux 10 MW m-2 at a plasma pulse duration of 5 s, inlet coolant pressure of 1.5 MPa and flow velocity of 4 m s-1. The pulse duration that the HL-2M divertor could withstand is closely related to the coolant velocity. In addition, at the time of 2 min after plasma discharge, the flow velocity decreased from 4 m s-1 to 1 m s-1, and the divertor could also be cooled to the initial temperature before the next plasma discharge commences.
Assessment of Theories for Free Vibration Analysis of Homogeneous and Multilayered Plates
Directory of Open Access Journals (Sweden)
Erasmo Carrera
2004-01-01
Full Text Available This paper assesses classical and advanced theories for free vibrational response of homogeneous and multilayered simply supported plates. Closed form solutions are given for thick and thin geometries. Single layer and multilayered plates made of metallic, composite and piezo-electric materials, are considered. Classical theories based on Kirchhoff and Reissner-Mindlin assumptions are compared with refined theories obtained by enhancing the order of the expansion of the displacement fields in the thickness direction z. The effect of the Zig-Zag form of the displacement distribution in z as well as of the Interlaminar Continuity of transverse shear and normal stresses at the layer interface were evaluated. A number of conclusions have been drawn. These conclusions could be used as desk-bed in order to choose the most valuable theories for a given problem.
Crosstalk analysis in homogeneous multi-core two-mode fiber under bent condition.
Chang, J H; Choi, H G; Bae, S H; Sim, D H; Kim, Hoon; Chung, Y C
2015-04-20
We analyze the inter-core crosstalk in homogeneous multi-core two-mode fibers (MC-TMFs) under bent condition by using the coupled-mode equations. In particular, we investigate the effects of the intra-core mode coupling on the inter-core crosstalk for two different types of MC-TMFs at various bending radii. The results show that the inter-core homo-mode crosstalk of LP(11) mode is dominant under the gentle fiber bending condition due to its large effective area. However, as the fiber bending becomes tight, the intra-core mode coupling is significantly enhanced and consequently makes all the inter-core crosstalk levels comparable to each other regardless of the mode. A similar tendency is observed at a reduced bending radius when the difference in the propagation constants between modes is large and core pitch is small.
Energy Technology Data Exchange (ETDEWEB)
Carey, V.P. [Mechanical Engineering Department, University of California, Berkeley, CA (United States); Wemhoff, A.P. [New Technologies Engineering Division, Lawrence Livermore National Laboratory, Livermore, CA (United States)
2005-12-01
Rapid heating of a liquid by a thin film heater surface may initiate homogeneous nucleation of vapor in the liquid in contact with the surface. In such circumstances, nucleation is generally expected to be most likely in the hottest liquid closest to the surface. It is known, however, that in many cases the liquid molecules closest to the surface will experience long-range attractive forces to molecules in the solid, with the result that the equation of state for the liquid near the surface will differ from that for the bulk liquid. In the investigation summarized here, a statistical thermodynamics analysis was used to derive a modified version of the Redlich-Kwong fluid property model that accounts for attractive forces between the solid surface molecules and liquid molecules in the near-wall region. In this model, the wall-fluid attractive forces are quantified in terms of Hamaker constants. This feature makes it possible to assess the effect of wall-fluid force interactions on the spinodal conditions for a variety of fluid and surface material combinations. The variation of pressure near the wall predicted by this model agrees well with the predictions of a hydrostatic model and molecular dynamics simulations. The results of the thermodynamic model analysis indicate that force interactions are important for a wide variety of fluids only within a few nanometers of the solid surface. The model specifically predicts that these forces increase the spinodal temperature in the near-surface region. The implications of this increase for nucleation near a solid surface during rapid heating are explored for circumstances similar to those in inkjet printer heads. The results of the analysis imply that during rapid transient heating, wall effects can result in homogeneous nucleation occurring first at a location slightly away from the solid surface. The results further suggest that on rough surfaces, wall effects may play a role in making some cavities preferred sites for
Waits, C. M.; Tolmachoff, E. D.; Allmon, W. R.; Zecher-Freeman, N. E.
2016-11-01
An energy analysis is presented for n-dodecane/air combustion in a heat recirculating Inconel microreactor under vacuum conditions. Microreactor channels are partially coated with platinum enabling operating with coupled heterogeneous and homogeneous reactions. The radiant efficiency, important for thermophotovoltaic energy conversion, was found to decrease from 57% to 52% over 5 different runs covering 377 min of operation. A similar decrease in combustion efficiency was observed with 6%-8% energy lost to incomplete combustion and 5%- 6% lost through sensible heat in the exhaust. The remaining thermal loss is from unusable radiation and conduction through inlet and outlet tubing. Changes in the Inconel microreactor geometry and emissivity properties were observed.
ENSO Forecasts in the North American Multi-Model Ensemble: Composite Analysis and Verification
Chen, L. C.
2015-12-01
In this study, we examine precipitation and temperature forecasts during El Nino/Southern Oscillation (ENSO) events in six models in the North American Multi-Model Ensemble (NMME), including the CFSv2, CanCM3, CanCM4, FLOR, GEOS5, and CCSM4 models, by comparing the model-based ENSO composites to the observed. The composite analysis is conducted using the 1982-2010 hindcasts for each of the six models with selected ENSO episodes based on the seasonal Ocean Nino Index (ONI) just prior to the date the forecasts were initiated. Two sets of composites are constructed over the North American continent: one based on precipitation and temperature anomalies, the other based on their probability of occurrence in a tercile-based system. The composites apply to monthly mean conditions in November, December, January, February, and March, respectively, as well as to the five-month aggregates representing the winter conditions. For the anomaly composites, we use the anomaly correlation coefficient and root-mean-square error against the observed composites for evaluation. For the probability composites, unlike conventional probabilistic forecast verification assuming binary outcomes to the observations, both model and observed composites are expressed in probability terms. Performance metrics for such validation are limited. Therefore, we develop a probability anomaly correlation measure and a probability score for assessment, so the results are comparable to the anomaly composite evaluation. We found that all NMME models predict ENSO precipitation patterns well during wintertime; however, some models have large discrepancies between the model temperature composites and the observed. The skill is higher for the multi-model ensemble, as well as the five-month aggregates. Comparing to the anomaly composites, the probability composites have superior skill in predicting ENSO temperature patterns and are less sensitive to the sample used to construct the composites, suggesting that
HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments
Energy Technology Data Exchange (ETDEWEB)
McCann, R.A.; Lowery, P.S.
1987-10-01
HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equations for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.
Directory of Open Access Journals (Sweden)
Farhan A. Salem
2014-05-01
Full Text Available This paper proposes a new, simple and user–friendly MATLAB built-in function, mathematical and Simulink models, to be used to early identify system level problems, to ensure that all design requirements are met, and, generally, to simplify Mechatronics motion control design process including; performance analysis and verification of a given electric DC machine, proper controller selection and verification for desired output speed or angle.
Permeability analysis of fractured vuggy porous media based on homogenization theory
Institute of Scientific and Technical Information of China (English)
无
2010-01-01
Based on the characteristics of fractured vuggy porous media,a novel mathematical model was proposed to model fluid flow in such media on fine scale,i.e.,the discrete fracture-vug network model.The new model consists of three systems:porous rock system,fracture system,and vug system.The fractures and vugs are embedded in porous rock,and the isolated vugs could be connected via the discrete fracture network.The flow in porous rock and fractures follows Darcy’s law,and the vugs system is free fluid region.Using a two-scale homogenization limit theory,we obtained a macroscopic Darcy’s law governing the media on coarse scale.The theoretical formula of the equivalent permeability of the fractured vuggy porous media was derived.The model and method of this paper were verified by some numerical examples.At the end the permeability of some fractured vuggy porous media with typical fracture-vug structures was analyzed.
Homogeneous abundance analysis of dwarf, subgiant and giant FGK stars with and without giant planets
da Silva, R; Rocha-Pinto, H J
2015-01-01
We have analyzed high-resolution and high signal-to-noise ratio optical spectra of nearby FGK stars with and without detected giant planets in order to homogeneously measure their photospheric parameters, mass, age, and the abundances of volatile (C, N, and O) and refractory (Na, Mg, Si, Ca, Ti, V, Mn, Fe, Ni, Cu, and Ba) elements. Our sample contains 309 stars from the solar neighborhood (up to the distance of 100 pc), out of which 140 are dwarfs, 29 are subgiants, and 140 are giants. The photospheric parameters are derived from the equivalent widths of Fe I and Fe II lines. Masses and ages come from the interpolation in evolutionary tracks and isochrones on the HR diagram. The abundance determination is based on the equivalent widths of selected atomic lines of the refractory elements and on the spectral synthesis of C_2, CN, C I, O I, and Na I features. We apply a set of statistical methods to analyze the abundances derived for the three subsamples. Our results show that: i) giant stars systematically exhi...
Verification of Large State/Event Systems using Compositionality and Dependency Analysis
DEFF Research Database (Denmark)
Lind-Nielsen, Jørn; Andersen, Henrik Reif; Hulgaard, Henrik;
2001-01-01
possible automated verification of large industrial designs with the use of only modest resources (less than 5 minutes on a standard PC for a model with 1421 concurrent machines). The results of the paper are being implemented in the next version of the commercial tool visualSTATETM....
A Verification and Analysis of the USAF/DoD Fatigue Model and Fatigue Management Technology
2005-11-01
We Nap: Evolution, Chronobiology, and Functions of Polyphasic and Ultrashort Sleep . Stampi, C. (ed) Birkhduser, Boston. Defense Acquisition...Windows® soffivare application of the Sleep , Activity, Fatigue, and Task Effectiveness (SAFTE) applied model. The application, the Fatigue Avoidance...Scheduling Tool (FASTTM) was re-engineered as a clone from the SAFTE specification. The verification considered nine sleep /wake schedules that were
Introduction to the Special Issue on Specification Analysis and Verification of Reactive Systems
Delzanno, Giorgio; Etalle, Sandro; Gabbrielli, Maurizio
2006-01-01
This special issue is inspired by the homonymous ICLP workshops that took place during ICLP 2001 and ICLP 2002. Extending and shifting slightly from the scope of their predecessors (on verification and logic languages) held in the context of previous editions of ICLP, the aim of the SAVE workshops
Introduction to the Special Issue on Specification Analysis and Verification of Reactive Systems
Delzanno, Giorgio; Etalle, Sandro; Gabbrielli, Maurizio
2006-01-01
This special issue is inspired by the homonymous ICLP workshops that took place during ICLP 2001 and ICLP 2002. Extending and shifting slightly from the scope of their predecessors (on verification and logic languages) held in the context of previous editions of ICLP, the aim of the SAVE workshops w
Vacuum-assisted resin transfer molding (VARTM) model development, verification, and process analysis
Sayre, Jay Randall
2000-12-01
Vacuum-Assisted Resin Transfer Molding (VARTM) processes are becoming promising technologies in the manufacturing of primary composite structures in the aircraft industry as well as infrastructure. A great deal of work still needs to be done on efforts to reduce the costly trial-and-error methods of VARTM processing that are currently in practice today. A computer simulation model of the VARTM process would provide a cost-effective tool in the manufacturing of composites utilizing this technique. Therefore, the objective of this research was to modify an existing three-dimensional, Resin Film Infusion (RFI)/Resin Transfer Molding (RTM) model to include VARTM simulation capabilities and to verify this model with the fabrication of aircraft structural composites. An additional objective was to use the VARTM model as a process analysis tool, where this tool would enable the user to configure the best process for manufacturing quality composites. Experimental verification of the model was performed by processing several flat composite panels. The parameters verified included flow front patterns and infiltration times. The flow front patterns were determined to be qualitatively accurate, while the simulated infiltration times over predicted experimental times by 8 to 10%. Capillary and gravitational forces were incorporated into the existing RFI/RTM model in order to simulate VARTM processing physics more accurately. The theoretical capillary pressure showed the capability to reduce the simulated infiltration times by as great as 6%. The gravity, on the other hand, was found to be negligible for all cases. Finally, the VARTM model was used as a process analysis tool. This enabled the user to determine such important process constraints as the location and type of injection ports and the permeability and location of the high-permeable media. A process for a three-stiffener composite panel was proposed. This configuration evolved from the variation of the process
Analysis of Homogeneous Water Oxidation Catalysis with Collector-Generator Cells.
Sherman, Benjamin D; Sheridan, Matthew V; Wee, Kyung-Ryang; Song, Na; Dares, Christopher J; Fang, Zhen; Tamaki, Yusuke; Nayak, Animesh; Meyer, Thomas J
2016-01-19
A collector-generator (C-G) technique has been applied to determine the Faradaic efficiencies for electrocatalytic O2 production by the homogeneous water oxidation catalysts Ru(bda)(isoq)2 (1; bda = 2,2'-bipyridine and isoq = isoquinoline) and [Ru(tpy)(bpz)(OH2)](2+) (2; tpy = 2,2':6',2″-terpyridine and bpz = 2,2'-bipyrazine). This technique uses a custom-fabricated cell consisting of two fluorine-doped tin oxide (FTO) working electrodes separated by 1 mm with the conductive sides facing each other. With a catalyst in solution, water oxidation occurs at one FTO electrode under a sufficient bias to drive O2 formation by the catalyst; the O2 formed then diffuses to the second FTO electrode poised at a potential sufficiently negative to drive O2 reduction. A comparison of the current versus time response at each electrode enables determination of the Faradaic efficiency for O2 production with high concentrations of supporting electrolyte important for avoiding capacitance effects between the electrodes. The C-G technique was applied to electrocatalytic water oxidation by 1 in the presence of the electron-transfer mediator Ru(bpy)3(2+) in both unbuffered aqueous solutions and with the added buffer bases HCO3(-), HPO4(2-), imidazole, 1-methylimidazole, and 4-methoxypyridine. HCO3(-) and HPO4(2-) facilitate water oxidation by atom-proton transfer (APT), which gave Faradaic yields of 100%. With imidazole as the buffer base, coordination to the catalyst inhibited water oxidation. 1-Methylimidazole and 4-methoxypyridine gave O2 yields of 55% and 76%, respectively, with the lower Faradaic efficiencies possibly due to competitive C-H oxidation of the bases. O2 evolution by catalyst 2 was evaluated at pH 12 with 0.1 M PO4(3-) and at pH 7 in a 0.1 M H2PO4(-)/HPO4(2-) buffer. At pH 12, at an applied potential of 0.8 V vs SCE, water oxidation by the Ru(IV)(O)(2+) form of the catalyst gave O2 in 73% yield. In a pH 7 solution, water oxidation at 1.4 V vs SCE, which is dominated
Pratt, J; Müller, W -C; Chapman, S C; Watkins, N W
2016-01-01
We investigate the utility of the convex hull to analyze physical questions related to the dispersion of a group of much more than four Lagrangian tracer particles in a turbulent flow. Validation of standard dispersion behaviors is a necessary preliminary step for use of the convex hull to describe turbulent flows. In simulations of statistically homogeneous and stationary Navier-Stokes turbulence, neutral fluid Boussinesq convection, and MHD Boussinesq convection we show that the convex hull can be used to reasonably capture the dispersive behavior of a large group of tracer particles. We validate dispersion results produced with convex hull analysis against scalings for Lagrangian particle pair dispersion. In addition to this basic validation study, we show that convex hull analysis provides information that particle pair dispersion does not, in the form of a extreme value statistics, surface area, and volume for a cluster of particles. We use the convex hull surface area and volume to examine the degree of...
Directory of Open Access Journals (Sweden)
Xavier Franch Auladell
2013-01-01
Full Text Available This work constitutes a contribution to the analysis of long term patterns of population concentration applied to the case of Spain. The proposed methodology is based on the homogenisation of both data and administrative units which takes the municipal structure of the 2001 census as its base reference. This work seeks to show how applying spatial analysis techniques to this type of homogeneous data series allows us to make more detailed studies of population patterns within a given territory. The most important conclusions that we reached was that, in Spain, sustained population growth has followed a spatial pattern that has become increasingly consolidated over time. The tendencies observed have produced an uneven distribution of population within the national territory marked by the existence of a series of well-defined, and often very localised, areas that spread beyond the limits of the official administrative boundaries.
Energy Technology Data Exchange (ETDEWEB)
H. B. HUNT; D. J. ROSENKRANTS; ET AL
2001-03-01
We identify several simple but powerful concepts, techniques, and results; and we use them to characterize the complexities of a number of basic problems II, that arise in the analysis and verification of the following models M of communicating automata and discrete dynamical systems: systems of communicating automata including both finite and infinite cellular automata, transition systems, discrete dynamical systems, and succinctly-specified finite automata. These concepts, techniques, and results are centered on the following: (i) reductions Of STATE-REACHABILITY problems, especially for very simple systems of communicating copies of a single simple finite automaton, (ii) reductions of generalized CNF satisfiability problems [Sc78], especially to very simple communicating systems of copies of a few basic acyclic finite sequential machines, and (iii) reductions of the EMPTINESS and EMPTINESS-OF-INTERSECTION problems, for several kinds of regular set descriptors. For systems of communicating automata and transition systems, the problems studied include: all equivalence relations and simulation preorders in the Linear-time/Branching-time hierarchies of equivalence relations and simulation preorders of [vG90, vG93], both without and with the hiding abstraction. For discrete dynamical systems, the problems studied include the INITIAL and BOUNDARY VALUE PROBLEMS (denoted IVPs and BVPS, respectively), for nonlinear difference equations over many different algebraic structures, e.g. all unitary rings, all finite unitary semirings, and all lattices. For succinctly-specified finite automata, the problems studied also include the several problems studied in [AY98], e.g. the EMPTINESS, EMPTINESS-OF-INTERSECTION, EQUIVALENCE and CONTAINMENT problems. The concepts, techniques, and results presented unify and significantly extend many of the known results in the literature, e.g. [Wo86, Gu89, BPT91, GM92, Ra92, HT94, SH+96, AY98, AKY99, RH93, SM73, Hu73, HRS76, HR78], for
Energy Technology Data Exchange (ETDEWEB)
Hunt, H. B. (Harry B.); Rosenkrantz, D. J. (Daniel J.); Barrett, C. L. (Christopher L.); Marathe, M. V. (Madhav V.); Ravi, S. S. (Sekharipuram S.)
2001-01-01
We identify several simple but powerful concepts, techniques, and results; and we use them to characterize the complexities of a number of basic problems II, that arise in the analysis and verification of the following models M of communicating automata and discrete dynamical systems: systems of communicating automata including both finite and infinite cellular automata, transition systems, discrete dynamical systems, and succinctly-specified finite automata. These concepts, techniques, and results are centered on the following: (1) reductions Of STATE-REACHABILITY problems, especially for very simple systems of communicating copies of a single simple finite automaton, (2) reductions of generalized CNF satisfiability problems [Sc78], especially to very simple communicating systems of copies of a few basic acyclic finite sequential machines, and (3) reductions of the EMPTINESS and EMPTINESS-OF-INTERSECTION problems, for several kinds of regular set descriptors. For systems of communicating automata and transition systems, the problems studied include: all equivalence relations and simulation preorders in the Linear-time/Branching-time hierarchies of equivalence relations and simulation preorders of [vG90, vG93], both without and with the hiding abstraction. For discrete dynamical systems, the problems studied include the INITIAL and BOUNDARY VALUE PROBLEMS (denoted IVPs and BVPs, respectively), for nonlinear difference equations over many different algebraic structures, e.g. all unitary rings, all finite unitary semirings, and all lattices. For succinctly specified finite automata, the problems studied also include the several problems studied in [AY98], e.g. the EMPTINESS, EMPTINESS-OF-INTERSECTION, EQUIVALENCE and CONTAINMENT problems. The concepts, techniques, and results presented unify and significantly extend many of the known results in the literature, e.g. [Wo86, Gu89, BPT91, GM92, Ra92, HT94, SH+96, AY98, AKY99, RH93, SM73, Hu73, HRS76, HR78], for
Tignanelli, H. L.; Vazquez, R. A.; Mostaccio, C.; Gordillo, S.; Plastino, A.
1990-11-01
RESUMEN. Presentamos una metodologia de analisis de la homogeneidad a partir de la Teoria de la Informaci6n, aplicable a muestras de datos observacionales. ABSTRACT:Standard concepts that underlie Information Theory are employed in order design a methodology that enables one to analyze the homogeneity of a given data sample. Key : DATA ANALYSIS
Directory of Open Access Journals (Sweden)
Nadeem AKHTAR
2014-12-01
Full Text Available This paper presents an approach based on the analysis, design, and formal verification of a multi-agent based university Information Management System (IMS. University IMS accesses information, creates reports and facilitates teachers as well as students. An orchestrator agent manages the coordination between all agents. It also manages the database connectivity for the whole system. The proposed IMS is based on BDI agent architecture, which models the system based on belief, desire, and intentions. The correctness properties of safety and liveness are specified by First-order predicate logic.
A solenoid-based active hydraulic engine mount: modelling, analysis, and verification
Hosseini, Ali
2010-01-01
The focus of this thesis is on the design, modelling, identification, simulation, and experimental verification of a low-cost solenoid-based active hydraulic engine mount. To build an active engine mount, a commercial On-Off solenoid is modified to be used as an actuator and it is embedded inside a hydraulic engine mount. The hydraulic engine mount is modelled and tested, solenoid actuator is modelled and identified, and finally the models were integrated to obtain the analytical model of the...
Exoplanet hosts reveal lithium depletion: Results from a homogeneous statistical analysis
Figueira, P; Delgado-Mena, E; Adibekyan, V Zh; Sousa, S G; Santos, N C; Israelian, G
2014-01-01
Aims. We study the impact of the presence of planets on the lithium abundance of host stars and evaluate the previous claim that planet hosts exhibit lithium depletion when compared to their non-host counterparts. Methods. Using previously published lithium abundances, we remove the confounding effect of the different fundamental stellar parameters by applying a multivariable regression on our dataset. In doing so, we explicitly make an assumption made implicitly by different authors: that lithium abundance depends linearly on fundamental stellar parameters. Using a moderator variable to distinguish stars with planets from those without, we evaluate the existence of an offset in lithium abundances between the two groups. We perform this analysis first for stars that present a clear lithium detection exclusively and include in a second analysis upper lithium measurements. Results. Our analysis shows that under the above-mentioned assumption of linearity, a statistically significant negative offset in lithium a...
Pierzga, M. J.
1981-01-01
The experimental verification of an inviscid, incompressible through-flow analysis method is presented. The primary component of this method is an axisymmetric streamline curvature technique which is used to compute the hub-to-tip flow field of a given turbomachine. To analyze the flow field in the blade-to-blade plane of the machine, the potential flow solution of an infinite cascade of airfoils is also computed using a source model technique. To verify the accuracy of such an analysis method an extensive experimental verification investigation was conducted using an axial flow research fan. Detailed surveys of the blade-free regions of the machine along with intra-blade surveys using rotating pressure sensing probes and blade surface static pressure taps provide a one-to-one relationship between measured and predicted data. The results of this investigation indicate the ability of this inviscid analysis method to predict the design flow field of the axial flow fan test rotor to within a few percent of the measured values.
Analysis of homogeneity of 2D electron gas at decreasing of electron density
Sherstobitov, A. A.; Minkov, G. M.; Germanenko, A. V.; Rut, O. E.; Soldatov, I. V.; Zvonkov, B. N.
2010-01-01
We investigate the gate voltage dependence of capacitance of a system gate - 2D electron gas (C-Vg). The abrupt drop of capacitance at decreasing concentration was found. The possible reasons of this drop, namely inhomogeneity of electron density distribution and serial resistance of 2D electron gas are discussed. Simultaneous analysis of gate voltage dependences of capacitance and resistance has shown that in heavily doped 2D systems the main role in the drop of capacitance at decreasing con...
Directory of Open Access Journals (Sweden)
Staňák P.
2013-06-01
Full Text Available Proposed paper presents application of the patch test for meshless analysis of piezoelectric circular plate with functionally graded material properties. Functionally graded materials (FGM are the special class of composite materials with continuous variation of volume fraction of constituents in predominant direction. Patch test analysis is an important tool in numerical methods for addressing the convergence. Meshless local Petrov-Galerkin (MLPG method together with moving least-squares (MLS approximation scheme is applied in the analysis. No finite elements are required for approximation or integration of unknown quantities. Circular plate is considered as a 3-D axisymmetric piezoelectric solid. Considering the axial symmetry, the problem is reduced to a 2-dimensinal one. Displacement and electric potential fields are prescribed on the outer boundaries in order to reach the state of constant stress field inside the considered plate as required by the patch test and the governing equations. Values of prescribed mechanical and electrical fields must be determined in order to comply with applied FGM gradation rule. Convergence study is performed to assess the considered meshless approach and several conclusions are finally presented.
Energy Technology Data Exchange (ETDEWEB)
Conan O' Rourke; Yutao Zhou
2007-12-31
for their photometric and electrical parameters only; in Cycle 2, 1000-hour Lumen Maintenance and the Rapid Cycle Stress Test was added and an additional set of six samples of each models were tested for Rapid Cycle Stress Test. Also, Cycle 2 data analysis included the testing and verification results against both the 'then existing' specification dated 2000 and the 'then new' specification dated 8/9/2001. In Cycle 3, Lumen Maintenance at 40% life was added and the number of samples for photometric and electrical testing was increased to five. In Cycle 6, the number of samples for photometric and electrical testing increased again to ten so that five of them were tested in base-up position and five in base-down position. A total of 2375 CFL samples were tested in PEARL program, out of the more than 3000 CFL samples that were purchased for the testing purpose of this program.
Robinson, G.; Ahmed, Ashraf A.; Hamill, G. A.
2016-07-01
This paper presents the applications of a novel methodology to quantify saltwater intrusion parameters in laboratory-scale experiments. The methodology uses an automated image analysis procedure, minimising manual inputs and the subsequent systematic errors that can be introduced. This allowed the quantification of the width of the mixing zone which is difficult to measure in experimental methods that are based on visual observations. Glass beads of different grain sizes were tested for both steady-state and transient conditions. The transient results showed good correlation between experimental and numerical intrusion rates. The experimental intrusion rates revealed that the saltwater wedge reached a steady state condition sooner while receding than advancing. The hydrodynamics of the experimental mixing zone exhibited similar traits; a greater increase in the width of the mixing zone was observed in the receding saltwater wedge, which indicates faster fluid velocities and higher dispersion. The angle of intrusion analysis revealed the formation of a volume of diluted saltwater at the toe position when the saltwater wedge is prompted to recede. In addition, results of different physical repeats of the experiment produced an average coefficient of variation less than 0.18 of the measured toe length and width of the mixing zone.
Anoukou, K.; Pastor, F.; Dufrenoy, P.; Kondo, D.
2016-06-01
The present two-part study aims at investigating the specific effects of Mohr-Coulomb matrix on the strength of ductile porous materials by using a kinematic limit analysis approach. While in the Part II, static and kinematic bounds are numerically derived and used for validation purpose, the present Part I focuses on the theoretical formulation of a macroscopic strength criterion for porous Mohr-Coulomb materials. To this end, we consider a hollow sphere model with a rigid perfectly plastic Mohr-Coulomb matrix, subjected to axisymmetric uniform strain rate boundary conditions. Taking advantage of an appropriate family of three-parameter trial velocity fields accounting for the specific plastic deformation mechanisms of the Mohr-Coulomb matrix, we then provide a solution of the constrained minimization problem required for the determination of the macroscopic dissipation function. The macroscopic strength criterion is then obtained by means of the Lagrangian method combined with Karush-Kuhn-Tucker conditions. After a careful analysis and discussion of the plastic admissibility condition associated to the Mohr-Coulomb criterion, the above procedure leads to a parametric closed-form expression of the macroscopic strength criterion. The latter explicitly shows a dependence on the three stress invariants. In the special case of a friction angle equal to zero, the established criterion reduced to recently available results for porous Tresca materials. Finally, both effects of matrix friction angle and porosity are briefly illustrated and, for completeness, the macroscopic plastic flow rule and the voids evolution law are fully furnished.
Directory of Open Access Journals (Sweden)
J. S. Lai
2012-07-01
Full Text Available Spatial information technologies and data can be used effectively to investigate and monitor natural disasters contiguously and to support policy- and decision-making for hazard prevention, mitigation and reconstruction. However, in addition to the vastly growing data volume, various spatial data usually come from different sources and with different formats and characteristics. Therefore, it is necessary to find useful and valuable information that may not be obvious in the original data sets from numerous collections. This paper presents the preliminary results of a research in the validation and risk assessment of landslide events induced by heavy torrential rains in the Shimen reservoir watershed of Taiwan using spatial analysis and data mining algorithms. In this study, eleven factors were considered, including elevation (Digital Elevation Model, DEM, slope, aspect, curvature, NDVI (Normalized Difference Vegetation Index, fault, geology, soil, land use, river and road. The experimental results indicate that overall accuracy and kappa coefficient in verification can reach 98.1% and 0.8829, respectively. However, the DT model after training is too over-fitting to carry prediction. To address this issue, a mechanism was developed to filter uncertain data by standard deviation of data distribution. Experimental results demonstrated that after filtering the uncertain data, the kappa coefficient in prediction substantially increased 29.5%.The results indicate that spatial analysis and data mining algorithm combining the mechanism developed in this study can produce more reliable results for verification and forecast of landslides in the study site.
Directory of Open Access Journals (Sweden)
Kaihang Han
2016-01-01
Full Text Available In order to better interpret failure features of the failure of soil in front of tunnel face, a new three-dimensional failure mechanism is proposed to analyze the limit support pressure of the tunnel face in multilayered cohesive-frictional soils. The new failure mechanism is composed of two truncated cones that represent the shear failure band and a distributed force acting on the truncated cones that represents the pressure arch effect. By introducing the concept of Terzaghi earth pressure theory, approximation of limit support pressures is calculated using the limit analysis methods. Then the limit support pressures obtained from the new failure mechanism and the existing approaches are compared, which show that the results obtained from the new mechanism in this paper provide relatively satisfactory results.
Esteban, Pere; Prohom, Marc; Aguilar, Enric; Mestre, Olivier
2010-05-01
The analysis of temperature and precipitation change and variability in high elevations is a difficult issue due to the lack of long term climatic series in those environments. Nonetheless, it is important to evaluate how much high elevations follow the same climate evolution than low lying sites. In this work, using daily data from three Andorran weather stations (maintained by the power company Forces Elèctriques d'Andorra, FEDA), climate trends of annual and seasonal temperature and precipitation were obtained for the period 1934-2008. The series are complete (99.9%) and are located in a mountainous area ranging from 1110 m to 1600 m asl. As a previous step to the analysis, data rescue, quality control and homogeneity tests were applied to the daily data. For quality control, several procedures were applied to identify and flag suspicious or erroneous data: duplicated days, outliers, excessive differences between consecutive days, flat line checking, days with maximum temperature lower that minimum temperature, and rounding analysis. All the station sites were visited to gather the available metadata. Concerning homogeneity, a homogeneous climate time series is defined as one where variations are caused only by variations in climate and not to non-climatic factors (i.e., changes in site location, instruments, station environment…). As a result, homogeneity of the series was inspected from several methodologies that have been used in a complementary and independent way in order to attain solid results: C3-SNHT (with software developed under the Spanish Government Grant CGL2007-65546-C03-02), and Caussinus-Mestre (C-M) approaches. In both cases, tests were applied to mean annual temperature and precipitation series, using Catalan and French series as references (provided respectively by the Meteorological Service of Catalonia and Météo-France, in the framework of the Action COST-ES0601: Advances in homogenisation methods of climate series: an integrated
Directory of Open Access Journals (Sweden)
L.V. Abramov
2016-06-01
The paper presents results of the CRISS 5.3 code verification through the comparison of the analysis results obtained using the CRISS 5.3 system against analytical formulas and results of a qualitative and quantitative analysis based on certified nuclear plant PSA software tools.
THE MOTOR LEARNING, MEMORY, KNOWLEDGE OF RESULTS: COMPARATIVE ANALYSIS OF HOMOGENEOUS GROUPS
Directory of Open Access Journals (Sweden)
Francesco Perrotta
2009-12-01
Full Text Available There is ample experimental evidence of the positive-fixing motor task of having both a lower relativefrequency (FR which is provided with the knowledge of the result (CR to the subject of learning, The outcomeof the response, both the application of the formulation of a subjective estimate of first CR (SS.It 'was, however, suggested the possibility of an interaction between these two variables,meaning thatthe individual, when he has to make an estimate subjective error, would benefit from greater FR. Taking up aninteresting research is shown in a dissertation in Physiotherapy (Giulia Days 2000/01 Which is credited withhaving carried out a rigorous investigation and Articles The authors of this paper have wanted to see exactlyProposing given to 60 young subjects, righthanded and in good health, and the same working hypothesis, butwith a different device (Biodex System 4: a simple right elbow flexion with isokinetic contraction at 30 degrees/ second. Subjects were asked:1 to Assess Whether or not the mistake made after the end of the year,2 CR was provided after each trial (100% FR, or after a trial every five (20% FR3 to mark the difference between the subjects who did not Had Gold SS to make an the latter was askedimmediately after the conclusion of the trial, of perform a simple calculation. All subjects performed 20 sets of10 repetitions each during a single session of practice. The next day was made a test of retention (consisting of 1 set of 15 repetitions without CR or SS The comparison between the groups at the retention test was performed with Analysis of variance, before and after adjustment for the initial conditions. The results showed that afteradjustment the group of subjects who received the CR with 100% FR and formulation of SS during the period of practice, a test of retention in a Significantly better.
Grimalt, Susana; Harbeck, Stefan; Shegunova, Penka; Seghers, John; Sejerøe-Olsen, Berit; Emteborg, Håkan; Dabrio, Marta
2015-04-01
The feasibility of the production of a reference material for pesticide residue analysis in a cucumber matrix was investigated. Cucumber was spiked at 0.075 mg/kg with each of the 15 selected pesticides (acetamiprid, azoxystrobin, carbendazim, chlorpyrifos, cypermethrin, diazinon, (α + β)-endosulfan, fenitrothion, imazalil, imidacloprid, iprodione, malathion, methomyl, tebuconazole and thiabendazole) respectively. Three different strategies were considered for processing the material, based on the physicochemical properties of the vegetable and the target pesticides. As a result, a frozen spiked slurry of fresh cucumber, a spiked freeze-dried cucumber powder and a freeze-dried cucumber powder spiked by spraying the powder were studied. The effects of processing and aspects related to the reconstitution of the material were evaluated by monitoring the pesticide levels in the three materials. Two separate analytical methods based on LC-MS/MS and GC-MS/MS were developed and validated in-house. The spiked freeze-dried cucumber powder was selected as the most feasible material and more exhaustive studies on homogeneity and stability of the pesticide residues in the matrix were carried out. The results suggested that the between-unit homogeneity was satisfactory with a sample intake of dried material as low as 0.1 g. A 9-week isochronous stability study was undertaken at -20 °C, 4 °C and 18 °C, with -70 °C designated as the reference temperature. The pesticides tested exhibited adequate stability at -20 °C during the 9-week period as well as at -70 °C for a period of 18 months. These results constitute a good basis for the development of a new candidate reference material for selected pesticides in a cucumber matrix.
Universum Inference and Corpus Homogeneity
Vogel, Carl; Lynch, Gerard; Janssen, Jerom
Universum Inference is re-interpreted for assessment of corpus homogeneity in computational stylometry. Recent stylometric research quantifies strength of characterization within dramatic works by assessing the homogeneity of corpora associated with dramatic personas. A methodological advance is suggested to mitigate the potential for the assessment of homogeneity to be achieved by chance. Baseline comparison analysis is constructed for contributions to debates by nonfictional participants: the corpus analyzed consists of transcripts of US Presidential and Vice-Presidential debates from the 2000 election cycle. The corpus is also analyzed in translation to Italian, Spanish and Portuguese. Adding randomized categories makes assessments of homogeneity more conservative.
Arnold, Steven M.; Pindera, Marek-Jerzy; Aboudi, Jacob
2003-01-01
This report summarizes the results of a numerical investigation into the spallation mechanism in plasma-sprayed thermal barrier coatings observed under spatially-uniform cyclic thermal loading. The analysis focuses on the evolution of local stress and inelastic strain fields in the vicinity of the rough top/bond coat interface during thermal cycling, and how these fields are influenced by the presence of an oxide film and spatially uniform and graded distributions of alumina particles in the metallic bond coat aimed at reducing the top/bond coat thermal expansion mismatch. The impact of these factors on the potential growth of a local horizontal delamination at the rough interface's crest is included. The analysis is conducted using the Higher-Order Theory for Functionally Graded Materials with creep/relaxation constituent modeling capabilities. For two-phase bond coat microstructures, both the actual and homogenized properties are employed in the analysis. The results reveal the important contributions of both the normal and shear stress components to the delamination growth potential in the presence of an oxide film, and suggest mixed-mode crack propagation. The use of bond coats with uniform or graded microstructures is shown to increase the potential for delamination growth by increasing the magnitude of the crack-tip shear stress component.
Bhat, Sujatha; Patil, Ajeetkumar; Rai, Lavanya; Kartha, V B; Chidangil, Santhosh
2012-01-01
A highly objective method, High Performance Liquid Chromatography with Laser Induced Fluorescence (HPLC-LIF) technique was used to study the protein profiles of normal and cervical cancer tissue homogenates. A total of 44 samples including normal cervical biopsy samples from the hysterectomy patients and the patients suffering from different stages of the cervical cancer were recorded by HPLC-LIF and analysed by Principle Component Analysis (PCA) to get statistical information on different tissue components. Discrimination of different stages of the samples was carried out by considering three parameters--scores of factor, spectral residual, and Mahalanobis Distance. Diagnostic accuracy of the method was evaluated using Receiver Operating Characteristic (ROC) analysis, and Youden's index (J) plots. The PCA results showed high sensitivity and specificity (~100) for cervical cancer diagnosis. ROC and Youden's index curves for both normal and malignant standard sets show good diagnostic accuracy with high AUC values. The statistical analysis has shown that the differences in protein profiles can be used to diagnose biochemical changes in the tissue, and thus can be readily applied for the detection of cervical cancer, even in situations where a histopathology examination is not easy because of nonavailability of experienced pathologists.
Directory of Open Access Journals (Sweden)
Sujatha Bhat
2012-01-01
Full Text Available A highly objective method, High Performance Liquid Chromatography with Laser Induced Fluorescence (HPLC-LIF technique was used to study the protein profiles of normal and cervical cancer tissue homogenates. A total of 44 samples including normal cervical biopsy samples from the hysterectomy patients and the patients suffering from different stages of the cervical cancer were recorded by HPLC-LIF and analysed by Principle Component Analysis (PCA to get statistical information on different tissue components. Discrimination of different stages of the samples was carried out by considering three parameters—scores of factor, spectral residual, and Mahalanobis Distance. Diagnostic accuracy of the method was evaluated using Receiver Operating Characteristic (ROC analysis, and Youden's index (J plots. The PCA results showed high sensitivity and specificity (∼100 for cervical cancer diagnosis. ROC and Youden's index curves for both normal and malignant standard sets show good diagnostic accuracy with high AUC values. The statistical analysis has shown that the differences in protein profiles can be used to diagnose biochemical changes in the tissue, and thus can be readily applied for the detection of cervical cancer, even in situations where a histopathology examination is not easy because of nonavailability of experienced pathologists.
Directory of Open Access Journals (Sweden)
Catherine Brasseur
2014-04-01
Full Text Available Hindgut homogenates of the termite Reticulitermes santonensis were incubated with carboxymethyl cellulose (CMC, crystalline celluloses or xylan substrates. Hydrolysates were analyzed with matrix-assisted laser desorption/ionization coupled to time-of-flight mass spectrometry (MALDI-TOF MS. The method was first set up using acid hydrolysis analysis to characterize non-enzymatic profiles. Commercial enzymes of Trichoderma reesei or T. longibrachiatum were also tested to validate the enzymatic hydrolysis analysis. For CMC hydrolysis, data processing and visual display were optimized to obtain comprehensive profiles and allow rapid comparison and evaluation of enzymatic selectivity, according to the number of substituents of each hydrolysis product. Oligosaccharides with degrees of polymerization (DPs ranging from three to 12 were measured from CMC and the enzymatic selectivity was demonstrated. Neutral and acidic xylo-oligosaccharides with DPs ranging from three to 11 were measured from xylan substrate. These results are of interest for lignocellulose biomass valorization and demonstrated the potential of termites and their symbiotic microbiota as a source of interesting enzymes for oligosaccharides production.
Cheng, Long; Jia, Yun; Oueslati, Abdelbacet; de Saxcé, Géry; Kondo, Djimedo
2015-04-01
In Gurson's footsteps, different authors have proposed macroscopic plastic models for porous solid with pressure-sensitive dilatant matrix obeying the normality law (associated materials). The main objective of the present paper is to extend this class of models to porous materials in the context of non-associated plasticity. This is the case of Drucker-Prager matrix for which the dilatancy angle is different from the friction one, and classical limit analysis theory cannot be applied. For such materials, the second last author has proposed a relevant modeling approach based on the concept of bipotential, a function of both dual variables, the plastic strain rate and stress tensors. On this ground, after recalling the basic elements of the Drucker-Prager model, we present the corresponding variational principles and the extended limit analysis theorems. Then, we formulate a new variational approach for the homogenization of porous materials with a non-associated matrix. This is implemented by considering the hollow sphere model with a non-associated Drucker-Prager matrix. The proposed procedure delivers a closed-form expression of the macroscopic bifunctional from which the criterion and a non-associated flow rule are readily obtained for the porous material. It is shown that these general results recover several available models as particular cases. Finally, the established results are assessed and validated by comparing their predictions to those obtained from finite element computations carried out on a cell representing the considered class of materials.
Energy Technology Data Exchange (ETDEWEB)
Mylvaganam, K [Centre for Advanced Materials Technology, University of Sydney, NSW 2006 (Australia); Zhang, L C [School of Mechanical and Manufacturing Engineering, University of New South Wales, NSW 2052 (Australia); Eyben, P; Vandervorst, W [IMEC, Kapeldreef 75, B-3001 Leuven (Belgium); Mody, J, E-mail: k.mylvaganam@usyd.edu.a, E-mail: Liangchi.zhang@unsw.edu.a, E-mail: eyben@imec.b, E-mail: jamody@imec.b, E-mail: vdvorst@imec.b [KU Leuven, Electrical Engineering Department, INSYS, Kasteelpark Arenberg 10, B-3001 Leuven (Belgium)
2009-07-29
This paper explores the evolution mechanisms of metastable phases during the nanoindentation on monocrystalline silicon. Both the molecular dynamics (MD) and the in situ scanning spreading resistance microscopy (SSRM) analyses were carried out on Si(100) orientation, and for the first time, experimental verification was achieved quantitatively at the same nanoscopic scale. It was found that under equivalent indentation loads, the MD prediction agrees extremely well with the result experimentally measured using SSRM, in terms of the depth of the residual indentation marks and the onset, evolution and dimension variation of the metastable phases, such as {beta}-Sn. A new six-coordinated silicon phase, Si-XIII, transformed directly from Si-I was discovered. The investigation showed that there is a critical size of contact between the indenter and silicon, beyond which a crystal particle of distorted diamond structure will emerge in between the indenter and the amorphous phase upon unloading.
Initial verification and validation of RAZORBACK - A research reactor transient analysis code
Energy Technology Data Exchange (ETDEWEB)
Talley, Darren G. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)
2015-09-01
This report describes the work and results of the initial verification and validation (V&V) of the beta release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This initial V&V effort was intended to confirm that the code work to-date shows good agreement between simulation and actual ACRR operations, indicating that the subsequent V&V effort for the official release of the code will be successful.
Energy Technology Data Exchange (ETDEWEB)
Hoinard, G.; Estevez, R.; Franciosi, P. [Univ. Paris Nord, Villetaneuse (France)
1995-04-01
The hardening anisotropy, in quasi static loading, of {gamma}/{gamma}{prime} superalloy single crystals is experimentally investigated and analyzed according to a description of crystal hardening at the dislocation density and interaction scale, and regardless of the two-phase nature of the material. A matrix of hardening coefficients is estimated from monotonous and sequential loadings at 650 C on <001>, <011>, <111> oriented samples of two different alloys with similar compositions and structures. This hardening analysis distinguishes three types of slip systems all having the <110> type slip direction: octahedral systems with an easy or an uneasy dislocation motion direction, and cubic systems. The estimated interactions between these system pairs separate, within each of the three system groups, a coplanar (including self interaction) coefficient with a non coplanar one. As a whole, the superalloy single crystal hardening anisotropy at this medium temperature comes out comparable to one of the most anisotropic pure f.c.c. metals, mainly because of the hardening asymmetry on the octahedral systems. If the asymmetry is averaged, the remaining anisotropy falls down to the anisotropy level of high stacking fault energy f.c.c. metals. These quantitative estimates are limited by the questionable assumption of homogeneous behavior for such a two-phase crystal structure. Analyses accounting for both the crystalline structure and the two-phase nature of these superalloys will be the purpose of the forthcoming Part 2.
Directory of Open Access Journals (Sweden)
A.V. Antonov
2016-12-01
Full Text Available Operation of technical equipment involves three consecutive stages, each of which is characterized by a particular trend in the behavior of the failure flow parameter (FFP. The FFP value is approximately constant during normal operation. In this case, the equipment operation process is assumed to be temporally homogeneous and the reliability indicators are calculated by conventional methods. At the burn-in stage, the FFP decreases with time while increasing at the aging stage. Therefore, the operating times between two successive failures at the burn-in and aging stages are not similarly distributed random values and the flow of events cannot be assumed to be recurrent. It shall be taken into account in the reliability performance calculation that the flow of failures is temporally inhomogeneous. The paper describes a method to estimate the NPP equipment reliability indicators allowing the potential inhomogeneity of the failure flow to be taken into account. The specific nature of incoming statistical data on failures is shown. The application of the normalizing flow function model for the calculation of required reliability indicators is described. A practical example of an analysis of data on the Bilibino NPP CPS KNK-56 component failures is presented.
Energy Technology Data Exchange (ETDEWEB)
Camacho, C.; Perez-Alija, J.; Pedro, A.
2013-07-01
During the administration of the field such information is it sampled and collected in files called Dynalog. The objective of this work is the analysis of these files as a complement to regular quality control of the EDW technique, as well as the independent verification system of generation and control of dynamic wedge fields. (Author)
Methods of Software Verification
Directory of Open Access Journals (Sweden)
R. E. Gurin
2015-01-01
Full Text Available This article is devoted to the problem of software verification (SW. Methods of software verification designed to check the software for compliance with the stated requirements such as correctness, system security and system adaptability to small changes in the environment, portability and compatibility, etc. These are various methods both by the operation process and by the way of achieving result. The article describes the static and dynamic methods of software verification and paid attention to the method of symbolic execution. In its review of static analysis are discussed and described the deductive method, and methods for testing the model. A relevant issue of the pros and cons of a particular method is emphasized. The article considers classification of test techniques for each method. In this paper we present and analyze the characteristics and mechanisms of the static analysis of dependencies, as well as their views, which can reduce the number of false positives in situations where the current state of the program combines two or more states obtained both in different paths of execution and in working with multiple object values. Dependences connect various types of software objects: single variables, the elements of composite variables (structure fields, array elements, the size of the heap areas, the length of lines, the number of initialized array elements in the verification code using static methods. The article pays attention to the identification of dependencies within the framework of the abstract interpretation, as well as gives an overview and analysis of the inference tools.Methods of dynamic analysis such as testing, monitoring and profiling are presented and analyzed. Also some kinds of tools are considered which can be applied to the software when using the methods of dynamic analysis. Based on the work a conclusion is drawn, which describes the most relevant problems of analysis techniques, methods of their solutions and
Indian Academy of Sciences (India)
Hossein Kheirfam; Mehdi Vafakhah
2015-08-01
Regional analysis of suspended sediment yield (SSY) is commonly used to estimate sediment at a particular site where little or no information is available on sediment yield. In this research, the efficiency of three input selection and homogenization methods were evaluated in the estimation of SSY. Therefore, 42 sediment measurement stations and their upstream watersheds were selected and sediment rating curve was estimated by using regression models for each station. Mean annual SSY was estimated by using sediment rating curve and daily discharge. In the present study, in order to determine the independent variables in sediment yield, 11 physiographical, one climatic and two hydrologic variables of whole study watersheds were selected. Then the most effective independent variables were selected by using principal component analysis (PCA), Gamma test (GT) and stepwise regression (SR) techniques. After reducing 14 input variables to five (using PCA and GT) and two (using SR techniques), they are divided into homogeneous areas by Andrew curve (AC), cluster analysis (CA) and canonical discriminate function (CDFs) techniques. The watersheds were divided into two (using PCA-AC), three (using PCA-CA, PCA-CDFs and GT-CDFs), four (using GT-CA, GT-AC and SR-CA) and five (using SR-AC) homogenous regions. Multiple regression models to estimate mean annual SSY as a function of five (using PCA and GT) and two (using SR techniques) watershed characteristics were built in each homogeneous region, and compared to actual mean annual SSY in each station using relative error (RE), efficiency coefficients (CE) and relative root mean square error (RRMSE). The results showed that preprocessing the input variables by means of PCA and GT techniques has improved the homogeneous stations determination and the development models. According to the results, the best technique for determining homogeneous watersheds was AC technique with RE=49.24%, RRMSE=43.75% and CE=71.04%.
Homogeneous crystal nucleation in polymers.
Schick, Christoph; Androsch, R; Schmelzer, Juern W P
2017-07-14
The pathway of crystal nucleation significantly influences the structure and properties of semi-crystalline polymers. Crystal nucleation is normally heterogeneous at low supercooling, and homogeneous at high supercooling, of the polymer melt. Homogeneous nucleation in bulk polymers has been, so far, hardly accessible experimentally, and was even doubted to occur at all. This topical review summarizes experimental findings on homogeneous crystal nucleation in polymers. Recently developed fast scanning calorimetry, with cooling and heating rates up to 106 K s-1, allows for detailed investigations of nucleation near and even below the glass transition temperature, including analysis of nuclei stability. As for other materials, the maximum homogeneous nucleation rate for polymers is located close to the glass transition temperature. In the experiments discussed here, it is shown that polymer nucleation is homogeneous at such temperatures. Homogeneous nucleation in polymers is discussed in the framework of classical nucleation theory. The majority of our observations are consistent with the theory. The discrepancies may guide further research, particularly experiments to progress theoretical development. Progress in the understanding of homogeneous nucleation is much needed, since most of the modelling approaches dealing with polymer crystallization exclusively consider homogeneous nucleation. This is also the basis for advancing theoretical approaches to the much more complex phenomena governing heterogeneous nucleation. © 2017 IOP Publishing Ltd.
Energy Technology Data Exchange (ETDEWEB)
Thomas, D.; Pickup, S.; Zhou, R.; Glickson, J. [Dept. of Radiology, Univ. of Pennsylvania, Philadelphia, PA (United States); Ferrari, V.A. [Dept. of Radiology, Univ. of Pennsylvania, Philadelphia, PA (United States); Dept. of Medicine, Div. of Cardiology, Univ. of Pennsylvania (United States)
2005-07-01
Purpose: to compare strain analysis and wall thickening (WT) analysis in differentiating the infarcted, adjacent, and remote zones in a rat model of myocardial infarction (MI). Material and methods: three normal (NL) and ten rats subjected to myocardial infarction were imaged on a 4.7T scanner. Gradient-echo and SPAMM-tagged cine images were acquired at three short axis levels of the left ventricle (LV). A homogenous strain analysis (principal strains {lambda}1 and {lambda}2, displacement D, angle {beta}) and a WT-analysis (mm- and %-thickening) were performed in all slices demonstrating MI. Regional function was compared between infarcted rats (infarcted, adjacent and remote zone) and corresponding regions in the NL rats. Additional segmental analysis was performed in the NL rats for the anterior, lateral, inferior and septal wall. Results: in the NL rats, {lambda}{sub 1} (greatest radial thickening) was greatest in the lateral and anterior wall. WT-analysis showed a pattern of function similar to {lambda}1, however, regional differences using WT-analysis were not significant. {lambda}{sub 2} (greatest circumferential shortening) was most negative in the anterior wall. D was greatest in the lateral and inferior wall. The angle {beta} was radially directed in all segments. In the infarcted rats, both strain and WT-analyses revealed significant impairment in function in the infarcted and adjacent zones as compared to NL (p < 0.001). However, only the strain analysis ({lambda}1, {lambda}2, p < 0.001) detected significant remote myocardial dysfunction. Myocardial function differed significantly between the infarcted and adjacent and between the infarcted and remote regions. Strain analysis ({lambda}2, D, {beta}, p < 0.001) also identified significant functional differences between the adjacent and remote zones, however, no statistically significant differences were found using WT-analysis. (orig.)
Photometric redshift analysis in the Dark Energy Survey Science Verification data
Sánchez, C; Lin, H; Miquel, R; Abdalla, F B; Amara, A; Banerji, M; Bonnett, C; Brunner, R; Capozzi, D; Carnero, A; Castander, F J; da Costa, L A N; Cunha, C; Fausti, A; Gerdes, D; Greisel, N; Gschwend, J; Hartley, W; Jouvel, S; Lahav, O; Lima, M; Maia, M A G; Martí, P; Ogando, R L C; Ostrovski, F; Pellegrini, P; Rau, M M; Sadeh, I; Seitz, S; Sevilla-Noarbe, I; Sypniewski, A; de Vicente, J; Abbot, T; Allam, S S; Atlee, D; Bernstein, G; Bernstein, J P; Buckley-Geer, E; Burke, D; Childress, M J; Davis, T; DePoy, D L; Dey, A; Desai, S; Diehl, H T; Doel, P; Estrada, J; Evrard, A; Fernández, E; Finley, D; Flaugher, B; Gaztanaga, E; Glazebrook, K; Honscheid, K; Kim, A; Kuehn, K; Kuropatkin, N; Lidman, C; Makler, M; Marshall, J L; Nichol, R C; Roodman, A; Sánchez, E; Santiago, B X; Sako, M; Scalzo, R; Smith, R C; Swanson, M E C; Tarle, G; Thomas, D; Tucker, D L; Uddin, S A; Valdés, F; Walker, A; Yuan, F; Zuntz, J
2014-01-01
We present results from a study of the photometric redshift performance of the Dark Energy Survey (DES), using the early data from a Science Verification (SV) period of observations in late 2012 and early 2013 that provided science-quality images for almost 200 sq.~deg.~at the nominal depth of the survey. We assess the photometric redshift performance using about 15000 galaxies with spectroscopic redshifts available from other surveys. These galaxies are used, in different configurations, as a calibration sample, and photo-$z$'s are obtained and studied using most of the existing photo-$z$ codes. A weighting method in a multi-dimensional color-magnitude space is applied to the spectroscopic sample in order to evaluate the photo-$z$ performance with sets that mimic the full DES photometric sample, which is on average significantly deeper than the calibration sample due to the limited depth of spectroscopic surveys. Empirical photo-$z$ methods using, for instance, Artificial Neural Networks or Random Forests, y...
Markel, Vadim A
2013-01-01
Reflection and refraction of electromagnetic waves by artificial periodic composites (metamaterials) can be accurately modeled by an effective medium theory only if the boundary of the medium is explicitly taken into account and the two effective parameters of the medium -- the index of refraction and the impedance -- are correctly determined. Theories that consider infinite periodic composites do not satisfy the above condition. As a result, they cannot model reflection and transmission by finite samples with the desired accuracy and are not useful for design of metamaterial-based devices. As an instructive case in point, we consider the "current-driven" homogenization theory, which has recently gained popularity. We apply this theory to the case of one-dimensional periodic medium wherein both exact and homogenization results can be obtained analytically in closed form. We show that, beyond the well-understood zero-cell limit, the current-driven homogenization result is inconsistent with the exact reflection...
Directory of Open Access Journals (Sweden)
Yaodeng Chen
2014-01-01
Full Text Available There are two different approaches on how to formulate adjoint numerical model (ANM. Aiming at the disputes arising from the construction methods of ANM, the differences between nonlinear shallow water equation and its adjoint equation are analyzed; the hyperbolicity and homogeneity of the adjoint equation are discussed. Then, based on unstructured meshes and finite volume method, a new adjoint model was advanced by getting numerical model of the adjoint equations directly. Using a gradient check, the correctness of the adjoint model was verified. The results of twin experiments to invert the bottom friction coefficient (Manning’s roughness coefficient indicate that the adjoint model can extract the observation information and produce good quality inversion. The reason of disputes about construction methods of ANM is also discussed in the paper.
Homogeneous Spaces and Equivariant Embeddings
Timashev, DA
2011-01-01
Homogeneous spaces of linear algebraic groups lie at the crossroads of algebraic geometry, theory of algebraic groups, classical projective and enumerative geometry, harmonic analysis, and representation theory. By standard reasons of algebraic geometry, in order to solve various problems on a homogeneous space it is natural and helpful to compactify it keeping track of the group action, i.e. to consider equivariant completions or, more generally, open embeddings of a given homogeneous space. Such equivariant embeddings are the subject of this book. We focus on classification of equivariant em
Energy Technology Data Exchange (ETDEWEB)
Jamsran, Narankhuu; Lim, Ock Taeck [University of Ulsan, Ulsan (Korea, Republic of)
2012-06-15
We investigated the efficacy of fuel stratification in a pre-mixture of dimethyl ether (DME) and n-butane, which have different autoignition characteristics, for reducing the pressure rise rate (PRR) of homogeneous charge compression ignition engines. A new chemical reaction model was created by mixing DME and n-butane and compared with existing chemical reaction models to verify the effects observed. The maximum PRR depended on the mixture ratio. When DME was charged with stratification and n-butane was charged with homogeneity, the maximum PRR was the lowest among all the mixtures studied. Calculations were performed using CHEMKIN and modified using SENKIN software.
Pierzga, M. J.
1980-05-01
To verify the results of a streamline curvature numerical analysis method, an investigation has been conducted in which comparisons are made between analytical and experimental data of an axial flow fan. Using loss model calculations to determine the proper outlet flow deviation angles, the flow field in the hub to tip plane of the turbomachine was calculated. These deviation angle calculations allow the inviscid streamline curvature (SLC) analysis to model a real fluid with viscous losses. The verification of this calculated flow field is the primary objective of the investigation; however, in addition to the hub to tip flow field, the numerical analysis of the blade-to-blade flow field was also investigated in some detail. To verify the accuracy of the numerical results, detailed flow surveys were conducted upstream and downstream of the test rotor of the axial flow fan. To obtain the necessary data to verify the blade-to-blade solution, internal blade row data were also collected. The internal blade row measurements were obtained by using a rotating circumferential traversing mechanism which was designed and implemented during this investigation. Along with these two sets of survey data, the static pressure distributions on the pressure and suction surfaces of the test rotor were also obtained.
Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex
2008-01-01
Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.
Automated verification of flight software. User's manual
Saib, S. H.
1982-01-01
(Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.
Fingerprint verification based on wavelet subbands
Huang, Ke; Aviyente, Selin
2004-08-01
Fingerprint verification has been deployed in a variety of security applications. Traditional minutiae detection based verification algorithms do not utilize the rich discriminatory texture structure of fingerprint images. Furthermore, minutiae detection requires substantial improvement of image quality and is thus error-prone. In this paper, we propose an algorithm for fingerprint verification using the statistics of subbands from wavelet analysis. One important feature for each frequency subband is the distribution of the wavelet coefficients, which can be modeled with a Generalized Gaussian Density (GGD) function. A fingerprint verification algorithm that combines the GGD parameters from different subbands is proposed to match two fingerprints. The verification algorithm in this paper is tested on a set of 1,200 fingerprint images. Experimental results indicate that wavelet analysis provides useful features for the task of fingerprint verification.
DEFF Research Database (Denmark)
Christensen, A H; Gjørup, T; Hilden, J;
1992-01-01
that uncertain statements were less frequently (61%) confirmed than were certain ones (85%). It is concluded that observer homogeneity is only moderate with regard to the histologic diagnosis of H. pylori, which should be considered both in daily clinical routine and in scientific studies. Disagreement between...
Dance, Michael J.
Sun Nuclear's SNC Patient (TM) analysis software. MapCHECK 2 showed a slightly better agreement with planned data for IMRT verifications with a mean pass rate of 99.4% for clinically used acceptance criteria of 3%/3mm. MapCHECK 2's 99.4% mean pass rate for IMRT verifications was 1.4% higher than ArcCHECK's mean pass rate. For VMAT verifications, the MapCHECK 2 had a mean pass rate of 99.6% and 100% for each arc respectively, resulting in a 1.25% to 1.92% higher mean passing rates than those measured by the ArcCHECK using an acceptance criteria of 3%/3mm. MapCHECK 2 showed consistently higher ROI-specific mean gamma passing rates, ranging from +0.2% to +5.6%. While neither diode array showed any advantage in regards to D95 measurements within the PTV, MapCHECK 2 again showed closer comparison data in the CTV/GTV with an absolute deviation of -1.14 Gy compared to -3.39 Gy as measured by the ArcCHECK. Lastly, while the MapCHECK 2 and ArcCHECK both closely matched with the reference doses within the PTV and CTV/GTV, the ArcCHECK consistently overestimated the maximum absolute dose to all ROI, from 0.026 Gy to 2.243 Gy. In conclusion, the MapCHECK 2 diode array measured data more closely matched with planned data compared to the ArcCHECK diode array for IMRT verifications. While MapCHECK 2 showed a marginally better gamma passing rates over the ArcCHECK diode array, the ArcCHECK's ability to simultaneously measure flatness, symmetry, output, and MLC positional accuracy as a function of gantry angle make it a more realistic and efficient measurement device for VMAT verifications.
Ma, Yu-Ting; Pei, Zhi-Guo; Chen, Zhong-Xiang
2017-07-01
A piezoelectric centrifugal pump was developed previously to overcome the low frequency responses of piezoelectric pumps with check valves and liquid reflux of conventional valveless piezoelectric pumps. However, the electro-mechanical-fluidic analysis on this pump has not been done. Therefore, multi-field analysis and experimental verification on piezoelectrically actuated centrifugal valveless pumps are conducted for liquid transport applications. The valveless pump consists of two piezoelectric sheets and a metal tube with piezoelectric elements pushing the metal tube to swing at the first bending resonant frequency. The centrifugal force generated by the swinging motion will force the liquid out of the metal tube. The governing equations for the solid and fluid domains are established, and the coupling relations of the mechanical, electrical and fluid fields are described. The bending resonant frequency and bending mode in solid domain are discussed, and the liquid flow rate, velocity profile, and gauge pressure are investigated in fluid domain. The working frequency and flow rate concerning different components sizes are analyzed and verified through experiments to guide the pump design. A fabricated prototype with an outer diameter of 2.2 mm and a length of 80 mm produced the largest flow rate of 13.8 mL/min at backpressure of 0.8 kPa with driving voltage of 80 Vpp. By solving the electro-mechanical-fluidic coupling problem, the model developed can provide theoretical guidance on the optimization of centrifugal valveless pump characters.
Rahmati, A. H.; Mohammadimehr, M.
2014-05-01
Electro-thermo-mechanical vibration analysis of non-uniform and non-homogeneous boron nitride nanorod (BNNR) embedded in elastic medium is presented. The steady state heat transfer equation without external heat source for non-homogeneous rod is developed and temperature distribution is derived. Using Maxwell's equation and nonlocal elasticity theory the coupled displacement and electrical potential equations are presented. Differential quadrature method (DQM) is implemented to evaluate the natural frequencies. The effects of attached mass, lower and higher vibrational mode, elastic medium, piezoelectric coefficient, dielectric coefficient, cross section coefficient, non-homogeneity parameter and small scale parameter on the natural frequency are investigated. The appropriate values for Winkler and Pasternak modulus in axial vibration of boron nitride nanorod are reported. The mass sensitivity limit of 10-1 (zg) is derived for BNNR-based nano-electro-mechanical sensors. The results show that the C-F boundary condition of BNNRs are more sensitive to attached mass than the C-C boundary condition and also the sensitivity range for BNNRs. It is concluded that frequency ratio decreases considering electro-thermal loadings and electrical loadings are more effective in non-uniform nanorods, in comparison with uniform nanorod. The natural frequency of BNNRs can be varied using different cross section coefficient and non-homogeneity parameter. This fact can be employed for practical tools to design and control nano-devices and nano-electro-mechanical systems (NEMS) to prevent resonance phenomenon.
Salmonson, J D
1993-01-01
Herein I present numerical calculations of lightcurves of homogeneous and structured afterglows with various lateral expansion rates as seen from any vantage point. Such calculations allow for direct simulation of observable quantities for complex afterglows with arbitrary energy distributions and lateral expansion paradigms. A simple, causal model is suggested for lateral expansion of the jet as it evolves; namely, that the lateral expansion kinetic energy derives from the forward kinetic energy. As such the homogeneous jet model shows that lateral expansion is important at all times in the afterglow evolution and that analytical scaling laws do a poor job at describing the afterglow decay before and after the break. In particular, I find that lateral expansion does not cause a break in the lightcurve as had been predicted. A primary purpose of this paper is to study structured afterglows, which do a good job of reproducing global relationships and correlations in the data and thus suggest the possibility of...
Directory of Open Access Journals (Sweden)
Dewei Tang
2017-03-01
Full Text Available The main task of the third Chinese lunar exploration project is to obtain soil samples that are greater than two meters in length and to acquire bedding information from the surface of the moon. The driving component is the power output unit of the drilling system in the lander; it provides drilling power for core drilling tools. High temperatures can cause the sensors, permanent magnet, gears, and bearings to suffer irreversible damage. In this paper, a thermal analysis model for this driving component, based on the thermal network method (TNM was established and the model was solved using the quasi-Newton method. A vacuum test platform was built and an experimental verification method (EVM was applied to measure the surface temperature of the driving component. Then, the TNM was optimized, based on the principle of heat distribution. Through comparative analyses, the reasonableness of the TNM is validated. Finally, the static temperature field of the driving component was predicted and the “safe working times” of every mode are given.
Institute of Scientific and Technical Information of China (English)
CAO Jiaokun; DING Shuiting
2012-01-01
This paper presents an application of global sensitivity analysis for system safety analysis of reciprocating aircraft engine.Compared with local sensitivity analysis results,global sensitivity analysis could provide more information on parameter interactions,which are significant in complex system safety analysis.First,a deterministic aviation reciprocating engine thermodynamics model is developed and parameters of interest are defined as random variables.Then,samples are generated by Monte Carlo method for the parameters used in engine model on the basis of definition of factor distribution.Eventually,results from engine model are generated and importance indices are calculated.Based on the analysis results,design is improved to satisfy the airworthiness requirements.The results reveal that by using global sensitivity analysis,the parameters could be ranked with respect to their importance,including first order indices and total sensitivity indices.By reducing the uncertainty of parameters and adjusting the range of inputs,safety criteria would be satisfied.
Tene, Yair; Tene, Noam; Tene, G.
1993-08-01
An interactive data fusion methodology of video, audio, and nonlinear structural dynamic analysis for potential application in forensic engineering is presented. The methodology was developed and successfully demonstrated in the analysis of heavy transportable bridge collapse during preparation for testing. Multiple bridge elements failures were identified after the collapse, including fracture, cracks and rupture of high performance structural materials. Videotape recording by hand held camcorder was the only source of information about the collapse sequence. The interactive data fusion methodology resulted in extracting relevant information form the videotape and from dynamic nonlinear structural analysis, leading to full account of the sequence of events during the bridge collapse.
Verification Support for Object Database Design
Spelt, David
1999-01-01
In this thesis we have developed a verification theory and a tool for the automated analysis of assertions about object-oriented database schemas. The approach is inspired by the work of [SS89] in which a theorem prover is used to automate the verification of invariants for transactions on a relatio
Energy Technology Data Exchange (ETDEWEB)
Ullo, J.J.; Hardy, J. Jr.
1977-10-01
Thirty-two U233 and U235 homogeneous aqueous critical experiments were analyzed with ENDF/B-IV data. Calculated eigenvalues for both fuel types increased by nearly 2 percent over the range of hydrogen/uranium atomic ratio covered (from 2106 to 27.1). This is attributed mostly to an underprediction of fast leakage, with some contribution from the fission and capture resonance integrals of ENDF/B-IV U235. Eigenvalue sensitivities to several nuclear data changes were examined. Values of the thermal criticality parameter constraint K2 for U233 and U235 were derived from the Gwin-Magnuson critical experiments at the zero leakage limit.
Directory of Open Access Journals (Sweden)
Lhoucine Boutahar
2016-03-01
Full Text Available Some Functionally Graded Materials contain pores due to the result of processing; this influences their elastic and mechanical properties. Therefore, it may be very useful to examine the vibration behavior of thin Functionally Graded Annular Plates Clamped at both edges including porosities. In the present study, the rule of mixture is modified to take into account the effect of porosity and to approximate the material properties assumed to be graded in the thickness direction of the examined annular plate. A semi-analytical model based on Hamilton’s principle and spectral analysis is adopted using a homogenization procedure to reduce the problem under consideration to that of an equivalent isotropic homogeneous annular plate. The problem is solved by a numerical iterative method. The effects of porosity, material property, and elastic foundations characteristics on the CCFGAP axisymmetric large deflection response are presented and discussed in detail.
Gunawan, Endra; Meilano, Irwan; Hanifa, Nuraini Rahma; Widiyantoro, Sri
2017-10-01
We simulate surface displacements calculated on homogeneous and layered half-space and spherical models as applied to the coseismic and postseismic (afterslip and viscoelastic relaxation) of the 2006 Java tsunami earthquake. Our analysis of coseismic and afterslip deformation suggests that the homogeneous half-space model generates a much broader displacement effect than the layered half-space and spherical models. Also, though the result for surface displacements is similar for the layered half-space and spherical models, noticeable displacements still occurred on top of the coseismic fault patches. Our displacement result in afterslip modeling suggests that significant displacements occurred on top of the main afterslip fault patches, differing from the viscoelastic relaxation model, which has displacements in the front region of coseismic fault patches. We propose this characteristic as one of the important features differentiating a postseismic deformation signal from afterslip and viscoelastic relaxation detected by geodetic data.
食品包装QS实地核查中的问题及分析%Problems and Analysis of Food Packaging QS Verification
Institute of Scientific and Technical Information of China (English)
毛志芳; 王凤玲; 向斌
2012-01-01
食品安全问题因涉及到民生和社会的稳定而引发各界关注,食品包装安全也是食品安全的一个重要组成部分.本文通过在食品塑料包装QS实地核查中发现的问题,对问题产生的原因进行了分析,以提高核查的质量.%Food safety issues related to people's livelihood and social stability caused by the public attention. An important component of food safety is food packaging safety. In thispaper, we found problems in plastic food packaging QS verification and analysis the factors of the problem in order to improve the quality of verification.
Directory of Open Access Journals (Sweden)
N.C. Lenin
2010-06-01
Full Text Available In this paper, the results of a finite element analysis are carried out on new stator geometry of a three phase longitudinal flux Linear Switched Reluctance Motor (LSRM. In the new geometry, pole shoes are affixed to the stator poles. Static and dynamic characteristics for the proposed structure have been highlighted. Motor performance for variable load conditions is discussed. The 2-Dimensional (2-D finite element analysis (FEA and the experimental results of this paper prove that LSRMs are one of the strong candidates for linear propulsion drive
Poelmans, J.; Dedene, G.; Snoeck, M.; Viaene, S.; Fox, R.; Golubski, W.
2010-01-01
One of the first steps in a software engineering process is the elaboration of the conceptual domain model. In this paper, we investigate how Formal Concept Analysis can be used to formally underpin the construction of a conceptual domain model. In particular, we demonstrate that intuitive verificat
Czyżewska, Urszula; Konończuk, Joanna; Teul, Joanna; Drągowski, Paweł; Pawlak-Morka, Renata; Surażyński, Arkadiusz; Miltyk, Wojciech
2015-05-01
Propolis is a resin that is collected by honeybees from various plant sources. Due to its pharmacological properties, it is used in commercial production of nutritional supplements in pharmaceutical industry. In this study, gas chromatography-mass spectrometry was applied for quality control analysis of the three commercial specimens containing aqueous-alcoholic extracts of bee propolis. More than 230 constituents were detected in analyzed products, including flavonoids, chalcones, cinnamic acids and their esters, phenylpropenoid glycerides, and phenylpropenoid sesquiterpenoids. An allergenic benzyl cinnamate ester was also identified in all tested samples. This analytical method allows to evaluate biological activity and potential allergenic components of bee glue simultaneously. Studies on chemical composition of propolis samples may provide new approach to quality and safety control analysis in production of propolis supplementary specimens.
Verification of C. G. Jung's analysis of Rowland Hazard and the history of Alcoholics Anonymous.
Bluhm, Amy Colwell
2006-11-01
Extant historical scholarship in the Jungian literature and the Alcoholics Anonymous (AA) literature does not provide a complete picture of the treatment of Rowland Hazard by C. G. Jung, an analysis that AA co-founder Bill Wilson claimed was integral to the foundation of AA in theory and practice. Wilson's original report resulted in archivists and historians incorrectly calibrating their searches to the wrong date. The current work definitively solves the mystery of the timing of Hazard's treatment with Jung by placing his preliminary analysis with Jung in the year 1926, rather than 1930 or 1931. Previously unexamined correspondence originating from Jung, Hazard, his cousin Leonard Bacon, his uncle Irving Fisher, and his aunt Margaret Hazard Fisher is supplemented by relevant primary and secondary source material.
Institute of Scientific and Technical Information of China (English)
WEI Yan-fang; GUO Si-ling; XUE Yu
2007-01-01
In this article, the traffic hydrodynamic model considering the driver's reaction time was applied to the traffic analysis at the intersections on real roads. In the numerical simulation with the model, the pinch effect of the right-turning vehicles flow was found, which mainly leads to traffic jamming on the straight lane. All of the results in accordance with the empirical data confirm the applicability of this model.
C/SiC Component & Material Analysis, Attachment Verification, & Blisk Turbopump Testing
Effinger, Michael R.; Genge, Gary; Gregg, Wayne; Jordan, William
1999-01-01
Ceramic composite blisk components constructed of carbon fibers & silicon carbide ceramic matrix have been fabricated and tested. Room and cryogenic torque testing have verified the attachment configuration. The polar and quasi-isotropic blisks have been proof spun, balanced, and tested in the Simplex turbopump. Foreign particle impact analysis was conducted for C/SiC. Data and manufacturing lessons learned are significantly benefiting the development of the Reusable Launch Vehicle's ceramic matrix composite integrally bladed disk.
Verification of radiation heat transfer analysis in KSTAR PFC and vacuum vessel during baking
Energy Technology Data Exchange (ETDEWEB)
Yoo, S.Y. [Chungnam National University, 79 Daehak-ro, Yuseong-gu, Daejeon 34167 (Korea, Republic of); Kim, Y.J., E-mail: k43689@nfri.re.kr [National Fusion Research Institute, 169-148 Gwahang-ro, Yuseong-gu, Daejeon 34133 (Korea, Republic of); Kim, S.T.; Jung, N.Y.; Im, D.S.; Gong, J.D.; Lee, J.M.; Park, K.R.; Oh, Y.K. [National Fusion Research Institute, 169-148 Gwahang-ro, Yuseong-gu, Daejeon 34133 (Korea, Republic of)
2016-11-01
Highlights: • Thermal network is used to analyze heat transfer from PFC to VV. • Three heat transfer rate equations are derived based on the thermal network. • The equations is verified using Experimental data and design documents. • Most of the heat lost in tokamak is transferred to experimental room air. • The heat loss to the air is 101 kW of the total heat loss of 154 kW in tokamak. - Abstract: KSTAR PFC (Plasma Facing Component) and VV (Vacuum Vessel) were not arrived at the target temperatures in bake-out phase, which are 300 °C and 110 °C, respectively. The purpose of this study is to find out the reason why they have not been reached the target temperature. A thermal network analysis is used to investigate the radiation heat transfer from PFC to VV, and the thermal network is drawn up based on the actual KSTAR tokamak. The analysis model consists of three equations, and is solved using the EES (Engineering Equation Solver). The heat transfer rates obtained with the analysis model is verified using the experimental data at the KSTAR bake-out phase. The analyzed radiation heat transfer rates from PFC to VV agree quite well with those of experiment throughout the bake-out phase. Heat loss from PFC to experimental room air via flange of VV is also calculated and compared, which is found be the main reason of temperature gap between the target temperature and actually attained temperature of KSTAR PFC.
Fault Tree Analysis for Safety/Security Verification in Aviation Software
Directory of Open Access Journals (Sweden)
Andrew J. Kornecki
2013-01-01
Full Text Available The Next Generation Air Traffic Management system (NextGen is a blueprint of the future National Airspace System. Supporting NextGen is a nation-wide Aviation Simulation Network (ASN, which allows integration of a variety of real-time simulations to facilitate development and validation of the NextGen software by simulating a wide range of operational scenarios. The ASN system is an environment, including both simulated and human-in-the-loop real-life components (pilots and air traffic controllers. Real Time Distributed Simulation (RTDS developed at Embry Riddle Aeronautical University, a suite of applications providing low and medium fidelity en-route simulation capabilities, is one of the simulations contributing to the ASN. To support the interconnectivity with the ASN, we designed and implemented a dedicated gateway acting as an intermediary, providing logic for two-way communication and transfer messages between RTDS and ASN and storage for the exchanged data. It has been necessary to develop and analyze safety/security requirements for the gateway software based on analysis of system assets, hazards, threats and attacks related to ultimate real-life future implementation. Due to the nature of the system, the focus was placed on communication security and the related safety of the impacted aircraft in the simulation scenario. To support development of safety/security requirements, a well-established fault tree analysis technique was used. This fault tree model-based analysis, supported by a commercial tool, was a foundation to propose mitigations assuring the gateway system safety and security.
FEM modeling for 3D dynamic analysis of deep-ocean mining pipeline and its experimental verification
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
3D dynamic analysis models of 1000 m deep-ocean mining pipeline, including steel lift pipe, pump, buffer and flexible hose, were established by finite element method (FEM). The coupling effect of steel lift pipe and flexible hose, and main external loads of pipeline were considered in the models, such as gravity, buoyancy, hydrodynamic forces, internal and external fluid pressures, concentrated suspension buoyancy on the flexible hose, torsional moment and axial force induced by pump working.Some relevant FEM models and solution techniques were developed, according to various 3D transient behaviors of integrated deep-ocean mining pipeline, including towing motions of track-keeping operation and launch process of pipeline. Meanwhile, an experimental verification system in towing water tank that had similar characteristics of designed mining pipeline was developed to verify the accuracy of the FEM models and dynamic simulation. The experiment results show that the experimental records and simulation results of stress of pipe are coincided. Based on the further simulations of 1 000 m deep-ocean mining pipeline, the simulation results show that, to form configuration of a saddle shape, the total concentrated suspension buoyancy of flexible hose should be 95%-105% of the gravity of flexible hose in water, the first suspension point occupies 1/3 of the total buoyancy, and the second suspension point occupies 2/3 of the total buoyancy. When towing velocity of mining system is less than 0.5 m/s, the towing track of buffer is coincided with the setting route of ship on the whole and the configuration of flexible hose is also kept well.
Verification of a three-dimensional viscous flow analysis for a single stage compressor
Matsuoka, Akinori; Hashimoto, Keisuke; Nozaki, Osamu; Kikuchi, Kazuo; Fukuda, Masahiro; Tamura, Atsuhiro
1992-12-01
A transonic flowfield around rotor blades of a highly loaded single stage axial compressor was numerically analyzed by a three dimensional compressible Navier-Stokes equation code using Chakravarthy and Osher type total variation diminishing (TVD) scheme. A stage analysis which calculates both flowfields around inlet guide vane (IGV) and rotor blades simultaneously was carried out. Comparing with design values and experimental data, computed results show slight difference quantitatively. But the numerical calculation simulates well the pressure rise characteristics of the compressor and its flow pattern including strong shock surface.
Directory of Open Access Journals (Sweden)
Abdollah YOUSEFI
2015-01-01
Full Text Available Hip joint is one of the most stable joints in human body. It has intrinsic stability provided by its relatively rigid ball and socket configuration. The hip joint also has a wide range of motion, which allows normal locomotion and daily activities. Location of hip joint center (HJC is an important parameter in gait analysis, biomechanical and clinical research laboratories to calculate human lower extremity kinematics and kinetics. Inaccuracies in estimation of hip joint center are shown to propagate errors in kinematic and kinetic calculations of lower extremities
Soeiro, Bruno T; Boen, Thaís R; Wagner, Roger; Lima-Pallone, Juliana A
2009-01-01
The aim of the present work was to determine parameters of the corn and wheat flour matrix, such as protein, lipid, moisture, ash and carbohydrates, folic acid and iron contents. Three principal components explained 91% of the total variance. Wheat flours were characterized by high protein and moisture content. On the other hand, the corn flours had the greater carbohydrates, lipids and folic acid levels. The concentrations of folic acid were lower than the issued value for wheat flours. Nevertheless, corn flours presented extremely high values. The iron concentration was higher than that recommended in Brazilian legislation. Poor homogenization of folic acid and iron was observed in enriched flours. This study could be useful to help the governmental authorities in the enriched food programs evaluation.
Song, Yi
2011-01-01
Cognitive radio (CR) technology is regarded as a promising solution to the spectrum scarcity problem. Due to the spectrum varying nature of CR networks, unlicensed users are required to perform spectrum handoffs when licensed users reuse the spectrum. In this paper, we study the performance of the spectrum handoff process in a CR ad hoc network under homogeneous primary traffic. We propose a novel three dimensional discrete-time Markov chain to characterize the process of spectrum handoffs and analyze the performance of unlicensed users. Since in real CR networks, a dedicated common control channel is not practical, in our model, we implement a network coordination scheme where no dedicated common control channel is needed. Moreover, in wireless communications, collisions among simultaneous transmissions cannot be immediately detected and the whole collided packets need to be retransmitted, which greatly affects the network performance. With this observation, we also consider the retransmissions of the collid...
Verification of the optimum tropospheric parameters setting for the kinematic PPP analysis
Hirata, Y.; Ohta, Y.
2015-12-01
Kinematic GNSS analysis is useful for extraction of the crustal deformation phenomena between seconds to one day such as coseismic and postseismic deformation after a large earthquake. The kinematic GNSS analysis, however, have fundamental difficulties for the separation between unknown parameters such as the site coordinate and tropospheric parameters, caused by a strong correlation between each other. Thus, we focused on the improvement of the separation precision between coordinate time series of kinematic PPP and wet zenith tropospheric delay (WZTD) based on the comprehensive search of the parameter space. We used GIPSY-OASIS II Ver. 6.3 software for kinematic PPP processing of whole GEONET sites in 10 March 2011. We applied the every 6 hours nominal WZTD value as a priori information based on the ECMWF global numerical climate model. For the coordinate time series and tropospheric parameters, we assumed white noise and random walk stochastic process, respectively. These unknown parameters are very sensitive to assumed process noise for each stochastic process. Thus, we searched for the optimum two variable parameters; wet zenith tropospheric parameter (named as TROP) and its gradient (named as GRAD). We defined the optimum parameters, which minimized the standard deviation of coordinate time series.We firstly checked the spatial distribution of optimum pair of TROP and GRAD. Even though the optimum parameters showed the certain range (TROP: 2×10-8 ~ 6×10-7 (horizontal), 5.5×10-9 ~ 2×10-8 (vertical); GRAD: 2×10-10 ~ 6×10-9 (horizontal), 2×10-10 ~ 1×10-8 (vertical) (unit: km·s-½)), we found they showed the large diversity. It suggests there are strong heterogeneity of atmospheric state. We also estimated temporal variations of optimum TROP and GRAD in specific site. We analyzed the data through 2010 at GEONET 940098 station located in the most southern part of Kyusyu, Japan. Obtained time series of optimum GRAD showed clear annual variation, and the
Monazite RW-1: a homogenous natural reference material for SIMS U-Pb and Th-Pb isotopic analysis
Ling, Xiao-Xiao; Huyskens, Magdalena H.; Li, Qiu-Li; Yin, Qin-Zhu; Werner, Ronald; Liu, Yu; Tang, Guo-Qiang; Yang, Ya-Nan; Li, Xian-Hua
2016-10-01
Well-characterized matrix-matched natural mineral references of known age are an important prerequisite for SIMS (secondary ion mass spectrometry) U-Th-Pb dating. We have characterized RW-1, a 44 g yellowish-brown single monazite specimen from a Norwegian pegmatite as an excellent hi-Th reference material for secondary ion mass spectrometric U-Th-Pb dating. A total of 206 secondary ion mass spectrometric analyses over six analytical sessions were performed on different monazite fragments of RW-1. The analyses resulted in 207Pb-based common lead corrected 206Pb/238U ages and Th-Pb ages with overall 2 % (2 SD = standard deviation) variations, indicating the good U-Th-Pb system homogeneity. The homogeneity of Th content of 11.8 ± 1.0 wt% (2 SD) and Th/U of 42 ± 3 (2 SD) make this crystal also a good compositional reference material. We used the combined ID-TIMS(Pb)/ID-MC-ICP-MS(U) technique (i.e. isotope dilution thermal ionization mass spectrometry for Pb, and isotope dilution multi-collector inductively-coupled plasma mass spectrometry for U) to determine U-Pb ages of the monazite samples studied. The mean 207Pb/235U age of 904.15 ± 0.26 Ma (95 % confidence level) is recommended as the best estimate crystallization age for RW-1 monazite. Considering that the most commonly distributed U-Pb monazite reference materials have rather low ThO2, we suggest that this RW-1 monazite with its ThO2 of 13.5 wt% is a suitable reference material providing investigators more confidence when dating high-Th monazite unknowns.
Verification analysis of thermoluminescent albedo neutron dosimetry at MOX fuel facilities.
Nakagawa, Takahiro; Takada, Chie; Tsujimura, Norio
2011-07-01
Radiation workers engaging in the fabrication of MOX fuels at the Japan Atomic Energy Agency-Nuclear Fuel Cycle Engineering Laboratories are exposed to neutrons. Accordingly, thermoluminescent albedo dosemeters (TLADs) are used for individual neutron dosimetry. Because dose estimation using TLADs is susceptible to variation of the neutron energy spectrum, the authors have provided TLADs incorporating solid-state nuclear tracks detectors (SSNTDs) to selected workers who are routinely exposed to neutrons and have continued analysis of the relationship between the SSNTD and the TLAD (T/R(f)) over the past 6 y from 2004 to 2009. Consequently, the T/R(f) value in each year was less than the data during 1991-1993, although the neutron spectra had not changed since then. This decrease of the T/R(f) implies that the ratio of operation time nearby gloveboxes and the total work time has decreased.
Logic analysis and verification of n-input genetic logic circuits
DEFF Research Database (Denmark)
Baig, Hasan; Madsen, Jan
2017-01-01
Nature is using genetic logic circuits to regulate the fundamental processes of life. These genetic logic circuits are triggered by a combination of external signals, such as chemicals, proteins, light and temperature, to emit signals to control other gene expressions or metabolic pathways...... accordingly. As compared to electronic circuits, genetic circuits exhibit stochastic behavior and do not always behave as intended. Therefore, there is a growing interest in being able to analyze and verify the logical behavior of a genetic circuit model, prior to its physical implementation in a laboratory....... In this paper, we present an approach to analyze and verify the Boolean logic of a genetic circuit from the data obtained through stochastic analog circuit simulations. The usefulness of this analysis is demonstrated through different case studies illustrating how our approach can be used to verify the expected...
Implementation and verification of a comprehensive helicopter coupled rotor - Fuselage analysis
Wood, E. R.; Banerjee, D.; Shamie, J.; Straub, F.; Dinyavari, M. A. H.
1985-01-01
The analytical basis and the application of a Rotor/Airframe Comprehensive Aeroelastic Program (RACAP) are described in detail. The rationale behind each analytical choice is outlined and the modular procedure is described. The program is verified by application to the AH-1G helicopter. The applicability of various airload prediction models is examined, and both the steady and vibratory responses of the blade are compared with flight test data. Reasonable correlation is found between measured and calculated blade response, with excellent correlation for vibration amplitudes at various locations on the fuselage such as engine, pilot seat, and gunner. Within the analytical model, comparisons are drawn between an isolated blade analysis and a coupled rotor/fuselage model. The deficiency of the former in the context of the AH-1G is highlighted.
Control analysis and experimental verification of a practical dc–dc boost converter
Directory of Open Access Journals (Sweden)
Saswati Swapna Dash
2015-12-01
Full Text Available This paper presents detailed open loop and closed loop analysis on boost dc–dc converter for both voltage mode control and current mode control. Here the boost dc–dc converter is a practical converter considering all possible parasitic elements like ESR and on state voltage drops. The open loop control, closed loop current mode control and voltage mode control are verified. The comparative study of all control techniques is presented. The PI compensator for closed loop current mode control is designed using these classical techniques like root locus technique and bode diagram. The simulation results are validated with the experimental results of voltage mode control for both open loop and closed loop control.
DEFF Research Database (Denmark)
Riisgaard, Benjamin; Georgakis, Christos; Stang, Henrik
2007-01-01
Compact Reinforced Composite, CRC, is a high-strength cement-based composite that holds an enormous flexural and energy-absorbing capacity due to the close-spaced high strength steel reinforcement and a high-strength cement-based fiber DSP matrix. The material has been used in various constructions...... without breaching. This paper introduces an efficient method for implementing high fractions of polymer shock reinforcement in a CRC element. Experimental tests and explicit finite element analysis is used to demonstrate the potentials of this material. This paper also provides the reader...... with the information and data needed to formulate a simple material model for High-Strength Fiber-Reinforced Concrete suitable for predicting the responses of Polymer reinforced CRC under close-in detonations using the general purpose transient dynamic finite element program LS-DYNA....
Energy Technology Data Exchange (ETDEWEB)
Nagle, J.; Whitfield, R.
1983-05-01
This report was developed as a management tool for use by the Federal Emergency Management Agency (FEMA) Region II staff. The analysis summarized in this report was undertaken to verify the extent to which procedures, training programs, and resources set forth in the County Radiological Emergency Response Plans (CRERPs) for Orange, Putnam, and Westchester counties in New York had been realized prior to the March 9, 1983, exercise of the Indian Point Nuclear Power Station near Buchanan, New York. To this end, a telephone survey of county emergency response organizations was conducted between January 19 and February 22, 1983. This report presents the results of responses obtained from this survey of county emergency response organizations.
Energy Technology Data Exchange (ETDEWEB)
Garcia, A.; Minning, C.
1981-11-01
The construction of optical and electrical verification test coupons of encapsulated solar cells and potted copper is detailed. Testing of these coupons was completed and the results are presented. Additionally, a thermal simulation of roof mounted array conditions was done and the results documented.
Energy Technology Data Exchange (ETDEWEB)
Conan O' Rourke; Yutao Zhou
2006-03-01
The Program for the Evaluation and Analysis of Residential Lighting (PEARL) is a watchdog program. It was created in response to complaints received by utility program managers about the performance of certain Energy Star lighting products being promoted within their service territories and the lack of a self-policing mechanism within the lighting industry that would ensure the reliability of these products and their compliance with ENERGY STAR specifications. To remedy these problems, PEARL purchases and tests products that are available to the consumers in the marketplace. The Lighting Research Center (LRC) tests the selected products against the corresponding Energy Star specifications. This report includes the experimental procedure and data results of Cycle Three and Cycle Four of PEARL program during the period of April 2003 to October 2003, along with the description of apparatus used, equipment calibration process, experimental methodology, and research findings from the testing. The parameter tested for Cycle three is lumen maintenance at 40% rated life, and parameters tested for Cycle Four are all parameters required in Energy Star specifications except lumen maintenance at 40% rated life.
Energy Technology Data Exchange (ETDEWEB)
Conan O' Rourke; Yutao Zhou
2006-03-01
The Program for the Evaluation and Analysis of Residential Lighting (PEARL) is a watchdog program. It was created in response to complaints received by utility program managers about the performance of certain Energy Star lighting products being promoted within their service territories and the lack of a self-policing mechanism within the lighting industry that would ensure the reliability of these products and their compliance with ENERGY STAR specifications. To remedy these problems, PEARL purchases and tests products that are available to the consumers in the marketplace. The Lighting Research Center (LRC) tests the selected products against the corresponding Energy Star specifications. This report includes the experimental procedure and data results of Cycle Five and Cycle Six of PEARL program during the period of April 2004 to October 2004, along with the description of apparatus used, equipment calibration process, experimental methodology, and research findings from the testing. The parameter tested for Cycle Five is lumen maintenance at 40% rated life, and parameters tested for Cycle Six are Efficacy, CCT, CRI, Power Factor, Start Time, Warm-up Time, and Rapid Cycle Stress Test for CFLs.
Energy Technology Data Exchange (ETDEWEB)
Conan O' Rourke; Yutao Zhou
2006-03-01
The Program for the Evaluation and Analysis of Residential Lighting (PEARL) is a watchdog program. It was created in response to complaints received by utility program managers about the performance of certain Energy Star lighting products being promoted within their service territories and the lack of a self-policing mechanism within the lighting industry that would ensure the reliability of these products and their compliance with ENERGY STAR specifications. To remedy these problems, PEARL purchases and tests products that are available to the consumers in the marketplace. The Lighting Research Center (LRC) tests the selected products against the corresponding Energy Star specifications. This report includes the experimental procedure and data results of Cycle Six of PEARL program during the period of October 2004 to April 2005, along with the description of apparatus used, equipment calibration process, experimental methodology, and research findings from the testing. The parameters tested for CFL models in Cycle Six are 1000-hour Lumen Maintenance, Lumen Maintenance at 40% Rated Life, and Interim Life Test, along with a series of parameters verified, such as ballast electrical parameters and Energy Star label.
Energy Technology Data Exchange (ETDEWEB)
Conan O' Rourke; Yutao Zhou
2006-03-01
The Program for the Evaluation and Analysis of Residential Lighting (PEARL) is a watchdog program. It was created in response to complaints received by utility program managers about the performance of certain Energy Star lighting products being promoted within their service territories and the lack of a self-policing mechanism within the lighting industry that would ensure the reliability of these products and their compliance with ENERGY STAR specifications. To remedy these problems, PEARL purchases and tests products that are available to the consumers in the marketplace. The Lighting Research Center (LRC) tests the selected products against the corresponding Energy Star specifications. This report includes the experimental procedure and data results of Cycle 6 and Reflector CFL In-situ Testing of PEARL program during the period of April 2005 to October 2005, along with the description of apparatus used, equipment calibration process, experimental methodology, and research findings from the testing. LRC performed testing for the fixture samples in Cycle 6 against Energy Star residential fixture specifications during this period of time. LRC subcontracted the Reflector CFL In-situ Testing to Luminaire Testing Laboratories located at Allentown PA, and supervised this test.
Energy Technology Data Exchange (ETDEWEB)
Conan O' Rourke; Yutao Zhou
2006-05-01
The Program for the Evaluation and Analysis of Residential Lighting (PEARL) is a watchdog program. It was created in response to complaints received by utility program managers about the performance of certain Energy Star lighting products being promoted within their service territories and the lack of a self-policing mechanism within the lighting industry that would ensure the reliability of these products and their compliance with ENERGY STAR specifications. To remedy these problems, PEARL purchases and tests products that are available to the consumers in the marketplace. The Lighting Research Center (LRC) tests the selected products against the corresponding Energy Star specifications. This report includes the experimental procedure of Cycle 7 of PEARL program during the period of October 2005 to March 2006, along with the description of apparatus used, equipment calibration process, experimental methodology, and research findings from the testing. LRC administered the purchasing of CFL samples to test in Cycle 7, performed 100-hour seasoning for most of the CFL samples received by March 2006, and performed sphere testing for some of the CFL samples at 100 hours of life (initial measurement).
Energy Technology Data Exchange (ETDEWEB)
Conan O' Rourke; Yutao Zhou
2006-03-01
The Program for the Evaluation and Analysis of Residential Lighting (PEARL) is a watchdog program. It was created in response to complaints received by utility program managers about the performance of certain Energy Star lighting products being promoted within their service territories and the lack of a self-policing mechanism within the lighting industry that would ensure the reliability of these products and their compliance with ENERGY STAR specifications. To remedy these problems, PEARL purchases and tests products that are available to the consumers in the marketplace. The Lighting Research Center (LRC) tests the selected products against the corresponding Energy Star specifications. This report includes the experimental procedure and data results of Cycle Four and Cycle Five of PEARL program during the period of October 2003 to April 2004, along with the description of apparatus used, equipment calibration process, experimental methodology, and research findings from the testing. The parameter tested for Cycle Four is lumen maintenance at 40% rated life, and parameters tested for Cycle Five are all parameters required in Energy Star specifications except lumen maintenance at 40% rated life.
Energy Technology Data Exchange (ETDEWEB)
Conan O' Rourke; Yutao Zhou
2006-03-01
The Program for the Evaluation and Analysis of Residential Lighting (PEARL) is a watchdog program. It was created in response to complaints received by utility program managers about the performance of certain Energy Star lighting products being promoted within their service territories and the lack of a self-policing mechanism within the lighting industry that would ensure the reliability of these products and their compliance with ENERGY STAR specifications. To remedy these problems, PEARL purchases and tests products that are available to the consumers in the marketplace. The Lighting Research Center (LRC) tests the selected products against the corresponding Energy Star specifications. This report includes the experimental procedure and data results of Cycle Three of PEARL program during the period of October 2002 to April 2003, along with the description of apparatus used, equipment calibration process, experimental methodology, and research findings from the testing. The products tested are 20 models of screw-based compact fluorescent lamps (CFL) of various types and various wattages made or marketed by 12 different manufacturers, and ten models of residential lighting fixtures from eight different manufacturers.
Verification of the helioseismic Fourier-Legendre analysis for meridional flow measurements
Roth, Markus; Hartlep, Thomas
2016-01-01
Measuring the Sun's internal meridional flow is one of the key issues of helioseismology. Using the Fourier-Legendre analysis is a technique for addressing this problem. We validate this technique with the help of artificial helioseismic data. The analysed data set was obtained by numerically simulating the effect of the meridional flow on the seismic wave field in the full volume of the Sun. In this way, a 51.2-hour long time series was generated. The resulting surface velocity field is then analyzed in various settings: Two $360^\\circ \\times 90^\\circ$ halfspheres, two $120^\\circ \\times 60^\\circ$ patches on the front and farside of the Sun (North and South, respectively) and two $120^\\circ \\times 60^\\circ$ patches on the northern and southern frontside only. We compare two possible measurement setups: observations from Earth and from an additional spacecraft on the solar farside, and observations from Earth only, in which case the full information of the global solar oscillation wave field was available. We ...
Automated Verification of Virtualized Infrastructures
DEFF Research Database (Denmark)
Bleikertz, Sören; Gross, Thomas; Mödersheim, Sebastian Alexander
2011-01-01
Virtualized infrastructures and clouds present new challenges for security analysis and formal verification: they are complex environments that continuously change their shape, and that give rise to non-trivial security goals such as isolation and failure resilience requirements. We present a pla...
Synthetic spider silk sustainability verification by techno-economic and life cycle analysis
Edlund, Alan
Major ampullate spider silk represents a promising biomaterial with diverse commercial potential ranging from textiles to medical devices due to the excellent physical and thermal properties from the protein structure. Recent advancements in synthetic biology have facilitated the development of recombinant spider silk proteins from Escherichia coli (E. coli), alfalfa, and goats. This study specifically investigates the economic feasibility and environmental impact of synthetic spider silk manufacturing. Pilot scale data was used to validate an engineering process model that includes all of the required sub-processing steps for synthetic fiber manufacture: production, harvesting, purification, drying, and spinning. Modeling was constructed modularly to support assessment of alternative protein production methods (alfalfa and goats) as well as alternative down-stream processing technologies. The techno-economic analysis indicates a minimum sale price from pioneer and optimized E. coli plants at 761 kg-1 and 23 kg-1 with greenhouse gas emissions of 572 kg CO2-eq. kg-1 and 55 kg CO2-eq. kg-1, respectively. Spider silk sale price estimates from goat pioneer and optimized results are 730 kg-1 and 54 kg-1, respectively, with pioneer and optimized alfalfa plants are 207 kg-1 and 9.22 kg-1 respectively. Elevated costs and emissions from the pioneer plant can be directly tied to the high material consumption and low protein yield. Decreased production costs associated with the optimized plants include improved protein yield, process optimization, and an Nth plant assumption. Discussion focuses on the commercial potential of spider silk, the production performance requirements for commercialization, and impact of alternative technologies on the sustainability of the system.
Experimental Verification and Analysis of Solar Parabolic Collector for Water Distillation
Directory of Open Access Journals (Sweden)
Mr. Mohd. Rizwan
2014-01-01
Full Text Available The paper is concerned with an experimental study of parabolic trough collector with its sun tracking system designed and manufactured to facilitate rapid diffusion and widespread use of solar energy. The paper focuses on use of alternative source of energy (through suns radiation which is easy to install, operate and maintain. Also, to improve the performance of solar concentrator, different geometries were evaluated with respect to their optical and energy conversion efficiency. To assure good performance and long technical lifetime of a concentrating system, the solar reflectance of the reflectors must be high and long term stable. During the research carried out, focus had been shifted from evaluation of the performance of concentrating solar collector to analysis of the optical properties of reflector and absorbing materials. The shift of focus was motivated by the need to assess long term system performance and possibilities of optimizing the optical efficiency or reducing costs by using new types of reflector materials and absorbing materials. The Solar Parabolic Trough Collector (SPTC was fabricated in local workshops and the sun tracking system was assembled using electric and electronic components in the market, while the mechanical components making up the driving system were procured from the local market. The objective of the research is to obtain distilled water by heating it to a higher temperature by solar parabolic trough collector. Solar distillation is used to produce potable water or to produce water for lead acid batteries or in chemical laboratories as in this case. The level of dissolved solids in solar distilled water is less than 3 ppm and bacteria free. The requirements for this specific design are a target for distilling water regularly with low maintenance.
Lifting locally homogeneous geometric structures
McKay, Benjamin
2011-01-01
We prove that under some purely algebraic conditions every locally homogeneous structure modelled on some homogeneous space is induced by a locally homogeneous structure modelled on a different homogeneous space.
Functionality and homogeneity.
2011-01-01
Functionality and homogeneity are two of the five Sustainable Safety principles. The functionality principle aims for roads to have but one exclusive function and distinguishes between traffic function (flow) and access function (residence). The homogeneity principle aims at differences in mass, spe
Functionality and homogeneity.
2011-01-01
Functionality and homogeneity are two of the five Sustainable Safety principles. The functionality principle aims for roads to have but one exclusive function and distinguishes between traffic function (flow) and access function (residence). The homogeneity principle aims at differences in mass, spe
Directory of Open Access Journals (Sweden)
Lenka Kovářová
2012-09-01
Full Text Available BACKGROUND: The triathlon is a combination of three different types of sport – swimming, cycling, and running. Each of these requires different top level predispositions and complex approach to talent selection is a rather difficult process. Attempts to identify assumptions in the triathlon have so far been specific and focused only on some groups of predispositions (physiology, motor tests, and psychology. The latest studies missed the structural approach and were based on determinants of sport performance, theory of sports training and expert assessment. OBJECTIVE: The aim of our study was to verify the model of predisposition in the short triathlon for talent assessment of young male athletes age 17–20 years. METHODS: The research sample consisted of 55 top level triathletes – men, who were included in the Government supported sports talent programme in the Czech Republic at the age of 17–20 years. We used a confirmative factor analysis (FA and Path diagram to verify the model, which allow us to explain mutual relationships among observed variables. For statistical data processing we used a structure equating modeling (SEM by software Lisrel L88. RESULTS: The study confirms best structural model for talent selection in triathlon at the age of 17–20 years old men, which composed seventeen indicators (tests and explained 91% of all cross-correlations (Goodness of Fit Index /GFI/ 0.91, Root Mean Square Residual /RMSR/ 0.13. Tests for predispositions in triathlons were grouped into five items, three motor predispositions (swimming, cycling and running skills, aerobic and psychological predispositions. Aerobic predispositions showed the highest importance to the assumptions to the general factor (1.00; 0. Running predispositions were measured as a very significant factor (–0.85; 0.28 which confirms importance of this critical stage of the race. Lower factor weight showed clusters of swimming (–0.61; 0.63 and cycling (0.53; 0
Hellmich, Christian; Fritsch, Andreas; Dormieux, Luc
Biomimetics deals with the application of nature-made "design solutions" to the realm of engineering. In the quest to understand mechanical implications of structural hierarchies found in biological materials, multiscale mechanics may hold the key to understand "building plans" inherent to entire material classes, here bone and bone replacement materials. Analyzing a multitude of biophysical hierarchical and biomechanical experiments through homogenization theories for upscaling stiffness and strength properties reveals the following design principles: The elementary component "collagen" induces, right at the nanolevel, the mechanical anisotropy of bone materials, which is amplified by fibrillar collagen-based structures at the 100-nm scale, and by pores in the micrometer-to-millimeter regime. Hydroxyapatite minerals are poorly organized, and provide stiffness and strength in a quasi-brittle manner. Water layers between hydroxyapatite crystals govern the inelastic behavior of the nanocomposite, unless the "collagen reinforcement" breaks. Bone replacement materials should mimic these "microstructural mechanics" features as closely as possible if an imitation of the natural form of bone is desired (Gebeshuber et al., Adv Mater Res 74:265-268, 2009).
Directory of Open Access Journals (Sweden)
Lenea Nørskov-Lauritsen
2014-02-01
Full Text Available Studying multidimensional signaling of G protein-coupled receptors (GPCRs in search of new and better treatments requires flexible, reliable and sensitive assays in high throughput screening (HTS formats. Today, more than half of the detection techniques used in HTS are based on fluorescence, because of the high sensitivity and rich signal, but quenching, optical interferences and light scattering are serious drawbacks. In the 1990s the HTRF® (Cisbio Bioassays, Codolet, France technology based on Förster resonance energy transfer (FRET in a time-resolved homogeneous format was developed. This improved technology diminished the traditional drawbacks. The optimized protocol described here based on HTRF® technology was used to study the activation and signaling pathways of the calcium-sensing receptor, CaSR, a GPCR responsible for maintaining calcium homeostasis. Stimulation of the CaSR by agonists activated several pathways, which were detected by measuring accumulation of the second messengers D-myo-inositol 1-phosphate (IP1 and cyclic adenosine 3',5'-monophosphate (cAMP, and by measuring the phosphorylation of extracellular signal-regulated kinase 1 and 2 (ERK1/2. Here we show how an optimized HTRF® platform with numerous advantages compared to previous assays provides a substantial and robust mode of investigating GPCR signaling. It is furthermore discussed how these assays can be optimized and miniaturized to meet HTS requirements and for screening compound libraries.
Nørskov-Lauritsen, Lenea; Thomsen, Alex Rojas Bie; Bräuner-Osborne, Hans
2014-02-13
Studying multidimensional signaling of G protein-coupled receptors (GPCRs) in search of new and better treatments requires flexible, reliable and sensitive assays in high throughput screening (HTS) formats. Today, more than half of the detection techniques used in HTS are based on fluorescence, because of the high sensitivity and rich signal, but quenching, optical interferences and light scattering are serious drawbacks. In the 1990s the HTRF® (Cisbio Bioassays, Codolet, France) technology based on Förster resonance energy transfer (FRET) in a time-resolved homogeneous format was developed. This improved technology diminished the traditional drawbacks. The optimized protocol described here based on HTRF® technology was used to study the activation and signaling pathways of the calcium-sensing receptor, CaSR, a GPCR responsible for maintaining calcium homeostasis. Stimulation of the CaSR by agonists activated several pathways, which were detected by measuring accumulation of the second messengers D-myo-inositol 1-phosphate (IP1) and cyclic adenosine 3',5'-monophosphate (cAMP), and by measuring the phosphorylation of extracellular signal-regulated kinase 1 and 2 (ERK1/2). Here we show how an optimized HTRF® platform with numerous advantages compared to previous assays provides a substantial and robust mode of investigating GPCR signaling. It is furthermore discussed how these assays can be optimized and miniaturized to meet HTS requirements and for screening compound libraries.
Institute of Scientific and Technical Information of China (English)
MING Zhimao; TAO Junyong; ZHANG Yunan; YI Xiaoshan; CHEN Xun
2009-01-01
New armament systems are subjected to the method for dealing with multi-stage system reliability-growth statistical problems of diverse population in order to improve reliability before starting mass production. Aiming at the test process which is high expense and small sample-size in the development of complex system, the specific methods are studied on how to process the statistical information of Bayesian reliability growth regarding diverse populations. Firstly, according to the characteristics of reliability growth during product development, the Bayesian method is used to integrate the testing information of multi-stage and the order relations of distribution parameters. And then a Gamma-Beta prior distribution is proposed based on non-homogeneous Poisson process(NHPP) corresponding to the reliability growth process. The posterior distribution of reliability parameters is obtained regarding different stages of product, and the reliability parameters are evaluated based on the posterior distribution. Finally, Bayesian approach proposed in this paper for multi-stage reliability growth test is applied to the test process which is small sample-size in the astronautics filed. The results of a numerical example show that the presented model can make use of the diverse information synthetically, and pave the way for the application of the Bayesian model for multi-stage reliability growth test evaluation with small sample-size. The method is useful for evaluating multi-stage system reliability and making reliability growth plan rationally.
Yu, XiaoChun; Bai, YuGuang; Cui, Miao; Gao, XiaoWei
2013-05-01
This paper presents a new inverse analysis approach to sensitivity analysis and material property identification in transient non-homogeneous and non-linear heat conduction Boundary Element Method (BEM) analysis based on Complex Variable Differentiation Method (CVDM). In this approach, the material properties are taken as the optimization variables, and the sensitivity coefficients are computed by CVDM. The advantages of using CVDM are that the computation of partial derivatives of an implicit function is reduced to function calculation in a complex domain, and the parameter sensitivity coefficients can be determined in a more accurate way than the traditional Finite Difference Method (FDM). Based on BEM and CVDM in evaluation of the sensitivity matrix of heat flux, the parameter such as thermal conductivity can be accurately identified. Six numerical examples are given to demonstrate the potential of the proposed approach. The results indicate that the presented method is efficient for identifying the thermal conductivity with single or multiple parameters.
Liu, Z.; Bambha, R.; Pinto, J. P.; Zeng, T.; Michelsen, H. A.
2013-12-01
The synergy between emissions-verification exercises for fossil-fuel CO2 and traditional air pollutants (TAPs, e.g., NOx, SO2, CO, and PM) stems from the common physical processes underlying the generation, transport, and perturbations of their emissions. Better understanding and characterizing such a synergetic relationship are of great interest and benefit for science and policy. To this end, we have been developing a modeling framework that allows for studying CO2 along with TAPs on regional-through-urban scales. The framework is based on the EPA Community Multi-Scale Air Quality (CMAQ) modeling system and has been implemented on a domain over the contiguous US, where abundant observational data and complete emissions information is available. In this presentation, we will show results from a comprehensive analysis of atmospheric CO2 and an array of TAPs observed from multiple networks and platforms (in situ and satellite observations) and those simulated by CMAQ over the contiguous US for a full year of 2007. We will first present the model configurations and input data used for CMAQ CO2 simulations and the results from model evaluations [1]. In light of the unique properties of CO2 compared to TAPs, we tested the sensitivity of model-simulated CO2 to different initial and boundary conditions, biosphere-atmosphere bidirectional fluxes and fossil-fuel emissions. We then examined the variability of CO2 and TAPs simulated by CMAQ and observed from the NOAA ESRL tall-tower network, the EPA AQS network, and satellites (e.g., SCIAMACHY and OMI) at various spatial and temporal scales. Finally, we diagnosed in CMAQ the roles of fluxes and transport in regulating the covariance between CO2 and TAPs manifested in both surface concentrations and column-integrated densities. We will discuss the implications from these results on how to understand trends and characteristics fossil-fuel emissions by exploiting and combining currently available observational and modeling
The Vehicle Verification after the Analysis of Simultaneous Engineering for Painting%涂装SE分析后的整车验证
Institute of Scientific and Technical Information of China (English)
薛杰; 王云飞; 栗玉山
2012-01-01
在数模分析阶段，通过整车验证所提交的ECR，总结成功的sE分析内容，为后续新车型或变型车的开发提供依据。%The equipment change request （ECR）was committed through stage of mathematical model, and the successful contents of simulta the vehicle verification at neous engineering （ SE ） summarized in order to providing the basis for the development of subsequent new vehicle or changing analytical analysis were model vehicle.
Energy Technology Data Exchange (ETDEWEB)
Unnikrishnan, V. K.; Nayak, Rajesh; Kartha, V. B.; Santhosh, C., E-mail: santhosh.cls@manipal.edu, E-mail: unnikrishnan.vk@manipal.edu [Department of Atomic and Molecular Physics, Manipal University, Manipal (India); Sonavane, M. S. [Nuclear Recycle Board, Bhabha Atomic Research Centre, Mumbai (India); Yeotikar, R. G. [Process Development Division, Bhabha Atomic Research Centre, Mumbai (India); Shah, M. L.; Gupta, G. P.; Suri, B. M. [Laser and Plasma Technology Division, Bhabha Atomic Research Centre, Mumbai (India)
2014-09-15
Laser-induced breakdown spectroscopy (LIBS), an atomic emission spectroscopy method, has rapidly grown as one of the best elemental analysis techniques over the past two decades. Homogeneity testing and quantitative analysis of manganese (Mn) in manganese-doped glasses have been carried out using an optimized LIBS system employing a nanosecond ultraviolet Nd:YAG laser as the source of excitation. The glass samples have been prepared using conventional vitrification methods. The laser pulse irradiance on the surface of the glass samples placed in air at atmospheric pressure was about 1.7×10{sup 9} W/cm{sup 2}. The spatially integrated plasma emission was collected and imaged on to the spectrograph slit using an optical-fiber-based collection system. Homogeneity was checked by recording LIBS spectra from different sites on the sample surface and analyzing the elemental emission intensities for concentration determination. Validation of the observed LIBS results was done by comparison with scanning electron microscope- energy dispersive X-ray spectroscopy (SEM-EDX) surface elemental mapping. The analytical performance of the LIBS system has been evaluated through the correlation of the LIBS determined concentrations of Mn with its certified values. The results are found to be in very good agreement with the certified concentrations.
Directory of Open Access Journals (Sweden)
V. K. Unnikrishnan
2014-09-01
Full Text Available Laser-induced breakdown spectroscopy (LIBS, an atomic emission spectroscopy method, has rapidly grown as one of the best elemental analysis techniques over the past two decades. Homogeneity testing and quantitative analysis of manganese (Mn in manganese-doped glasses have been carried out using an optimized LIBS system employing a nanosecond ultraviolet Nd:YAG laser as the source of excitation. The glass samples have been prepared using conventional vitrification methods. The laser pulse irradiance on the surface of the glass samples placed in air at atmospheric pressure was about 1.7×109 W/cm2. The spatially integrated plasma emission was collected and imaged on to the spectrograph slit using an optical-fiber-based collection system. Homogeneity was checked by recording LIBS spectra from different sites on the sample surface and analyzing the elemental emission intensities for concentration determination. Validation of the observed LIBS results was done by comparison with scanning electron microscope- energy dispersive X-ray spectroscopy (SEM-EDX surface elemental mapping. The analytical performance of the LIBS system has been evaluated through the correlation of the LIBS determined concentrations of Mn with its certified values. The results are found to be in very good agreement with the certified concentrations.
Verification Games: Crowd-Sourced Formal Verification
2016-03-01
Government drawings, specifications, or other data included in this document for any purpose other than Government procurement does not in any way...information exchange, and its publication does not constitute the Government’s approval or disapproval of its ideas or findings. REPORT DOCUMENTATION ...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES: CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750
Jekova, Irena; Bortolan, Giovanni
2015-01-01
Traditional means for identity validation (PIN codes, passwords), and physiological and behavioral biometric characteristics (fingerprint, iris, and speech) are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (r I), II (r II), calculated from them first principal ECG component (r PCA), linear and nonlinear combinations between r I, r II, and r PCA. For the verification task, the one-to-one scenario is applied and threshold values for r I, r II, and r PCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension) has been considered. In addition a common reference PTB dataset (14 healthy individuals) with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%.
Directory of Open Access Journals (Sweden)
Irena Jekova
2015-01-01
Full Text Available Traditional means for identity validation (PIN codes, passwords, and physiological and behavioral biometric characteristics (fingerprint, iris, and speech are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (rI, II (rII, calculated from them first principal ECG component (rPCA, linear and nonlinear combinations between rI, rII, and rPCA. For the verification task, the one-to-one scenario is applied and threshold values for rI, rII, and rPCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension has been considered. In addition a common reference PTB dataset (14 healthy individuals with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%.
Jekova, Irena; Bortolan, Giovanni
2015-01-01
Traditional means for identity validation (PIN codes, passwords), and physiological and behavioral biometric characteristics (fingerprint, iris, and speech) are susceptible to hacker attacks and/or falsification. This paper presents a method for person verification/identification based on correlation of present-to-previous limb ECG leads: I (rI), II (rII), calculated from them first principal ECG component (rPCA), linear and nonlinear combinations between rI, rII, and rPCA. For the verification task, the one-to-one scenario is applied and threshold values for rI, rII, and rPCA and their combinations are derived. The identification task supposes one-to-many scenario and the tested subject is identified according to the maximal correlation with a previously recorded ECG in a database. The population based ECG-ILSA database of 540 patients (147 healthy subjects, 175 patients with cardiac diseases, and 218 with hypertension) has been considered. In addition a common reference PTB dataset (14 healthy individuals) with short time interval between the two acquisitions has been taken into account. The results on ECG-ILSA database were satisfactory with healthy people, and there was not a significant decrease in nonhealthy patients, demonstrating the robustness of the proposed method. With PTB database, the method provides an identification accuracy of 92.9% and a verification sensitivity and specificity of 100% and 89.9%. PMID:26568954
Homogeneity of Inorganic Glasses
DEFF Research Database (Denmark)
Jensen, Martin; Zhang, L.; Keding, Ralf;
2011-01-01
Homogeneity of glasses is a key factor determining their physical and chemical properties and overall quality. However, quantification of the homogeneity of a variety of glasses is still a challenge for glass scientists and technologists. Here, we show a simple approach by which the homogeneity...... of different glass products can be quantified and ranked. This approach is based on determination of both the optical intensity and dimension of the striations in glasses. These two characteristic values areobtained using the image processing method established recently. The logarithmic ratio between...... the dimension and the intensity is used to quantify and rank the homogeneity of glass products. Compared with the refractive index method, the image processing method has a wider detection range and a lower statistical uncertainty....
Homogeneity of Inorganic Glasses
DEFF Research Database (Denmark)
Jensen, Martin; Zhang, L.; Keding, Ralf
2011-01-01
Homogeneity of glasses is a key factor determining their physical and chemical properties and overall quality. However, quantification of the homogeneity of a variety of glasses is still a challenge for glass scientists and technologists. Here, we show a simple approach by which the homogeneity...... of different glass products can be quantified and ranked. This approach is based on determination of both the optical intensity and dimension of the striations in glasses. These two characteristic values areobtained using the image processing method established recently. The logarithmic ratio between...... the dimension and the intensity is used to quantify and rank the homogeneity of glass products. Compared with the refractive index method, the image processing method has a wider detection range and a lower statistical uncertainty....
Benchmarking monthly homogenization algorithms
Directory of Open Access Journals (Sweden)
V. K. C. Venema
2011-08-01
Full Text Available The COST (European Cooperation in Science and Technology Action ES0601: Advances in homogenization methods of climate series: an integrated approach (HOME has executed a blind intercomparison and validation study for monthly homogenization algorithms. Time series of monthly temperature and precipitation were evaluated because of their importance for climate studies and because they represent two important types of statistics (additive and multiplicative. The algorithms were validated against a realistic benchmark dataset. The benchmark contains real inhomogeneous data as well as simulated data with inserted inhomogeneities. Random break-type inhomogeneities were added to the simulated datasets modeled as a Poisson process with normally distributed breakpoint sizes. To approximate real world conditions, breaks were introduced that occur simultaneously in multiple station series within a simulated network of station data. The simulated time series also contained outliers, missing data periods and local station trends. Further, a stochastic nonlinear global (network-wide trend was added.
Participants provided 25 separate homogenized contributions as part of the blind study as well as 22 additional solutions submitted after the details of the imposed inhomogeneities were revealed. These homogenized datasets were assessed by a number of performance metrics including (i the centered root mean square error relative to the true homogeneous value at various averaging scales, (ii the error in linear trend estimates and (iii traditional contingency skill scores. The metrics were computed both using the individual station series as well as the network average regional series. The performance of the contributions depends significantly on the error metric considered. Contingency scores by themselves are not very informative. Although relative homogenization algorithms typically improve the homogeneity of temperature data, only the best ones improve
Energy Technology Data Exchange (ETDEWEB)
Copps, Kevin D.
2011-12-01
For a CASL grid-to-rod fretting problem, Sandia's Percept software was used in conjunction with the Sierra Mechanics suite to analyze the convergence behavior of the data transfer from a fluid simulation to a solid mechanics simulation. An analytic function, with properties relatively close to numerically computed fluid approximations, was chosen to represent the pressure solution in the fluid domain. The analytic pressure was interpolated on a sequence of grids on the fluid domain, and transferred onto a separate sequence of grids in the solid domain. The error in the resulting pressure in the solid domain was measured with respect to the analytic pressure. The error in pressure approached zero as both the fluid and solids meshes were refined. The convergence of the transfer algorithm was limited by whether the source grid resolution was the same or finer than the target grid resolution. In addition, using a feature coverage analysis, we found gaps in the solid mechanics code verification test suite directly relevant to the prototype CASL GTRF simulations.
3D DVH-based metric analysis versus per-beam planar analysis in IMRT pretreatment verification
Energy Technology Data Exchange (ETDEWEB)
Carrasco, Pablo; Jornet, Nuria; Latorre, Artur; Eudaldo, Teresa; Ruiz, Agusti; Ribas, Montserrat [Servei de Radiofisica i Radioproteccio, Hospital de la Santa Creu i Sant Pau, Sant Antoni Maria Claret, 167, 08025 Barcelona (Spain)
2012-08-15
Purpose: To evaluate methods of pretreatment IMRT analysis, using real measurements performed with a commercial 2D detector array, for clinical relevance and accuracy by comparing clinical DVH parameters. Methods: We divided the work into two parts. The first part consisted of six in-phantom tests aimed to study the sensitivity of the different analysis methods. Beam fluences, 3D dose distribution, and DVH of an unaltered original plan were compared to those of the delivered plan, in which an error had been intentionally introduced. The second part consisted of comparing gamma analysis with DVH metrics for 17 patient plans from various sites. Beam fluences were measured with the MapCHECK 2 detector, per-beam planar analysis was performed with the MapCHECK software, and 3D gamma analysis and the DVH evaluation were performed using 3DVH software. Results: In a per-beam gamma analysis some of the tests yielded false positives or false negatives. However, the 3DVH software correctly described the DVH of the plan which included the error. The measured DVH from the plan with controlled error agreed with the planned DVH within 2% dose or 2% volume. We also found that a gamma criterion of 3%/3 mm was too lax to detect some of the forced errors. Global analysis masked some problems, while local analysis magnified irrelevant errors at low doses. Small hotspots were missed for all metrics due to the spatial resolution of the detector panel. DVH analysis for patient plans revealed small differences between treatment plan calculations and 3DVH results, with the exception of very small volume structures such as the eyes and the lenses. Target coverage (D{sub 98} and D{sub 95}) of the measured plan was systematically lower than that predicted by the treatment planning system, while other DVH characteristics varied depending on the parameter and organ. Conclusions: We found no correlation between the gamma index and the clinical impact of a discrepancy for any of the gamma index
Ma, Xiaoye; Chen, Yong; Cole, Stephen R; Chu, Haitao
2014-05-26
To account for between-study heterogeneity in meta-analysis of diagnostic accuracy studies, bivariate random effects models have been recommended to jointly model the sensitivities and specificities. As study design and population vary, the definition of disease status or severity could differ across studies. Consequently, sensitivity and specificity may be correlated with disease prevalence. To account for this dependence, a trivariate random effects model had been proposed. However, the proposed approach can only include cohort studies with information estimating study-specific disease prevalence. In addition, some diagnostic accuracy studies only select a subset of samples to be verified by the reference test. It is known that ignoring unverified subjects may lead to partial verification bias in the estimation of prevalence, sensitivities, and specificities in a single study. However, the impact of this bias on a meta-analysis has not been investigated. In this paper, we propose a novel hybrid Bayesian hierarchical model combining cohort and case-control studies and correcting partial verification bias at the same time. We investigate the performance of the proposed methods through a set of simulation studies. Two case studies on assessing the diagnostic accuracy of gadolinium-enhanced magnetic resonance imaging in detecting lymph node metastases and of adrenal fluorine-18 fluorodeoxyglucose positron emission tomography in characterizing adrenal masses are presented.
Directory of Open Access Journals (Sweden)
N.V. Chernysheva
2016-08-01
Full Text Available The application of algorithms of the finite element method (FEM or the boundary element method (BEM reveals some peculiar properties for a numerical solution of the three-dimensional analysis in infinite domains. Various algorithms offer to avoid such problems at the expense of combining different methods and equations. The algorithm of the 3d analysis developed to solve an external boundary problem by applying the combined method based on incorporating the FEM and Somigliana’s integral formula is considered. The algorithm is modified for the case of the interaction of a structure with an inhomogeneous medium. The efficiency of software implementation of both algorithms has been tested. A stress-strain analysis of an inhomogeneous medium with a cavity has been carried out to illustrate the given approach.
Pardasani, Deepak; Gupta, Arvinda K; Palit, Meehir; Shakya, Purushottam; Kanaujia, Pankaj K; Sekhar, K; Dubey, Devendra K
2005-01-01
This paper describes the synthesis and gas chromatography/electron ionization mass spectrometric (GC/EI-MS) analysis of methyl esters of N,N-dialkylaminoethane-2-sulfonic acids (DAESAs). These sulfonic acids are important environmental signatures of nerve agent VX and its toxic analogues, hence GC/EI-MS analysis of their methyl esters is of paramount importance for verification of the Chemical Weapons Convention. DAESAs were prepared by condensation of 2-bromoethane sulfonic acid with dialkylamines, and by condensation of dialkylaminoethyl chloride with sodium bisulfite. GC/EI-MS analysis of methyl esters of DAESAs yielded mass spectra; based on these spectra, generalized fragmentation routes are proposed that rationalize most of the characteristic ions.
Parada, N. D. J. (Principal Investigator); Cappelletti, C. A.
1982-01-01
A stratification oriented to crop area and yield estimation problems was performed using an algorithm of clustering. The variables used were a set of agroclimatological characteristics measured in each one of the 232 municipalities of the State of Rio Grande do Sul, Brazil. A nonhierarchical cluster analysis was used and the pseudo F-statistics criterion was implemented for determining the "cut point" in the number of strata.
Directory of Open Access Journals (Sweden)
Xue Liang
Full Text Available PURPOSE: To investigate the pattern of spontaneous neural activity in patients with end-stage renal disease (ESRD with and without neurocognitive dysfunction using resting-state functional magnetic resonance imaging (rs-fMRI with a regional homogeneity (ReHo algorithm. MATERIALS AND METHODS: rs-fMRI data were acquired in 36 ESRD patients (minimal nephro-encephalopathy [MNE], n = 19, 13 male, 37±12.07 years; non-nephro-encephalopathy [non-NE], n = 17, 11 male, 38±12.13 years and 20 healthy controls (13 male, 7 female, 36±10.27 years. Neuropsychological (number connection test type A [NCT-A], digit symbol test [DST] and laboratory tests were performed in all patients. The Kendall's coefficient of concordance (KCC was used to measure the regional homogeneity for each subject. The regional homogeneity maps were compared using ANOVA tests among MNE, non-NE, and healthy control groups and post hoc t -tests between each pair in a voxel-wise way. A multiple regression analysis was performed to evaluate the relationships between ReHo index and NCT-A, DST scores, serum creatinine and urea levels, disease and dialysis duration. RESULTS: Compared with healthy controls, both MNE and non-NE patients showed decreased ReHo in the multiple areas of bilateral frontal, parietal and temporal lobes. Compared with the non-NE, MNE patients showed decreased ReHo in the right inferior parietal lobe (IPL, medial frontal cortex (MFC and left precuneus (PCu. The NCT-A scores and serum urea levels of ESRD patients negatively correlated with ReHo values in the frontal and parietal lobes, while DST scores positively correlated with ReHo values in the bilateral PCC/precuneus, MFC and inferior parietal lobe (IPL (all P0.05, AlphaSim corrected. CONCLUSION: Diffused decreased ReHo values were found in both MNE and non-NE patients. The progressively decreased ReHo in the default mode network (DMN, frontal and parietal lobes might be trait-related in MNE. The Re
1979-01-01
Structural analysis and certification of the collector system is presented. System verification against the interim performance criteria is presented and indicated by matrices. The verification discussion, analysis, and test results are also given.
Homogeneity and composition of AlInGaN: A multiprobe nanostructure study
Energy Technology Data Exchange (ETDEWEB)
Krause, Florian F., E-mail: f.krause@ifp.uni-bremen.de [Institut für Festkörperphysik, Universität Bremen, Otto-Hahn-Allee 1, 28359 Bremen (Germany); Ahl, Jan-Philipp [OSRAM Opto Semiconductors GmbH, Leibnizstr. 4, 93055 Regensburg (Germany); Tytko, Darius; Choi, Pyuck-Pa [Max-Planck-Institut für Eisenforschung GmbH, Max-Planck-Str. 1, 40237 Düsseldorf (Germany); Egoavil, Ricardo [EMAT, Universiteit Antwerpen, Groenenborgerlaan 171, 2020 Antwerpen (Belgium); Schowalter, Marco; Mehrtens, Thorsten; Müller-Caspary, Knut [Institut für Festkörperphysik, Universität Bremen, Otto-Hahn-Allee 1, 28359 Bremen (Germany); Verbeeck, Johan [EMAT, Universiteit Antwerpen, Groenenborgerlaan 171, 2020 Antwerpen (Belgium); Raabe, Dierk [Max-Planck-Institut für Eisenforschung GmbH, Max-Planck-Str. 1, 40237 Düsseldorf (Germany); Hertkorn, Joachim; Engl, Karl [OSRAM Opto Semiconductors GmbH, Leibnizstr. 4, 93055 Regensburg (Germany); Rosenauer, Andreas [Institut für Festkörperphysik, Universität Bremen, Otto-Hahn-Allee 1, 28359 Bremen (Germany)
2015-09-15
The electronic properties of quaternary AlInGaN devices significantly depend on the homogeneity of the alloy. The identification of compositional fluctuations or verification of random-alloy distribution is hence of grave importance. Here, a comprehensive multiprobe study of composition and compositional homogeneity is presented, investigating AlInGaN layers with indium concentrations ranging from 0 to 17 at% and aluminium concentrations between 0 and 39 at% employing high-angle annular dark field scanning electron microscopy (HAADF STEM), energy dispersive X-ray spectroscopy (EDX) and atom probe tomography (APT). EDX mappings reveal distributions of local concentrations which are in good agreement with random alloy atomic distributions. This was hence investigated with HAADF STEM by comparison with theoretical random alloy expectations using statistical tests. To validate the performance of these tests, HAADF STEM image simulations were carried out for the case of a random-alloy distribution of atoms and for the case of In-rich clusters with nanometer dimensions. The investigated samples, which were grown by metal-organic vapor phase epitaxy (MOVPE), were thereby found to be homogeneous on this nanometer scale. Analysis of reconstructions obtained from APT measurements yielded matching results. Though HAADF STEM only allows for the reduction of possible combinations of indium and aluminium concentrations to the proximity of isolines in the two-dimensional composition space. The observed ranges of composition are in good agreement with the EDX and APT results within the respective precisions. - Highlights: • Composition and homogeneity of AlInGaN layers investigated with various techniques. • Random alloy distribution shown with HAADF STEM by comparison to simulation. • Composition of MOVPE grown quaternary AlInGaN found to be fully homogeneous. • Homogeneity also confirmed by APT measurements. • Concentrations determined by EDX, APT and HAADF
Verification Account Management System (VAMS)
Social Security Administration — The Verification Account Management System (VAMS) is the centralized location for maintaining SSA's verification and data exchange accounts. VAMS account management...
Verification of ceramic structures
Behar-Lafenetre, S.; Cornillon, L.; Rancurel, M.; Graaf, D. de; Hartmann, P.; Coe, G.; Laine, B.
2012-01-01
In the framework of the "Mechanical Design and Verification Methodologies for Ceramic Structures" contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instr
DEFF Research Database (Denmark)
Lei, Anders; Xu, Ruichao; Pedersen, C.M.
2011-01-01
This work presents a high yield wafer scale fabrication of MEMS-based unimorph silicon/PZT thick film vibrational energy harvesters aimed towards vibration sources with peak frequencies in the range of a few hundred Hz. By combining KOH etching with mechanical front side protection, SOI wafer...... to accurately define the thickness of the silicon part of the harvester and a silicon compatible PZT thick film screen-printing technique, we are able to fabricate energy harvesters on wafer scale with a yield higher than 90%. The characterization of the fabricated harvesters is focused towards the full wafer....../mass-production aspect; hence the analysis of uniformity in harvested power and resonant frequency....
Świrniak, Grzegorz; Mroczka, Janusz
2017-07-01
This work provides a numerical study of the scattering of low-coherent light by an infinite right circular cylinder and various types of optical fiber (with step- and graded-index profiles) in the vicinity of primary rainbows, caused by light that has been subjected to one internal reflection. The scattered intensity is analyzed in terms of the Fourier transform as well as in the time domain (by examining the impulse response of a fiber) with the aim to obtain a detailed information about the scattering process. The analysis reveals a wealth of information about the scattering process that is not obvious when a fiber is illuminated by a temporally coherent light source. The results also provide an idea for the characterization of the core size of step-index optical fibers.
Dynamics of homogeneous nucleation
DEFF Research Database (Denmark)
Toxværd, Søren
2015-01-01
The classical nucleation theory for homogeneous nucleation is formulated as a theory for a density fluctuation in a supersaturated gas at a given temperature. But molecular dynamics simulations reveal that it is small cold clusters which initiates the nucleation. The temperature in the nucleating...
Energy Technology Data Exchange (ETDEWEB)
Talley, Darren G.
2017-04-01
This report describes the work and results of the verification and validation (V&V) of the version 1.0 release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, the equation of motion for fuel element thermal expansion, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This V&V effort was intended to confirm that the code shows good agreement between simulation and actual ACRR operations.
Bhattacharya, Bishnupriya; Hall, Gary; Heaton, Nick; Kashai, Yaron; Khan Neyaz; Kirshenbaum, Zeev; Shneydor, Efrat
2011-01-01
The Accellera Universal Verification Methodology (UVM) standard is architected to scale, but verification is growing and in more than just the digital design dimension. It is growing in the SoC dimension to include low-power and mixed-signal and the system integration dimension to include multi-language support and acceleration. These items and others all contribute to the quality of the SOC so the Metric-Driven Verification (MDV) methodology is needed to unify it all into a coherent verification plan. This book is for verification engineers and managers familiar with the UVM and the benefits it brings to digital verification but who also need to tackle specialized tasks. It is also written for the SoC project manager that is tasked with building an efficient worldwide team. While the task continues to become more complex, Advanced Verification Topics describes methodologies outside of the Accellera UVM standard, but that build on it, to provide a way for SoC teams to stay productive and profitable.
Verification of predicted robustness and accuracy of multivariate analysis☆
Markiewicz, P.J.; Matthews, J.C.; Declerck, J.; Herholz, K.
2012-01-01
The assessment of accuracy and robustness of multivariate analysis of FDG-PET brain images as presented in [Markiewicz, P.J., Matthews, J.C., Declerck, J., Herholz, K., 2009. Robustness of multivariate image analysis assessed by resampling techniques and applied to FDG-PET scans of patients with Alzheimer's disease. Neuroimage 46, 472–485.] using a homogeneous sample (from one centre) of small size is here verified using a heterogeneous sample (from multiple centres) of much larger size. Originally the analysis, which included principal component analysis (PCA) and Fisher discriminant analysis (FDA), was established using a sample of 42 subjects (19 Normal Controls (NCs) and 23 Alzheimer's disease (AD) patients) and here the analysis is verified using an independent sample of 166 subjects (86 NCs and 80 ADs) obtained from the ADNI database. It is shown that bootstrap resampling combined with the metric of the largest principal angle between PCA subspaces as well as the deliberate clinical misdiagnosis simulation can predict robustness of the multivariate analysis when used with new datasets. Cross-validation (CV) and the .632 bootstrap overestimated the predictive accuracy encouraging less robust solutions. Also, it is shown that the type of PET scanner and image reconstruction method has an impact on such analysis and affects the accuracy of the verification sample. PMID:21338696
Dynamics of homogeneous nucleation
DEFF Research Database (Denmark)
Toxværd, Søren
2015-01-01
The classical nucleation theory for homogeneous nucleation is formulated as a theory for a density fluctuation in a supersaturated gas at a given temperature. But molecular dynamics simulations reveal that it is small cold clusters which initiates the nucleation. The temperature in the nucleating...... clusters fluctuates, but the mean temperature remains below the temperature in the supersaturated gas until they reach the critical nucleation size. The critical nuclei have, however, a temperature equal to the supersaturated gas. The kinetics of homogeneous nucleation is not only caused by a grow...... or shrink by accretion or evaporation of monomers only but also by an exponentially declining change in cluster size per time step equal to the cluster distribution in the supersaturated gas....
Homogeneous group, research, institution
Directory of Open Access Journals (Sweden)
Francesca Natascia Vasta
2014-09-01
Full Text Available The work outlines the complex connection among empiric research, therapeutic programs and host institution. It is considered the current research state in Italy. Italian research field is analyzed and critic data are outlined: lack of results regarding both the therapeutic processes and the effectiveness of eating disorders group analytic treatment. The work investigates on an eating disorders homogeneous group, led into an eating disorder outpatient service. First we present the methodological steps the research is based on including the strong connection among theory and clinical tools. Secondly clinical tools are described and the results commented. Finally, our results suggest the necessity of validating some more specifical hypothesis: verifying the relationship between clinical improvement (sense of exclusion and painful emotions reduction and specific group therapeutic processes; verifying the relationship between depressive feelings, relapses and transition trough a more differentiated groupal field.Keywords: Homogeneous group; Eating disorders; Institutional field; Therapeutic outcome
Homogenous finitary symmetric groups
Directory of Open Access Journals (Sweden)
Otto. H. Kegel
2015-03-01
Full Text Available We characterize strictly diagonal type of embeddings of finitary symmetric groups in terms of cardinality and the characteristic. Namely, we prove the following. Let kappa be an infinite cardinal. If G=underseti=1stackrelinftybigcupG i , where G i =FSym(kappan i , (H=underseti=1stackrelinftybigcupH i , where H i =Alt(kappan i , is a group of strictly diagonal type and xi=(p 1 ,p 2 ,ldots is an infinite sequence of primes, then G is isomorphic to the homogenous finitary symmetric group FSym(kappa(xi (H is isomorphic to the homogenous alternating group Alt(kappa(xi , where n 0 =1,n i =p 1 p 2 ldotsp i .
Directory of Open Access Journals (Sweden)
VIORICA POPESCU
2013-05-01
Full Text Available At 1st January 2007 when Romania joined the European Union was established a Cooperation and Verification Mechanism (further named “CVM” in order to support Romania to remedy certain shortcomings in the areas of judicial reform and fight against corruption, as well as to monitor the achieved progress through periodic reports. Though the reforms of the human resources management in the Romanian legal system were conceived in a coherent framework, the main changes in this area often did not complement each other, their implementation being sometimes inconsistent with previous measures taken. In this context, the study aims to make a short analysis of the way in which the human resources management’s reform was reflected in the European Commission’s reports, pointing the measures adopted by the Romanian authorities.
Energy Technology Data Exchange (ETDEWEB)
Liu, Fei; Parkinson, B. A.; Divan, Ralu; Roberts, John; Liang, Yanping
2016-12-01
Interdigitated array (IDA) electrodes have been applied to study the EC’ (electron transfer reaction followed by a catalytic reaction) reactions and a new method of quantitative analysis of IDA results was developed. In this new method, currents on IDA generator and collector electrodes for an EC’ mechanism are derived from the number of redox cycles and the contribution of non-catalytic current. And the fractions of bipotential recycling species and catalytic-active species are calculated, which helps understanding the catalytic reaction mechanism. The homogeneous hydrogen evolution reaction catalyzed by [Ni(PPh2NBn2)2]2+ (where PPh2NBn2 is 1,5-dibenzyl-3,7-diphenyl-1,5-diaza-3,7-diphosphacyclooctane) electrocatalyst was examined and analyzed with IDA electrodes. Besides, the existence of reaction intermediates in the catalytic cycle is inferred from the electrochemical behavior of a glassy carbon disk electrodes and carbon IDA electrodes. This quantitative analysis of IDA electrode cyclic voltammetry currents can be used as a simple and straightforward method for determining reaction mechanism in other catalytic systems as well.
Radhakrishnan, Krishnan
1994-01-01
LSENS, the Lewis General Chemical Kinetics and Sensitivity Analysis Code, has been developed for solving complex, homogeneous, gas-phase chemical kinetics problems and contains sensitivity analysis for a variety of problems, including nonisothermal situations. This report is part 1 of a series of three reference publications that describe LENS, provide a detailed guide to its usage, and present many example problems. Part 1 derives the governing equations and describes the numerical solution procedures for the types of problems that can be solved. The accuracy and efficiency of LSENS are examined by means of various test problems, and comparisons with other methods and codes are presented. LSENS is a flexible, convenient, accurate, and efficient solver for chemical reaction problems such as static system; steady, one-dimensional, inviscid flow; reaction behind incident shock wave, including boundary layer correction; and perfectly stirred (highly backmixed) reactor. In addition, the chemical equilibrium state can be computed for the following assigned states: temperature and pressure, enthalpy and pressure, temperature and volume, and internal energy and volume. For static problems the code computes the sensitivity coefficients of the dependent variables and their temporal derivatives with respect to the initial values of the dependent variables and/or the three rate coefficient parameters of the chemical reactions.
Regression Verification Using Impact Summaries
Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana
2013-01-01
Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program
Homogenization of resonant chiral metamaterials
DEFF Research Database (Denmark)
Andryieuski, Andrei; Menzel, C.; Rockstuhl, Carsten
2010-01-01
Homogenization of metamaterials is a crucial issue as it allows to describe their optical response in terms of effective wave parameters as, e.g., propagation constants. In this paper we consider the possible homogenization of chiral metamaterials. We show that for meta-atoms of a certain size...... an analytical criterion for performing the homogenization and a tool to predict the homogenization limit. We show that strong coupling between meta-atoms of chiral metamaterials may prevent their homogenization at all....
Homogenization of resonant chiral metamaterials
DEFF Research Database (Denmark)
Andryieuski, Andrei; Menzel, C.; Rockstuhl, Carsten
2010-01-01
Homogenization of metamaterials is a crucial issue as it allows to describe their optical response in terms of effective wave parameters as, e.g., propagation constants. In this paper we consider the possible homogenization of chiral metamaterials. We show that for meta-atoms of a certain size...... an analytical criterion for performing the homogenization and a tool to predict the homogenization limit. We show that strong coupling between meta-atoms of chiral metamaterials may prevent their homogenization at all....
Homogeneous Clifford structures
Moroianu, Andrei; Pilca, Mihaela
2012-01-01
We give an upper bound for the rank r of homogeneous (even) Clifford structures on compact manifolds of non-vanishing Euler characteristic. More precisely, we show that if r = 2a � b with b odd, then r � 9 for a = 0, r � 10 for a = 1, r � 12 for a = 2 and r � 16 for a � 3. Moreover, we describe the four limiting cases and show that there is exactly one solution in each case.
Figueroa-O'Farrill, José
2015-01-01
Motivated by the search for new gravity duals to M2 branes with $N>4$ supersymmetry --- equivalently, M-theory backgrounds with Killing superalgebra $\\mathfrak{osp}(N|4)$ for $N>4$ --- we classify (except for a small gap) homogeneous M-theory backgrounds with symmetry Lie algebra $\\mathfrak{so}(n) \\oplus \\mathfrak{so}(3,2)$ for $n=5,6,7$. We find that there are no new backgrounds with $n=6,7$ but we do find a number of new (to us) backgrounds with $n=5$. All backgrounds are metrically products of the form $\\operatorname{AdS}_4 \\times P^7$, with $P$ riemannian and homogeneous under the action of $\\operatorname{SO}(5)$, or $S^4 \\times Q^7$ with $Q$ lorentzian and homogeneous under the action of $\\operatorname{SO}(3,2)$. At least one of the new backgrounds is supersymmetric (albeit with only $N=2$) and we show that it can be constructed from a supersymmetric Freund--Rubin background via a Wick rotation. Two of the new backgrounds have only been approximated numerically.
Verification measurements of an eMC algorithm using a 2D ion chamber array.
Wanklyn, Mark D; Kidane, Ghirmay; Crees, Liz
2016-09-08
The aim of this study was to assess the suitability of the Im'RT MatriXX 2D ion chamber array for performing verification measurements on the Varian Eclipse electron Monte Carlo (eMC) algorithm for a range of clinical energies (6, 12, and 20 MeV) on a Varian 2100iX linear accelerator. Firstly, the suitability of the MatriXX for measuring percentage depth doses (PDD) in water was assessed, including characterization of the inherent buildup found in the MatriXX. Secondly the suitability of the MatriXX for measuring dose distributions in homogeneous and heterogeneous phantoms was assessed using gamma analysis at 3%/3 mm. It was found that after adjusting the PDD curves for the inherent buildup, that the position of R50,D measured using the MatriXX agreed to within 1 mm to the PDDs generated using the eMC algorithm for all energies used in this study. Gamma analysis at 3%/3mm showed very good agreement (> 95%) for all cases in both homogeneous and heterogeneous phantoms. It was concluded that the Im'RT MatriXX is a suitable device for performing eMC verification and could potentially be used for routine energy checks of electron beams.
Kume, Hideaki; Muraoka, Satoshi; Kuga, Takahisa; Adachi, Jun; Narumi, Ryohei; Watanabe, Shio; Kuwano, Masayoshi; Kodera, Yoshio; Matsushita, Kazuyuki; Fukuoka, Junya; Masuda, Takeshi; Ishihama, Yasushi; Matsubara, Hisahiro; Nomura, Fumio; Tomonaga, Takeshi
2014-06-01
Recent advances in quantitative proteomic technology have enabled the large-scale validation of biomarkers. We here performed a quantitative proteomic analysis of membrane fractions from colorectal cancer tissue to discover biomarker candidates, and then extensively validated the candidate proteins identified. A total of 5566 proteins were identified in six tissue samples, each of which was obtained from polyps and cancer with and without metastasis. GO cellular component analysis predicted that 3087 of these proteins were membrane proteins, whereas TMHMM algorithm predicted that 1567 proteins had a transmembrane domain. Differences were observed in the expression of 159 membrane proteins and 55 extracellular proteins between polyps and cancer without metastasis, while the expression of 32 membrane proteins and 17 extracellular proteins differed between cancer with and without metastasis. A total of 105 of these biomarker candidates were quantitated using selected (or multiple) reaction monitoring (SRM/MRM) with stable synthetic isotope-labeled peptides as an internal control. The results obtained revealed differences in the expression of 69 of these proteins, and this was subsequently verified in an independent set of patient samples (polyps (n = 10), cancer without metastasis (n = 10), cancer with metastasis (n = 10)). Significant differences were observed in the expression of 44 of these proteins, including ITGA5, GPRC5A, PDGFRB, and TFRC, which have already been shown to be overexpressed in colorectal cancer, as well as proteins with unknown function, such as C8orf55. The expression of C8orf55 was also shown to be high not only in colorectal cancer, but also in several cancer tissues using a multicancer tissue microarray, which included 1150 cores from 14 cancer tissues. This is the largest verification study of biomarker candidate membrane proteins to date; our methods for biomarker discovery and subsequent validation using SRM/MRM will contribute to the
Social Security Administration — The SSN Verification Service is used by Java applications to execute the GUVERF02 service using the WebSphere/CICS Interface. It accepts several input data fields...
Martin, Edward J.
2008-01-15
A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.
Apel, Sven; Wendler, Philipp; von Rhein, Alexander; Beyer, Dirk
2011-01-01
A software product line is a set of software products that are distinguished in terms of features (i.e., end-user--visible units of behavior). Feature interactions ---situations in which the combination of features leads to emergent and possibly critical behavior--- are a major source of failures in software product lines. We explore how feature-aware verification can improve the automatic detection of feature interactions in software product lines. Feature-aware verification uses product-line verification techniques and supports the specification of feature properties along with the features in separate and composable units. It integrates the technique of variability encoding to verify a product line without generating and checking a possibly exponential number of feature combinations. We developed the tool suite SPLverifier for feature-aware verification, which is based on standard model-checking technology. We applied it to an e-mail system that incorporates domain knowledge of AT&T. We found that feat...
Standard Verification System (SVS)
Social Security Administration — SVS is a mainframe program that accesses the NUMIDENT to perform SSN verifications. This program is called by SSA Internal applications to verify SSNs. There is also...
Energy Technology Data Exchange (ETDEWEB)
1979-03-01
The certification and verification of the Northrup Model NSC-01-0732 Fresnel lens tracking solar collector are presented. A certification statement is included with signatures and a separate report on the structural analysis of the collector system. System verification against the Interim Performance Criteria are indicated by matrices with verification discussion, analysis, and enclosed test results.
Energy Technology Data Exchange (ETDEWEB)
Takahashi, Atsushi; Igarashi, Shukuro; Ueki, Yasuo [Ibaraki Univ., Hitachi (Japan). Faculty of Engineering; Yamaguchi, Hitoshi [National Research Inst. for Metals, Ibaraki (Japan)
2000-11-01
A homogeneous liquid-liquid extraction method for 36 metal ions with diethyldithiocarbamate was studied. As a result, 11 metal ions were extracted as metal-chelates. Under the experimental conditions, the maximum concentration factor was 500 (i.e., 0.1 mL of sedimented liquid phase was produced from 50 mL of aqueous phase). Moreover, the proposed method was utilized as a preconcentration method for X-ray fluorescence analysis of these metals. The recovery of each metal was ca. 97-100%. All calibration curves were linear over the range of 5.0 x 10{sup -7} mol L{sup -1} to 1.0 x 10{sup -5} mol L{sup -1}. The detection limits were at the 10{sup -8} mol L{sup -1} levels and the relative standard deviations were below 5% (5 determinations). When the proposed method was used for the determination of contaminants in a synthetic sample (Al-based alloy model) and of components in an Au-Pd alloy, the results were satisfactory. (orig.)
Mashonkina, L.; Jablonka, P.; Pakhomov, Yu.; Sitnova, T.; North, P.
2017-08-01
We present a homogeneous set of accurate atmospheric parameters for a complete sample of very and extremely metal-poor stars in the dwarf spheroidal galaxies (dSphs) Sculptor, Ursa Minor, Sextans, Fornax, Boötes I, Ursa Major II, and Leo IV. We also deliver a Milky Way (MW) comparison sample of giant stars covering the - 4 - 3.5 regime, the Ti i/Ti ii ionisation equilibrium is fulfilled in the NLTE calculations. In the log g - Teff plane, all the stars sit on the giant branch of the evolutionary tracks corresponding to [Fe/H] = - 2 to - 4, in line with their metallicities. For some of the most metal-poor stars of our sample, we achieve relatively inconsistent NLTE abundances from the two ionisation stages for both iron and titanium. We suggest that this is a consequence of the uncertainty in the Teff-colour relation at those metallicities. The results of this work provide the basis for a detailed abundance analysis presented in a companion paper. Tables A.1 and A.2 are also available at the CDS via anonymous ftp to http://cdsarc.u-strasbg.fr (http://130.79.128.5) or via http://cdsarc.u-strasbg.fr/viz-bin/qcat?J/A+A/604/A129
Qiu, Shi; Yang, Wen-Zhi; Yao, Chang-Liang; Qiu, Zhi-Dong; Shi, Xiao-Jian; Zhang, Jing-Xian; Hou, Jin-Jun; Wang, Qiu-Rong; Wu, Wan-Ying; Guo, De-An
2016-07-01
A key segment in authentication of herbal medicines is the establishment of robust biomarkers that embody the intrinsic metabolites difference independent of the growing environment or processing technics. We present a strategy by nontargeted metabolomics and "Commercial-homophyletic" comparison-induced biomarkers verification with new bioinformatic vehicles, to improve the efficiency and reliability in authentication of herbal medicines. The chemical differentiation of five different parts (root, leaf, flower bud, berry, and seed) of Panax ginseng was illustrated as a case study. First, an optimized ultra-performance liquid chromatography/quadrupole time-of-flight-MS(E) (UPLC/QTOF-MS(E)) approach was established for global metabolites profiling. Second, UNIFI™ combined with search of an in-house library was employed to automatically characterize the metabolites. Third, pattern recognition multivariate statistical analysis of the MS(E) data of different parts of commercial and homophyletic samples were separately performed to explore potential biomarkers. Fourth, potential biomarkers deduced from commercial and homophyletic root and leaf samples were cross-compared to infer robust biomarkers. Fifth, discriminating models by artificial neutral network (ANN) were established to identify different parts of P. ginseng. Consequently, 164 compounds were characterized, and 11 robust biomarkers enabling the differentiation among root, leaf, flower bud, and berry, were discovered by removing those structurally unstable and possibly processing-related ones. The ANN models using the robust biomarkers managed to exactly discriminate four different parts and root adulterant with leaf as well. Conclusively, biomarkers verification using homophyletic samples conduces to the discovery of robust biomarkers. The integrated strategy facilitates authentication of herbal medicines in a more efficient and more intelligent manner. Copyright © 2016 Elsevier B.V. All rights reserved.
Butler, S. L.
2017-08-01
It is instructive to consider the sensitivity function for a homogeneous half space for resistivity since it has a simple mathematical formula and it does not require a priori knowledge of the resistivity of the ground. Past analyses of this function have allowed visualization of the regions that contribute most to apparent resistivity measurements with given array configurations. The horizontally integrated form of this equation gives the sensitivity function for an infinitesimally thick horizontal slab with a small resistivity contrast and analysis of this function has admitted estimates of the depth of investigation for a given electrode array. Recently, it has been shown that the average of the vertical coordinate over this function yields a simple formula that can be used to estimate the depth of investigation. The sensitivity function for a vertical inline slab has also been previously calculated. In this contribution, I show that the sensitivity function for a homogeneous half-space can also be integrated so as to give sensitivity functions to semi-infinite vertical slabs that are perpendicular to the array axis. These horizontal sensitivity functions can, in turn, be integrated over the spatial coordinates to give the mean horizontal positions of the sensitivity functions. The mean horizontal positions give estimates for the centres of the regions that affect apparent resistivity measurements for arbitrary array configuration and can be used as horizontal positions when plotting pseudosections even for non-collinear arrays. The mean of the horizontal coordinate that is perpendicular to a collinear array also gives a simple formula for estimating the distance over which offline resistivity anomalies will have a significant effect. The root mean square (rms) widths of the sensitivity functions are also calculated in each of the coordinate directions as an estimate of the inverse of the resolution of a given array. For depth and in the direction perpendicular
Deng, Shaoqiang
2012-01-01
"Homogeneous Finsler Spaces" is the first book to emphasize the relationship between Lie groups and Finsler geometry, and the first to show the validity in using Lie theory for the study of Finsler geometry problems. This book contains a series of new results obtained by the author and collaborators during the last decade. The topic of Finsler geometry has developed rapidly in recent years. One of the main reasons for its surge in development is its use in many scientific fields, such as general relativity, mathematical biology, and phycology (study of algae). This monograph introduc
Energy Technology Data Exchange (ETDEWEB)
Bershadskii, A.G.
1985-06-01
An exact solution for the nonlinear problem of the spectral energy function of a homogeneous turbulence is derived under the assumption that energy transfer under the effect of inertial forces is determined mainly by the interactions among vortices whose wavenumbers are only slightly different from each other. The results are experimentally verified for turbulence behind grids. Similar problems are solved for MHD turbulence and for a nonstationary spectral energy function. It is shown that at the initial stage of degeneration, the spectral energy function is little influenced by the Stewart number; this agrees with experimental data for the damping of longitudinal velocity pulsations behind a grid in a longitudinal magnetic field. 15 references.
Burnette, Matthew C.; Genereux, David P.; Birgand, François
2016-08-01
The hydraulic conductivity (K) of streambeds is a critical variable controlling interaction of groundwater and surface water. The Hvorslev analysis for estimating K from falling-head test data has been widely used since the 1950s, but its performance in layered sandy sediments common in streams and lakes has not previously been examined. Our numerical simulations and laboratory experiments show that the Hvorslev analysis yields accurate K values in both homogenous sediment (for which the analysis was originally derived) and layered deposits with low-K sand over high-K sand. K from the Hvorslev analysis deviated significantly from true K only when two conditions were present together: (1) high-K sand was present over low-K sand, and (2) the bottom of the permeameter in which K was measured was at or very near the interface between high-K and low-K. When this combination of conditions exists, simulation and laboratory sand tank results show that in-situ Hvorslev K underestimates the true K of the sediment within a permeameter, because the falling-head test is affected by low-K sediment outside of (below the bottom of) the permeameter. In simulation results, the maximum underestimation (occurring when the bottom of the permeameter was at the interface of high K over low K) was by a factor of 0.91, 0.59, and 0.12 when the high-K to low-K ratio was 2, 10, and 100, respectively. In laboratory sand tank experiments, the underestimation was by a factor of about 0.83 when the high-K to low-K ratio was 2.3. Also, this underestimation of K by the Hvorslev analysis was about the same whether the underlying low-K layer was 2 cm or 174 cm thick (1% or 87% of the domain thickness). Numerical model simulations were useful in the interpretation of in-situ field K profiles at streambed sites with layering; specifically, scaling the model results to the maximum measured K at the top of the field K profiles helped constrain the likely ratio of high K to low K at field locations with
CPAchecker: A Tool for Configurable Software Verification
Beyer, Dirk
2009-01-01
Configurable software verification is a recent concept for expressing different program analysis and model checking approaches in one single formalism. This paper presents CPAchecker, a tool and framework that aims at easy integration of new verification components. Every abstract domain, together with the corresponding operations, is required to implement the interface of configurable program analysis (CPA). The main algorithm is configurable to perform a reachability analysis on arbitrary combinations of existing CPAs. The major design goal during the development was to provide a framework for developers that is flexible and easy to extend. We hope that researchers find it convenient and productive to implement new verification ideas and algorithms using this platform and that it advances the field by making it easier to perform practical experiments. The tool is implemented in Java and runs as command-line tool or as Eclipse plug-in. We evaluate the efficiency of our tool on benchmarks from the software mo...
Embedded software verification and debugging
Winterholer, Markus
2017-01-01
This book provides comprehensive coverage of verification and debugging techniques for embedded software, which is frequently used in safety critical applications (e.g., automotive), where failures are unacceptable. Since the verification of complex systems needs to encompass the verification of both hardware and embedded software modules, this book focuses on verification and debugging approaches for embedded software with hardware dependencies. Coverage includes the entire flow of design, verification and debugging of embedded software and all key approaches to debugging, dynamic, static, and hybrid verification. This book discusses the current, industrial embedded software verification flow, as well as emerging trends with focus on formal and hybrid verification and debugging approaches. Includes in a single source the entire flow of design, verification and debugging of embedded software; Addresses the main techniques that are currently being used in the industry for assuring the quality of embedded softw...
Symmetries of homogeneous cosmologies
Cotsakis, S; Pantazi, H; Cotsakis, Spiros; Leach, Peter; Pantazi, Hara
1998-01-01
We reformulate the dynamics of homogeneous cosmologies with a scalar field matter source with an arbitrary self-interaction potential in the language of jet bundles and extensions of vector fields. In this framework, the Bianchi-scalar field equations become subsets of the second Bianchi jet bundle, $J^2$, and every Bianchi cosmology is naturally extended to live on a variety of $J^2$. We are interested in the existence and behaviour of extensions of arbitrary Bianchi-Lie and variational vector fields acting on the Bianchi variety and accordingly we classify all such vector fields corresponding to both Bianchi classes $A$ and $B$. We give examples of functions defined on Bianchi jet bundles which are constant along some Bianchi models (first integrals) and use these to find particular solutions in the Bianchi total space. We discuss how our approach could be used to shed new light to questions like isotropization and the nature of singularities of homogeneous cosmologies by examining the behaviour of the vari...
Energy Technology Data Exchange (ETDEWEB)
Jones, J.L.; Calley, M.B.; Capps, E.L.; Zeigler, S.L.; Galyean, W.J.; Novack, S.D.; Smith, C.L.; Wolfram, L.M. [Lockheed Idaho Technologies Co., Idaho Falls, ID (United States)
1995-03-01
A verification and validation (V&V) process has been performed for the System Analysis Programs for Hands-on Integrated Reliability Evaluation (SAPHIRE) Version 5.0. SAPHIRE is a set of four computer programs that NRC developed for performing probabilistic risk assessments. They allow an analyst to perform many of the functions necessary to create, quantify, and evaluate the risk associated with a facility or process being analyzed. The programs are Integrated Reliability and Risk Analysis System (IRRAS) System Analysis and Risk Assessment (SARA), Models And Results Database (MAR-D), and Fault tree, Event tree, and Piping and instrumentation diagram (FEP) graphical editor. Intent of this program is to perform a V&V of successive versions of SAPHIRE. Previous efforts have been the V&V of SAPHIRE Version 4.0. The SAPHIRE 5.0 V&V plan is based on the SAPHIRE 4.0 V&V plan with revisions to incorporate lessons learned from the previous effort. Also, the SAPHIRE 5.0 vital and nonvital test procedures are based on the test procedures from SAPHIRE 4.0 with revisions to include the new SAPHIRE 5.0 features as well as to incorporate lessons learned from the previous effort. Most results from the testing were acceptable; however, some discrepancies between expected code operation and actual code operation were identified. Modifications made to SAPHIRE are identified.
Indian Academy of Sciences (India)
DEVENDRA P MISHRA; ANCHAL SRIVASTAVA; R K SHUKLA
2017-07-01
This paper describes the spectroscopic ($_{1}\\rm{H}$ and $_{13}\\rm{C NMR}$, FT-IR and UV–Visible), chemical, nonlinear optical and thermodynamic properties of D-Myo-Inositol using quantum chemical technique and its experimental verification. The structural parameters of the compound are determined from the optimized geometry by B3LYP method with $6-311++G(d,p)$ basis set. It was found that the optimized parameters thus obtained are almost in agreement with the experimental ones. A detailed interpretation of the infrared spectra of D-Myo-Inositol is also reported in the present work. After optimization, the proton and carbon NMR chemical shifts of the studied compound are calculated using GIAO and 6-311++G(d,p) basis set. The search of organic materials with improved charge transfer properties requires precise quantum chemical calculations of space-charge density distribution, state and transition dipole moments and HOMO–LUMO states. The nature of the transitions in the observed UV–Visible spectrum of the compound has been studied by the time-dependent density functional theory (TD-DFT). The global reactivity descriptors like chemical potential, electronegativity, hardness, softness and electrophilicity index, have been calculated using DFT. The thermodynamic calculation related to the title compound was also performed at $B3LYP/6-311++G(d,p)$ level of theory. The standard statistical thermodynamic functions like heat capacity at constant pressure, entropy and enthalpy change were obtained from the theoretical harmonic frequencies of the optimized molecule. It is observed that the values of heat capacity, entropy and enthalpy increase with increase intemperature from 100 to 1000 K, which is attributed to the enhancement of molecular vibration with the increase in temperature.
Mishra, Devendra P.; Srivastava, Anchal; Shukla, R. K.
2017-07-01
This paper describes the spectroscopic (^1H and ^{13}C NMR, FT-IR and UV-Visible), chemical, nonlinear optical and thermodynamic properties of D-Myo-Inositol using quantum chemical technique and its experimental verification. The structural parameters of the compound are determined from the optimized geometry by B3LYP method with 6 {-}311{+}{+}G(d,p) basis set. It was found that the optimized parameters thus obtained are almost in agreement with the experimental ones. A detailed interpretation of the infrared spectra of D-Myo-Inositol is also reported in the present work. After optimization, the proton and carbon NMR chemical shifts of the studied compound are calculated using GIAO and 6 {-}311{+}{+}G(d,p) basis set. The search of organic materials with improved charge transfer properties requires precise quantum chemical calculations of space-charge density distribution, state and transition dipole moments and HOMO-LUMO states. The nature of the transitions in the observed UV-Visible spectrum of the compound has been studied by the time-dependent density functional theory (TD-DFT). The global reactivity descriptors like chemical potential, electronegativity, hardness, softness and electrophilicity index, have been calculated using DFT. The thermodynamic calculation related to the title compound was also performed at B3LYP/ 6 {-}311{+}{+}G(d,p) level of theory. The standard statistical thermodynamic functions like heat capacity at constant pressure, entropy and enthalpy change were obtained from the theoretical harmonic frequencies of the optimized molecule. It is observed that the values of heat capacity, entropy and enthalpy increase with increase in temperature from 100 to 1000 K, which is attributed to the enhancement of molecular vibration with the increase in temperature.
A Computationally Based Approach to Homogenizing Advanced Alloys
Energy Technology Data Exchange (ETDEWEB)
Jablonski, P D; Cowen, C J
2011-02-27
We have developed a computationally based approach to optimizing the homogenization heat treatment of complex alloys. The Scheil module within the Thermo-Calc software is used to predict the as-cast segregation present within alloys, and DICTRA (Diffusion Controlled TRAnsformations) is used to model the homogenization kinetics as a function of time, temperature and microstructural scale. We will discuss this approach as it is applied to both Ni based superalloys as well as the more complex (computationally) case of alloys that solidify with more than one matrix phase as a result of segregation. Such is the case typically observed in martensitic steels. With these alloys it is doubly important to homogenize them correctly, especially at the laboratory scale, since they are austenitic at high temperature and thus constituent elements will diffuse slowly. The computationally designed heat treatment and the subsequent verification real castings are presented.
Akhundov, V. M.
2017-01-01
Results of an analysis of form changes of a toroidal body highly filled with fibers at large torsional deformations and rotational motions are presented. The body is reinforced in the meridional direction. The investigation was carried out by using the two-level carcass theory of fibrous media at large deformations, according to which the macroscopic fields of a reinforced body are determined by its internal fields. These fields are represented by the material configurations of nodal material blocks of the body, for which, on the basis of the model of a piecewise homogeneous medium, boundary-values problems of the micromechanical level of the theory are solved. The results obtained are compared with those for a homogeneous body. The congruent deformation of the homogeneous body at which its initial form and dimensions are practically restored upon superposition of torsion and rotation is determined.
An Efficient Location Verification Scheme for Static Wireless Sensor Networks.
Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok
2017-01-24
In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.
Formal Verification of NTRUEncrypt Scheme
Directory of Open Access Journals (Sweden)
Gholam Reza Moghissi
2016-04-01
Full Text Available In this paper we explore a mechanized verification of the NTRUEncrypt scheme, with the formal proof system Isabelle/HOL. More precisely, the functional correctness of this algorithm, in its reduced form, is formally verified with computer support. We show that this scheme is correct what is a necessary condition for the usefulness of any cryptographic encryption scheme. Besides, we present a convenient and application specific formalization of the NTRUEncrypt scheme in the Isabelle/HOL system that can be used in further study around the functional and security analysis of NTRUEncrypt family.
A verification environment for bigraphs
DEFF Research Database (Denmark)
Perrone, Gian David; Debois, Søren; Hildebrandt, Thomas
2013-01-01
We present the BigMC tool for bigraphical reactive systems that may be instantiated as a verification tool for any formalism or domain-specific modelling language encoded as a bigraphical reactive system. We introduce the syntax and use of BigMC, and exemplify its use with two small examples......: a textbook “philosophers” example, and an example motivated by a ubiquitous computing application. We give a tractable heuristic with which to approximate interference between reaction rules, and prove this analysis to be safe. We provide a mechanism for state reachability checking of bigraphical reactive...
Wind gust warning verification
Primo, Cristina
2016-07-01
Operational meteorological centres around the world increasingly include warnings as one of their regular forecast products. Warnings are issued to warn the public about extreme weather situations that might occur leading to damages and losses. In forecasting these extreme events, meteorological centres help their potential users in preventing the damage or losses they might suffer. However, verifying these warnings requires specific methods. This is due not only to the fact that they happen rarely, but also because a new temporal dimension is added when defining a warning, namely the time window of the forecasted event. This paper analyses the issues that might appear when dealing with warning verification. It also proposes some new verification approaches that can be applied to wind warnings. These new techniques are later applied to a real life example, the verification of wind gust warnings at the German Meteorological Centre ("Deutscher Wetterdienst"). Finally, the results obtained from the latter are discussed.
Nuclear disarmament verification
Energy Technology Data Exchange (ETDEWEB)
DeVolpi, A.
1993-12-31
Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.
Energy Technology Data Exchange (ETDEWEB)
Chang Liyun; Ho, Sheng-Yow; Chui, Chen-Shou; Du, Yi-Chun; Chen Tainsong [Institute of Biochemical Engineering, National Cheng-Kung University, Tainan 701, Taiwan (China) and Department of Radiation Oncology, Sinlau Christian Hospital, Tainan 701, Taiwan (China); Department of Radiation Oncology, Sinlau Christian Hospital, Tainan 701, Taiwan (China); Department of Medical Physics, Sun Yat-Sen Cancer Center, Taipei 112, Taiwan (China); Institute of Biomedical Engineering, National Cheng-Kung University, Tainan 701, Taiwan (China)
2009-09-15
A method was presented that employs standard linac QA tools to verify the accuracy of film reconstruction algorithms used in the brachytherapy planning system. Verification of reconstruction techniques is important as suggested in the ESTRO booklet 8: ''The institution should verify the full process of any reconstruction technique employed clinically.'' Error modeling was also performed to analyze seed-position errors. The ''isocentric beam checker'' device was used in this work. It has a two-dimensional array of steel balls embedded on its surface. The checker was placed on the simulator couch with its center ball coincident with the simulator isocenter, and one axis of its cross marks parallel to the axis of gantry rotation. The gantry of the simulator was rotated to make the checker behave like a three-dimensional array of balls. Three algorithms used in the ABACUS treatment planning system: orthogonal film, 2-films-with-variable-angle, and 3-films-with-variable-angle were tested. After exposing and digitizing the films, the position of each steel ball on the checker was reconstructed and compared to its true position, which can be accurately calculated. The results showed that the error is dependent on the object-isocenter distance, but not the magnification of the object. The averaged errors were less than 1 mm within the tolerance level defined by Roueet al. [''The EQUAL-ESTRO audit on geometric reconstruction techniques in brachytherapy,'' Radiother. Oncol. 78, 78-83 (2006)]. However, according to the error modeling, the theoretical error would be greater than 2 mm if the objects were located more than 20 cm away from the isocenter with a 0.5 deg. reading error of the gantry and collimator angles. Thus, in addition to carefully performing the QA of the gantry and collimator angle indicators, it is suggested that the patient, together with the applicators or seeds inside, should be placed close to
Verification and validation benchmarks.
Energy Technology Data Exchange (ETDEWEB)
Oberkampf, William Louis; Trucano, Timothy Guy
2007-02-01
Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of
粒度分析在牛奶均质化过程中的应用%Application of particle size analysis in milk homogenization process
Institute of Scientific and Technical Information of China (English)
马姜; 高原; 王红静
2015-01-01
Objective To ensure the effectiveness of homogenization and find an effective method to monitor milk particles before and after homogenization. Methods The milk homogenization process was monitored by using the HORIBA LA-950 laser particle-size analyzer. Results The distribution charts of particle size were obtained by laser measuring, which clearly indicated that the distribution of milk particles after homogenization was obviously less than that before homogenization. Conclusion In milk homogenization process, the application of laser to measure particle sizes was proved to be a quick and effective method to realize real-time monitoring of particle size distribution.%目的：为确保均质化效果，采用一种快速有效的方法对牛奶均质化前后的粒度进行监控。方法利用HORIBA LA-950激光粒度分析仪对牛奶的均质化过程进行监控。结果通过激光粒度法，可以快速得到均质化前后牛奶的粒度分布图，并可以看出均质化后的牛奶粒度分布明显小于均质化前。结论在牛奶的均质化的过程中，激光粒度法是一种快速有效的方法，它可以实现对均质化过程中牛奶粒度的实时监控。
Open verification methodology cookbook
Glasser, Mark
2009-01-01
Functional verification is an art as much as a science. It requires not only creativity and cunning, but also a clear methodology to approach the problem. The Open Verification Methodology (OVM) is a leading-edge methodology for verifying designs at multiple levels of abstraction. It brings together ideas from electrical, systems, and software engineering to provide a complete methodology for verifying large scale System-on-Chip (SoC) designs. OVM defines an approach for developing testbench architectures so they are modular, configurable, and reusable. This book is designed to help both novic
Verification test report on a solar heating and hot water system
1978-01-01
Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.
Energy Technology Data Exchange (ETDEWEB)
Lee, Yoon Hee; Cho, Bum Hee; Cho, Nam Zin [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)
2016-06-15
In Part I of this paper, the two-temperature homogenized model for the fully ceramic microencapsulated fuel, in which tristructural isotropic particles are randomly dispersed in a fine lattice stochastic structure, was discussed. In this model, the fuel-kernel and silicon carbide matrix temperatures are distinguished. Moreover, the obtained temperature profiles are more realistic than those obtained using other models. Using the temperature-dependent thermal conductivities of uranium nitride and the silicon carbide matrix, temperature-dependent homogenized parameters were obtained. In Part II of the paper, coupled with the COREDAX code, a reactor core loaded by fully ceramic microencapsulated fuel in which tristructural isotropic particles are randomly dispersed in the fine lattice stochastic structure is analyzed via a two-temperature homogenized model at steady and transient states. The results are compared with those from harmonic- and volumetric-average thermal conductivity models; i.e., we compare keff eigenvalues, power distributions, and temperature profiles in the hottest single channel at a steady state. At transient states, we compare total power, average energy deposition, and maximum temperatures in the hottest single channel obtained by the different thermal analysis models. The different thermal analysis models and the availability of fuel-kernel temperatures in the two-temperature homogenized model for Doppler temperature feedback lead to significant differences.
Projective duality and homogeneous spaces
Tevelev, E A
2006-01-01
Projective duality is a very classical notion naturally arising in various areas of mathematics, such as algebraic and differential geometry, combinatorics, topology, analytical mechanics, and invariant theory, and the results in this field were until now scattered across the literature. Thus the appearance of a book specifically devoted to projective duality is a long-awaited and welcome event. Projective Duality and Homogeneous Spaces covers a vast and diverse range of topics in the field of dual varieties, ranging from differential geometry to Mori theory and from topology to the theory of algebras. It gives a very readable and thorough account and the presentation of the material is clear and convincing. For the most part of the book the only prerequisites are basic algebra and algebraic geometry. This book will be of great interest to graduate and postgraduate students as well as professional mathematicians working in algebra, geometry and analysis.
Directory of Open Access Journals (Sweden)
Ahmad A. Mahfouz
2013-04-01
Full Text Available The accurate control of motion is a fundamental concern in mechatronics applications, where placing an object in the exact desired location with the exact possible amount of force and torque at the correct exact time is essential for efficient system operation. An accurate modeling, simulation and dynamics analysis of actuators for mechatronics motion control applications is of big concern. The ultimate goal of this paper addresses different approaches used to derive mathematical models, building corresponding simulink models and dynamic analysis of the basic open loop electric DC motor system, used in mechatronics motion control applications, particularly, to design, construct and control of a mechatronics robot arm with single degree of freedom, and verification by MATLAB/Simulink. To simplify and accelerate the process of DC motors sizing, selection, dynamic analysis and evaluation for different motion applications, different mathematical models in terms of output position, speed, current, acceleration and torque, as well as corresponding simulink models, supporting MATLAB m.file and general function block models are to be introduced. The introduced models were verified using MATLAB/ Simulink. These models are intended for research purposes as well as for the application in educational process.This paper is part I of writers' research about mechatronics motion control, the ultimate goal of this research addresses design, modeling, simulation, dynamics analysis and controller selection and design issues, of mechatronics single joint robot arm. where a electric DC motor is used and a control system is selected and designed to move a Robot arm to a desired output position, θ corresponding to applied input voltage, V_(in and satisfying all required design specifications.
Energy Technology Data Exchange (ETDEWEB)
Lee, Yoon Hee; Cho, Bum Hee; Cho, Nam Zin [Dept. of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Daejeon (Korea, Republic of)
2016-06-15
As a type of accident-tolerant fuel, fully ceramic microencapsulated (FCM) fuel was proposed after the Fukushima accident in Japan. The FCM fuel consists of tristructural isotropic particles randomly dispersed in a silicon carbide (SiC) matrix. For a fuel element with such high heterogeneity, we have proposed a two-temperature homogenized model using the particle transport Monte Carlo method for the heat conduction problem. This model distinguishes between fuel-kernel and SiC matrix temperatures. Moreover, the obtained temperature profiles are more realistic than those of other models. In Part I of the paper, homogenized parameters for the FCM fuel in which tristructural isotropic particles are randomly dispersed in the fine lattice stochastic structure are obtained by (1) matching steady-state analytic solutions of the model with the results of particle transport Monte Carlo method for heat conduction problems, and (2) preserving total enthalpies in fuel kernels and SiC matrix. The homogenized parameters have two desirable properties: (1) they are insensitive to boundary conditions such as coolant bulk temperatures and thickness of cladding, and (2) they are independent of operating power density. By performing the Monte Carlo calculations with the temperature-dependent thermal properties of the constituent materials of the FCM fuel, temperature-dependent homogenized parameters are obtained.
Moricke, E.; Lappenschaar, G.A.; Swinkels, S.H.N.; Rommelse, N.N.J.; Buitelaar, J.K.
2013-01-01
Precursors of child psychiatric disorders are often present in infancy, but little is known about the prevalence and course of general psychopathology in population-based samples of children 0-3 years. We examined whether homogeneous behavioural and developmental profiles could be identified in chil
The MODUS approach to formal verification
DEFF Research Database (Denmark)
Brewka, Lukasz Jerzy; Soler, José; Berger, Michael Stübert
2014-01-01
in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality) project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution...... Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model...... verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development...
Energy Technology Data Exchange (ETDEWEB)
Cox, Daryl [ORNL
2009-05-01
The United States Department of Energy, Industrial Technologies Program has invested in emerging Process Design and Optimizations Technologies (PDOT) to encourage the development of new initiatives that might result in energy savings in industrial processes. Gas fired furnaces present a harsh environment, often making accurate determination of correct air/fuel ratios a challenge. Operation with the correct air/fuel ratio and especially with balanced burners in multi-burner combustion equipment can result in improved system efficiency, yielding lower operating costs and reduced emissions. Flame Image Analysis offers a way to improve individual burner performance by identifying and correcting fuel-rich burners. The anticipated benefit of this technology is improved furnace thermal efficiency, and lower NOx emissions. Independent validation and verification (V&V) testing of the FIA technology was performed at Missouri Forge, Inc., in Doniphan, Missouri by Environ International Corporation (V&V contractor) and Enterprise Energy and Research (EE&R), the developer of the technology. The test site was selected by the technology developer and accepted by Environ after a meeting held at Missouri Forge. As stated in the solicitation for the V&V contractor, 'The objective of this activity is to provide independent verification and validation of the performance of this new technology when demonstrated in industrial applications. A primary goal for the V&V process will be to independently evaluate if this technology, when demonstrated in an industrial application, can be utilized to save a significant amount of the operating energy cost. The Seller will also independently evaluate the other benefits of the demonstrated technology that were previously identified by the developer, including those related to product quality, productivity, environmental impact, etc'. A test plan was provided by the technology developer and is included as an appendix to the summary report
Quantitative Verification in Practice
Haverkort, Boudewijn R.; Katoen, Joost-Pieter; Larsen, Kim G.
2010-01-01
Soon after the birth of model checking, the first theoretical achievements have been reported on the automated verification of quanti- tative system aspects such as discrete probabilities and continuous time. These theories have been extended in various dimensions, such as con- tinuous probabilities
国产稳压管的技术分析与验证改进%Technical analysis and verification improvement of domestic stabilivdt tube
Institute of Scientific and Technical Information of China (English)
李洪艳
2014-01-01
以集成电路稳压管为例，研究国产化器件的替代选用，通过长期的论证考察和试验验证，对稳压管电路原理进行了分析，反复进行试验，经过试验数据分析论证，选择国内生产工艺成熟、资质能力深厚的厂家提供的货架产品。通过一系列工艺试验攻关，国产稳压管能够满足产品使用要求，稳压管的国产化替代工作取得了实质性的突破。%Take the integrated circuit voltage regulator tube as an example, substitution of home-made device selection. Through the demonstration investigation and experimental verification of the long-term , the principle of circuit voltage regulator tubes were analyzed, repeated test,through the test data analysis, selection of production technology mature, domestic manufacturers to provide qualifications deep shelf products. Domestic voltage regulator tube can meet the product requirements , voltage stabilizing pipe made of alternative work has made substantial breakthroughs.
Energy Technology Data Exchange (ETDEWEB)
Uwaba, Tomoyuki, E-mail: uwaba.tomoyuki@jaea.go.jp [Japan Atomic Energy Agency, 4002, Narita-cho, Oarai-machi, Ibaraki 311-1393 (Japan); Ito, Masahiro; Nemoto, Junichi [Japan Atomic Energy Agency, 4002, Narita-cho, Oarai-machi, Ibaraki 311-1393 (Japan); Ichikawa, Shoichi [Japan Atomic Energy Agency, 2-1, Shiraki, Tsuruga-shi, Fukui 919-1279 (Japan); Katsuyama, Kozo [Japan Atomic Energy Agency, 4002, Narita-cho, Oarai-machi, Ibaraki 311-1393 (Japan)
2014-09-15
The BAMBOO computer code was verified by results for the out-of-pile bundle compression test with large diameter pin bundle deformation under the bundle–duct interaction (BDI) condition. The pin diameters of the examined test bundles were 8.5 mm and 10.4 mm, which are targeted as preliminary fuel pin diameters for the upgraded core of the prototype fast breeder reactor (FBR) and for demonstration and commercial FBRs studied in the FaCT project. In the bundle compression test, bundle cross-sectional views were obtained from X-ray computer tomography (CT) images and local parameters of bundle deformation such as pin-to-duct and pin-to-pin clearances were measured by CT image analyses. In the verification, calculation results of bundle deformation obtained by the BAMBOO code analyses were compared with the experimental results from the CT image analyses. The comparison showed that the BAMBOO code reasonably predicts deformation of large diameter pin bundles under the BDI condition by assuming that pin bowing and cladding oval distortion are the major deformation mechanisms, the same as in the case of small diameter pin bundles. In addition, the BAMBOO analysis results confirmed that cladding oval distortion effectively suppresses BDI in large diameter pin bundles as well as in small diameter pin bundles.
Energy Technology Data Exchange (ETDEWEB)
Ruedig, Elizabeth [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)
2017-08-30
Public Law 105-119 directs the U.S. Department of Energy (DOE) to convey or transfer parcels of land to the Incorporated County of Los Alamos or their designees and to the Department of Interior, Bureau of Indian Affairs, in trust for the Pueblo de San Ildefonso. Los Alamos National Security is tasked to support DOE in conveyance and/or transfer of identified land parcels no later than September 2022. Under DOE Order 458.1, Radiation Protection of the Public and the Environment (O458.1, 2013) and Los Alamos National Laboratory (LANL or the Laboratory) implementing Policy 412 (P412, 2014), real property with the potential to contain residual radioactive material must meet the criteria for clearance and release to the public. This Sampling and Analysis Plan (SAP) is a second investigation of Tract A-18-2 for the purpose of verifying the previous sampling results (LANL 2017). This sample plan requires 18 projectspecific soil samples for use in radiological clearance decisions consistent with LANL Procedure ENV-ES-TP-238 (2015a) and guidance in the Multi-Agency Radiation Survey and Site Investigation Manual (MARSSIM, 2000). The sampling work will be conducted by LANL, and samples will be evaluated by a LANL-contracted independent lab. However, there will be federal review (verification) of all steps of the sampling process.
Using timing information in speaker verification
CSIR Research Space (South Africa)
Van Heerden, CJ
2005-11-01
Full Text Available This paper presents an analysis of temporal information as a feature for use in speaker verification systems. The relevance of temporal information in a speaker’s utterances is investigated, both with regard to improving the robustness of modern...
9 CFR 417.8 - Agency verification.
2010-01-01
... REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the adequacy of the HACCP plan(s) by determining that each HACCP plan meets the requirements of this part...
Energy Technology Data Exchange (ETDEWEB)
Weir, A.; Lopasso, E.; Gho, C. [Departamento de Ingenieria Nuclear, Comision Nacional de Energia Atomica, Av. Bustillo 9500 Centro Atomico Bariloche, 8400 (Argentina)]. e-mail: weira@ib.cnea.gov.ar
2007-07-01
The {sup 99m} Tc is the more used radioisotope in nuclear medicine, used in 80% of procedures of nuclear medicine in the world. This is due to their characteristics practically ideal for the diagnostic. The {sup 99m}Tc is obtained by decay of the {sup 99}Mo, which can produce it by irradiating enriched targets in {sup 98}Mo, or as fission product, irradiating uranium targets or by means of homogeneous solution reactors. The pattern of the used reactor in the neutron analysis possesses a liquid fuel composed of uranyl nitrate dissolved in water with the attach of nitric acid. This solution is contained in a cylindrical recipient of stainless steel reflected with light water. The reactor is refrigerated by means of an helicoidal heat exchanger immersed in the fuel solution. The heat of the fuel is removed by natural convection while the circulation of the water inside the exchanger is forced. The control system of the reactor consists on 6 independent cadmium bars, with followers of water. An auxiliary control system can be the level of the fuel solution inside container tank, but it was not included in the pattern in study. One studies the variations of the reactivity of the system due to different phenomena. An important factor during the normal operation of the reactor is the variation of temperature taking to a volumetric expansion of the fuel and ghastly effects in the same one. Another causing phenomenon of changes in the reactivity is the variation of the concentration of uranium in the combustible solution. An important phenomenon in this type of reactors is the hole fraction in the nucleus I liquidate due to the radiolysis and the possible boil of the water of the combustible solution. Some of the possible cases of abnormal operation were studied as the lost one of coolant in the secondary circuit of the heat exchanger, the introduction and evaporation of water in the nucleus. The reactivity variations were studied using the codes of I calculate MCNP, WIMS
Hosseini, Hadi S; Dünki, Andreas; Fabech, Jonas; Stauber, Martin; Vilayphiou, Nicolas; Pahr, Dieter; Pretterklieber, Michael; Wandel, Jasmin; Rietbergen, Bert van; Zysset, Philippe K
2017-01-07
Fractures of the distal section of the radius (Colles' fractures) occur earlier in life than other osteoporotic fractures. Therefore, they can be interpreted as a warning signal for later, more deleterious fractures of vertebral bodies or the femoral neck. In the past decade, the advent of HR-pQCT allowed a detailed architectural analysis of the distal radius and an automated but time-consuming estimation of its strength with linear micro-finite element (μFE) analysis. Recently, a second generation of HR-pQCT scanner (XtremeCT II, SCANCO Medical, Switzerland) with a resolution beyond 61 μm became available for even more refined biomechanical investigations in vivo. This raises the question how biomechanical outcome variables compare between the original (LR) and the new (HR) scanner resolution. Accordingly, the aim of this work was to validate experimentally a patient-specific homogenized finite element (hFE) analysis of the distal section of the human radius for the fast prediction of Colles' fracture load based on the last generation HR-pQCT. Fourteen pairs of fresh frozen forearms (mean age = 77.5±9) were scanned intact using the high (61 μm) and the low (82 μm) resolution protocols that correspond to the new and original HR-pQCT systems. From each forearm, the 20mm most distal section of the radius were dissected out, scanned with μCT at 16.4 μm and tested experimentally under compression up to failure for assessment of stiffness and ultimate load. Linear and nonlinear hFE models together with linear micro finite element (μFE) models were then generated based on the μCT and HR-pQCT reconstructions to predict the aforementioned mechanical properties of 24 sections. Precision errors of the short term reproducibility of the FE analyses were measured based on the repeated scans of 12 sections. The calculated failure loads correlated strongly with those measured in the experiments: accounting for donor as a random factor, the nonlinear hFE provided a
Indian Academy of Sciences (India)
C MAHESH; K GOVINDARAJULU; V BALAKRISHNA MURTHY
2016-06-01
In this study, homogenization approach is proposed to analyse the fibre waviness in predicting the effective thermal conductivities of composite. Composites that have wavy fibre were analysed by finite element method toestablish equivalence between micro- and macro-mechanics principles, thereby, it is possible to minimize the computational efforts required to solve the problem through only micro-mechanics approach. In the present work, the influence of crest offset, wavy-span on the thermal conductivities of composite for different volume fractions and thermal conductivity mismatch ratios were also studied. It is observed that the homogenization results are in good agreement with minimal % error from those obtained through pure micro-mechanics approach at the cost of low computational facilities and less processing time for converged solutions.
A Tutorial on Text-Independent Speaker Verification
Bimbot, Frédéric; Bonastre, Jean-François; Fredouille, Corinne; Gravier, Guillaume; Magrin-Chagnolleau, Ivan; Meignier, Sylvain; Merlin, Teva; Ortega-García, Javier; Petrovska-Delacrétaz, Dijana; Reynolds, Douglas A.
2004-12-01
This paper presents an overview of a state-of-the-art text-independent speaker verification system. First, an introduction proposes a modular scheme of the training and test phases of a speaker verification system. Then, the most commonly speech parameterization used in speaker verification, namely, cepstral analysis, is detailed. Gaussian mixture modeling, which is the speaker modeling technique used in most systems, is then explained. A few speaker modeling alternatives, namely, neural networks and support vector machines, are mentioned. Normalization of scores is then explained, as this is a very important step to deal with real-world data. The evaluation of a speaker verification system is then detailed, and the detection error trade-off (DET) curve is explained. Several extensions of speaker verification are then enumerated, including speaker tracking and segmentation by speakers. Then, some applications of speaker verification are proposed, including on-site applications, remote applications, applications relative to structuring audio information, and games. Issues concerning the forensic area are then recalled, as we believe it is very important to inform people about the actual performance and limitations of speaker verification systems. This paper concludes by giving a few research trends in speaker verification for the next couple of years.
A Tutorial on Text-Independent Speaker Verification
Directory of Open Access Journals (Sweden)
Frédéric Bimbot
2004-04-01
Full Text Available This paper presents an overview of a state-of-the-art text-independent speaker verification system. First, an introduction proposes a modular scheme of the training and test phases of a speaker verification system. Then, the most commonly speech parameterization used in speaker verification, namely, cepstral analysis, is detailed. Gaussian mixture modeling, which is the speaker modeling technique used in most systems, is then explained. A few speaker modeling alternatives, namely, neural networks and support vector machines, are mentioned. Normalization of scores is then explained, as this is a very important step to deal with real-world data. The evaluation of a speaker verification system is then detailed, and the detection error trade-off (DET curve is explained. Several extensions of speaker verification are then enumerated, including speaker tracking and segmentation by speakers. Then, some applications of speaker verification are proposed, including on-site applications, remote applications, applications relative to structuring audio information, and games. Issues concerning the forensic area are then recalled, as we believe it is very important to inform people about the actual performance and limitations of speaker verification systems. This paper concludes by giving a few research trends in speaker verification for the next couple of years.
Energy Technology Data Exchange (ETDEWEB)
Sundberg, Jan; Wrafter, John; Laendell, Maerta (Geo Innova AB (Sweden)); Back, Paer-Erik; Rosen, Lars (Sweco AB (Sweden))
2008-11-15
. This temperature dependence tends to decrease as the thermal conductivity decreases. - Heat capacity: Domains RFM029 and RFM045 have a mean heat capacity of 2.06 MJ/(m3K) and 2.15 MJ/(m3K) respectively. - The mean in situ temperatures at 400 m, 500 m and 600 m depth are estimated at 10.5 deg C, 11.6 deg C, and 12.8 deg C respectively, and are therefore unchanged compared to model stage 2.2. - The estimates of the TRC (thermal rock class) proportions in domain RFM029 are considerably more reliable than those for domain RFM045. For the latter, the small number of boreholes in combination with the higher degree of lithological heterogeneity results in rather large uncertainties in the estimated proportions. - The aspect of the thermal model with the highest confidence is the thermal conductivity distribution of domain RFM029, because of its higher degree of lithological and thermal homogeneity compared to domain RFM045 - The aspect of the thermal model with the lowest confidence is the lower tail of the thermal conductivity distribution for rock domain RFM045. This uncertainty is related to the spatial and size distribution of amphibolite in domain RFM045
2013-08-28
... regulatory program under the Federal Meat Inspection Act (FMIA) (21 U.S.C. 601 et seq.) that is intended to ensure that meat and meat food products distributed in commerce are wholesome; not adulterated; and... ``Pathogen Reduction; Hazard Analysis and Critical Control Point (PR/HACCP) Systems,'' which FSIS...
Energy Technology Data Exchange (ETDEWEB)
Kim, Chang Hyo; Hong, In Seob; Han, Beom Seok; Jeong, Jong Seong [Seoul National University, Seoul (Korea)
2002-03-01
The objective of this project is to verify neutronics characteristics of the SMART core design as to compare computational results of the MCNAP code with those of the MASTER code. To achieve this goal, we will analyze neutronics characteristics of the SMART core using the MCNAP code and compare these results with results of the MASTER code. We improved parallel computing module and developed error analysis module of the MCNAP code. We analyzed mechanism of the error propagation through depletion computation and developed a calculation module for quantifying these errors. We performed depletion analysis for fuel pins and assemblies of the SMART core. We modeled a 3-D structure of the SMART core and considered a variation of material compositions by control rods operation and performed depletion analysis for the SMART core. We computed control-rod worths of assemblies and a reactor core for operation of individual control-rod groups. We computed core reactivity coefficients-MTC, FTC and compared these results with computational results of the MASTER code. To verify error analysis module of the MCNAP code, we analyzed error propagation through depletion of the SMART B-type assembly. 18 refs., 102 figs., 36 tabs. (Author)
Spherical coverage verification
Petkovic, Marko D; Latecki, Longin Jan
2011-01-01
We consider the problem of covering hypersphere by a set of spherical hypercaps. This sort of problem has numerous practical applications such as error correcting codes and reverse k-nearest neighbor problem. Using the reduction of non degenerated concave quadratic programming (QP) problem, we demonstrate that spherical coverage verification is NP hard. We propose a recursive algorithm based on reducing the problem to several lower dimension subproblems. We test the performance of the proposed algorithm on a number of generated constellations. We demonstrate that the proposed algorithm, in spite of its exponential worst-case complexity, is applicable in practice. In contrast, our results indicate that spherical coverage verification using QP solvers that utilize heuristics, due to numerical instability, may produce false positives.
Distorted Fingerprint Verification System
Directory of Open Access Journals (Sweden)
Divya KARTHIKAESHWARAN
2011-01-01
Full Text Available Fingerprint verification is one of the most reliable personal identification methods. Fingerprint matching is affected by non-linear distortion introduced in fingerprint impression during the image acquisition process. This non-linear deformation changes both the position and orientation of minutiae. The proposed system operates in three stages: alignment based fingerprint matching, fuzzy clustering and classifier framework. First, an enhanced input fingerprint image has been aligned with the template fingerprint image and matching score is computed. To improve the performance of the system, a fuzzy clustering based on distance and density has been used to cluster the feature set obtained from the fingerprint matcher. Finally a classifier framework has been developed and found that cost sensitive classifier produces better results. The system has been evaluated on fingerprint database and the experimental result shows that system produces a verification rate of 96%. This system plays an important role in forensic and civilian applications.
Energy Technology Data Exchange (ETDEWEB)
Zauner, C., E-mail: zauner@krp-m.de [KRP-Mechatec Engineering GbR, D-85748 Garching (Germany); Klammer, J. [KRP-Mechatec Engineering GbR, D-85748 Garching (Germany); Hartl, M.; Kampf, D. [Kayser-Threde GmbH, D-81379 Munich (Germany); Huber, A.; Mertens, Ph.; Schweer, B.; Terra, A. [Institute of Energy and Climate Research – Plasma Physics, Forschungszentrum Jülich, EURATOM Association, Trilateral Euregio Cluster, D-52425 Jülich (Germany); Balshaw, N. [Euratom-CCFE Fusion Association, Culham Science Centre, Abingdon, OX14 3DB (United Kingdom)
2013-10-15
Considering multi-physics requirements and loads in the early design phase as well as during the later experimental verification is especially important for the design of fusion devices due to the extreme environmental conditions and loads. Typical disciplines in design of fusion devices are thermodynamics, structural-mechanics, electro-magnetics, and optics. The interaction of these disciplines as well as an efficient approach to implement this interaction in numerical and experimental simulations is presented as applied at the new JET KL11 divertor endoscope design and verification process. The endoscope's first pictures already showed the very good performance of the instrument.
Directory of Open Access Journals (Sweden)
V. A. Solov'ev
2015-01-01
Full Text Available The paper presents the analysis results of the implementation potential and evaluation of the virtual electric devices reliability when conducting circuit simulation of pulsed DC electrical devices in the NI Multisim 10.1environment. It analyses metrological properties of electric measuring devices and sensors of the NI Multisim 10.1environment. To calculate the reliable parameters of periodic non-sinusoidal electrical values based on their physical feasibility the mathematical expressions have been defined.To verify the virtual electric devices a circuit model of the power section of buck DC converter with enabled devices under consideration at its input and output is used as a consumer of pulse current of trapezoidal or triangular form. It is used as an example to show a technique to verify readings of virtual electric measuring devices in the NI Multisim 10.1environment.It is found that when simulating the pulsed DC electric devices to measure average and RMS voltage supply and current consumption values it is advisable to use the probe. Electric device power consumption read from the virtual power meter is equal to its average value, and its displayed power factor is inversely proportional to the input current form factor. To determine the RMS pulsed DC current by ammeter and multi-meter it is necessary to measure current by these devices in DC and AC modes, and then determine the RMS value of measurement results.Virtual electric devices verification has proved the possibility of their application to determine the energy performance of transistor converters for various purposes in the circuit simulation in the NI 10.1 Multisim environment, thus saving time of their designing.
Donarski, James A; Jones, Stephen A; Charlton, Adrian J
2008-07-23
Proton nuclear magnetic resonance spectroscopy ((1)H NMR) and multivariate analysis techniques have been used to classify honey into two groups by geographical origin. Honey from Corsica (Miel de Corse) was used as an example of a protected designation of origin product. Mathematical models were constructed to determine the feasibility of distinguishing between honey from Corsica and that from other geographical locations in Europe, using (1)H NMR spectroscopy. Honey from 10 different regions within five countries was analyzed. (1)H NMR spectra were used as input variables for projection to latent structures (PLS) followed by linear discriminant analysis (LDA) and genetic programming (GP). Models were generated using three methods, PLS-LDA, two-stage GP, and a combination of PLS and GP (PLS-GP). The PLS-GP model used variables selected by PLS for subsequent GP calculations. All models were generated using Venetian blind cross-validation. Overall classification rates for the discrimination of Corsican and non-Corsican honey of 75.8, 94.5, and 96.2% were determined using PLS-LDA, two-stage GP, and PLS-GP, respectively. The variables utilized by PLS-GP were related to their (1)H NMR chemical shifts, and this led to the identification of trigonelline in honey for the first time.
Darabi, Amir; Leamy, Michael J.
2017-03-01
This paper introduces an analytical framework for predicting wave energy harvested by a circular piezoelectric disk attached to a thin plate. An harmonic point source excitation generates waves that are then incident on a piezoelectric disk—summing responses due to all such excitation enables general forcing profiles to be considered. The analysis approach decomposes the coupled system into two subdomains, one being the piezoelectric disk, and the other an infinite plate for which a Green's function is readily available. Interaction forces between the two subdomains couple the problems and lead to a closed-form solution for the propagation, transmission, and reflection of waves over the entire domain. In addition, the voltage generated by the harvester is calculated using coupled electromechanical equations. The analysis approach is first validated by comparing predicted response quantities to those computed using numerical simulations, documenting good agreement. The system is then studied in the frequency domain and the optimum harvester resistance is found for generating the most electrical power. Representative experiments are carried out to demonstrate the validity of the analytical approach and verify the harvested power versus resistance trend.
Directory of Open Access Journals (Sweden)
Joung Tae-Hwan
2014-06-01
Full Text Available This paper examines the suitability of using the Computational Fluid Dynamics (CFD tools, ANSYSCFX, as an initial analysis tool for predicting the drag and propulsion performance (thrust and torque of a concept underwater vehicle design. In order to select an appropriate thruster that will achieve the required speed of the Underwater Disk Robot (UDR, the ANSYS-CFX tools were used to predict the drag force of the UDR. Vertical Planar Motion Mechanism (VPMM test simulations (i.e. pure heaving and pure pitching motion by CFD motion analysis were carried out with the CFD software. The CFD results reveal the distribution of hydrodynamic values (velocity, pressure, etc. of the UDR for these motion studies. Finally, CFD bollard pull test simulations were performed and compared with the experimental bollard pull test results conducted in a model basin. The experimental results confirm the suitability of using the ANSYS-CFX tools for predicting the behavior of concept vehicles early on in their design process.
Energy Technology Data Exchange (ETDEWEB)
Lee, Won Jae; Chung, Bub Dong; Jeong, Jae Jun; Ha, Kwi Seok [Korea Atomic Energy Research Institute, Taejon (Korea)
1998-06-01
A multi-dimensional realistic thermal-hydraulic system analysis code, MARS version 1.3 has been developed. Main purpose of MARS 1.3 development is to have the realistic analysis capability of transient two-phase thermal-hydraulics of Pressurized Water Reactors (PWRs) especially during Large Break Loss of Coolant Accidents (LBLOCAs) where the multi-dimensional phenomena domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, domain the transients. MARS code is a unified version of USNRC developed COBRA-TF, three-dimensional (3D) reactor vessel analysis code, and RELAP5/MOD3.2.1.2, one-dimensional (1D) reactor system analysis code., Developmental requirements for MARS are chosen not only to best utilize the existing capability of the codes but also to have the enhanced capability in code maintenance, user accessibility, user friendliness, code portability, code readability, and code flexibility. For the maintenance of existing codes capability and the enhancement of code maintenance capability, user accessibility and user friendliness, MARS has been unified to be a single code consisting of 1D module (RELAP5) and 3D module (COBRA-TF). This is realized by implicitly integrating the system pressure matrix equations of hydrodynamic models and solving them simultaneously, by modifying the 1D/3D calculation sequence operable under a single Central Processor Unit (CPU) and by unifying the input structure and the light water property routines of both modules. In addition, the code structure of 1D module is completely restructured using the modular data structure of standard FORTRAN 90, which greatly improves the code maintenance capability, readability and portability. For the code flexibility, a dynamic memory management scheme is applied in both modules. MARS 1.3 now runs on PC/Windows and HP/UNIX platforms having a single CPU, and users have the options to select the 3D module to model the 3D thermal-hydraulics in the reactor vessel or other
Zhang, De-Li; Ji, Liang; Li, Yan-Da
2004-05-01
We found that human genome coding regions annotated by computers have different kinds of many errors in public domain through homologous BLAST of our cloned genes in non-redundant (nr) database, including insertions, deletions or mutations of one base pair or a segment in sequences at the cDNA level, or different permutation and combination of these errors. Basically, we use the three means for validating and identifying some errors of the model genes appeared in NCBI GENOME ANNOTATION PROJECT REFSEQS: (I) Evaluating the support degree of human EST clustering and draft human genome BLAST. (2) Preparation of chromosomal mapping of our verified genes and analysis of genomic organization of the genes. All of the exon/intron boundaries should be consistent with the GT/AG rule, and consensuses surrounding the splice boundaries should be found as well. (3) Experimental verification by RT-PCR of the in silico cloning genes and further by cDNA sequencing. And then we use the three means as reference: (1) Web searching or in silico cloning of the genes of different species, especially mouse and rat homologous genes, and thus judging the gene existence by ontology. (2) By using the released genes in public domain as standard, which should be highly homologous to our verified genes, especially the released human genes appeared in NCBI GENOME ANNOTATION PROJECT REFSEQS, we try to clone each a highly homologous complete gene similar to the released genes in public domain according to the strategy we developed in this paper. If we can not get it, our verified gene may be correct and the released gene in public domain may be wrong. (3) To find more evidence, we verified our cloned genes by RT-PCR or hybrid technique. Here we list some errors we found from NCBI GENOME ANNOTATION PROJECT REFSEQs: (1) Insert a base in the ORF by mistake which causes the frame shift of the coding amino acid. In detail, abase in the ORF of a gene is a redundant insertion, which causes a reading frame
Continuous verification using multimodal biometrics.
Sim, Terence; Zhang, Sheng; Janakiraman, Rajkumar; Kumar, Sandeep
2007-04-01
Conventional verification systems, such as those controlling access to a secure room, do not usually require the user to reauthenticate himself for continued access to the protected resource. This may not be sufficient for high-security environments in which the protected resource needs to be continuously monitored for unauthorized use. In such cases, continuous verification is needed. In this paper, we present the theory, architecture, implementation, and performance of a multimodal biometrics verification system that continuously verifies the presence of a logged-in user. Two modalities are currently used--face and fingerprint--but our theory can be readily extended to include more modalities. We show that continuous verification imposes additional requirements on multimodal fusion when compared to conventional verification systems. We also argue that the usual performance metrics of false accept and false reject rates are insufficient yardsticks for continuous verification and propose new metrics against which we benchmark our system.
Institute of Scientific and Technical Information of China (English)
QIN Xiao-hong; JIA Lin; LI Ni
2010-01-01
In this paper, the theoretical analysis proves that the relationship between radius r of jet and the axial distance z from the onset of whipping instability follows an allometric law in the form r ∝ z-1/4 whatever surface charge parameter a is. Polyvinylalcohol (PVA) was used to study the effect of surface charge on the variation of jet diameter with axial coordinate after the onset of whipping instability during electrospinning by adding LiCI. The experiment shows that the relationship between radius r of jet and the axial distance z from the onset of whipping instability also follows the law in the form r∝ z-1/4 whenthe content of LiCl is from 0.2 wt% to 4 wt%. That is, the law does not depend upon the content of salt, and the theoretical prediction agrees quite well with the experimental data.
Peretti, L. F.; Dowell, E. H.
1992-01-01
An experiment was performed on a rigid wall rectangular acoustic cavity driven by a flexible plate mounted in a quarter of one end wall and excited by white noise. The experiment was designed so that the assumptions of Asymptotic Modal Analysis (AMA) were satisfied for certain bandwidths and center frequencies. Measurements of sound pressure levels at points along the boundaries and incrementally into tbe interior were taken. These were compared with the theoretical results predicted with AMA, and found to be in good agreement, particularly for moderate (1/3 octave) bandwidths and sufficiently high center frequencies. Sound pressure level measurements were also taken well into the cavity interior at various points along the 5 totally rigid walls. The AMA theory, including boundary intensification effects, was shown to be accurate provided the assumption of large number of acoustic modes is satisfied, and variables such as power spectra of the wall acceleration, frequency, and damping are slowly varying in the frequency of bandwidth.
Peretti, L. F.; Dowell, E. H.
1992-10-01
An experiment was performed on a rigid wall rectangular acoustic cavity driven by a flexible plate mounted in a quarter of one end wall and excited by white noise. The experiment was designed so that the assumptions of Asymptotic Modal Analysis (AMA) were satisfied for certain bandwidths and center frequencies. Measurements of sound pressure levels at points along the boundaries and incrementally into tbe interior were taken. These were compared with the theoretical results predicted with AMA, and found to be in good agreement, particularly for moderate (1/3 octave) bandwidths and sufficiently high center frequencies. Sound pressure level measurements were also taken well into the cavity interior at various points along the 5 totally rigid walls. The AMA theory, including boundary intensification effects, was shown to be accurate provided the assumption of large number of acoustic modes is satisfied, and variables such as power spectra of the wall acceleration, frequency, and damping are slowly varying in the frequency of bandwidth.
DEFF Research Database (Denmark)
Peng, Liang; Wang, Jingyu; Zhejiang University, Hangzhou, China, L.
2011-01-01
In this paper, the efficiency analysis of a mid-range wireless energy transfer system is performed through non-resonant magnetic coupling. It is shown that the self-resistance of the coils and the mutual inductance are critical in achieving a high efficiency, which is indicated by our theoretical...... formulation and verified in our experiments. It is experimentally shown that high efficiency, up to 65%, can be realized even in a non-resonant wireless energy system which employs a device part with moderate or low quality factor. We also address some aspects of a practical wireless energy transfer system...... and show that careful design of the de-tuned system can intrinsically minimize the power dissipated in the source part. Our non-resonant scheme presented in this paper allows flexible design and fabrication of a wireless energy transfer systems with transfer distance being several times of the coils...
Gu, Herong; Guan, Yajuan; Wang, Huaibao; Wei, Baoze; Guo, Xiaoqiang
2014-01-01
Microgrid is an effective way to integrate the distributed energy resources into the utility networks. One of the most important issues is the power flow control of grid-connected voltage-source inverter in microgrid. In this paper, the small-signal model of the power flow control for the grid-connected inverter is established, from which it can be observed that the conventional power flow control may suffer from the poor damping and slow transient response. While the new power flow control can mitigate these problems without affecting the steady-state power flow regulation. Results of continuous-domain simulations in MATLAB and digital control experiments based on a 32-bit fixed-point TMS320F2812 DSP are in good agreement, which verify the small signal model analysis and effectiveness of the proposed method.
Luquain, Alexandra; Magnin, Sandrine; Guenat, David; Prétet, Jean-Luc; Viennet, Gabriel; Valmary-Degano, Séverine; Mougin, Christiane
2015-01-01
Promoter methylation of the MGMT gene, encoding the enzyme O6-methylguanine-ubiquitous methyltransferase, is a theranostic good prognosis marker of glioblastomas treated with alkylating chemotherapy (temozolomide, Temodal(®)). Among the methylation analysis techniques, pyrosequencing is a reproducible and sensitive quantitative method. As part of the accreditation of the hospital platform of molecular genetics of cancer, Besançon, our objective was to verify the performance of the pyrosequencing commercial kit therascreen(®) MGMT Pyro(®) (Qiagen) in terms of repeatability, reproducibility, limit of blank (LOB), limit of detection (LOD), linearity and contamination by the guide SH GTA 04 delivered by the Cofrac. The repeatability tests show an average methylation of 3.22% [standard deviation (SD) = 0.41, coefficient of variation (CV) = 12.75%] for the unmethylated control and 70.16% (SD = 2.20, CV = 3.14%) for the methylated control. Reproducibility demontrates an average methylation of 1.39% (SD = 0.25, CV = 18.25%) for the unmethylated control and of 94.03% (SD = 2.56, CV = 2.73%) for the methylated control. The percentages of LOB and LOD are respectively 3.43% and 6.22% methylation. The regression coefficient of 0,983 confirms the linearity of the assay from 0% to 100% methylation. No contamination has been observed. Over 40% of glioblastomas studied in 2013 in our laboratory have shown a methylated MGMT gene. Our results confirms that the theraScreen(®) MGMT Pyro(®) kit (Qiagen) is performant in compliance with the quality requirements of the NF EN ISO 15189 for the routine analysis of methylation status of MGMT in glioblastomas.
Hryniewicka, Marta; Karpinska, Agnieszka; Kijewska, Marta; Turkowicz, Monika Joanna; Karpinska, Joanna
2016-11-01
This study shows the results of application liquid chromatography-tandem mass spectrometry (LC/MS/MS) for assay of the content of α-tocopherol and coenzyme Q10 in bee products of animal origin, i.e. royal jelly, beebread and drone homogenate. The biological matrix was removed using extraction with n-hexane. It was found that drone homogenate is a rich source of coenzyme Q10 . It contains only 8 ± 1 µg/g of α-tocopherol and 20 ± 2 µg/g of coenzyme Q10 . The contents of assayed compounds in royal jelly were 16 ± 3 and 8 ± 0.2 µg/g of α-tocopherol and coenzyme Q10 , respectively. Beebread appeared to be the richest of α-tocopherol. Its level was 80 ± 30 µg/g, while the level of coenzyme Q10 was only 11.5 ± 0.3 µg/g. Copyright © 2016 John Wiley & Sons, Ltd.
Scalable Techniques for Formal Verification
Ray, Sandip
2010-01-01
This book presents state-of-the-art approaches to formal verification techniques to seamlessly integrate different formal verification methods within a single logical foundation. It should benefit researchers and practitioners looking to get a broad overview of the spectrum of formal verification techniques, as well as approaches to combining such techniques within a single framework. Coverage includes a range of case studies showing how such combination is fruitful in developing a scalable verification methodology for industrial designs. This book outlines both theoretical and practical issue
Frandsen, Benjamin A.; Brunelli, Michela; Page, Katharine; Uemura, Yasutomo J.; Staunton, Julie B.; Billinge, Simon J. L.
2016-05-01
We present a temperature-dependent atomic and magnetic pair distribution function (PDF) analysis of neutron total scattering measurements of antiferromagnetic MnO, an archetypal strongly correlated transition-metal oxide. The known antiferromagnetic ground-state structure fits the low-temperature data closely with refined parameters that agree with conventional techniques, confirming the reliability of the newly developed magnetic PDF method. The measurements performed in the paramagnetic phase reveal significant short-range magnetic correlations on a ˜1 nm length scale that differ substantially from the low-temperature long-range spin arrangement. Ab initio calculations using a self-interaction-corrected local spin density approximation of density functional theory predict magnetic interactions dominated by Anderson superexchange and reproduce the measured short-range magnetic correlations to a high degree of accuracy. Further calculations simulating an additional contribution from a direct exchange interaction show much worse agreement with the data. The Anderson superexchange model for MnO is thus verified by experimentation and confirmed by ab initio theory.
Ji, Jing; Zhang, Jianhui; Xia, Qixiao; Wang, Shouyin; Huang, Jun; Zhao, Chunsheng
2014-05-01
Existing researches on no-moving part valves in valve-less piezoelectric pumps mainly concentrate on pipeline valves and chamber bottom valves, which leads to the complex structure and manufacturing process of pump channel and chamber bottom. Furthermore, position fixed valves with respect to the inlet and outlet also makes the adjustability and controllability of flow rate worse. In order to overcome these shortcomings, this paper puts forward a novel implantable structure of valve-less piezoelectric pump with hemisphere-segments in the pump chamber. Based on the theory of flow around bluff-body, the flow resistance on the spherical and round surface of hemisphere-segment is different when fluid flows through, and the macroscopic flow resistance differences thus formed are also different. A novel valve-less piezoelectric pump with hemisphere-segment bluff-body (HSBB) is presented and designed. HSBB is the no-moving part valve. By the method of volume and momentum comparison, the stress on the bluff-body in the pump chamber is analyzed. The essential reason of unidirectional fluid pumping is expounded, and the flow rate formula is obtained. To verify the theory, a prototype is produced. By using the prototype, experimental research on the relationship between flow rate, pressure difference, voltage, and frequency has been carried out, which proves the correctness of the above theory. This prototype has six hemisphere-segments in the chamber filled with water, and the effective diameter of the piezoelectric bimorph is 30mm. The experiment result shows that the flow rate can reach 0.50 mL/s at the frequency of 6 Hz and the voltage of 110 V. Besides, the pressure difference can reach 26.2 mm H2O at the frequency of 6 Hz and the voltage of 160 V. This research proposes a valve-less piezoelectric pump with hemisphere-segment bluff-body, and its validity and feasibility is verified through theoretical analysis and experiment.
Homogenization of Partial Differential Equations
Kaiser, Gerald
2005-01-01
A comprehensive study of homogenized problems, focusing on the construction of nonstandard models: non-local models, multicomponent models, and models with memory. This work is intended for graduate students, applied mathematicians, physicists, and engineers.
Dong, Lei
1995-01-01
The successful management of cancer with radiation relies on the accurate deposition of a prescribed dose to a prescribed anatomical volume within the patient. Treatment set-up errors are inevitable because the alignment of field shaping devices with the patient must be repeated daily up to eighty times during the course of a fractionated radiotherapy treatment. With the invention of electronic portal imaging devices (EPIDs), patient's portal images can be visualized daily in real-time after only a small fraction of the radiation dose has been delivered to each treatment field. However, the accuracy of human visual evaluation of low-contrast portal images has been found to be inadequate. The goal of this research is to develop automated image analysis tools to detect both treatment field shape errors and patient anatomy placement errors with an EPID. A moments method has been developed to align treatment field images to compensate for lack of repositioning precision of the image detector. A figure of merit has also been established to verify the shape and rotation of the treatment fields. Following proper alignment of treatment field boundaries, a cross-correlation method has been developed to detect shifts of the patient's anatomy relative to the treatment field boundary. Phantom studies showed that the moments method aligned the radiation fields to within 0.5mm of translation and 0.5^ circ of rotation and that the cross-correlation method aligned anatomical structures inside the radiation field to within 1 mm of translation and 1^ circ of rotation. A new procedure of generating and using digitally reconstructed radiographs (DRRs) at megavoltage energies as reference images was also investigated. The procedure allowed a direct comparison between a designed treatment portal and the actual patient setup positions detected by an EPID. Phantom studies confirmed the feasibility of the methodology. Both the moments method and the cross -correlation technique were
Operator estimates in homogenization theory
Zhikov, V. V.; Pastukhova, S. E.
2016-06-01
This paper gives a systematic treatment of two methods for obtaining operator estimates: the shift method and the spectral method. Though substantially different in mathematical technique and physical motivation, these methods produce basically the same results. Besides the classical formulation of the homogenization problem, other formulations of the problem are also considered: homogenization in perforated domains, the case of an unbounded diffusion matrix, non-self-adjoint evolution equations, and higher-order elliptic operators. Bibliography: 62 titles.
Survey on Offline Finger Print Verification System
Suman, R.; Kaur, R.
2012-01-01
The fingerprint verification, means where "verification" implies a user matching a fingerprint against a single fingerprint associated with the identity that the user claims. Biometrics can be classified into two types Behavioral (signature verification, keystroke dynamics, etc.) And Physiological (
Verification of LHS distributions.
Energy Technology Data Exchange (ETDEWEB)
Swiler, Laura Painton
2006-04-01
This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.
Lerner, Sorin; Kundu, Sudipta
2011-01-01
Given the growing size and heterogeneity of Systems on Chip (SOC), the design process from initial specification to chip fabrication has become increasingly complex. This growing complexity provides incentive for designers to use high-level languages such as C, SystemC, and SystemVerilog for system-level design. While a major goal of these high-level languages is to enable verification at a higher level of abstraction, allowing early exploration of system-level designs, the focus so far for validation purposes has been on traditional testing techniques such as random testing and scenario-based
Puntel, Robson Luiz; Roos, Daniel Henrique; Grotto, Denise; Garcia, Solange C; Nogueira, Cristina Wayne; Rocha, Joao Batista Teixeira
2007-06-13
A variety of Krebs cycle intermediaries has been shown to possess antioxidant properties in different in vivo and in vitro systems. Here we examined whether citrate, succinate, malate, oxaloacetate, fumarate and alpha-ketoglutarate could modulate malonate-induced thiobarbituric acid-reactive species (TBARS) production in rat brain homogenate. The mechanisms involved in their antioxidant activity were also determined using two analytical methods: 1) a popular spectrophotometric method (Ohkawa, H., Ohishi, N., Yagi, K., 1979. Assay for lipid peroxides in animal tissues by thiobarbituric acid reaction. Analytical Biochemistry 95, 351-358.) and a high performance liquid chromatographic (HPLC) procedure (Grotto, D., Santa Maria, L. D., Boeira, S., Valentini, J., Charão, M. F., Moro, A. M., Nascimento, P. C., Pomblum, V. J., Garcia, S. C., 2006. Rapid quantification of malondialdehyde in plasma by high performance liquid chromatography-visible detection. Journal of Pharmaceutical and Biomedical Analysis 43, 619-624.). Citrate, malate, and oxaloacetate reduced both basal and malonate-induced TBARS production. Their effects were not changed by pre-treatment of rat brain homogenates at 100 degrees C for 10 min. alpha-Ketoglutarate increased basal TBARS without changing malonate-induced TBARS production in fresh and heat-treated homogenates. Succinate reduced basal--without altering malonate-induced TBARS production. Its antioxidant activity was abolished by KCN or heat treatment. Fumarate reduced malonate-induced TBARS production in fresh homogenates; however, its effect was completely abolished by heat treatment. There were minimal differences among the studied methods. Citrate, oxaloacetate, malate, alpha-ketoglutarate and malonate showed iron-chelating activity. We suggest that antioxidant properties of citrate, malate and oxaloacetate were due to their ability to cancel iron redox activity by forming inactive complexes, whereas alpha-ketoglutarate and malonate pro
Non-linear waves in heterogeneous elastic rods via homogenization
Quezada de Luna, Manuel
2012-03-01
We consider the propagation of a planar loop on a heterogeneous elastic rod with a periodic microstructure consisting of two alternating homogeneous regions with different material properties. The analysis is carried out using a second-order homogenization theory based on a multiple scale asymptotic expansion. © 2011 Elsevier Ltd. All rights reserved.
Yamasaki, Yudai; Iida, Norimasa
The present study focuses on clarifying the combustion mechanism of the homogeneous-charge compression-ignition (HCCI) engine in order to control ignition and combustion as well as to reduce HC and CO emissions and to maintain high combustion efficiency by calculating the chemical kinetics of elementary reactions. For the calculations, n-butane was selected as fuel since it is a fuel with the smallest carbon number in the alkane family that shows two-stage autoignition (heat release with low-temperature reaction (LTR) and with high-temperature reaction (HTR)) similarly to higher hydrocarbons such as gasoline. The CHEMKIN code was used for the calculations assuming zero dimensions in the combustion chamber and adiabatic change. The results reveal the heat release mechanism of the LTR and HTR, the control factor of ignition timing and combustion speed, and the condition need to reduce HC and CO emissions and to maintain high combustion efficiency.
Directory of Open Access Journals (Sweden)
Jacek Hunicz
2015-01-01
Full Text Available In this study we summarize and analyze experimental observations of cyclic variability in homogeneous charge compression ignition (HCCI combustion in a single-cylinder gasoline engine. The engine was configured with negative valve overlap (NVO to trap residual gases from prior cycles and thus enable auto-ignition in successive cycles. Correlations were developed between different fuel injection strategies and cycle average combustion and work output profiles. Hypothesized physical mechanisms based on these correlations were then compared with trends in cycle-by-cycle predictability as revealed by sample entropy. The results of these comparisons help to clarify how fuel injection strategy can interact with prior cycle effects to affect combustion stability and so contribute to design control methods for HCCI engines.
Analysis the Online Game Industry Homogenization Phenomenon%对网络游戏产业同质化现象的分析
Institute of Scientific and Technical Information of China (English)
马莹莹
2012-01-01
经过十年的发展,基于网络平台的休闲方式之一的网络游戏已形成了巨大的产业规模,并成为与影视、音乐等并驾齐驱的全球最重要的娱乐产业之一。然而在国内网络游戏市场呈现出欣欣向荣的景象背后,由于发展过快所带来的各种问题凸显严重,其中同质化现象尤为明显。本文通过展示同质化现象的具体表现,从产业发展、网络技术、人才需求等方面探讨其成因,推论其影响,从而提出对这种现象应持有的正确态度以及相应的对策。%After ten years of development,based on the network platform for leisure.Online game has become a huge industry scale,and became associated with film,music and other racing together bridle to bridle the world＇s most important entertainment industries.However in market of domestic online game shows a thriving scene behind,owing to rapid development brought about by the various problems highlighted serious,the homogenization phenomenon particularly apparent.This artiel shows homogenization phenomenon of specific performance,and explores the phenomenon＇s causes from the industrial development,network technology,talent demand,and inference its influence,thus put forward the right attitudes and corresponding countermeasures.
Spatial verification using wavelet transforms: a review
Weniger, Michael; Kapp, Florian; Friederichs, Petra
2017-01-01
Due to the emergence of new high resolution numerical weather prediction (NWP) models and the availability of new or more reliable remote sensing data, the importance of efficient spatial verification techniques is growing. Wavelet transforms offer an effective framework to decompose spatial data into separate (and possibly orthogonal) scales and directions. Most wavelet based spatial verification techniques have been developed or refined in the last decade and concentrate on assessing forecast performance (i.e. forecast skill or forecast error) on distinct physical scales. Particularly during the last five years, a significant growth in meteorological applications could be observed. However, a comparison with other scientific fields such as feature detection, image fusion, texture analysis, or facial and biometric recognition, shows that there is still a considerable, currently unused potential to derive useful diagnostic information. In order to tab the full potential of wavelet analysis, we revise the state-of-the art in one- and two-dimensional wavelet analysis and its application with emphasis on spatial verification. We further use a technique developed for texture analysis in the context of high-resolution quantitative precipitation forecasts, which is able to assess structural characteristics of the precipitation fields and allows efficient clustering of ensemble data.
Enhanced verification test suite for physics simulation codes
Energy Technology Data Exchange (ETDEWEB)
Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.; Cotrell, David L.; Johnson, Bryan; Knupp, Patrick; Rider, William J.; Trucano, Timothy G.; Weirs, V. Gregory
2008-09-01
This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.
HDL to verification logic translator
Gambles, J. W.; Windley, P. J.
The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.
HDL to verification logic translator
Gambles, J. W.; Windley, P. J.
1992-01-01
The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.
Systems Approach to Arms Control Verification
Energy Technology Data Exchange (ETDEWEB)
Allen, K; Neimeyer, I; Listner, C; Stein, G; Chen, C; Dreicer, M
2015-05-15
Using the decades of experience of developing concepts and technologies for verifying bilateral and multilateral arms control agreements, a broad conceptual systems approach is being developed that takes into account varying levels of information and risk. The IAEA has already demonstrated the applicability of a systems approach by implementing safeguards at the State level, with acquisition path analysis as the key element. In order to test whether such an approach could also be implemented for arms control verification, an exercise was conducted in November 2014 at the JRC ITU Ispra. Based on the scenario of a hypothetical treaty between two model nuclear weapons states aimed at capping their nuclear arsenals at existing levels, the goal of this exercise was to explore how to use acquisition path analysis in an arms control context. Our contribution will present the scenario, objectives and results of this exercise, and attempt to define future workshops aimed at further developing verification measures that will deter or detect treaty violations.
Spatial Verification Using Wavelet Transforms: A Review
Weniger, Michael; Friederichs, Petra
2016-01-01
Due to the emergence of new high resolution numerical weather prediction (NWP) models and the availability of new or more reliable remote sensing data, the importance of efficient spatial verification techniques is growing. Wavelet transforms offer an effective framework to decompose spatial data into separate (and possibly orthogonal) scales and directions. Most wavelet based spatial verification techniques have been developed or refined in the last decade and concentrate on assessing forecast performance (i.e. forecast skill or forecast error) on distinct physical scales. Particularly during the last five years, a significant growth in meteorological applications could be observed. However, a comparison with other scientific fields such as feature detection, image fusion, texture analysis, or facial and biometric recognition, shows that there is still a considerable, currently unused potential to derive useful diagnostic information. In order to tab the full potential of wavelet analysis, we revise the stat...
Formal Verification, Engineering and Business Value
Directory of Open Access Journals (Sweden)
Ralf Huuck
2012-12-01
Full Text Available How to apply automated verification technology such as model checking and static program analysis to millions of lines of embedded C/C++ code? How to package this technology in a way that it can be used by software developers and engineers, who might have no background in formal verification? And how to convince business managers to actually pay for such a software? This work addresses a number of those questions. Based on our own experience on developing and distributing the Goanna source code analyzer for detecting software bugs and security vulnerabilities in C/C++ code, we explain the underlying technology of model checking, static analysis and SMT solving, steps involved in creating industrial-proof tools.
Directory of Open Access Journals (Sweden)
Abhishek Jain
2012-12-01
Full Text Available In this paper, we present Generic System Verilog Universal Verification Methodology based Reusable Verification Environment for efficient verification of Image Signal Processing IP’s/SoC’s. With the tight schedules on all projects it is important to have a strong verification methodology which contributes to First Silicon Success. Deploy methodologies which enforce full functional coverage and verification of corner cases through pseudo random test scenarios is required. Also, standardization of verification flow is needed. Previously, inside imaging group of ST, Specman (e/Verilog based Verification Environment forIP/Subsystem level verification and C/C++/Verilog based Directed Verification Environment for SoC Level Verification was used for Functional Verification. Different Verification Environments were used at IP level and SoC level. Different Verification/Validation Methodologies were used for SoC Verification across multiple sites. Verification teams were also looking for the ways how to catch bugs early in the design cycle? Thus, Generic System Verilog Universal Verification Methodology (UVM based Reusable Verification Environment is required to avoid the problem of having so many methodologies and provides a standard unified solution which compiles on all tools.
Clinical Verification of Homeopathy
Directory of Open Access Journals (Sweden)
Michel Van Wassenhoven
2011-07-01
Full Text Available The world is changing! This is certainly true regarding the homeopathic practice and access to homeopathic medicine. Therefore our first priority at the ECH-LMHI [1] has been to produce a yearly report on the scientific framework of homeopathy. In the 2010 version a new chapter about epidemic diseases has been added including the Leptospirosis survey on the Cuban population. A second priority has been to review the definition of the homeopathic medicines respecting the new framework generated by the official registration procedure and the WHO report. We are working now on a documented (Materia Medica and provings list of homeopathic remedies to facilitate the registration of our remedies. The new challenges are: first of all more good research proposals and as such more funding (possible through ISCHI + Blackie Foundation as examples [2]; international acceptance of new guidelines for proving and clinical verification of homeopathic symptoms (Proposals are ready for discussion; total reconsideration of the homeopathic repertories including results of the clinical verification of the symptoms. The world is changing, we are part of the world and changes are needed also for homeopathy!
Energy Technology Data Exchange (ETDEWEB)
Bob Hardage; M.M. Backus; M.V. DeAngelo; R.J. Graebner; S.E. Laubach; Paul Murray
2004-02-01
Fractures within the producing reservoirs at McElroy Field could not be studied with the industry-provided 3C3D seismic data used as a cost-sharing contribution in this study. The signal-to-noise character of the converted-SV data across the targeted reservoirs in these contributed data was not adequate for interpreting azimuth-dependent data effects. After illustrating the low signal quality of the converted-SV data at McElroy Field, the seismic portion of this report abandons the McElroy study site and defers to 3C3D seismic data acquired across a different fractured carbonate reservoir system to illustrate how 3C3D seismic data can provide useful information about fracture systems. Using these latter data, we illustrate how fast-S and slow-S data effects can be analyzed in the prestack domain to recognize fracture azimuth, and then demonstrate how fast-S and slow-S data volumes can be analyzed in the poststack domain to estimate fracture intensity. In the geologic portion of the report, we analyze published regional stress data near McElroy Field and numerous formation multi-imager (FMI) logs acquired across McElroy to develop possible fracture models for the McElroy system. Regional stress data imply a fracture orientation different from the orientations observed in most of the FMI logs. This report culminates Phase 2 of the study, ''Combining a New 3-D Seismic S-Wave Propagation Analysis for Remote Fracture Detection with a Robust Subsurface Microfracture-Based Verification Technique''. Phase 3 will not be initiated because wells were to be drilled in Phase 3 of the project to verify the validity of fracture-orientation maps and fracture-intensity maps produced in Phase 2. Such maps cannot be made across McElroy Field because of the limitations of the available 3C3D seismic data at the depth level of the reservoir target.
AQUEOUS HOMOGENEOUS REACTORTECHNICAL PANEL REPORT
Energy Technology Data Exchange (ETDEWEB)
Diamond, D.J.; Bajorek, S.; Bakel, A.; Flanagan, G.; Mubayi, V.; Skarda, R.; Staudenmeier, J.; Taiwo, T.; Tonoike, K.; Tripp, C.; Wei, T.; Yarsky, P.
2010-12-03
Considerable interest has been expressed for developing a stable U.S. production capacity for medical isotopes and particularly for molybdenum- 99 (99Mo). This is motivated by recent re-ductions in production and supply worldwide. Consistent with U.S. nonproliferation objectives, any new production capability should not use highly enriched uranium fuel or targets. Conse-quently, Aqueous Homogeneous Reactors (AHRs) are under consideration for potential 99Mo production using low-enriched uranium. Although the Nuclear Regulatory Commission (NRC) has guidance to facilitate the licensing process for non-power reactors, that guidance is focused on reactors with fixed, solid fuel and hence, not applicable to an AHR. A panel was convened to study the technical issues associated with normal operation and potential transients and accidents of an AHR that might be designed for isotope production. The panel has produced the requisite AHR licensing guidance for three chapters that exist now for non-power reactor licensing: Reac-tor Description, Reactor Coolant Systems, and Accident Analysis. The guidance is in two parts for each chapter: 1) standard format and content a licensee would use and 2) the standard review plan the NRC staff would use. This guidance takes into account the unique features of an AHR such as the fuel being in solution; the fission product barriers being the vessel and attached systems; the production and release of radiolytic and fission product gases and their impact on operations and their control by a gas management system; and the movement of fuel into and out of the reactor vessel.
Molinelli, S.; Mairani, A.; Mirandola, A.; Vilches Freixas, G.; Tessonnier, T.; Giordanengo, S.; Parodi, K.; Ciocca, M.; Orecchia, R.
2013-06-01
During one year of clinical activity at the Italian National Center for Oncological Hadron Therapy 31 patients were treated with actively scanned proton beams. Results of patient-specific quality assurance procedures are presented here which assess the accuracy of a three-dimensional dose verification technique with the simultaneous use of multiple small-volume ionization chambers. To investigate critical cases of major deviations between treatment planning system (TPS) calculated and measured data points, a Monte Carlo (MC) simulation tool was implemented for plan verification in water. Starting from MC results, the impact of dose calculation, dose delivery and measurement set-up uncertainties on plan verification results was analyzed. All resulting patient-specific quality checks were within the acceptance threshold, which was set at 5% for both mean deviation between measured and calculated doses and standard deviation. The mean deviation between TPS dose calculation and measurement was less than ±3% in 86% of the cases. When all three sources of uncertainty were accounted for, simulated data sets showed a high level of agreement, with mean and maximum absolute deviation lower than 2.5% and 5%, respectively.
Genetic Homogenization of Composite Materials
Directory of Open Access Journals (Sweden)
P. Tobola
2009-04-01
Full Text Available The paper is focused on numerical studies of electromagnetic properties of composite materials used for the construction of small airplanes. Discussions concentrate on the genetic homogenization of composite layers and composite layers with a slot. The homogenization is aimed to reduce CPU-time demands of EMC computational models of electrically large airplanes. First, a methodology of creating a 3-dimensional numerical model of a composite material in CST Microwave Studio is proposed focusing on a sufficient accuracy of the model. Second, a proper implementation of a genetic optimization in Matlab is discussed. Third, an association of the optimization script and a simplified 2-dimensional model of the homogeneous equivalent model in Comsol Multiphysics is proposed considering EMC issues. Results of computations are experimentally verified.
Directory of Open Access Journals (Sweden)
del Río Merino, M.
2002-12-01
Full Text Available The company Vetrotex in collaboration with the Department of Architectural Constructions and their control (E.U.A.T of the Polytechnic University of Madrid, have deeply researched on plaster strengthened with glass fiber E.
The conclusions of the influence of the dispersibility of glass fiber in the compound mechanical behavior and in its workability have been published in an earlier article.
Here now are the results and conclusions of a study on mixed strengthenings of glass fibers E and AR combined at a 50%, as an alternative to the homogenous strengthening.
La empresa Vetrotex, a través de sus técnicos y en colaboración con el Departamento de Construcciones Arquitectónicas y su control (E.U.A.T de la UPM, decide acometer un estudio en profundidad de la escayola reforzada con fibra de vidrio E.
En un primer artículo se presentaron las conclusiones sobre la influencia del grado de dispersabilidad de las fibras de vidrio en el comportamiento mecánico del compuesto y en su trabajabilidad.
En este segundo artículo se presentan los resultados y conclusiones del estudio de los refuerzos mixtos de fibras de vidrio E en combinación al 50% con fibras de vidrio AR, como alteruativa a los refuerzos actuales homogéneos.
Qin, Shitong; Li, Renxian; Yang, Ruiping; Ding, Chunying
2017-07-01
The interaction of an axicon-generated vector Bessel beam (AGVBB) with a homogeneous sphere is investigated in the framework of generalized Lorenz-Mie theory (GLMT). An analytical expression of beam shape coefficients (BSCs) is derived using angular spectrum decomposition method (ASDM), and the scattering coefficients are expanded using Debye series (DSE) in order to isolate the contribution of single scattering process. The internal and near-surface electric fields are numerically analyzed, and the effect of beam location, polarization, order of beam, half-cone angle, and scattering process (namely Debye mode p) are mainly discussed. Numerical results show that a curve formed by extreme peaks can be observed, and the electric fields can be locally enhanced after the interaction of AGVBBs with the particle. Internal and near-surface fields, especially its local enhancement, are very sensitive to the beam parameters, including polarization, order, half-cone angle, etc. The internal fields can also be enhanced by various scattering process (or Debye mode p). Such results have important applications in various fields, including particle sizing, optical tweezers, etc.
Wang, Shuai; Wang, Guodong; Lv, Hailong; Wu, Renrong; Zhao, Jingping; Guo, Wenbin
2016-06-08
Subjects with psychosis risk syndrome (PRS) have structural and functional abnormalities in several brain regions. However, regional functional synchronization of PRS has not been clarified. We recruited 34 PRS subjects and 37 healthy controls. Regional homogeneity (ReHo) of resting-state functional magnetic resonance scans was employed to analyze regional functional synchronization in these participants. Receiver operating characteristic curves and support vector machines were used to detect whether abnormal regional functional synchronization could be utilized to separate PRS subjects from healthy controls. We observed that PRS subjects showed significant ReHo decreases in the left inferior temporal gyrus and increases in the right inferior frontal gyrus and right putamen compared with the controls. No correlations between abnormal regional functional synchronization in these brain regions and clinical characteristics existed. A combination of the ReHo values in the three brain regions showed sensitivity, specificity, and accuracy of 88.24%, 91.89%, and 90.14%, respectively, for discriminating PRS subjects from healthy controls. We inferred that abnormal regional functional synchronization exists in the cerebrum of PRS subjects, and a combination of ReHo values in these abnormal regions could be applied as potential image biomarker to identify PRS subjects from healthy controls.
Cappelle, J; Caron, A; Servan De Almeida, R; Gil, P; Pedrono, M; Mundava, J; Fofana, B; Balança, G; Dakouo, M; Ould El Mamy, A B; Abolnik, C; Maminiaina, O F; Cumming, G S; De Visscher, M-N; Albina, E; Chevalier, V; Gaidet, N
2015-04-01
Newcastle disease (ND) is one of the most important poultry diseases worldwide and can lead to annual losses of up to 80% of backyard chickens in Africa. All bird species are considered susceptible to ND virus (NDV) infection but little is known about the role that wild birds play in the epidemiology of the virus. We present a long-term monitoring of 9000 wild birds in four African countries. Overall, 3·06% of the birds were PCR-positive for NDV infection, with prevalence ranging from 0% to 10% depending on the season, the site and the species considered. Our study shows that ND is circulating continuously and homogeneously in a large range of wild bird species. Several genotypes of NDV circulate concurrently in different species and are phylogenetically closely related to strains circulating in local domestic poultry, suggesting that wild birds may play several roles in the epidemiology of different NDV strains in Africa. We recommend that any strategic plan aiming at controlling ND in Africa should take into account the potential role of the local wild bird community in the transmission of the disease.
Institute of Scientific and Technical Information of China (English)
王昌达; 华明辉; 周从华; 宋香梅; 鞠时光
2011-01-01
In order to implement security analysis of access control policy rapidly, predicate abstract with verification space division was presented, I. E. Transfer pristine state machine model analysis to abstract state machine model which contains fewer states. Furthermore, verification space division was introduced to decrease the dimensions of model checking. Endorsed by both theoretic analysis and experiment,time and space requirement are effectively reduced. Compared with the known methods,our methodology is more efficiency and less human interacted.%为满足访问控制策略安全性快速判定的要求,提出一种基于谓词抽象和验证空间划分的访问控制策略状态空间约减方法,将在访问控制策略原始状态机模型上的安全性分析工作转移到包含较少状态的抽象模型上,并进一步划分抽象模型的验证空间,以提高效率.理论分析和实验数据均表明,其安全性分析所需的时间和空间都得到有效约减.与传统方法相比,它具有速度更快、自动化程度更高等优点.
Finite Countermodel Based Verification for Program Transformation (A Case Study
Directory of Open Access Journals (Sweden)
Alexei P. Lisitsa
2015-12-01
Full Text Available Both automatic program verification and program transformation are based on program analysis. In the past decade a number of approaches using various automatic general-purpose program transformation techniques (partial deduction, specialization, supercompilation for verification of unreachability properties of computing systems were introduced and demonstrated. On the other hand, the semantics based unfold-fold program transformation methods pose themselves diverse kinds of reachability tasks and try to solve them, aiming at improving the semantics tree of the program being transformed. That means some general-purpose verification methods may be used for strengthening program transformation techniques. This paper considers the question how finite countermodels for safety verification method might be used in Turchin's supercompilation method. We extract a number of supercompilation sub-algorithms trying to solve reachability problems and demonstrate use of an external countermodel finder for solving some of the problems.
International Space Station Requirement Verification for Commercial Visiting Vehicles
Garguilo, Dan
2017-01-01
The COTS program demonstrated NASA could rely on commercial providers for safe, reliable, and cost-effective cargo delivery to ISS. The ISS Program has developed a streamlined process to safely integrate commercial visiting vehicles and ensure requirements are met Levy a minimum requirement set (down from 1000s to 100s) focusing on the ISS interface and safety, reducing the level of NASA oversight/insight and burden on the commercial Partner. Partners provide a detailed verification and validation plan documenting how they will show they've met NASA requirements. NASA conducts process sampling to ensure that the established verification processes is being followed. NASA participates in joint verification events and analysis for requirements that require both parties verify. Verification compliance is approved by NASA and launch readiness certified at mission readiness reviews.
Online fingerprint verification.
Upendra, K; Singh, S; Kumar, V; Verma, H K
2007-01-01
As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.
Shift Verification and Validation
Energy Technology Data Exchange (ETDEWEB)
Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
2016-09-07
This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.
Shift Verification and Validation
Energy Technology Data Exchange (ETDEWEB)
Pandya, Tara M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Evans, Thomas M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Davidson, Gregory G [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Johnson, Seth R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Godfrey, Andrew T. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
2016-09-07
This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of LightWater Reactors (CASL). Fivemain types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.
Palm Vein Verification Using Gabor Filter
Directory of Open Access Journals (Sweden)
Ali Mohsin Al-Juboori
2013-01-01
Full Text Available Palm vein authentication is one of the modern biometric techniques, which employs the vein pattern in the human palm to verify the person. The merits of palm vein on classical biometric (e.g. fingerprint, iris, face are a low risk of falsification, difficulty of duplicated and stability. In this research, a new method is proposed for personal verification based on palm vein features. In the propose method, the palm vein images are firstly enhanced and then the features are extracted by using bank of Gabor filters. Then Fisher Discriminated Analysis (FDA is used to reduce the dimension of the features vectors. For vein pattern verification, this work uses Nearest Neighbors method. The EER of the proposed method is 0.2335%.
RPG 行星减速器齿面应力均化分析%The homogenization analysis for tooth surface stress of RPG planetary reductor
Institute of Scientific and Technical Information of China (English)
孔霞; 蔡云龙
2015-01-01
为避免RPG行星减速器因齿面应力分布不均导致的齿面接触应力增加、齿根弯曲强度下降、行星轮轴承受力不均等问题，以RPG39－50型行星减速器为例，利用有限元方法，构建该型减速器的系统柔性模型，依托行星排框架、行星轮销轴、行星轮轴承及输出排齿轮等关键结构件的变形分析，进行了针对性的齿面优化修形，从而实现了齿面应力均化，有效提高了该齿轮系统的寿命与可靠性。%The RPG planetary redactor is a special redactor with the conditions of large power , low speed and large torque output .The output structure of the RPG planetary redactor is easy to deform of many components , and the unbalanced distribution of tooth surface stress is inevitable .All these problems may cause the worse transmission conditions such as the raise of tooth surface contact stress , the drop of tooth flexural strength and the unbalanced loads of planetary gear bears .Taking RPG39-50 planetary redactor as an example , it uses the finite element method to build the flexible model of the redactor , analyzes the deformation of the components such as the planetary gear frame , the planetary gear shafts , the planetary gear bears and the output gears .At last , it re-alizes the tooth optimization repair .The result homogenizes the tooth surface stress and improves the reliability of the gear transmission system .
40 CFR 1065.307 - Linearity verification.
2010-07-01
... linearity verification generally consists of introducing a series of at least 10 reference values to a... reference values of the linearity verification. For pressure, temperature, dewpoint, and GC-ECD linearity verifications, we recommend at least three reference values. For all other linearity verifications select...
Reconfigurable system design and verification
Hsiung, Pao-Ann; Huang, Chun-Hsian
2009-01-01
Reconfigurable systems have pervaded nearly all fields of computation and will continue to do so for the foreseeable future. Reconfigurable System Design and Verification provides a compendium of design and verification techniques for reconfigurable systems, allowing you to quickly search for a technique and determine if it is appropriate to the task at hand. It bridges the gap between the need for reconfigurable computing education and the burgeoning development of numerous different techniques in the design and verification of reconfigurable systems in various application domains. The text e
The Approach to Steady State Using Homogeneous and Cartesian Coordinates
Directory of Open Access Journals (Sweden)
D. F. Gochberg
2013-01-01
Full Text Available Repeating an arbitrary sequence of RF pulses and magnetic field gradients will eventually lead to a steady-state condition in any magnetic resonance system. While numerical methods can quantify this trajectory, analytic analysis provides significantly more insight and a means for faster calculation. Recently, an analytic analysis using homogeneous coordinates was published. The current work further develops this line of thought and compares the relative merits of using a homogeneous or a Cartesian coordinate system.
一种嵌入式系统模型的安全性分析验证方法%A Verification Method of Security Analysis for Embedded System Model
Institute of Scientific and Technical Information of China (English)
石娇洁; 胡军; 刘雪; 马金晶; 黄志球; 程桢
2015-01-01
Because the design cycle of embedded system model is shorter and shorter,the function is more and more complex,in the field of safety critical systems engineering,its security analysis and verification method is an important research hotspot in recent years. In view of this,put forward a method based on model driven architecture for system security analysis verification, which is oriented SysML/MARTE state machine,including that constructed the state machine metamodel which has SysML/MARTE extension semantics,and the GTS metamodel which is the semantic model of AltaRica,high safety modeling and analysis language,then established semantic mapping model transformation rules from the SysML/MARTE state machine model to the AltaRica model,and based on the platform of AMMA and the fault tree analysis tools XFTA to realize the model transformation of SysML/MARTE state machine and the framework for system security formal verification. Finally give security verification example about wheel brake system design model. Experimental results show that the proposed verification method of security analysis for embedded system design model is representative and executive.%由于嵌入式系统模型设计周期越来越短，功能越来越复杂，其安全性分析与验证方法是近年来在安全攸关系统工程领域中出现的一个重要研究热点。针对这种情况，文中提出一种基于模型驱动架构的面向SysML/MARTE状态机的系统安全性分析验证方法。具体包括：构建了具备SysML/MARTE扩展语义的状态机元模型，以及高级安全性建模与分析语言AltaRica的语义模型GTS的元模型，然后建立了从SysML/MARTE状态机模型到AltaRica模型的语义映射模型转换规则，并基于AMMA平台和故障树分析工具XFTA实现了对SysML/MARTE状态机的模型转换与系统安全性形式化验证框架的构建。最后给出了民用飞机系统中的机轮刹车系统设计模型的例子进行实例验证
Gomes, M F; Valva, V N; Vieira, E M M; Giannasi, L C; Salgado, M A C; Vilela-Goulart, M G
2016-02-01
This study evaluated the effects of homogenous demineralized dentin matrix (HDDM) slices and platelet-rich plasma (PRP) in surgical defects created in the parietal bones of alloxan-induced diabetic rabbits, treated with a guided bone regeneration technique. Biochemical, radiographic, and histological analyses were performed. Sixty adult New Zealand rabbits were divided into five groups of 12: normoglycaemic (control, C), diabetic (D), diabetic with a PTFE membrane (DM), diabetic with a PTFE membrane and HDDM slices (DM-HDDM), and diabetic with PTFE membrane and PRP (DM-PRP). The quantity and quality of bone mass was greatest in the DM-HDDM group (respective radiographic and histological analyses: at 15 days, 71.70 ± 16.50 and 50.80 ± 1.52; 30 days, 62.73 ± 16.51 and 54.20 ± 1.23; 60 days, 63.03 ± 11.04 and 59.91 ± 3.32; 90 days, 103.60 ± 24.86 and 78.99 ± 1.34), followed by the DM-PRP group (respective radiographic and histological analyses: at 15 days 23.00 ± 2.74 and 20.66 ± 7.45; 30 days 31.92 ± 6.06 and 25.31 ± 5.59; 60 days 25.29 ± 16.30 and 46.73 ± 2.07; 90 days 38.10 ± 14.04 and 53.38 ± 9.20). PRP greatly enhanced vascularization during the bone repair process. Abnormal calcium metabolism was statistically significant in the DM-PRP group (P<0.001) for all four time intervals studied, especially when compared to the DM-HDDM group. Alkaline phosphatase activity was significantly higher in the DM-HDDM group (P<0.001) in comparison to the C, D, and DM-PRP groups, confirming the findings of intense osteoblastic activity and increased bone mineralization. Thus, HDDM promoted superior bone architectural microstructure in bone defects in diabetic rabbits due to its effective osteoinductive and osteoconductive activity, whereas PRP stimulated angiogenesis and red bone marrow formation. Copyright © 2015 International Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Fleischhauer, Robert; Božić, Marko; Kaliske, Michael
2016-11-01
The paper introduces a novel approach to computational homogenization by bridging the scales from microscale to macroscale. Whenever the microstructure is in an equilibrium state, the macrostructure needs to be in equilibrium, too. The novel approach is based on the concept of representative volume elements, stating that an assemblage of representative elements should be able to resemble the macrostructure. The resulting key assumption is the continuity of the appropriate kinematic fields across both scales. This assumption motivates the following idea. In contrast to existing approaches, where mostly constitutive quantities are homogenized, the balance equations, that drive the considered field quantities, are homogenized. The approach is applied to the fully coupled partial differential equations of thermomechanics solved by the finite element (FE) method. A novel consistent finite homogenization element is given with respect to discretized residual formulations and linearization terms. The presented FE has no restrictions regarding the thermomechanical constitutive laws that are characterizing the microstructure. A first verification of the presented approach is carried out against semi-analytical and reference solutions within the range of one-dimensional small strain thermoelasticity. Further verification is obtained by a comparison to the classical FE^2 method and its different types of boundary conditions within a finite deformation setting of purely mechanical problems. Furthermore, the efficiency of the novel approach is investigated and compared. Finally, structural examples are shown in order to demonstrate the applicability of the presented homogenization framework in case of finite thermo-inelasticity at different length scales.
Generic interpreters and microprocessor verification
Windley, Phillip J.
1990-01-01
The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.
Numident Online Verification Utility (NOVU)
Social Security Administration — NOVU is a mainframe application that accesses the NUMIDENT to perform real-time SSN verifications. This program is called by other SSA online programs that serve as...
Biometric verification with correlation filters
Vijaya Kumar, B. V. K.; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit
2004-01-01
Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification.
Energy Technology Data Exchange (ETDEWEB)
Quiros Higueras, J. D.; Marco Blancas, N. de; Ruiz Rodriguez, J. C.
2011-07-01
One of the essential tests in quality control of brachytherapy equipment is verification auto load intrinsic positional radioactive source. A classic method for evaluation is the use of x-ray film and measuring the distance between the marks left by autoradiography of the source with respect to a reference. In our center has developed an automated method of measurement by the radiochromic film scanning and implementation of a macro developed in Matlab, in order to optimize time and reduce uncertainty in the measurement. The purpose of this paper is to describe the method developed, assess their uncertainty and quantify their advantages over the manual method. (Author)
Homogeneous Biosensing Based on Magnetic Particle Labels
Schrittwieser, Stefan
2016-06-06
The growing availability of biomarker panels for molecular diagnostics is leading to an increasing need for fast and sensitive biosensing technologies that are applicable to point-of-care testing. In that regard, homogeneous measurement principles are especially relevant as they usually do not require extensive sample preparation procedures, thus reducing the total analysis time and maximizing ease-of-use. In this review, we focus on homogeneous biosensors for the in vitro detection of biomarkers. Within this broad range of biosensors, we concentrate on methods that apply magnetic particle labels. The advantage of such methods lies in the added possibility to manipulate the particle labels by applied magnetic fields, which can be exploited, for example, to decrease incubation times or to enhance the signal-to-noise-ratio of the measurement signal by applying frequency-selective detection. In our review, we discriminate the corresponding methods based on the nature of the acquired measurement signal, which can either be based on magnetic or optical detection. The underlying measurement principles of the different techniques are discussed, and biosensing examples for all techniques are reported, thereby demonstrating the broad applicability of homogeneous in vitro biosensing based on magnetic particle label actuation.
Homogeneity and plane-wave limits
Figueroa-O'Farrill, J M; Philip, S; Farrill, Jos\\'e Figueroa-O'; Meessen, Patrick; Philip, Simon
2005-01-01
We explore the plane-wave limit of homogeneous spacetimes. For plane-wave limits along homogeneous geodesics the limit is known to be homogeneous and we exhibit the limiting metric in terms of Lie algebraic data. This simplifies many calculations and we illustrate this with several examples. We also investigate the behaviour of (reductive) homogeneous structures under the plane-wave limit.
The MODUS Approach to Formal Verification
Directory of Open Access Journals (Sweden)
Brewka Lukasz
2014-03-01
Full Text Available Background: Software reliability is of great importance for the development of embedded systems that are often used in applications that have requirements for safety. Since the life cycle of embedded products is becoming shorter, productivity and quality simultaneously required and closely in the process of providing competitive products Objectives: In relation to this, MODUS (Method and supporting toolset advancing embedded systems quality project aims to provide small and medium-sized businesses ways to improve their position in the embedded market through a pragmatic and viable solution Methods/Approach: This paper will describe the MODUS project with focus on the technical methodologies that can assist formal verification and formal model checking. Results: Based on automated analysis of the characteristics of the system and by controlling the choice of the existing opensource model verification engines, model verification producing inputs to be fed into these engines. Conclusions: The MODUS approach is aligned with present market needs; the familiarity with tools, the ease of use and compatibility/interoperability remain among the most important criteria when selecting the development environment for a project
Enhanced Verification Test Suite for Physics Simulation Codes
Energy Technology Data Exchange (ETDEWEB)
Kamm, J R; Brock, J S; Brandon, S T; Cotrell, D L; Johnson, B; Knupp, P; Rider, W; Trucano, T; Weirs, V G
2008-10-10
This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest. This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of
Homogenization method for elastic materials
Directory of Open Access Journals (Sweden)
Seifrt F.
2007-11-01
Full Text Available In the paper we study the homogenization method and its potential for research of some phenomenons connected with periodic elastic materials. This method will be applied on partial differential equations that describe the deformation of a periodic composite material. The next part of the paper will deal with applications of the homogenization method. The importance of the method will be discussed more detailed for the exploration of the so called bandgaps. Bandgap is a phenomenon which may appear during vibrations of some periodically heterogeneous materials. This phenomenon is not only observable during vibrations for the aforementioned materials, but we may also observe similar effects by propagation of electromagnetic waves of heterogeneous dielectric medias.
Giant dielectric anisotropy via homogenization
Mackay, Tom G
2014-01-01
A random mixture of two isotropic dielectric materials, one composed of oriented spheroidal particles of relative permittivity $\\epsilon_a$ and the other composed of oriented spheroidal particles of relative permittivity $\\epsilon_b$, was considered in the long wavelength regime. The permittivity dyadic of the resulting homogenized composite material (HCM) was estimated using the Bruggeman homogenization formalism. The HCM was an orthorhombic biaxial material if the symmetry axes of the two populations of spheroids were mutually perpendicular and a uniaxial material if these two axes were mutually aligned. The degree of anisotropy of the HCM, as gauged by the ratio of the eigenvalues of the HCM's permittivity dyadic, increased as the shape of the constituent particles became more eccentric. The greatest degrees of HCM anisotropy were achieved for the limiting cases wherein the constituent particles were shaped as needles or discs. In these instances explicit formulas for the HCM anisotropy were derived from t...
Woodward Effect Experimental Verifications
March, Paul
2004-02-01
The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.
CSIR Research Space (South Africa)
Every, AG
2010-01-01
Full Text Available at www.sciencedirect.com Physics Procedia 00 (2009) 000?000 www.elsevier.com/locate/procedia International Congress on Ultrasonics, Universidad de Santiago de Chile, January 2009 Progress in the analysis of non-axisymmetric wave propagation in a...
Stability properties of autonomous homogeneous polynomial differential systems
Samardzija, Nikola
A geometrical approach is used to derive a generalized characteristic value problem for dynamic systems described by homogeneous polynomials. It is shown that a nonlinear homogeneous polynomial system possesses eigenvectors and eigenvalues, quantities normally associated with a linear system. These quantities are then employed in studying stability properties. The necessary and sufficient conditions for all forms of stabilities characteristic of a two-dimensional system are provided. This result, together with the classical theorem of Frommer, completes a stability analysis for a two-dimensional homogeneous polynomial system.
Organics Verification Study for Sinclair and Dyes Inlets, Washington
Energy Technology Data Exchange (ETDEWEB)
Kohn, Nancy P.; Brandenberger, Jill M.; Niewolny, Laurie A.; Johnston, Robert K.
2006-09-28
Sinclair and Dyes Inlets near Bremerton, Washington, are on the State of Washington 1998 303(d) list of impaired waters because of fecal coliform contamination in marine water, metals in sediment and fish tissue, and organics in sediment and fish tissue. Because significant cleanup and source control activities have been conducted in the inlets since the data supporting the 1998 303(d) listings were collected, two verification studies were performed to address the 303(d) segments that were listed for metal and organic contaminants in marine sediment. The Metals Verification Study (MVS) was conducted in 2003; the final report, Metals Verification Study for Sinclair and Dyes Inlets, Washington, was published in March 2004 (Kohn et al. 2004). This report describes the Organics Verification Study that was conducted in 2005. The study approach was similar to the MVS in that many surface sediment samples were screened for the major classes of organic contaminants, and then the screening results and other available data were used to select a subset of samples for quantitative chemical analysis. Because the MVS was designed to obtain representative data on concentrations of contaminants in surface sediment throughout Sinclair Inlet, Dyes Inlet, Port Orchard Passage, and Rich Passage, aliquots of the 160 MVS sediment samples were used in the analysis for the Organics Verification Study. However, unlike metals screening methods, organics screening methods are not specific to individual organic compounds, and are not available for some target organics. Therefore, only the quantitative analytical results were used in the organics verification evaluation. The results of the Organics Verification Study showed that sediment quality outside of Sinclair Inlet is unlikely to be impaired because of organic contaminants. Similar to the results for metals, in Sinclair Inlet, the distribution of residual organic contaminants is generally limited to nearshore areas already within the
Zipay, John J.; Bernstein, Karen S.; Bruno, Erica E.; Deloo, Phillipe; Patin, Raymond
2012-01-01
The International Space Station (ISS) can be considered one of the structural engineering wonders of the world. On par with the World Trade Center, the Colossus of Rhodes, the Statue of Liberty, the Great Pyramids, the Petronas towers and the Burj Khalifa skyscraper of Dubai, the ambition and scope of the ISS structural design, verification and assembly effort is a truly global success story. With its on-orbit life projected to be from its beginning in 1998 to the year 2020 (and perhaps beyond), all of those who participated in its development can consider themselves part of an historic engineering achievement representing all of humanity. The structural design and verification of the ISS could be the subject of many scholarly papers. Several papers have been written on the structural dynamic characterization of the ISS once it was assembled on-orbit [1], but the ground-based activities required to assure structural integrity and structural life of the individual elements from delivery to orbit through assembly and planned on-orbit operations have never been totally summarized. This paper is intended to give the reader an overview of some of the key decisions made during the structural verification planning for the elements of the U.S. On-Orbit Segment (USOS) as well as to summarize the many structural tests and structural analyses that were performed on its major elements. An effort is made for this paper to be summarily comprehensive, but as with all knowledge capture efforts of this kind, there are bound to be errors of omission. Should the reader discover any of these, please feel free to contact the principal author. The ISS (Figure 1) is composed of pre-integrated truss segments and pressurized elements supplied by NASA, the Russian Federal Space Agency (RSA), the European Space Agency (ESA) and the Japanese Aerospace Exploration Agency (JAXA). Each of these elements was delivered to orbit by a launch vehicle and connected to one another either robotically or
Energy Technology Data Exchange (ETDEWEB)
Kolev, S.D. (Sofia Univ. (Bulgaria). Khimicheski Fakultet); Nagy, Geza; Pungor, Ernoe (Budapesti Mueszaki Egyetem, Budapest (Hungary). Altalanos es Analitikai Kemia Tanszek)
1991-11-20
Glucose and urea electrodes, prepared by two different enzyme immobilization techniques and used as detectors in a single-line flow-injection manifold, were experimentally investigated for elucidating the influence of their most important parameters, i.e., the initial substrate concentration in the sample, the enzyme concentration in the reaction layer and its thickness and the buffer concentration, on the output signal. The results obtained were compared with the theoretical predictions based on simulations of the model for single-line flow-injection systems with single-layer enzyme electrode detection. The good qualitative agreement which was observed is a convincing experimental verification of this model and the guidelines for the production of flow-through biocatalytic electrodes with optimum design based upon it. (author). 12 refs.; 6 figs.
Technical safety requirements control level verification
Energy Technology Data Exchange (ETDEWEB)
STEWART, J.L.
1999-05-21
A Technical Safety Requirement (TSR) control level verification process was developed for the Tank Waste Remediation System (TWRS) TSRs at the Hanford Site in Richland, WA, at the direction of the US. Department of Energy, Richland Operations Office (RL). The objective of the effort was to develop a process to ensure that the TWRS TSR controls are designated and managed at the appropriate levels as Safety Limits (SLs), Limiting Control Settings (LCSs), Limiting Conditions for Operation (LCOs), Administrative Controls (ACs), or Design Features. The TSR control level verification process was developed and implemented by a team of contractor personnel with the participation of Fluor Daniel Hanford, Inc. (FDH), the Project Hanford Management Contract (PHMC) integrating contractor, and RL representatives. The team was composed of individuals with the following experience base: nuclear safety analysis; licensing; nuclear industry and DOE-complex TSR preparation/review experience; tank farm operations; FDH policy and compliance; and RL-TWRS oversight. Each TSR control level designation was completed utilizing TSR control logic diagrams and TSR criteria checklists based on DOE Orders, Standards, Contractor TSR policy, and other guidance. The control logic diagrams and criteria checklists were reviewed and modified by team members during team meetings. The TSR control level verification process was used to systematically evaluate 12 LCOs, 22 AC programs, and approximately 100 program key elements identified in the TWRS TSR document. The verification of each TSR control required a team consensus. Based on the results of the process, refinements were identified and the TWRS TSRs were modified as appropriate. A final report documenting key assumptions and the control level designation for each TSR control was prepared and is maintained on file for future reference. The results of the process were used as a reference in the RL review of the final TWRS TSRs and control suite. RL
Peripheral nerve magnetic stimulation: influence of tissue non-homogeneity
Directory of Open Access Journals (Sweden)
Papazov Sava P
2003-12-01
Full Text Available Abstract Background Peripheral nerves are situated in a highly non-homogeneous environment, including muscles, bones, blood vessels, etc. Time-varying magnetic field stimulation of the median and ulnar nerves in the carpal region is studied, with special consideration of the influence of non-homogeneities. Methods A detailed three-dimensional finite element model (FEM of the anatomy of the wrist region was built to assess the induced currents distribution by external magnetic stimulation. The electromagnetic field distribution in the non-homogeneous domain was defined as an internal Dirichlet problem using the finite element method. The boundary conditions were obtained by analysis of the vector potential field excited by external current-driven coils. Results The results include evaluation and graphical representation of the induced current field distribution at various stimulation coil positions. Comparative study for the real non-homogeneous structure with anisotropic conductivities of the tissues and a mock homogeneous media is also presented. The possibility of achieving selective stimulation of either of the two nerves is assessed. Conclusion The model developed could be useful in theoretical prediction of the current distribution in the nerves during diagnostic stimulation and therapeutic procedures involving electromagnetic excitation. The errors in applying homogeneous domain modeling rather than real non-homogeneous biological structures are demonstrated. The practical implications of the applied approach are valid for any arbitrary weakly conductive medium.
Heterotic strings on homogeneous spaces
Israel, D; Orlando, D; Petropoulos, P M; Israel, Dan; Kounnas, Costas; Orlando, Domenico
2004-01-01
We construct heterotic string backgrounds corresponding to families of homogeneous spaces as exact conformal field theories. They contain left cosets of compact groups by their maximal tori supported by NS-NS 2-forms and gauge field fluxes. We give the general formalism and modular-invariant partition functions, then we consider some examples such as SU(2)/U(1) ~ S^2 (already described in a previous paper) and the SU(3)/U(1)^2 flag space. As an application we construct new supersymmetric string vacua with magnetic fluxes and a linear dilaton.
Homogeneous orbit closures and applications
Lindenstrauss, Elon
2011-01-01
We give new classes of examples of orbits of the diagonal group in the space of unit volume lattices in R^d for d > 2 with nice (homogeneous) orbit closures, as well as examples of orbits with explicitly computable but irregular orbit closures. We give Diophantine applications to the former, for instance we show that if x is the cubic root of 2 then for any y,z in R liminf |n|=0 (as |n| goes to infinity), where denotes the distance of a real number c to the integers.
Pastor, F.; Anoukou, K.; Pastor, J.; Kondo, D.
2016-06-01
This second part of the two-part study is devoted to the numerical Limit Analysis of a hollow sphere model with a Mohr-Coulomb matrix and its use for the assessment of theoretical results. Brief background and fundamental of the static and kinematic approaches in the context of numerical limit analysis are first recalled. We then present the hollow sphere model, together with its axisymmetric FEM discretization and its mechanical position. A conic programming adaptation of a previous iterative static approach, based on a piecewise linearization (PWL) of the plasticity criterion, was first realized. Unfortunately, the resulting code, no more than the PWL one, did not allow sufficiently refined meshes for loss of convergence of the conic optimizer. This problem was solved by using the projection algorithm of Ben Tal and Nemriovski (BTN) and the (interior point) linear programming code XA. For the kinematic approach, a first conic adaptation appeared also inefficient. Then, an original mixed (but fully kinematic) approach dedicated to the general Mohr-Coulomb axisymmetric problem was elaborated. The final conic mixed code appears much more robust than the classic one when using the conic code MOSEK, allowing us to take into account refined numerical meshes. After a fine validation in the case of spherical cavities and isotropic loadings (for which the exact solution is known) and comparison to previous (partial) results, numerical lower and upper bounds (a posteriori verified) of the macroscopic strength are provided. These bounds are used to assess and validate the theoretical results of the companion (part I) paper. Effects of the friction angle as well as that of the porosity are illustrated.
Energy Technology Data Exchange (ETDEWEB)
Amano, Hikaru; Saito, Kimiaki (eds.) [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment
2001-03-01
This workshop was organized and sponsored by the Japan Atomic Energy Research Institute (JAERI) and Chernobyl Science and Technology Center for International Research (CHESCIR). JAERI and CHESCIR have conducted 8 years research cooperation from 1992 to 1999 concerning the study on assessment and analysis of environmental radiological consequences and verification of an assessment system, focusing on the Chernobyl contaminated area. It contained 3 research subjects. Subject-1 initiated at 1992 and focused the study on measurements and evaluation of environmental external exposure after nuclear accident. Subject-2 initiated at 1992 and focused the study on the validation of assessment models in an environmental consequence assessment methodology for nuclear accidents. Subject-3 initiated at 1995 and focused on the study on migration of radionuclides released into terrestrial and aquatic environment after nuclear accidents. This workshop was held to summarize the research cooperation between JAERI and CHESCIR, and to discuss future research needs in this field. (author)
Predicting SMT Solver Performance for Software Verification
Directory of Open Access Journals (Sweden)
Andrew Healy
2017-01-01
Full Text Available The Why3 IDE and verification system facilitates the use of a wide range of Satisfiability Modulo Theories (SMT solvers through a driver-based architecture. We present Where4: a portfolio-based approach to discharge Why3 proof obligations. We use data analysis and machine learning techniques on static metrics derived from program source code. Our approach benefits software engineers by providing a single utility to delegate proof obligations to the solvers most likely to return a useful result. It does this in a time-efficient way using existing Why3 and solver installations - without requiring low-level knowledge about SMT solver operation from the user.
State of the Art: Signature Biometrics Verification
Directory of Open Access Journals (Sweden)
Nourddine Guersi
2010-04-01
Full Text Available This paper presents a comparative analysis of the performance of three estimation algorithms: Expectation Maximization (EM, Greedy EM Algorithm (GEM and Figueiredo-Jain Algorithm (FJ - based on the Gaussian mixture models (GMMs for signature biometrics verification. The simulation results have shown significant performance achievements. The test performance of EER=5.49 % for "EM", EER=5.04 % for "GEM" and EER=5.00 % for "FJ", shows that the behavioral information scheme of signature biometrics is robust and has a discriminating power, which can be explored for identity authentication.
Homogenization scheme for acoustic metamaterials
Yang, Min
2014-02-26
We present a homogenization scheme for acoustic metamaterials that is based on reproducing the lowest orders of scattering amplitudes from a finite volume of metamaterials. This approach is noted to differ significantly from that of coherent potential approximation, which is based on adjusting the effective-medium parameters to minimize scatterings in the long-wavelength limit. With the aid of metamaterials’ eigenstates, the effective parameters, such as mass density and elastic modulus can be obtained by matching the surface responses of a metamaterial\\'s structural unit cell with a piece of homogenized material. From the Green\\'s theorem applied to the exterior domain problem, matching the surface responses is noted to be the same as reproducing the scattering amplitudes. We verify our scheme by applying it to three different examples: a layered lattice, a two-dimensional hexagonal lattice, and a decorated-membrane system. It is shown that the predicted characteristics and wave fields agree almost exactly with numerical simulations and experiments and the scheme\\'s validity is constrained by the number of dominant surface multipoles instead of the usual long-wavelength assumption. In particular, the validity extends to the full band in one dimension and to regimes near the boundaries of the Brillouin zone in two dimensions.
ISOTOPE METHODS IN HOMOGENEOUS CATALYSIS.
Energy Technology Data Exchange (ETDEWEB)
BULLOCK,R.M.; BENDER,B.R.
2000-12-01
The use of isotope labels has had a fundamentally important role in the determination of mechanisms of homogeneously catalyzed reactions. Mechanistic data is valuable since it can assist in the design and rational improvement of homogeneous catalysts. There are several ways to use isotopes in mechanistic chemistry. Isotopes can be introduced into controlled experiments and followed where they go or don't go; in this way, Libby, Calvin, Taube and others used isotopes to elucidate mechanistic pathways for very different, yet important chemistries. Another important isotope method is the study of kinetic isotope effects (KIEs) and equilibrium isotope effect (EIEs). Here the mere observation of where a label winds up is no longer enough - what matters is how much slower (or faster) a labeled molecule reacts than the unlabeled material. The most careti studies essentially involve the measurement of isotope fractionation between a reference ground state and the transition state. Thus kinetic isotope effects provide unique data unavailable from other methods, since information about the transition state of a reaction is obtained. Because getting an experimental glimpse of transition states is really tantamount to understanding catalysis, kinetic isotope effects are very powerful.
Lee, Chong-Yong; Bullock, John P; Kennedy, Gareth F; Bond, Alan M
2010-09-23
Large-amplitude ac voltammograms contain a wealth of kinetic information concerning electrode processes and can provide unique mechanistic insights compared to other techniques. This paper describes the effects homogeneous chemical processes have on ac voltammetry in general and provides experimental examples using two well-known chemical systems: one simple and one complex. Oxidation of [Cp*Fe(CO)(2)](2) (Cp* = η(5)-pentamethylcyclopentadienyl) in noncoordinating media is a reversible one-electron process; in the presence of nucleophiles, however, the resulting ligand-induced disproportionation changes the process to a multiple step regeneration. The chemical kinetic parameters of the regeneration mechanism were discerned via analysis of the third and higher harmonics of Fourier-transformed ac voltammetry data. Comparison of experimental data to digital simulations provides clear evidence that the reaction proceeds via a rapid pre-equilibrium between the electrogenerated monocation and the coordinating ligand; simultaneous fitting of the first nine harmonics indicates that k(f) = 7500 M(-1) s(-1) and k(r) = 100 s(-1), and that the unimolecular decomposition of the corresponding intermediate occurs with a rate constant of 2.2 s(-1). The rapid cis(+) → trans(+) isomerization of the electrogenerated cis-[W(CO)(2)(dpe)(2)](+), where dpe = 1,2-diphenylphosphinoethane, was examined to illustrate the effects of a simpler EC mechanism on the higher harmonics; a rate constant of 280 s(-1) was determined. These results not only shed new light on the chemistry of these systems, but provide a clear demonstration that the higher harmonics of ac voltammetry provide mechanistic insights into coupled homogeneous processes far more detailed than those that are readily accessible with dc techniques.
Homogenization of a nonlinear degenerate parabolic equation
Institute of Scientific and Technical Information of China (English)
无
2005-01-01
The homogenization of one kind of nonlinear parabolic equation is studied. The weak convergence and corrector results are obtained by combining carefully the compactness method and two-scale convergence method in the homogenization theory.
The Calkin algebra is not countably homogeneous
Farah, Ilijas; Hirshberg, Ilan
2015-01-01
We show that the Calkin algebra is not countably homogeneous, in the sense of continuous model theory. We furthermore show that the connected component of the unitary group of the Calkin algebra is not countably homogeneous.
10 CFR 300.11 - Independent verification.
2010-01-01
... skills required for verification are often cross-disciplinary. For example, an individual verifier...) Companies that provide verification services must use professionals that possess the necessary skills and...: American Institute of Certified Public Accountants; American National Standards Institute's...
On acoustic band gaps in homogenized piezoelectric phononic materials
Directory of Open Access Journals (Sweden)
Rohan E.
2010-07-01
Full Text Available We consider a composite medium made of weakly piezoelectric inclusions periodically distributed in the matrix which ismade of a different piezoelectricmaterial. Themediumis subject to a periodic excitation with an incidence wave frequency independent of scale ε of the microscopic heterogeneities. Two-scale method of homogenization is applied to obtain the limit homogenized model which describes acoustic wave propagation in the piezoelectric medium when ε → 0. In analogy with the purely elastic composite, the resulting model allows existence of the acoustic band gaps. These are identified for certain frequency ranges whenever the so-called homogenized mass becomes negative. The homogenized model can be used for band gap prediction and for dispersion analysis for low wave numbers. Modeling such composite materials seems to be perspective in the context of Smart Materials design.
Fractal Dimension as a measure of the scale of Homogeneity
Yadav, Jaswant K; Khandai, Nishikanta
2010-01-01
In the multi-fractal analysis of large scale matter distribution, the scale of transition to homogeneity is defined as the scale above which the fractal dimension of underlying point distribution is equal to the ambient dimension of the space in which points are distributed. With finite sized weakly clustered distribution of tracers obtained from galaxy redshift surveys it is difficult to achieve this equality. Recently we have defined the scale of homogeneity to be the scale above which the deviation of fractal dimension from the ambient dimension becomes smaller than the statistical dispersion. In this paper we use the relation between the fractal dimensions and the correlation function to compute the dispersion for any given model in the limit of weak clustering amplitude. We compare the deviation and dispersion for the LCDM model and discuss the implication of this comparison for the expected scale of homogeneity in the concordant model of cosmology. We estimate the upper limit to the scale of homogeneity...
Biometric Technologies and Verification Systems
Vacca, John R
2007-01-01
Biometric Technologies and Verification Systems is organized into nine parts composed of 30 chapters, including an extensive glossary of biometric terms and acronyms. It discusses the current state-of-the-art in biometric verification/authentication, identification and system design principles. It also provides a step-by-step discussion of how biometrics works; how biometric data in human beings can be collected and analyzed in a number of ways; how biometrics are currently being used as a method of personal identification in which people are recognized by their own unique corporal or behavior
Coherence delay augmented laser beam homogenizer
Rasmussen, P.; Bernhardt, A.
1993-06-29
The geometrical restrictions on a laser beam homogenizer are relaxed by ug a coherence delay line to separate a coherent input beam into several components each having a path length difference equal to a multiple of the coherence length with respect to the other components. The components recombine incoherently at the output of the homogenizer, and the resultant beam has a more uniform spatial intensity suitable for microlithography and laser pantogography. Also disclosed is a variable aperture homogenizer, and a liquid filled homogenizer.
Energy Technology Data Exchange (ETDEWEB)
Le Campion, J.M.
1996-01-30
To provide safety analysis of complex real time systems which have been developed for the protection of French nuclear Plants, the CEA is interested in software testing validation techniques. These series of tests are made by a purely software simulation of the system. The purpose is to establish the truth of some critical properties of the programs either at the simulation run time or after its execution. The operator is able to describe the variation of some inputs parameters of the programs and shows the results with graphics facilities. An important need was to describe formally some categories of properties expressed in terms of academic examples. We thought that a logical textual language was appropriate to achieve this formal expression. This thesis describe a new data-flow language called EFRI extending the semantic of interval temporal logics. Then we describe a calculus using regular languages on arrays which associates to each formula of the EFRI language a regular expression. With this method, the verification of a property described by a formula of EFRI can be viewed as a classical problem of languages theory: does a word belongs to a regular language. We can then build a finite automaton to recognize complex temporal diagrams. (author). 38 refs., 7 tabs., 4 appends.
Orthogonality Measurement for Homogenous Projects-Bases
Ivan, Ion; Sandu, Andrei; Popa, Marius
2009-01-01
The homogenous projects-base concept is defined. Next, the necessary steps to create a homogenous projects-base are presented. A metric system is built, which then will be used for analyzing projects. The indicators which are meaningful for analyzing a homogenous projects-base are selected. The given hypothesis is experimentally verified. The…
Improving homogeneity by dynamic speed limit systems.
Nes, N. van Brandenberg, S. & Twisk, D.A.M.
2010-01-01
Homogeneity of driving speeds is an important variable in determining road safety; more homogeneous driving speeds increase road safety. This study investigates the effect of introducing dynamic speed limit systems on homogeneity of driving speeds. A total of 46 subjects twice drove a route along 12
Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System
Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li
The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.
Energy interactions in homogeneously sheared magnetohydrodynamic flows
Collard, Diane; Praturi, Divya Sri; Girimaji, Sharath
2016-11-01
We investigate the behavior of homogeneously sheared magnetohydrodynamic (MHD) flows subject to perturbations in various directions. We perform rapid distortion theory (RDT) analysis and direct numerical simulations (DNS) to examine the interplay between magnetic, kinetic, and internal energies. For perturbation wavevectors oriented along the spanwise direction, RDT analysis shows that the magnetic and velocity fields are decoupled. In the case of streamwise wavevectors, the magnetic and velocity fields are tightly coupled. The coupling is "harmonic" in nature. DNS is then used to confirm the RDT findings. Computations of spanwise perturbations indeed exhibit behavior that is impervious to the magnetic field. Computed streamwise perturbations exhibit oscillatory evolution of kinetic and magnetic energies for low magnetic field strength. As the strength of magnetic field increases, the oscillatory behavior intensifies even as the energy magnitude decays, indicating strong stabilization.
RELAP-7 Software Verification and Validation Plan
Energy Technology Data Exchange (ETDEWEB)
Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Choi, Yong-Joon [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support; Zou, Ling [Idaho National Lab. (INL), Idaho Falls, ID (United States). Risk, Reliability, and Regulatory Support
2014-09-25
This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.
Homogeneous models for bianisotropic crystals
Ponti, S; Oldano, C
2002-01-01
We extend to bianisotropic structures a formalism already developed, based on the Bloch method for defining the effective dielectric tensor of anisotropic crystals in the long-wavelength approximation. More precisely, we provide a homogenization scheme which yields a wavevector-dependent effective medium for any 3D, 2D, or 1D bianisotropic crystal. We illustrate our procedure by applying this to a 1D magneto-electric smectic C*-type structure. The resulting equations confirm that the presence of dielectric and magnetic susceptibilities in the periodic structures generates magneto-electric pseudo-tensors for the effective medium. Their contribution to the optical activity of structurally chiral media can be of the same order of magnitude as the one present in dielectric helix-shaped crystals. Simple analytical expressions are found for the most important optical properties of smectic C*-type structures which are simultaneously dielectric and magnetic.
Effective Gravity and Homogenous Solutions
Müller, Daniel
2013-01-01
Near the singularity, gravity should be modified to an effective theory, in the same sense as with the Euler-Heisenberg electrodynamics. This effective gravity surmounts to higher derivative theory, and as is well known, a much more reacher theory concerning the solution space. On the other hand, as a highly non linear theory, the understanding of this solution space must go beyond the linearized approach. In this talk we will present some results previously published by collaborators and myself, concerning solutions for vacuum spatially homogenous cases of Bianchi types $I$ and $VII_A$. These are the anisotropic generalizations of the cosmological spatially "flat", and "open" models respectively. The solutions present isotropisation in a weak sense depending on the initial condition. Also, depending on the initial condition, singular solutions are obtained.
Effective Gravity and Homogenous Solutions
Müller, Daniel
2015-01-01
Near the singularity, gravity should be modified to an effective theory, in the same sense as with the Euler-Heisenberg electrodynamics. This effective gravity surmounts to higher derivative theory, and as is well known, a much more reacher theory concerning the solution space. On the other hand, as a highly non linear theory, the understanding of this solution space must go beyond the linearized approach. In this talk we will present some results previously published by collaborators and myself, concerning solutions for vacuum spatially homogenous cases of Bianchi types I and VIIA. These are the anisotropic generalizations of the cosmological spatially "flat", and "open" models respectively. The solutions present isotropisation in a weak sense depending on the initial condition. Also, depending on the initial condition, singular solutions are obtained.
Formal verification of complex properties on PLC programs
Darvas, D; Voros, A; Bartha, T; Blanco Vinuela, E; Gonzalez Suarez, V M
2014-01-01
Formal verification has become a recommended practice in the safety-critical application areas. However, due to the complexity of practical control and safety systems, the state space explosion often prevents the use of formal analysis. In this paper we extend our former verification methodology with effective property preserving reduction techniques. For this purpose we developed general rule-based reductions and a customized version of the Cone of Influence (COI) reduction. Using these methods, the verification of complex requirements formalised with temporal logics (e.g. CTL, LTL) can be orders of magnitude faster. We use the NuSMV model checker on a real-life PLC program from CERN to demonstrate the performance of our reduction techniques.
Verification of the Space Shuttle entry GN&C system
Van Hoften, J. D. A.; Moyles, J. A.
1981-01-01
The certification procedures for the initial Shuttle flight are discussed. Particular attention is paid to the entry guidance, navigation, and control (GNC) verification, comprising tests, analysis, demonstration, inspection, and simulation. Flow diagrams for the verification and operational flight sequences are provided, along with a block diagram of the GNC circuitry interfaces. The development of the test matrix software for the GNC is outlined, noting the constant interplay between software verification and spacecraft reconfiguration to meet simulated performance requirements. Comparison of GNC performance predictions with actual entry flight data showed a good match in all performance areas except for sideslip excursions, bank overshoots, an area of transonic buffet, and an increased lift/drag ratio in the preflare to landing flight phase.
Verification of Embedded Memory Systems using Efficient Memory Modeling
Ganai, Malay K; Ashar, Pranav
2011-01-01
We describe verification techniques for embedded memory systems using efficient memory modeling (EMM), without explicitly modeling each memory bit. We extend our previously proposed approach of EMM in Bounded Model Checking (BMC) for a single read/write port single memory system, to more commonly occurring systems with multiple memories, having multiple read and write ports. More importantly, we augment such EMM to providing correctness proofs, in addition to finding real bugs as before. The novelties of our verification approach are in a) combining EMM with proof-based abstraction that preserves the correctness of a property up to a certain analysis depth of SAT-based BMC, and b) modeling arbitrary initial memory state precisely and thereby, providing inductive proofs using SAT-based BMC for embedded memory systems. Similar to the previous approach, we construct a verification model by eliminating memory arrays, but retaining the memory interface signals with their control logic and adding constraints on tho...
Modified Homogeneous Data Set of Coronal Intensities
Dorotovič, I.; Minarovjech, M.; Lorenc, M.; Rybanský, M.
2014-07-01
The Astronomical Institute of the Slovak Academy of Sciences has published the intensities, recalibrated with respect to a common intensity scale, of the 530.3 nm (Fe xiv) green coronal line observed at ground-based stations up to the year 2008. The name of this publication is Homogeneous Data Set (HDS). We have developed a method that allows one to successfully substitute the ground-based observations by satellite observations and, thus, continue with the publication of the HDS. For this purpose, the observations of the Extreme-ultraviolet Imaging Telescope (EIT), onboard the Solar and Heliospheric Observatory (SOHO) satellite, were exploited. Among other data the EIT instrument provides almost daily 28.4 nm (Fe xv) emission-line snapshots of the corona. The Fe xiv and Fe xv data (4051 observation days) taken in the period 1996 - 2008 have been compared and good agreement was found. The method to obtain the individual data for the HDS follows from the correlation analysis described in this article. The resulting data, now under the name of Modified Homogeneous Data Set (MHDS), are identical up to 1996 to those in the HDS. The MHDS can be used further for studies of the coronal solar activity and its cycle. These data are available at http://www.suh.sk.
Homogenization in micro-magneto-mechanics
Sridhar, A.; Keip, M.-A.; Miehe, C.
2016-07-01
Ferromagnetic materials are characterized by a heterogeneous micro-structure that can be altered by external magnetic and mechanical stimuli. The understanding and the description of the micro-structure evolution is of particular importance for the design and the analysis of smart materials with magneto-mechanical coupling. The macroscopic response of the material results from complex magneto-mechanical interactions occurring on smaller length scales, which are driven by magnetization reorientation and associated magnetic domain wall motions. The aim of this work is to directly base the description of the macroscopic magneto-mechanical material behavior on the micro-magnetic domain evolution. This will be realized by the incorporation of a ferromagnetic phase-field formulation into a macroscopic Boltzmann continuum by the use of computational homogenization. The transition conditions between the two scales are obtained via rigorous exploitation of rate-type and incremental variational principles, which incorporate an extended version of the classical Hill-Mandel macro-homogeneity condition covering the phase field on the micro-scale. An efficient two-scale computational scenario is developed based on an operator splitting scheme that includes a predictor for the magnetization on the micro-scale. Two- and three-dimensional numerical simulations demonstrate the performance of the method. They investigate micro-magnetic domain evolution driven by macroscopic fields as well as the associated overall hysteretic response of ferromagnetic solids.
Eggspectation : organic egg verification tool
Ruth, van S.M.; Hoogenboom, L.A.P.
2011-01-01
In 2009 RIKILT conducted a study on about 2,000 eggs to evaluate three different analytical verification methods: carotenoid profiling, fatty acid profiling and isotope ratio mass spectrometry. The eggs were collected from about 50 Dutch farms. The selection was based on the farms’ location and size
Verification tests for a solar-heating system
1980-01-01
Report describes method of verification of solar space heating and hot-water systems using similarity comparison, mathematical analysis, inspections, and tests. Systems, subsystems, and components were tested for performance, durability, safety, and other factors. Tables and graphs compliment test materials.
9 CFR 417.4 - Validation, Verification, Reassessment.
2010-01-01
... AGRICULTURE REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification, Reassessment. (a) Every establishment shall validate the HACCP plan's adequacy in controlling the food...
Verification of flood damage modelling using insurance data
DEFF Research Database (Denmark)
Zhou, Qianqian; Petersen, Toke E. P.; Thorsen, Bo J.
2012-01-01
This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, howeve...
Verification of flood damage modelling using insurance data
DEFF Research Database (Denmark)
Zhou, Qianqian; Panduro, T. E.; Thorsen, B. J.
2013-01-01
This paper presents the results of an analysis using insurance data for damage description and risk model verification, based on data from a Danish case. The results show that simple, local statistics of rainfall are not able to describe the variation in individual cost per claim, but are, howeve...
Comparison of particulate verification techniques study
Rivera, Rachel
2006-08-01
The efficacy of five particulate verification techniques on four types of materials was studied. Statistical Analysis Software/JMP 6.0 was used to create a statistically valid design of experiments. In doing so, 35 witness coupons consisting of the four types of materials being studied, were intentionally contaminated with particulate fallout. Image Analysis was used to characterize the extent of particulate fallout on the coupons and was used to establish a baseline, or basis of comparison, against the five techniques that were studied. The five particulate verification techniques were the Tapelift, the Particulate Solvent Rinse, the GelPak lift, an in-line vacuum filtration probe, and the Infinity Focusing Microscope (IFM). The four types of materials consisted of magnesium flouride (MgF II) coated mirrors, composite coated silver aluminum (CCAg), Z93 and NS43G coated aluminum, and silicon (si) wafers. The vacuum probe was determined to be most effective for Z93, the tapelift or vacuum probe for MgF2, and the GelPak Lift for CCAg and si substrates. A margin of error for each technique, based on experimental data from two experiments, for si wafer substrates, yielded the following: Tapelift - 67%, Solvent Rinse - 58%, GelPak- 26%, Vacuum Probe - 93%, IFM-to be determined.
Institute of Scientific and Technical Information of China (English)
陈权; 张于; 卢从俊
2013-01-01
分析矿用低速风洞的工作原理，研究目前所使用的检定规程JJG（煤炭）01—96《矿用风速表》，发现湿度的变化影响空气密度的变化，而空气密度的改变直接影响风速表的检定结果。忽略了湿度对空气密度的影响，会影响风速表检定结果的准确性。% This paper first introduces the current China's mining anemometer verification the basic situation, detailed analysis of the working principle of the mine low-speed wind tunnel. Based on the principle of this kind of work, we ware carefully studied mining anemometer VRof, The currently used VRof is JJG (coal) 01-96《mining anemometer》VRof, this statute ignored the impact of humidity on the verification results. In this paper, the effects of humidity study. The analysis found that changes in humidity will affect the changes in air density, and the air density changes directly affect the verification results of the anemometer. Ignoring the influence of the humidity of the air density will affect the accuracy of the anemometer verification results.
Challenges in Decomposing Encodings of Verification Problems
Directory of Open Access Journals (Sweden)
Peter Schrammel
2016-07-01
Full Text Available Modern program verifiers use logic-based encodings of the verification problem that are discharged by a back end reasoning engine. However, instances of such encodings for large programs can quickly overwhelm these back end solvers. Hence, we need techniques to make the solving process scale to large systems, such as partitioning (divide-and-conquer and abstraction. In recent work, we showed how decomposing the formula encoding of a termination analysis can significantly increase efficiency. The analysis generates a sequence of logical formulas with existentially quantified predicates that are solved by a synthesis-based program analysis engine. However, decomposition introduces abstractions in addition to those required for finding the unknown predicates in the formula, and can hence deteriorate precision. We discuss the challenges associated with such decompositions and their interdependencies with the solving process.
Inertial particles in homogeneous shear turbulence
Energy Technology Data Exchange (ETDEWEB)
Nicolai, Claudia; Jacob, Boris [CNR-INSEAN, via di Vallerano 139, 00128 Rome (Italy); Gualtieri, Paolo; Piva, Renzo, E-mail: claudia.nicolai@uniroma1.it [DMA, Sapienza Universita di Roma, Via Eudossiana 18, 00184 Rome (Italy)
2011-12-22
The characteristics of inertial particles distribution in a uniformly sheared turbulent flow are investigated, with the aim of quantifying the effects associated with the large-scale anisotropy induced by the mean velocity gradient. The focus of the analysis is on clustering aspects, and in particular on the dependence of the radial distribution function on both the directionality and the magnitude of the observation scale. We discuss experimental data measured in a homogeneous shear flow seeded with particles of size comparable with the Kolmogorov length scale and Stokes number St Almost-Equal-To 0.3, and discuss their distribution properties in comparison with results provided by related one-way coupled direct numerical simulations which make use of the point-force approximation.
Garg, Prabhat; Purohit, Ajay; Tak, Vijay K; Dubey, D K
2009-11-06
N,N-Dialkylamino alcohols, N-methyldiethanolamine, N-ethyldiethanolamine and triethanolamine are the precursors of VX type nerve agents and three different nitrogen mustards respectively. Their detection and identification is of paramount importance for verification analysis of chemical weapons convention. GC-FTIR is used as complimentary technique to GC-MS analysis for identification of these analytes. One constraint of GC-FTIR, its low sensitivity, was overcome by converting the analytes to their fluorinated derivatives. Owing to high absorptivity in IR region, these derivatives facilitated their detection by GC-FTIR analysis. Derivatizing reagents having trimethylsilyl, trifluoroacyl and heptafluorobutyryl groups on imidazole moiety were screened. Derivatives formed there were analyzed by GC-FTIR quantitatively. Of these reagents studied, heptafluorobutyrylimidazole (HFBI) produced the greatest increase in sensitivity by GC-FTIR detection. 60-125 folds of sensitivity enhancement were observed for the analytes by HFBI derivatization. Absorbance due to various functional groups responsible for enhanced sensitivity were compared by determining their corresponding relative molar extinction coefficients ( [Formula: see text] ) considering uniform optical path length. The RSDs for intraday repeatability and interday reproducibility for various derivatives were 0.2-1.1% and 0.3-1.8%. Limit of detection (LOD) was achieved up to 10-15ng and applicability of the method was tested with unknown samples obtained in international proficiency tests.
Reciprocity theory of homogeneous reactions
Agbormbai, Adolf A.
1990-03-01
The reciprocity formalism is applied to the homogeneous gaseous reactions in which the structure of the participating molecules changes upon collision with one another, resulting in a change in the composition of the gas. The approach is applied to various classes of dissociation, recombination, rearrangement, ionizing, and photochemical reactions. It is shown that for the principle of reciprocity to be satisfied it is necessary that all chemical reactions exist in complementary pairs which consist of the forward and backward reactions. The backward reaction may be described by either the reverse or inverse process. The forward and backward processes must satisfy the same reciprocity equation. Because the number of dynamical variables is usually unbalanced on both sides of a chemical equation, it is necessary that this balance be established by including as many of the dynamical variables as needed before the reciprocity equation can be formulated. Statistical transformation models of the reactions are formulated. The models are classified under the titles free exchange, restricted exchange and simplified restricted exchange. The special equations for the forward and backward processes are obtained. The models are consistent with the H theorem and Le Chatelier's principle. The models are also formulated in the context of the direct simulation Monte Carlo method.
Pharmaceutical Industry Oriented Homogeneous Catalysis
Institute of Scientific and Technical Information of China (English)
Zhang Xumu
2004-01-01
Chiral therapeutics already makes up over one-third of pharmaceutical drugs currently sold worldwide. This is a growing industry with global chiral drug sales for 2002 increasing by 12%to $160 billion (Technology Catalysts International) of a total drug market of $410bn. The increasing demand to produce enantiomerically pure pharmaceuticals, agrochemicals, flavors, and other fine chemicals has advanced the field of asymmetric catalytic technologies.We aim to become a high value technology provider and partner in the chiral therapeutics industry by offering proprietary catalysts, novel building blocks, and collaborative synthetic solutions. In decade, we have developed a set of novel chiral homogeneous phosphorus ligands such as Binaphane, Me-KetalPhos, TangPhos, f-Binaphane, Me-f-KetalPhos, C4TunePhos and Binapine,which we called Chiral Ligand ToolKit. Complementing the ToolKit, (R, S, S, R)-DIOP*, T-Phos,o-BIPHEP, o-BINAPO and FAP were added recently[1].These ligands can be applied to a broad variety of drug structural features by asymmetric hydrogenation of dehydroamino acid derivatives, enamides, unsatisfied acids and esters, ketones,beta ketoesters, imines and cyclic imines. And ligand FAP had been apllied succefully in allylic alkylation and [3+2] cycloaddition.
Energy Technology Data Exchange (ETDEWEB)
King, David A.
2012-08-16
Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs).
Homogeneous sample preparation of raw shrimp using dry ice.
Bunch, E A; Altwein, D M; Johnson, L E; Farley, J R; Hammersmith, A A
1995-01-01
Sample homogeneity is critical to accurate and reproducible analysis of trace residues in foods. A method of uniform sample preparation using dry ice is described for shrimp. Other sample preparation techniques for raw shrimp produce nonhomogeneous samples. Sample homogeneity was determined through analysis of chloramphenicol added to intact tiger or white shrimp prior to sample preparation. Simulated chloramphenicol residue levels were 50, 15, 10, and 5 ppb. No significant differences were noted when analyses of shrimp inoculated with chlor-amphenicol prior to sample preparation with dry ice were compared with analyses of shrimp spiked after grinding with dry ice. Grinding shrimp with dry ice produced samples with homogeneous chloramphenicol residues. This technique should be applicable to other tissues and vegetable products.
Institute of Scientific and Technical Information of China (English)
李世锐; 任丽霞; 胡文军; 乔鹏瑞
2016-01-01
CONTAIN-LMR是针对以液态钠为冷却剂的反应堆而开发的安全壳事故一体化分析程序。我国目前的CONTAIN-LMR程序版本为2000年左右从法国引进，还未进行过面向工程设计的系统性地程序开发和验证。本文主要针对 CONTAIN-LMR 程序中模拟池式钠火事故的分析模型进行详细分析，并采用国际上的池式钠火实验进行验证，实验验证结果表明 CONTAIN-LMR 程序可以较准确地模拟池式钠火事故造成的钠工艺间内的温度、压力升高及放射性钠气溶胶行为。本文的研究结果初步表明CONTAIN-LMR程序可用于钠冷快堆的钠火事故分析。%CONTAIN-LMR is an integrated code which aims at sodium cooled fast reactor containment accident analysis. The current version of the CONTAIN-LMR code in China was imported from France around 2000,program development and verification of engineering level design has not undertaken systematically. This paper makes a detailed analysis for the models of sodium pool fire accident simulation in CONTAIN-LMR code,and uses international sodium pool fire experiments for verification,the result shows that the CONTAIN-LMR code can simulate the temperature,pressure rising and radioactive sodium aerosol behavior in containment caused by sodium pool fire accidents. The studies in this paper indicated that the CONTAIN-LMR code can be used for the analysis of sodium fire accidents in sodium cooled fast reactor.
A Class of Homogeneous Einstein Manifolds
Institute of Scientific and Technical Information of China (English)
Yifang KANG; Ke LIANG
2006-01-01
A Riemannian manifold (M,g) is called Einstein manifold if its Ricci tensor satisfies r=c·g for some constant c. General existence results are hard to obtain,e.g., it is as yet unknown whether every compact manifold admits an Einstein metric. A natural approach is to impose additional homogeneous assumptions. M. Y. Wang and W. Ziller have got some results on compact homogeneous space G/H. They investigate standard homogeneous metrics, the metric induced by Killing form on G/H, and get some classification results. In this paper some more general homogeneous metrics on some homogeneous space G/H are studies, and a necessary and sufficient condition for this metric to be Einstein is given. The authors also give some examples of Einstein manifolds with non-standard homogeneous metrics.
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, SHARPE MANUFACTURING TITANIUM T1-CG SPRAY GUN
Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, the pollution prevention capabilities of a high transfer efficiency liquid spray gun was tested. This ...
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, ANEST IWATA CORPORATION W400-LV SPRAY GUN
Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, the pollution prevention capabilities of a high transfer efficiency liquid spray gun was tested. This ...
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, ANEST IWATA CORPORATION W400-LV SPRAY GUN
Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, the pollution prevention capabilities of a high transfer efficiency liquid spray gun was tested. This ...
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, SHARPE MANUFACTURING TITANIUM T1-CG SPRAY GUN
Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, the pollution prevention capabilities of a high transfer efficiency liquid spray gun was tested. This ...
Homogeneity and thermodynamic identities in geometrothermodynamics
Quevedo, Hernando; Quevedo, María N.; Sánchez, Alberto
2017-03-01
We propose a classification of thermodynamic systems in terms of the homogeneity properties of their fundamental equations. Ordinary systems correspond to homogeneous functions and non-ordinary systems are given by generalized homogeneous functions. This affects the explicit form of the Gibbs-Duhem relation and Euler's identity. We show that these generalized relations can be implemented in the formalism of black hole geometrothermodynamics in order to completely fix the arbitrariness present in Legendre invariant metrics.
Homogeneity and thermodynamic identities in geometrothermodynamics
Energy Technology Data Exchange (ETDEWEB)
Quevedo, Hernando [Universidad Nacional Autonoma de Mexico, Instituto de Ciencias Nucleares (Mexico); Universita di Roma ' ' La Sapienza' ' , Dipartimento di Fisica, Rome (Italy); ICRANet, Rome (Italy); Quevedo, Maria N. [Universidad Militar Nueva Granada, Departamento de Matematicas, Facultad de Ciencias Basicas, Bogota (Colombia); Sanchez, Alberto [CIIDET, Departamento de Posgrado, Queretaro (Mexico)
2017-03-15
We propose a classification of thermodynamic systems in terms of the homogeneity properties of their fundamental equations. Ordinary systems correspond to homogeneous functions and non-ordinary systems are given by generalized homogeneous functions. This affects the explicit form of the Gibbs-Duhem relation and Euler's identity. We show that these generalized relations can be implemented in the formalism of black hole geometrothermodynamics in order to completely fix the arbitrariness present in Legendre invariant metrics. (orig.)
Formal Verification of Continuous Systems
DEFF Research Database (Denmark)
Sloth, Christoffer
2012-01-01
and the verification procedures should be algorithmically synthesizable. Autonomous control plays an important role in many safety-critical systems. This implies that a malfunction in the control system can have catastrophic consequences, e.g., in space applications where a design flaw can result in large economic...... losses. Furthermore, a malfunction in the control system of a surgical robot may cause death of patients. The previous examples involve complex systems that are required to operate according to complex specifications. The systems cannot be formally verified by modern verification techniques, due......The purpose of this thesis is to develop a method for verifying timed temporal properties of continuous dynamical systems, and to develop a method for verifying the safety of an interconnection of continuous systems. The methods must be scalable in the number of continuous variables...
The Homogeneity Scale of the universe
Ntelis, Pierros
2016-01-01
In this study, we probe the cosmic homogeneity with the BOSS CMASS galaxy sample in the redshift region of $0.43 < z < 0.7$. We use the normalised counts-in-spheres estimator $\\mathcal{N}(
Program Verification of Numerical Computation
Pantelis, Garry
2014-01-01
These notes outline a formal method for program verification of numerical computation. It forms the basis of the software package VPC in its initial phase of development. Much of the style of presentation is in the form of notes that outline the definitions and rules upon which VPC is based. The initial motivation of this project was to address some practical issues of computation, especially of numerically intensive programs that are commonplace in computer models. The project evolved into a...
Kleene Algebra and Bytecode Verification
2016-04-27
Bytecode 2005 Preliminary Version Kleene Algebra and Bytecode Verification Lucja Kot 1 Dexter Kozen 2 Department of Computer Science Cornell...first-order methods that inductively annotate program points with abstract values. In [6] we introduced a second-order approach based on Kleene algebra ...form a left-handed Kleene algebra . The dataflow labeling is not achieved by inductively labeling the program with abstract values, but rather by
Verification of Loop Diagnostics
Winebarger, A.; Lionello, R.; Mok, Y.; Linker, J.; Mikic, Z.
2014-01-01
Many different techniques have been used to characterize the plasma in the solar corona: density-sensitive spectral line ratios are used to infer the density, the evolution of coronal structures in different passbands is used to infer the temperature evolution, and the simultaneous intensities measured in multiple passbands are used to determine the emission measure. All these analysis techniques assume that the intensity of the structures can be isolated through background subtraction. In this paper, we use simulated observations from a 3D hydrodynamic simulation of a coronal active region to verify these diagnostics. The density and temperature from the simulation are used to generate images in several passbands and spectral lines. We identify loop structures in the simulated images and calculate the loop background. We then determine the density, temperature and emission measure distribution as a function of time from the observations and compare with the true temperature and density of the loop. We find that the overall characteristics of the temperature, density, and emission measure are recovered by the analysis methods, but the details of the true temperature and density are not. For instance, the emission measure curves calculated from the simulated observations are much broader than the true emission measure distribution, though the average temperature evolution is similar. These differences are due, in part, to inadequate background subtraction, but also indicate a limitation of the analysis methods.
Nuclear Data Verification and Standardization
Energy Technology Data Exchange (ETDEWEB)
Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.
2011-10-01
The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.
Homogenizing Advanced Alloys: Thermodynamic and Kinetic Simulations Followed by Experimental Results
Jablonski, Paul D.; Hawk, Jeffrey A.
2017-01-01
Segregation of solute elements occurs in nearly all metal alloys during solidification. The resultant elemental partitioning can severely degrade as-cast material properties and lead to difficulties during post-processing (e.g., hot shorts and incipient melting). Many cast articles are subjected to a homogenization heat treatment in order to minimize segregation and improve their performance. Traditionally, homogenization heat treatments are based upon past practice or time-consuming trial and error experiments. Through the use of thermodynamic and kinetic modeling software, NETL has designed a systematic method to optimize homogenization heat treatments. Use of the method allows engineers and researchers to homogenize casting chemistries to levels appropriate for a given application. The method also allows for the adjustment of heat treatment schedules to fit limitations on in-house equipment (capability, reliability, etc.) while maintaining clear numeric targets for segregation reduction. In this approach, the Scheil module within Thermo-Calc is used to predict the as-cast segregation present within an alloy, and then diffusion controlled transformations is used to model homogenization kinetics as a function of time and temperature. Examples of computationally designed heat treatments and verification of their effects on segregation and properties of real castings are presented.
Homogenizing Advanced Alloys: Thermodynamic and Kinetic Simulations Followed by Experimental Results
Jablonski, Paul D.; Hawk, Jeffrey A.
2016-11-01
Segregation of solute elements occurs in nearly all metal alloys during solidification. The resultant elemental partitioning can severely degrade as-cast material properties and lead to difficulties during post-processing (e.g., hot shorts and incipient melting). Many cast articles are subjected to a homogenization heat treatment in order to minimize segregation and improve their performance. Traditionally, homogenization heat treatments are based upon past practice or time-consuming trial and error experiments. Through the use of thermodynamic and kinetic modeling software, NETL has designed a systematic method to optimize homogenization heat treatments. Use of the method allows engineers and researchers to homogenize casting chemistries to levels appropriate for a given application. The method also allows for the adjustment of heat treatment schedules to fit limitations on in-house equipment (capability, reliability, etc.) while maintaining clear numeric targets for segregation reduction. In this approach, the Scheil module within Thermo-Calc is used to predict the as-cast segregation present within an alloy, and then diffusion controlled transformations is used to model homogenization kinetics as a function of time and temperature. Examples of computationally designed heat treatments and verification of their effects on segregation and properties of real castings are presented.
Are Heterogeneous or Homogeneous Groups More Beneficial to Students?
Schullery, Nancy M.; Schullery, Stephen E.
2006-01-01
This study investigated the relative benefits to the student of working in homogeneous versus heterogeneous classroom groups. Correlation analysis of 18 desirable outcomes versus 8 personality-based heterogeneity variables reveals that heterogeneity associates with advantages as well as disadvantages. Ways in which group composition might be…
Homogeneity of Moral Judgment? Apprentices Solving Business Conflicts.
Beck, Klaus; Heinrichs, Karin; Minnameier, Gerhard; Parche-Kawik, Kirsten
In an ongoing longitudinal study that started in 1994, the moral development of business apprentices is being studied. The focal point of this project is a critical analysis of L. Kohlberg's thesis of homogeneity, according to which people should judge every moral issue from the point of view of their "modal" stage (the most frequently…
Self-consolidating concrete homogeneity
Directory of Open Access Journals (Sweden)
Jarque, J. C.
2007-08-01
Full Text Available Concrete instability may lead to the non-uniform distribution of its properties. The homogeneity of self-consolidating concrete in vertically cast members was therefore explored in this study, analyzing both resistance to segregation and pore structure uniformity. To this end, two series of concretes were prepared, self-consolidating and traditional vibrated materials, with different w/c ratios and types of cement. The results showed that selfconsolidating concretes exhibit high resistance to segregation, albeit slightly lower than found in the traditional mixtures. The pore structure in the former, however, tended to be slightly more uniform, probably as a result of less intense bleeding. Such concretes are also characterized by greater bulk density, lower porosity and smaller mean pore size, which translates into a higher resistance to pressurized water. For pore diameters of over about 0.5 Î¼m, however, the pore size distribution was found to be similar to the distribution in traditional concretes, with similar absorption rates.En este trabajo se estudia la homogeneidad de los hormigones autocompactantes en piezas hormigonadas verticalmente, determinando su resistencia a la segregación y la uniformidad de su estructura porosa, dado que la pérdida de estabilidad de una mezcla puede conducir a una distribución no uniforme de sus propiedades. Para ello se han fabricado dos tipos de hormigones, uno autocompactante y otro tradicional vibrado, con diferentes relaciones a/c y distintos tipos de cemento. Los resultados ponen de manifiesto que los hormigones autocompactantes presentan una buena resistencia a la segregación, aunque algo menor que la registrada en los hormigones tradicionales. A pesar de ello, su estructura porosa tiende a ser ligeramente más uniforme, debido probablemente a un menor sangrado. Asimismo, presentan una mayor densidad aparente, una menor porosidad y un menor tamaño medio de poro, lo que les confiere mejores
Energy Technology Data Exchange (ETDEWEB)
Sigrist, J.F. [DCN Propulsion, Service Technique et Scientifique, 44 - La Montagne (France); Broc, D. [CEA Saclay, Lab. d' Etude Mecanique et Sismique, 91 - Gif-sur-Yvette (France)
2007-03-15
A homogenization method is presented and validated in order to perform the dynamic analysis of a nuclear pressure vessel with a 'reduced' numerical model accounting for inertial fluid-structure coupling and describing the geometrical details of the internal structures, periodically embedded within the nuclear reactor. Homogenization techniques have been widely used in nuclear engineering to model confinement effects in reactor cores or tubes bundles. Application of such techniques to reactor internals is investigated in the present paper. The theory bases of the method are first recalled. Adaptation of the homogenization approach to the case of reactor internals is then exposed: it is shown that in such case, confinement effects can be modelled by a suitable modification of classical fluid-structure symmetric formulation. The method is then validated by comparison of 3D and 2D calculations. In the latter, a 'reduced' model with homogenized fluid is used, whereas in the former, a full finite element model of the nuclear pressure vessel with internal structures is elaborated. The homogenization approach is proved to be efficient from the numerical of view point and accurate from the physical point of view. Confinement effects in the industrial case can then be highlighted. (authors)
Methods of Verification, Accountability and Control of Special Nuclear Material
Energy Technology Data Exchange (ETDEWEB)
Stewart, J.E.
1999-05-03
This session demonstrates nondestructive assay (NDA) measurement, surveillance and analysis technology required to protect, control and account (MPC and A) for special nuclear materials (SNM) in sealed containers. These measurements, observations and analyses comprise state-of-the art, strengthened, SNM safeguards systems. Staff member specialists, actively involved in research, development, training and implementation worldwide, will present six NDA verification systems and two software tools for integration and analysis of facility MPC and A data.
Dynamic spreading behavior of homogeneous and heterogeneous networks
Institute of Scientific and Technical Information of China (English)
XIA Chengyi; LIU Zhongxin; CHEN Zengqiang; YUAN Zhuzhi
2007-01-01
The detailed investigation of the dynamic epidemic spreading on homogeneous and heterogeneous networks was carried out. After the analysis of the basic epidemic models, the susceptible-infected-susceptible (SIS) model on homogenous and heterogeneous networks is established, and the dynamical evolution of the density of the infected individuals in these two different kinds of networks is analyzed theoretically. It indicates that heterogeneous networks are easier to propagate for the epidemics and the leading spreading behavior is dictated by the exponential increasing in the initial outbreaks. Large-scale simulations display that the infection is much faster on heterogeneous networks than that on homogeneous ones. It means that the network topology can have a significant effect on the epidemics taking place on complex networks. Some containment strategies of epidemic outbreaks are presented according to the theoretical analyses and numerical simulations.
Energy Technology Data Exchange (ETDEWEB)
Krhounek, V.; Zdarek, J.; Pecinka, L. [Nuclear Research Institute, Rez (Czech Republic)
1997-04-01
Loss of coolant accident (LOCA) experiments performed as part of a Leak Before Break (LBB) analysis are very briefly summarized. The aim of these experiments was to postulate the leak rates of the coolant. Through-wall cracks were introduced into pipes by fatigue cycling and hydraulically loaded in a test device. Measurements included coolant pressure and temperature, quantity of leaked coolant, displacement of a specimen, and acoustic emission. Small cracks were plugged with particles in the coolant during testing. It is believed that plugging will have no effect in cracks with leak rates above 35 liters per minute. The leak rate safety margin of 10 is sufficient for cracks in which the leak rate is more than 5 liters per minute.
CLASSIFICATION OF CUBIC PARAMETERIZED HOMOGENEOUS VECTOR FIELDS
Institute of Scientific and Technical Information of China (English)
Karnal H.Yasir; TANG Yun
2002-01-01
In this paper the cubic homogeneous parameterized vector fields are studied.The classification of the phase portrait near the critical point is presented. This classification is an extension of the result given by Takens to the cubic homogeneous parameterized vector fields with six parameters.
CLASSIFICATION OF CUBIC PARAMETERIZED HOMOGENEOUS VECTOR FIELDS
Institute of Scientific and Technical Information of China (English)
KamalH.Yasir; TNAGYun
2002-01-01
In this paper the cubic homogeneous parameterized vector fields are studied.The classification of the phase portrait near the critical point is presented.This classification is an extension of the result given by takens to the cubic homogeneous parameterized vector fields with six parameters.
Investigations into homogenization of electromagnetic metamaterials
DEFF Research Database (Denmark)
Clausen, Niels Christian Jerichau
This dissertation encompasses homogenization methods, with a special interest into their applications to metamaterial homogenization. The first method studied is the Floquet-Bloch method, that is based on the assumption of a material being infinite periodic. Its field can then be expanded in term...
The Case Against Homogeneous Sets in Mathematics
Jackman, M. K.
1973-01-01
A point-by-point criticism is made of F. H. Flynn's article, The Case for Homogeneous Sets in Mathematics'' (Mathematics in School, Volume 1 Number 2, 1972) in an attempt to show that the arguments used in trying to justify homogeneous grouping in mathematics are invalid. (Editor/DT)
The homogeneous geometries of real hyperbolic space
DEFF Research Database (Denmark)
Castrillón López, Marco; Gadea, Pedro Martínez; Swann, Andrew Francis
We describe the holonomy algebras of all canonical connections of homogeneous structures on real hyperbolic spaces in all dimensions. The structural results obtained then lead to a determination of the types, in the sense of Tricerri and Vanhecke, of the corresponding homogeneous tensors. We use ...
DETERMINISTIC HOMOGENIZATION OF QUASILINEAR DAMPED HYPERBOLIC EQUATIONS
Institute of Scientific and Technical Information of China (English)
Gabriel Nguetseng; Hubert Nnang; Nils Svanstedt
2011-01-01
Deterministic homogenization is studied for quasilinear monotone hyperbolic problems with a linear damping term.It is shown by the sigma-convergence method that the sequence of solutions to a class of multi-scale highly oscillatory hyperbolic problems converges to the solution to a homogenized quasilinear hyperbolic problem.