WorldWideScience

Sample records for integration method based

  1. Towards risk-based structural integrity methods for PWRs

    International Nuclear Information System (INIS)

    Chapman, O.J.V.; Lloyd, R.B.

    1992-01-01

    This paper describes the development of risk-based structural integrity assurance methods and their application to Pressurized Water Reactor (PWR) plant. In-service inspection is introduced as a way of reducing the failure probability of high risk sites and the latter are identified using reliability analysis; the extent and interval of inspection can also be optimized. The methodology is illustrated by reference to the aspect of reliability of weldments in PWR systems. (author)

  2. A numerical integration-based yield estimation method for integrated circuits

    International Nuclear Information System (INIS)

    Liang Tao; Jia Xinzhang

    2011-01-01

    A novel integration-based yield estimation method is developed for yield optimization of integrated circuits. This method tries to integrate the joint probability density function on the acceptability region directly. To achieve this goal, the simulated performance data of unknown distribution should be converted to follow a multivariate normal distribution by using Box-Cox transformation (BCT). In order to reduce the estimation variances of the model parameters of the density function, orthogonal array-based modified Latin hypercube sampling (OA-MLHS) is presented to generate samples in the disturbance space during simulations. The principle of variance reduction of model parameters estimation through OA-MLHS together with BCT is also discussed. Two yield estimation examples, a fourth-order OTA-C filter and a three-dimensional (3D) quadratic function are used for comparison of our method with Monte Carlo based methods including Latin hypercube sampling and importance sampling under several combinations of sample sizes and yield values. Extensive simulations show that our method is superior to other methods with respect to accuracy and efficiency under all of the given cases. Therefore, our method is more suitable for parametric yield optimization. (semiconductor integrated circuits)

  3. A numerical integration-based yield estimation method for integrated circuits

    Energy Technology Data Exchange (ETDEWEB)

    Liang Tao; Jia Xinzhang, E-mail: tliang@yahoo.cn [Key Laboratory of Ministry of Education for Wide Bandgap Semiconductor Materials and Devices, School of Microelectronics, Xidian University, Xi' an 710071 (China)

    2011-04-15

    A novel integration-based yield estimation method is developed for yield optimization of integrated circuits. This method tries to integrate the joint probability density function on the acceptability region directly. To achieve this goal, the simulated performance data of unknown distribution should be converted to follow a multivariate normal distribution by using Box-Cox transformation (BCT). In order to reduce the estimation variances of the model parameters of the density function, orthogonal array-based modified Latin hypercube sampling (OA-MLHS) is presented to generate samples in the disturbance space during simulations. The principle of variance reduction of model parameters estimation through OA-MLHS together with BCT is also discussed. Two yield estimation examples, a fourth-order OTA-C filter and a three-dimensional (3D) quadratic function are used for comparison of our method with Monte Carlo based methods including Latin hypercube sampling and importance sampling under several combinations of sample sizes and yield values. Extensive simulations show that our method is superior to other methods with respect to accuracy and efficiency under all of the given cases. Therefore, our method is more suitable for parametric yield optimization. (semiconductor integrated circuits)

  4. A method for establishing integrity in software-based systems

    International Nuclear Information System (INIS)

    Staple, B.D.; Berg, R.S.; Dalton, L.J.

    1997-01-01

    In this paper, the authors present a digital system requirements specification method that has demonstrated a potential for improving the completeness of requirements while reducing ambiguity. It assists with making proper digital system design decisions, including the defense against specific digital system failures modes. It also helps define the technical rationale for all of the component and interface requirements. This approach is a procedural method that abstracts key features that are expanded in a partitioning that identifies and characterizes hazards and safety system function requirements. The key system features are subjected to a hierarchy that progressively defines their detailed characteristics and components. This process produces a set of requirements specifications for the system and all of its components. Based on application to nuclear power plants, the approach described here uses two ordered domains: plant safety followed by safety system integrity. Plant safety refers to those systems defined to meet the safety goals for the protection of the public. Safety system integrity refers to systems defined to ensure that the system can meet the safety goals. Within each domain, a systematic process is used to identify hazards and define the corresponding means of defense and mitigation. In both domains, the approach and structure are focused on the completeness of information and eliminating ambiguities in the generation of safety system requirements that will achieve the plant safety goals

  5. Integrated method for the measurement of trace nitrogenous atmospheric bases

    Directory of Open Access Journals (Sweden)

    D. Key

    2011-12-01

    Full Text Available Nitrogenous atmospheric bases are thought to play a key role in the global nitrogen cycle, but their sources, transport, and sinks remain poorly understood. Of the many methods available to measure such compounds in ambient air, few meet the current need of being applicable to the complete range of potential analytes and fewer still are convenient to implement using instrumentation that is standard to most laboratories. In this work, an integrated approach to measuring trace, atmospheric, gaseous nitrogenous bases has been developed and validated. The method uses a simple acid scrubbing step to capture and concentrate the bases as their phosphite salts, which then are derivatized and analyzed using GC/MS and/or LC/MS. The advantages of both techniques in the context of the present measurements are discussed. The approach is sensitive, selective, reproducible, as well as convenient to implement and has been validated for different sampling strategies. The limits of detection for the families of tested compounds are suitable for ambient measurement applications (e.g., methylamine, 1 pptv; ethylamine, 2 pptv; morpholine, 1 pptv; aniline, 1 pptv; hydrazine, 0.1 pptv; methylhydrazine, 2 pptv, as supported by field measurements in an urban park and in the exhaust of on-road vehicles.

  6. A study of compositional verification based IMA integration method

    Science.gov (United States)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  7. [Development method of healthcare information system integration based on business collaboration model].

    Science.gov (United States)

    Li, Shasha; Nie, Hongchao; Lu, Xudong; Duan, Huilong

    2015-02-01

    Integration of heterogeneous systems is the key to hospital information construction due to complexity of the healthcare environment. Currently, during the process of healthcare information system integration, people participating in integration project usually communicate by free-format document, which impairs the efficiency and adaptability of integration. A method utilizing business process model and notation (BPMN) to model integration requirement and automatically transforming it to executable integration configuration was proposed in this paper. Based on the method, a tool was developed to model integration requirement and transform it to integration configuration. In addition, an integration case in radiology scenario was used to verify the method.

  8. Overlay improvement methods with diffraction based overlay and integrated metrology

    Science.gov (United States)

    Nam, Young-Sun; Kim, Sunny; Shin, Ju Hee; Choi, Young Sin; Yun, Sang Ho; Kim, Young Hoon; Shin, Si Woo; Kong, Jeong Heung; Kang, Young Seog; Ha, Hun Hwan

    2015-03-01

    To accord with new requirement of securing more overlay margin, not only the optical overlay measurement is faced with the technical limitations to represent cell pattern's behavior, but also the larger measurement samples are inevitable for minimizing statistical errors and better estimation of circumstance in a lot. From these reasons, diffraction based overlay (DBO) and integrated metrology (IM) were mainly proposed as new approaches for overlay enhancement in this paper.

  9. Integrative health care method based on combined complementary ...

    African Journals Online (AJOL)

    Background: There are various models of health care, such as the ... sociological, economic, systemic of Neuman, cognitive medicine or ecological, ayurvedic, ... 2013, with a comprehensive approach in 64 patients using the clinical method.

  10. Structural reliability calculation method based on the dual neural network and direct integration method.

    Science.gov (United States)

    Li, Haibin; He, Yun; Nie, Xiaobo

    2018-01-01

    Structural reliability analysis under uncertainty is paid wide attention by engineers and scholars due to reflecting the structural characteristics and the bearing actual situation. The direct integration method, started from the definition of reliability theory, is easy to be understood, but there are still mathematics difficulties in the calculation of multiple integrals. Therefore, a dual neural network method is proposed for calculating multiple integrals in this paper. Dual neural network consists of two neural networks. The neural network A is used to learn the integrand function, and the neural network B is used to simulate the original function. According to the derivative relationships between the network output and the network input, the neural network B is derived from the neural network A. On this basis, the performance function of normalization is employed in the proposed method to overcome the difficulty of multiple integrations and to improve the accuracy for reliability calculations. The comparisons between the proposed method and Monte Carlo simulation method, Hasofer-Lind method, the mean value first-order second moment method have demonstrated that the proposed method is an efficient and accurate reliability method for structural reliability problems.

  11. Hybrid Pixel-Based Method for Cardiac Ultrasound Fusion Based on Integration of PCA and DWT

    Directory of Open Access Journals (Sweden)

    Samaneh Mazaheri

    2015-01-01

    Full Text Available Medical image fusion is the procedure of combining several images from one or multiple imaging modalities. In spite of numerous attempts in direction of automation ventricle segmentation and tracking in echocardiography, due to low quality images with missing anatomical details or speckle noises and restricted field of view, this problem is a challenging task. This paper presents a fusion method which particularly intends to increase the segment-ability of echocardiography features such as endocardial and improving the image contrast. In addition, it tries to expand the field of view, decreasing impact of noise and artifacts and enhancing the signal to noise ratio of the echo images. The proposed algorithm weights the image information regarding an integration feature between all the overlapping images, by using a combination of principal component analysis and discrete wavelet transform. For evaluation, a comparison has been done between results of some well-known techniques and the proposed method. Also, different metrics are implemented to evaluate the performance of proposed algorithm. It has been concluded that the presented pixel-based method based on the integration of PCA and DWT has the best result for the segment-ability of cardiac ultrasound images and better performance in all metrics.

  12. Review of Polynomial Chaos-Based Methods for Uncertainty Quantification in Modern Integrated Circuits

    Directory of Open Access Journals (Sweden)

    Arun Kaintura

    2018-02-01

    Full Text Available Advances in manufacturing process technology are key ensembles for the production of integrated circuits in the sub-micrometer region. It is of paramount importance to assess the effects of tolerances in the manufacturing process on the performance of modern integrated circuits. The polynomial chaos expansion has emerged as a suitable alternative to standard Monte Carlo-based methods that are accurate, but computationally cumbersome. This paper provides an overview of the most recent developments and challenges in the application of polynomial chaos-based techniques for uncertainty quantification in integrated circuits, with particular focus on high-dimensional problems.

  13. An Accurate Integral Method for Vibration Signal Based on Feature Information Extraction

    Directory of Open Access Journals (Sweden)

    Yong Zhu

    2015-01-01

    Full Text Available After summarizing the advantages and disadvantages of current integral methods, a novel vibration signal integral method based on feature information extraction was proposed. This method took full advantage of the self-adaptive filter characteristic and waveform correction feature of ensemble empirical mode decomposition in dealing with nonlinear and nonstationary signals. This research merged the superiorities of kurtosis, mean square error, energy, and singular value decomposition on signal feature extraction. The values of the four indexes aforementioned were combined into a feature vector. Then, the connotative characteristic components in vibration signal were accurately extracted by Euclidean distance search, and the desired integral signals were precisely reconstructed. With this method, the interference problem of invalid signal such as trend item and noise which plague traditional methods is commendably solved. The great cumulative error from the traditional time-domain integral is effectively overcome. Moreover, the large low-frequency error from the traditional frequency-domain integral is successfully avoided. Comparing with the traditional integral methods, this method is outstanding at removing noise and retaining useful feature information and shows higher accuracy and superiority.

  14. System and method for integrating hazard-based decision making tools and processes

    Science.gov (United States)

    Hodgin, C Reed [Westminster, CO

    2012-03-20

    A system and method for inputting, analyzing, and disseminating information necessary for identified decision-makers to respond to emergency situations. This system and method provides consistency and integration among multiple groups, and may be used for both initial consequence-based decisions and follow-on consequence-based decisions. The system and method in a preferred embodiment also provides tools for accessing and manipulating information that are appropriate for each decision-maker, in order to achieve more reasoned and timely consequence-based decisions. The invention includes processes for designing and implementing a system or method for responding to emergency situations.

  15. Review of Polynomial Chaos-Based Methods for Uncertainty Quantification in Modern Integrated Circuits

    OpenAIRE

    Arun Kaintura; Tom Dhaene; Domenico Spina

    2018-01-01

    Advances in manufacturing process technology are key ensembles for the production of integrated circuits in the sub-micrometer region. It is of paramount importance to assess the effects of tolerances in the manufacturing process on the performance of modern integrated circuits. The polynomial chaos expansion has emerged as a suitable alternative to standard Monte Carlo-based methods that are accurate, but computationally cumbersome. This paper provides an overview of the most recent developm...

  16. Research on Integrated Analysis Method for Equipment and Tactics Based on Intervention Strategy Discussion

    Institute of Scientific and Technical Information of China (English)

    陈超; 张迎新; 毛赤龙

    2012-01-01

    As the increase of the complexity of the information warfare,its intervention strategy needs to be designed in an integrated environment.However,the current research always breaks the internal relation between equipment and tactics,and it is difficult to meet the requirements of their integrated analysis.In this paper,the research status quo of the integrated analysis about equipment and tactics is discussed first,some shortages of the current methods are summarized then,and an evolvement mechanism of the integrated analysis for equipment and tactics is given finally.Based on these,a framework of integrated analysis is proposed.This method's effectiveness is validated by an example.

  17. Advanced image based methods for structural integrity monitoring: Review and prospects

    Science.gov (United States)

    Farahani, Behzad V.; Sousa, Pedro José; Barros, Francisco; Tavares, Paulo J.; Moreira, Pedro M. G. P.

    2018-02-01

    There is a growing trend in engineering to develop methods for structural integrity monitoring and characterization of in-service mechanical behaviour of components. The fast growth in recent years of image processing techniques and image-based sensing for experimental mechanics, brought about a paradigm change in phenomena sensing. Hence, several widely applicable optical approaches are playing a significant role in support of experiment. The current review manuscript describes advanced image based methods for structural integrity monitoring, and focuses on methods such as Digital Image Correlation (DIC), Thermoelastic Stress Analysis (TSA), Electronic Speckle Pattern Interferometry (ESPI) and Speckle Pattern Shearing Interferometry (Shearography). These non-contact full-field techniques rely on intensive image processing methods to measure mechanical behaviour, and evolve even as reviews such as this are being written, which justifies a special effort to keep abreast of this progress.

  18. Integrated Power Flow and Short Circuit Calculation Method for Distribution Network with Inverter Based Distributed Generation

    Directory of Open Access Journals (Sweden)

    Shan Yang

    2016-01-01

    Full Text Available Power flow calculation and short circuit calculation are the basis of theoretical research for distribution network with inverter based distributed generation. The similarity of equivalent model for inverter based distributed generation during normal and fault conditions of distribution network and the differences between power flow and short circuit calculation are analyzed in this paper. Then an integrated power flow and short circuit calculation method for distribution network with inverter based distributed generation is proposed. The proposed method let the inverter based distributed generation be equivalent to Iθ bus, which makes it suitable to calculate the power flow of distribution network with a current limited inverter based distributed generation. And the low voltage ride through capability of inverter based distributed generation can be considered as well in this paper. Finally, some tests of power flow and short circuit current calculation are performed on a 33-bus distribution network. The calculated results from the proposed method in this paper are contrasted with those by the traditional method and the simulation method, whose results have verified the effectiveness of the integrated method suggested in this paper.

  19. Integrated Power Flow and Short Circuit Calculation Method for Distribution Network with Inverter Based Distributed Generation

    OpenAIRE

    Yang, Shan; Tong, Xiangqian

    2016-01-01

    Power flow calculation and short circuit calculation are the basis of theoretical research for distribution network with inverter based distributed generation. The similarity of equivalent model for inverter based distributed generation during normal and fault conditions of distribution network and the differences between power flow and short circuit calculation are analyzed in this paper. Then an integrated power flow and short circuit calculation method for distribution network with inverte...

  20. Third-party Reverse logistics platform and method Based on Bilateral Resource Integration

    Directory of Open Access Journals (Sweden)

    Zheng Hong Zhen

    2016-01-01

    Full Text Available Dispersion of reverse logistics resources makes it difficult to create relationships between demanders and providers, thereby the personalized demand for the construction of enterprise reverse logistics cannot be satisfied and the service quality cannot be guaranteed. Aiming at these problems, this paper presents a platform and method of enterprise reverse logistics based on bilateral resource integration (RLBRI. The method creates a third-party reverse logistics platform to accumulate a mass of reverse logistics demanders and providers together. And the platform integrates bilateral resources and acts as an intermediary to establish relationships between two sides. Through the platform, a complete and high-quality business chain for enterprise reverse logistics will be built efficiently. Finally put forward an effective strategy of non-defective reverse logistics depends on the integrity checking service provided by third-party logistics. By using this strategy it can short the distance of non-defective reverse transportation. Computational tests validate the strategy.

  1. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    Science.gov (United States)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  2. A Nonparametric, Multiple Imputation-Based Method for the Retrospective Integration of Data Sets

    Science.gov (United States)

    Carrig, Madeline M.; Manrique-Vallier, Daniel; Ranby, Krista W.; Reiter, Jerome P.; Hoyle, Rick H.

    2015-01-01

    Complex research questions often cannot be addressed adequately with a single data set. One sensible alternative to the high cost and effort associated with the creation of large new data sets is to combine existing data sets containing variables related to the constructs of interest. The goal of the present research was to develop a flexible, broadly applicable approach to the integration of disparate data sets that is based on nonparametric multiple imputation and the collection of data from a convenient, de novo calibration sample. We demonstrate proof of concept for the approach by integrating three existing data sets containing items related to the extent of problematic alcohol use and associations with deviant peers. We discuss both necessary conditions for the approach to work well and potential strengths and weaknesses of the method compared to other data set integration approaches. PMID:26257437

  3. Research on Linux Trusted Boot Method Based on Reverse Integrity Verification

    Directory of Open Access Journals (Sweden)

    Chenlin Huang

    2016-01-01

    Full Text Available Trusted computing aims to build a trusted computing environment for information systems with the help of secure hardware TPM, which has been proved to be an effective way against network security threats. However, the TPM chips are not yet widely deployed in most computing devices so far, thus limiting the applied scope of trusted computing technology. To solve the problem of lacking trusted hardware in existing computing platform, an alternative security hardware USBKey is introduced in this paper to simulate the basic functions of TPM and a new reverse USBKey-based integrity verification model is proposed to implement the reverse integrity verification of the operating system boot process, which can achieve the effect of trusted boot of the operating system in end systems without TPMs. A Linux operating system booting method based on reverse integrity verification is designed and implemented in this paper, with which the integrity of data and executable files in the operating system are verified and protected during the trusted boot process phase by phase. It implements the trusted boot of operation system without TPM and supports remote attestation of the platform. Enhanced by our method, the flexibility of the trusted computing technology is greatly improved and it is possible for trusted computing to be applied in large-scale computing environment.

  4. Study on Method of Geohazard Change Detection Based on Integrating Remote Sensing and GIS

    International Nuclear Information System (INIS)

    Zhao, Zhenzhen; Yan, Qin; Liu, Zhengjun; Luo, Chengfeng

    2014-01-01

    Following a comprehensive literature review, this paper looks at analysis of geohazard using remote sensing information. This paper compares the basic types and methods of change detection, explores the basic principle of common methods and makes an respective analysis of the characteristics and shortcomings of the commonly used methods in the application of geohazard. Using the earthquake in JieGu as a case study, this paper proposes a geohazard change detection method integrating RS and GIS. When detecting the pre-earthquake and post-earthquake remote sensing images at different phases, it is crucial to set an appropriate threshold. The method adopts a self-adapting determination algorithm for threshold. We select a training region which is obtained after pixel information comparison and set a threshold value. The threshold value separates the changed pixel maximum. Then we apply the threshold value to the entire image, which could also make change detection accuracy maximum. Finally, we output the result to the GIS system to make change analysis. The experimental results show that this method of geohazard change detection based on integrating remote sensing and GIS information has higher accuracy with obvious advantages compared with the traditional methods

  5. The Integrative Method Based on the Module-Network for Identifying Driver Genes in Cancer Subtypes

    Directory of Open Access Journals (Sweden)

    Xinguo Lu

    2018-01-01

    Full Text Available With advances in next-generation sequencing(NGS technologies, a large number of multiple types of high-throughput genomics data are available. A great challenge in exploring cancer progression is to identify the driver genes from the variant genes by analyzing and integrating multi-types genomics data. Breast cancer is known as a heterogeneous disease. The identification of subtype-specific driver genes is critical to guide the diagnosis, assessment of prognosis and treatment of breast cancer. We developed an integrated frame based on gene expression profiles and copy number variation (CNV data to identify breast cancer subtype-specific driver genes. In this frame, we employed statistical machine-learning method to select gene subsets and utilized an module-network analysis method to identify potential candidate driver genes. The final subtype-specific driver genes were acquired by paired-wise comparison in subtypes. To validate specificity of the driver genes, the gene expression data of these genes were applied to classify the patient samples with 10-fold cross validation and the enrichment analysis were also conducted on the identified driver genes. The experimental results show that the proposed integrative method can identify the potential driver genes and the classifier with these genes acquired better performance than with genes identified by other methods.

  6. New Internet search volume-based weighting method for integrating various environmental impacts

    Energy Technology Data Exchange (ETDEWEB)

    Ji, Changyoon, E-mail: changyoon@yonsei.ac.kr; Hong, Taehoon, E-mail: hong7@yonsei.ac.kr

    2016-01-15

    Weighting is one of the steps in life cycle impact assessment that integrates various characterized environmental impacts as a single index. Weighting factors should be based on the society's preferences. However, most previous studies consider only the opinion of some people. Thus, this research proposes a new weighting method that determines the weighting factors of environmental impact categories by considering public opinion on environmental impacts using the Internet search volumes for relevant terms. To validate the new weighting method, the weighting factors for six environmental impacts calculated by the new weighting method were compared with the existing weighting factors. The resulting Pearson's correlation coefficient between the new and existing weighting factors was from 0.8743 to 0.9889. It turned out that the new weighting method presents reasonable weighting factors. It also requires less time and lower cost compared to existing methods and likewise meets the main requirements of weighting methods such as simplicity, transparency, and reproducibility. The new weighting method is expected to be a good alternative for determining the weighting factor. - Highlight: • A new weighting method using Internet search volume is proposed in this research. • The new weighting method reflects the public opinion using Internet search volume. • The correlation coefficient between new and existing weighting factors is over 0.87. • The new weighting method can present the reasonable weighting factors. • The proposed method can be a good alternative for determining the weighting factors.

  7. New Internet search volume-based weighting method for integrating various environmental impacts

    International Nuclear Information System (INIS)

    Ji, Changyoon; Hong, Taehoon

    2016-01-01

    Weighting is one of the steps in life cycle impact assessment that integrates various characterized environmental impacts as a single index. Weighting factors should be based on the society's preferences. However, most previous studies consider only the opinion of some people. Thus, this research proposes a new weighting method that determines the weighting factors of environmental impact categories by considering public opinion on environmental impacts using the Internet search volumes for relevant terms. To validate the new weighting method, the weighting factors for six environmental impacts calculated by the new weighting method were compared with the existing weighting factors. The resulting Pearson's correlation coefficient between the new and existing weighting factors was from 0.8743 to 0.9889. It turned out that the new weighting method presents reasonable weighting factors. It also requires less time and lower cost compared to existing methods and likewise meets the main requirements of weighting methods such as simplicity, transparency, and reproducibility. The new weighting method is expected to be a good alternative for determining the weighting factor. - Highlight: • A new weighting method using Internet search volume is proposed in this research. • The new weighting method reflects the public opinion using Internet search volume. • The correlation coefficient between new and existing weighting factors is over 0.87. • The new weighting method can present the reasonable weighting factors. • The proposed method can be a good alternative for determining the weighting factors.

  8. Research on Multi - Person Parallel Modeling Method Based on Integrated Model Persistent Storage

    Science.gov (United States)

    Qu, MingCheng; Wu, XiangHu; Tao, YongChao; Liu, Ying

    2018-03-01

    This paper mainly studies the multi-person parallel modeling method based on the integrated model persistence storage. The integrated model refers to a set of MDDT modeling graphics system, which can carry out multi-angle, multi-level and multi-stage description of aerospace general embedded software. Persistent storage refers to converting the data model in memory into a storage model and converting the storage model into a data model in memory, where the data model refers to the object model and the storage model is a binary stream. And multi-person parallel modeling refers to the need for multi-person collaboration, the role of separation, and even real-time remote synchronization modeling.

  9. Improved Genetic Algorithm-Based Unit Commitment Considering Uncertainty Integration Method

    Directory of Open Access Journals (Sweden)

    Kyu-Hyung Jo

    2018-05-01

    Full Text Available In light of the dissemination of renewable energy connected to the power grid, it has become necessary to consider the uncertainty in the generation of renewable energy as a unit commitment (UC problem. A methodology for solving the UC problem is presented by considering various uncertainties, which are assumed to have a normal distribution, by using a Monte Carlo simulation. Based on the constructed scenarios for load, wind, solar, and generator outages, a combination of scenarios is found that meets the reserve requirement to secure the power balance of the power grid. In those scenarios, the uncertainty integration method (UIM identifies the best combination by minimizing the additional reserve requirements caused by the uncertainty of power sources. An integration process for uncertainties is formulated for stochastic unit commitment (SUC problems and optimized by the improved genetic algorithm (IGA. The IGA is composed of five procedures and finds the optimal combination of unit status at the scheduled time, based on the determined source data. According to the number of unit systems, the IGA demonstrates better performance than the other optimization methods by applying reserve repairing and an approximation process. To account for the result of the proposed method, various UC strategies are tested with a modified 24-h UC test system and compared.

  10. Adaptive Sliding Mode Control Method Based on Nonlinear Integral Sliding Surface for Agricultural Vehicle Steering Control

    Directory of Open Access Journals (Sweden)

    Taochang Li

    2014-01-01

    Full Text Available Automatic steering control is the key factor and essential condition in the realization of the automatic navigation control of agricultural vehicles. In order to get satisfactory steering control performance, an adaptive sliding mode control method based on a nonlinear integral sliding surface is proposed in this paper for agricultural vehicle steering control. First, the vehicle steering system is modeled as a second-order mathematic model; the system uncertainties and unmodeled dynamics as well as the external disturbances are regarded as the equivalent disturbances satisfying a certain boundary. Second, a transient process of the desired system response is constructed in each navigation control period. Based on the transient process, a nonlinear integral sliding surface is designed. Then the corresponding sliding mode control law is proposed to guarantee the fast response characteristics with no overshoot in the closed-loop steering control system. Meanwhile, the switching gain of sliding mode control is adaptively adjusted to alleviate the control input chattering by using the fuzzy control method. Finally, the effectiveness and the superiority of the proposed method are verified by a series of simulation and actual steering control experiments.

  11. A peaking-regulation-balance-based method for wind & PV power integrated accommodation

    Science.gov (United States)

    Zhang, Jinfang; Li, Nan; Liu, Jun

    2018-02-01

    Rapid development of China’s new energy in current and future should be focused on cooperation of wind and PV power. Based on the analysis of system peaking balance, combined with the statistical features of wind and PV power output characteristics, a method of comprehensive integrated accommodation analysis of wind and PV power is put forward. By the electric power balance during night peaking load period in typical day, wind power installed capacity is determined firstly; then PV power installed capacity could be figured out by midday peak load hours, which effectively solves the problem of uncertainty when traditional method hard determines the combination of the wind and solar power simultaneously. The simulation results have validated the effectiveness of the proposed method.

  12. Learning Algorithm of Boltzmann Machine Based on Spatial Monte Carlo Integration Method

    Directory of Open Access Journals (Sweden)

    Muneki Yasuda

    2018-04-01

    Full Text Available The machine learning techniques for Markov random fields are fundamental in various fields involving pattern recognition, image processing, sparse modeling, and earth science, and a Boltzmann machine is one of the most important models in Markov random fields. However, the inference and learning problems in the Boltzmann machine are NP-hard. The investigation of an effective learning algorithm for the Boltzmann machine is one of the most important challenges in the field of statistical machine learning. In this paper, we study Boltzmann machine learning based on the (first-order spatial Monte Carlo integration method, referred to as the 1-SMCI learning method, which was proposed in the author’s previous paper. In the first part of this paper, we compare the method with the maximum pseudo-likelihood estimation (MPLE method using a theoretical and a numerical approaches, and show the 1-SMCI learning method is more effective than the MPLE. In the latter part, we compare the 1-SMCI learning method with other effective methods, ratio matching and minimum probability flow, using a numerical experiment, and show the 1-SMCI learning method outperforms them.

  13. Integrated base stations and a method of transmitting data units in a communications system for mobile devices

    NARCIS (Netherlands)

    Bosch, H.G.P.; Mullender, Sape J.; Narlikar, G.J.; Samuel, L.G.; Yagati, L.N.

    2006-01-01

    Integrated base stations and a method of transmitting data units in a communications system for mobile devices. In one embodiment, an integrated base station includes a communications processor having a protocol stack configured with a media access control layer and a physical layer.

  14. Thermally-isolated silicon-based integrated circuits and related methods

    Science.gov (United States)

    Wojciechowski, Kenneth; Olsson, Roy H.; Clews, Peggy J.; Bauer, Todd

    2017-05-09

    Thermally isolated devices may be formed by performing a series of etches on a silicon-based substrate. As a result of the series of etches, silicon material may be removed from underneath a region of an integrated circuit (IC). The removal of the silicon material from underneath the IC forms a gap between remaining substrate and the integrated circuit, though the integrated circuit remains connected to the substrate via a support bar arrangement that suspends the integrated circuit over the substrate. The creation of this gap functions to release the device from the substrate and create a thermally-isolated integrated circuit.

  15. Method of making thermally-isolated silicon-based integrated circuits

    Science.gov (United States)

    Wojciechowski, Kenneth; Olsson, Roy; Clews, Peggy J.; Bauer, Todd

    2017-11-21

    Thermally isolated devices may be formed by performing a series of etches on a silicon-based substrate. As a result of the series of etches, silicon material may be removed from underneath a region of an integrated circuit (IC). The removal of the silicon material from underneath the IC forms a gap between remaining substrate and the integrated circuit, though the integrated circuit remains connected to the substrate via a support bar arrangement that suspends the integrated circuit over the substrate. The creation of this gap functions to release the device from the substrate and create a thermally-isolated integrated circuit.

  16. A new method to identify the foot of continental slope based on an integrated profile analysis

    Science.gov (United States)

    Wu, Ziyin; Li, Jiabiao; Li, Shoujun; Shang, Jihong; Jin, Xiaobin

    2017-06-01

    A new method is proposed to identify automatically the foot of the continental slope (FOS) based on the integrated analysis of topographic profiles. Based on the extremum points of the second derivative and the Douglas-Peucker algorithm, it simplifies the topographic profiles, then calculates the second derivative of the original profiles and the D-P profiles. Seven steps are proposed to simplify the original profiles. Meanwhile, multiple identification methods are proposed to determine the FOS points, including gradient, water depth and second derivative values of data points, as well as the concave and convex, continuity and segmentation of the topographic profiles. This method can comprehensively and intelligently analyze the topographic profiles and their derived slopes, second derivatives and D-P profiles, based on which, it is capable to analyze the essential properties of every single data point in the profile. Furthermore, it is proposed to remove the concave points of the curve and in addition, to implement six FOS judgment criteria.

  17. Integration of sampling based battery state of health estimation method in electric vehicles

    International Nuclear Information System (INIS)

    Ozkurt, Celil; Camci, Fatih; Atamuradov, Vepa; Odorry, Christopher

    2016-01-01

    Highlights: • Presentation of a prototype system with full charge discharge cycling capability. • Presentation of SoH estimation results for systems degraded in the lab. • Discussion of integration alternatives of the presented method in EVs. • Simulation model based on presented SoH estimation for a real EV battery system. • Optimization of number of battery cells to be selected for SoH test. - Abstract: Battery cost is one of the crucial parameters affecting high deployment of Electric Vehicles (EVs) negatively. Accurate State of Health (SoH) estimation plays an important role in reducing the total ownership cost, availability, and safety of the battery avoiding early disposal of the batteries and decreasing unexpected failures. A circuit design for SoH estimation in a battery system that bases on selected battery cells and its integration to EVs are presented in this paper. A prototype microcontroller has been developed and used for accelerated aging tests for a battery system. The data collected in the lab tests have been utilized to simulate a real EV battery system. Results of accelerated aging tests and simulation have been presented in the paper. The paper also discusses identification of the best number of battery cells to be selected for SoH estimation test. In addition, different application options of the presented approach for EV batteries have been discussed in the paper.

  18. Method for assessing wind power integration in a hydro based power system

    International Nuclear Information System (INIS)

    Norheim, I.; Palsson, M.; Tande, J.O.G.; Uhlen, K.

    2006-01-01

    The present paper demonstrates a method for assessment of how much wind power that can be integrated in a system with limited transmission capacity. Based on hydro inflow data and wind measurements (for different locations of planned wind farms in an area) it is possible to assess how much wind power that can be fed into a certain point in the transmission network without violating the transmission capacity limits. The proposed method combines the use of market modelling and detailed network analysis in order to assess the probability of network congestions rather than focusing on extreme cases. By computing the probability distribution of power flow on critical corridors in the grid it is possible to assess the likelihood of network congestions and the amount of energy that must be curtailed to fulfil power system security requirements (n-1). This way the assessment is not only made of worst case scenarios, assuming maximal flow from hydro plants and maximal wind power production. As extreme case scenarios are short term and may be solved by market mechanisms or automatic system protection schemes (disconnection of wind power or hydro power), the proposed method may reveal that it would be economic to install more wind power than if only based on analysis of worst case scenarios. (orig.)

  19. Ensuring the integrity of information resources based methods dvooznakovoho structural data encoding

    Directory of Open Access Journals (Sweden)

    О.К. Юдін

    2009-01-01

    Full Text Available  Developed methods of estimation of noise stability and correction of structural code constructions to distortion in comunication of data in informatively communication systems and networks taking into account providing of integrity of informative resource.

  20. A computational method based on the integration of heterogeneous networks for predicting disease-gene associations.

    Directory of Open Access Journals (Sweden)

    Xingli Guo

    Full Text Available The identification of disease-causing genes is a fundamental challenge in human health and of great importance in improving medical care, and provides a better understanding of gene functions. Recent computational approaches based on the interactions among human proteins and disease similarities have shown their power in tackling the issue. In this paper, a novel systematic and global method that integrates two heterogeneous networks for prioritizing candidate disease-causing genes is provided, based on the observation that genes causing the same or similar diseases tend to lie close to one another in a network of protein-protein interactions. In this method, the association score function between a query disease and a candidate gene is defined as the weighted sum of all the association scores between similar diseases and neighbouring genes. Moreover, the topological correlation of these two heterogeneous networks can be incorporated into the definition of the score function, and finally an iterative algorithm is designed for this issue. This method was tested with 10-fold cross-validation on all 1,126 diseases that have at least a known causal gene, and it ranked the correct gene as one of the top ten in 622 of all the 1,428 cases, significantly outperforming a state-of-the-art method called PRINCE. The results brought about by this method were applied to study three multi-factorial disorders: breast cancer, Alzheimer disease and diabetes mellitus type 2, and some suggestions of novel causal genes and candidate disease-causing subnetworks were provided for further investigation.

  1. INTEGRATIVE METHOD OF TEACHING INFORMATION MODELING IN PRACTICAL HEALTH SERVICE BASED ON MICROSOFT ACCESS QUERIES

    Directory of Open Access Journals (Sweden)

    Svetlana A. Firsova

    2016-06-01

    Full Text Available Introduction: this article explores the pedagogical technology employed to teach medical students foundations of work with MICROSOFT ACCESS databases. The above technology is based on integrative approach to the information modeling in public health practice, drawing upon basic didactic concepts that pertain to objects and tools databases created in MICROSOFT ACCESS. The article examines successive steps in teaching the topic “Queries in MICROSOFT ACCESS” – from simple queries to complex ones. The main attention is paid to such components of methodological system, as the principles and teaching methods classified according to the degree of learners’ active cognitive activity. The most interesting is the diagram of the relationship of learning principles, teaching methods and specific types of requests. Materials and Methods: the authors used comparative analysis of literature, syllabi, curricula in medical informatics taught at leading medical universities in Russia. Results: the original technique of training in putting queries with databases of MICROSOFT ACCESS is presented for analysis of information models in practical health care. Discussion and Conclusions: it is argued that the proposed pedagogical technology will significantly improve the effectiveness of teaching the course “Medical Informatics”, that includes development and application of models to simulate the operation of certain facilities and services of the health system which, in turn, increases the level of information culture of practitioners.

  2. Protein complex detection in PPI networks based on data integration and supervised learning method.

    Science.gov (United States)

    Yu, Feng; Yang, Zhi; Hu, Xiao; Sun, Yuan; Lin, Hong; Wang, Jian

    2015-01-01

    Revealing protein complexes are important for understanding principles of cellular organization and function. High-throughput experimental techniques have produced a large amount of protein interactions, which makes it possible to predict protein complexes from protein-protein interaction (PPI) networks. However, the small amount of known physical interactions may limit protein complex detection. The new PPI networks are constructed by integrating PPI datasets with the large and readily available PPI data from biomedical literature, and then the less reliable PPI between two proteins are filtered out based on semantic similarity and topological similarity of the two proteins. Finally, the supervised learning protein complex detection (SLPC), which can make full use of the information of available known complexes, is applied to detect protein complex on the new PPI networks. The experimental results of SLPC on two different categories yeast PPI networks demonstrate effectiveness of the approach: compared with the original PPI networks, the best average improvements of 4.76, 6.81 and 15.75 percentage units in the F-score, accuracy and maximum matching ratio (MMR) are achieved respectively; compared with the denoising PPI networks, the best average improvements of 3.91, 4.61 and 12.10 percentage units in the F-score, accuracy and MMR are achieved respectively; compared with ClusterONE, the start-of the-art complex detection method, on the denoising extended PPI networks, the average improvements of 26.02 and 22.40 percentage units in the F-score and MMR are achieved respectively. The experimental results show that the performances of SLPC have a large improvement through integration of new receivable PPI data from biomedical literature into original PPI networks and denoising PPI networks. In addition, our protein complexes detection method can achieve better performance than ClusterONE.

  3. Integral equation methods for electromagnetics

    CERN Document Server

    Volakis, John

    2012-01-01

    This text/reference is a detailed look at the development and use of integral equation methods for electromagnetic analysis, specifically for antennas and radar scattering. Developers and practitioners will appreciate the broad-based approach to understanding and utilizing integral equation methods and the unique coverage of historical developments that led to the current state-of-the-art. In contrast to existing books, Integral Equation Methods for Electromagnetics lays the groundwork in the initial chapters so students and basic users can solve simple problems and work their way up to the mo

  4. A comparison of sequential and information-based methods for determining the co-integration rank in heteroskedastic VAR MODELS

    DEFF Research Database (Denmark)

    Cavaliere, Giuseppe; Angelis, Luca De; Rahbek, Anders

    2015-01-01

    In this article, we investigate the behaviour of a number of methods for estimating the co-integration rank in VAR systems characterized by heteroskedastic innovation processes. In particular, we compare the efficacy of the most widely used information criteria, such as Akaike Information Criterion....... The relative finite-sample properties of the different methods are investigated by means of a Monte Carlo simulation study. For the simulation DGPs considered in the analysis, we find that the BIC-based procedure and the bootstrap sequential test procedure deliver the best overall performance in terms......-based method to over-estimate the co-integration rank in relatively small sample sizes....

  5. Depth extraction method with high accuracy in integral imaging based on moving array lenslet technique

    Science.gov (United States)

    Wang, Yao-yao; Zhang, Juan; Zhao, Xue-wei; Song, Li-pei; Zhang, Bo; Zhao, Xing

    2018-03-01

    In order to improve depth extraction accuracy, a method using moving array lenslet technique (MALT) in pickup stage is proposed, which can decrease the depth interval caused by pixelation. In this method, the lenslet array is moved along the horizontal and vertical directions simultaneously for N times in a pitch to get N sets of elemental images. Computational integral imaging reconstruction method for MALT is taken to obtain the slice images of the 3D scene, and the sum modulus (SMD) blur metric is taken on these slice images to achieve the depth information of the 3D scene. Simulation and optical experiments are carried out to verify the feasibility of this method.

  6. Integrated analyzing method for the progress event based on subjects and predicates in events

    International Nuclear Information System (INIS)

    Minowa, Hirotsugu; Munesawa, Yoshiomi

    2014-01-01

    It is expected to make use of the knowledge that was extracted by analyzing the mistakes of the past to prevent recurrence of accidents. Currently main analytic style is an analytic style that experts decipher deeply the accident cases, but cross-analysis has come to an end with extracting the common factors in the accident cases. We propose an integrated analyzing method for progress events to analyze among accidents in this study. Our method realized the integration of many accident cases by the integration connecting the common keyword called as 'Subject' or 'Predicate' that are extracted from each progress event in accident cases or near-miss cases. Our method can analyze and visualize the partial risk identification and the frequency to cause accidents and the risk assessment from the data integrated accident cases. The result of applying our method to PEC-SAFER accident cases identified 8 hazardous factors which can be caused from tank again, and visualized the high frequent factors that the first factor was damage of tank 26% and the second factor was the corrosion 21%, and visualized the high risks that the first risk was the damage 3.3 x 10 -2 [risk rank / year] and the second risk was the destroy 2.5 x 10 -2 [risk rank / year]. (author)

  7. Electrolocation-based underwater obstacle avoidance using wide-field integration methods

    International Nuclear Information System (INIS)

    Dimble, Kedar D; Faddy, James M; Humbert, J Sean

    2014-01-01

    Weakly electric fish are capable of efficiently performing obstacle avoidance in dark and navigationally challenging aquatic environments using electrosensory information. This sensory modality enables extraction of relevant proximity information about surrounding obstacles by interpretation of perturbations induced to the fish’s self-generated electric field. In this paper, reflexive obstacle avoidance is demonstrated by extracting relative proximity information using spatial decompositions of the perturbation signal, also called an electric image. Electrostatics equations were formulated for mathematically expressing electric images due to a straight tunnel to the electric field generated with a planar electro-sensor model. These equations were further used to design a wide-field integration based static output feedback controller. The controller was implemented in quasi-static simulations for environments with complicated geometries modelled using finite element methods to demonstrate sense and avoid behaviours. The simulation results were confirmed by performing experiments using a computer operated gantry system in environments lined with either conductive or non-conductive objects acting as global stimuli to the field of the electro-sensor. The proposed approach is computationally inexpensive and readily implementable, making underwater autonomous navigation in real-time feasible. (paper)

  8. A Novel Evaluation Method for Building Construction Project Based on Integrated Information Entropy with Reliability Theory

    Directory of Open Access Journals (Sweden)

    Xiao-ping Bai

    2013-01-01

    Full Text Available Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  9. A novel evaluation method for building construction project based on integrated information entropy with reliability theory.

    Science.gov (United States)

    Bai, Xiao-ping; Zhang, Xi-wei

    2013-01-01

    Selecting construction schemes of the building engineering project is a complex multiobjective optimization decision process, in which many indexes need to be selected to find the optimum scheme. Aiming at this problem, this paper selects cost, progress, quality, and safety as the four first-order evaluation indexes, uses the quantitative method for the cost index, uses integrated qualitative and quantitative methodologies for progress, quality, and safety indexes, and integrates engineering economics, reliability theories, and information entropy theory to present a new evaluation method for building construction project. Combined with a practical case, this paper also presents detailed computing processes and steps, including selecting all order indexes, establishing the index matrix, computing score values of all order indexes, computing the synthesis score, sorting all selected schemes, and making analysis and decision. Presented method can offer valuable references for risk computing of building construction projects.

  10. Social network extraction based on Web: 3. the integrated superficial method

    Science.gov (United States)

    Nasution, M. K. M.; Sitompul, O. S.; Noah, S. A.

    2018-03-01

    The Web as a source of information has become part of the social behavior information. Although, by involving only the limitation of information disclosed by search engines in the form of: hit counts, snippets, and URL addresses of web pages, the integrated extraction method produces a social network not only trusted but enriched. Unintegrated extraction methods may produce social networks without explanation, resulting in poor supplemental information, or resulting in a social network of durmise laden, consequently unrepresentative social structures. The integrated superficial method in addition to generating the core social network, also generates an expanded network so as to reach the scope of relation clues, or number of edges computationally almost similar to n(n - 1)/2 for n social actors.

  11. Tensor Product Model Transformation Based Adaptive Integral-Sliding Mode Controller: Equivalent Control Method

    Directory of Open Access Journals (Sweden)

    Guoliang Zhao

    2013-01-01

    Full Text Available This paper proposes new methodologies for the design of adaptive integral-sliding mode control. A tensor product model transformation based adaptive integral-sliding mode control law with respect to uncertainties and perturbations is studied, while upper bounds on the perturbations and uncertainties are assumed to be unknown. The advantage of proposed controllers consists in having a dynamical adaptive control gain to establish a sliding mode right at the beginning of the process. Gain dynamics ensure a reasonable adaptive gain with respect to the uncertainties. Finally, efficacy of the proposed controller is verified by simulations on an uncertain nonlinear system model.

  12. Integral-equation based methods for parameter estimation in output pulses of radiation detectors: Application in nuclear medicine and spectroscopy

    Science.gov (United States)

    Mohammadian-Behbahani, Mohammad-Reza; Saramad, Shahyar

    2018-04-01

    Model based analysis methods are relatively new approaches for processing the output data of radiation detectors in nuclear medicine imaging and spectroscopy. A class of such methods requires fast algorithms for fitting pulse models to experimental data. In order to apply integral-equation based methods for processing the preamplifier output pulses, this article proposes a fast and simple method for estimating the parameters of the well-known bi-exponential pulse model by solving an integral equation. The proposed method needs samples from only three points of the recorded pulse as well as its first and second order integrals. After optimizing the sampling points, the estimation results were calculated and compared with two traditional integration-based methods. Different noise levels (signal-to-noise ratios from 10 to 3000) were simulated for testing the functionality of the proposed method, then it was applied to a set of experimental pulses. Finally, the effect of quantization noise was assessed by studying different sampling rates. Promising results by the proposed method endorse it for future real-time applications.

  13. Models, methods and software for distributed knowledge acquisition for the automated construction of integrated expert systems knowledge bases

    International Nuclear Information System (INIS)

    Dejneko, A.O.

    2011-01-01

    Based on an analysis of existing models, methods and means of acquiring knowledge, a base method of automated knowledge acquisition has been chosen. On the base of this method, a new approach to integrate information acquired from knowledge sources of different typologies has been proposed, and the concept of a distributed knowledge acquisition with the aim of computerized formation of the most complete and consistent models of problem areas has been introduced. An original algorithm for distributed knowledge acquisition from databases, based on the construction of binary decision trees has been developed [ru

  14. Matrix based method for synthesis of main intensified and integrated distillation sequences

    International Nuclear Information System (INIS)

    Khalili-Garakani, Amirhossein; Kasiri, Norollah; Ivakpour, Javad

    2016-01-01

    The objective of many studies in this area has involved access to a column-sequencing algorithm enabling designers and researchers alike to generate a wide range of sequences in a broad search space, and be as mathematically and as automated as possible for programing purposes and with good generality. In the present work an algorithm previously developed by the authors, called the matrix method, has been developed much further. The new version of the algorithm includes thermally coupled, thermodynamically equivalent, intensified, simultaneous heat and mass integrated and divided-wall column sequences which are of gross application and provide vast saving potential both on capital investment, operating costs and energy usage in industrial applications. To demonstrate the much wider searchable space now accessible, a three component separation has been thoroughly examined as a case study, always resulting in an integrated sequence being proposed as the optimum.

  15. Bds/gps Integrated Positioning Method Research Based on Nonlinear Kalman Filtering

    Science.gov (United States)

    Ma, Y.; Yuan, W.; Sun, H.

    2017-09-01

    In order to realize fast and accurate BDS/GPS integrated positioning, it is necessary to overcome the adverse effects of signal attenuation, multipath effect and echo interference to ensure the result of continuous and accurate navigation and positioning. In this paper, pseudo-range positioning is used as the mathematical model. In the stage of data preprocessing, using precise and smooth carrier phase measurement value to promote the rough pseudo-range measurement value without ambiguity. At last, the Extended Kalman Filter(EKF), the Unscented Kalman Filter(UKF) and the Particle Filter(PF) algorithm are applied in the integrated positioning method for higher positioning accuracy. The experimental results show that the positioning accuracy of PF is the highest, and UKF is better than EKF.

  16. An extensive analysis of disease-gene associations using network integration and fast kernel-based gene prioritization methods

    Science.gov (United States)

    Valentini, Giorgio; Paccanaro, Alberto; Caniza, Horacio; Romero, Alfonso E.; Re, Matteo

    2014-01-01

    Objective In the context of “network medicine”, gene prioritization methods represent one of the main tools to discover candidate disease genes by exploiting the large amount of data covering different types of functional relationships between genes. Several works proposed to integrate multiple sources of data to improve disease gene prioritization, but to our knowledge no systematic studies focused on the quantitative evaluation of the impact of network integration on gene prioritization. In this paper, we aim at providing an extensive analysis of gene-disease associations not limited to genetic disorders, and a systematic comparison of different network integration methods for gene prioritization. Materials and methods We collected nine different functional networks representing different functional relationships between genes, and we combined them through both unweighted and weighted network integration methods. We then prioritized genes with respect to each of the considered 708 medical subject headings (MeSH) diseases by applying classical guilt-by-association, random walk and random walk with restart algorithms, and the recently proposed kernelized score functions. Results The results obtained with classical random walk algorithms and the best single network achieved an average area under the curve (AUC) across the 708 MeSH diseases of about 0.82, while kernelized score functions and network integration boosted the average AUC to about 0.89. Weighted integration, by exploiting the different “informativeness” embedded in different functional networks, outperforms unweighted integration at 0.01 significance level, according to the Wilcoxon signed rank sum test. For each MeSH disease we provide the top-ranked unannotated candidate genes, available for further bio-medical investigation. Conclusions Network integration is necessary to boost the performances of gene prioritization methods. Moreover the methods based on kernelized score functions can further

  17. An Integrated DEMATEL-VIKOR Method-Based Approach for Cotton Fibre Selection and Evaluation

    Science.gov (United States)

    Chakraborty, Shankar; Chatterjee, Prasenjit; Prasad, Kanika

    2018-01-01

    Selection of the most appropriate cotton fibre type for yarn manufacturing is often treated as a multi-criteria decision-making (MCDM) problem as the optimal selection decision needs to be taken in presence of several conflicting fibre properties. In this paper, two popular MCDM methods in the form of decision making trial and evaluation laboratory (DEMATEL) and VIse Kriterijumska Optimizacija kompromisno Resenje (VIKOR) are integrated to aid the cotton fibre selection decision. DEMATEL method addresses the interrelationships between various physical properties of cotton fibres while segregating them into cause and effect groups, whereas, VIKOR method helps in ranking all the considered 17 cotton fibres from the best to the worst. The derived ranking of cotton fibre alternatives closely matches with that obtained by the past researchers. This model can assist the spinning industry personnel in the blending process while making accurate fibre selection decision when cotton fibre properties are numerous and interrelated.

  18. MEMS-based fuel cells with integrated catalytic fuel processor and method thereof

    Science.gov (United States)

    Jankowski, Alan F [Livermore, CA; Morse, Jeffrey D [Martinez, CA; Upadhye, Ravindra S [Pleasanton, CA; Havstad, Mark A [Davis, CA

    2011-08-09

    Described herein is a means to incorporate catalytic materials into the fuel flow field structures of MEMS-based fuel cells, which enable catalytic reforming of a hydrocarbon based fuel, such as methane, methanol, or butane. Methods of fabrication are also disclosed.

  19. Wing aeroelasticity analysis based on an integral boundary-layer method coupled with Euler solver

    Directory of Open Access Journals (Sweden)

    Ma Yanfeng

    2016-10-01

    Full Text Available An interactive boundary-layer method, which solves the unsteady flow, is developed for aeroelastic computation in the time domain. The coupled method combines the Euler solver with the integral boundary-layer solver (Euler/BL in a “semi-inverse” manner to compute flows with the inviscid and viscous interaction. Unsteady boundary conditions on moving surfaces are taken into account by utilizing the approximate small-perturbation method without moving the computational grids. The steady and unsteady flow calculations for the LANN wing are presented. The wing tip displacement of high Reynolds number aero-structural dynamics (HIRENASD Project is simulated under different angles of attack. The flutter-boundary predictions for the AGARD 445.6 wing are provided. The results of the interactive boundary-layer method are compared with those of the Euler method and experimental data. The study shows that viscous effects are significant for these cases and the further data analysis confirms the validity and practicability of the coupled method.

  20. GIS-Based Integration of Subjective and Objective Weighting Methods for Regional Landslides Susceptibility Mapping

    Directory of Open Access Journals (Sweden)

    Suhua Zhou

    2016-04-01

    Full Text Available The development of landslide susceptibility maps is of great importance due to rapid urbanization. The purpose of this study is to present a method to integrate the subjective weight with objective weight for regional landslide susceptibility mapping on the geographical information system (GIS platform. The analytical hierarchy process (AHP, which is subjective, was employed to weight predictive factors’ contribution to landslide occurrence. The frequency ratio (FR method, which is objective, was used to derive subclasses’ frequency ratio with respect to landslides that indicate the relative importance of a subclass within each predictive factor. A case study was carried out at Tsushima Island, Japan, using a historical inventory of 534 landslides and seven predictive factors: elevation, slope, aspect, terrain roughness index (TRI, lithology, land cover and mean annual precipitation (MAP. The landslide susceptibility index (LSI was calculated using the weighted linear combination of factors’ weights and subclasses’ weights. The study area was classified into five susceptibility zones according to the LSI. In addition, the produced susceptibility map was compared with maps generated using the conventional FR and AHP method and validated using the relative landslide index (RLI. The validation result showed that the proposed method performed better than the conventional application of the FR method and AHP method. The obtained landslide susceptibility maps could serve as a scientific basis for urban planning and landslide hazard management.

  1. An extensive analysis of disease-gene associations using network integration and fast kernel-based gene prioritization methods.

    Science.gov (United States)

    Valentini, Giorgio; Paccanaro, Alberto; Caniza, Horacio; Romero, Alfonso E; Re, Matteo

    2014-06-01

    In the context of "network medicine", gene prioritization methods represent one of the main tools to discover candidate disease genes by exploiting the large amount of data covering different types of functional relationships between genes. Several works proposed to integrate multiple sources of data to improve disease gene prioritization, but to our knowledge no systematic studies focused on the quantitative evaluation of the impact of network integration on gene prioritization. In this paper, we aim at providing an extensive analysis of gene-disease associations not limited to genetic disorders, and a systematic comparison of different network integration methods for gene prioritization. We collected nine different functional networks representing different functional relationships between genes, and we combined them through both unweighted and weighted network integration methods. We then prioritized genes with respect to each of the considered 708 medical subject headings (MeSH) diseases by applying classical guilt-by-association, random walk and random walk with restart algorithms, and the recently proposed kernelized score functions. The results obtained with classical random walk algorithms and the best single network achieved an average area under the curve (AUC) across the 708 MeSH diseases of about 0.82, while kernelized score functions and network integration boosted the average AUC to about 0.89. Weighted integration, by exploiting the different "informativeness" embedded in different functional networks, outperforms unweighted integration at 0.01 significance level, according to the Wilcoxon signed rank sum test. For each MeSH disease we provide the top-ranked unannotated candidate genes, available for further bio-medical investigation. Network integration is necessary to boost the performances of gene prioritization methods. Moreover the methods based on kernelized score functions can further enhance disease gene ranking results, by adopting both

  2. Primary exploration of the application of case based learning method in clinical probation teaching of the integrated curriculum of hematology

    Institute of Scientific and Technical Information of China (English)

    Zi-zhen XU; Ye-fei WANG; Yan WANG; Shu CHENG; Yi-qun HU; Lei DING

    2015-01-01

    Objective To explore the application and the effect of the case based learning(CBL)method in clinical probation teaching of the integrated curriculum of hematology among eight-year-program medical students.Methods The CBL method was applied to the experimental group,and the traditional approach for the control group.After the lecture,a questionnaire survey was conducted to evaluate the teaching effect in the two groups.Results The CBL method efficiently increased the students’interest in learning and autonomous learning ability,enhanced their ability to solve clinical problems with basic theoretic knowledge and cultivated their clinical thinking ability.Conclusion The CBL method can improve the quality of clinical probation teaching of the integrated curriculum of hematology among eight-year-program medical students.

  3. Secure method for biometric-based recognition with integrated cryptographic functions.

    Science.gov (United States)

    Chiou, Shin-Yan

    2013-01-01

    Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied.

  4. Secure Method for Biometric-Based Recognition with Integrated Cryptographic Functions

    Directory of Open Access Journals (Sweden)

    Shin-Yan Chiou

    2013-01-01

    Full Text Available Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to combine cryptography and biometrics, but these methods require the synchronization of internal systems and are vulnerable to power analysis attacks, fault-based cryptanalysis, and replay attacks. This paper presents a new secure cryptographic authentication method using biometric features. The proposed system combines the advantages of biometric identification and cryptographic techniques. By adding a subsystem to existing biometric recognition systems, we can simultaneously achieve the security of cryptographic technology and the error tolerance of biometric recognition. This method can be used for biometric data encryption, signatures, and other types of cryptographic computation. The method offers a high degree of security with protection against power analysis attacks, fault-based cryptanalysis, and replay attacks. Moreover, it can be used to improve the confidentiality of biological data storage and biodata identification processes. Remote biometric authentication can also be safely applied.

  5. A Novel Ship Detection Method Based on Gradient and Integral Feature for Single-Polarization Synthetic Aperture Radar Imagery

    Directory of Open Access Journals (Sweden)

    Hao Shi

    2018-02-01

    Full Text Available With the rapid development of remote sensing technologies, SAR satellites like China’s Gaofen-3 satellite have more imaging modes and higher resolution. With the availability of high-resolution SAR images, automatic ship target detection has become an important topic in maritime research. In this paper, a novel ship detection method based on gradient and integral features is proposed. This method is mainly composed of three steps. First, in the preprocessing step, a filter is employed to smooth the clutters and the smoothing effect can be adaptive adjusted according to the statistics information of the sub-window. Thus, it can retain details while achieving noise suppression. Second, in the candidate area extraction, a sea-land segmentation method based on gradient enhancement is presented. The integral image method is employed to accelerate computation. Finally, in the ship target identification step, a feature extraction strategy based on Haar-like gradient information and a Radon transform is proposed. This strategy decreases the number of templates found in traditional Haar-like methods. Experiments were performed using Gaofen-3 single-polarization SAR images, and the results showed that the proposed method has high detection accuracy and rapid computational efficiency. In addition, this method has the potential for on-board processing.

  6. Adaptive method for multi-dimensional integration and selection of a base of chaos polynomials

    International Nuclear Information System (INIS)

    Crestaux, T.

    2011-01-01

    This research thesis addresses the propagation of uncertainty in numerical simulations and its processing within a probabilistic framework by a functional approach based on random variable functions. The author reports the use of the spectral method to represent random variables by development in polynomial chaos. More precisely, the author uses the method of non-intrusive projection which uses the orthogonality of Chaos Polynomials to compute the development coefficients by approximation of scalar products. The approach is applied to a cavity and to waste storage [fr

  7. Improvement of Bioactive Compound Classification through Integration of Orthogonal Cell-Based Biosensing Methods

    Directory of Open Access Journals (Sweden)

    Goran N. Jovanovic

    2007-01-01

    Full Text Available Lack of specificity for different classes of chemical and biological agents, and false positives and negatives, can limit the range of applications for cell-based biosensors. This study suggests that the integration of results from algal cells (Mesotaenium caldariorum and fish chromatophores (Betta splendens improves classification efficiency and detection reliability. Cells were challenged with paraquat, mercuric chloride, sodium arsenite and clonidine. The two detection systems were independently investigated for classification of the toxin set by performing discriminant analysis. The algal system correctly classified 72% of the bioactive compounds, whereas the fish chromatophore system correctly classified 68%. The combined classification efficiency was 95%. The algal sensor readout is based on fluorescence measurements of changes in the energy producing pathways of photosynthetic cells, whereas the response from fish chromatophores was quantified using optical density. Change in optical density reflects interference with the functioning of cellular signal transduction networks. Thus, algal cells and fish chromatophores respond to the challenge agents through sufficiently different mechanisms of action to be considered orthogonal.

  8. Integral methods in low-frequency electromagnetics

    CERN Document Server

    Solin, Pavel; Karban, Pavel; Ulrych, Bohus

    2009-01-01

    A modern presentation of integral methods in low-frequency electromagnetics This book provides state-of-the-art knowledge on integral methods in low-frequency electromagnetics. Blending theory with numerous examples, it introduces key aspects of the integral methods used in engineering as a powerful alternative to PDE-based models. Readers will get complete coverage of: The electromagnetic field and its basic characteristics An overview of solution methods Solutions of electromagnetic fields by integral expressions Integral and integrodifferential methods

  9. Statistical Methods in Integrative Genomics

    Science.gov (United States)

    Richardson, Sylvia; Tseng, George C.; Sun, Wei

    2016-01-01

    Statistical methods in integrative genomics aim to answer important biology questions by jointly analyzing multiple types of genomic data (vertical integration) or aggregating the same type of data across multiple studies (horizontal integration). In this article, we introduce different types of genomic data and data resources, and then review statistical methods of integrative genomics, with emphasis on the motivation and rationale of these methods. We conclude with some summary points and future research directions. PMID:27482531

  10. Integrated Building Energy Design of a Danish Office Building Based on Monte Carlo Simulation Method

    DEFF Research Database (Denmark)

    Sørensen, Mathias Juul; Myhre, Sindre Hammer; Hansen, Kasper Kingo

    2017-01-01

    The focus on reducing buildings energy consumption is gradually increasing, and the optimization of a building’s performance and maximizing its potential leads to great challenges between architects and engineers. In this study, we collaborate with a group of architects on a design project of a new...... office building located in Aarhus, Denmark. Building geometry, floor plans and employee schedules were obtained from the architects which is the basis for this study. This study aims to simplify the iterative design process that is based on the traditional trial and error method in the late design phases...

  11. Diverse methods for integrable models

    NARCIS (Netherlands)

    Fehér, G.

    2017-01-01

    This thesis is centered around three topics, sharing integrability as a common theme. This thesis explores different methods in the field of integrable models. The first two chapters are about integrable lattice models in statistical physics. The last chapter describes an integrable quantum chain.

  12. Nature-based integration

    DEFF Research Database (Denmark)

    Pitkänen, Kati; Oratuomi, Joose; Hellgren, Daniela

    Increased attention to, and careful planning of the integration of migrants into Nordic societies is ever more important. Nature based integration is a new solution to respond to this need. This report presents the results of a Nordic survey and workshop and illustrates current practices of nature...... based integration by case study descriptions from Denmark, Sweden Norway and Finland. Across Nordic countries several practical projects and initiatives have been launched to promote the benefits of nature in integration and there is also growing academic interest in the topic. Nordic countries have...... the potential of becoming real forerunners in nature based integration even at the global scale....

  13. A modified precise integration method based on Magnus expansion for transient response analysis of time varying dynamical structure

    International Nuclear Information System (INIS)

    Yue, Cong; Ren, Xingmin; Yang, Yongfeng; Deng, Wangqun

    2016-01-01

    This paper provides a precise and efficacious methodology for manifesting forced vibration response with respect to the time-variant linear rotational structure subjected to unbalanced excitation. A modified algorithm based on time step precise integration method and Magnus expansion is developed for instantaneous dynamic problems. The iterative solution is achieved by the ideology of transition and dimensional increment matrix. Numerical examples on a typical accelerating rotation system considering gyroscopic moment and mass unbalance force comparatively demonstrate the validity, effectiveness and accuracy with Newmark-β method. It is shown that the proposed algorithm has high accuracy without loss efficiency.

  14. Bending analysis of embedded nanoplates based on the integral formulation of Eringen's nonlocal theory using the finite element method

    Science.gov (United States)

    Ansari, R.; Torabi, J.; Norouzzadeh, A.

    2018-04-01

    Due to the capability of Eringen's nonlocal elasticity theory to capture the small length scale effect, it is widely used to study the mechanical behaviors of nanostructures. Previous studies have indicated that in some cases, the differential form of this theory cannot correctly predict the behavior of structure, and the integral form should be employed to avoid obtaining inconsistent results. The present study deals with the bending analysis of nanoplates resting on elastic foundation based on the integral formulation of Eringen's nonlocal theory. Since the formulation is presented in a general form, arbitrary kernel functions can be used. The first order shear deformation plate theory is considered to model the nanoplates, and the governing equations for both integral and differential forms are presented. Finally, the finite element method is applied to solve the problem. Selected results are given to investigate the effects of elastic foundation and to compare the predictions of integral nonlocal model with those of its differential nonlocal and local counterparts. It is found that by the use of proposed integral formulation of Eringen's nonlocal model, the paradox observed for the cantilever nanoplate is resolved.

  15. A clustering based method to evaluate soil corrosivity for pipeline external integrity management

    International Nuclear Information System (INIS)

    Yajima, Ayako; Wang, Hui; Liang, Robert Y.; Castaneda, Homero

    2015-01-01

    One important category of transportation infrastructure is underground pipelines. Corrosion of these buried pipeline systems may cause pipeline failures with the attendant hazards of property loss and fatalities. Therefore, developing the capability to estimate the soil corrosivity is important for designing and preserving materials and for risk assessment. The deterioration rate of metal is highly influenced by the physicochemical characteristics of a material and the environment of its surroundings. In this study, the field data obtained from the southeast region of Mexico was examined using various data mining techniques to determine the usefulness of these techniques for clustering soil corrosivity level. Specifically, the soil was classified into different corrosivity level clusters by k-means and Gaussian mixture model (GMM). In terms of physical space, GMM shows better separability; therefore, the distributions of the material loss of the buried petroleum pipeline walls were estimated via the empirical density within GMM clusters. The soil corrosivity levels of the clusters were determined based on the medians of metal loss. The proposed clustering method was demonstrated to be capable of classifying the soil into different levels of corrosivity severity. - Highlights: • The clustering approach is applied to the data extracted from a real-life pipeline system. • Soil properties in the right-of-way are analyzed via clustering techniques to assess corrosivity. • GMM is selected as the preferred method for detecting the hidden pattern of in-situ data. • K–W test is performed for significant difference of corrosivity level between clusters

  16. A METEOROLOGICAL RISK ASSESSMENT METHOD FOR POWER LINES BASED ON GIS AND MULTI-SENSOR INTEGRATION

    Directory of Open Access Journals (Sweden)

    Z. Lin

    2016-06-01

    Full Text Available Power lines, exposed in the natural environment, are vulnerable to various kinds of meteorological factors. Traditional research mainly deals with the influence of a single meteorological condition on the power line, which lacks of comprehensive effects evaluation and analysis of the multiple meteorological factors. In this paper, we use multiple meteorological monitoring data obtained by multi-sensors to implement the meteorological risk assessment and early warning of power lines. Firstly, we generate meteorological raster map from discrete meteorological monitoring data using spatial interpolation. Secondly, the expert scoring based analytic hierarchy process is used to compute the power line risk index of all kinds of meteorological conditions and establish the mathematical model of meteorological risk. By adopting this model in raster calculator of ArcGIS, we will have a raster map showing overall meteorological risks for power line. Finally, by overlaying the power line buffer layer to that raster map, we will get to know the exact risk index around a certain part of power line, which will provide significant guidance for power line risk management. In the experiment, based on five kinds of observation data gathered from meteorological stations in Guizhou Province of China, including wind, lightning, rain, ice, temperature, we carry on the meteorological risk analysis for the real power lines, and experimental results have proved the feasibility and validity of our proposed method.

  17. Integrating Inquiry-Based Science and Education Methods Courses in a "Science Semester" for Future Elementary Teachers

    Science.gov (United States)

    Madsen, J.; Fifield, S.; Allen, D.; Brickhouse, N.; Dagher, Z.; Ford, D.; Shipman, H.

    2001-05-01

    In this NSF-funded project we will adapt problem-based learning (PBL) and other inquiry-based approaches to create an integrated science and education methods curriculum ("science semester") for elementary teacher education majors. Our goal is to foster integrated understandings of science and pedagogy that future elementary teachers need to effectively use inquiry-based approaches in their classrooms. This project responds to calls to improve science education for all students by making preservice teachers' experiences in undergraduate science courses more consistent with reforms at the K-12 level. The involved faculty teach three science courses (biology, earth science, physical science) and an elementary science education methods course that are degree requirements for elementary teacher education majors. Presently, students take the courses in variable sequences and at widely scattered times. Too many students fail to appreciate the value of science courses to their future careers as teachers, and when they reach the methods course in the junior year they often retain little of the science content studied earlier. These episodic encounters with science make it difficult for students to learn the content, and to translate their understandings of science into effective, inquiry-based teaching strategies. To encourage integrated understandings of science concepts and pedagogy we will coordinate the science and methods courses in a junior-year science semester. Traditional subject matter boundaries will be crossed to stress shared themes that teachers must understand to teach standards-based elementary science. We will adapt exemplary approaches that support both learning science and learning how to teach science. Students will work collaboratively on multidisciplinary PBL activities that place science concepts in authentic contexts and build learning skills. "Lecture" meetings will be large group active learning sessions that help students understand difficult

  18. A new essential protein discovery method based on the integration of protein-protein interaction and gene expression data

    Directory of Open Access Journals (Sweden)

    Li Min

    2012-03-01

    Full Text Available Abstract Background Identification of essential proteins is always a challenging task since it requires experimental approaches that are time-consuming and laborious. With the advances in high throughput technologies, a large number of protein-protein interactions are available, which have produced unprecedented opportunities for detecting proteins' essentialities from the network level. There have been a series of computational approaches proposed for predicting essential proteins based on network topologies. However, the network topology-based centrality measures are very sensitive to the robustness of network. Therefore, a new robust essential protein discovery method would be of great value. Results In this paper, we propose a new centrality measure, named PeC, based on the integration of protein-protein interaction and gene expression data. The performance of PeC is validated based on the protein-protein interaction network of Saccharomyces cerevisiae. The experimental results show that the predicted precision of PeC clearly exceeds that of the other fifteen previously proposed centrality measures: Degree Centrality (DC, Betweenness Centrality (BC, Closeness Centrality (CC, Subgraph Centrality (SC, Eigenvector Centrality (EC, Information Centrality (IC, Bottle Neck (BN, Density of Maximum Neighborhood Component (DMNC, Local Average Connectivity-based method (LAC, Sum of ECC (SoECC, Range-Limited Centrality (RL, L-index (LI, Leader Rank (LR, Normalized α-Centrality (NC, and Moduland-Centrality (MC. Especially, the improvement of PeC over the classic centrality measures (BC, CC, SC, EC, and BN is more than 50% when predicting no more than 500 proteins. Conclusions We demonstrate that the integration of protein-protein interaction network and gene expression data can help improve the precision of predicting essential proteins. The new centrality measure, PeC, is an effective essential protein discovery method.

  19. Systematic Evaluation of Methods for Integration of Transcriptomic Data into Constraint-Based Models of Metabolism

    DEFF Research Database (Denmark)

    Machado, Daniel; Herrgard, Markus

    2014-01-01

    of these methods has not been critically evaluated and compared. This work presents a survey of recently published methods that use transcript levels to try to improve metabolic flux predictions either by generating flux distributions or by creating context-specific models. A subset of these methods...... is then systematically evaluated using published data from three different case studies in E. coli and S. cerevisiae. The flux predictions made by different methods using transcriptomic data are compared against experimentally determined extracellular and intracellular fluxes (from 13C-labeling data). The sensitivity...... of the results to method-specific parameters is also evaluated, as well as their robustness to noise in the data. The results show that none of the methods outperforms the others for all cases. Also, it is observed that for many conditions, the predictions obtained by simple flux balance analysis using growth...

  20. Creating a memory of causal relationships an integration of empirical and explanation-based learning methods

    CERN Document Server

    Pazzani, Michael J

    2014-01-01

    This book presents a theory of learning new causal relationships by making use of perceived regularities in the environment, general knowledge of causality, and existing causal knowledge. Integrating ideas from the psychology of causation and machine learning, the author introduces a new learning procedure called theory-driven learning that uses abstract knowledge of causality to guide the induction process. Known as OCCAM, the system uses theory-driven learning when new experiences conform to common patterns of causal relationships, empirical learning to learn from novel experiences, and expl

  1. Automatic numerical integration methods for Feynman integrals through 3-loop

    International Nuclear Information System (INIS)

    De Doncker, E; Olagbemi, O; Yuasa, F; Ishikawa, T; Kato, K

    2015-01-01

    We give numerical integration results for Feynman loop diagrams through 3-loop such as those covered by Laporta [1]. The methods are based on automatic adaptive integration, using iterated integration and extrapolation with programs from the QUADPACK package, or multivariate techniques from the ParInt package. The Dqags algorithm from QuadPack accommodates boundary singularities of fairly general types. PARINT is a package for multivariate integration layered over MPI (Message Passing Interface), which runs on clusters and incorporates advanced parallel/distributed techniques such as load balancing among processes that may be distributed over a network of nodes. Results are included for 3-loop self-energy diagrams without IR (infra-red) or UV (ultra-violet) singularities. A procedure based on iterated integration and extrapolation yields a novel method of numerical regularization for integrals with UV terms, and is applied to a set of 2-loop self-energy diagrams with UV singularities. (paper)

  2. Using high-order polynomial basis in 3-D EM forward modeling based on volume integral equation method

    Science.gov (United States)

    Kruglyakov, Mikhail; Kuvshinov, Alexey

    2018-05-01

    3-D interpretation of electromagnetic (EM) data of different origin and scale becomes a common practice worldwide. However, 3-D EM numerical simulations (modeling)—a key part of any 3-D EM data analysis—with realistic levels of complexity, accuracy and spatial detail still remains challenging from the computational point of view. We present a novel, efficient 3-D numerical solver based on a volume integral equation (IE) method. The efficiency is achieved by using a high-order polynomial (HOP) basis instead of the zero-order (piecewise constant) basis that is invoked in all routinely used IE-based solvers. We demonstrate that usage of the HOP basis allows us to decrease substantially the number of unknowns (preserving the same accuracy), with corresponding speed increase and memory saving.

  3. Secure Method for Biometric-Based Recognition with Integrated Cryptographic Functions

    OpenAIRE

    Chiou, Shin-Yan

    2013-01-01

    Biometric systems refer to biometric technologies which can be used to achieve authentication. Unlike cryptography-based technologies, the ratio for certification in biometric systems needs not to achieve 100% accuracy. However, biometric data can only be directly compared through proximal access to the scanning device and cannot be combined with cryptographic techniques. Moreover, repeated use, improper storage, or transmission leaks may compromise security. Prior studies have attempted to c...

  4. Knowledge-based inspection:modelling complex processes with the integrated Safeguards Modelling Method (iSMM)

    International Nuclear Information System (INIS)

    Abazi, F.

    2011-01-01

    Increased level of complexity in almost every discipline and operation today raises the demand for knowledge in order to successfully run an organization whether to generate profit or to attain a non-profit mission. Traditional way of transferring knowledge to information systems rich in data structures and complex algorithms continue to hinder the ability to swiftly turnover concepts into operations. Diagrammatic modelling commonly applied in engineering in order to represent concepts or reality remains to be an excellent way of converging knowledge from domain experts. The nuclear verification domain represents ever more a matter which has great importance to the World safety and security. Demand for knowledge about nuclear processes and verification activities used to offset potential misuse of nuclear technology will intensify with the growth of the subject technology. This Doctoral thesis contributes with a model-based approach for representing complex process such as nuclear inspections. The work presented contributes to other domains characterized with knowledge intensive and complex processes. Based on characteristics of a complex process a conceptual framework was established as the theoretical basis for creating a number of modelling languages to represent the domain. The integrated Safeguards Modelling Method (iSMM) is formalized through an integrated meta-model. The diagrammatic modelling languages represent the verification domain and relevant nuclear verification aspects. Such a meta-model conceptualizes the relation between practices of process management, knowledge management and domain specific verification principles. This fusion is considered as necessary in order to create quality processes. The study also extends the formalization achieved through a meta-model by contributing with a formalization language based on Pattern Theory. Through the use of graphical and mathematical constructs of the theory, process structures are formalized enhancing

  5. A chlorophyll fluorescence-based method for the integrated characterization of the photophysiological response to light stress.

    Science.gov (United States)

    Serôdio, João; Schmidt, William; Frankenbach, Silja

    2017-02-01

    This work introduces a new experimental method for the comprehensive description of the physiological responses to light of photosynthetic organisms. It allows the integration in a single experiment of the main established manipulative chlorophyll fluorescence-based protocols. It enables the integrated characterization of the photophysiology of samples regarding photoacclimation state (generating non-sequential light-response curves of effective PSII quantum yield, electron transport rate or non-photochemical quenching), photoprotection capacity (running light stress-recovery experiments, quantifying non-photochemical quenching components) and the operation of photoinactivation and photorepair processes (measuring rate constants of photoinactivation and repair for different light levels and the relative quantum yield of photoinactivation). The new method is based on a previously introduced technique, combining the illumination of a set of replicated samples with spatially separated actinic light beams of different intensity, and the simultaneous measurement of the fluorescence emitted by all samples using an imaging fluorometer. The main novelty described here is the independent manipulation of light intensity and duration of exposure for each sample, and the control of the cumulative light dose applied. The results demonstrate the proof of concept for the method, by comparing the responses of cultures of Chlorella vulgaris acclimated to low and high light regimes, highlighting the mapping of light stress responses over a wide range of light intensity and exposure conditions, and the rapid generation of paired light-response curves of photoinactivation and repair rate constants. This approach represents a chlorophyll fluorescence 'protocol of everything', contributing towards the high throughput characterization of the photophysiology of photosynthetic organisms. © The Author 2017. Published by Oxford University Press on behalf of the Society for Experimental

  6. Quantitative assessment of key parameters in qualitative vulnerability methods applied in karst systems based on an integrated numerical modelling approach

    Science.gov (United States)

    Doummar, Joanna; Kassem, Assaad

    2017-04-01

    In the framework of a three-year PEER (USAID/NSF) funded project, flow in a Karst system in Lebanon (Assal) dominated by snow and semi arid conditions was simulated and successfully calibrated using an integrated numerical model (MIKE-She 2016) based on high resolution input data and detailed catchment characterization. Point source infiltration and fast flow pathways were simulated by a bypass function and a high conductive lens respectively. The approach consisted of identifying all the factors used in qualitative vulnerability methods (COP, EPIK, PI, DRASTIC, GOD) applied in karst systems and to assess their influence on recharge signals in the different hydrological karst compartments (Atmosphere, Unsaturated zone and Saturated zone) based on the integrated numerical model. These parameters are usually attributed different weights according to their estimated impact on Groundwater vulnerability. The aim of this work is to quantify the importance of each of these parameters and outline parameters that are not accounted for in standard methods, but that might play a role in the vulnerability of a system. The spatial distribution of the detailed evapotranspiration, infiltration, and recharge signals from atmosphere to unsaturated zone to saturated zone was compared and contrasted among different surface settings and under varying flow conditions (e.g., in varying slopes, land cover, precipitation intensity, and soil properties as well point source infiltration). Furthermore a sensitivity analysis of individual or coupled major parameters allows quantifying their impact on recharge and indirectly on vulnerability. The preliminary analysis yields a new methodology that accounts for most of the factors influencing vulnerability while refining the weights attributed to each one of them, based on a quantitative approach.

  7. Integrated Case Based and Rule Based Reasoning for Decision Support

    OpenAIRE

    Eshete, Azeb Bekele

    2009-01-01

    This project is a continuation of my specialization project which was focused on studying theoretical concepts related to case based reasoning method, rule based reasoning method and integration of them. The integration of rule-based and case-based reasoning methods has shown a substantial improvement with regards to performance over the individual methods. Verdande Technology As wants to try integrating the rule based reasoning method with an existing case based system. This project focu...

  8. A numerical method for resonance integral calculations

    International Nuclear Information System (INIS)

    Tanbay, Tayfun; Ozgener, Bilge

    2013-01-01

    A numerical method has been proposed for resonance integral calculations and a cubic fit based on least squares approximation to compute the optimum Bell factor is given. The numerical method is based on the discretization of the neutron slowing down equation. The scattering integral is approximated by taking into account the location of the upper limit in energy domain. The accuracy of the method has been tested by performing computations of resonance integrals for uranium dioxide isolated rods and comparing the results with empirical values. (orig.)

  9. Experimental validation of alternate integral-formulation method for predicting acoustic radiation based on particle velocity measurements.

    Science.gov (United States)

    Ni, Zhi; Wu, Sean F

    2010-09-01

    This paper presents experimental validation of an alternate integral-formulation method (AIM) for predicting acoustic radiation from an arbitrary structure based on the particle velocities specified on a hypothetical surface enclosing the target source. Both the normal and tangential components of the particle velocity on this hypothetical surface are measured and taken as the input to AIM codes to predict the acoustic pressures in both exterior and interior regions. The results obtained are compared with the benchmark values measured by microphones at the same locations. To gain some insight into practical applications of AIM, laser Doppler anemometer (LDA) and double hotwire sensor (DHS) are used as measurement devices to collect the particle velocities in the air. Measurement limitations of using LDA and DHS are discussed.

  10. The Breakthrough Series Collaborative on Service Integration: A Mixed Methods Study of a Strengths-Based Initiative

    Directory of Open Access Journals (Sweden)

    Cynthia A. Lietz

    2010-11-01

    Full Text Available Arizona’s Department of Economic Security (DES engaged in a strengths-based initiative to increase quality and integration of human services. Twenty teams including employees from state agencies, community leaders, and families were brought together to discuss and implement improvements to a variety of social services. A mixed methods study was conducted to explore the complex process of forming diverse teams to strengthen social services. Specifically, the research team conducted focus groups to collect qualitative data from a purposive sample of the teams to explore their experiences in greater depth. Analysis of the data led to the development of an online survey instrument that allowed all collaborative members an opportunity to participate in the study. Findings suggest that while the teams faced many challenges, a commitment to the process brought perseverance, communication, and creativity allowing this collaborative to initiate 105 activities to bring about positive changes in social services within their communities.

  11. MADM Technique Integrated with Grey- based Taguchi method for Selection of Alluminium alloys to minimize deburring cost during Drilling

    Directory of Open Access Journals (Sweden)

    Reddy Sreenivasulu

    2015-06-01

    Full Text Available Traditionally, burr problems had been considered unavoidable so that most efforts had been made on removal of the burr as a post process. Nowadays, a trend of manufacturing is an integration of the whole production flow from design to end product. Manufacturing problem issues are handled in various stages even from design stage. Therefore, the methods of describing the burr are getting much attention in recent years for the systematic approach to resolve the burr problem at various manufacturing stages. The main objective of this paper is to explore the basic concepts of MADM methods. In this study, five parameters namely speed, feed, drill size, drill geometry such as point angle and clearance angle were identified to influence more on burr formation during drilling. L 18 orthogonal array was selected and experiments were conducted as per Taguchi experimental plan for Aluminium alloy of 2014, 6061, 5035 and 7075 series. The experiment performed on a CNC Machining center with HSS twist drills. The burr size such as height and thickness were measured on exit of each hole. An optimal combination of process parameters was obtained to minimize the burr size via grey relational analysis. The output from grey based- taguchi method fed as input to the MADM. Apart from burr size strength and temperature are also considered as attributes. Finally, the results generated in MADM suggests the suitable alternative of  aluminium alloy, which results in less deburring cost, high strength and high resistance at elevated temperatures.

  12. Subdomain Precise Integration Method for Periodic Structures

    Directory of Open Access Journals (Sweden)

    F. Wu

    2014-01-01

    Full Text Available A subdomain precise integration method is developed for the dynamical responses of periodic structures comprising many identical structural cells. The proposed method is based on the precise integration method, the subdomain scheme, and the repeatability of the periodic structures. In the proposed method, each structural cell is seen as a super element that is solved using the precise integration method, considering the repeatability of the structural cells. The computational efforts and the memory size of the proposed method are reduced, while high computational accuracy is achieved. Therefore, the proposed method is particularly suitable to solve the dynamical responses of periodic structures. Two numerical examples are presented to demonstrate the accuracy and efficiency of the proposed method through comparison with the Newmark and Runge-Kutta methods.

  13. Computation of integral bases

    NARCIS (Netherlands)

    Bauch, J.H.P.

    2015-01-01

    Let $A$ be a Dedekind domain, $K$ the fraction field of $A$, and $f\\in A[x]$ a monic irreducible separable polynomial. For a given non-zero prime ideal $\\mathfrak{p}$ of $A$ we present in this paper a new method to compute a $\\mathfrak{p}$-integral basis of the extension of $K$ determined by $f$.

  14. Modeling erosion and accretion along the Illinois Lake Michigan shore using integrated airborne, waterborne and ground-based method

    Science.gov (United States)

    Mwakanyamale, K. E.; Brown, S.; Larson, T. H.; Theuerkauf, E.; Ntarlagiannis, D.; Phillips, A.; Anderson, A.

    2017-12-01

    Sediment distribution at the Illinois Lake Michigan shoreline is constantly changing in response to increased human activities and complex natural coastal processes associated with wave action, short and long term fluctuations in lake level, and the influence of coastal ice. Understanding changes to volume, distribution and thickness of sand along the shore through time, is essential for modeling shoreline changes and predicting changes due to extreme weather events and lake-level fluctuation. The use of helicopter transient electromagnetic (HTEM) method and integration with ground-based and waterborne geophysical and geologic methods provides high resolution spatial rich data required for modeling the extent of erosion and accretion at this dynamic coastal system. Analysis and interpretation of HTEM, ground and waterborne geophysical and geological data identify spatial distribution and thickness of beach and lake-bottom sand. The results provide information on existence of littoral sand deposits and identify coastal hazards such as lakebed down-cutting that occurs in sand-starved areas.

  15. Analytical solutions for prediction of the ignition time of wood particles based on a time and space integral method

    NARCIS (Netherlands)

    Haseli, Y.; Oijen, van J.A.; Goey, de L.P.H.

    2012-01-01

    The main idea of this paper is to establish a simple approach for prediction of the ignition time of a wood particle assuming that the thermo-physical properties remain constant and ignition takes place at a characteristic ignition temperature. Using a time and space integral method, explicit

  16. An Integrated Start-Up Method for Pumped Storage Units Based on a Novel Artificial Sheep Algorithm

    Directory of Open Access Journals (Sweden)

    Zanbin Wang

    2018-01-01

    Full Text Available Pumped storage units (PSUs are an important storage tool for power systems containing large-scale renewable energy, and the merit of rapid start-up enable PSUs to modulate and stabilize the power system. In this paper, PSU start-up strategies have been studied and a new integrated start-up method has been proposed for the purpose of achieving swift and smooth start-up. A two-phase closed-loop startup strategy, composed of switching Proportion Integration (PI and Proportion Integration Differentiation (PID controller is designed, and an integrated optimization scheme is proposed for a synchronous optimization of the parameters in the strategy. To enhance the optimization performance, a novel meta-heuristic called Artificial Sheep Algorithm (ASA is proposed and applied to solve the optimization task after a sufficient verification with seven popular meta-heuristic algorithms and 13 typical benchmark functions. Simulation model has been built for a China’s PSU and comparative experiments are conducted to evaluate the proposed integrated method. Results show that the start-up performance could be significantly improved on both indices on overshoot and start-up, and up to 34%-time consumption has been reduced under different working condition. The significant improvements on start-up of PSU is interesting and meaning for further application on real unit.

  17. Gaussian-windowed frame based method of moments formulation of surface-integral-equation for extended apertures

    Energy Technology Data Exchange (ETDEWEB)

    Shlivinski, A., E-mail: amirshli@ee.bgu.ac.il [Department of Electrical and Computer Engineering, Ben-Gurion University of the Negev, Beer-Sheva 84105 (Israel); Lomakin, V., E-mail: vlomakin@eng.ucsd.edu [Department of Electrical and Computer Engineering, University of California, San Diego, 9500 Gilman Drive, La Jolla, CA 92093-0407 (United States)

    2016-03-01

    Scattering or coupling of electromagnetic beam-field at a surface discontinuity separating two homogeneous or inhomogeneous media with different propagation characteristics is formulated using surface integral equation, which are solved by the Method of Moments with the aid of the Gabor-based Gaussian window frame set of basis and testing functions. The application of the Gaussian window frame provides (i) a mathematically exact and robust tool for spatial-spectral phase-space formulation and analysis of the problem; (ii) a system of linear equations in a transmission-line like form relating mode-like wave objects of one medium with mode-like wave objects of the second medium; (iii) furthermore, an appropriate setting of the frame parameters yields mode-like wave objects that blend plane wave properties (as if solving in the spectral domain) with Green's function properties (as if solving in the spatial domain); and (iv) a representation of the scattered field with Gaussian-beam propagators that may be used in many large (in terms of wavelengths) systems.

  18. Undergraduate physiotherapy students' competencies, attitudes and perceptions after integrated educational pathways in evidence-based practice: a mixed methods study.

    Science.gov (United States)

    Bozzolan, M; Simoni, G; Balboni, M; Fiorini, F; Bombardi, S; Bertin, N; Da Roit, M

    2014-11-01

    This mixed methods study aimed to explore perceptions/attitudes, to evaluate knowledge/ skills, to investigate clinical behaviours of undergraduate physiotherapy students exposed to a composite education curriculum on evidence-based practice (EBP). Students' knowledge and skills were assessed before and after integrated learning activities, using the Adapted Fresno test, whereas their behaviour in EBP was evaluated by examining their internship documentation. Students' perceptions and attitudes were explored through four focus groups. Sixty-two students agreed to participate in the study. The within group mean differences (A-Fresno test) were 34.2 (95% CI 24.4 to 43.9) in the first year and 35.1 (95% CI 23.2 to 47.1) in the second year; no statistically significant change was observed in the third year. Seventy-six percent of the second year and 88% of the third year students reached the pass score. Internship documentation gave evidence of PICOs and database searches (95-100%), critical appraisal of internal validity (25-75%) but not of external validity (5-15%). The correct application of these items ranged from 30 to 100%. Qualitative analysis of the focus groups indicated students valued EBP, but perceived many barriers, with clinicians being both an obstacle and a model. Key elements for changing students' behaviours seem to be internship environment and possibility of continuous practice and feedback.

  19. Interface-based software integration

    Directory of Open Access Journals (Sweden)

    Aziz Ahmad Rais

    2016-07-01

    Full Text Available Enterprise architecture frameworks define the goals of enterprise architecture in order to make business processes and IT operations more effective, and to reduce the risk of future investments. These enterprise architecture frameworks offer different architecture development methods that help in building enterprise architecture. In practice, the larger organizations become, the larger their enterprise architecture and IT become. This leads to an increasingly complex system of enterprise architecture development and maintenance. Application software architecture is one type of architecture that, along with business architecture, data architecture and technology architecture, composes enterprise architecture. From the perspective of integration, enterprise architecture can be considered a system of interaction between multiple examples of application software. Therefore, effective software integration is a very important basis for the future success of the enterprise architecture in question. This article will provide interface-based integration practice in order to help simplify the process of building such a software integration system. The main goal of interface-based software integration is to solve problems that may arise with software integration requirements and developing software integration architecture.

  20. The simulation methods based on 1D/3D collaborative computing for the vehicle integrated thermal management

    International Nuclear Information System (INIS)

    Lu, Pengyu; Gao, Qing; Wang, Yan

    2016-01-01

    Highlights: • A 1D/3D collaborative computing simulation method for vehicle thermal management. • Analyzing the influence of the thermodynamic systems and the engine compartment geometry on the vehicle performance. • Providing the basis for the matching energy consumptions of thermodynamic systems in the underhood. - Abstract: The vehicle integrated thermal management containing the engine cooling circuit, the air conditioning circuit, the turbocharged inter-cooled circuit, the engine lubrication circuit etc. is the important means of enhancing power performance, promoting economy, saving energy and reducing emission. In this study, a 1D/3D collaborative simulation method is proposed with the engine cooling circuit and air conditioning circuit being the research object. The mathematical characterizations of the multiple thermodynamic systems are achieved by 1D calculation and the underhood structure is described by 3D simulation. Through analyzing the engine compartment integrated heat transfer process, the model of the integrated thermal management system is formed after coupling the cooling circuit and air conditioning circuit. This collaborative simulation method establishes structured correlation of engine-cooling and air conditioning thermal dissipation in the engine compartment, comprehensively analyzing the engine working process and air condition operational process in order to research the interaction effect of them. In the calculation examples, to achieve the integrated optimization of multiple thermal systems design and performance prediction, by describing the influence of system thermomechanical parameters and operating duty to underhood heat transfer process, performance evaluation of the engine cooling circuit and the air conditioning circuit are realized.

  1. Detection of concrete dam leakage using an integrated geophysical technique based on flow-field fitting method

    Science.gov (United States)

    Dai, Qianwei; Lin, Fangpeng; Wang, Xiaoping; Feng, Deshan; Bayless, Richard C.

    2017-05-01

    An integrated geophysical investigation was performed at S dam located at Dadu basin in China to assess the condition of the dam curtain. The key methodology of the integrated technique used was flow-field fitting method, which allowed identification of the hydraulic connections between the dam foundation and surface water sources (upstream and downstream), and location of the anomalous leakage outlets in the dam foundation. Limitations of the flow-field fitting method were complemented with resistivity logging to identify the internal erosion which had not yet developed into seepage pathways. The results of the flow-field fitting method and resistivity logging were consistent when compared with data provided by seismic tomography, borehole television, water injection test, and rock quality designation.

  2. PLUME-MoM 1.0: A new integral model of volcanic plumes based on the method of moments

    Science.gov (United States)

    de'Michieli Vitturi, M.; Neri, A.; Barsotti, S.

    2015-08-01

    In this paper a new integral mathematical model for volcanic plumes, named PLUME-MoM, is presented. The model describes the steady-state dynamics of a plume in a 3-D coordinate system, accounting for continuous variability in particle size distribution of the pyroclastic mixture ejected at the vent. Volcanic plumes are composed of pyroclastic particles of many different sizes ranging from a few microns up to several centimeters and more. A proper description of such a multi-particle nature is crucial when quantifying changes in grain-size distribution along the plume and, therefore, for better characterization of source conditions of ash dispersal models. The new model is based on the method of moments, which allows for a description of the pyroclastic mixture dynamics not only in the spatial domain but also in the space of parameters of the continuous size distribution of the particles. This is achieved by formulation of fundamental transport equations for the multi-particle mixture with respect to the different moments of the grain-size distribution. Different formulations, in terms of the distribution of the particle number, as well as of the mass distribution expressed in terms of the Krumbein log scale, are also derived. Comparison between the new moments-based formulation and the classical approach, based on the discretization of the mixture in N discrete phases, shows that the new model allows for the same results to be obtained with a significantly lower computational cost (particularly when a large number of discrete phases is adopted). Application of the new model, coupled with uncertainty quantification and global sensitivity analyses, enables the investigation of the response of four key output variables (mean and standard deviation of the grain-size distribution at the top of the plume, plume height and amount of mass lost by the plume during the ascent) to changes in the main input parameters (mean and standard deviation) characterizing the

  3. Integration of Microchip Electrophoresis with Electrochemical Detection Using an Epoxy-Based Molding Method to Embed Multiple Electrode Materials

    Science.gov (United States)

    Johnson, Alicia S.; Selimovic, Asmira; Martin, R. Scott

    2012-01-01

    This paper describes the use of epoxy-encapsulated electrodes to integrate microchip-based electrophoresis with electrochemical detection. Devices with various electrode combinations can easily be developed. This includes a palladium decoupler with a downstream working electrode material of either gold, mercury/gold, platinum, glassy carbon, or a carbon fiber bundle. Additional device components such as the platinum wires for the electrophoresis separation and the counter electrode for detection can also be integrated into the epoxy base. The effect of the decoupler configuration was studied in terms of the separation performance, detector noise, and the ability to analyze samples of a high ionic strength. The ability of both glassy carbon and carbon fiber bundle electrodes to analyze a complex mixture was demonstrated. It was also shown that a PDMS-based valving microchip can be used along with the epoxy embedded electrodes to integrate microdialysis sampling with microchip electrophoresis and electrochemical detection, with the microdialysis tubing also being embedded in the epoxy substrate. This approach enables one to vary the detection electrode material as desired in a manner where the electrodes can be polished and modified in a similar fashion to electrochemical flow cells used in liquid chromatography. PMID:22038707

  4. Integral Methods in Science and Engineering

    CERN Document Server

    Constanda, Christian

    2011-01-01

    An enormous array of problems encountered by scientists and engineers are based on the design of mathematical models using many different types of ordinary differential, partial differential, integral, and integro-differential equations. Accordingly, the solutions of these equations are of great interest to practitioners and to science in general. Presenting a wealth of cutting-edge research by a diverse group of experts in the field, Integral Methods in Science and Engineering: Computational and Analytic Aspects gives a vivid picture of both the development of theoretical integral techniques

  5. ASPECTS OF INTEGRATION MANAGEMENT METHODS

    Directory of Open Access Journals (Sweden)

    Artemy Varshapetian

    2015-10-01

    Full Text Available For manufacturing companies to succeed in today's unstable economic environment, it is necessary to restructure the main components of its activities: designing innovative product, production using modern reconfigurable manufacturing systems, a business model that takes into account the global strategy and management methods using modern management models and tools. The first three components are discussed in numerous publications, for example, (Koren, 2010 and is therefore not considered in the article. A large number of publications devoted to the methods and tools of production management, for example (Halevi, 2007. On the basis of what was said in the article discusses the possibility of the integration of only three methods have received in recent years, the most widely used, namely: Six Sigma method - SS (George et al., 2005 and supplements its-Design for six sigm? - DFSS (Taguchi, 2003; Lean production transformed with the development to the "Lean management" and further to the "Lean thinking" - Lean (Hirano et al., 2006; Theory of Constraints, developed E.Goldratt - TOC (Dettmer, 2001. The article investigates some aspects of this integration: applications in diverse fields, positive features, changes in management structure, etc.

  6. Methods for enhancing numerical integration

    International Nuclear Information System (INIS)

    Doncker, Elise de

    2003-01-01

    We give a survey of common strategies for numerical integration (adaptive, Monte-Carlo, Quasi-Monte Carlo), and attempt to delineate their realm of applicability. The inherent accuracy and error bounds for basic integration methods are given via such measures as the degree of precision of cubature rules, the index of a family of lattice rules, and the discrepancy of uniformly distributed point sets. Strategies incorporating these basic methods often use paradigms to reduce the error by, e.g., increasing the number of points in the domain or decreasing the mesh size, locally or uniformly. For these processes the order of convergence of the strategy is determined by the asymptotic behavior of the error, and may be too slow in practice for the type of problem at hand. For certain problem classes we may be able to improve the effectiveness of the method or strategy by such techniques as transformations, absorbing a difficult part of the integrand into a weight function, suitable partitioning of the domain, transformations and extrapolation or convergence acceleration. Situations warranting the use of these techniques (possibly in an 'automated' way) are described and illustrated by sample applications

  7. The research of the test-class method based on interface object in the software integration test of the large container inspection system

    International Nuclear Information System (INIS)

    Sun Shaohua; Chen Zhiqiang; Zhang Li; Gao Wenhuan; Kang Kejun

    2000-01-01

    Software test is the important stage in software process. The has been mature theory, method and model for unit test in practice. But for integration test, there is not regular method to adhere to. The author presents a new method, developed during the development of the large container inspection system, named test class method based on interface object. In this method a set of basic test-class based on the concept of class in the object-oriented method is established and the method of combining the interface graph and the class set is used to describe the test process. So the strict control and the scientific management for the test process are achieved. The conception of test database is introduced in this method, thus the traceability and the repeatability of test process are improved

  8. Integrated data base program

    International Nuclear Information System (INIS)

    Notz, K.J.

    1981-01-01

    The IDB Program provides direct support to the DOE Nuclear Waste Management and Fuel Cycle Programs and their lead sites and support contractors by providing and maintaining a current, integrated data base of spent fuel and radioactive waste inventories and projections. All major waste types (HLW, TRU, and LLW) and sources (government, commerical fuel cycle, and I/I) are included. A major data compilation was issued in September, 1981: Spent Fuel and Radioactive Waste Inventories and Projections as of December 31, 1980, DOE/NE-0017. This report includes chapters on Spent Fuel, HLW, TRU Waste, LLW, Remedial Action Waste, Active Uranium Mill Tailings, and Airborne Waste, plus Appendices with more detailed data in selected areas such as isotopics, radioactivity, thermal power, projections, and land usage. The LLW sections include volumes, radioactivity, thermal power, current inventories, projected inventories and characteristics, source terms, land requirements, and a breakdown in terms of government/commercial and defense/fuel cycle/I and I

  9. An integrating factor matrix method to find first integrals

    International Nuclear Information System (INIS)

    Saputra, K V I; Quispel, G R W; Van Veen, L

    2010-01-01

    In this paper we develop an integrating factor matrix method to derive conditions for the existence of first integrals. We use this novel method to obtain first integrals, along with the conditions for their existence, for two- and three-dimensional Lotka-Volterra systems with constant terms. The results are compared to previous results obtained by other methods.

  10. Integrating experimental and numerical methods for a scenario-based quantitative assessment of subsurface energy storage options

    Science.gov (United States)

    Kabuth, Alina; Dahmke, Andreas; Hagrey, Said Attia al; Berta, Márton; Dörr, Cordula; Koproch, Nicolas; Köber, Ralf; Köhn, Daniel; Nolde, Michael; Tilmann Pfeiffer, Wolf; Popp, Steffi; Schwanebeck, Malte; Bauer, Sebastian

    2016-04-01

    Within the framework of the transition to renewable energy sources ("Energiewende"), the German government defined the target of producing 60 % of the final energy consumption from renewable energy sources by the year 2050. However, renewable energies are subject to natural fluctuations. Energy storage can help to buffer the resulting time shifts between production and demand. Subsurface geological structures provide large potential capacities for energy stored in the form of heat or gas on daily to seasonal time scales. In order to explore this potential sustainably, the possible induced effects of energy storage operations have to be quantified for both specified normal operation and events of failure. The ANGUS+ project therefore integrates experimental laboratory studies with numerical approaches to assess subsurface energy storage scenarios and monitoring methods. Subsurface storage options for gas, i.e. hydrogen, synthetic methane and compressed air in salt caverns or porous structures, as well as subsurface heat storage are investigated with respect to site prerequisites, storage dimensions, induced effects, monitoring methods and integration into spatial planning schemes. The conceptual interdisciplinary approach of the ANGUS+ project towards the integration of subsurface energy storage into a sustainable subsurface planning scheme is presented here, and this approach is then demonstrated using the examples of two selected energy storage options: Firstly, the option of seasonal heat storage in a shallow aquifer is presented. Coupled thermal and hydraulic processes induced by periodic heat injection and extraction were simulated in the open-source numerical modelling package OpenGeoSys. Situations of specified normal operation as well as cases of failure in operational storage with leaking heat transfer fluid are considered. Bench-scale experiments provided parameterisations of temperature dependent changes in shallow groundwater hydrogeochemistry. As a

  11. Rapid detection of Salmonella in pet food: design and evaluation of integrated methods based on real-time PCR detection.

    Science.gov (United States)

    Balachandran, Priya; Friberg, Maria; Vanlandingham, V; Kozak, K; Manolis, Amanda; Brevnov, Maxim; Crowley, Erin; Bird, Patrick; Goins, David; Furtado, Manohar R; Petrauskene, Olga V; Tebbs, Robert S; Charbonneau, Duane

    2012-02-01

    Reducing the risk of Salmonella contamination in pet food is critical for both companion animals and humans, and its importance is reflected by the substantial increase in the demand for pathogen testing. Accurate and rapid detection of foodborne pathogens improves food safety, protects the public health, and benefits food producers by assuring product quality while facilitating product release in a timely manner. Traditional culture-based methods for Salmonella screening are laborious and can take 5 to 7 days to obtain definitive results. In this study, we developed two methods for the detection of low levels of Salmonella in pet food using real-time PCR: (i) detection of Salmonella in 25 g of dried pet food in less than 14 h with an automated magnetic bead-based nucleic acid extraction method and (ii) detection of Salmonella in 375 g of composite dry pet food matrix in less than 24 h with a manual centrifugation-based nucleic acid preparation method. Both methods included a preclarification step using a novel protocol that removes food matrix-associated debris and PCR inhibitors and improves the sensitivity of detection. Validation studies revealed no significant differences between the two real-time PCR methods and the standard U.S. Food and Drug Administration Bacteriological Analytical Manual (chapter 5) culture confirmation method.

  12. A measurement-based method for predicting margins and uncertainties for unprotected accidents in the Integral Fast Reactor concept

    International Nuclear Information System (INIS)

    Vilim, R.B.

    1990-01-01

    A measurement-based method for predicting the response of an LMR core to unprotected accidents has been developed. The method processes plant measurements taken at normal operation to generate a stochastic model for the core dynamics. This model can be used to predict three sigma confidence intervals for the core temperature and power response. Preliminary numerical simulations performed for EBR-2 appear promising. 6 refs., 2 figs

  13. The research of the test-class method based on interface object in the software integration test of the large container inspection system

    International Nuclear Information System (INIS)

    Sun Shaohua; Chen Zhiqiang; Zhang Li; Gao Wenhuan; Kang Kejun

    2001-01-01

    Software test is the important stage in software process. There has been mature theory, method and model for unit test in practice. But for integration test, there is not regular method to adhere to. The author presents a new method, developed during the development of the large container inspection system, named test-class method based on interface object. A set of basis test-class based on the concept of class in the object-oriented method is established and the method of combining the interface graph and the class set is used to describe the test process. So the strict control and the scientific management for the test process are achieved. The conception of test database is introduced in this method, thus the traceability and the repeatability of test process are improved

  14. Setting value optimization method in integration for relay protection based on improved quantum particle swarm optimization algorithm

    Science.gov (United States)

    Yang, Guo Sheng; Wang, Xiao Yang; Li, Xue Dong

    2018-03-01

    With the establishment of the integrated model of relay protection and the scale of the power system expanding, the global setting and optimization of relay protection is an extremely difficult task. This paper presents a kind of application in relay protection of global optimization improved particle swarm optimization algorithm and the inverse time current protection as an example, selecting reliability of the relay protection, selectivity, quick action and flexibility as the four requires to establish the optimization targets, and optimizing protection setting values of the whole system. Finally, in the case of actual power system, the optimized setting value results of the proposed method in this paper are compared with the particle swarm algorithm. The results show that the improved quantum particle swarm optimization algorithm has strong search ability, good robustness, and it is suitable for optimizing setting value in the relay protection of the whole power system.

  15. Fuzzy knowledge bases integration based on ontology

    OpenAIRE

    Ternovoy, Maksym; Shtogrina, Olena

    2012-01-01

    the paper describes the approach for fuzzy knowledge bases integration with the usage of ontology. This approach is based on metadata-base usage for integration of different knowledge bases with common ontology. The design process of metadata-base is described.

  16. A Decomposition-Based Pricing Method for Solving a Large-Scale MILP Model for an Integrated Fishery

    Directory of Open Access Journals (Sweden)

    M. Babul Hasan

    2007-01-01

    The IFP can be decomposed into a trawler-scheduling subproblem and a fish-processing subproblem in two different ways by relaxing different sets of constraints. We tried conventional decomposition techniques including subgradient optimization and Dantzig-Wolfe decomposition, both of which were unacceptably slow. We then developed a decomposition-based pricing method for solving the large fishery model, which gives excellent computation times. Numerical results for several planning horizon models are presented.

  17. Levels of reconstruction as complementarity in mixed methods research: a social theory-based conceptual framework for integrating qualitative and quantitative research.

    Science.gov (United States)

    Carroll, Linda J; Rothe, J Peter

    2010-09-01

    Like other areas of health research, there has been increasing use of qualitative methods to study public health problems such as injuries and injury prevention. Likewise, the integration of qualitative and quantitative research (mixed-methods) is beginning to assume a more prominent role in public health studies. Likewise, using mixed-methods has great potential for gaining a broad and comprehensive understanding of injuries and their prevention. However, qualitative and quantitative research methods are based on two inherently different paradigms, and their integration requires a conceptual framework that permits the unity of these two methods. We present a theory-driven framework for viewing qualitative and quantitative research, which enables us to integrate them in a conceptually sound and useful manner. This framework has its foundation within the philosophical concept of complementarity, as espoused in the physical and social sciences, and draws on Bergson's metaphysical work on the 'ways of knowing'. Through understanding how data are constructed and reconstructed, and the different levels of meaning that can be ascribed to qualitative and quantitative findings, we can use a mixed-methods approach to gain a conceptually sound, holistic knowledge about injury phenomena that will enhance our development of relevant and successful interventions.

  18. Energy saving analysis and management modeling based on index decomposition analysis integrated energy saving potential method: Application to complex chemical processes

    International Nuclear Information System (INIS)

    Geng, Zhiqiang; Gao, Huachao; Wang, Yanqing; Han, Yongming; Zhu, Qunxiong

    2017-01-01

    Highlights: • The integrated framework that combines IDA with energy-saving potential method is proposed. • Energy saving analysis and management framework of complex chemical processes is obtained. • This proposed method is efficient in energy optimization and carbon emissions of complex chemical processes. - Abstract: Energy saving and management of complex chemical processes play a crucial role in the sustainable development procedure. In order to analyze the effect of the technology, management level, and production structure having on energy efficiency and energy saving potential, this paper proposed a novel integrated framework that combines index decomposition analysis (IDA) with energy saving potential method. The IDA method can obtain the level of energy activity, energy hierarchy and energy intensity effectively based on data-drive to reflect the impact of energy usage. The energy saving potential method can verify the correctness of the improvement direction proposed by the IDA method. Meanwhile, energy efficiency improvement, energy consumption reduction and energy savings can be visually discovered by the proposed framework. The demonstration analysis of ethylene production has verified the practicality of the proposed method. Moreover, we can obtain the corresponding improvement for the ethylene production based on the demonstration analysis. The energy efficiency index and the energy saving potential of these worst months can be increased by 6.7% and 7.4%, respectively. And the carbon emissions can be reduced by 7.4–8.2%.

  19. Analysis Method for Integrating Components of Product

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jun Ho [Inzest Co. Ltd, Seoul (Korea, Republic of); Lee, Kun Sang [Kookmin Univ., Seoul (Korea, Republic of)

    2017-04-15

    This paper presents some of the methods used to incorporate the parts constituting a product. A new relation function concept and its structure are introduced to analyze the relationships of component parts. This relation function has three types of information, which can be used to establish a relation function structure. The relation function structure of the analysis criteria was established to analyze and present the data. The priority components determined by the analysis criteria can be integrated. The analysis criteria were divided based on their number and orientation, as well as their direct or indirect characteristic feature. This paper presents a design algorithm for component integration. This algorithm was applied to actual products, and the components inside the product were integrated. Therefore, the proposed algorithm was used to conduct research to improve the brake discs for bicycles. As a result, an improved product similar to the related function structure was actually created.

  20. Analysis Method for Integrating Components of Product

    International Nuclear Information System (INIS)

    Choi, Jun Ho; Lee, Kun Sang

    2017-01-01

    This paper presents some of the methods used to incorporate the parts constituting a product. A new relation function concept and its structure are introduced to analyze the relationships of component parts. This relation function has three types of information, which can be used to establish a relation function structure. The relation function structure of the analysis criteria was established to analyze and present the data. The priority components determined by the analysis criteria can be integrated. The analysis criteria were divided based on their number and orientation, as well as their direct or indirect characteristic feature. This paper presents a design algorithm for component integration. This algorithm was applied to actual products, and the components inside the product were integrated. Therefore, the proposed algorithm was used to conduct research to improve the brake discs for bicycles. As a result, an improved product similar to the related function structure was actually created.

  1. First integral method for an oscillator system

    Directory of Open Access Journals (Sweden)

    Xiaoqian Gong

    2013-04-01

    Full Text Available In this article, we consider the nonlinear Duffing-van der Pol-type oscillator system by means of the first integral method. This system has physical relevance as a model in certain flow-induced structural vibration problems, which includes the van der Pol oscillator and the damped Duffing oscillator etc as particular cases. Firstly, we apply the Division Theorem for two variables in the complex domain, which is based on the ring theory of commutative algebra, to explore a quasi-polynomial first integral to an equivalent autonomous system. Then, through solving an algebraic system we derive the first integral of the Duffing-van der Pol-type oscillator system under certain parametric condition.

  2. An Integrated Approach Using Chaotic Map & Sample Value Difference Method for Electrocardiogram Steganography and OFDM Based Secured Patient Information Transmission.

    Science.gov (United States)

    Pandey, Anukul; Saini, Barjinder Singh; Singh, Butta; Sood, Neetu

    2017-10-18

    This paper presents a patient's confidential data hiding scheme in electrocardiogram (ECG) signal and its subsequent wireless transmission. Patient's confidential data is embedded in ECG (called stego-ECG) using chaotic map and the sample value difference approach. The sample value difference approach effectually hides the patient's confidential data in ECG sample pairs at the predefined locations. The chaotic map generates these predefined locations through the use of selective control parameters. Subsequently, the wireless transmission of the stego-ECG is analyzed using the Orthogonal Frequency Division Multiplexing (OFDM) system in a Rayleigh fading scenario for telemedicine applications. Evaluation of proposed method on all 48 records of MIT-BIH arrhythmia ECG database demonstrates that the embedding does not alter the diagnostic features of cover ECG. The secret data imperceptibility in stego-ECG is evident through the statistical and clinical performance measures. Statistical measures comprise of Percentage Root-mean-square Difference (PRD), Peak Signal to Noise Ratio (PSNR), and Kulback-Leibler Divergence (KL-Div), etc. while clinical metrics includes wavelet Energy Based Diagnostic Distortion (WEDD) and Wavelet based Weighted PRD (WWPRD). The various channel Signal-to-Noise Ratio scenarios are simulated for wireless communication of stego-ECG in OFDM system. The proposed method over all the 48 records of MIT-BIH arrhythmia database resulted in average, PRD = 0.26, PSNR = 55.49, KL-Div = 3.34 × 10 -6 , WEDD = 0.02, and WWPRD = 0.10 with secret data size of 21Kb. Further, a comparative analysis of proposed method and recent existing works was also performed. The results clearly, demonstrated the superiority of proposed method.

  3. Computation of integral bases

    NARCIS (Netherlands)

    Bauch, J.D.

    2016-01-01

    Let A be a Dedekind domain, K the fraction field of A, and f∈. A[. x] a monic irreducible separable polynomial. For a given non-zero prime ideal p of A we present in this paper a new characterization of a p-integral basis of the extension of K determined by f. This characterization yields in an

  4. An Association Rule Based Method to Integrate Metro-Public Bicycle Smart Card Data for Trip Chain Analysis

    Directory of Open Access Journals (Sweden)

    De Zhao

    2018-01-01

    Full Text Available Smart card data provide valuable insights and massive samples for enhancing the understanding of transfer behavior between metro and public bicycle. However, smart cards for metro and public bicycle are often issued and managed by independent companies and this results in the same commuter having different identity tags in the metro and public bicycle smart card systems. The primary objective of this study is to develop a data fusion methodology for matching metro and public bicycle smart cards for the same commuter using historical smart card data. A novel method with association rules to match the data derived from the two systems is proposed and validation was performed. The results showed that our proposed method successfully matched 573 pairs of smart cards with an accuracy of 100%. We also validated the association rules method through visualization of individual metro and public bicycle trips. Based on the matched cards, interesting findings of metro-bicycle transfer have been derived, including the spatial pattern of the public bicycle as first/last mile solution as well as the duration of a metro trip chain.

  5. Analytical solutions for prediction of the ignition time of wood particles based on a time and space integral method

    International Nuclear Information System (INIS)

    Haseli, Y.; Oijen, J.A. van; Goey, L.P.H. de

    2012-01-01

    Highlights: ► A simple model for prediction of the ignition time of a wood particle is presented. ► The formulation is given for both thermally thin and thermally thick particles. ► Transition from thermally thin to thick regime occurs at a critical particle size. ► The model is validated against a numerical model and various experimental data. - Abstract: The main idea of this paper is to establish a simple approach for prediction of the ignition time of a wood particle assuming that the thermo-physical properties remain constant and ignition takes place at a characteristic ignition temperature. Using a time and space integral method, explicit relationships are derived for computation of the ignition time of particles of three common shapes (slab, cylinder and sphere), which may be characterized as thermally thin or thermally thick. It is shown through a dimensionless analysis that the dimensionless ignition time can be described as a function of non-dimensional ignition temperature, reactor temperature or external incident heat flux, and parameter K which represents the ratio of conduction heat transfer to the external radiation heat transfer. The numerical results reveal that for the dimensionless ignition temperature between 1.25 and 2.25 and for values of K up to 8000 (corresponding to woody materials), the variation of the ignition time of a thermally thin particle with K and the dimensionless ignition temperature is linear, whereas the dependence of the ignition time of a thermally thick particle on the above two parameters obeys a quadratic function. Furthermore, it is shown that the transition from the regime of thermally thin to the regime of thermally thick occurs at K cr (corresponding to a critical size of particle) which is found to be independent of the particle shape. The model is validated by comparing the predicted and the measured ignition time of several wood particles obtained from different sources. Good agreement is achieved which

  6. Methode de conception dirigee par les modeles pour les systemes avioniques modulaires integres basee sur une approche de cosimulation

    Science.gov (United States)

    Bao, Lin

    In the aerospace industry, with the development of avionic systems becomes more and more complex, the integrated modular avionics (IMA) architecture was proposed to replace its predecessor - the federated architecture, in order to reduce the weight, power consumption and the dimension of the avionics equipment. The research work presented in this thesis, which is considered as a part of the research project AVIO509, aims to propose to the aviation industry a set of time-effective and cost-effective solutions for the development and the functional validation of IMA systems. The proposed methodologies mainly focus on two design flows that are based on: 1) the concept of model-driven engineering design and 2) a cosimulation platform. In the first design flow, the modeling language AADL is used to describe the IMA architecture. The environment OCARINA, a code generator initially designed for POK, was modified so that it can generate avionic applications from an AADL model for the simulator SIMA (an IMA simulator compliant to the ARINC653 standards). In the second design flow, Simulink is used to simulate the external world of IMA module thanks to the availability of avionic library that can offer lots of avionics sensors and actuators, and as well as its effectiveness in creating the Simulink models. The cosimulation platform is composed of two simulators: Simulink for the simulation of peripherals and SIMA for the simulation of IMA module, the latter is considered as an ideal alternative for the super expensive commercial development environment. In order to have a good portability, a SIMA partition is reserved as the role of " adapter " to synchronize the communication between these two simulators via the TCP/IP protocol. When the avionics applications are ported to the implementation platform (such as PikeOS) after the simulation, there is only the " adapter " to be modified because the internal communication and the system configuration are the same. An avionics

  7. [Ideas and methods on efficient screening of traditional medicines for anti-osteoporosis activity based on M-Act/Tox integrated evaluation using zebrafish].

    Science.gov (United States)

    Wang, Mo; Ling, Jie; Chen, Ying; Song, Jie; Sun, E; Shi, Zi-Qi; Feng, Liang; Jia, Xiao-Bin; Wei, Ying-Jie

    2017-11-01

    The increasingly apparent liver injury problems of bone strengthening Chinese medicines have brought challenges for clinical application, and it is necessary to consider both effectiveness and safety in screening anti-osteoporosis Chinese medicines. Metabolic transformation is closely related to drug efficacy and toxicity, so it is significant to comprehensively consider metabolism-action/toxicity(M-Act/Tox) for screening anti-osteoporosis Chinese medicines. The current evaluation models and the number of compounds(including metabolites) severely restrict efficient screening in vivo. By referring to previous relevant research and domestic and abroad literature, zebrafish M-Act/Tox integrative method was put forward for efficiently screening anti-osteoporosis herb medicines, which has organically integrated zebrafish metabolism model, osteoporosis model and toxicity evaluation method. This method can break through the bottleneck and blind spots that trace compositions can't achieve efficient and integrated in vivo evaluation, and realize both efficient and comprehensive screening on anti-osteoporosis traditional medicines based on in vivo process taking both safety and effectiveness into account, which is significant to accelerate discovery of effective and safe innovative traditional Chinese medicines for osteoporosis. Copyright© by the Chinese Pharmaceutical Association.

  8. Vulnerability assessment of archaeological sites to earthquake hazard: An indicator based method integrating spatial and temporal aspects

    Directory of Open Access Journals (Sweden)

    Despina Minos-Minopoulos

    2017-07-01

    Full Text Available Across the world, numerous sites of cultural heritage value are at risk from a variety of human-induced and natural hazards such as war and earthquakes. Here we present and test a novel indicator-based method for assessing the vulnerability of archaeological sites to earthquakes. Vulnerability is approached as a dynamic element assessed through a combination of spatial and temporal parameters. The spatial parameters examine the susceptibility of the sites to the secondary Earthquake Environmental Effects of ground liquefaction, landslides and tsunami and are expressed through the Spatial Susceptibility Index (SSi. Parameters of physical vulnerability, economic importance and visitors density examine the temporal vulnerability of the sites expressed through the Temporal Vulnerability Index (TVi. The equally weighted sum of the spatial and temporal indexes represents the total Archaeological Site Vulnerability Index (A.S.V.I.. The A.S.V.I method is applied at 16 archaeological sites across Greece, allowing an assessment of their vulnerability. This then allows the establishment of a regional and national priority list for considering future risk mitigation. Results indicate that i the majority of the sites have low to moderate vulnerability to earthquake hazard, ii Neratzia Fortress on Kos and Heraion on Samos are characterised as highly vulnerable and should be prioritised for further studies and mitigation measures, and iii the majority of the sites are susceptible to at least one Earthquake Environmental Effect and present relatively high physical vulnerability attributed to the existing limited conservation works. This approach highlights the necessity for an effective vulnerability assessment methodology within the existing framework of disaster risk management for cultural heritage.

  9. Continual integration method in the polaron model

    International Nuclear Information System (INIS)

    Kochetov, E.A.; Kuleshov, S.P.; Smondyrev, M.A.

    1981-01-01

    The article is devoted to the investigation of a polaron system on the base of a variational approach formulated on the language of continuum integration. The variational method generalizing the Feynman one for the case of the system pulse different from zero has been formulated. The polaron state has been investigated at zero temperature. A problem of the bound state of two polarons exchanging quanta of a scalar field as well as a problem of polaron scattering with an external field in the Born approximation have been considered. Thermodynamics of the polaron system has been investigated, namely, high-temperature expansions for mean energy and effective polaron mass have been studied [ru

  10. Using an Integrated Group Decision Method Based on SVM, TFN-RS-AHP, and TOPSIS-CD for Cloud Service Supplier Selection

    Directory of Open Access Journals (Sweden)

    Lian-hui Li

    2017-01-01

    Full Text Available To solve the cloud service supplier selection problem under the background of cloud computing emergence, an integrated group decision method is proposed. The cloud service supplier selection index framework is built from two perspectives of technology and technology management. Support vector machine- (SVM- based classification model is applied for the preliminary screening to reduce the number of candidate suppliers. A triangular fuzzy number-rough sets-analytic hierarchy process (TFN-RS-AHP method is designed to calculate supplier’s index value by expert’s wisdom and experience. The index weight is determined by criteria importance through intercriteria correlation (CRITIC. The suppliers are evaluated by the improved TOPSIS replacing Euclidean distance with connection distance (TOPSIS-CD. An electric power enterprise’s case is given to illustrate the correctness and feasibility of the proposed method.

  11. An integrated approach coupling physically based models and probabilistic method to assess quantitatively landslide susceptibility at different scale: application to different geomorphological environments

    Science.gov (United States)

    Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine

    2016-04-01

    Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The

  12. Improvements of the integral transport theory method

    International Nuclear Information System (INIS)

    Kavenoky, A.; Lam-Hime, M.; Stankovski, Z.

    1979-01-01

    The integral transport theory is widely used in practical reactor design calculations however it is computer time consuming for two dimensional calculations of large media. In the first part of this report a new treatment is presented; it is based on the Galerkin method: inside each region the total flux is expanded over a three component basis. Numerical comparison shows that this method can considerably reduce the computing time. The second part of the this report is devoted to homogeneization theory: a straightforward calculation of the fundamental mode for an heterogeneous cell is presented. At first general presentation of the problem is given, then it is simplified to plane geometry and numerical results are presented

  13. Decision tree-based method for integrating gene expression, demographic, and clinical data to determine disease endotypes

    Science.gov (United States)

    Complex diseases are often difficult to diagnose, treat, and study due to the multi-factorial nature of the etiology. Significant challenges exist with regard to how to segregate indivdiuals into suitable subtypes of the disease. Here, we examine a range of methods for evaluati...

  14. Selective Integration in the Material-Point Method

    DEFF Research Database (Denmark)

    Andersen, Lars; Andersen, Søren; Damkilde, Lars

    2009-01-01

    The paper deals with stress integration in the material-point method. In order to avoid parasitic shear in bending, a formulation is proposed, based on selective integration in the background grid that is used to solve the governing equations. The suggested integration scheme is compared...... to a traditional material-point-method computation in which the stresses are evaluated at the material points. The deformation of a cantilever beam is analysed, assuming elastic or elastoplastic material behaviour....

  15. Efficient orbit integration by manifold correction methods.

    Science.gov (United States)

    Fukushima, Toshio

    2005-12-01

    Triggered by a desire to investigate, numerically, the planetary precession through a long-term numerical integration of the solar system, we developed a new formulation of numerical integration of orbital motion named manifold correct on methods. The main trick is to rigorously retain the consistency of physical relations, such as the orbital energy, the orbital angular momentum, or the Laplace integral, of a binary subsystem. This maintenance is done by applying a correction to the integrated variables at each integration step. Typical methods of correction are certain geometric transformations, such as spatial scaling and spatial rotation, which are commonly used in the comparison of reference frames, or mathematically reasonable operations, such as modularization of angle variables into the standard domain [-pi, pi). The form of the manifold correction methods finally evolved are the orbital longitude methods, which enable us to conduct an extremely precise integration of orbital motions. In unperturbed orbits, the integration errors are suppressed at the machine epsilon level for an indefinitely long period. In perturbed cases, on the other hand, the errors initially grow in proportion to the square root of time and then increase more rapidly, the onset of which depends on the type and magnitude of the perturbations. This feature is also realized for highly eccentric orbits by applying the same idea as used in KS-regularization. In particular, the introduction of time elements greatly enhances the performance of numerical integration of KS-regularized orbits, whether the scaling is applied or not.

  16. METHODS OF INTEGRATED OPTIMIZATION MAGLEV TRANSPORT SYSTEMS

    Directory of Open Access Journals (Sweden)

    A. Lasher

    2013-09-01

    example, this research proved the sustainability of the proposed integrated optimization parameters of transport systems. This approach could be applied not only for MTS, but also for other transport systems. Originality. The bases of the complex optimization of transport presented are the new system of universal scientific methods and approaches that ensure high accuracy and authenticity of calculations with the simulation of transport systems and transport networks taking into account the dynamics of their development. Practical value. The development of the theoretical and technological bases of conducting the complex optimization of transport makes it possible to create the scientific tool, which ensures the fulfillment of the automated simulation and calculating of technical and economic structure and technology of the work of different objects of transport, including its infrastructure.

  17. A review on microscale polymerase chain reaction based methods in molecular diagnosis, and future prospects for the fabrication of fully integrated portable biomedical devices.

    Science.gov (United States)

    Lee, Nae Yoon

    2018-05-08

    Since the advent of microfabrication technology and soft lithography, the lab-on-a-chip concept has emerged as a state-of-the-art miniaturized tool for conducting the multiple functions associated with micro total analyses of nucleic acids, in series, in a seamless manner with a miniscule volume of sample. The enhanced surface-to-volume ratio inside a microchannel enables fast reactions owing to increased heat dissipation, allowing rapid amplification. For this reason, PCR has been one of the first applications to be miniaturized in a portable format. However, the nature of the basic working principle for microscale PCR, such as the complicated temperature controls and use of a thermal cycler, has hindered its total integration with other components into a micro total analyses systems (μTAS). This review (with 179 references) surveys the diverse forms of PCR microdevices constructed on the basis of different working principles and evaluates their performances. The first two main sections cover the state-of-the-art in chamber-type PCR microdevices and in continuous-flow PCR microdevices. Methods are then discussed that lead to microdevices with upstream sample purification and downstream detection schemes, with a particular focus on rapid on-site detection of foodborne pathogens. Next, the potential for miniaturizing and automating heaters and pumps is examined. The review concludes with sections on aspects of complete functional integration in conjunction with nanomaterial based sensing, a discussion on future prospects, and with conclusions. Graphical abstract In recent years, thermocycler-based PCR systems have been miniaturized to palm-sized, disposable polymer platforms. In addition, operational accessories such as heaters and mechanical pumps have been simplified to realize semi-automatted stand-alone portable biomedical diagnostic microdevices that are directly applicable in the field. This review summarizes the progress made and the current state of this

  18. Method of manufacturing Josephson junction integrated circuits

    International Nuclear Information System (INIS)

    Jillie, D.W. Jr.; Smith, L.N.

    1985-01-01

    Josephson junction integrated circuits of the current injection type and magnetically controlled type utilize a superconductive layer that forms both Josephson junction electrode for the Josephson junction devices on the integrated circuit as well as a ground plane for the integrated circuit. Large area Josephson junctions are utilized for effecting contact to lower superconductive layers and islands are formed in superconductive layers to provide isolation between the groudplane function and the Josephson junction electrode function as well as to effect crossovers. A superconductor-barrier-superconductor trilayer patterned by local anodization is also utilized with additional layers formed thereover. Methods of manufacturing the embodiments of the invention are disclosed

  19. A dynamic integrated fault diagnosis method for power transformers.

    Science.gov (United States)

    Gao, Wensheng; Bai, Cuifen; Liu, Tong

    2015-01-01

    In order to diagnose transformer fault efficiently and accurately, a dynamic integrated fault diagnosis method based on Bayesian network is proposed in this paper. First, an integrated fault diagnosis model is established based on the causal relationship among abnormal working conditions, failure modes, and failure symptoms of transformers, aimed at obtaining the most possible failure mode. And then considering the evidence input into the diagnosis model is gradually acquired and the fault diagnosis process in reality is multistep, a dynamic fault diagnosis mechanism is proposed based on the integrated fault diagnosis model. Different from the existing one-step diagnosis mechanism, it includes a multistep evidence-selection process, which gives the most effective diagnostic test to be performed in next step. Therefore, it can reduce unnecessary diagnostic tests and improve the accuracy and efficiency of diagnosis. Finally, the dynamic integrated fault diagnosis method is applied to actual cases, and the validity of this method is verified.

  20. A Dynamic Integrated Fault Diagnosis Method for Power Transformers

    Science.gov (United States)

    Gao, Wensheng; Liu, Tong

    2015-01-01

    In order to diagnose transformer fault efficiently and accurately, a dynamic integrated fault diagnosis method based on Bayesian network is proposed in this paper. First, an integrated fault diagnosis model is established based on the causal relationship among abnormal working conditions, failure modes, and failure symptoms of transformers, aimed at obtaining the most possible failure mode. And then considering the evidence input into the diagnosis model is gradually acquired and the fault diagnosis process in reality is multistep, a dynamic fault diagnosis mechanism is proposed based on the integrated fault diagnosis model. Different from the existing one-step diagnosis mechanism, it includes a multistep evidence-selection process, which gives the most effective diagnostic test to be performed in next step. Therefore, it can reduce unnecessary diagnostic tests and improve the accuracy and efficiency of diagnosis. Finally, the dynamic integrated fault diagnosis method is applied to actual cases, and the validity of this method is verified. PMID:25685841

  1. Variational method for integrating radial gradient field

    Science.gov (United States)

    Legarda-Saenz, Ricardo; Brito-Loeza, Carlos; Rivera, Mariano; Espinosa-Romero, Arturo

    2014-12-01

    We propose a variational method for integrating information obtained from circular fringe pattern. The proposed method is a suitable choice for objects with radial symmetry. First, we analyze the information contained in the fringe pattern captured by the experimental setup and then move to formulate the problem of recovering the wavefront using techniques from calculus of variations. The performance of the method is demonstrated by numerical experiments with both synthetic and real data.

  2. Recovery actions in PRA [probabilistic risk assessment] for the Risk Methods Integration and Evaluation Program (RMIEP): Volume 1, Development of the data-based method

    International Nuclear Information System (INIS)

    Weston, L.M.; Whitehead, D.W.; Graves, N.L.

    1987-06-01

    In a probabilistic risk assessment (PRA) for a nuclear power plant, the analyst identifies a set of potential core damage events consisting of equipment failures and human errors and their estimated probabilities of occurrence. If operator recovery from an event within some specified time is considered, then the probability of this recovery can be included in the PRA. This report provides PRA analysts with an improved methodology for including recovery actions in a PRA. A recovery action can be divided into two distinct phases: a Diagnosis Phase (realizing that there is a problem with a critical parameter and deciding upon the correct course of action) and an Action Phase (physically accomplishing the required action). In this methodology, simulator data are used to estimate recovery probabilities for the diagnosis phase. Different time-reliability curves showing the probability of failure of diagnosis as a function of time from the compelling cue for the event are presented. These curves are based on simulator exercises, and the actions are grouped based upon their operational similarities. This is an improvement over existing diagnosis models that rely greatly upon subjective judgment to obtain such estimates. The action phase is modeled using estimates from available sources. The methodology also includes a recommendation on where and when to apply the recovery action in the PRA process

  3. New method for calculation of integral characteristics of thermal plumes

    DEFF Research Database (Denmark)

    Zukowska, Daria; Popiolek, Zbigniew; Melikov, Arsen Krikor

    2008-01-01

    A method for calculation of integral characteristics of thermal plumes is proposed. The method allows for determination of the integral parameters of plumes based on speed measurements performed with omnidirectional low velocity thermoanemometers. The method includes a procedure for calculation...... of the directional velocity (upward component of the mean velocity). The method is applied for determination of the characteristics of an asymmetric thermal plume generated by a sitting person. The method was validated in full-scale experiments in a climatic chamber with a thermal manikin as a simulator of a sitting...

  4. Mining method selection by integrated AHP and PROMETHEE method.

    Science.gov (United States)

    Bogdanovic, Dejan; Nikolic, Djordje; Ilic, Ivana

    2012-03-01

    Selecting the best mining method among many alternatives is a multicriteria decision making problem. The aim of this paper is to demonstrate the implementation of an integrated approach that employs AHP and PROMETHEE together for selecting the most suitable mining method for the "Coka Marin" underground mine in Serbia. The related problem includes five possible mining methods and eleven criteria to evaluate them. Criteria are accurately chosen in order to cover the most important parameters that impact on the mining method selection, such as geological and geotechnical properties, economic parameters and geographical factors. The AHP is used to analyze the structure of the mining method selection problem and to determine weights of the criteria, and PROMETHEE method is used to obtain the final ranking and to make a sensitivity analysis by changing the weights. The results have shown that the proposed integrated method can be successfully used in solving mining engineering problems.

  5. Approximation of the exponential integral (well function) using sampling methods

    Science.gov (United States)

    Baalousha, Husam Musa

    2015-04-01

    Exponential integral (also known as well function) is often used in hydrogeology to solve Theis and Hantush equations. Many methods have been developed to approximate the exponential integral. Most of these methods are based on numerical approximations and are valid for a certain range of the argument value. This paper presents a new approach to approximate the exponential integral. The new approach is based on sampling methods. Three different sampling methods; Latin Hypercube Sampling (LHS), Orthogonal Array (OA), and Orthogonal Array-based Latin Hypercube (OA-LH) have been used to approximate the function. Different argument values, covering a wide range, have been used. The results of sampling methods were compared with results obtained by Mathematica software, which was used as a benchmark. All three sampling methods converge to the result obtained by Mathematica, at different rates. It was found that the orthogonal array (OA) method has the fastest convergence rate compared with LHS and OA-LH. The root mean square error RMSE of OA was in the order of 1E-08. This method can be used with any argument value, and can be used to solve other integrals in hydrogeology such as the leaky aquifer integral.

  6. Nonlinear structural analysis using integrated force method

    Indian Academy of Sciences (India)

    A new formulation termed the Integrated Force Method (IFM) was proposed by Patnaik ... nated ``Structure (nY m)'' where (nY m) are the force and displacement degrees of ..... Patnaik S N, Yadagiri S 1976 Frequency analysis of structures.

  7. A Theoretical and Empirical Integrated Method to Select the Optimal Combined Signals for Geometry-Free and Geometry-Based Three-Carrier Ambiguity Resolution.

    Science.gov (United States)

    Zhao, Dongsheng; Roberts, Gethin Wyn; Lau, Lawrence; Hancock, Craig M; Bai, Ruibin

    2016-11-16

    Twelve GPS Block IIF satellites, out of the current constellation, can transmit on three-frequency signals (L1, L2, L5). Taking advantages of these signals, Three-Carrier Ambiguity Resolution (TCAR) is expected to bring much benefit for ambiguity resolution. One of the research areas is to find the optimal combined signals for a better ambiguity resolution in geometry-free (GF) and geometry-based (GB) mode. However, the existing researches select the signals through either pure theoretical analysis or testing with simulated data, which might be biased as the real observation condition could be different from theoretical prediction or simulation. In this paper, we propose a theoretical and empirical integrated method, which first selects the possible optimal combined signals in theory and then refines these signals with real triple-frequency GPS data, observed at eleven baselines of different lengths. An interpolation technique is also adopted in order to show changes of the AR performance with the increase in baseline length. The results show that the AR success rate can be improved by 3% in GF mode and 8% in GB mode at certain intervals of the baseline length. Therefore, the TCAR can perform better by adopting the combined signals proposed in this paper when the baseline meets the length condition.

  8. Indirect methods for wake potential integration

    International Nuclear Information System (INIS)

    Zagorodnov, I.

    2006-05-01

    The development of the modern accelerator and free-electron laser projects requires to consider wake fields of very short bunches in arbitrary three dimensional structures. To obtain the wake numerically by direct integration is difficult, since it takes a long time for the scattered fields to catch up to the bunch. On the other hand no general algorithm for indirect wake field integration is available in the literature so far. In this paper we review the know indirect methods to compute wake potentials in rotationally symmetric and cavity-like three dimensional structures. For arbitrary three dimensional geometries we introduce several new techniques and test them numerically. (Orig.)

  9. Numerical methods for engine-airframe integration

    International Nuclear Information System (INIS)

    Murthy, S.N.B.; Paynter, G.C.

    1986-01-01

    Various papers on numerical methods for engine-airframe integration are presented. The individual topics considered include: scientific computing environment for the 1980s, overview of prediction of complex turbulent flows, numerical solutions of the compressible Navier-Stokes equations, elements of computational engine/airframe integrations, computational requirements for efficient engine installation, application of CAE and CFD techniques to complete tactical missile design, CFD applications to engine/airframe integration, and application of a second-generation low-order panel methods to powerplant installation studies. Also addressed are: three-dimensional flow analysis of turboprop inlet and nacelle configurations, application of computational methods to the design of large turbofan engine nacelles, comparison of full potential and Euler solution algorithms for aeropropulsive flow field computations, subsonic/transonic, supersonic nozzle flows and nozzle integration, subsonic/transonic prediction capabilities for nozzle/afterbody configurations, three-dimensional viscous design methodology of supersonic inlet systems for advanced technology aircraft, and a user's technology assessment

  10. Risk assessment of water pollution sources based on an integrated k-means clustering and set pair analysis method in the region of Shiyan, China.

    Science.gov (United States)

    Li, Chunhui; Sun, Lian; Jia, Junxiang; Cai, Yanpeng; Wang, Xuan

    2016-07-01

    Source water areas are facing many potential water pollution risks. Risk assessment is an effective method to evaluate such risks. In this paper an integrated model based on k-means clustering analysis and set pair analysis was established aiming at evaluating the risks associated with water pollution in source water areas, in which the weights of indicators were determined through the entropy weight method. Then the proposed model was applied to assess water pollution risks in the region of Shiyan in which China's key source water area Danjiangkou Reservoir for the water source of the middle route of South-to-North Water Diversion Project is located. The results showed that eleven sources with relative high risk value were identified. At the regional scale, Shiyan City and Danjiangkou City would have a high risk value in term of the industrial discharge. Comparatively, Danjiangkou City and Yunxian County would have a high risk value in terms of agricultural pollution. Overall, the risk values of north regions close to the main stream and reservoir of the region of Shiyan were higher than that in the south. The results of risk level indicated that five sources were in lower risk level (i.e., level II), two in moderate risk level (i.e., level III), one in higher risk level (i.e., level IV) and three in highest risk level (i.e., level V). Also risks of industrial discharge are higher than that of the agricultural sector. It is thus essential to manage the pillar industry of the region of Shiyan and certain agricultural companies in the vicinity of the reservoir to reduce water pollution risks of source water areas. Copyright © 2016 Elsevier B.V. All rights reserved.

  11. Permutation statistical methods an integrated approach

    CERN Document Server

    Berry, Kenneth J; Johnston, Janis E

    2016-01-01

    This research monograph provides a synthesis of a number of statistical tests and measures, which, at first consideration, appear disjoint and unrelated. Numerous comparisons of permutation and classical statistical methods are presented, and the two methods are compared via probability values and, where appropriate, measures of effect size. Permutation statistical methods, compared to classical statistical methods, do not rely on theoretical distributions, avoid the usual assumptions of normality and homogeneity of variance, and depend only on the data at hand. This text takes a unique approach to explaining statistics by integrating a large variety of statistical methods, and establishing the rigor of a topic that to many may seem to be a nascent field in statistics. This topic is new in that it took modern computing power to make permutation methods available to people working in the mainstream of research. This research monograph addresses a statistically-informed audience, and can also easily serve as a ...

  12. Rule-based Information Integration

    NARCIS (Netherlands)

    de Keijzer, Ander; van Keulen, Maurice

    2005-01-01

    In this report, we show the process of information integration. We specifically discuss the language used for integration. We show that integration consists of two phases, the schema mapping phase and the data integration phase. We formally define transformation rules, conversion, evolution and

  13. Continuous integration congestion cost allocation based on sensitivity

    International Nuclear Information System (INIS)

    Wu, Z.Q.; Wang, Y.N.

    2004-01-01

    Congestion cost allocation is a very important topic in congestion management. Allocation methods based on the Aumann-Shapley value use the discrete numerical integration method, which needs to solve the incremented OPF solution many times, and as such it is not suitable for practical application to large-scale systems. The optimal solution and its sensitivity change tendency during congestion removal using a DC optimal power flow (OPF) process is analysed. A simple continuous integration method based on the sensitivity is proposed for the congestion cost allocation. The proposed sensitivity analysis method needs a smaller computation time than the method based on using the quadratic method and inner point iteration. The proposed congestion cost allocation method uses a continuous integration method rather than discrete numerical integration. The method does not need to solve the incremented OPF solutions; which allows it use in large-scale systems. The method can also be used for AC OPF congestion management. (author)

  14. Assessing Backwards Integration as a Method of KBO Family Finding

    Science.gov (United States)

    Benfell, Nathan; Ragozzine, Darin

    2018-04-01

    The age of young asteroid collisional families can sometimes be determined by using backwards n-body integrations of the solar system. This method is not used for discovering young asteroid families and is limited by the unpredictable influence of the Yarkovsky effect on individual specific asteroids over time. Since these limitations are not as important for objects in the Kuiper belt, Marcus et al. 2011 suggested that backwards integration could be used to discover and characterize collisional families in the outer solar system. But various challenges present themselves when running precise and accurate 4+ Gyr integrations of Kuiper Belt objects. We have created simulated families of Kuiper Belt Objects with identical starting locations and velocity distributions, based on the Haumea Family. We then ran several long-term test integrations to observe the effect of various simulation parameters on integration results. These integrations were then used to investigate which parameters are of enough significance to require inclusion in the integration. Thereby we determined how to construct long-term integrations that both yield significant results and require manageable processing power. Additionally, we have tested the use of backwards integration as a method of discovery of potential young families in the Kuiper Belt.

  15. Classification Method in Integrated Information Network Using Vector Image Comparison

    Directory of Open Access Journals (Sweden)

    Zhou Yuan

    2014-05-01

    Full Text Available Wireless Integrated Information Network (WMN consists of integrated information that can get data from its surrounding, such as image, voice. To transmit information, large resource is required which decreases the service time of the network. In this paper we present a Classification Approach based on Vector Image Comparison (VIC for WMN that improve the service time of the network. The available methods for sub-region selection and conversion are also proposed.

  16. Agent-based enterprise integration

    Energy Technology Data Exchange (ETDEWEB)

    N. M. Berry; C. M. Pancerella

    1998-12-01

    The authors are developing and deploying software agents in an enterprise information architecture such that the agents manage enterprise resources and facilitate user interaction with these resources. The enterprise agents are built on top of a robust software architecture for data exchange and tool integration across heterogeneous hardware and software. The resulting distributed multi-agent system serves as a method of enhancing enterprises in the following ways: providing users with knowledge about enterprise resources and applications; accessing the dynamically changing enterprise; locating enterprise applications and services; and improving search capabilities for applications and data. Furthermore, agents can access non-agents (i.e., databases and tools) through the enterprise framework. The ultimate target of the effort is the user; they are attempting to increase user productivity in the enterprise. This paper describes their design and early implementation and discusses the planned future work.

  17. A Machine Learning Application Based in Random Forest for Integrating Mass Spectrometry-Based Metabolomic Data: A Simple Screening Method for Patients With Zika Virus

    Directory of Open Access Journals (Sweden)

    Carlos Fernando Odir Rodrigues Melo

    2018-04-01

    Full Text Available Recent Zika outbreaks in South America, accompanied by unexpectedly severe clinical complications have brought much interest in fast and reliable screening methods for ZIKV (Zika virus identification. Reverse-transcriptase polymerase chain reaction (RT-PCR is currently the method of choice to detect ZIKV in biological samples. This approach, nonetheless, demands a considerable amount of time and resources such as kits and reagents that, in endemic areas, may result in a substantial financial burden over affected individuals and health services veering away from RT-PCR analysis. This study presents a powerful combination of high-resolution mass spectrometry and a machine-learning prediction model for data analysis to assess the existence of ZIKV infection across a series of patients that bear similar symptomatic conditions, but not necessarily are infected with the disease. By using mass spectrometric data that are inputted with the developed decision-making algorithm, we were able to provide a set of features that work as a “fingerprint” for this specific pathophysiological condition, even after the acute phase of infection. Since both mass spectrometry and machine learning approaches are well-established and have largely utilized tools within their respective fields, this combination of methods emerges as a distinct alternative for clinical applications, providing a diagnostic screening—faster and more accurate—with improved cost-effectiveness when compared to existing technologies.

  18. A Machine Learning Application Based in Random Forest for Integrating Mass Spectrometry-Based Metabolomic Data: A Simple Screening Method for Patients With Zika Virus.

    Science.gov (United States)

    Melo, Carlos Fernando Odir Rodrigues; Navarro, Luiz Claudio; de Oliveira, Diogo Noin; Guerreiro, Tatiane Melina; Lima, Estela de Oliveira; Delafiori, Jeany; Dabaja, Mohamed Ziad; Ribeiro, Marta da Silva; de Menezes, Maico; Rodrigues, Rafael Gustavo Martins; Morishita, Karen Noda; Esteves, Cibele Zanardi; de Amorim, Aline Lopes Lucas; Aoyagui, Caroline Tiemi; Parise, Pierina Lorencini; Milanez, Guilherme Paier; do Nascimento, Gabriela Mansano; Ribas Freitas, André Ricardo; Angerami, Rodrigo; Costa, Fábio Trindade Maranhão; Arns, Clarice Weis; Resende, Mariangela Ribeiro; Amaral, Eliana; Junior, Renato Passini; Ribeiro-do-Valle, Carolina C; Milanez, Helaine; Moretti, Maria Luiza; Proenca-Modena, Jose Luiz; Avila, Sandra; Rocha, Anderson; Catharino, Rodrigo Ramos

    2018-01-01

    Recent Zika outbreaks in South America, accompanied by unexpectedly severe clinical complications have brought much interest in fast and reliable screening methods for ZIKV (Zika virus) identification. Reverse-transcriptase polymerase chain reaction (RT-PCR) is currently the method of choice to detect ZIKV in biological samples. This approach, nonetheless, demands a considerable amount of time and resources such as kits and reagents that, in endemic areas, may result in a substantial financial burden over affected individuals and health services veering away from RT-PCR analysis. This study presents a powerful combination of high-resolution mass spectrometry and a machine-learning prediction model for data analysis to assess the existence of ZIKV infection across a series of patients that bear similar symptomatic conditions, but not necessarily are infected with the disease. By using mass spectrometric data that are inputted with the developed decision-making algorithm, we were able to provide a set of features that work as a "fingerprint" for this specific pathophysiological condition, even after the acute phase of infection. Since both mass spectrometry and machine learning approaches are well-established and have largely utilized tools within their respective fields, this combination of methods emerges as a distinct alternative for clinical applications, providing a diagnostic screening-faster and more accurate-with improved cost-effectiveness when compared to existing technologies.

  19. Explicit integration of extremely stiff reaction networks: partial equilibrium methods

    International Nuclear Information System (INIS)

    Guidry, M W; Hix, W R; Billings, J J

    2013-01-01

    In two preceding papers (Guidry et al 2013 Comput. Sci. Disc. 6 015001 and Guidry and Harris 2013 Comput. Sci. Disc. 6 015002), we have shown that when reaction networks are well removed from equilibrium, explicit asymptotic and quasi-steady-state approximations can give algebraically stabilized integration schemes that rival standard implicit methods in accuracy and speed for extremely stiff systems. However, we also showed that these explicit methods remain accurate but are no longer competitive in speed as the network approaches equilibrium. In this paper, we analyze this failure and show that it is associated with the presence of fast equilibration timescales that neither asymptotic nor quasi-steady-state approximations are able to remove efficiently from the numerical integration. Based on this understanding, we develop a partial equilibrium method to deal effectively with the approach to equilibrium and show that explicit asymptotic methods, combined with the new partial equilibrium methods, give an integration scheme that can plausibly deal with the stiffest networks, even in the approach to equilibrium, with accuracy and speed competitive with that of implicit methods. Thus we demonstrate that such explicit methods may offer alternatives to implicit integration of even extremely stiff systems and that these methods may permit integration of much larger networks than have been possible before in a number of fields. (paper)

  20. Multistep Methods for Integrating the Solar System

    Science.gov (United States)

    1988-07-01

    Technical Report 1055 [Multistep Methods for Integrating the Solar System 0 Panayotis A. Skordos’ MIT Artificial Intelligence Laboratory DTIC S D g8...RMA ELEENT. PROECT. TASK Artific ial Inteligence Laboratory ARE1A G WORK UNIT NUMBERS 545 Technology Square Cambridge, MA 02139 IL. CONTROLLING...describes research done at the Artificial Intelligence Laboratory of the Massachusetts Institute of Technology, supported by the Advanced Research Projects

  1. ISFAHAN HEALTHY HEART PROGRAM:A COMPREHENSIVE INTEGRATED COMMUNITY-BASED PROGRAM FOR CARDIOVASCULAR DISEASE PREVENTION AND CONTROL. DESIGN, METHODS AND INITIAL EXPERIENCE 2000-2001

    Directory of Open Access Journals (Sweden)

    N MOHAMMADI FARD

    2002-03-01

    Full Text Available Isfahan Healthy Heart Program (IHHP is a five to six year comprehensive integrated community based program for preventing and controlling of cardiovascular diseases (CVD via reducing CVD risk factors and improvement of cardiovascular healthy behavior in target population. IHHP has been started in 1999 and will be last since 2004. Primary survey was done to collect baseline data from interventional (Isfahan and Najafabad Cities and reference (Arak communities. In a multistage sampling method, we select randomly 5 to 10 percent of households in clusters. Then individuals aged equal or higher than 19 years old were selected for entering to survey. In this way, data from 12600 individuals (6300 in interventional counties and 6300 in reference county was collected and stratified due to their living area (urban vs. rural and different age and sex groups. Cardiovascular risk factors (Hypercholesterolemia, Smoking, Hypertension, Diabetes Mellitus, Obesity were investigated by laboratory tests (Lipid profile, FBS, OGTT, physical exam and standard questionnaires, in all ones. Nutritional habits, socioeconomic states, physical activity profiles and other healthy behaviors regarding to cardiovascular disease were assessed by validated questionnaires via interviewing to all individuals. Twelve leads electrocardiogram was done in all persons older than 35 years old. The prevalence of CVDs and distribution of CVD risk factors were estimated in this phase. In the 2nd phase, based on primary survey findings, we arranged a series of teams (worksite, children, women, health personnel, high risk patients, nutrition for planning and implementation of program through interventional community for a 5-year period. Every team has its own target population and objectives and monitors its process during the study. At intervals (annually, some local and small surveys with a random sampling will be conducted to assess and monitor the program and its potency to cope with

  2. Integrated computer-aided design in automotive development development processes, geometric fundamentals, methods of CAD, knowledge-based engineering data management

    CERN Document Server

    Mario, Hirz; Gfrerrer, Anton; Lang, Johann

    2013-01-01

    The automotive industry faces constant pressure to reduce development costs and time while still increasing vehicle quality. To meet this challenge, engineers and researchers in both science and industry are developing effective strategies and flexible tools by enhancing and further integrating powerful, computer-aided design technology. This book provides a valuable overview of the development tools and methods of today and tomorrow. It is targeted not only towards professional project and design engineers, but also to students and to anyone who is interested in state-of-the-art computer-aided development. The book begins with an overview of automotive development processes and the principles of virtual product development. Focusing on computer-aided design, a comprehensive outline of the fundamentals of geometry representation provides a deeper insight into the mathematical techniques used to describe and model geometrical elements. The book then explores the link between the demands of integrated design pr...

  3. The integrated quality assessment of Chinese commercial dry red wine based on a method of online HPLC-DAD-CL combined with HPLC-ESI-MS.

    Science.gov (United States)

    Yu, Hai-Xiang; Sun, Li-Qiong; Qi, Jin

    2014-07-01

    To apply an integrated quality assessment strategy to investigate the quality of multiple Chinese commercial dry red wine samples. A comprehensive method was developed by combining a high performance liquid chromatography-diode array detector-chemiluminescence (HPLC-DAD-CL) online hyphenated system with an HPLC-ESI-MS technique. Chromatographic and H2O2-scavenging active fingerprints of thirteen batches of different, commercially available Chinese dry red wine samples were obtained and analyzed. Twenty-five compounds, including eighteen antioxidants were identified and evaluated. The dominant and characteristic antioxidants in the samples were identified. The relationships between antioxidant potency and the cultivated variety of grape, producing area, cellaring period, and trade mark are also discussed. The results provide the feasibility for an integrated quality assessment strategy to be efficiently and objectively used in quality (especially antioxidant activity) assessment and identification of dry red wine. Copyright © 2014 China Pharmaceutical University. Published by Elsevier B.V. All rights reserved.

  4. Integral methods in science and engineering theoretical and practical aspects

    CERN Document Server

    Constanda, C; Rollins, D

    2006-01-01

    Presents a series of analytic and numerical methods of solution constructed for important problems arising in science and engineering, based on the powerful operation of integration. This volume is meant for researchers and practitioners in applied mathematics, physics, and mechanical and electrical engineering, as well as graduate students.

  5. Educational integrating projects as a method of interactive learning

    Directory of Open Access Journals (Sweden)

    Иван Николаевич Куринин

    2013-12-01

    Full Text Available The article describes a method of interactive learning based on educational integrating projects. Some examples of content of such projects for the disciplines related to the study of information and Internet technologies and their application in management are presented.

  6. Integrative methods for analyzing big data in precision medicine.

    Science.gov (United States)

    Gligorijević, Vladimir; Malod-Dognin, Noël; Pržulj, Nataša

    2016-03-01

    We provide an overview of recent developments in big data analyses in the context of precision medicine and health informatics. With the advance in technologies capturing molecular and medical data, we entered the area of "Big Data" in biology and medicine. These data offer many opportunities to advance precision medicine. We outline key challenges in precision medicine and present recent advances in data integration-based methods to uncover personalized information from big data produced by various omics studies. We survey recent integrative methods for disease subtyping, biomarkers discovery, and drug repurposing, and list the tools that are available to domain scientists. Given the ever-growing nature of these big data, we highlight key issues that big data integration methods will face. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  7. Integral Method of Boundary Characteristics: Neumann Condition

    Science.gov (United States)

    Kot, V. A.

    2018-05-01

    A new algorithm, based on systems of identical equalities with integral and differential boundary characteristics, is proposed for solving boundary-value problems on the heat conduction in bodies canonical in shape at a Neumann boundary condition. Results of a numerical analysis of the accuracy of solving heat-conduction problems with variable boundary conditions with the use of this algorithm are presented. The solutions obtained with it can be considered as exact because their errors comprise hundredths and ten-thousandths of a persent for a wide range of change in the parameters of a problem.

  8. The integral equation method applied to eddy currents

    International Nuclear Information System (INIS)

    Biddlecombe, C.S.; Collie, C.J.; Simkin, J.; Trowbridge, C.W.

    1976-04-01

    An algorithm for the numerical solution of eddy current problems is described, based on the direct solution of the integral equation for the potentials. In this method only the conducting and iron regions need to be divided into elements, and there are no boundary conditions. Results from two computer programs using this method for iron free problems for various two-dimensional geometries are presented and compared with analytic solutions. (author)

  9. Collaborative teaching of an integrated methods course

    Directory of Open Access Journals (Sweden)

    George Zhou

    2011-03-01

    Full Text Available With an increasing diversity in American schools, teachers need to be able to collaborate in teaching. University courses are widely considered as a stage to demonstrate or model the ways of collaboration. To respond to this call, three authors team taught an integrated methods course at an urban public university in the city of New York. Following a qualitative research design, this study explored both instructors‟ and pre-service teachers‟ experiences with this course. Study findings indicate that collaborative teaching of an integrated methods course is feasible and beneficial to both instructors and pre-service teachers. For instructors, this collaborative teaching was a reciprocal learning process where they were engaged in thinking about teaching in a broader and innovative way. For pre-service teachers, this collaborative course not only helped them understand how three different subjects could be related to each other, but also provided opportunities for them to actually see how collaboration could take place in teaching. Their understanding of collaborative teaching was enhanced after the course.

  10. Understanding integrated care: a comprehensive conceptual framework based on the integrative functions of primary care.

    Science.gov (United States)

    Valentijn, Pim P; Schepman, Sanneke M; Opheij, Wilfrid; Bruijnzeels, Marc A

    2013-01-01

    Primary care has a central role in integrating care within a health system. However, conceptual ambiguity regarding integrated care hampers a systematic understanding. This paper proposes a conceptual framework that combines the concepts of primary care and integrated care, in order to understand the complexity of integrated care. The search method involved a combination of electronic database searches, hand searches of reference lists (snowball method) and contacting researchers in the field. The process of synthesizing the literature was iterative, to relate the concepts of primary care and integrated care. First, we identified the general principles of primary care and integrated care. Second, we connected the dimensions of integrated care and the principles of primary care. Finally, to improve content validity we held several meetings with researchers in the field to develop and refine our conceptual framework. The conceptual framework combines the functions of primary care with the dimensions of integrated care. Person-focused and population-based care serve as guiding principles for achieving integration across the care continuum. Integration plays complementary roles on the micro (clinical integration), meso (professional and organisational integration) and macro (system integration) level. Functional and normative integration ensure connectivity between the levels. The presented conceptual framework is a first step to achieve a better understanding of the inter-relationships among the dimensions of integrated care from a primary care perspective.

  11. Evaluating base widening methods.

    Science.gov (United States)

    2013-12-01

    The surface transportation system forms the biggest infrastructure investment in the United States of which the : roadway pavement forms an integral part. Maintaining the roadways can involve rehabilitation in the form of : widening; which require a ...

  12. Comparisons of Modeling and State of Charge Estimation for Lithium-Ion Battery Based on Fractional Order and Integral Order Methods

    Directory of Open Access Journals (Sweden)

    Renxin Xiao

    2016-03-01

    Full Text Available In order to properly manage lithium-ion batteries of electric vehicles (EVs, it is essential to build the battery model and estimate the state of charge (SOC. In this paper, the fractional order forms of Thevenin and partnership for a new generation of vehicles (PNGV models are built, of which the model parameters including the fractional orders and the corresponding resistance and capacitance values are simultaneously identified based on genetic algorithm (GA. The relationships between different model parameters and SOC are established and analyzed. The calculation precisions of the fractional order model (FOM and integral order model (IOM are validated and compared under hybrid test cycles. Finally, extended Kalman filter (EKF is employed to estimate the SOC based on different models. The results prove that the FOMs can simulate the output voltage more accurately and the fractional order EKF (FOEKF can estimate the SOC more precisely under dynamic conditions.

  13. Gait recognition based on integral outline

    Science.gov (United States)

    Ming, Guan; Fang, Lv

    2017-02-01

    Biometric identification technology replaces traditional security technology, which has become a trend, and gait recognition also has become a hot spot of research because its feature is difficult to imitate and theft. This paper presents a gait recognition system based on integral outline of human body. The system has three important aspects: the preprocessing of gait image, feature extraction and classification. Finally, using a method of polling to evaluate the performance of the system, and summarizing the problems existing in the gait recognition and the direction of development in the future.

  14. Parallel Jacobi EVD Methods on Integrated Circuits

    Directory of Open Access Journals (Sweden)

    Chi-Chia Sun

    2014-01-01

    Full Text Available Design strategies for parallel iterative algorithms are presented. In order to further study different tradeoff strategies in design criteria for integrated circuits, A 10 × 10 Jacobi Brent-Luk-EVD array with the simplified μ-CORDIC processor is used as an example. The experimental results show that using the μ-CORDIC processor is beneficial for the design criteria as it yields a smaller area, faster overall computation time, and less energy consumption than the regular CORDIC processor. It is worth to notice that the proposed parallel EVD method can be applied to real-time and low-power array signal processing algorithms performing beamforming or DOA estimation.

  15. Value of image fusion using single photon emission computed tomography with integrated low dose computed tomography in comparison with a retrospective voxel-based method in neuroendocrine tumours

    International Nuclear Information System (INIS)

    Amthauer, H.; Denecke, T.; Ruf, J.; Gutberlet, M.; Felix, R.; Lemke, A.J.; Rohlfing, T.; Boehmig, M.; Ploeckinger, U.

    2005-01-01

    The objective was the evaluation of single photon emission computed tomography (SPECT) with integrated low dose computed tomography (CT) in comparison with a retrospective fusion of SPECT and high-resolution CT and a side-by-side analysis for lesion localisation in patients with neuroendocrine tumours. Twenty-seven patients were examined by multidetector CT. Additionally, as part of somatostatin receptor scintigraphy (SRS), an integrated SPECT-CT was performed. SPECT and CT data were fused using software with a registration algorithm based on normalised mutual information. The reliability of the topographic assignment of lesions in SPECT-CT, retrospective fusion and side-by-side analysis was evaluated by two blinded readers. Two patients were not enrolled in the final analysis because of misregistrations in the retrospective fusion. Eighty-seven foci were included in the analysis. For the anatomical assignment of foci, SPECT-CT and retrospective fusion revealed overall accuracies of 91 and 94% (side-by-side analysis 86%). The correct identification of foci as lymph node manifestations (n=25) was more accurate by retrospective fusion (88%) than from SPECT-CT images (76%) or by side-by-side analysis (60%). Both modalities of image fusion appear to be well suited for the localisation of SRS foci and are superior to side-by-side analysis of non-fused images especially concerning lymph node manifestations. (orig.)

  16. Direct integration multiple collision integral transport analysis method for high energy fusion neutronics

    International Nuclear Information System (INIS)

    Koch, K.R.

    1985-01-01

    A new analysis method specially suited for the inherent difficulties of fusion neutronics was developed to provide detailed studies of the fusion neutron transport physics. These studies should provide a better understanding of the limitations and accuracies of typical fusion neutronics calculations. The new analysis method is based on the direct integration of the integral form of the neutron transport equation and employs a continuous energy formulation with the exact treatment of the energy angle kinematics of the scattering process. In addition, the overall solution is analyzed in terms of uncollided, once-collided, and multi-collided solution components based on a multiple collision treatment. Furthermore, the numerical evaluations of integrals use quadrature schemes that are based on the actual dependencies exhibited in the integrands. The new DITRAN computer code was developed on the Cyber 205 vector supercomputer to implement this direct integration multiple-collision fusion neutronics analysis. Three representative fusion reactor models were devised and the solutions to these problems were studied to provide suitable choices for the numerical quadrature orders as well as the discretized solution grid and to understand the limitations of the new analysis method. As further verification and as a first step in assessing the accuracy of existing fusion-neutronics calculations, solutions obtained using the new analysis method were compared to typical multigroup discrete ordinates calculations

  17. Application of Stochastic Sensitivity Analysis to Integrated Force Method

    Directory of Open Access Journals (Sweden)

    X. F. Wei

    2012-01-01

    Full Text Available As a new formulation in structural analysis, Integrated Force Method has been successfully applied to many structures for civil, mechanical, and aerospace engineering due to the accurate estimate of forces in computation. Right now, it is being further extended to the probabilistic domain. For the assessment of uncertainty effect in system optimization and identification, the probabilistic sensitivity analysis of IFM was further investigated in this study. A set of stochastic sensitivity analysis formulation of Integrated Force Method was developed using the perturbation method. Numerical examples are presented to illustrate its application. Its efficiency and accuracy were also substantiated with direct Monte Carlo simulations and the reliability-based sensitivity method. The numerical algorithm was shown to be readily adaptable to the existing program since the models of stochastic finite element and stochastic design sensitivity are almost identical.

  18. Second Order Generalized Integrator Based Reference Current Generation Method for Single-Phase Shunt Active Power Filters Under Adverse Grid Conditions

    DEFF Research Database (Denmark)

    Golestan, Saeed; Monfared, Mohammad; Guerrero, Josep M.

    2013-01-01

    The reference current generation (RCG) is a crucial part in the control of a shunt active power filter (APF). A variety of RCG techniques have been proposed in literature. Among these, the instantaneous reactive power theory, called pq theory, is probably the most widely used technique. The pq...... theory offers advantages such as satisfactory steady-state and dynamic performance, and at the same time simple digital implementation, however its application was limited to three-phase systems. To exploit the advantages of pq theory in single-phase systems, the single-phase pq theory has been proposed...... recently. In this paper, a simple and effective implementation of the single phase pq theory for single-phase shunt APFs is proposed. The suggested approach is based on employing second order generalized integrators (SOGI), and a phase locked loop (PLL). To fine tune the control parameters, a systematic...

  19. Universal integrals based on copulas

    Czech Academy of Sciences Publication Activity Database

    Klement, E.P.; Mesiar, Radko; Spizzichino, F.; Stupňanová, A.

    2014-01-01

    Roč. 13, č. 3 (2014), s. 273-286 ISSN 1568-4539 R&D Projects: GA ČR GAP402/11/0378 Institutional support: RVO:67985556 Keywords : capacity * copula * universal integral Subject RIV: BA - General Mathematics Impact factor: 2.163, year: 2014 http://library.utia.cas.cz/separaty/2014/E/mesiar-0432228.pdf

  20. Numerov iteration method for second order integral-differential equation

    International Nuclear Information System (INIS)

    Zeng Fanan; Zhang Jiaju; Zhao Xuan

    1987-01-01

    In this paper, Numerov iterative method for second order integral-differential equation and system of equations are constructed. Numerical examples show that this method is better than direct method (Gauss elimination method) in CPU time and memoy requireing. Therefore, this method is an efficient method for solving integral-differential equation in nuclear physics

  1. An Integrated Method for Airfoil Optimization

    Science.gov (United States)

    Okrent, Joshua B.

    Design exploration and optimization is a large part of the initial engineering and design process. To evaluate the aerodynamic performance of a design, viscous Navier-Stokes solvers can be used. However this method can prove to be overwhelmingly time consuming when performing an initial design sweep. Therefore, another evaluation method is needed to provide accurate results at a faster pace. To accomplish this goal, a coupled viscous-inviscid method is used. This thesis proposes an integrated method for analyzing, evaluating, and optimizing an airfoil using a coupled viscous-inviscid solver along with a genetic algorithm to find the optimal candidate. The method proposed is different from prior optimization efforts in that it greatly broadens the design space, while allowing the optimization to search for the best candidate that will meet multiple objectives over a characteristic mission profile rather than over a single condition and single optimization parameter. The increased design space is due to the use of multiple parametric airfoil families, namely the NACA 4 series, CST family, and the PARSEC family. Almost all possible airfoil shapes can be created with these three families allowing for all possible configurations to be included. This inclusion of multiple airfoil families addresses a possible criticism of prior optimization attempts since by only focusing on one airfoil family, they were inherently limiting the number of possible airfoil configurations. By using multiple parametric airfoils, it can be assumed that all reasonable airfoil configurations are included in the analysis and optimization and that a global and not local maximum is found. Additionally, the method used is amenable to customization to suit any specific needs as well as including the effects of other physical phenomena or design criteria and/or constraints. This thesis found that an airfoil configuration that met multiple objectives could be found for a given set of nominal

  2. Nuclear methods - an integral part of the NBS certification program

    International Nuclear Information System (INIS)

    Gills, T.E.

    1984-01-01

    Within the past twenty years, new techniques and methods have emerged in response to new technologies that are based upon the performance of high-purity and well-characterized materials. The National Bureau of Standards, through its Standard Reference Materials (SRM's) Program, provides standards in the form of many of these materials to ensure accuracy and the compatibility of measurements throughout the US and the world. These standards, defined by the National Bureau of Standards as Standard Reference Materials (SRMs), are developed by using state-of-the-art methods and procedures for both preparation and analysis. Nuclear methods-activation analysis constitute an integral part of that analysis process

  3. An integrated model of water resources optimization allocation based on projection pursuit model - Grey wolf optimization method in a transboundary river basin

    Science.gov (United States)

    Yu, Sen; Lu, Hongwei

    2018-04-01

    Under the effects of global change, water crisis ranks as the top global risk in the future decade, and water conflict in transboundary river basins as well as the geostrategic competition led by it is most concerned. This study presents an innovative integrated PPMGWO model of water resources optimization allocation in a transboundary river basin, which is integrated through the projection pursuit model (PPM) and Grey wolf optimization (GWO) method. This study uses the Songhua River basin and 25 control units as examples, adopting the PPMGWO model proposed in this study to allocate the water quantity. Using water consumption in all control units in the Songhua River basin in 2015 as reference to compare with optimization allocation results of firefly algorithm (FA) and Particle Swarm Optimization (PSO) algorithms as well as the PPMGWO model, results indicate that the average difference between corresponding allocation results and reference values are 0.195 bil m3, 0.151 bil m3, and 0.085 bil m3, respectively. Obviously, the average difference of the PPMGWO model is the lowest and its optimization allocation result is closer to reality, which further confirms the reasonability, feasibility, and accuracy of the PPMGWO model. And then the PPMGWO model is adopted to simulate allocation of available water quantity in Songhua River basin in 2018, 2020, and 2030. The simulation results show water quantity which could be allocated in all controls demonstrates an overall increasing trend with reasonable and equal exploitation and utilization of water resources in the Songhua River basin in future. In addition, this study has a certain reference value and application meaning to comprehensive management and water resources allocation in other transboundary river basins.

  4. Microcontroller based Integrated Circuit Tester

    OpenAIRE

    Yousif Taha Yousif Elamin; Abdelrasoul Jabar Alzubaidi

    2015-01-01

    The digital integrated circuit (IC) tester is implemented by using the ATmega32 microcontroller . The microcontroller processes the inputs and outputs and displays the results on a Liquid Crystal Display (LCD). The basic function of the digital IC tester is to test a digital IC for correct logical functioning as described in the truth table and/or function table. The designed model can test digital ICs having 14 pins. Since it is programmable, any number of ICs can be tested . Thi...

  5. A Multi-Objective Optimization Method to integrate Heat Pumps in Industrial Processes

    OpenAIRE

    Becker, Helen; Spinato, Giulia; Maréchal, François

    2011-01-01

    Aim of process integration methods is to increase the efficiency of industrial processes by using pinch analysis combined with process design methods. In this context, appropriate integrated utilities offer promising opportunities to reduce energy consumption, operating costs and pollutants emissions. Energy integration methods are able to integrate any type of predefined utility, but so far there is no systematic approach to generate potential utilities models based on their technology limit...

  6. Ontology based heterogeneous materials database integration and semantic query

    Science.gov (United States)

    Zhao, Shuai; Qian, Quan

    2017-10-01

    Materials digital data, high throughput experiments and high throughput computations are regarded as three key pillars of materials genome initiatives. With the fast growth of materials data, the integration and sharing of data is very urgent, that has gradually become a hot topic of materials informatics. Due to the lack of semantic description, it is difficult to integrate data deeply in semantic level when adopting the conventional heterogeneous database integration approaches such as federal database or data warehouse. In this paper, a semantic integration method is proposed to create the semantic ontology by extracting the database schema semi-automatically. Other heterogeneous databases are integrated to the ontology by means of relational algebra and the rooted graph. Based on integrated ontology, semantic query can be done using SPARQL. During the experiments, two world famous First Principle Computational databases, OQMD and Materials Project are used as the integration targets, which show the availability and effectiveness of our method.

  7. Integrating computational methods to retrofit enzymes to synthetic pathways.

    Science.gov (United States)

    Brunk, Elizabeth; Neri, Marilisa; Tavernelli, Ivano; Hatzimanikatis, Vassily; Rothlisberger, Ursula

    2012-02-01

    Microbial production of desired compounds provides an efficient framework for the development of renewable energy resources. To be competitive to traditional chemistry, one requirement is to utilize the full capacity of the microorganism to produce target compounds with high yields and turnover rates. We use integrated computational methods to generate and quantify the performance of novel biosynthetic routes that contain highly optimized catalysts. Engineering a novel reaction pathway entails addressing feasibility on multiple levels, which involves handling the complexity of large-scale biochemical networks while respecting the critical chemical phenomena at the atomistic scale. To pursue this multi-layer challenge, our strategy merges knowledge-based metabolic engineering methods with computational chemistry methods. By bridging multiple disciplines, we provide an integral computational framework that could accelerate the discovery and implementation of novel biosynthetic production routes. Using this approach, we have identified and optimized a novel biosynthetic route for the production of 3HP from pyruvate. Copyright © 2011 Wiley Periodicals, Inc.

  8. System integrational and migrational concepts and methods within healthcare

    DEFF Research Database (Denmark)

    Endsleff, F; Loubjerg, P

    1997-01-01

    In this paper an overview and comparison of the basic concepts and methods behind different system integrational implementations is given, including the DHE, which is based on the coming Healthcare Information Systems Architecture pre-standard HISA, developed by CEN TC251. This standard and the DHE...... (Distributed Healthcare Environment) not only provides highly relevant standards, but also provides an efficient and well structured platform for Healthcare IT Systems....

  9. A geometrical method towards first integrals for dynamical systems

    International Nuclear Information System (INIS)

    Labrunie, S.; Conte, R.

    1996-01-01

    We develop a method, based on Darboux close-quote s and Liouville close-quote s works, to find first integrals and/or invariant manifolds for a physically relevant class of dynamical systems, without making any assumption on these elements close-quote forms. We apply it to three dynamical systems: Lotka endash Volterra, Lorenz and Rikitake. copyright 1996 American Institute of Physics

  10. IMP: Integrated method for power analysis

    Energy Technology Data Exchange (ETDEWEB)

    1989-03-01

    An integrated, easy to use, economical package of microcomputer programs has been developed which can be used by small hydro developers to evaluate potential sites for small scale hydroelectric plants in British Columbia. The programs enable evaluation of sites located far from the nearest stream gauging station, for which streamflow data are not available. For each of the province's 6 hydrologic regions, a streamflow record for one small watershed is provided in the data base. The program can then be used to generate synthetic streamflow records and to compare results obtained by the modelling procedure with the actual data. The program can also be used to explore the significance of modelling parameters and to develop a detailed appreciation for the accuracy which can be obtained under various circumstances. The components of the program are an atmospheric model of precipitation; a watershed model that will generate a continuous series of streamflow data, based on information from the atmospheric model; a flood frequency analysis system that uses site-specific topographic data plus information from the atmospheric model to generate a flood frequency curve; a hydroelectric power simulation program which determines daily energy output for a run-of-river or reservoir storage site based on selected generation facilities and the time series generated in the watershed model; and a graphic analysis package that provides direct visualization of data and modelling results. This report contains a description of the programs, a user guide, the theory behind the model, the modelling methodology, and results from a workshop that reviewed the program package. 32 refs., 16 figs., 18 tabs.

  11. Attribute-Based Methods

    Science.gov (United States)

    Thomas P. Holmes; Wiktor L. Adamowicz

    2003-01-01

    Stated preference methods of environmental valuation have been used by economists for decades where behavioral data have limitations. The contingent valuation method (Chapter 5) is the oldest stated preference approach, and hundreds of contingent valuation studies have been conducted. More recently, and especially over the last decade, a class of stated preference...

  12. Boundary integral methods for unsaturated flow

    International Nuclear Information System (INIS)

    Martinez, M.J.; McTigue, D.F.

    1990-01-01

    Many large simulations may be required to assess the performance of Yucca Mountain as a possible site for the nations first high level nuclear waste repository. A boundary integral equation method (BIEM) is described for numerical analysis of quasilinear steady unsaturated flow in homogeneous material. The applicability of the exponential model for the dependence of hydraulic conductivity on pressure head is discussed briefly. This constitutive assumption is at the heart of the quasilinear transformation. Materials which display a wide distribution in pore-size are described reasonably well by the exponential. For materials with a narrow range in pore-size, the exponential is suitable over more limited ranges in pressure head. The numerical implementation of the BIEM is used to investigate the infiltration from a strip source to a water table. The net infiltration of moisture into a finite-depth layer is well-described by results for a semi-infinite layer if αD > 4, where α is the sorptive number and D is the depth to the water table. the distribution of moisture exhibits a similar dependence on αD. 11 refs., 4 figs.,

  13. Structure of an E. coli integral membrane sulfurtransferase and its structural transition upon SCN− binding defined by EPR-based hybrid method

    Science.gov (United States)

    Ling, Shenglong; Wang, Wei; Yu, Lu; Peng, Junhui; Cai, Xiaoying; Xiong, Ying; Hayati, Zahra; Zhang, Longhua; Zhang, Zhiyong; Song, Likai; Tian, Changlin

    2016-01-01

    Electron paramagnetic resonance (EPR)-based hybrid experimental and computational approaches were applied to determine the structure of a full-length E. coli integral membrane sulfurtransferase, dimeric YgaP, and its structural and dynamic changes upon ligand binding. The solution NMR structures of the YgaP transmembrane domain (TMD) and cytosolic catalytic rhodanese domain were reported recently, but the tertiary fold of full-length YgaP was not yet available. Here, systematic site-specific EPR analysis defined a helix-loop-helix secondary structure of the YagP-TMD monomers using mobility, accessibility and membrane immersion measurements. The tertiary folds of dimeric YgaP-TMD and full-length YgaP in detergent micelles were determined through inter- and intra-monomer distance mapping and rigid-body computation. Further EPR analysis demonstrated the tight packing of the two YgaP second transmembrane helices upon binding of the catalytic product SCN−, which provides insight into the thiocyanate exportation mechanism of YgaP in the E. coli membrane. PMID:26817826

  14. Mapping pan-Arctic CH4 emissions using an adjoint method by integrating process-based wetland and lake biogeochemical models and atmospheric CH4 concentrations

    Science.gov (United States)

    Tan, Z.; Zhuang, Q.; Henze, D. K.; Frankenberg, C.; Dlugokencky, E. J.; Sweeney, C.; Turner, A. J.

    2015-12-01

    Understanding CH4 emissions from wetlands and lakes are critical for the estimation of Arctic carbon balance under fast warming climatic conditions. To date, our knowledge about these two CH4 sources is almost solely built on the upscaling of discontinuous measurements in limited areas to the whole region. Many studies indicated that, the controls of CH4 emissions from wetlands and lakes including soil moisture, lake morphology and substrate content and quality are notoriously heterogeneous, thus the accuracy of those simple estimates could be questionable. Here we apply a high spatial resolution atmospheric inverse model (nested-grid GEOS-Chem Adjoint) over the Arctic by integrating SCIAMACHY and NOAA/ESRL CH4 measurements to constrain the CH4 emissions estimated with process-based wetland and lake biogeochemical models. Our modeling experiments using different wetland CH4 emission schemes and satellite and surface measurements show that the total amount of CH4 emitted from the Arctic wetlands is well constrained, but the spatial distribution of CH4 emissions is sensitive to priors. For CH4 emissions from lakes, our high-resolution inversion shows that the models overestimate CH4 emissions in Alaskan costal lowlands and East Siberian lowlands. Our study also indicates that the precision and coverage of measurements need to be improved to achieve more accurate high-resolution estimates.

  15. Construction of possible integrated predictive index based on EGFR and ANXA3 polymorphisms for chemotherapy response in fluoropyrimidine-treated Japanese gastric cancer patients using a bioinformatic method

    International Nuclear Information System (INIS)

    Takahashi, Hiro; Kaniwa, Nahoko; Saito, Yoshiro; Sai, Kimie; Hamaguchi, Tetsuya; Shirao, Kuniaki; Shimada, Yasuhiro; Matsumura, Yasuhiro; Ohtsu, Atsushi; Yoshino, Takayuki; Doi, Toshihiko; Takahashi, Anna; Odaka, Yoko; Okuyama, Misuzu; Sawada, Jun-ichi; Sakamoto, Hiromi; Yoshida, Teruhiko

    2015-01-01

    Variability in drug response between individual patients is a serious concern in medicine. To identify single-nucleotide polymorphisms (SNPs) related to drug response variability, many genome-wide association studies have been conducted. We previously applied a knowledge-based bioinformatic approach to a pharmacogenomics study in which 119 fluoropyrimidine-treated gastric cancer patients were genotyped at 109,365 SNPs using the Illumina Human-1 BeadChip. We identified the SNP rs2293347 in the human epidermal growth factor receptor (EGFR) gene as a novel genetic factor related to chemotherapeutic response. In the present study, we reanalyzed these hypothesis-free genomic data using extended knowledge. We identified rs2867461 in annexin A3 (ANXA3) gene as another candidate. Using logistic regression, we confirmed that the performance of the rs2867461 + rs2293347 model was superior to those of the single factor models. Furthermore, we propose a novel integrated predictive index (iEA) based on these two polymorphisms in EGFR and ANXA3. The p value for iEA was 1.47 × 10 −8 by Fisher’s exact test. Recent studies showed that the mutations in EGFR is associated with high expression of dihydropyrimidine dehydrogenase, which is an inactivating and rate-limiting enzyme for fluoropyrimidine, and suggested that the combination of chemotherapy with fluoropyrimidine and EGFR-targeting agents is effective against EGFR-overexpressing gastric tumors, while ANXA3 overexpression confers resistance to tyrosine kinase inhibitors targeting the EGFR pathway. These results suggest that the iEA index or a combination of polymorphisms in EGFR and ANXA3 may serve as predictive factors of drug response, and therefore could be useful for optimal selection of chemotherapy regimens. The online version of this article (doi:10.1186/s12885-015-1721-z) contains supplementary material, which is available to authorized users

  16. Integral Equation Methods for Electromagnetic and Elastic Waves

    CERN Document Server

    Chew, Weng; Hu, Bin

    2008-01-01

    Integral Equation Methods for Electromagnetic and Elastic Waves is an outgrowth of several years of work. There have been no recent books on integral equation methods. There are books written on integral equations, but either they have been around for a while, or they were written by mathematicians. Much of the knowledge in integral equation methods still resides in journal papers. With this book, important relevant knowledge for integral equations are consolidated in one place and researchers need only read the pertinent chapters in this book to gain important knowledge needed for integral eq

  17. Analytic methods to generate integrable mappings

    Indian Academy of Sciences (India)

    essential integrability features of an integrable differential equation is a .... With this in mind we first write x3(t) as a cubic polynomial in (xn−1,xn,xn+1) and then ..... coefficients, the quadratic equation in xn+N has real and distinct roots which in ...

  18. Invited Article: A novel calibration method for the JET real-time far infrared polarimeter and integration of polarimetry-based line-integrated density measurements for machine protection of a fusion plant.

    Science.gov (United States)

    Boboc, A; Bieg, B; Felton, R; Dalley, S; Kravtsov, Yu

    2015-09-01

    In this paper, we present the work in the implementation of a new calibration for the JET real-time polarimeter based on the complex amplitude ratio technique and a new self-validation mechanism of data. This allowed easy integration of the polarimetry measurements into the JET plasma density control (gas feedback control) and as well as machine protection systems (neutral beam injection heating safety interlocks). The new addition was used successfully during 2014 JET Campaign and is envisaged that will operate routinely from 2015 campaign onwards in any plasma condition (including ITER relevant scenarios). This mode of operation elevated the importance of the polarimetry as a diagnostic tool in the view of future fusion experiments.

  19. Invited Article: A novel calibration method for the JET real-time far infrared polarimeter and integration of polarimetry-based line-integrated density measurements for machine protection of a fusion plant

    Energy Technology Data Exchange (ETDEWEB)

    Boboc, A., E-mail: Alexandru.Boboc@ccfe.ac.uk; Felton, R.; Dalley, S. [EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); CCFE, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Bieg, B.; Kravtsov, Yu. [EUROfusion Consortium, JET, Culham Science Centre, Abingdon OX14 3DB (United Kingdom); Institute of Physics, Maritime University of Szczecin, Szczecin (Poland)

    2015-09-15

    In this paper, we present the work in the implementation of a new calibration for the JET real-time polarimeter based on the complex amplitude ratio technique and a new self-validation mechanism of data. This allowed easy integration of the polarimetry measurements into the JET plasma density control (gas feedback control) and as well as machine protection systems (neutral beam injection heating safety interlocks). The new addition was used successfully during 2014 JET Campaign and is envisaged that will operate routinely from 2015 campaign onwards in any plasma condition (including ITER relevant scenarios). This mode of operation elevated the importance of the polarimetry as a diagnostic tool in the view of future fusion experiments.

  20. Interface-based software integration

    OpenAIRE

    Aziz Ahmad Rais

    2016-01-01

    Enterprise architecture frameworks define the goals of enterprise architecture in order to make business processes and IT operations more effective, and to reduce the risk of future investments. These enterprise architecture frameworks offer different architecture development methods that help in building enterprise architecture. In practice, the larger organizations become, the larger their enterprise architecture and IT become. This leads to an increasingly complex system of enterprise arch...

  1. A Cloud Based Data Integration Framework

    OpenAIRE

    Jiang , Nan; Xu , Lai; Vrieze , Paul ,; Lim , Mian-Guan; Jarabo , Oscar

    2012-01-01

    Part 7: Cloud-Based Support; International audience; Virtual enterprise (VE) relies on resource sharing and collaboration across geographically dispersed and dynamically allied businesses in order to better respond to market opportunities. It is generally considered that effective data integration and management is crucial to realise the value of VE. This paper describes a cloud-based data integration framework that can be used for supporting VE to discover, explore and respond more emerging ...

  2. Integrating Mainframe Data Bases on a Microcomputer

    OpenAIRE

    Marciniak, Thomas A.

    1985-01-01

    Microcomputers support user-friendly software for interrogating their resident data bases. Many medical data bases currently consist of files on less accessible mainframe computers with more limited inquiry capabilities. We discuss the transferring and integrating of mainframe data into microcomputer data base systems in one medical environment.

  3. Recent Advances in the Method of Forces: Integrated Force Method of Structural Analysis

    Science.gov (United States)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.

    1998-01-01

    Stress that can be induced in an elastic continuum can be determined directly through the simultaneous application of the equilibrium equations and the compatibility conditions. In the literature, this direct stress formulation is referred to as the integrated force method. This method, which uses forces as the primary unknowns, complements the popular equilibrium-based stiffness method, which considers displacements as the unknowns. The integrated force method produces accurate stress, displacement, and frequency results even for modest finite element models. This version of the force method should be developed as an alternative to the stiffness method because the latter method, which has been researched for the past several decades, may have entered its developmental plateau. Stress plays a primary role in the development of aerospace and other products, and its analysis is difficult. Therefore, it is advisable to use both methods to calculate stress and eliminate errors through comparison. This paper examines the role of the integrated force method in analysis, animation and design.

  4. Numerical method for solving integral equations of neutron transport. II

    International Nuclear Information System (INIS)

    Loyalka, S.K.; Tsai, R.W.

    1975-01-01

    In a recent paper it was pointed out that the weakly singular integral equations of neutron transport can be quite conveniently solved by a method based on subtraction of singularity. This previous paper was devoted entirely to the consideration of simple one-dimensional isotropic-scattering and one-group problems. The present paper constitutes interesting extensions of the previous work in that in addition to a typical two-group anisotropic-scattering albedo problem in the slab geometry, the method is also applied to an isotropic-scattering problem in the x-y geometry. These results are compared with discrete S/sub N/ (ANISN or TWOTRAN-II) results, and for the problems considered here, the proposed method is found to be quite effective. Thus, the method appears to hold considerable potential for future applications. (auth)

  5. A Grid Synchronization PLL Method Based on Mixed Second- and Third-Order Generalized Integrator for DC-Offset Elimination and Frequency Adaptability

    DEFF Research Database (Denmark)

    Zhang, Chunjiang; Zhao, Xiaojun; Wang, Xiaohuan

    2018-01-01

    in the grid voltages, the general SOGI’s performance suffers from its generated dc effect in the lagging sine signal at the output. Therefore, in this paper, a mixed second- and third-order generalized integrator (MSTOGI) is proposed to eliminate this effect caused by the dc offset of grid voltages......The second order generalized integrator (SOGI) has been widely used to implement grid synchronization for grid-connected inverters, and from grid voltages it is able to extract the fundamental components with an output of two orthogonal sinusoidal signals. However, if there is a dc offset existing...

  6. Integrated circuit and method of arbitration in a network on an integrated circuit.

    NARCIS (Netherlands)

    2011-01-01

    The invention relates to an integrated circuit and to a method of arbitration in a network on an integrated circuit. According to the invention, a method of arbitration in a network on an integrated circuit is provided, the network comprising a router unit, the router unit comprising a first input

  7. Integrals of Frullani type and the method of brackets

    Directory of Open Access Journals (Sweden)

    Bravo Sergio

    2017-01-01

    Full Text Available The method of brackets is a collection of heuristic rules, some of which have being made rigorous, that provide a flexible, direct method for the evaluation of definite integrals. The present work uses this method to establish classical formulas due to Frullani which provide values of a specific family of integrals. Some generalizations are established.

  8. Formation of elements of integrated acousto-optic cell based on LiNbO3 films by methods of nanotechnology

    International Nuclear Information System (INIS)

    Ageev, O A; Zamburg, E G; Kolomiytsev, A S; Suchkov, D O; Shipulin, I A; Shumov, A V

    2015-01-01

    In the experiments we defined modes, and developed the technology of formation of elements of input-output laser emission and microlens of integrated acousto-optic cell by Pulsed Laser Deposition and Focused Ion Beams by using nanotechnology cluster complex, allowing controlled creation of elements in a single process cycle. (paper)

  9. Accurate Electromagnetic Modeling Methods for Integrated Circuits

    NARCIS (Netherlands)

    Sheng, Z.

    2010-01-01

    The present development of modern integrated circuits (IC’s) is characterized by a number of critical factors that make their design and verification considerably more difficult than before. This dissertation addresses the important questions of modeling all electromagnetic behavior of features on

  10. The Efficacy of Three Learning Methods Collaborative, Context-Based Learning and Traditional, on Learning, Attitude and Behaviour of Undergraduate Nursing Students: Integrating Theory and Practice.

    Science.gov (United States)

    Hasanpour-Dehkordi, Ali; Solati, Kamal

    2016-04-01

    Communication skills training, responsibility, respect, and self-awareness are important indexes of changing learning behaviours in modern approaches. The aim of this study was to investigate the efficacy of three learning approaches, collaborative, context-based learning (CBL), and traditional, on learning, attitude, and behaviour of undergraduate nursing students. This study was a clinical trial with pretest and post-test of control group. The participants were senior nursing students. The samples were randomly assigned to three groups; CBL, collaborative, and traditional. To gather data a standard questionnaire of students' behaviour and attitude was administered prior to and after the intervention. Also, the rate of learning was investigated by a researcher-developed questionnaire prior to and after the intervention in the three groups. In CBL and collaborative training groups, the mean score of behaviour and attitude increased after the intervention. But no significant association was obtained between the mean scores of behaviour and attitude prior to and after the intervention in the traditional group. However, the mean learning score increased significantly in the CBL, collaborative, and traditional groups after the study in comparison to before the study. Both CBL and collaborative approaches were useful in terms of increased respect, self-awareness, self-evaluation, communication skills and responsibility as well as increased motivation and learning score in comparison to traditional method.

  11. Integrated Phoneme Subspace Method for Speech Feature Extraction

    Directory of Open Access Journals (Sweden)

    Park Hyunsin

    2009-01-01

    Full Text Available Speech feature extraction has been a key focus in robust speech recognition research. In this work, we discuss data-driven linear feature transformations applied to feature vectors in the logarithmic mel-frequency filter bank domain. Transformations are based on principal component analysis (PCA, independent component analysis (ICA, and linear discriminant analysis (LDA. Furthermore, this paper introduces a new feature extraction technique that collects the correlation information among phoneme subspaces and reconstructs feature space for representing phonemic information efficiently. The proposed speech feature vector is generated by projecting an observed vector onto an integrated phoneme subspace (IPS based on PCA or ICA. The performance of the new feature was evaluated for isolated word speech recognition. The proposed method provided higher recognition accuracy than conventional methods in clean and reverberant environments.

  12. Adaptive integral equation methods in transport theory

    International Nuclear Information System (INIS)

    Kelley, C.T.

    1992-01-01

    In this paper, an adaptive multilevel algorithm for integral equations is described that has been developed with the Chandrasekhar H equation and its generalizations in mind. The algorithm maintains good performance when the Frechet derivative of the nonlinear map is singular at the solution, as happens in radiative transfer with conservative scattering and in critical neutron transport. Numerical examples that demonstrate the algorithm's effectiveness are presented

  13. A symplectic integration method for elastic filaments

    Science.gov (United States)

    Ladd, Tony; Misra, Gaurav

    2009-03-01

    Elastic rods are a ubiquitous coarse-grained model of semi-flexible biopolymers such as DNA, actin, and microtubules. The Worm-Like Chain (WLC) is the standard numerical model for semi-flexible polymers, but it is only a linearized approximation to the dynamics of an elastic rod, valid for small deflections; typically the torsional motion is neglected as well. In the standard finite-difference and finite-element formulations of an elastic rod, the continuum equations of motion are discretized in space and time, but it is then difficult to ensure that the Hamiltonian structure of the exact equations is preserved. Here we discretize the Hamiltonian itself, expressed as a line integral over the contour of the filament. This discrete representation of the continuum filament can then be integrated by one of the explicit symplectic integrators frequently used in molecular dynamics. The model systematically approximates the continuum partial differential equations, but has the same level of computational complexity as molecular dynamics and is constraint free. Numerical tests show that the algorithm is much more stable than a finite-difference formulation and can be used for high aspect ratio filaments, such as actin. We present numerical results for the deterministic and stochastic motion of single filaments.

  14. A comparison of non-integrating reprogramming methods

    Science.gov (United States)

    Schlaeger, Thorsten M; Daheron, Laurence; Brickler, Thomas R; Entwisle, Samuel; Chan, Karrie; Cianci, Amelia; DeVine, Alexander; Ettenger, Andrew; Fitzgerald, Kelly; Godfrey, Michelle; Gupta, Dipti; McPherson, Jade; Malwadkar, Prerana; Gupta, Manav; Bell, Blair; Doi, Akiko; Jung, Namyoung; Li, Xin; Lynes, Maureen S; Brookes, Emily; Cherry, Anne B C; Demirbas, Didem; Tsankov, Alexander M; Zon, Leonard I; Rubin, Lee L; Feinberg, Andrew P; Meissner, Alexander; Cowan, Chad A; Daley, George Q

    2015-01-01

    Human induced pluripotent stem cells (hiPSCs1–3) are useful in disease modeling and drug discovery, and they promise to provide a new generation of cell-based therapeutics. To date there has been no systematic evaluation of the most widely used techniques for generating integration-free hiPSCs. Here we compare Sendai-viral (SeV)4, episomal (Epi)5 and mRNA transfection mRNA6 methods using a number of criteria. All methods generated high-quality hiPSCs, but significant differences existed in aneuploidy rates, reprogramming efficiency, reliability and workload. We discuss the advantages and shortcomings of each approach, and present and review the results of a survey of a large number of human reprogramming laboratories on their independent experiences and preferences. Our analysis provides a valuable resource to inform the use of specific reprogramming methods for different laboratories and different applications, including clinical translation. PMID:25437882

  15. Methods in Logic Based Control

    DEFF Research Database (Denmark)

    Christensen, Georg Kronborg

    1999-01-01

    Desing and theory of Logic Based Control systems.Boolean Algebra, Karnaugh Map, Quine McClusky's algorithm. Sequential control design. Logic Based Control Method, Cascade Control Method. Implementation techniques: relay, pneumatic, TTL/CMOS,PAL and PLC- and Soft_PLC implementation. PLC...

  16. Integrated airfoil and blade design method for large wind turbines

    DEFF Research Database (Denmark)

    Zhu, Wei Jun; Shen, Wen Zhong; Sørensen, Jens Nørkær

    2014-01-01

    This paper presents an integrated method for designing airfoil families of large wind turbine blades. For a given rotor diameter and a tip speed ratio, optimal airfoils are designed based on the local speed ratios. To achieve a high power performance at low cost, the airfoils are designed...... with the objectives of high Cp and small chord length. When the airfoils are obtained, the optimum flow angle and rotor solidity are calculated which forms the basic input to the blade design. The new airfoils are designed based on a previous in-house designed airfoil family which was optimized at a Reynolds number...... of 3 million. A novel shape perturbation function is introduced to optimize the geometry based on the existing airfoils which simplifies the design procedure. The viscous/inviscid interactive code XFOIL is used as the aerodynamic tool for airfoil optimization at a Reynolds number of 16 million...

  17. S-bases as a tool to solve reduction problems for Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, A.V.; Smirnov, V.A.

    2006-01-01

    We suggest a mathematical definition of the notion of master integrals and present a brief review of algorithmic methods to solve reduction problems for Feynman integrals based on integration by parts relations. In particular, we discuss a recently suggested reduction algorithm which uses Groebner bases. New results obtained with its help for a family of three-loop Feynman integrals are outlined

  18. S-bases as a tool to solve reduction problems for Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, A.V. [Scientific Research Computing Center of Moscow State University, Moscow 119992 (Russian Federation); Smirnov, V.A. [Nuclear Physics Institute of Moscow State University, Moscow 119992 (Russian Federation)

    2006-10-15

    We suggest a mathematical definition of the notion of master integrals and present a brief review of algorithmic methods to solve reduction problems for Feynman integrals based on integration by parts relations. In particular, we discuss a recently suggested reduction algorithm which uses Groebner bases. New results obtained with its help for a family of three-loop Feynman integrals are outlined.

  19. Activity based costing (ABC Method

    Directory of Open Access Journals (Sweden)

    Prof. Ph.D. Saveta Tudorache

    2008-05-01

    Full Text Available In the present paper the need and advantages are presented of using the Activity BasedCosting method, need arising from the need of solving the information pertinence issue. This issue has occurreddue to the limitation of classic methods in this field, limitation also reflected by the disadvantages ofsuch classic methods in establishing complete costs.

  20. Numerical method of singular problems on singular integrals

    International Nuclear Information System (INIS)

    Zhao Huaiguo; Mou Zongze

    1992-02-01

    As first part on the numerical research of singular problems, a numerical method is proposed for singular integrals. It is shown that the procedure is quite powerful for solving physics calculation with singularity such as the plasma dispersion function. Useful quadrature formulas for some class of the singular integrals are derived. In general, integrals with more complex singularities can be dealt by this method easily

  1. Life cycle integrated thermoeconomic assessment method for energy conversion systems

    International Nuclear Information System (INIS)

    Kanbur, Baris Burak; Xiang, Liming; Dubey, Swapnil; Choo, Fook Hoong; Duan, Fei

    2017-01-01

    Highlights: • A new LCA integrated thermoeconomic approach is presented. • The new unit fuel cost is found 4.8 times higher than the classic method. • The new defined parameter increased the sustainability index by 67.1%. • The case studies are performed for countries with different CO 2 prices. - Abstract: Life cycle assessment (LCA) based thermoeconomic modelling has been applied for the evaluation of energy conversion systems since it provided more comprehensive and applicable assessment criteria. This study proposes an improved thermoeconomic method, named as life cycle integrated thermoeconomic assessment (LCiTA), which combines the LCA based enviroeconomic parameters in the production steps of the system components and fuel with the conventional thermoeconomic method for the energy conversion systems. A micro-cogeneration system is investigated and analyzed with the LCiTA method, the comparative studies show that the unit cost of fuel by using the LCiTA method is 3.8 times higher than the conventional thermoeconomic model. It is also realized that the enviroeconomic parameters during the operation of the system components do not have significant impacts on the system streams since the exergetic parameters are dominant in the thermoeconomic calculations. Moreover, the improved sustainability index is found roundly 67.2% higher than the previously defined sustainability index, suggesting that the enviroeconomic and thermoeconomic parameters decrease the impact of the exergy destruction in the sustainability index definition. To find the feasible operation conditions for the micro-cogeneration system, different assessment strategies are presented. Furthermore, a case study for Singapore is conducted to see the impact of the forecasted carbon dioxide prices on the thermoeconomic performance of the micro-cogeneration system.

  2. Train integrity detection risk analysis based on PRISM

    Science.gov (United States)

    Wen, Yuan

    2018-04-01

    GNSS based Train Integrity Monitoring System (TIMS) is an effective and low-cost detection scheme for train integrity detection. However, as an external auxiliary system of CTCS, GNSS may be influenced by external environments, such as uncertainty of wireless communication channels, which may lead to the failure of communication and positioning. In order to guarantee the reliability and safety of train operation, a risk analysis method of train integrity detection based on PRISM is proposed in this article. First, we analyze the risk factors (in GNSS communication process and the on-board communication process) and model them. Then, we evaluate the performance of the model in PRISM based on the field data. Finally, we discuss how these risk factors influence the train integrity detection process.

  3. Hierarchical Matrices Method and Its Application in Electromagnetic Integral Equations

    Directory of Open Access Journals (Sweden)

    Han Guo

    2012-01-01

    Full Text Available Hierarchical (H- matrices method is a general mathematical framework providing a highly compact representation and efficient numerical arithmetic. When applied in integral-equation- (IE- based computational electromagnetics, H-matrices can be regarded as a fast algorithm; therefore, both the CPU time and memory requirement are reduced significantly. Its kernel independent feature also makes it suitable for any kind of integral equation. To solve H-matrices system, Krylov iteration methods can be employed with appropriate preconditioners, and direct solvers based on the hierarchical structure of H-matrices are also available along with high efficiency and accuracy, which is a unique advantage compared to other fast algorithms. In this paper, a novel sparse approximate inverse (SAI preconditioner in multilevel fashion is proposed to accelerate the convergence rate of Krylov iterations for solving H-matrices system in electromagnetic applications, and a group of parallel fast direct solvers are developed for dealing with multiple right-hand-side cases. Finally, numerical experiments are given to demonstrate the advantages of the proposed multilevel preconditioner compared to conventional “single level” preconditioners and the practicability of the fast direct solvers for arbitrary complex structures.

  4. Momentum integral network method for thermal-hydraulic transient analysis

    International Nuclear Information System (INIS)

    Van Tuyle, G.J.

    1983-01-01

    A new momentum integral network method has been developed, and tested in the MINET computer code. The method was developed in order to facilitate the transient analysis of complex fluid flow and heat transfer networks, such as those found in the balance of plant of power generating facilities. The method employed in the MINET code is a major extension of a momentum integral method reported by Meyer. Meyer integrated the momentum equation over several linked nodes, called a segment, and used a segment average pressure, evaluated from the pressures at both ends. Nodal mass and energy conservation determined nodal flows and enthalpies, accounting for fluid compression and thermal expansion

  5. Integrated Temperature Sensors based on Heat Diffusion

    NARCIS (Netherlands)

    Van Vroonhoven, C.P.L.

    2015-01-01

    This thesis describes the theory, design and implementation of a new class of integrated temperature sensors, based on heat diffusion. In such sensors, temperature is sensed by measuring the time it takes for heat to diffuse through silicon. An on-chip thermal delay can be determined by geometry and

  6. Integrated airfoil and blade design method for large wind turbines

    DEFF Research Database (Denmark)

    Zhu, Wei Jun; Shen, Wen Zhong

    2013-01-01

    This paper presents an integrated method for designing airfoil families of large wind turbine blades. For a given rotor diameter and tip speed ratio, the optimal airfoils are designed based on the local speed ratios. To achieve high power performance at low cost, the airfoils are designed...... with an objective of high Cp and small chord length. When the airfoils are obtained, the optimum flow angle and rotor solidity are calculated which forms the basic input to the blade design. The new airfoils are designed based on the previous in-house airfoil family which were optimized at a Reynolds number of 3...... million. A novel shape perturbation function is introduced to optimize the geometry on the existing airfoils and thus simplify the design procedure. The viscos/inviscid code Xfoil is used as the aerodynamic tool for airfoil optimization where the Reynolds number is set at 16 million with a free...

  7. Service tailoring: a method and tool for user-centric creation of integrated IT-based homecare services to support independent living of elderly

    NARCIS (Netherlands)

    Zarifi Eslami, Mohammed

    2013-01-01

    This thesis addresses the problem of supporting independent living of elderly people through IT-based homecare services. Independent living is seen as one way to deal with the consequences of an aging population (especially in industrialized countries), which include rising healthcare expenditures

  8. Achieving Integration in Mixed Methods Designs—Principles and Practices

    OpenAIRE

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-01-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participato...

  9. Integrated management of thesis using clustering method

    Science.gov (United States)

    Astuti, Indah Fitri; Cahyadi, Dedy

    2017-02-01

    Thesis is one of major requirements for student in pursuing their bachelor degree. In fact, finishing the thesis involves a long process including consultation, writing manuscript, conducting the chosen method, seminar scheduling, searching for references, and appraisal process by the board of mentors and examiners. Unfortunately, most of students find it hard to match all the lecturers' free time to sit together in a seminar room in order to examine the thesis. Therefore, seminar scheduling process should be on the top of priority to be solved. Manual mechanism for this task no longer fulfills the need. People in campus including students, staffs, and lecturers demand a system in which all the stakeholders can interact each other and manage the thesis process without conflicting their timetable. A branch of computer science named Management Information System (MIS) could be a breakthrough in dealing with thesis management. This research conduct a method called clustering to distinguish certain categories using mathematics formulas. A system then be developed along with the method to create a well-managed tool in providing some main facilities such as seminar scheduling, consultation and review process, thesis approval, assessment process, and also a reliable database of thesis. The database plays an important role in present and future purposes.

  10. Elements for successful sensor-based process control {Integrated Metrology}

    International Nuclear Information System (INIS)

    Butler, Stephanie Watts

    1998-01-01

    Current productivity needs have stimulated development of alternative metrology, control, and equipment maintenance methods. Specifically, sensor applications provide the opportunity to increase productivity, tighten control, reduce scrap, and improve maintenance schedules and procedures. Past experience indicates a complete integrated solution must be provided for sensor-based control to be used successfully in production. In this paper, Integrated Metrology is proposed as the term for an integrated solution that will result in a successful application of sensors for process control. This paper defines and explores the perceived four elements of successful sensor applications: business needs, integration, components, and form. Based upon analysis of existing successful commercially available controllers, the necessary business factors have been determined to be strong, measurable industry-wide business needs whose solution is profitable and feasible. This paper examines why the key aspect of integration is the decision making process. A detailed discussion is provided of the components of most importance to sensor based control: decision-making methods, the 3R's of sensors, and connectivity. A metric for one of the R's (resolution) is proposed to allow focus on this important aspect of measurement. A form for these integrated components which synergistically partitions various aspects of control at the equipment and MES levels to efficiently achieve desired benefits is recommended

  11. Elements for successful sensor-based process control {Integrated Metrology}

    Science.gov (United States)

    Butler, Stephanie Watts

    1998-11-01

    Current productivity needs have stimulated development of alternative metrology, control, and equipment maintenance methods. Specifically, sensor applications provide the opportunity to increase productivity, tighten control, reduce scrap, and improve maintenance schedules and procedures. Past experience indicates a complete integrated solution must be provided for sensor-based control to be used successfully in production. In this paper, Integrated Metrology is proposed as the term for an integrated solution that will result in a successful application of sensors for process control. This paper defines and explores the perceived four elements of successful sensor applications: business needs, integration, components, and form. Based upon analysis of existing successful commercially available controllers, the necessary business factors have been determined to be strong, measurable industry-wide business needs whose solution is profitable and feasible. This paper examines why the key aspect of integration is the decision making process. A detailed discussion is provided of the components of most importance to sensor based control: decision-making methods, the 3R's of sensors, and connectivity. A metric for one of the R's (resolution) is proposed to allow focus on this important aspect of measurement. A form for these integrated components which synergistically partitions various aspects of control at the equipment and MES levels to efficiently achieve desired benefits is recommended.

  12. Deterministic methods to solve the integral transport equation in neutronic

    International Nuclear Information System (INIS)

    Warin, X.

    1993-11-01

    We present a synthesis of the methods used to solve the integral transport equation in neutronic. This formulation is above all used to compute solutions in 2D in heterogeneous assemblies. Three kinds of methods are described: - the collision probability method; - the interface current method; - the current coupling collision probability method. These methods don't seem to be the most effective in 3D. (author). 9 figs

  13. Gain Scheduling of Observer-Based Controllers with Integral Action

    DEFF Research Database (Denmark)

    Trangbæk, Klaus; Stoustrup, Jakob; Bendtsen, Jan Dimon

    2006-01-01

     This paper presents a method for continuous gain scheduling of  observer-based controllers with integral action. Given two stabilising controllers for a given system, explicit state space formulae are presented, allowing to change gradually from one  controller to the other while preserving...

  14. Acoustic 3D modeling by the method of integral equations

    Science.gov (United States)

    Malovichko, M.; Khokhlov, N.; Yavich, N.; Zhdanov, M.

    2018-02-01

    This paper presents a parallel algorithm for frequency-domain acoustic modeling by the method of integral equations (IE). The algorithm is applied to seismic simulation. The IE method reduces the size of the problem but leads to a dense system matrix. A tolerable memory consumption and numerical complexity were achieved by applying an iterative solver, accompanied by an effective matrix-vector multiplication operation, based on the fast Fourier transform (FFT). We demonstrate that, the IE system matrix is better conditioned than that of the finite-difference (FD) method, and discuss its relation to a specially preconditioned FD matrix. We considered several methods of matrix-vector multiplication for the free-space and layered host models. The developed algorithm and computer code were benchmarked against the FD time-domain solution. It was demonstrated that, the method could accurately calculate the seismic field for the models with sharp material boundaries and a point source and receiver located close to the free surface. We used OpenMP to speed up the matrix-vector multiplication, while MPI was used to speed up the solution of the system equations, and also for parallelizing across multiple sources. The practical examples and efficiency tests are presented as well.

  15. Quadratic algebras in the noncommutative integration method of wave equation

    International Nuclear Information System (INIS)

    Varaksin, O.L.

    1995-01-01

    The paper deals with the investigation of applications of the method of noncommutative integration of linear differential equations by partial derivatives. Nontrivial example was taken for integration of three-dimensions wave equation with the use of non-Abelian quadratic algebras

  16. INTEGRATED FUSION METHOD FOR MULTIPLE TEMPORAL-SPATIAL-SPECTRAL IMAGES

    Directory of Open Access Journals (Sweden)

    H. Shen

    2012-08-01

    Full Text Available Data fusion techniques have been widely researched and applied in remote sensing field. In this paper, an integrated fusion method for remotely sensed images is presented. Differently from the existed methods, the proposed method has the performance to integrate the complementary information in multiple temporal-spatial-spectral images. In order to represent and process the images in one unified framework, two general image observation models are firstly presented, and then the maximum a posteriori (MAP framework is used to set up the fusion model. The gradient descent method is employed to solve the fused image. The efficacy of the proposed method is validated using simulated images.

  17. Evaluation of time integration methods for transient response analysis of nonlinear structures

    International Nuclear Information System (INIS)

    Park, K.C.

    1975-01-01

    Recent developments in the evaluation of direct time integration methods for the transient response analysis of nonlinear structures are presented. These developments, which are based on local stability considerations of an integrator, show that the interaction between temporal step size and nonlinearities of structural systems has a pronounced effect on both accuracy and stability of a given time integration method. The resulting evaluation technique is applied to a model nonlinear problem, in order to: 1) demonstrate that it eliminates the present costly process of evaluating time integrator for nonlinear structural systems via extensive numerical experiments; 2) identify the desirable characteristics of time integration methods for nonlinear structural problems; 3) develop improved stiffly-stable methods for application to nonlinear structures. Extension of the methodology for examination of the interaction between a time integrator and the approximate treatment of nonlinearities (such as due to pseudo-force or incremental solution procedures) is also discussed. (Auth.)

  18. Alternative containment integrity test methods, an overview of possible techniques

    International Nuclear Information System (INIS)

    Spletzer, B.L.

    1986-01-01

    A study is being conducted to develop and analyze alternative methods for testing of containment integrity. The study is focused on techniques for continuously monitoring containment integrity to provide rapid detection of existing leaks, thus providing greater certainty of the integrity of the containment at any time. The study is also intended to develop techniques applicable to the currently required Type A integrated leakage rate tests. A brief discussion of the range of alternative methods currently being considered is presented. The methods include applicability to all major containment types, operating and shutdown plant conditions, and quantitative and qualitative leakage measurements. The techniques are analyzed in accordance with the current state of knowledge of each method. The bulk of the techniques discussed are in the conceptual stage, have not been tested in actual plant conditions, and are presented here as a possible future direction for evaluating containment integrity. Of the methods considered, no single method provides optimum performance for all containment types. Several methods are limited in the types of containment for which they are applicable. The results of the study to date indicate that techniques for continuous monitoring of containment integrity exist for many plants and may be implemented at modest cost

  19. Two pricing methods for solving an integrated commercial fishery ...

    African Journals Online (AJOL)

    a model (Hasan and Raffensperger, 2006) to solve this problem: the integrated ... planning and labour allocation for that processing firm, but did not consider any fleet- .... the DBONP method actually finds such price information, and uses it.

  20. Critical Analysis of Methods for Integrating Economic and Environmental Indicators

    NARCIS (Netherlands)

    Huguet Ferran, Pau; Heijungs, Reinout; Vogtländer, Joost G.

    2018-01-01

    The application of environmental strategies requires scoring and evaluation methods that provide an integrated vision of the economic and environmental performance of systems. The vector optimisation, ratio and weighted addition of indicators are the three most prevalent techniques for addressing

  1. A simple flow-concentration modelling method for integrating water ...

    African Journals Online (AJOL)

    A simple flow-concentration modelling method for integrating water quality and ... flow requirements are assessed for maintenance low flow, drought low flow ... the instream concentrations of chemical constituents that will arise from different ...

  2. APPLICATION OF BOUNDARY INTEGRAL EQUATION METHOD FOR THERMOELASTICITY PROBLEMS

    Directory of Open Access Journals (Sweden)

    Vorona Yu.V.

    2015-12-01

    Full Text Available Boundary Integral Equation Method is used for solving analytically the problems of coupled thermoelastic spherical wave propagation. The resulting mathematical expressions coincide with the solutions obtained in a conventional manner.

  3. New Approaches to Aluminum Integral Foam Production with Casting Methods

    Directory of Open Access Journals (Sweden)

    Ahmet Güner

    2015-08-01

    Full Text Available Integral foam has been used in the production of polymer materials for a long time. Metal integral foam casting systems are obtained by transferring and adapting polymer injection technology. Metal integral foam produced by casting has a solid skin at the surface and a foam core. Producing near-net shape reduces production expenses. Insurance companies nowadays want the automotive industry to use metallic foam parts because of their higher impact energy absorption properties. In this paper, manufacturing processes of aluminum integral foam with casting methods will be discussed.

  4. Microprocessor-based integrated LMFBR core surveillance

    International Nuclear Information System (INIS)

    Gmeiner, L.

    1984-06-01

    This report results from a joint study of KfK and INTERATOM. The aim of this study is to explore the advantages of microprocessors and microelectronics for a more sophisticated core surveillance, which is based on the integration of separate surveillance techniques. Due to new developments in microelectronics and related software an approach to LMFBR core surveillance can be conceived that combines a number of measurements into a more intelligent decision-making data processing system. The following techniques are considered to contribute essentially to an integrated core surveillance system: - subassembly state and thermal hydraulics performance monitoring, - temperature noise analysis, - acoustic core surveillance, - failure characterization and failure prediction based on DND- and cover gas signals, and - flux tilting techniques. Starting from a description of these techniques it is shown that by combination and correlation of these individual techniques a higher degree of cost-effectiveness, reliability and accuracy can be achieved. (orig./GL) [de

  5. Tau method approximation of the Hubbell rectangular source integral

    International Nuclear Information System (INIS)

    Kalla, S.L.; Khajah, H.G.

    2000-01-01

    The Tau method is applied to obtain expansions, in terms of Chebyshev polynomials, which approximate the Hubbell rectangular source integral:I(a,b)=∫ b 0 (1/(√(1+x 2 )) arctan(a/(√(1+x 2 )))) This integral corresponds to the response of an omni-directional radiation detector situated over a corner of a plane isotropic rectangular source. A discussion of the error in the Tau method approximation follows

  6. Integrating financial theory and methods in electricity resource planning

    Energy Technology Data Exchange (ETDEWEB)

    Felder, F.A. [Economics Resource Group, Cambridge, MA (United States)

    1996-02-01

    Decision makers throughout the world are introducing risk and market forces in the electric power industry to lower costs and improve services. Incentive based regulation (IBR), which replaces cost of service ratemaking with an approach that divorces costs from revenues, exposes the utility to the risk of profits or losses depending on their performance. Regulators also are allowing for competition within the industry, most notably in the wholesale market and possibly in the retail market. Two financial approaches that incorporate risk in resource planning are evaluated: risk adjusted discount rates (RADR) and options theory (OT). These two complementary approaches are an improvement over the standard present value revenue requirement (PVRR). However, each method has some important limitations. By correctly using RADR and OT and understanding their limitations, decision makers can improve their ability to value risk properly in power plant projects and integrated resource plans. (Author)

  7. An integrated lean-methods approach to hospital facilities redesign.

    Science.gov (United States)

    Nicholas, John

    2012-01-01

    Lean production methods for eliminating waste and improving processes in manufacturing are now being applied in healthcare. As the author shows, the methods are appropriate for redesigning hospital facilities. When used in an integrated manner and employing teams of mostly clinicians, the methods produce facility designs that are custom-fit to patient needs and caregiver work processes, and reduce operational costs. The author reviews lean methods and an approach for integrating them in the redesign of hospital facilities. A case example of the redesign of an emergency department shows the feasibility and benefits of the approach.

  8. Iterative algorithm for the volume integral method for magnetostatics problems

    International Nuclear Information System (INIS)

    Pasciak, J.E.

    1980-11-01

    Volume integral methods for solving nonlinear magnetostatics problems are considered in this paper. The integral method is discretized by a Galerkin technique. Estimates are given which show that the linearized problems are well conditioned and hence easily solved using iterative techniques. Comparisons of iterative algorithms with the elimination method of GFUN3D shows that the iterative method gives an order of magnitude improvement in computational time as well as memory requirements for large problems. Computational experiments for a test problem as well as a double layer dipole magnet are given. Error estimates for the linearized problem are also derived

  9. Integrated Data Base: Status and waste projections

    International Nuclear Information System (INIS)

    Klein, J.A.

    1990-01-01

    The Integrated Data Base (IDB) is the official US Department of Energy (DOE) data base for spent fuel and radioactive waste inventories and projections. DOE low-level waste (LLW) is just one of the many waste types that are documented with the IDB. Summary-level tables and figures are presented illustrating historical and projected volume changes of DOE LLW. This information is readily available through the annual IDB publication. Other presentation formats are also available to the DOE community through a request to the IDB Program. 4 refs., 6 figs., 5 tabs

  10. Integrating Globality and Locality for Robust Representation Based Classification

    Directory of Open Access Journals (Sweden)

    Zheng Zhang

    2014-01-01

    Full Text Available The representation based classification method (RBCM has shown huge potential for face recognition since it first emerged. Linear regression classification (LRC method and collaborative representation classification (CRC method are two well-known RBCMs. LRC and CRC exploit training samples of each class and all the training samples to represent the testing sample, respectively, and subsequently conduct classification on the basis of the representation residual. LRC method can be viewed as a “locality representation” method because it just uses the training samples of each class to represent the testing sample and it cannot embody the effectiveness of the “globality representation.” On the contrary, it seems that CRC method cannot own the benefit of locality of the general RBCM. Thus we propose to integrate CRC and LRC to perform more robust representation based classification. The experimental results on benchmark face databases substantially demonstrate that the proposed method achieves high classification accuracy.

  11. An integrated approach for facilities planning by ELECTRE method

    Science.gov (United States)

    Elbishari, E. M. Y.; Hazza, M. H. F. Al; Adesta, E. Y. T.; Rahman, Nur Salihah Binti Abdul

    2018-01-01

    Facility planning is concerned with the design, layout, and accommodation of people, machines and activities of a system. Most of the researchers try to investigate the production area layout and the related facilities. However, few of them try to investigate the relationship between the production space and its relationship with service departments. The aim of this research to is to integrate different approaches in order to evaluate, analyse and select the best facilities planning method that able to explain the relationship between the production area and other supporting departments and its effect on human efforts. To achieve the objective of this research two different approaches have been integrated: Apple’s layout procedure as one of the effective tools in planning factories, ELECTRE method as one of the Multi Criteria Decision Making methods (MCDM) to minimize the risk of getting poor facilities planning. Dalia industries have been selected as a case study to implement our integration the factory have been divided two main different area: the whole facility (layout A), and the manufacturing area (layout B). This article will be concerned with the manufacturing area layout (Layout B). After analysing the data gathered, the manufacturing area was divided into 10 activities. There are five factors that the alternative were compared upon which are: Inter department satisfactory level, total distance travelled for workers, total distance travelled for the product, total time travelled for the workers, and total time travelled for the product. Three different layout alternatives have been developed in addition to the original layouts. Apple’s layout procedure was used to study and evaluate the different alternatives layouts, the study and evaluation of the layouts was done by calculating scores for each of the factors. After obtaining the scores from evaluating the layouts, ELECTRE method was used to compare the proposed alternatives with each other and with

  12. Review of Statistical Learning Methods in Integrated Omics Studies (An Integrated Information Science).

    Science.gov (United States)

    Zeng, Irene Sui Lan; Lumley, Thomas

    2018-01-01

    Integrated omics is becoming a new channel for investigating the complex molecular system in modern biological science and sets a foundation for systematic learning for precision medicine. The statistical/machine learning methods that have emerged in the past decade for integrated omics are not only innovative but also multidisciplinary with integrated knowledge in biology, medicine, statistics, machine learning, and artificial intelligence. Here, we review the nontrivial classes of learning methods from the statistical aspects and streamline these learning methods within the statistical learning framework. The intriguing findings from the review are that the methods used are generalizable to other disciplines with complex systematic structure, and the integrated omics is part of an integrated information science which has collated and integrated different types of information for inferences and decision making. We review the statistical learning methods of exploratory and supervised learning from 42 publications. We also discuss the strengths and limitations of the extended principal component analysis, cluster analysis, network analysis, and regression methods. Statistical techniques such as penalization for sparsity induction when there are fewer observations than the number of features and using Bayesian approach when there are prior knowledge to be integrated are also included in the commentary. For the completeness of the review, a table of currently available software and packages from 23 publications for omics are summarized in the appendix.

  13. Achieving integration in mixed methods designs-principles and practices.

    Science.gov (United States)

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-12-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs-exploratory sequential, explanatory sequential, and convergent-and through four advanced frameworks-multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. © Health Research and Educational Trust.

  14. Achieving Integration in Mixed Methods Designs—Principles and Practices

    Science.gov (United States)

    Fetters, Michael D; Curry, Leslie A; Creswell, John W

    2013-01-01

    Mixed methods research offers powerful tools for investigating complex processes and systems in health and health care. This article describes integration principles and practices at three levels in mixed methods research and provides illustrative examples. Integration at the study design level occurs through three basic mixed method designs—exploratory sequential, explanatory sequential, and convergent—and through four advanced frameworks—multistage, intervention, case study, and participatory. Integration at the methods level occurs through four approaches. In connecting, one database links to the other through sampling. With building, one database informs the data collection approach of the other. When merging, the two databases are brought together for analysis. With embedding, data collection and analysis link at multiple points. Integration at the interpretation and reporting level occurs through narrative, data transformation, and joint display. The fit of integration describes the extent the qualitative and quantitative findings cohere. Understanding these principles and practices of integration can help health services researchers leverage the strengths of mixed methods. PMID:24279835

  15. The Integrated Multi-Level Bilingual Teaching of "Social Research Methods"

    Science.gov (United States)

    Zhu, Yanhan; Ye, Jian

    2012-01-01

    "Social Research Methods," as a methodology course, combines theories and practices closely. Based on the synergy theory, this paper tries to establish an integrated multi-level bilingual teaching mode. Starting from the transformation of teaching concepts, we should integrate interactions, experiences, and researches together and focus…

  16. Phase-integral method allowing nearlying transition points

    CERN Document Server

    Fröman, Nanny

    1996-01-01

    The efficiency of the phase-integral method developed by the present au­ thors has been shown both analytically and numerically in many publica­ tions. With the inclusion of supplementary quantities, closely related to new Stokes constants and obtained with the aid of comparison equation technique, important classes of problems in which transition points may approach each other become accessible to accurate analytical treatment. The exposition in this monograph is of a mathematical nature but has important physical applications, some examples of which are found in the adjoined papers. Thus, we would like to emphasize that, although we aim at mathematical rigor, our treatment is made primarily with physical needs in mind. To introduce the reader into the background of this book, we start by de­ scribing the phase-integral approximation of arbitrary order generated from an unspecified base function. This is done in Chapter 1, which is reprinted, after minor changes, from a review article. Chapter 2 is the re...

  17. Entropy-based benchmarking methods

    NARCIS (Netherlands)

    Temurshoev, Umed

    2012-01-01

    We argue that benchmarking sign-volatile series should be based on the principle of movement and sign preservation, which states that a bench-marked series should reproduce the movement and signs in the original series. We show that the widely used variants of Denton (1971) method and the growth

  18. Integrated project delivery methods for energy renovation of social housing

    Directory of Open Access Journals (Sweden)

    Tadeo Baldiri Salcedo Rahola

    2015-11-01

    renting them. As such, SHOs are used to dealing with renovations on a professional basis. The limited financial capacity of SHOs to realise energy renovations magnifies the importance of improving process performance in order to get the best possible outcomes. In the last 30 years numerous authors have addressed the need to improve the performance of traditional construction processes via alternative project delivery methods. However, very little is known about the specifics of renovations processes for social housing, the feasibility of applying innovative construction management methods and the consequences for the process, for the role of all the actors involved and for the results of the projects. The aim of this study is to provide an insight into the project delivery methods available for SHOs when they are undertaking energy renovation projects and to evaluate how these methods could facilitate the achievement of a higher process performance. The main research question is: How can Social Housing Organisations improve the performance of energy renovation processes using more integrated project delivery methods? The idea of a PhD thesis about social housing renovation processes originated from the participation of TU Delft as research partner in the Intelligent Energy Europe project SHELTER1 which was carried out between 2010 and 2013. The aim of the SHELTER project was to promote and facilitate the use of new models of cooperation, inspired by integrated design, for the energy renovation of social housing. The SHELTER project was a joint effort between six social housing organisations (Arte Genova, Italy; Black Country Housing Group, United Kingdom; Bulgarian Housing Association, Bulgaria; Dynacité, France; Logirep, France and Société Wallonne du Logement, Belgium, three European professional federations based in Brussels (Architects Council of Europe, Cecodhas Housing Europe and European Builders Confederation and one research partner (Delft University of

  19. Computation of rectangular source integral by rational parameter polynomial method

    International Nuclear Information System (INIS)

    Prabha, Hem

    2001-01-01

    Hubbell et al. (J. Res. Nat Bureau Standards 64C, (1960) 121) have obtained a series expansion for the calculation of the radiation field generated by a plane isotropic rectangular source (plaque), in which leading term is the integral H(a,b). In this paper another integral I(a,b), which is related with the integral H(a,b) has been solved by the rational parameter polynomial method. From I(a,b), we compute H(a,b). Using this method the integral I(a,b) is expressed in the form of a polynomial of a rational parameter. Generally, a function f (x) is expressed in terms of x. In this method this is expressed in terms of x/(1+x). In this way, the accuracy of the expression is good over a wide range of x as compared to the earlier approach. The results for I(a,b) and H(a,b) are given for a sixth degree polynomial and are found to be in good agreement with the results obtained by numerically integrating the integral. Accuracy could be increased either by increasing the degree of the polynomial or by dividing the range of integration. The results of H(a,b) and I(a,b) are given for values of b and a up to 2.0 and 20.0, respectively

  20. Presentation planning using an integrated knowledge base

    Science.gov (United States)

    Arens, Yigal; Miller, Lawrence; Sondheimer, Norman

    1988-01-01

    A description is given of user interface research aimed at bringing together multiple input and output modes in a way that handles mixed mode input (commands, menus, forms, natural language), interacts with a diverse collection of underlying software utilities in a uniform way, and presents the results through a combination of output modes including natural language text, maps, charts and graphs. The system, Integrated Interfaces, derives much of its ability to interact uniformly with the user and the underlying services and to build its presentations, from the information present in a central knowledge base. This knowledge base integrates models of the application domain (Navy ships in the Pacific region, in the current demonstration version); the structure of visual displays and their graphical features; the underlying services (data bases and expert systems); and interface functions. The emphasis is on a presentation planner that uses the knowledge base to produce multi-modal output. There has been a flurry of recent work in user interface management systems. (Several recent examples are listed in the references). Existing work is characterized by an attempt to relieve the software designer of the burden of handcrafting an interface for each application. The work has generally focused on intelligently handling input. This paper deals with the other end of the pipeline - presentations.

  1. An integration weighting method to evaluate extremum coordinates

    International Nuclear Information System (INIS)

    Ilyushchenko, V.I.

    1990-01-01

    The numerical version of the Laplace asymptotics has been used to evaluate the coordinates of extrema of multivariate continuous and discontinuous test functions. The performed computer experiments demonstrate the high efficiency of the integration method proposed. The saturating dependence of extremum coordinates on such parameters as a number of integration subregions and that of K going /theoretically/ to infinity has been studied in detail for the limitand being a ratio of two Laplace integrals with exponentiated K. The given method is an integral equivalent of that of weighted means. As opposed to the standard optimization methods of the zero, first and second order the proposed method can be successfully applied to optimize discontinuous objective functions, too. There are possibilities of applying the integration method in the cases, when the conventional techniques fail due to poor analytical properties of the objective functions near extremal points. The proposed method is efficient in searching for both local and global extrema of multimodal objective functions. 12 refs.; 4 tabs

  2. Integration of Small-Diameter Wood Harvesting in Early Thinnings using the Two pile Cutting Method

    Energy Technology Data Exchange (ETDEWEB)

    Kaerhae, Kalle (Metsaeteho Oy, P.O. Box 101, FI-00171 Helsinki (Finland))

    2008-10-15

    Metsaeteho Oy studied the integrated harvesting of industrial roundwood (pulpwood) and energy wood based on a two-pile cutting method, i.e. pulpwood and energy wood fractions are stacked into two separate piles when cutting a first-thinning stand. The productivity and cost levels of the integrated, two-pile cutting method were determined, and the harvesting costs of the two-pile method were compared with those of conventional separate wood harvesting methods. In the time study, when the size of removal was 50 dm3, the productivity in conventional whole-tree cutting was 6% higher than in integrated cutting. With a stem size of 100 dm3, the productivity in whole-tree cutting was 7% higher than in integrated cutting. The results indicated, however, that integrated harvesting based on the two-pile method enables harvesting costs to be decreased to below the current cost level of separate pulpwood harvesting in first thinning stands. The greatest cost-saving potential lies in small-sized first thinnings. The results showed that, when integrated wood harvesting based on the two-pile method is applied, the removals of both energy wood and pulpwood should be more than 15-20 m3/ha at the harvesting sites in order to achieve economically viable integrated procurement

  3. A web-based personalized risk communication and decision-making tool for women with dense breasts: Design and methods of a randomized controlled trial within an integrated health care system.

    Science.gov (United States)

    Knerr, Sarah; Wernli, Karen J; Leppig, Kathleen; Ehrlich, Kelly; Graham, Amanda L; Farrell, David; Evans, Chalanda; Luta, George; Schwartz, Marc D; O'Neill, Suzanne C

    2017-05-01

    Mammographic breast density is one of the strongest risk factors for breast cancer after age and family history. Mandatory breast density disclosure policies are increasing nationally without clear guidance on how to communicate density status to women. Coupling density disclosure with personalized risk counseling and decision support through a web-based tool may be an effective way to allow women to make informed, values-consistent risk management decisions without increasing distress. This paper describes the design and methods of Engaged, a prospective, randomized controlled trial examining the effect of online personalized risk counseling and decision support on risk management decisions in women with dense breasts and increased breast cancer risk. The trial is embedded in a large integrated health care system in the Pacific Northwest. A total of 1250 female health plan members aged 40-69 with a recent negative screening mammogram who are at increased risk for interval cancer based on their 5-year breast cancer risk and BI-RADS® breast density will be randomly assigned to access either a personalized web-based counseling and decision support tool or standard educational content. Primary outcomes will be assessed using electronic health record data (i.e., chemoprevention and breast MRI utilization) and telephone surveys (i.e., distress) at baseline, six weeks, and twelve months. Engaged will provide evidence about whether a web-based personalized risk counseling and decision support tool is an effective method for communicating with women about breast density and risk management. An effective intervention could be disseminated with minimal clinical burden to align with density disclosure mandates. Clinical Trials Registration Number:NCT03029286. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Higher-Order Integral Equation Methods in Computational Electromagnetics

    DEFF Research Database (Denmark)

    Jørgensen, Erik; Meincke, Peter

    Higher-order integral equation methods have been investigated. The study has focused on improving the accuracy and efficiency of the Method of Moments (MoM) applied to electromagnetic problems. A new set of hierarchical Legendre basis functions of arbitrary order is developed. The new basis...

  5. Two pricing methods for solving an integrated commercial fishery ...

    African Journals Online (AJOL)

    In this paper, we develop two novel pricing methods for solving an integer program. We demonstrate the methods by solving an integrated commercial fishery planning model (IFPM). In this problem, a fishery manager must schedule fishing trawlers (determine when and where the trawlers should go fishing, and when the ...

  6. Method for integrating a train of fast, nanosecond wide pulses

    International Nuclear Information System (INIS)

    Rose, C.R.

    1987-01-01

    This paper describes a method used to integrate a train of fast, nanosecond wide pulses. The pulses come from current transformers in a RF LINAC beamline. Because they are ac signals and have no dc component, true mathematical integration would yield zero over the pulse train period or an equally erroneous value because of a dc baseline shift. The circuit used to integrate the pulse train first stretches the pulses to 35 ns FWHM. The signals are then fed into a high-speed, precision rectifier which restores a true dc baseline for the following stage - a fast, gated integrator. The rectifier is linear over 55dB in excess of 25 MHz, and the gated integrator is linear over a 60 dB range with input pulse widths as short as 16 ns. The assembled system is linear over 30 dB with a 6 MHz input signal

  7. Microprocessor-based integrated LMFBR core surveillance. Pt. 2

    International Nuclear Information System (INIS)

    Elies, V.

    1985-12-01

    This report is the result of the KfK part of a joint study of KfK and INTERATOM. The aim of this study is to explore the advantages of microprocessors and microelectronics for a more sophisticated core surveillance, which is based on the integration of separate surveillance techniques. After a description of the experimental results gained with the different surveillance techniques so far, it is shown which kinds of correlation can be done using the evaluation results obtained from the single surveillance systems. The main part of this report contains the systems analysis of a microcomputer-based system integrating different surveillance methods. After an analysis of the hardware requirements a hardware structure for the integrated system is proposed. The software structure is then described for the subsystem performing the different surveillance algorithms as well as for the system which does the correlation thus deriving additional information from the single results. (orig.) [de

  8. Decision-Based Design Integrating Consumer Preferences into Engineering Design

    CERN Document Server

    Chen, Wei; Wassenaar, Henk Jan

    2013-01-01

    Building upon the fundamental principles of decision theory, Decision-Based Design: Integrating Consumer Preferences into Engineering Design presents an analytical approach to enterprise-driven Decision-Based Design (DBD) as a rigorous framework for decision making in engineering design.  Once the related fundamentals of decision theory, economic analysis, and econometrics modelling are established, the remaining chapters describe the entire process, the associated analytical techniques, and the design case studies for integrating consumer preference modeling into the enterprise-driven DBD framework. Methods for identifying key attributes, optimal design of human appraisal experiments, data collection, data analysis, and demand model estimation are presented and illustrated using engineering design case studies. The scope of the chapters also provides: •A rigorous framework of integrating the interests from both producer and consumers in engineering design, •Analytical techniques of consumer choice model...

  9. Spatial Data Integration Using Ontology-Based Approach

    Science.gov (United States)

    Hasani, S.; Sadeghi-Niaraki, A.; Jelokhani-Niaraki, M.

    2015-12-01

    In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  10. SPATIAL DATA INTEGRATION USING ONTOLOGY-BASED APPROACH

    Directory of Open Access Journals (Sweden)

    S. Hasani

    2015-12-01

    Full Text Available In today's world, the necessity for spatial data for various organizations is becoming so crucial that many of these organizations have begun to produce spatial data for that purpose. In some circumstances, the need to obtain real time integrated data requires sustainable mechanism to process real-time integration. Case in point, the disater management situations that requires obtaining real time data from various sources of information. One of the problematic challenges in the mentioned situation is the high degree of heterogeneity between different organizations data. To solve this issue, we introduce an ontology-based method to provide sharing and integration capabilities for the existing databases. In addition to resolving semantic heterogeneity, better access to information is also provided by our proposed method. Our approach is consisted of three steps, the first step is identification of the object in a relational database, then the semantic relationships between them are modelled and subsequently, the ontology of each database is created. In a second step, the relative ontology will be inserted into the database and the relationship of each class of ontology will be inserted into the new created column in database tables. Last step is consisted of a platform based on service-oriented architecture, which allows integration of data. This is done by using the concept of ontology mapping. The proposed approach, in addition to being fast and low cost, makes the process of data integration easy and the data remains unchanged and thus takes advantage of the legacy application provided.

  11. Integration of gas chromatography mass spectrometry methods for differentiating ricin preparation methods.

    Science.gov (United States)

    Wunschel, David S; Melville, Angela M; Ehrhardt, Christopher J; Colburn, Heather A; Victry, Kristin D; Antolick, Kathryn C; Wahl, Jon H; Wahl, Karen L

    2012-05-07

    The investigation of crimes involving chemical or biological agents is infrequent, but presents unique analytical challenges. The protein toxin ricin is encountered more frequently than other agents and is found in the seeds of Ricinus communis, commonly known as the castor plant. Typically, the toxin is extracted from castor seeds utilizing a variety of different recipes that result in varying purity of the toxin. Moreover, these various purification steps can also leave or differentially remove a variety of exogenous and endogenous residual components with the toxin that may indicate the type and number of purification steps involved. We have applied three gas chromatography-mass spectrometry (GC-MS) based analytical methods to measure the variation in seed carbohydrates and castor oil ricinoleic acid, as well as the presence of solvents used for purification. These methods were applied to the same samples prepared using four previously identified toxin preparation methods, starting from four varieties of castor seeds. The individual data sets for seed carbohydrate profiles, ricinoleic acid, or acetone amount each provided information capable of differentiating different types of toxin preparations across seed types. However, the integration of the data sets using multivariate factor analysis provided a clear distinction of all samples based on the preparation method, independent of the seed source. In particular, the abundance of mannose, arabinose, fucose, ricinoleic acid, and acetone were shown to be important differentiating factors. These complementary tools provide a more confident determination of the method of toxin preparation than would be possible using a single analytical method.

  12. 14th International Conference on Integral Methods in Science and Engineering

    CERN Document Server

    Riva, Matteo; Lamberti, Pier; Musolino, Paolo

    2017-01-01

    This contributed volume contains a collection of articles on the most recent advances in integral methods.  The first of two volumes, this work focuses on the construction of theoretical integral methods. Written by internationally recognized researchers, the chapters in this book are based on talks given at the Fourteenth International Conference on Integral Methods in Science and Engineering, held July 25-29, 2016, in Padova, Italy. A broad range of topics is addressed, such as: • Integral equations • Homogenization • Duality methods • Optimal design • Conformal techniques This collection will be of interest to researchers in applied mathematics, physics, and mechanical and electrical engineering, as well as graduate students in these disciplines, and to other professionals who use integration as an essential tool in their work.

  13. INTEGRATED SENSOR EVALUATION CIRCUIT AND METHOD FOR OPERATING SAID CIRCUIT

    OpenAIRE

    Krüger, Jens; Gausa, Dominik

    2015-01-01

    WO15090426A1 Sensor evaluation device and method for operating said device Integrated sensor evaluation circuit for evaluating a sensor signal (14) received from a sensor (12), having a first connection (28a) for connection to the sensor and a second connection (28b) for connection to the sensor. The integrated sensor evaluation circuit comprises a configuration data memory (16) for storing configuration data which describe signal properties of a plurality of sensor control signals (26a-c). T...

  14. Mathematical methods linear algebra normed spaces distributions integration

    CERN Document Server

    Korevaar, Jacob

    1968-01-01

    Mathematical Methods, Volume I: Linear Algebra, Normed Spaces, Distributions, Integration focuses on advanced mathematical tools used in applications and the basic concepts of algebra, normed spaces, integration, and distributions.The publication first offers information on algebraic theory of vector spaces and introduction to functional analysis. Discussions focus on linear transformations and functionals, rectangular matrices, systems of linear equations, eigenvalue problems, use of eigenvectors and generalized eigenvectors in the representation of linear operators, metric and normed vector

  15. User's guide to Monte Carlo methods for evaluating path integrals

    Science.gov (United States)

    Westbroek, Marise J. E.; King, Peter R.; Vvedensky, Dimitri D.; Dürr, Stephan

    2018-04-01

    We give an introduction to the calculation of path integrals on a lattice, with the quantum harmonic oscillator as an example. In addition to providing an explicit computational setup and corresponding pseudocode, we pay particular attention to the existence of autocorrelations and the calculation of reliable errors. The over-relaxation technique is presented as a way to counter strong autocorrelations. The simulation methods can be extended to compute observables for path integrals in other settings.

  16. Computerized integrated data base production system (COMPINDAS)

    Energy Technology Data Exchange (ETDEWEB)

    Marek, D; Buerk, K [Fachinformationszentrum Karlsruhe, Gesellschaft fuer Wissenschaftlich-Technische Information mbH, Eggenstein-Leopoldshafen (Germany)

    1990-05-01

    Based on many years of experience, and with the main objective in mind to guarantee long-term database quality and efficiency of input processes, Fachinformationszentrum Karlsruhe is developing an integrated interactive data management systems for bibliographic and factual databases. Its concept includes the following range of applications: Subject analysis with computer-assisted classification, indexing and translation; technical procedures with online acquisition and management of literature and factual data, recording by means of optical scanning, computer-assisted bibliographic description, control and update procedures; support of the whole process by continuous surveillance of document flow. All these procedures will be performed in an integrated manner. They system is to meet high standards for flexibility, data integrity and effectiveness of system functions. Independent of the type of data, the appropriate database or the subject field to be handled, all data will be stored in one large pool. One main goal is to avoid duplication of work and redundancy of data storage. The system will work online, interactive and conversational. COMPINDAS is being established on the basis of the ADABAS as database management system for storage and retrieval. The applications are being generated by means of aDis of ASTEC in Munich. aDis is used for the definition of the data structures, checking routines, coupling processes, and the design of dialogue and batch routines including masks. (author). 7 figs.

  17. Computerized integrated data base production system (COMPINDAS)

    International Nuclear Information System (INIS)

    Marek, D.; Buerk, K.

    1990-05-01

    Based on many years of experience, and with the main objective in mind to guarantee long-term database quality and efficiency of input processes, Fachinformationszentrum Karlsruhe is developing an integrated interactive data management systems for bibliographic and factual databases. Its concept includes the following range of applications: Subject analysis with computer-assisted classification, indexing and translation; technical procedures with online acquisition and management of literature and factual data, recording by means of optical scanning, computer-assisted bibliographic description, control and update procedures; support of the whole process by continuous surveillance of document flow. All these procedures will be performed in an integrated manner. They system is to meet high standards for flexibility, data integrity and effectiveness of system functions. Independent of the type of data, the appropriate database or the subject field to be handled, all data will be stored in one large pool. One main goal is to avoid duplication of work and redundancy of data storage. The system will work online, interactive and conversational. COMPINDAS is being established on the basis of the ADABAS as database management system for storage and retrieval. The applications are being generated by means of aDis of ASTEC in Munich. aDis is used for the definition of the data structures, checking routines, coupling processes, and the design of dialogue and batch routines including masks. (author). 7 figs

  18. Fibonacci-regularization method for solving Cauchy integral equations of the first kind

    Directory of Open Access Journals (Sweden)

    Mohammad Ali Fariborzi Araghi

    2017-09-01

    Full Text Available In this paper, a novel scheme is proposed to solve the first kind Cauchy integral equation over a finite interval. For this purpose, the regularization method is considered. Then, the collocation method with Fibonacci base function is applied to solve the obtained second kind singular integral equation. Also, the error estimate of the proposed scheme is discussed. Finally, some sample Cauchy integral equations stem from the theory of airfoils in fluid mechanics are presented and solved to illustrate the importance and applicability of the given algorithm. The tables in the examples show the efficiency of the method.

  19. Multifeature Fusion Vehicle Detection Algorithm Based on Choquet Integral

    Directory of Open Access Journals (Sweden)

    Wenhui Li

    2014-01-01

    Full Text Available Vision-based multivehicle detection plays an important role in Forward Collision Warning Systems (FCWS and Blind Spot Detection Systems (BSDS. The performance of these systems depends on the real-time capability, accuracy, and robustness of vehicle detection methods. To improve the accuracy of vehicle detection algorithm, we propose a multifeature fusion vehicle detection algorithm based on Choquet integral. This algorithm divides the vehicle detection problem into two phases: feature similarity measure and multifeature fusion. In the feature similarity measure phase, we first propose a taillight-based vehicle detection method, and then vehicle taillight feature similarity measure is defined. Second, combining with the definition of Choquet integral, the vehicle symmetry similarity measure and the HOG + AdaBoost feature similarity measure are defined. Finally, these three features are fused together by Choquet integral. Being evaluated on public test collections and our own test images, the experimental results show that our method has achieved effective and robust multivehicle detection in complicated environments. Our method can not only improve the detection rate but also reduce the false alarm rate, which meets the engineering requirements of Advanced Driving Assistance Systems (ADAS.

  20. The Integral Method, a new approach to quantify bactericidal activity.

    Science.gov (United States)

    Gottardi, Waldemar; Pfleiderer, Jörg; Nagl, Markus

    2015-08-01

    The bactericidal activity (BA) of antimicrobial agents is generally derived from the results of killing assays. A reliable quantitative characterization and particularly a comparison of these substances, however, are impossible with this information. We here propose a new method that takes into account the course of the complete killing curve for assaying BA and that allows a clear-cut quantitative comparison of antimicrobial agents with only one number. The new Integral Method, based on the reciprocal area below the killing curve, reliably calculates an average BA [log10 CFU/min] and, by implementation of the agent's concentration C, the average specific bactericidal activity SBA=BA/C [log10 CFU/min/mM]. Based on experimental killing data, the pertaining BA and SBA values of exemplary active halogen compounds were established, allowing quantitative assertions. N-chlorotaurine (NCT), chloramine T (CAT), monochloramine (NH2Cl), and iodine (I2) showed extremely diverging SBA values of 0.0020±0.0005, 1.11±0.15, 3.49±0.22, and 291±137log10 CFU/min/mM, respectively, against Staphylococcus aureus. This immediately demonstrates an approximately 550-fold stronger activity of CAT, 1730-fold of NH2Cl, and 150,000-fold of I2 compared to NCT. The inferred quantitative assertions and conclusions prove the new method suitable for characterizing bactericidal activity. Its application comprises the effect of defined agents on various bacteria, the consequence of temperature shifts, the influence of varying drug structure, dose-effect relationships, ranking of isosteric agents, comparison of competing commercial antimicrobial formulations, and the effect of additives. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Tokamak plasma shape identification based on the boundary integral equations

    International Nuclear Information System (INIS)

    Kurihara, Kenichi; Kimura, Toyoaki

    1992-05-01

    A necessary condition for tokamak plasma shape identification is discussed and a new identification method is proposed in this article. This method is based on the boundary integral equations governing a vacuum region around a plasma with only the measurement of either magnetic fluxes or magnetic flux intensities. It can identify various plasmas with low to high ellipticities with the precision determined by the number of the magnetic sensors. This method is applicable to real-time control and visualization using a 'table-look-up' procedure. (author)

  2. An integral nodal variational method for multigroup criticality calculations

    International Nuclear Information System (INIS)

    Lewis, E.E.; Tsoulfanidis, N.

    2003-01-01

    An integral formulation of the variational nodal method is presented and applied to a series of benchmark critically problems. The method combines an integral transport treatment of the even-parity flux within the spatial node with an odd-parity spherical harmonics expansion of the Lagrange multipliers at the node interfaces. The response matrices that result from this formulation are compatible with those in the VARIANT code at Argonne National Laboratory. Either homogeneous or heterogeneous nodes may be employed. In general, for calculations requiring higher-order angular approximations, the integral method yields solutions with comparable accuracy while requiring substantially less CPU time and memory than the standard spherical harmonics expansion using the same spatial approximations. (author)

  3. Cultural adaptation and translation of measures: an integrated method.

    Science.gov (United States)

    Sidani, Souraya; Guruge, Sepali; Miranda, Joyal; Ford-Gilboe, Marilyn; Varcoe, Colleen

    2010-04-01

    Differences in the conceptualization and operationalization of health-related concepts may exist across cultures. Such differences underscore the importance of examining conceptual equivalence when adapting and translating instruments. In this article, we describe an integrated method for exploring conceptual equivalence within the process of adapting and translating measures. The integrated method involves five phases including selection of instruments for cultural adaptation and translation; assessment of conceptual equivalence, leading to the generation of a set of items deemed to be culturally and linguistically appropriate to assess the concept of interest in the target community; forward translation; back translation (optional); and pre-testing of the set of items. Strengths and limitations of the proposed integrated method are discussed. (c) 2010 Wiley Periodicals, Inc.

  4. Stress estimation in reservoirs using an integrated inverse method

    Science.gov (United States)

    Mazuyer, Antoine; Cupillard, Paul; Giot, Richard; Conin, Marianne; Leroy, Yves; Thore, Pierre

    2018-05-01

    Estimating the stress in reservoirs and their surroundings prior to the production is a key issue for reservoir management planning. In this study, we propose an integrated inverse method to estimate such initial stress state. The 3D stress state is constructed with the displacement-based finite element method assuming linear isotropic elasticity and small perturbations in the current geometry of the geological structures. The Neumann boundary conditions are defined as piecewise linear functions of depth. The discontinuous functions are determined with the CMA-ES (Covariance Matrix Adaptation Evolution Strategy) optimization algorithm to fit wellbore stress data deduced from leak-off tests and breakouts. The disregard of the geological history and the simplified rheological assumptions mean that only the stress field, statically admissible and matching the wellbore data should be exploited. The spatial domain of validity of this statement is assessed by comparing the stress estimations for a synthetic folded structure of finite amplitude with a history constructed assuming a viscous response.

  5. Development the method of integral evaluation the efficiency of management of financing the enterprise

    Directory of Open Access Journals (Sweden)

    Opeshko Nataliya Sergiivna

    2017-06-01

    Full Text Available The essence of the concept of «financing the enterprise» was determined. The system of indicators of measuring the effectiveness of management of financing the enterprise was developed. The internal structure of linkages in the scorecard was investigated. The method of integral evaluation of management of financing the enterprise was developed. The usefulness of using the developed method of integral evaluation was proved based on conducted experiments.

  6. Applications of integral equation methods for the numerical solution of magnetostatic and eddy current problems

    International Nuclear Information System (INIS)

    Trowbridge, C.W.

    1976-06-01

    Various integral equation methods are described. For magnetostatic problems three formulations are considered in detail, (a) the direct solution method for the magnetisation distribution in permeable materials, (b) a method based on a scalar potential and (c) the use of an integral equation derived from Green's Theorem, i.e. the so-called Boundary Integral Method (BIM). In the case of (a) results are given for two-and three-dimensional non-linear problems with comparisons against measurement. For methods (b) and (c) which both lead to a more economic use of the computer than (a) some preliminary results are given for simple cases. For eddy current problems various methods are discussed and some results are given from a computer program based on a vector potential formulation. (author)

  7. Applications of integral equation methods for the numerical solution of magnetostatic and eddy current problems

    Energy Technology Data Exchange (ETDEWEB)

    Trowbridge, C W

    1976-06-01

    Various integral equation methods are described. For magnetostatic problems three formulations are considered in detail, (a) the direct solution method for the magnetisation distribution in permeable materials, (b) a method based on a scalar potential, and (c) the use of an integral equation derived from Green's Theorem, i.e. the so-called Boundary Integral Method (BIM). In the case of (a) results are given for two-and three-dimensional non-linear problems with comparisons against measurement. For methods (b) and (c), which both lead to a more economical use of the computer than (a), some preliminary results are given for simple cases. For eddy current problems various methods are discussed and some results are given from a computer program based on a vector potential formulation.

  8. Integrated circuits based on conjugated polymer monolayer.

    Science.gov (United States)

    Li, Mengmeng; Mangalore, Deepthi Kamath; Zhao, Jingbo; Carpenter, Joshua H; Yan, Hongping; Ade, Harald; Yan, He; Müllen, Klaus; Blom, Paul W M; Pisula, Wojciech; de Leeuw, Dago M; Asadi, Kamal

    2018-01-31

    It is still a great challenge to fabricate conjugated polymer monolayer field-effect transistors (PoM-FETs) due to intricate crystallization and film formation of conjugated polymers. Here we demonstrate PoM-FETs based on a single monolayer of a conjugated polymer. The resulting PoM-FETs are highly reproducible and exhibit charge carrier mobilities reaching 3 cm 2  V -1  s -1 . The high performance is attributed to the strong interactions of the polymer chains present already in solution leading to pronounced edge-on packing and well-defined microstructure in the monolayer. The high reproducibility enables the integration of discrete unipolar PoM-FETs into inverters and ring oscillators. Real logic functionality has been demonstrated by constructing a 15-bit code generator in which hundreds of self-assembled PoM-FETs are addressed simultaneously. Our results provide the state-of-the-art example of integrated circuits based on a conjugated polymer monolayer, opening prospective pathways for bottom-up organic electronics.

  9. The 3D Lagrangian Integral Method. Henrik Koblitz Rasmussen

    DEFF Research Database (Denmark)

    Rasmussen, Henrik Koblitz

    2003-01-01

    . This are processes such as thermo-forming, gas-assisted injection moulding and all kind of simultaneous multi-component polymer processing operations. Though, in all polymer processing operations free surfaces (or interfaces) are present and the dynamic of these surfaces are of interest. In the "3D Lagrangian...... Integral Method" to simulate viscoelastic flow, the governing equations are solved for the particle positions (Lagrangian kinematics). Therefore, the transient motion of surfaces can be followed in a particularly simple fashion even in 3D viscoelastic flow. The "3D Lagrangian Integral Method" is described...

  10. Integrated Management System as a base for customer satisfaction

    International Nuclear Information System (INIS)

    Grabelnikov, Konstantin V.

    2012-01-01

    In this article JSC NCCP integrated management system procedures is presented. Unique possibility to collect different Customers Voices, based on different organisation culture and approaches, to improve the technological process, design of nuclear fuel, quality control methods and instrumentation is presented. As a result of the mutual efforts we have stable and valuable decreasing of leaking FA at NPP of our Customers. And, with other hand, we have stable and valuable increasing of Customer Satisfaction Level

  11. Activity – based costing method

    Directory of Open Access Journals (Sweden)

    Èuchranová Katarína

    2001-06-01

    Full Text Available Activity based costing is a method of identifying and tracking the operating costs directly associated with processing items. It is the practice of focusing on some unit of output, such as a purchase order or an assembled automobile and attempting to determine its total as precisely as poccible based on the fixed and variable costs of the inputs.You use ABC to identify, quantify and analyze the various cost drivers (such as labor, materials, administrative overhead, rework. and to determine which ones are candidates for reduction.A processes any activity that accepts inputs, adds value to these inputs for customers and produces outputs for these customers. The customer may be either internal or external to the organization. Every activity within an organization comprimes one or more processes. Inputs, controls and resources are all supplied to the process.A process owner is the person responsible for performing and or controlling the activity.The direction of cost through their contact to partial activity and processes is a new modern theme today. Beginning of this method is connected with very important changes in the firm processes.ABC method is a instrument , that bring a competitive advantages for the firm.

  12. Real-time hybrid simulation using the convolution integral method

    International Nuclear Information System (INIS)

    Kim, Sung Jig; Christenson, Richard E; Wojtkiewicz, Steven F; Johnson, Erik A

    2011-01-01

    This paper proposes a real-time hybrid simulation method that will allow complex systems to be tested within the hybrid test framework by employing the convolution integral (CI) method. The proposed CI method is potentially transformative for real-time hybrid simulation. The CI method can allow real-time hybrid simulation to be conducted regardless of the size and complexity of the numerical model and for numerical stability to be ensured in the presence of high frequency responses in the simulation. This paper presents the general theory behind the proposed CI method and provides experimental verification of the proposed method by comparing the CI method to the current integration time-stepping (ITS) method. Real-time hybrid simulation is conducted in the Advanced Hazard Mitigation Laboratory at the University of Connecticut. A seismically excited two-story shear frame building with a magneto-rheological (MR) fluid damper is selected as the test structure to experimentally validate the proposed method. The building structure is numerically modeled and simulated, while the MR damper is physically tested. Real-time hybrid simulation using the proposed CI method is shown to provide accurate results

  13. An approximation method for nonlinear integral equations of Hammerstein type

    International Nuclear Information System (INIS)

    Chidume, C.E.; Moore, C.

    1989-05-01

    The solution of a nonlinear integral equation of Hammerstein type in Hilbert spaces is approximated by means of a fixed point iteration method. Explicit error estimates are given and, in some cases, convergence is shown to be at least as fast as a geometric progression. (author). 25 refs

  14. The philosophy and method of integrative humanism and religious ...

    African Journals Online (AJOL)

    This paper titled “Philosophy and Method of Integrative Humanism and Religious Crises in Nigeria: Picking the Essentials”, acknowledges the damaging effects of religious bigotry, fanaticism and creed differences on the social, political and economic development of the country. The need for the cessation of religious ...

  15. An Integrated Approach to Research Methods and Capstone

    Science.gov (United States)

    Postic, Robert; McCandless, Ray; Stewart, Beth

    2014-01-01

    In 1991, the AACU issued a report on improving undergraduate education suggesting, in part, that a curriculum should be both comprehensive and cohesive. Since 2008, we have systematically integrated our research methods course with our capstone course in an attempt to accomplish the twin goals of comprehensiveness and cohesion. By taking this…

  16. Confluent education: an integrative method for nursing (continuing) education.

    NARCIS (Netherlands)

    Francke, A.L.; Erkens, T.

    1994-01-01

    Confluent education is presented as a method to bridge the gap between cognitive and affective learning. Attention is focused on three main characteristics of confluent education: (a) the integration of four overlapping domains in a learning process (readiness, the cognitive domain, the affective

  17. On the solution of high order stable time integration methods

    Czech Academy of Sciences Publication Activity Database

    Axelsson, Owe; Blaheta, Radim; Sysala, Stanislav; Ahmad, B.

    2013-01-01

    Roč. 108, č. 1 (2013), s. 1-22 ISSN 1687-2770 Institutional support: RVO:68145535 Keywords : evolution equations * preconditioners for quadratic matrix polynomials * a stiffly stable time integration method Subject RIV: BA - General Mathematics Impact factor: 0.836, year: 2013 http://www.boundaryvalueproblems.com/content/2013/1/108

  18. Integrating Expressive Methods in a Relational-Psychotherapy

    Directory of Open Access Journals (Sweden)

    Richard G. Erskine

    2011-06-01

    Full Text Available Therapeutic Involvement is an integral part of all effective psychotherapy.This article is written to illustrate the concept of Therapeutic Involvement in working within a therapeutic relationship – within the transference -- and with active expressive and experiential methods to resolve traumatic experiences, relational disturbances and life shaping decisions.

  19. Integrated Curriculum and Subject-based Curriculum: Achievement and Attitudes

    Science.gov (United States)

    Casady, Victoria

    The research conducted for this mixed-method study, qualitative and quantitative, analyzed the results of an academic year-long study to determine whether the use of an integrated fourth grade curriculum would benefit student achievement in the areas of English language arts, social studies, and science more than a subject-based traditional curriculum. The research was conducted based on the international, national, and state test scores, which show a slowing or lack of growth. Through pre- and post-assessments, student questionnaires, and administrative interviews, the researcher analyzed the phenomenological experiences of the students to determine if the integrated curriculum was a beneficial restructuring of the curriculum. The research questions for this study focused on the achievement and attitudes of the students in the study and whether the curriculum they were taught impacted their achievement and attitudes over the course of one school year. The curricula for the study were organized to cover the current standards, where the integrated curriculum focused on connections between subject areas to help students make connections to what they are learning and the world beyond the classroom. The findings of this study indicated that utilizing the integrated curriculum could increase achievement as well as students' attitudes toward specific content areas. The ANOVA analysis for English language arts was not determined to be significant; although, greater growth in the students from the integrated curriculum setting was recorded. The ANOVA for social studies (0.05) and the paired t-tests (0.001) for science both determined significant positive differences. The qualitative analysis led to the discovery that the experiences of the students from the integrated curriculum setting were more positive. The evaluation of the data from this study led the researcher to determine that the integrated curriculum was a worthwhile endeavor to increase achievement and attitudes

  20. Integrated Data Base Program: a status report

    International Nuclear Information System (INIS)

    Notz, K.J.; Klein, J.A.

    1984-06-01

    The Integrated Data Base (IDB) Program provides official Department of Energy (DOE) data on spent fuel and radioactive waste inventories, projections, and characteristics. The accomplishments of FY 1983 are summarized for three broad areas: (1) upgrading and issuing of the annual report on spent fuel and radioactive waste inventories, projections, and characteristics, including ORIGEN2 applications and a quality assurance plan; (2) creation of a summary data file in user-friendly format for use on a personal computer and enhancing user access to program data; and (3) optimizing and documentation of the data handling methodology used by the IDB Program and providing direct support to other DOE programs and sites in data handling. Plans for future work in these three areas are outlined. 23 references, 11 figures

  1. A new integral method for solving the point reactor neutron kinetics equations

    International Nuclear Information System (INIS)

    Li Haofeng; Chen Wenzhen; Luo Lei; Zhu Qian

    2009-01-01

    A numerical integral method that efficiently provides the solution of the point kinetics equations by using the better basis function (BBF) for the approximation of the neutron density in one time step integrations is described and investigated. The approach is based on an exact analytic integration of the neutron density equation, where the stiffness of the equations is overcome by the fully implicit formulation. The procedure is tested by using a variety of reactivity functions, including step reactivity insertion, ramp input and oscillatory reactivity changes. The solution of the better basis function method is compared to other analytical and numerical solutions of the point reactor kinetics equations. The results show that selecting a better basis function can improve the efficiency and accuracy of this integral method. The better basis function method can be used in real time forecasting for power reactors in order to prevent reactivity accidents.

  2. Development of an Agent-Based Model (ABM) to Simulate the Immune System and Integration of a Regression Method to Estimate the Key ABM Parameters by Fitting the Experimental Data

    Science.gov (United States)

    Tong, Xuming; Chen, Jinghang; Miao, Hongyu; Li, Tingting; Zhang, Le

    2015-01-01

    Agent-based models (ABM) and differential equations (DE) are two commonly used methods for immune system simulation. However, it is difficult for ABM to estimate key parameters of the model by incorporating experimental data, whereas the differential equation model is incapable of describing the complicated immune system in detail. To overcome these problems, we developed an integrated ABM regression model (IABMR). It can combine the advantages of ABM and DE by employing ABM to mimic the multi-scale immune system with various phenotypes and types of cells as well as using the input and output of ABM to build up the Loess regression for key parameter estimation. Next, we employed the greedy algorithm to estimate the key parameters of the ABM with respect to the same experimental data set and used ABM to describe a 3D immune system similar to previous studies that employed the DE model. These results indicate that IABMR not only has the potential to simulate the immune system at various scales, phenotypes and cell types, but can also accurately infer the key parameters like DE model. Therefore, this study innovatively developed a complex system development mechanism that could simulate the complicated immune system in detail like ABM and validate the reliability and efficiency of model like DE by fitting the experimental data. PMID:26535589

  3. High temperature spectral emissivity measurement using integral blackbody method

    Science.gov (United States)

    Pan, Yijie; Dong, Wei; Lin, Hong; Yuan, Zundong; Bloembergen, Pieter

    2016-10-01

    Spectral emissivity is a critical material's thermos-physical property for heat design and radiation thermometry. A prototype instrument based upon an integral blackbody method was developed to measure material's spectral emissivity above 1000 °. The system was implemented with an optimized commercial variable-high-temperature blackbody, a high speed linear actuator, a linear pyrometer, and an in-house designed synchronization circuit. A sample was placed in a crucible at the bottom of the blackbody furnace, by which the sample and the tube formed a simulated blackbody which had an effective total emissivity greater than 0.985. During the measurement, the sample was pushed to the end opening of the tube by a graphite rod which was actuated through a pneumatic cylinder. A linear pyrometer was used to monitor the brightness temperature of the sample surface through the measurement. The corresponding opto-converted voltage signal was fed and recorded by a digital multi-meter. A physical model was proposed to numerically evaluate the temperature drop along the process. Tube was discretized as several isothermal cylindrical rings, and the temperature profile of the tube was measurement. View factors between sample and rings were calculated and updated along the whole pushing process. The actual surface temperature of the sample at the end opening was obtained. Taking advantages of the above measured voltage profile and the calculated true temperature, spectral emissivity under this temperature point was calculated.

  4. Methods for assessing NPP containment pressure boundary integrity

    International Nuclear Information System (INIS)

    Naus, D.J.; Ellingwood, B.R.; Graves, H.L.

    2004-01-01

    Research is being conducted to address aging of the containment pressure boundary in light-water reactor plants. Objectives of this research are to (1) understand the significant factors relating to corrosion occurrence, efficacy of inspection, and structural capacity reduction of steel containments and of liners of concrete containments; (2) provide the U.S. Nuclear Regulatory Commission (USNRC) reviewers a means of establishing current structural capacity margins or estimating future residual structural capacity margins for steel containments and concrete containments as limited by liner integrity; and (3) provide recommendations, as appropriate, on information to be requested of licensees for guidance that could be utilized by USNRC reviewers in assessing the seriousness of reported incidences of containment degradation. Activities include development of a degradation assessment methodology; reviews of techniques and methods for inspection and repair of containment metallic pressure boundaries; evaluation of candidate techniques for inspection of inaccessible regions of containment metallic pressure boundaries; establishment of a methodology for reliability-based condition assessments of steel containments and liners; and fragility assessments of steel containments with localized corrosion

  5. Mixed methods in psychotherapy research: A review of method(ology) integration in psychotherapy science.

    Science.gov (United States)

    Bartholomew, Theodore T; Lockard, Allison J

    2018-06-13

    Mixed methods can foster depth and breadth in psychological research. However, its use remains in development in psychotherapy research. Our purpose was to review the use of mixed methods in psychotherapy research. Thirty-one studies were identified via the PRISMA systematic review method. Using Creswell & Plano Clark's typologies to identify design characteristics, we assessed each study for rigor and how each used mixed methods. Key features of mixed methods designs and these common patterns were identified: (a) integration of clients' perceptions via mixing; (b) understanding group psychotherapy; (c) integrating methods with cases and small samples; (d) analyzing clinical data as qualitative data; and (e) exploring cultural identities in psychotherapy through mixed methods. The review is discussed with respect to the value of integrating multiple data in single studies to enhance psychotherapy research. © 2018 Wiley Periodicals, Inc.

  6. 3-D computer graphics based on integral photography.

    Science.gov (United States)

    Naemura, T; Yoshida, T; Harashima, H

    2001-02-12

    Integral photography (IP), which is one of the ideal 3-D photographic technologies, can be regarded as a method of capturing and displaying light rays passing through a plane. The NHK Science and Technical Research Laboratories have developed a real-time IP system using an HDTV camera and an optical fiber array. In this paper, the authors propose a method of synthesizing arbitrary views from IP images captured by the HDTV camera. This is a kind of image-based rendering system, founded on the 4-D data space Representation of light rays. Experimental results show the potential to improve the quality of images rendered by computer graphics techniques.

  7. Integral staggered point-matching method for millimeter-wave reflective diffraction gratings on electron cyclotron heating systems

    International Nuclear Information System (INIS)

    Xia, Donghui; Huang, Mei; Wang, Zhijiang; Zhang, Feng; Zhuang, Ge

    2016-01-01

    Highlights: • The integral staggered point-matching method for design of polarizers on the ECH systems is presented. • The availability of the integral staggered point-matching method is checked by numerical calculations. • Two polarizers are designed with the integral staggered point-matching method and the experimental results are given. - Abstract: The reflective diffraction gratings are widely used in the high power electron cyclotron heating systems for polarization strategy. This paper presents a method which we call “the integral staggered point-matching method” for design of reflective diffraction gratings. This method is based on the integral point-matching method. However, it effectively removes the convergence problems and tedious calculations of the integral point-matching method, making it easier to be used for a beginner. A code is developed based on this method. The calculation results of the integral staggered point-matching method are compared with the integral point-matching method, the coordinate transformation method and the low power measurement results. It indicates that the integral staggered point-matching method can be used as an optional method for the design of reflective diffraction gratings in electron cyclotron heating systems.

  8. Integral staggered point-matching method for millimeter-wave reflective diffraction gratings on electron cyclotron heating systems

    Energy Technology Data Exchange (ETDEWEB)

    Xia, Donghui [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, 430074 Wuhan (China); Huang, Mei [Southwestern Institute of Physics, 610041 Chengdu (China); Wang, Zhijiang, E-mail: wangzj@hust.edu.cn [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, 430074 Wuhan (China); Zhang, Feng [Southwestern Institute of Physics, 610041 Chengdu (China); Zhuang, Ge [State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Huazhong University of Science and Technology, 430074 Wuhan (China)

    2016-10-15

    Highlights: • The integral staggered point-matching method for design of polarizers on the ECH systems is presented. • The availability of the integral staggered point-matching method is checked by numerical calculations. • Two polarizers are designed with the integral staggered point-matching method and the experimental results are given. - Abstract: The reflective diffraction gratings are widely used in the high power electron cyclotron heating systems for polarization strategy. This paper presents a method which we call “the integral staggered point-matching method” for design of reflective diffraction gratings. This method is based on the integral point-matching method. However, it effectively removes the convergence problems and tedious calculations of the integral point-matching method, making it easier to be used for a beginner. A code is developed based on this method. The calculation results of the integral staggered point-matching method are compared with the integral point-matching method, the coordinate transformation method and the low power measurement results. It indicates that the integral staggered point-matching method can be used as an optional method for the design of reflective diffraction gratings in electron cyclotron heating systems.

  9. Integrated failure probability estimation based on structural integrity analysis and failure data: Natural gas pipeline case

    International Nuclear Information System (INIS)

    Dundulis, Gintautas; Žutautaitė, Inga; Janulionis, Remigijus; Ušpuras, Eugenijus; Rimkevičius, Sigitas; Eid, Mohamed

    2016-01-01

    In this paper, the authors present an approach as an overall framework for the estimation of the failure probability of pipelines based on: the results of the deterministic-probabilistic structural integrity analysis (taking into account loads, material properties, geometry, boundary conditions, crack size, and defected zone thickness), the corrosion rate, the number of defects and failure data (involved into the model via application of Bayesian method). The proposed approach is applied to estimate the failure probability of a selected part of the Lithuanian natural gas transmission network. The presented approach for the estimation of integrated failure probability is a combination of several different analyses allowing us to obtain: the critical crack's length and depth, the failure probability of the defected zone thickness, dependency of the failure probability on the age of the natural gas transmission pipeline. A model's uncertainty analysis and uncertainty propagation analysis are performed, as well. - Highlights: • Degradation mechanisms of natural gas transmission pipelines. • Fracture mechanic analysis of the pipe with crack. • Stress evaluation of the pipe with critical crack. • Deterministic-probabilistic structural integrity analysis of gas pipeline. • Integrated estimation of pipeline failure probability by Bayesian method.

  10. Dhage Iteration Method for Generalized Quadratic Functional Integral Equations

    Directory of Open Access Journals (Sweden)

    Bapurao C. Dhage

    2015-01-01

    Full Text Available In this paper we prove the existence as well as approximations of the solutions for a certain nonlinear generalized quadratic functional integral equation. An algorithm for the solutions is developed and it is shown that the sequence of successive approximations starting at a lower or upper solution converges monotonically to the solutions of related quadratic functional integral equation under some suitable mixed hybrid conditions. We rely our main result on Dhage iteration method embodied in a recent hybrid fixed point theorem of Dhage (2014 in partially ordered normed linear spaces. An example is also provided to illustrate the abstract theory developed in the paper.

  11. Entropic sampling in the path integral Monte Carlo method

    International Nuclear Information System (INIS)

    Vorontsov-Velyaminov, P N; Lyubartsev, A P

    2003-01-01

    We have extended the entropic sampling Monte Carlo method to the case of path integral representation of a quantum system. A two-dimensional density of states is introduced into path integral form of the quantum canonical partition function. Entropic sampling technique within the algorithm suggested recently by Wang and Landau (Wang F and Landau D P 2001 Phys. Rev. Lett. 86 2050) is then applied to calculate the corresponding entropy distribution. A three-dimensional quantum oscillator is considered as an example. Canonical distributions for a wide range of temperatures are obtained in a single simulation run, and exact data for the energy are reproduced

  12. High sensitive quench detection method using an integrated test wire

    International Nuclear Information System (INIS)

    Fevrier, A.; Tavergnier, J.P.; Nithart, H.; Kiblaire, M.; Duchateau, J.L.

    1981-01-01

    A high sensitive quench detection method which works even in the presence of an external perturbing magnetic field is reported. The quench signal is obtained from the difference in voltages at the superconducting winding terminals and at the terminals at a secondary winding strongly coupled to the primary. The secondary winding could consist of a ''zero-current strand'' of the superconducting cable not connected to one of the winding terminals or an integrated normal test wire inside the superconducting cable. Experimental results on quench detection obtained by this method are described. It is shown that the integrated test wire method leads to efficient and sensitive quench detection, especially in the presence of an external perturbing magnetic field

  13. 13th International Conference on Integral Methods in Science and Engineering

    CERN Document Server

    Kirsch, Andreas

    2015-01-01

    This contributed volume contains a collection of articles on state-of-the-art developments on the construction of theoretical integral techniques and their application to specific problems in science and engineering.  Written by internationally recognized researchers, the chapters in this book are based on talks given at the Thirteenth International Conference on Integral Methods in Science and Engineering, held July 21–25, 2014, in Karlsruhe, Germany.   A broad range of topics is addressed, from problems of existence and uniqueness for singular integral equations on domain boundaries to numerical integration via finite and boundary elements, conservation laws, hybrid methods, and other quadrature-related approaches.   This collection will be of interest to researchers in applied mathematics, physics, and mechanical and electrical engineering, as well as graduate students in these disciplines and other professionals for whom integration is an essential tool.

  14. Open Source GIS based integrated watershed management

    Science.gov (United States)

    Byrne, J. M.; Lindsay, J.; Berg, A. A.

    2013-12-01

    Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address

  15. Integrated data base for spent fuel and radwaste: inventories

    International Nuclear Information System (INIS)

    Notz, K.J.; Carter, W.L.; Kibbey, A.H.

    1982-01-01

    The Integrated Data Base (IDB) program provides and maintains current, integrated data on spent reactor fuel and radwaste, including historical data, current inventories, projected inventories, and material characteristics. The IDB program collects, organizes, integrates, and - where necessary - reconciles inventory and projection (I/P) and characteristics information to provide a coherent, self-consistent data base on spent fuel and radwaste

  16. Integrated Arts-Based Teaching (IAT) Model for Brain-Based Learning

    Science.gov (United States)

    Inocian, Reynaldo B.

    2015-01-01

    This study analyzes teaching strategies among the eight books in Principles and Methods of Teaching recommended for use in the College of Teacher Education in the Philippines. It seeks to answer the following objectives: (1) identify the most commonly used teaching strategies congruent with the integrated arts-based teaching (IAT) and (2) design…

  17. PROSPECTS OF THE REGIONAL INTEGRATION POLICY BASED ON CLUSTER FORMATION

    Directory of Open Access Journals (Sweden)

    Elena Tsepilova

    2018-01-01

    Full Text Available The purpose of this article is to develop the theoretical foundations of regional integration policy and to determine its prospects on the basis of cluster formation. The authors use such research methods as systematization, comparative and complex analysis, synthesis, statistical method. Within the framework of the research, the concept of regional integration policy is specified, and its integration core – cluster – is allocated. The authors work out an algorithm of regional clustering, which will ensure the growth of economy and tax income. Measures have been proposed to optimize the organizational mechanism of interaction between the participants of the territorial cluster and the authorities that allow to ensure the effective functioning of clusters, including taxation clusters. Based on the results of studying the existing methods for assessing the effectiveness of cluster policy, the authors propose their own approach to evaluating the consequences of implementing the regional integration policy, according to which the list of quantitative and qualitative indicators is defined. The present article systematizes the experience and results of the cluster policy of certain European countries, that made it possible to determine the prospects and synergetic effect from the development of clusters as an integration foundation of regional policy in the Russian Federation. The authors carry out the analysis of activity of cluster formations using the example of the Rostov region – a leader in the formation of conditions for the cluster policy development in the Southern Federal District. 11 clusters and cluster initiatives are developing in this region. As a result, the authors propose measures for support of the already existing clusters and creation of the new ones.

  18. Characterization methods of integrated optics for mid-infrared interferometry

    Science.gov (United States)

    Labadie, Lucas; Kern, Pierre Y.; Schanen-Duport, Isabelle; Broquin, Jean-Emmanuel

    2004-10-01

    his article deals with one of the important instrumentation challenges of the stellar interferometry mission IRSI-Darwin of the European Space Agency: the necessity to have a reliable and performant system for beam combination has enlightened the advantages of an integrated optics solution, which is already in use for ground-base interferometry in the near infrared. Integrated optics provides also interesting features in terms of filtering, which is a main issue for the deep null to be reached by Darwin. However, Darwin will operate in the mid infrared range from 4 microns to 20 microns where no integrated optics functions are available on-the-shelf. This requires extending the integrated optics concept and the undergoing technology in this spectral range. This work has started with the IODA project (Integrated Optics for Darwin) under ESA contract and aims to provide a first component for interferometry. In this paper are presented the guidelines of the characterization work that is implemented to test and validate the performances of a component at each step of the development phase. We present also an example of characterization experiment used within the frame of this work, is theoretical approach and some results.

  19. An Integrated Method of Supply Chains Vulnerability Assessment

    Directory of Open Access Journals (Sweden)

    Jiaguo Liu

    2016-01-01

    Full Text Available Supply chain vulnerability identification and evaluation are extremely important to mitigate the supply chain risk. We present an integrated method to assess the supply chain vulnerability. The potential failure mode of the supply chain vulnerability is analyzed through the SCOR model. Combining the fuzzy theory and the gray theory, the correlation degree of each vulnerability indicator can be calculated and the target improvements can be carried out. In order to verify the effectiveness of the proposed method, we use Kendall’s tau coefficient to measure the effect of different methods. The result shows that the presented method has the highest consistency in the assessment compared with the other two methods.

  20. Field Method for Integrating the First Order Differential Equation

    Institute of Scientific and Technical Information of China (English)

    JIA Li-qun; ZHENG Shi-wang; ZHANG Yao-yu

    2007-01-01

    An important modern method in analytical mechanics for finding the integral, which is called the field-method, is used to research the solution of a differential equation of the first order. First, by introducing an intermediate variable, a more complicated differential equation of the first order can be expressed by two simple differential equations of the first order, then the field-method in analytical mechanics is introduced for solving the two differential equations of the first order. The conclusion shows that the field-method in analytical mechanics can be fully used to find the solutions of a differential equation of the first order, thus a new method for finding the solutions of the first order is provided.

  1. Comparison the Students Satisfaction of Traditional and Integrated Teaching Method in Physiology Course

    Directory of Open Access Journals (Sweden)

    Keshavarzi Z.

    2016-02-01

    Full Text Available Aims: Different education methods play crucial roles to improve education quality and students’ satisfaction. In the recent years, medical education highly changes through new education methods. The aim of this study was to compare medical students’ satisfaction in traditional and integrated methods of teaching physiology course. Instrument and Methods: In the descriptive analysis study, fifty 4th semester medical students of Bojnourd University of Medical Sciences were studied in 2015. The subjects were randomly selected based on availability. Data was collected by two researcher-made questionnaires; their validity and reliability were confirmed. Questionnaure 1 was completed by the students after presenting renal and endocrinology topics via traditional and integrated methods. Questionnaire 2 was only completed by the students after presenting the course via integrated method. Data was analyzed by SPSS 16 software using dependent T test. Findings: Mean score of the students’ satisfaction in traditional method (24.80±3.48 was higher than integrated method (22.30±4.03; p<0.0001. In the integrated method, most of the students were agreed and completely agreed on telling stories from daily life (76%, sitting mode in the classroom (48%, an attribution of cell roles to the students (60%, showing movies and animations (76%, using models (84%, and using real animal parts (72% during teaching, as well as expressing clinical items to enhance learning motivations (76%. Conclusion: Favorable satisfaction of the students in traditional lecture method to understand the issues, as well as their acceptance of new and active methods of learning, show effectiveness and efficiency of traditional method and the requirement of its enhancement by the integrated methods

  2. Computing thermal Wigner densities with the phase integration method

    International Nuclear Information System (INIS)

    Beutier, J.; Borgis, D.; Vuilleumier, R.; Bonella, S.

    2014-01-01

    We discuss how the Phase Integration Method (PIM), recently developed to compute symmetrized time correlation functions [M. Monteferrante, S. Bonella, and G. Ciccotti, Mol. Phys. 109, 3015 (2011)], can be adapted to sampling/generating the thermal Wigner density, a key ingredient, for example, in many approximate schemes for simulating quantum time dependent properties. PIM combines a path integral representation of the density with a cumulant expansion to represent the Wigner function in a form calculable via existing Monte Carlo algorithms for sampling noisy probability densities. The method is able to capture highly non-classical effects such as correlation among the momenta and coordinates parts of the density, or correlations among the momenta themselves. By using alternatives to cumulants, it can also indicate the presence of negative parts of the Wigner density. Both properties are demonstrated by comparing PIM results to those of reference quantum calculations on a set of model problems

  3. Computing thermal Wigner densities with the phase integration method.

    Science.gov (United States)

    Beutier, J; Borgis, D; Vuilleumier, R; Bonella, S

    2014-08-28

    We discuss how the Phase Integration Method (PIM), recently developed to compute symmetrized time correlation functions [M. Monteferrante, S. Bonella, and G. Ciccotti, Mol. Phys. 109, 3015 (2011)], can be adapted to sampling/generating the thermal Wigner density, a key ingredient, for example, in many approximate schemes for simulating quantum time dependent properties. PIM combines a path integral representation of the density with a cumulant expansion to represent the Wigner function in a form calculable via existing Monte Carlo algorithms for sampling noisy probability densities. The method is able to capture highly non-classical effects such as correlation among the momenta and coordinates parts of the density, or correlations among the momenta themselves. By using alternatives to cumulants, it can also indicate the presence of negative parts of the Wigner density. Both properties are demonstrated by comparing PIM results to those of reference quantum calculations on a set of model problems.

  4. Methods for Developing Emissions Scenarios for Integrated Assessment Models

    Energy Technology Data Exchange (ETDEWEB)

    Prinn, Ronald [MIT; Webster, Mort [MIT

    2007-08-20

    The overall objective of this research was to contribute data and methods to support the future development of new emissions scenarios for integrated assessment of climate change. Specifically, this research had two main objectives: 1. Use historical data on economic growth and energy efficiency changes, and develop probability density functions (PDFs) for the appropriate parameters for two or three commonly used integrated assessment models. 2. Using the parameter distributions developed through the first task and previous work, we will develop methods of designing multi-gas emission scenarios that usefully span the joint uncertainty space in a small number of scenarios. Results on the autonomous energy efficiency improvement (AEEI) parameter are summarized, an uncertainty analysis of elasticities of substitution is described, and the probabilistic emissions scenario approach is presented.

  5. Methods in Entrepreneurship Education Research: A Review and Integrative Framework

    DEFF Research Database (Denmark)

    Blenker, Per; Trolle Elmholdt, Stine; Frederiksen, Signe Hedeboe

    2014-01-01

    is fragmented both conceptually and methodologically. Findings suggest that the methods applied in entrepreneurship education research cluster in two groups: 1. quantitative studies of the extent and effect of entrepreneurship education, and 2. qualitative single case studies of different courses and programmes....... It integrates qualitative and quantitative techniques, the use of research teams consisting of insiders (teachers studying their own teaching) and outsiders (research collaborators studying the education) as well as multiple types of data. To gain both in-depth and analytically generalizable studies...... a variety of helpful methods, explore the potential relation between insiders and outsiders in the research process, and discuss how different types of data can be combined. The integrated framework urges researchers to extend investments in methodological efforts and to enhance the in-depth understanding...

  6. Measurement of integrated healthcare delivery: a systematic review of methods and future research directions

    Directory of Open Access Journals (Sweden)

    Martin Strandberg-Larsen

    2009-02-01

    Full Text Available Background: Integrated healthcare delivery is a policy goal of healthcare systems. There is no consensus on how to measure the concept, which makes it difficult to monitor progress. Purpose: To identify the different types of methods used to measure integrated healthcare delivery with emphasis on structural, cultural and process aspects. Methods: Medline/Pubmed, EMBASE, Web of Science, Cochrane Library, WHOLIS, and conventional internet search engines were systematically searched for methods to measure integrated healthcare delivery (published – April 2008. Results: Twenty-four published scientific papers and documents met the inclusion criteria. In the 24 references we identified 24 different measurement methods; however, 5 methods shared theoretical framework. The methods can be categorized according to type of data source: a questionnaire survey data, b automated register data, or c mixed data sources. The variety of concepts measured reflects the significant conceptual diversity within the field, and most methods lack information regarding validity and reliability. Conclusion: Several methods have been developed to measure integrated healthcare delivery; 24 methods are available and some are highly developed. The objective governs the method best used. Criteria for sound measures are suggested and further developments should be based on an explicit conceptual framework and focus on simplifying and validating existing methods.

  7. INTEGRATED APPLICATION OF OPTICAL DIAGNOSTIC METHODS IN ULCERATIVE COLITIS

    Directory of Open Access Journals (Sweden)

    E. V. Velikanov

    2013-01-01

    Full Text Available Abstract. Our results suggest that the combined use of optical coherent tomography (OCT and fluorescence diagnosis helps to refine the nature and boundaries of the pathological process in the tissue of the colon in ulcerative colitis. Studies have shown that an integrated optical diagnostics allows us to differentiate lesions respectively to histology and to decide on the need for biopsy and venue. This method is most appropriate in cases difficult for diagnosis. 

  8. Digital Integration Method (DIM): A new method for the precise correlation of OCT and fluorescein angiography

    International Nuclear Information System (INIS)

    Hassenstein, A.; Richard, G.; Inhoffen, W.; Scholz, F.

    2007-01-01

    The new integration method (DIM) provides for the first time the anatomically precise integration of the OCT-scan position into the angiogram (fluorescein angiography, FLA), using reference marker at corresponding vessel crossings. Therefore an exact correlation of angiographic and morphological pathological findings is possible und leads to a better understanding of OCT and FLA. Occult findings in FLA were the patient group which profited most. Occult leakages could gain additional information using DIM such as serous detachment of the retinal pigment epithelium (RPE) in a topography. So far it was unclear whether the same localization in the lesion was examined by FLA and OCT especially when different staff were performing and interpreting the examination. Using DIM this problem could be solved using objective markers. This technique is the requirement for follow-up examinations by OCT. Using DIM for an objective, reliable and precise correlation of OCT and FLA-findings it is now possible to provide the identical scan-position in follow-up. Therefore for follow-up in clinical studies it is mandatory to use DIM to improve the evidence-based statement of OCT and the quality of the study. (author) [de

  9. On a numereeical method for solving the Faddv integral equation without deformation of contour

    International Nuclear Information System (INIS)

    Belyaev, V.O.; Moller, K.

    1976-01-01

    A numerical method is proposed for solving the Faddeev equation for separable potentials at positive total energy. The method is based on the fact that after applying a simple interpolation procedure the logarithmic singularities in the kernel of the integral equation can be extracted in the same way as usually the pole singularity is extracted. The method has been applied to calculate the eigenvalues of the Faddeev kernel

  10. Evaluation of the filtered leapfrog-trapezoidal time integration method

    International Nuclear Information System (INIS)

    Roache, P.J.; Dietrich, D.E.

    1988-01-01

    An analysis and evaluation are presented for a new method of time integration for fluid dynamic proposed by Dietrich. The method, called the filtered leapfrog-trapezoidal (FLT) scheme, is analyzed for the one-dimensional constant-coefficient advection equation and is shown to have some advantages for quasi-steady flows. A modification (FLTW) using a weighted combination of FLT and leapfrog is developed which retains the advantages for steady flows, increases accuracy for time-dependent flows, and involves little coding effort. Merits and applicability are discussed

  11. Investigation of Optimal Integrated Circuit Raster Image Vectorization Method

    Directory of Open Access Journals (Sweden)

    Leonas Jasevičius

    2011-03-01

    Full Text Available Visual analysis of integrated circuit layer requires raster image vectorization stage to extract layer topology data to CAD tools. In this paper vectorization problems of raster IC layer images are presented. Various line extraction from raster images algorithms and their properties are discussed. Optimal raster image vectorization method was developed which allows utilization of common vectorization algorithms to achieve the best possible extracted vector data match with perfect manual vectorization results. To develop the optimal method, vectorized data quality dependence on initial raster image skeleton filter selection was assessed.Article in Lithuanian

  12. Integral equation models for image restoration: high accuracy methods and fast algorithms

    International Nuclear Information System (INIS)

    Lu, Yao; Shen, Lixin; Xu, Yuesheng

    2010-01-01

    Discrete models are consistently used as practical models for image restoration. They are piecewise constant approximations of true physical (continuous) models, and hence, inevitably impose bottleneck model errors. We propose to work directly with continuous models for image restoration aiming at suppressing the model errors caused by the discrete models. A systematic study is conducted in this paper for the continuous out-of-focus image models which can be formulated as an integral equation of the first kind. The resulting integral equation is regularized by the Lavrentiev method and the Tikhonov method. We develop fast multiscale algorithms having high accuracy to solve the regularized integral equations of the second kind. Numerical experiments show that the methods based on the continuous model perform much better than those based on discrete models, in terms of PSNR values and visual quality of the reconstructed images

  13. Applying Groebner bases to solve reduction problems for Feynman integrals

    International Nuclear Information System (INIS)

    Smirnov, Alexander V.; Smirnov, Vladimir A.

    2006-01-01

    We describe how Groebner bases can be used to solve the reduction problem for Feynman integrals, i.e. to construct an algorithm that provides the possibility to express a Feynman integral of a given family as a linear combination of some master integrals. Our approach is based on a generalized Buchberger algorithm for constructing Groebner-type bases associated with polynomials of shift operators. We illustrate it through various examples of reduction problems for families of one- and two-loop Feynman integrals. We also solve the reduction problem for a family of integrals contributing to the three-loop static quark potential

  14. Applying Groebner bases to solve reduction problems for Feynman integrals

    Energy Technology Data Exchange (ETDEWEB)

    Smirnov, Alexander V. [Mechanical and Mathematical Department and Scientific Research Computer Center of Moscow State University, Moscow 119992 (Russian Federation); Smirnov, Vladimir A. [Nuclear Physics Institute of Moscow State University, Moscow 119992 (Russian Federation)

    2006-01-15

    We describe how Groebner bases can be used to solve the reduction problem for Feynman integrals, i.e. to construct an algorithm that provides the possibility to express a Feynman integral of a given family as a linear combination of some master integrals. Our approach is based on a generalized Buchberger algorithm for constructing Groebner-type bases associated with polynomials of shift operators. We illustrate it through various examples of reduction problems for families of one- and two-loop Feynman integrals. We also solve the reduction problem for a family of integrals contributing to the three-loop static quark potential.

  15. Performance Criteria of Spatial Development Projects Based on Interregional Integration

    Directory of Open Access Journals (Sweden)

    Elena Viktorovna Kurushina

    2018-03-01

    Full Text Available The search of efficient ways for the development of regional socio-economic space is a relevant problem. The authors consider the models of spatial organization according to the Spatial Development Strategy of the Russian Federation until 2030. We conduct the comparative analysis of scenarios for the polarized and diversified spatial growth. Many investigations consider the concepts of polarized and endogenous growth. This study proposes a methodology to assess the development of macroregions and to increase the viability of interregional integration projects. To develop this methodology, we formulate scientific principles and indirect criteria of the project performance conforming to the theory of regional integration. In addition to the territorial community and complementarity of the development potentials, regional integration in the country should be based on the principles of security, networking, limited quantity and awareness of the potential project participants. Integration should ensure synergetic effects and take into account cultural and historical closeness, that manifests in the common mentality and existing economic relations among regions. The calculation results regarding the indirect criteria are obtained using the methods of classification and spatial correlation. This study confirms the hypothesis, that the formation of the Western Siberian and Ural macro-regions is appropriate. We have concluded this on the basis of the criteria of economic development, economic integration, the similarity of regional spaces as habitats, and a number of participants for the subjects of the Ural Federal District. The projection of the patterns of international economic integration to the interregional level allows predicting the highest probability for the successful cooperation among the Western Siberian regions with a high level of economic development. The authors’ method has revealed a high synchronization between the economies of

  16. Comparison of Three Different Methods for Pile Integrity Testing on a Cylindrical Homogeneous Polyamide Specimen

    Science.gov (United States)

    Lugovtsova, Y. D.; Soldatov, A. I.

    2016-01-01

    Three different methods for pile integrity testing are proposed to compare on a cylindrical homogeneous polyamide specimen. The methods are low strain pile integrity testing, multichannel pile integrity testing and testing with a shaker system. Since the low strain pile integrity testing is well-established and standardized method, the results from it are used as a reference for other two methods.

  17. Method to integrate clinical guidelines into the electronic health record (EHR) by applying the archetypes approach.

    Science.gov (United States)

    Garcia, Diego; Moro, Claudia Maria Cabral; Cicogna, Paulo Eduardo; Carvalho, Deborah Ribeiro

    2013-01-01

    Clinical guidelines are documents that assist healthcare professionals, facilitating and standardizing diagnosis, management, and treatment in specific areas. Computerized guidelines as decision support systems (DSS) attempt to increase the performance of tasks and facilitate the use of guidelines. Most DSS are not integrated into the electronic health record (EHR), ordering some degree of rework especially related to data collection. This study's objective was to present a method for integrating clinical guidelines into the EHR. The study developed first a way to identify data and rules contained in the guidelines, and then incorporate rules into an archetype-based EHR. The proposed method tested was anemia treatment in the Chronic Kidney Disease Guideline. The phases of the method are: data and rules identification; archetypes elaboration; rules definition and inclusion in inference engine; and DSS-EHR integration and validation. The main feature of the proposed method is that it is generic and can be applied toany type of guideline.

  18. Integrated optical isolators based on two-mode interference couplers

    International Nuclear Information System (INIS)

    Sun, Yiling; Zhou, Haifeng; Jiang, Xiaoqing; Hao, Yinlei; Yang, Jianyi; Wang, Minghua

    2010-01-01

    This paper presents an optical waveguide isolator based on two-mode interference (TMI) couplers, by utilizing the magneto-optical nonreciprocal phase shift (NPS). The operating principle of this device is to utilize the difference between the nonreciprocal phase shifts of the two lowest-order modes. A two-dimensional (2D) semi-vectorial finite difference method is used to calculate the difference between the nonreciprocal phase shifts of the two lowest-order modes and optimize the parameters. The proposed device may play an important role in integrated optical devices and optical communication systems

  19. A simple reliability block diagram method for safety integrity verification

    International Nuclear Information System (INIS)

    Guo Haitao; Yang Xianhui

    2007-01-01

    IEC 61508 requires safety integrity verification for safety related systems to be a necessary procedure in safety life cycle. PFD avg must be calculated to verify the safety integrity level (SIL). Since IEC 61508-6 does not give detailed explanations of the definitions and PFD avg calculations for its examples, it is difficult for common reliability or safety engineers to understand when they use the standard as guidance in practice. A method using reliability block diagram is investigated in this study in order to provide a clear and feasible way of PFD avg calculation and help those who take IEC 61508-6 as their guidance. The method finds mean down times (MDTs) of both channel and voted group first and then PFD avg . The calculated results of various voted groups are compared with those in IEC61508 part 6 and Ref. [Zhang T, Long W, Sato Y. Availability of systems with self-diagnostic components-applying Markov model to IEC 61508-6. Reliab Eng System Saf 2003;80(2):133-41]. An interesting outcome can be realized from the comparison. Furthermore, although differences in MDT of voted groups exist between IEC 61508-6 and this paper, PFD avg of voted groups are comparatively close. With detailed description, the method of RBD presented can be applied to the quantitative SIL verification, showing a similarity of the method in IEC 61508-6

  20. Numerical Simulation of Antennas with Improved Integral Equation Method

    International Nuclear Information System (INIS)

    Ma Ji; Fang Guang-You; Lu Wei

    2015-01-01

    Simulating antennas around a conducting object is a challenge task in computational electromagnetism, which is concerned with the behaviour of electromagnetic fields. To analyze this model efficiently, an improved integral equation-fast Fourier transform (IE-FFT) algorithm is presented in this paper. The proposed scheme employs two Cartesian grids with different size and location to enclose the antenna and the other object, respectively. On the one hand, IE-FFT technique is used to store matrix in a sparse form and accelerate the matrix-vector multiplication for each sub-domain independently. On the other hand, the mutual interaction between sub-domains is taken as the additional exciting voltage in each matrix equation. By updating integral equations several times, the whole electromagnetic system can achieve a stable status. Finally, the validity of the presented method is verified through the analysis of typical antennas in the presence of a conducting object. (paper)

  1. Ray-based approach to integrated 3D visual communication

    Science.gov (United States)

    Naemura, Takeshi; Harashima, Hiroshi

    2001-02-01

    For a high sense of reality in the next-generation communications, it is very important to realize three-dimensional (3D) spatial media, instead of existing 2D image media. In order to comprehensively deal with a variety of 3D visual data formats, the authors first introduce the concept of "Integrated 3D Visual Communication," which reflects the necessity of developing a neutral representation method independent of input/output systems. Then, the following discussions are concentrated on the ray-based approach to this concept, in which any visual sensation is considered to be derived from a set of light rays. This approach is a simple and straightforward to the problem of how to represent 3D space, which is an issue shared by various fields including 3D image communications, computer graphics, and virtual reality. This paper mainly presents the several developments in this approach, including some efficient methods of representing ray data, a real-time video-based rendering system, an interactive rendering system based on the integral photography, a concept of virtual object surface for the compression of tremendous amount of data, and a light ray capturing system using a telecentric lens. Experimental results demonstrate the effectiveness of the proposed techniques.

  2. Development of an integrated method for long-term water quality prediction using seasonal climate forecast

    Directory of Open Access Journals (Sweden)

    J. Cho

    2016-10-01

    Full Text Available The APEC Climate Center (APCC produces climate prediction information utilizing a multi-climate model ensemble (MME technique. In this study, four different downscaling methods, in accordance with the degree of utilizing the seasonal climate prediction information, were developed in order to improve predictability and to refine the spatial scale. These methods include: (1 the Simple Bias Correction (SBC method, which directly uses APCC's dynamic prediction data with a 3 to 6 month lead time; (2 the Moving Window Regression (MWR method, which indirectly utilizes dynamic prediction data; (3 the Climate Index Regression (CIR method, which predominantly uses observation-based climate indices; and (4 the Integrated Time Regression (ITR method, which uses predictors selected from both CIR and MWR. Then, a sampling-based temporal downscaling was conducted using the Mahalanobis distance method in order to create daily weather inputs to the Soil and Water Assessment Tool (SWAT model. Long-term predictability of water quality within the Wecheon watershed of the Nakdong River Basin was evaluated. According to the Korean Ministry of Environment's Provisions of Water Quality Prediction and Response Measures, modeling-based predictability was evaluated by using 3-month lead prediction data issued in February, May, August, and November as model input of SWAT. Finally, an integrated approach, which takes into account various climate information and downscaling methods for water quality prediction, was presented. This integrated approach can be used to prevent potential problems caused by extreme climate in advance.

  3. Property Valuation: Integration of Methods and Determination of Depreciation

    NARCIS (Netherlands)

    Tempelmans Plat, H.; Verhaegh, M.

    2000-01-01

    Property valuation up to now is a global guess. On the one hand we have the Investment Method which regards a property as just a sum of money, on the other hand we have the Contractor's Method which is based on the actual new construction costs of the building and the actual value of the land. Both

  4. Improved parallel solution techniques for the integral transport matrix method

    Energy Technology Data Exchange (ETDEWEB)

    Zerr, R. Joseph, E-mail: rjz116@psu.edu [Department of Mechanical and Nuclear Engineering, The Pennsylvania State University, University Park, PA (United States); Azmy, Yousry Y., E-mail: yyazmy@ncsu.edu [Department of Nuclear Engineering, North Carolina State University, Burlington Engineering Laboratories, Raleigh, NC (United States)

    2011-07-01

    Alternative solution strategies to the parallel block Jacobi (PBJ) method for the solution of the global problem with the integral transport matrix method operators have been designed and tested. The most straightforward improvement to the Jacobi iterative method is the Gauss-Seidel alternative. The parallel red-black Gauss-Seidel (PGS) algorithm can improve on the number of iterations and reduce work per iteration by applying an alternating red-black color-set to the subdomains and assigning multiple sub-domains per processor. A parallel GMRES(m) method was implemented as an alternative to stationary iterations. Computational results show that the PGS method can improve on the PBJ method execution time by up to 10´ when eight sub-domains per processor are used. However, compared to traditional source iterations with diffusion synthetic acceleration, it is still approximately an order of magnitude slower. The best-performing cases are optically thick because sub-domains decouple, yielding faster convergence. Further tests revealed that 64 sub-domains per processor was the best performing level of sub-domain division. An acceleration technique that improves the convergence rate would greatly improve the ITMM. The GMRES(m) method with a diagonal block pre conditioner consumes approximately the same time as the PBJ solver but could be improved by an as yet undeveloped, more efficient pre conditioner. (author)

  5. Improved parallel solution techniques for the integral transport matrix method

    International Nuclear Information System (INIS)

    Zerr, R. Joseph; Azmy, Yousry Y.

    2011-01-01

    Alternative solution strategies to the parallel block Jacobi (PBJ) method for the solution of the global problem with the integral transport matrix method operators have been designed and tested. The most straightforward improvement to the Jacobi iterative method is the Gauss-Seidel alternative. The parallel red-black Gauss-Seidel (PGS) algorithm can improve on the number of iterations and reduce work per iteration by applying an alternating red-black color-set to the subdomains and assigning multiple sub-domains per processor. A parallel GMRES(m) method was implemented as an alternative to stationary iterations. Computational results show that the PGS method can improve on the PBJ method execution time by up to 10´ when eight sub-domains per processor are used. However, compared to traditional source iterations with diffusion synthetic acceleration, it is still approximately an order of magnitude slower. The best-performing cases are optically thick because sub-domains decouple, yielding faster convergence. Further tests revealed that 64 sub-domains per processor was the best performing level of sub-domain division. An acceleration technique that improves the convergence rate would greatly improve the ITMM. The GMRES(m) method with a diagonal block pre conditioner consumes approximately the same time as the PBJ solver but could be improved by an as yet undeveloped, more efficient pre conditioner. (author)

  6. Teaching Basic Quantum Mechanics in Secondary School Using Concepts of Feynman Path Integrals Method

    Science.gov (United States)

    Fanaro, Maria de los Angeles; Otero, Maria Rita; Arlego, Marcelo

    2012-01-01

    This paper discusses the teaching of basic quantum mechanics in high school. Rather than following the usual formalism, our approach is based on Feynman's path integral method. Our presentation makes use of simulation software and avoids sophisticated mathematical formalism. (Contains 3 figures.)

  7. Demonstrating the Effectiveness of an Integrated and Intensive Research Methods and Statistics Course Sequence

    Science.gov (United States)

    Pliske, Rebecca M.; Caldwell, Tracy L.; Calin-Jageman, Robert J.; Taylor-Ritzler, Tina

    2015-01-01

    We developed a two-semester series of intensive (six-contact hours per week) behavioral research methods courses with an integrated statistics curriculum. Our approach includes the use of team-based learning, authentic projects, and Excel and SPSS. We assessed the effectiveness of our approach by examining our students' content area scores on the…

  8. SODIM: Service Oriented Data Integration based on MapReduce

    Directory of Open Access Journals (Sweden)

    Ghada ElSheikh

    2013-09-01

    Data integration systems can benefit from innovative dynamic infrastructure solutions such as Clouds, with its more agility, lower cost, device independency, location independency, and scalability. This study consolidates the data integration system, Service Orientation, and distributed processing to develop a new data integration system called Service Oriented Data Integration based on MapReduce (SODIM that improves the system performance, especially with large number of data sources, and that can efficiently be hosted on modern dynamic infrastructures as Clouds.

  9. A nodal method based on matrix-response method

    International Nuclear Information System (INIS)

    Rocamora Junior, F.D.; Menezes, A.

    1982-01-01

    A nodal method based in the matrix-response method, is presented, and its application to spatial gradient problems, such as those that exist in fast reactors, near the core - blanket interface, is investigated. (E.G.) [pt

  10. Application of the critical pathway and integrated case teaching method to nursing orientation.

    Science.gov (United States)

    Goodman, D

    1997-01-01

    Nursing staff development programs must be responsive to current changes in healthcare. New nursing staff must be prepared to manage continuous change and to function competently in clinical practice. The orientation pathway, based on a case management model, is used as a structure for the orientation phase of staff development. The integrated case is incorporated as a teaching strategy in orientation. The integrated case method is based on discussion and analysis of patient situations with emphasis on role modeling and integration of theory and skill. The orientation pathway and integrated case teaching method provide a useful framework for orientation of new staff. Educators, preceptors and orientees find the structure provided by the orientation pathway very useful. Orientation that is developed, implemented and evaluated based on a case management model with the use of an orientation pathway and incorporation of an integrated case teaching method provides a standardized structure for orientation of new staff. This approach is designed for the adult learner, promotes conceptual reasoning, and encourages the social and contextual basis for continued learning.

  11. A high-order boundary integral method for surface diffusions on elastically stressed axisymmetric rods.

    Science.gov (United States)

    Li, Xiaofan; Nie, Qing

    2009-07-01

    Many applications in materials involve surface diffusion of elastically stressed solids. Study of singularity formation and long-time behavior of such solid surfaces requires accurate simulations in both space and time. Here we present a high-order boundary integral method for an elastically stressed solid with axi-symmetry due to surface diffusions. In this method, the boundary integrals for isotropic elasticity in axi-symmetric geometry are approximated through modified alternating quadratures along with an extrapolation technique, leading to an arbitrarily high-order quadrature; in addition, a high-order (temporal) integration factor method, based on explicit representation of the mean curvature, is used to reduce the stability constraint on time-step. To apply this method to a periodic (in axial direction) and axi-symmetric elastically stressed cylinder, we also present a fast and accurate summation method for the periodic Green's functions of isotropic elasticity. Using the high-order boundary integral method, we demonstrate that in absence of elasticity the cylinder surface pinches in finite time at the axis of the symmetry and the universal cone angle of the pinching is found to be consistent with the previous studies based on a self-similar assumption. In the presence of elastic stress, we show that a finite time, geometrical singularity occurs well before the cylindrical solid collapses onto the axis of symmetry, and the angle of the corner singularity on the cylinder surface is also estimated.

  12. Integration method of 3D MR spectroscopy into treatment planning system for glioblastoma IMRT dose painting with integrated simultaneous boost

    International Nuclear Information System (INIS)

    Ken, Soléakhéna; Cassol, Emmanuelle; Delannes, Martine; Celsis, Pierre; Cohen-Jonathan, Elizabeth Moyal; Laprie, Anne; Vieillevigne, Laure; Franceries, Xavier; Simon, Luc; Supper, Caroline; Lotterie, Jean-Albert; Filleron, Thomas; Lubrano, Vincent; Berry, Isabelle

    2013-01-01

    To integrate 3D MR spectroscopy imaging (MRSI) in the treatment planning system (TPS) for glioblastoma dose painting to guide simultaneous integrated boost (SIB) in intensity-modulated radiation therapy (IMRT). For sixteen glioblastoma patients, we have simulated three types of dosimetry plans, one conventional plan of 60-Gy in 3D conformational radiotherapy (3D-CRT), one 60-Gy plan in IMRT and one 72-Gy plan in SIB-IMRT. All sixteen MRSI metabolic maps were integrated into TPS, using normalization with color-space conversion and threshold-based segmentation. The fusion between the metabolic maps and the planning CT scans were assessed. Dosimetry comparisons were performed between the different plans of 60-Gy 3D-CRT, 60-Gy IMRT and 72-Gy SIB-IMRT, the last plan was targeted on MRSI abnormalities and contrast enhancement (CE). Fusion assessment was performed for 160 transformations. It resulted in maximum differences <1.00 mm for translation parameters and ≤1.15° for rotation. Dosimetry plans of 72-Gy SIB-IMRT and 60-Gy IMRT showed a significantly decreased maximum dose to the brainstem (44.00 and 44.30 vs. 57.01 Gy) and decreased high dose-volumes to normal brain (19 and 20 vs. 23% and 7 and 7 vs. 12%) compared to 60-Gy 3D-CRT (p < 0.05). Delivering standard doses to conventional target and higher doses to new target volumes characterized by MRSI and CE is now possible and does not increase dose to organs at risk. MRSI and CE abnormalities are now integrated for glioblastoma SIB-IMRT, concomitant with temozolomide, in an ongoing multi-institutional phase-III clinical trial. Our method of MR spectroscopy maps integration to TPS is robust and reliable; integration to neuronavigation systems with this method could also improve glioblastoma resection or guide biopsies

  13. Method for deposition of a conductor in integrated circuits

    Science.gov (United States)

    Creighton, J. Randall; Dominguez, Frank; Johnson, A. Wayne; Omstead, Thomas R.

    1997-01-01

    A method is described for fabricating integrated semiconductor circuits and, more particularly, for the selective deposition of a conductor onto a substrate employing a chemical vapor deposition process. By way of example, tungsten can be selectively deposited onto a silicon substrate. At the onset of loss of selectivity of deposition of tungsten onto the silicon substrate, the deposition process is interrupted and unwanted tungsten which has deposited on a mask layer with the silicon substrate can be removed employing a halogen etchant. Thereafter, a plurality of deposition/etch back cycles can be carried out to achieve a predetermined thickness of tungsten.

  14. Integrated Data Collection Analysis (IDCA) Program - SSST Testing Methods

    Energy Technology Data Exchange (ETDEWEB)

    Sandstrom, Mary M. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Brown, Geoffrey W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Preston, Daniel N. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Pollard, Colin J. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Warner, Kirstin F. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Remmers, Daniel L. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Sorensen, Daniel N. [Naval Surface Warfare Center (NSWC), Indian Head, MD (United States). Indian Head Division; Whinnery, LeRoy L. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Phillips, Jason J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Shelley, Timothy J. [Bureau of Alcohol, Tobacco and Firearms (ATF), Huntsville, AL (United States); Reyes, Jose A. [Applied Research Associates, Tyndall AFB, FL (United States); Hsu, Peter C. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Reynolds, John G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-03-25

    The Integrated Data Collection Analysis (IDCA) program is conducting a proficiency study for Small- Scale Safety and Thermal (SSST) testing of homemade explosives (HMEs). Described here are the methods used for impact, friction, electrostatic discharge, and differential scanning calorimetry analysis during the IDCA program. These methods changed throughout the Proficiency Test and the reasons for these changes are documented in this report. The most significant modifications in standard testing methods are: 1) including one specified sandpaper in impact testing among all the participants, 2) diversifying liquid test methods for selected participants, and 3) including sealed sample holders for thermal testing by at least one participant. This effort, funded by the Department of Homeland Security (DHS), is putting the issues of safe handling of these materials in perspective with standard military explosives. The study is adding SSST testing results for a broad suite of different HMEs to the literature. Ultimately the study will suggest new guidelines and methods and possibly establish the SSST testing accuracies needed to develop safe handling practices for HMEs. Each participating testing laboratory uses identical test materials and preparation methods wherever possible. The testing performers involved are Lawrence Livermore National Laboratory (LLNL), Los Alamos National Laboratory (LANL), Indian Head Division, Naval Surface Warfare Center, (NSWC IHD), Sandia National Laboratories (SNL), and Air Force Research Laboratory (AFRL/RXQL). These tests are conducted as a proficiency study in order to establish some consistency in test protocols, procedures, and experiments and to compare results when these testing variables cannot be made consistent.

  15. Use of integral data in the development of design methods for fast reactors

    International Nuclear Information System (INIS)

    Doncals, R.A.; Lake, J.A.; Paik, N.C.

    1978-01-01

    The paper describes an LMFBR nuclear design methodology which has been strongly influenced by the availability of integral data, by the expansion of the differential nuclear data base, by improvements in large nuclear design computer codes, and by the specific reactor under consideration. The accuracy of the nuclear data base has been improved as the result of detailed differential measurements as well as extensive integral testing in the ZPR and ZPPR criticals. Due to the increased interest in radial parfait designs, the applicability of the design data and methods to the analysis of heterogeneous LMFBR systems have been explored. The ability of the design data and methods to predict integral parameters in ZPPR is also discussed

  16. Integrated Markov-neural reliability computation method: A case for multiple automated guided vehicle system

    International Nuclear Information System (INIS)

    Fazlollahtabar, Hamed; Saidi-Mehrabad, Mohammad; Balakrishnan, Jaydeep

    2015-01-01

    This paper proposes an integrated Markovian and back propagation neural network approaches to compute reliability of a system. While states of failure occurrences are significant elements for accurate reliability computation, Markovian based reliability assessment method is designed. Due to drawbacks shown by Markovian model for steady state reliability computations and neural network for initial training pattern, integration being called Markov-neural is developed and evaluated. To show efficiency of the proposed approach comparative analyses are performed. Also, for managerial implication purpose an application case for multiple automated guided vehicles (AGVs) in manufacturing networks is conducted. - Highlights: • Integrated Markovian and back propagation neural network approach to compute reliability. • Markovian based reliability assessment method. • Managerial implication is shown in an application case for multiple automated guided vehicles (AGVs) in manufacturing networks

  17. Research on Sustainable Development Level Evaluation of Resource-based Cities Based on Shapely Entropy and Chouqet Integral

    Science.gov (United States)

    Zhao, Hui; Qu, Weilu; Qiu, Weiting

    2018-03-01

    In order to evaluate sustainable development level of resource-based cities, an evaluation method with Shapely entropy and Choquet integral is proposed. First of all, a systematic index system is constructed, the importance of each attribute is calculated based on the maximum Shapely entropy principle, and then the Choquet integral is introduced to calculate the comprehensive evaluation value of each city from the bottom up, finally apply this method to 10 typical resource-based cities in China. The empirical results show that the evaluation method is scientific and reasonable, which provides theoretical support for the sustainable development path and reform direction of resource-based cities.

  18. A VGI data integration framework based on linked data model

    Science.gov (United States)

    Wan, Lin; Ren, Rongrong

    2015-12-01

    This paper aims at the geographic data integration and sharing method for multiple online VGI data sets. We propose a semantic-enabled framework for online VGI sources cooperative application environment to solve a target class of geospatial problems. Based on linked data technologies - which is one of core components of semantic web, we can construct the relationship link among geographic features distributed in diverse VGI platform by using linked data modeling methods, then deploy these semantic-enabled entities on the web, and eventually form an interconnected geographic data network to support geospatial information cooperative application across multiple VGI data sources. The mapping and transformation from VGI sources to RDF linked data model is presented to guarantee the unique data represent model among different online social geographic data sources. We propose a mixed strategy which combined spatial distance similarity and feature name attribute similarity as the measure standard to compare and match different geographic features in various VGI data sets. And our work focuses on how to apply Markov logic networks to achieve interlinks of the same linked data in different VGI-based linked data sets. In our method, the automatic generating method of co-reference object identification model according to geographic linked data is discussed in more detail. It finally built a huge geographic linked data network across loosely-coupled VGI web sites. The results of the experiment built on our framework and the evaluation of our method shows the framework is reasonable and practicable.

  19. Conservative multi-implicit integral deferred correction methods with adaptive mesh refinement

    International Nuclear Information System (INIS)

    Layton, A.T.

    2004-01-01

    In most models of reacting gas dynamics, the characteristic time scales of chemical reactions are much shorter than the hydrodynamic and diffusive time scales, rendering the reaction part of the model equations stiff. Moreover, nonlinear forcings may introduce into the solutions sharp gradients or shocks, the robust behavior and correct propagation of which require the use of specialized spatial discretization procedures. This study presents high-order conservative methods for the temporal integration of model equations of reacting flows. By means of a method of lines discretization on the flux difference form of the equations, these methods compute approximations to the cell-averaged or finite-volume solution. The temporal discretization is based on a multi-implicit generalization of integral deferred correction methods. The advection term is integrated explicitly, and the diffusion and reaction terms are treated implicitly but independently, with the splitting errors present in traditional operator splitting methods reduced via the integral deferred correction procedure. To reduce computational cost, time steps used to integrate processes with widely-differing time scales may differ in size. (author)

  20. [Bases and methods of suturing].

    Science.gov (United States)

    Vogt, P M; Altintas, M A; Radtke, C; Meyer-Marcotty, M

    2009-05-01

    If pharmaceutic modulation of scar formation does not improve the quality of the healing process over conventional healing, the surgeon must rely on personal skill and experience. Therefore a profound knowledge of wound healing based on experimental and clinical studies supplemented by postsurgical means of scar management and basic techniques of planning incisions, careful tissue handling, and thorough knowledge of suturing remain the most important ways to avoid abnormal scarring. This review summarizes the current experimental and clinical bases of surgical scar management.

  1. Ontology-based geographic data set integration

    NARCIS (Netherlands)

    Uitermark, H.T.J.A.; Uitermark, Harry T.; Oosterom, Peter J.M.; Mars, Nicolaas; Molenaar, Martien; Molenaar, M.

    1999-01-01

    In order to develop a system to propagate updates we investigate the semantic and spatial relationships between independently produced geographic data sets of the same region (data set integration). The goal of this system is to reduce operator intervention in update operations between corresponding

  2. Modeling of Graphene Planar Grating in the THz Range by the Method of Singular Integral Equations

    Science.gov (United States)

    Kaliberda, Mstislav E.; Lytvynenko, Leonid M.; Pogarsky, Sergey A.

    2018-04-01

    Diffraction of the H-polarized electromagnetic wave by the planar graphene grating in the THz range is considered. The scattering and absorption characteristics are studied. The scattered field is represented in the spectral domain via unknown spectral function. The mathematical model is based on the graphene surface impedance and the method of singular integral equations. The numerical solution is obtained by the Nystrom-type method of discrete singularities.

  3. The reduced basis method for the electric field integral equation

    International Nuclear Information System (INIS)

    Fares, M.; Hesthaven, J.S.; Maday, Y.; Stamm, B.

    2011-01-01

    We introduce the reduced basis method (RBM) as an efficient tool for parametrized scattering problems in computational electromagnetics for problems where field solutions are computed using a standard Boundary Element Method (BEM) for the parametrized electric field integral equation (EFIE). This combination enables an algorithmic cooperation which results in a two step procedure. The first step consists of a computationally intense assembling of the reduced basis, that needs to be effected only once. In the second step, we compute output functionals of the solution, such as the Radar Cross Section (RCS), independently of the dimension of the discretization space, for many different parameter values in a many-query context at very little cost. Parameters include the wavenumber, the angle of the incident plane wave and its polarization.

  4. Integrating Multiple Teaching Methods into a General Chemistry Classroom

    Science.gov (United States)

    Francisco, Joseph S.; Nicoll, Gayle; Trautmann, Marcella

    1998-02-01

    In addition to the traditional lecture format, three other teaching strategies (class discussions, concept maps, and cooperative learning) were incorporated into a freshman level general chemistry course. Student perceptions of their involvement in each of the teaching methods, as well as their perceptions of the utility of each method were used to assess the effectiveness of the integration of the teaching strategies as received by the students. Results suggest that each strategy serves a unique purpose for the students and increased student involvement in the course. These results indicate that the multiple teaching strategies were well received by the students and that all teaching strategies are necessary for students to get the most out of the course.

  5. Integral equation methods for vesicle electrohydrodynamics in three dimensions

    Science.gov (United States)

    Veerapaneni, Shravan

    2016-12-01

    In this paper, we develop a new boundary integral equation formulation that describes the coupled electro- and hydro-dynamics of a vesicle suspended in a viscous fluid and subjected to external flow and electric fields. The dynamics of the vesicle are characterized by a competition between the elastic, electric and viscous forces on its membrane. The classical Taylor-Melcher leaky-dielectric model is employed for the electric response of the vesicle and the Helfrich energy model combined with local inextensibility is employed for its elastic response. The coupled governing equations for the vesicle position and its transmembrane electric potential are solved using a numerical method that is spectrally accurate in space and first-order in time. The method uses a semi-implicit time-stepping scheme to overcome the numerical stiffness associated with the governing equations.

  6. Integrated knowledge base tool for acquisition and verification of NPP alarm systems

    International Nuclear Information System (INIS)

    Park, Joo Hyun; Seong, Poong Hyun

    1998-01-01

    Knowledge acquisition and knowledge base verification are important activities in developing knowledge-based systems such as alarm processing systems. In this work, we developed the integrated tool, for knowledge acquisition and verification of NPP alarm processing systems, by using G2 tool. The tool integrates document analysis method and ECPN matrix analysis method, for knowledge acquisition and knowledge verification, respectively. This tool enables knowledge engineers to perform their tasks from knowledge acquisition to knowledge verification consistently

  7. Accelerometer method and apparatus for integral display and control functions

    Science.gov (United States)

    Bozeman, Richard J., Jr.

    1992-06-01

    Vibration analysis has been used for years to provide a determination of the proper functioning of different types of machinery, including rotating machinery and rocket engines. A determination of a malfunction, if detected at a relatively early stage in its development, will allow changes in operating mode or a sequenced shutdown of the machinery prior to a total failure. Such preventative measures result in less extensive and/or less expensive repairs, and can also prevent a sometimes catastrophic failure of equipment. Standard vibration analyzers are generally rather complex, expensive, and of limited portability. They also usually result in displays and controls being located remotely from the machinery being monitored. Consequently, a need exists for improvements in accelerometer electronic display and control functions which are more suitable for operation directly on machines and which are not so expensive and complex. The invention includes methods and apparatus for detecting mechanical vibrations and outputting a signal in response thereto. The apparatus includes an accelerometer package having integral display and control functions. The accelerometer package is suitable for mounting upon the machinery to be monitored. Display circuitry provides signals to a bar graph display which may be used to monitor machine condition over a period of time. Control switches may be set which correspond to elements in the bar graph to provide an alert if vibration signals increase over the selected trip point. The circuitry is shock mounted within the accelerometer housing. The method provides for outputting a broadband analog accelerometer signal, integrating this signal to produce a velocity signal, integrating and calibrating the velocity signal before application to a display driver, and selecting a trip point at which a digitally compatible output signal is generated. The benefits of a vibration recording and monitoring system with controls and displays readily

  8. Application of the method of integral equations to calculating the electrodynamic characteristics of periodically corrugated waveguides

    International Nuclear Information System (INIS)

    Belov, V.E.; Rodygin, L.V.; Fil'chenko, S.E.; Yunakovskii, A.D.

    1988-01-01

    A method is described for calculating the electrodynamic characteristics of periodically corrugated waveguide systems. This method is based on representing the field as the solution of the Helmholtz vector equation in the form of a simple layer potential, transformed with the use of the Floquet conditions. Systems of compound integral equations based on a weighted vector function of the simple layer potential are derived for waveguides with azimuthally symmetric and helical corrugations. A numerical realization of the Fourier method is cited for seeking the dispersion relation of azimuthally symmetric waves of a circular corrugated waveguide

  9. An Analysis of Delay-based and Integrator-based Sequence Detectors for Grid-Connected Converters

    DEFF Research Database (Denmark)

    Khazraj, Hesam; Silva, Filipe Miguel Faria da; Bak, Claus Leth

    2017-01-01

    -signal cancellation operators are the main members of the delay-based sequence detectors. The aim of this paper is to provide a theoretical and experimental comparative study between integrator and delay based sequence detectors. The theoretical analysis is conducted based on the small-signal modelling......Detecting and separating positive and negative sequence components of the grid voltage or current is of vital importance in the control of grid-connected power converters, HVDC systems, etc. To this end, several techniques have been proposed in recent years. These techniques can be broadly...... classified into two main classes: The integrator-based techniques and Delay-based techniques. The complex-coefficient filter-based technique, dual second-order generalized integrator-based method, multiple reference frame approach are the main members of the integrator-based sequence detector and the delay...

  10. Based on Penalty Function Method

    Directory of Open Access Journals (Sweden)

    Ishaq Baba

    2015-01-01

    Full Text Available The dual response surface for simultaneously optimizing the mean and variance models as separate functions suffers some deficiencies in handling the tradeoffs between bias and variance components of mean squared error (MSE. In this paper, the accuracy of the predicted response is given a serious attention in the determination of the optimum setting conditions. We consider four different objective functions for the dual response surface optimization approach. The essence of the proposed method is to reduce the influence of variance of the predicted response by minimizing the variability relative to the quality characteristics of interest and at the same time achieving the specific target output. The basic idea is to convert the constraint optimization function into an unconstraint problem by adding the constraint to the original objective function. Numerical examples and simulations study are carried out to compare performance of the proposed method with some existing procedures. Numerical results show that the performance of the proposed method is encouraging and has exhibited clear improvement over the existing approaches.

  11. COMPANY VALUATION METHODS BASED ON PATRIMONY

    Directory of Open Access Journals (Sweden)

    SUCIU GHEORGHE

    2013-02-01

    Full Text Available The methods used for the company valuation can be divided into 3 main groups: methods based on patrimony,methods based on financial performance, methods based both on patrimony and on performance. The companyvaluation methods based on patrimony are implemented taking into account the balance sheet or the financialstatement. The financial statement refers to that type of balance in which the assets are arranged according to liquidity,and the liabilities according to their financial maturity date. The patrimonial methods are based on the principle thatthe value of the company equals that of the patrimony it owns. From a legal point of view, the patrimony refers to allthe rights and obligations of a company. The valuation of companies based on their financial performance can be donein 3 ways: the return value, the yield value, the present value of the cash flows. The mixed methods depend both onpatrimony and on financial performance or can make use of other methods.

  12. Methods for external event screening quantification: Risk Methods Integration and Evaluation Program (RMIEP) methods development

    International Nuclear Information System (INIS)

    Ravindra, M.K.; Banon, H.

    1992-07-01

    In this report, the scoping quantification procedures for external events in probabilistic risk assessments of nuclear power plants are described. External event analysis in a PRA has three important goals; (1) the analysis should be complete in that all events are considered; (2) by following some selected screening criteria, the more significant events are identified for detailed analysis; (3) the selected events are analyzed in depth by taking into account the unique features of the events: hazard, fragility of structures and equipment, external-event initiated accident sequences, etc. Based on the above goals, external event analysis may be considered as a three-stage process: Stage I: Identification and Initial Screening of External Events; Stage II: Bounding Analysis; Stage III: Detailed Risk Analysis. In the present report, first, a review of published PRAs is given to focus on the significance and treatment of external events in full-scope PRAs. Except for seismic, flooding, fire, and extreme wind events, the contributions of other external events to plant risk have been found to be negligible. Second, scoping methods for external events not covered in detail in the NRC's PRA Procedures Guide are provided. For this purpose, bounding analyses for transportation accidents, extreme winds and tornadoes, aircraft impacts, turbine missiles, and chemical release are described

  13. Consequence Based Design. An approach for integrating computational collaborative models (Integrated Dynamic Models) in the building design phase

    DEFF Research Database (Denmark)

    Negendahl, Kristoffer

    relies on various advancements in the area of integrated dynamic models. It also relies on the application and test of the approach in practice to evaluate the Consequence based design and the use of integrated dynamic models. As a result, the Consequence based design approach has been applied in five...... and define new ways to implement integrated dynamic models for the following project. In parallel, seven different developments of new methods, tools and algorithms have been performed to support the application of the approach. The developments concern: Decision diagrams – to clarify goals and the ability...... affect the design process and collaboration between building designers and simulationists. Within the limits of applying the approach of Consequence based design to five case studies, followed by documentation based on interviews, surveys and project related documentations derived from internal reports...

  14. Developments of integrity evaluation technology for pressurized components in nuclear power plant and IT based integrity evaluation system

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Young Jin; Choi, Jae Boong; Shim, Do Jun [Sungkyunkwan Univ., Seoul (Korea, Republic of)] (and others)

    2003-03-15

    The objective of this research is to develop an efficient evaluation technology and to investigate applicability of newly-developed technology, such as internet-based cyber platform, to operating power plants. Development of efficient evaluation systems for Nuclear Power Plant components, based on structural integrity assessment techniques, are increasingly demanded for safe operation with the increasing operating period of Nuclear Power Plants. The following five topics are covered in this project: development of assessment method for wall-thinned nuclear piping based on pipe test; development of structural integrity program for steam generator tubes with cracks of various shape; development of fatigue life evaluation system for mam components of NPP; development of internet-based cyber platform and integrity program for primary components of NPP; effect of aging on strength of dissimilar welds.

  15. Integrating forest inventory and analysis data into a LIDAR-based carbon monitoring system

    Science.gov (United States)

    Kristofer D. Johnson; Richard Birdsey; Andrew O Finley; Anu Swantaran; Ralph Dubayah; Craig Wayson; Rachel. Riemann

    2014-01-01

    Forest Inventory and Analysis (FIA) data may be a valuable component of a LIDAR-based carbon monitoring system, but integration of the two observation systems is not without challenges. To explore integration methods, two wall-to-wall LIDAR-derived biomass maps were compared to FIA data at both the plot and county levels in Anne Arundel and Howard Counties in Maryland...

  16. A Numerical Study of Quantization-Based Integrators

    Directory of Open Access Journals (Sweden)

    Barros Fernando

    2014-01-01

    Full Text Available Adaptive step size solvers are nowadays considered fundamental to achieve efficient ODE integration. While, traditionally, ODE solvers have been designed based on discrete time machines, new approaches based on discrete event systems have been proposed. Quantization provides an efficient integration technique based on signal threshold crossing, leading to independent and modular solvers communicating through discrete events. These solvers can benefit from the large body of knowledge on discrete event simulation techniques, like parallelization, to obtain efficient numerical integration. In this paper we introduce new solvers based on quantization and adaptive sampling techniques. Preliminary numerical results comparing these solvers are presented.

  17. A fast method for linear waves based on geometrical optics

    NARCIS (Netherlands)

    Stolk, C.C.

    2009-01-01

    We develop a fast method for solving the one-dimensional wave equation based on geometrical optics. From geometrical optics (e.g., Fourier integral operator theory or WKB approximation) it is known that high-frequency waves split into forward and backward propagating parts, each propagating with the

  18. Integration issues of information engineering based I-CASE tools

    OpenAIRE

    Kurbel, Karl; Schnieder, Thomas

    1994-01-01

    Problems and requirements regarding integration of methods and tools across phases of the software-development life cycle are discussed. Information engineering (IE) methodology and I-CASE (integrated CASE) tools supporting IE claim to have an integrated view across major stages of enterprise-wide information-system development: information strategy planning, business area analysis, system design, and construction. In the main part of this paper, two comprehensive I-CASE tools, ADW (Applicati...

  19. Integration Processes of Delay Differential Equation Based on Modified Laguerre Functions

    Directory of Open Access Journals (Sweden)

    Yeguo Sun

    2012-01-01

    Full Text Available We propose long-time convergent numerical integration processes for delay differential equations. We first construct an integration process based on modified Laguerre functions. Then we establish its global convergence in certain weighted Sobolev space. The proposed numerical integration processes can also be used for systems of delay differential equations. We also developed a technique for refinement of modified Laguerre-Radau interpolations. Lastly, numerical results demonstrate the spectral accuracy of the proposed method and coincide well with analysis.

  20. Rapid and Green Analytical Method for the Determination of Quinoline Alkaloids from Cinchona succirubra Based on Microwave-Integrated Extraction and Leaching (MIEL Prior to High Performance Liquid Chromatography

    Directory of Open Access Journals (Sweden)

    Farid Chemat

    2011-11-01

    Full Text Available Quinas contains several compounds, such as quinoline alkaloids, principally quinine, quinidine, cinchonine and cichonidine. Identified from barks of Cinchona, quinine is still commonly used to treat human malaria. Microwave-Integrated Extraction and Leaching (MIEL is proposed for the extraction of quinoline alkaloids from bark of Cinchona succirubra. The process is performed in four steps, which ensures complete, rapid and accurate extraction of the samples. Optimal conditions for extraction were obtained using a response surface methodology reached from a central composite design. The MIEL extraction has been compared with a conventional technique soxhlet extraction. The extracts of quinoline alkaloids from C. succirubra obtained by these two different methods were compared by HPLC. The extracts obtained by MIEL in 32 min were quantitatively (yield and qualitatively (quinine, quinidine, cinchonine, cinchonidine similar to those obtained by conventional Soxhlet extraction in 3 hours. MIEL is a green technology that serves as a good alternative for the extraction of Cinchona alkaloids.

  1. Rapid and green analytical method for the determination of quinoline alkaloids from Cinchona succirubra based on Microwave-Integrated Extraction and Leaching (MIEL) prior to high performance liquid chromatography.

    Science.gov (United States)

    Fabiano-Tixier, Anne-Sylvie; Elomri, Abdelhakim; Blanckaert, Axelle; Seguin, Elisabeth; Petitcolas, Emmanuel; Chemat, Farid

    2011-01-01

    Quinas contains several compounds, such as quinoline alkaloids, principally quinine, quinidine, cinchonine and cichonidine. Identified from barks of Cinchona, quinine is still commonly used to treat human malaria. Microwave-Integrated Extraction and Leaching (MIEL) is proposed for the extraction of quinoline alkaloids from bark of Cinchona succirubra. The process is performed in four steps, which ensures complete, rapid and accurate extraction of the samples. Optimal conditions for extraction were obtained using a response surface methodology reached from a central composite design. The MIEL extraction has been compared with a conventional technique soxhlet extraction. The extracts of quinoline alkaloids from C. succirubra obtained by these two different methods were compared by HPLC. The extracts obtained by MIEL in 32 min were quantitatively (yield) and qualitatively (quinine, quinidine, cinchonine, cinchonidine) similar to those obtained by conventional Soxhlet extraction in 3 hours. MIEL is a green technology that serves as a good alternative for the extraction of Cinchona alkaloids.

  2. Apparatus and method for defect testing of integrated circuits

    Science.gov (United States)

    Cole, Jr., Edward I.; Soden, Jerry M.

    2000-01-01

    An apparatus and method for defect and failure-mechanism testing of integrated circuits (ICs) is disclosed. The apparatus provides an operating voltage, V.sub.DD, to an IC under test and measures a transient voltage component, V.sub.DDT, signal that is produced in response to switching transients that occur as test vectors are provided as inputs to the IC. The amplitude or time delay of the V.sub.DDT signal can be used to distinguish between defective and defect-free (i.e. known good) ICs. The V.sub.DDT signal is measured with a transient digitizer, a digital oscilloscope, or with an IC tester that is also used to input the test vectors to the IC. The present invention has applications for IC process development, for the testing of ICs during manufacture, and for qualifying ICs for reliability.

  3. ARE METHODS USED TO INTEGRATE STANDARDIZED MANAGEMENT SYSTEMS A CONDITIONING FACTOR OF THE LEVEL OF INTEGRATION? AN EMPIRICAL STUDY

    Directory of Open Access Journals (Sweden)

    Merce Bernardo

    2011-09-01

    Full Text Available Organizations are increasingly implementing multiple Management System Standards (M SSs and considering managing the related Management Systems (MSs as a single system.The aim of this paper is to analyze if methods us ed to integrate standardized MSs condition the level of integration of those MSs. A descriptive methodology has been applied to 343 Spanish organizations registered to, at least, ISO 9001 and ISO 14001. Seven groups of these organizations using different combinations of methods have been analyzed Results show that these organizations have a high level of integration of their MSs. The most common method used, was the process map. Organizations using a combination of different methods achieve higher levels of integration than those using a single method. However, no evidence has been found to confirm the relationship between the method used and the integration level achieved.

  4. Ultrafast method of calculating the dynamic spectral line shapes for integrated modelling of plasmas

    International Nuclear Information System (INIS)

    Lisitsa, V.S.

    2009-01-01

    An ultrafast code for spectral line shape calculations is presented to be used in the integrated modelling of plasmas. The code is based on the close analogy between two mechanisms: (i) Dicke narrowing of the Doppler-broadened spectral lines and (ii) transition from static to impact regime in the Stark broadening. The analogy makes it possible to describe the dynamic Stark broadening in terms of an analytical functional of the static line shape. A comparison of new method with the widely used Frequency Fluctuating Method (FFM) developed by the Marseille University group (B. Talin, R. Stamm, et al.) shows good agreement, with the new method being faster than the standard FFM by nearly two orders of magnitude. The method proposed may significantly simplify the radiation transport modeling and opens new possibilities for integrated modeling of the edge and divertor plasma in tokamaks. (author)

  5. Null Space Integration Method for Constrained Multibody Systems with No Constraint Violation

    International Nuclear Information System (INIS)

    Terze, Zdravko; Lefeber, Dirk; Muftic, Osman

    2001-01-01

    A method for integrating equations of motion of constrained multibody systems with no constraint violation is presented. A mathematical model, shaped as a differential-algebraic system of index 1, is transformed into a system of ordinary differential equations using the null-space projection method. Equations of motion are set in a non-minimal form. During integration, violations of constraints are corrected by solving constraint equations at the position and velocity level, utilizing the metric of the system's configuration space, and projective criterion to the coordinate partitioning method. The method is applied to dynamic simulation of 3D constrained biomechanical system. The simulation results are evaluated by comparing them to the values of characteristic parameters obtained by kinematics analysis of analyzed motion based unmeasured kinematics data

  6. Improving Allergen Prediction in Main Crops Using a Weighted Integrative Method.

    Science.gov (United States)

    Li, Jing; Wang, Jing; Li, Jing

    2017-12-01

    As a public health problem, food allergy is frequently caused by food allergy proteins, which trigger a type-I hypersensitivity reaction in the immune system of atopic individuals. The food allergens in our daily lives are mainly from crops including rice, wheat, soybean and maize. However, allergens in these main crops are far from fully uncovered. Although some bioinformatics tools or methods predicting the potential allergenicity of proteins have been proposed, each method has their limitation. In this paper, we built a novel algorithm PREAL W , which integrated PREAL, FAO/WHO criteria and motif-based method by a weighted average score, to benefit the advantages of different methods. Our results illustrated PREAL W has better performance significantly in the crops' allergen prediction. This integrative allergen prediction algorithm could be useful for critical food safety matters. The PREAL W could be accessed at http://lilab.life.sjtu.edu.cn:8080/prealw .

  7. Extending product modeling methods for integrated product development

    DEFF Research Database (Denmark)

    Bonev, Martin; Wörösch, Michael; Hauksdóttir, Dagný

    2013-01-01

    Despite great efforts within the modeling domain, the majority of methods often address the uncommon design situation of an original product development. However, studies illustrate that development tasks are predominantly related to redesigning, improving, and extending already existing products...... and PVM methods, in a presented Product Requirement Development model some of the individual drawbacks of each method could be overcome. Based on the UML standard, the model enables the representation of complex hierarchical relationships in a generic product model. At the same time it uses matrix....... Updated design requirements have then to be made explicit and mapped against the existing product architecture. In this paper, existing methods are adapted and extended through linking updated requirements to suitable product models. By combining several established modeling techniques, such as the DSM...

  8. Systems and methods for switched-inductor integrated voltage regulators

    Science.gov (United States)

    Shepard, Kenneth L.; Sturcken, Noah Andrew

    2017-12-12

    Power controller includes an output terminal having an output voltage, at least one clock generator to generate a plurality of clock signals and a plurality of hardware phases. Each hardware phase is coupled to the at least one clock generator and the output terminal and includes a comparator. Each hardware phase is configured to receive a corresponding one of the plurality of clock signals and a reference voltage, combine the corresponding clock signal and the reference voltage to produce a reference input, generate a feedback voltage based on the output voltage, compare the reference input and the feedback voltage using the comparator and provide a comparator output to the output terminal, whereby the comparator output determines a duty cycle of the power controller. An integrated circuit including the power controller is also provided.

  9. Challenges and promises of integrating knowledge engineering and qualitative methods

    Science.gov (United States)

    Lundberg, C. Gustav; Holm, Gunilla

    Our goal is to expose some of the close ties that exist between knowledge engineering (KE) and qualitative methodology (QM). Many key concepts of qualitative research, for example meaning, commonsense, understanding, and everyday life, overlap with central research concerns in artificial intelligence. These shared interests constitute a largely unexplored avenue for interdisciplinary cooperation. We compare and take some steps toward integrating two historically diverse methodologies by exploring the commonalities of KE and QM both from a substantive and a methodological/technical perspective. In the second part of this essay, we address knowledge acquisition problems and procedures. Knowledge acquisition within KE has been based primarily on cognitive psychology/science foundations, whereas knowledge acquisition within QM has a broader foundation in phenomenology, symbolic interactionism, and ethnomethodology. Our discussion and examples are interdisciplinary in nature. We do not suggest that there is a clash between the KE and QM frameworks, but rather that the lack of communication potentially may limit each framework's future development.

  10. Integrated crop protection and environment exposure to pesticides: methods to reduce use and impact of pesticides in arable farming

    NARCIS (Netherlands)

    Wijnands, F.G.

    1997-01-01

    Prototypes of Integrated Farming Systems for arable farming are being developed in the Netherlands based on a coherent methodology elaborated in an European Union concerted action. The role of crop protection in Integrated systems is, additional to all other methods, to efficiently control the

  11. Path integral methods via the use of the central limit theorem and application

    International Nuclear Information System (INIS)

    Thrapsaniotis, E G

    2008-01-01

    We consider a path integral in the phase space possibly with an influence functional in it and we use a method based on the use of the central limit theorem on the phase of the path integral representation to extract an equivalent expression which can be used in numerical calculations. Moreover we give conditions under which we can extract closed analytical results. As a specific application we consider a general system of two coupled and forced harmonic oscillators with coupling of the form x 1 x α 2 and we derive the relevant sign solved propagator

  12. Detailed comparison between decay heat data calculated by the summation method and integral measurements

    International Nuclear Information System (INIS)

    Rudstam, G.

    1979-01-01

    The fission product library FPLIB has been used for a calculation of the decay heat effect in nuclear fuel. The results are compared with integral determinations and with results obtained using the ENDF/BIV data base. In the case of the beta part, and also for the total decay heat, the FPLIB-data seem to be superior to the ENDF/BIV-data. The experimental integral data are in many cases reproduced within the combined limits of error of the methods. (author)

  13. Development of integrated cask body and base plate

    International Nuclear Information System (INIS)

    Sasaki, T.; Koyama, Y.; Yoshida, T.; Wada, T.

    2015-01-01

    The average of occupancy of stored spent-fuel in the nuclear power plants have reached 70 percent and it is anticipated that the demand of metal casks for the storage and transportation of spent-fuel rise after resuming the operations. The main part of metal cask consists of main body, neutron shield and external cylinder. We have developed the manufacturing technology of Integrated Cask Body and Base Plate by integrating Cask Body and Base Plate as monolithic forging with the goal of cost reduction, manufacturing period shortening and further reliability improvement. Here, we report the manufacturing technology, code compliance and obtained properties of Integrated Cask body and Base Plate. (author)

  14. Model-Based Integration and Interpretation of Data

    DEFF Research Database (Denmark)

    Petersen, Johannes

    2004-01-01

    Data integration and interpretation plays a crucial role in supervisory control. The paper defines a set of generic inference steps for the data integration and interpretation process based on a three-layer model of system representations. The three-layer model is used to clarify the combination...... of constraint and object-centered representations of the work domain throwing new light on the basic principles underlying the data integration and interpretation process of Rasmussen's abstraction hierarchy as well as other model-based approaches combining constraint and object-centered representations. Based...

  15. Methods of assessing total doses integrated across pathways

    International Nuclear Information System (INIS)

    Grzechnik, M.; Camplin, W.; Clyne, F.; Allott, R.; Webbe-Wood, D.

    2006-01-01

    Calculated doses for comparison with limits resulting from discharges into the environment should be summed across all relevant pathways and food groups to ensure adequate protection. Current methodology for assessments used in the radioactivity in Food and the Environment (R.I.F.E.) reports separate doses from pathways related to liquid discharges of radioactivity to the environment from those due to gaseous releases. Surveys of local inhabitant food consumption and occupancy rates are conducted in the vicinity of nuclear sites. Information has been recorded in an integrated way, such that the data for each individual is recorded for all pathways of interest. These can include consumption of foods, such as fish, crustaceans, molluscs, fruit and vegetables, milk and meats. Occupancy times over beach sediments and time spent in close proximity to the site is also recorded for inclusion of external and inhalation radiation dose pathways. The integrated habits survey data may be combined with monitored environmental radionuclide concentrations to calculate total dose. The criteria for successful adoption of a method for this calculation were: Reproducibility can others easily use the approach and reassess doses? Rigour and realism how good is the match with reality?Transparency a measure of the ease with which others can understand how the calculations are performed and what they mean. Homogeneity is the group receiving the dose relatively homogeneous with respect to age, diet and those aspects that affect the dose received? Five methods of total dose calculation were compared and ranked according to their suitability. Each method was labelled (A to E) and given a short, relevant name for identification. The methods are described below; A) Individual doses to individuals are calculated and critical group selection is dependent on dose received. B) Individual Plus As in A, but consumption and occupancy rates for high dose is used to derive rates for application in

  16. Skill-based immigration, economic integration, and economic performance

    OpenAIRE

    Aydemir, Abdurrahman

    2014-01-01

    Studies for major immigrant-receiving countries provide evidence on the comparative economic performance of immigrant classes (skill-, kinship-, and humanitarian-based). Developed countries are increasingly competing for high-skilled immigrants, who perform better in the labor market. However, there are serious challenges to their economic integration, which highlights a need for complementary immigration and integration policies.

  17. Integrating knowledge based functionality in commercial hospital information systems.

    Science.gov (United States)

    Müller, M L; Ganslandt, T; Eich, H P; Lang, K; Ohmann, C; Prokosch, H U

    2000-01-01

    Successful integration of knowledge-based functions in the electronic patient record depends on direct and context-sensitive accessibility and availability to clinicians and must suit their workflow. In this paper we describe an exemplary integration of an existing standalone scoring system for acute abdominal pain into two different commercial hospital information systems using Java/Corba technolgy.

  18. Integrated lecturing within clerkship course, a new learning method in nurse-anesthesia teaching

    Directory of Open Access Journals (Sweden)

    Mahmood Akhlaghi

    2015-06-01

    Full Text Available Background and purpose: Traditional lecture-based teaching has been long used to transit theoretical knowledge to the participants. Due to some problems of this didactic approach, some believe that integration within an active method is more valuable in nursing education. In this study, we hypothesized that integrating lecture-based teaching within clerkship course would enhance nurse-anesthesia students’ knowledge.Methods: A prospective randomized study was conducted. Twenty four students of two-year nurse-anesthesia participated in the study. All of the students received either didactic lectures or integrated lectures within clerkship course during a four-month semester of their educational curriculum. Their knowledge of anesthesia course was assessed at the end of the course using Wilcoxon Rank test.Results: The integrated method improved students’ final scores at the end of the semester (p=0.004. Moreover, their scores was much better when taxonomy-2 questions were compared (p=0.001.Conclusion: Incorporating didactic lecture within anesthesia clerkship course improves participants’ knowledge of anesthesia course.Keywords:  Anesthesia, Lecture, Knowledge, Anesthesia course, Clerkship course

  19. Application of Nemerow Index Method and Integrated Water Quality Index Method in Water Quality Assessment of Zhangze Reservoir

    Science.gov (United States)

    Zhang, Qian; Feng, Minquan; Hao, Xiaoyan

    2018-03-01

    [Objective] Based on the water quality historical data from the Zhangze Reservoir from the last five years, the water quality was assessed by the integrated water quality identification index method and the Nemerow pollution index method. The results of different evaluation methods were analyzed and compared and the characteristics of each method were identified.[Methods] The suitability of the water quality assessment methods were compared and analyzed, based on these results.[Results] the water quality tended to decrease over time with 2016 being the year with the worst water quality. The sections with the worst water quality were the southern and northern sections.[Conclusion] The results produced by the traditional Nemerow index method fluctuated greatly in each section of water quality monitoring and therefore could not effectively reveal the trend of water quality at each section. The combination of qualitative and quantitative measures of the comprehensive pollution index identification method meant it could evaluate the degree of water pollution as well as determine that the river water was black and odorous. However, the evaluation results showed that the water pollution was relatively low.The results from the improved Nemerow index evaluation were better as the single indicators and evaluation results are in strong agreement; therefore the method is able to objectively reflect the water quality of each water quality monitoring section and is more suitable for the water quality evaluation of the reservoir.

  20. A study of non destructive integrity assessment method for structural materials of nuclear reactor. Part 2

    International Nuclear Information System (INIS)

    Totsuka, Nobuo; Matsuzaki, Akihiro

    2011-01-01

    The hardness measurement is one of the most effective way for non destructive integrity assessment evaluating structural materials of nuclear power plants before and after suffering an earthquake. Then an actual evaluation method and effectiveness of the method using portable hardness tester has been reported in the previous Journal. In this study, the developing method which can evaluate more accurately the amount of plastic deformation of the material caused by an earthquake has been reported, based on the experimental results about the hardness change of the material considering the thermal aging due to the plant operation and the cyclic deformation suffered by an earthquake. (author)

  1. Integration of image exposure time into a modified laser speckle imaging method

    Energy Technology Data Exchange (ETDEWEB)

    RamIrez-San-Juan, J C; Salazar-Hermenegildo, N; Ramos-Garcia, R; Munoz-Lopez, J [Optics Department, INAOE, Puebla (Mexico); Huang, Y C [Department of Electrical Engineering and Computer Science, University of California, Irvine, CA (United States); Choi, B, E-mail: jcram@inaoep.m [Beckman Laser Institute and Medical Clinic, University of California, Irvine, CA (United States)

    2010-11-21

    Speckle-based methods have been developed to characterize tissue blood flow and perfusion. One such method, called modified laser speckle imaging (mLSI), enables computation of blood flow maps with relatively high spatial resolution. Although it is known that the sensitivity and noise in LSI measurements depend on image exposure time, a fundamental disadvantage of mLSI is that it does not take into account this parameter. In this work, we integrate the exposure time into the mLSI method and provide experimental support of our approach with measurements from an in vitro flow phantom.

  2. Integration of image exposure time into a modified laser speckle imaging method

    International Nuclear Information System (INIS)

    RamIrez-San-Juan, J C; Salazar-Hermenegildo, N; Ramos-Garcia, R; Munoz-Lopez, J; Huang, Y C; Choi, B

    2010-01-01

    Speckle-based methods have been developed to characterize tissue blood flow and perfusion. One such method, called modified laser speckle imaging (mLSI), enables computation of blood flow maps with relatively high spatial resolution. Although it is known that the sensitivity and noise in LSI measurements depend on image exposure time, a fundamental disadvantage of mLSI is that it does not take into account this parameter. In this work, we integrate the exposure time into the mLSI method and provide experimental support of our approach with measurements from an in vitro flow phantom.

  3. Two-dimensional parasitic capacitance extraction for integrated circuit with dual discrete geometric methods

    International Nuclear Information System (INIS)

    Ren Dan; Ren Zhuoxiang; Qu Hui; Xu Xiaoyu

    2015-01-01

    Capacitance extraction is one of the key issues in integrated circuits and also a typical electrostatic problem. The dual discrete geometric method (DGM) is investigated to provide relative solutions in two-dimensional unstructured mesh space. The energy complementary characteristic and quick field energy computation thereof based on it are emphasized. Contrastive analysis between the dual finite element methods and the dual DGMs are presented both from theoretical derivation and through case studies. The DGM, taking the scalar potential as unknown on dual interlocked meshes, with simple form and good accuracy, is expected to be one of the mainstreaming methods in associated areas. (paper)

  4. Runge-Kutta Integration of the Equal Width Wave Equation Using the Method of Lines

    Directory of Open Access Journals (Sweden)

    M. A. Banaja

    2015-01-01

    Full Text Available The equal width (EW equation governs nonlinear wave phenomena like waves in shallow water. Numerical solution of the (EW equation is obtained by using the method of lines (MOL based on Runge-Kutta integration. Using von Neumann stability analysis, the scheme is found to be unconditionally stable. Solitary wave motion and interaction of two solitary waves are studied using the proposed method. The three invariants of the motion are evaluated to determine the conservation properties of the generated scheme. Accuracy of the proposed method is discussed by computing the L2 and L∞ error norms. The results are found in good agreement with exact solution.

  5. Smartphone-based integrated PDR/GPS/Bluetooth pedestrian location

    Science.gov (United States)

    Li, Xianghong; Wei, Dongyan; Lai, Qifeng; Xu, Ying; Yuan, Hong

    2017-02-01

    Typical indoor location method is fingerprint and traditional outdoor location system is GPS. Both of them are of poor accuracy and limited only for indoor or outdoor environments. As the smartphones are equipped with MEMS sensors, it means PDR can be widely used. In this paper, an algorithm of smartphone-based integrated PDR/GPS/Bluetooth for pedestrian location in the indoor/outdoor is proposed, which can be highly expected to realize seamless indoor/outdoor localization of the pedestrian. In addition, we also provide technologies to estimate orientation with Magnetometer and Gyroscope and detect context with output of sensors. The extensive experimental results show that the proposed algorithm can realize seamless indoor/outdoor localization.

  6. Undergraduate medical student's perception about an integrated method of teaching at a medical school in Oman

    Directory of Open Access Journals (Sweden)

    Harshal Sabane

    2015-01-01

    Full Text Available Objective In recent years, there has been a gradual but definitive shift in medical schools all over the globe to promote a more integrated way of teaching. Integration of medical disciplines promotes a holistic understanding of the medical curriculum in the students. This helps them better understand and appreciate the importance and role of each medical subject. Method The study was conducted among the 5th year Pre-clinical students. Questionnaire consisted of 4 questions on the level of integration, 5 questions on various aspects of the assessment and some questions which tested the level of awareness of the integrated method. Result Out of a total of 72 students present on the day of data collection, 65 participated in the study giving a response rate of 90.27 %. After primary data cleansing 4 questionnaires had to be omitted. Most of the students opined as “good” or “very good” for the questions on integration and its attributes. Only 27 (44 % were aware of integrated curriculum being taught in other medical schools in the gulf. Similar findings were observed regarding assessment related questions. Reduction in the number of block exams is unpopular among the students and only 6% have agreed for 3, 4, or 5 non-summative block assessments. Opinion regarding the help of integrated teaching in IFOM based OMSB entrance examination was mixed with a greater variance in the responses. 43% students have indicated that they would like to spend more time with PDCI. Conclusion The students of our institution seem to have a favourable opinion regarding the integrated system of teaching. The satisfaction with the conduct of examinations and its related variables is found to be high. A reduction in the number of block exams however is unpopular among the target group and they would appreciate a greater time allocation for subjects of PDCI and Pharmacology.

  7. Integration of Evidence Base into a Probabilistic Risk Assessment

    Science.gov (United States)

    Saile, Lyn; Lopez, Vilma; Bickham, Grandin; Kerstman, Eric; FreiredeCarvalho, Mary; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    INTRODUCTION: A probabilistic decision support model such as the Integrated Medical Model (IMM) utilizes an immense amount of input data that necessitates a systematic, integrated approach for data collection, and management. As a result of this approach, IMM is able to forecasts medical events, resource utilization and crew health during space flight. METHODS: Inflight data is the most desirable input for the Integrated Medical Model. Non-attributable inflight data is collected from the Lifetime Surveillance for Astronaut Health study as well as the engineers, flight surgeons, and astronauts themselves. When inflight data is unavailable cohort studies, other models and Bayesian analyses are used, in addition to subject matters experts input on occasion. To determine the quality of evidence of a medical condition, the data source is categorized and assigned a level of evidence from 1-5; the highest level is one. The collected data reside and are managed in a relational SQL database with a web-based interface for data entry and review. The database is also capable of interfacing with outside applications which expands capabilities within the database itself. Via the public interface, customers can access a formatted Clinical Findings Form (CLiFF) that outlines the model input and evidence base for each medical condition. Changes to the database are tracked using a documented Configuration Management process. DISSCUSSION: This strategic approach provides a comprehensive data management plan for IMM. The IMM Database s structure and architecture has proven to support additional usages. As seen by the resources utilization across medical conditions analysis. In addition, the IMM Database s web-based interface provides a user-friendly format for customers to browse and download the clinical information for medical conditions. It is this type of functionality that will provide Exploratory Medicine Capabilities the evidence base for their medical condition list

  8. Nucleic Acid-based Detection of Bacterial Pathogens Using Integrated Microfluidic Platform Systems

    Directory of Open Access Journals (Sweden)

    Carl A. Batt

    2009-05-01

    Full Text Available The advent of nucleic acid-based pathogen detection methods offers increased sensitivity and specificity over traditional microbiological techniques, driving the development of portable, integrated biosensors. The miniaturization and automation of integrated detection systems presents a significant advantage for rapid, portable field-based testing. In this review, we highlight current developments and directions in nucleic acid-based micro total analysis systems for the detection of bacterial pathogens. Recent progress in the miniaturization of microfluidic processing steps for cell capture, DNA extraction and purification, polymerase chain reaction, and product detection are detailed. Discussions include strategies and challenges for implementation of an integrated portable platform.

  9. Simulation electromagnetic scattering on bodies through integral equation and neural networks methods

    Science.gov (United States)

    Lvovich, I. Ya; Preobrazhenskiy, A. P.; Choporov, O. N.

    2018-05-01

    The paper deals with the issue of electromagnetic scattering on a perfectly conducting diffractive body of a complex shape. Performance calculation of the body scattering is carried out through the integral equation method. Fredholm equation of the second time was used for calculating electric current density. While solving the integral equation through the moments method, the authors have properly described the core singularity. The authors determined piecewise constant functions as basic functions. The chosen equation was solved through the moments method. Within the Kirchhoff integral approach it is possible to define the scattered electromagnetic field, in some way related to obtained electrical currents. The observation angles sector belongs to the area of the front hemisphere of the diffractive body. To improve characteristics of the diffractive body, the authors used a neural network. All the neurons contained a logsigmoid activation function and weighted sums as discriminant functions. The paper presents the matrix of weighting factors of the connectionist model, as well as the results of the optimized dimensions of the diffractive body. The paper also presents some basic steps in calculation technique of the diffractive bodies, based on the combination of integral equation and neural networks methods.

  10. Integrating ICT in Agriculture for Knowledge-Based Economy

    African Journals Online (AJOL)

    agriculture –based livelihoods, demands the integration of ICT knowledge with agriculture. .... (CGIAR) shows the vital role of Agricultural development in Rwanda's ... Network, Rwanda National Backbone Project, Regional Communication.

  11. Leisure market segmentation : an integrated preferences/constraints-based approach

    NARCIS (Netherlands)

    Stemerding, M.P.; Oppewal, H.; Beckers, T.A.M.; Timmermans, H.J.P.

    1996-01-01

    Traditional segmentation schemes are often based on a grouping of consumers with similar preference functions. The research steps, ultimately leading to such segmentation schemes, are typically independent. In the present article, a new integrated approach to segmentation is introduced, which

  12. A Review on Methods of Risk Adjustment and their Use in Integrated Healthcare Systems

    Science.gov (United States)

    Juhnke, Christin; Bethge, Susanne

    2016-01-01

    Introduction: Effective risk adjustment is an aspect that is more and more given weight on the background of competitive health insurance systems and vital healthcare systems. The objective of this review was to obtain an overview of existing models of risk adjustment as well as on crucial weights in risk adjustment. Moreover, the predictive performance of selected methods in international healthcare systems should be analysed. Theory and methods: A comprehensive, systematic literature review on methods of risk adjustment was conducted in terms of an encompassing, interdisciplinary examination of the related disciplines. Results: In general, several distinctions can be made: in terms of risk horizons, in terms of risk factors or in terms of the combination of indicators included. Within these, another differentiation by three levels seems reasonable: methods based on mortality risks, methods based on morbidity risks as well as those based on information on (self-reported) health status. Conclusions and discussion: After the final examination of different methods of risk adjustment it was shown that the methodology used to adjust risks varies. The models differ greatly in terms of their included morbidity indicators. The findings of this review can be used in the evaluation of integrated healthcare delivery systems and can be integrated into quality- and patient-oriented reimbursement of care providers in the design of healthcare contracts. PMID:28316544

  13. A new integrated evaluation method of heavy metals pollution control during melting and sintering of MSWI fly ash.

    Science.gov (United States)

    Li, Rundong; Li, Yanlong; Yang, Tianhua; Wang, Lei; Wang, Weiyun

    2015-05-30

    Evaluations of technologies for heavy metal control mainly examine the residual and leaching rates of a single heavy metal, such that developed evaluation method have no coordination or uniqueness and are therefore unsuitable for hazard control effect evaluation. An overall pollution toxicity index (OPTI) was established in this paper, based on the developed index, an integrated evaluation method of heavy metal pollution control was established. Application of this method in the melting and sintering of fly ash revealed the following results: The integrated control efficiency of the melting process was higher in all instances than that of the sintering process. The lowest integrated control efficiency of melting was 56.2%, and the highest integrated control efficiency of sintering was 46.6%. Using the same technology, higher integrated control efficiency conditions were all achieved with lower temperatures and shorter times. This study demonstrated the unification and consistency of this method. Copyright © 2015 Elsevier B.V. All rights reserved.

  14. Photometric method for determination of acidity constants through integral spectra analysis.

    Science.gov (United States)

    Zevatskiy, Yuriy Eduardovich; Ruzanov, Daniil Olegovich; Samoylov, Denis Vladimirovich

    2015-04-15

    An express method for determination of acidity constants of organic acids, based on the analysis of the integral transmittance vs. pH dependence is developed. The integral value is registered as a photocurrent of photometric device simultaneously with potentiometric titration. The proposed method allows to obtain pKa using only simple and low-cost instrumentation. The optical part of the experimental setup has been optimized through the exclusion of the monochromator device. Thus it only takes 10-15 min to obtain one pKa value with the absolute error of less than 0.15 pH units. Application limitations and reliability of the method have been tested for a series of organic acids of various nature. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Photometric method for determination of acidity constants through integral spectra analysis

    Science.gov (United States)

    Zevatskiy, Yuriy Eduardovich; Ruzanov, Daniil Olegovich; Samoylov, Denis Vladimirovich

    2015-04-01

    An express method for determination of acidity constants of organic acids, based on the analysis of the integral transmittance vs. pH dependence is developed. The integral value is registered as a photocurrent of photometric device simultaneously with potentiometric titration. The proposed method allows to obtain pKa using only simple and low-cost instrumentation. The optical part of the experimental setup has been optimized through the exclusion of the monochromator device. Thus it only takes 10-15 min to obtain one pKa value with the absolute error of less than 0.15 pH units. Application limitations and reliability of the method have been tested for a series of organic acids of various nature.

  16. Direct integral linear least square regression method for kinetic evaluation of hepatobiliary scintigraphy

    International Nuclear Information System (INIS)

    Shuke, Noriyuki

    1991-01-01

    In hepatobiliary scintigraphy, kinetic model analysis, which provides kinetic parameters like hepatic extraction or excretion rate, have been done for quantitative evaluation of liver function. In this analysis, unknown model parameters are usually determined using nonlinear least square regression method (NLS method) where iterative calculation and initial estimate for unknown parameters are required. As a simple alternative to NLS method, direct integral linear least square regression method (DILS method), which can determine model parameters by a simple calculation without initial estimate, is proposed, and tested the applicability to analysis of hepatobiliary scintigraphy. In order to see whether DILS method could determine model parameters as good as NLS method, or to determine appropriate weight for DILS method, simulated theoretical data based on prefixed parameters were fitted to 1 compartment model using both DILS method with various weightings and NLS method. The parameter values obtained were then compared with prefixed values which were used for data generation. The effect of various weights on the error of parameter estimate was examined, and inverse of time was found to be the best weight to make the error minimum. When using this weight, DILS method could give parameter values close to those obtained by NLS method and both parameter values were very close to prefixed values. With appropriate weighting, the DILS method could provide reliable parameter estimate which is relatively insensitive to the data noise. In conclusion, the DILS method could be used as a simple alternative to NLS method, providing reliable parameter estimate. (author)

  17. Agent-Based Data Integration Framework

    Directory of Open Access Journals (Sweden)

    Łukasz Faber

    2014-01-01

    Full Text Available Combining data from diverse, heterogeneous sources while facilitating a unified access to it is an important (albeit difficult task. There are various possibilities of performing it. In this publication, we propose and describe an agent-based framework dedicated to acquiring and processing distributed, heterogeneous data collected from diverse sources (e.g., the Internet, external software, relational, and document databases. Using this multi-agent-based approach in the aspects of the general architecture (the organization and management of the framework, we create a proof-of-concept implementation. The approach is presented using a sample scenario in which the system is used to search for personal and professional profiles of scientists.

  18. Nodal integral method for the neutron diffusion equation in cylindrical geometry

    International Nuclear Information System (INIS)

    Azmy, Y.Y.

    1987-01-01

    The nodal methodology is based on retaining a higher a higher degree of analyticity in the process of deriving the discrete-variable equations compared to conventional numerical methods. As a result, extensive numerical testing of nodal methods developed for a wide variety of partial differential equations and comparison of the results to conventional methods have established the superior accuracy of nodal methods on coarse meshes. Moreover, these tests have shown that nodal methods are more computationally efficient than finite difference and finite-element methods in the sense that they require shorter CPU times to achieve comparable accuracy in the solutions. However, nodal formalisms and the final discrete-variable equations they produce are, in general, more complicated than their conventional counterparts. This, together with anticipated difficulties in applying the transverse-averaging procedure in curvilinear coordinates, has limited the applications of nodal methods, so far, to Cartesian geometry, and with additional approximations to hexagonal geometry. In this paper the authors report recent progress in deriving and numerically implementing a nodal integral method (NIM) for solving the neutron diffusion equation in cylindrical r-z geometry. Also, presented are comparisons of numerical solutions to two test problems with those obtained by the Exterminator-2 code, which indicate the superior accuracy of the nodal integral method solutions on much coarser meshes

  19. Nonlinear moments method for the isotropic Boltzmann equation and the invariance of collision integral

    International Nuclear Information System (INIS)

    Ehnder, A.Ya.; Ehnder, I.A.

    1999-01-01

    A new approach to develop nonlinear moment method to solve the Boltzmann equation is presented. This approach is based on the invariance of collision integral as to the selection of the base functions. The Sonin polynomials with the Maxwell weighting function are selected to serve as the base functions. It is shown that for the arbitrary cross sections of the interaction the matrix elements corresponding to the moments from the nonlinear integral of collisions are bound by simple recurrent bonds enabling to express all nonlinear matrix elements in terms of the linear ones. As a result, high-efficiency numerical pattern to calculate nonlinear matrix elements is obtained. The presented approach offers possibilities both to calculate relaxation processes within high speed range and to some more complex kinetic problems [ru

  20. Developing integrated methods to address complex resource and environmental issues

    Science.gov (United States)

    Smith, Kathleen S.; Phillips, Jeffrey D.; McCafferty, Anne E.; Clark, Roger N.

    2016-02-08

    IntroductionThis circular provides an overview of selected activities that were conducted within the U.S. Geological Survey (USGS) Integrated Methods Development Project, an interdisciplinary project designed to develop new tools and conduct innovative research requiring integration of geologic, geophysical, geochemical, and remote-sensing expertise. The project was supported by the USGS Mineral Resources Program, and its products and acquired capabilities have broad applications to missions throughout the USGS and beyond.In addressing challenges associated with understanding the location, quantity, and quality of mineral resources, and in investigating the potential environmental consequences of resource development, a number of field and laboratory capabilities and interpretative methodologies evolved from the project that have applications to traditional resource studies as well as to studies related to ecosystem health, human health, disaster and hazard assessment, and planetary science. New or improved tools and research findings developed within the project have been applied to other projects and activities. Specifically, geophysical equipment and techniques have been applied to a variety of traditional and nontraditional mineral- and energy-resource studies, military applications, environmental investigations, and applied research activities that involve climate change, mapping techniques, and monitoring capabilities. Diverse applied geochemistry activities provide a process-level understanding of the mobility, chemical speciation, and bioavailability of elements, particularly metals and metalloids, in a variety of environmental settings. Imaging spectroscopy capabilities maintained and developed within the project have been applied to traditional resource studies as well as to studies related to ecosystem health, human health, disaster assessment, and planetary science. Brief descriptions of capabilities and laboratory facilities and summaries of some

  1. Content-Based Personalization Services Integrating Folksonomies

    Science.gov (United States)

    Musto, Cataldo; Narducci, Fedelucio; Lops, Pasquale; de Gemmis, Marco; Semeraro, Giovanni

    Basic content-based personalization consists in matching up the attributes of a user profile, in which preferences and interests are stored, with the attributes of a content object. The Web 2.0 (r)evolution has changed the game for personalization, from ‘elitary’ Web 1.0, written by few and read by many, to web content generated by everyone (user-generated content - UGC), since the role of people has evolved from passive consumers of information to that of active contributors.

  2. Metriplectic Gyrokinetics and Discretization Methods for the Landau Collision Integral

    Science.gov (United States)

    Hirvijoki, Eero; Burby, Joshua W.; Kraus, Michael

    2017-10-01

    We present two important results for the kinetic theory and numerical simulation of warm plasmas: 1) We provide a metriplectic formulation of collisional electrostatic gyrokinetics that is fully consistent with the First and Second Laws of Thermodynamics. 2) We provide a metriplectic temporal and velocity-space discretization for the particle phase-space Landau collision integral that satisfies the conservation of energy, momentum, and particle densities to machine precision, as well as guarantees the existence of numerical H-theorem. The properties are demonstrated algebraically. These two result have important implications: 1) Numerical methods addressing the Vlasov-Maxwell-Landau system of equations, or its reduced gyrokinetic versions, should start from a metriplectic formulation to preserve the fundamental physical principles also at the discrete level. 2) The plasma physics community should search for a metriplectic reduction theory that would serve a similar purpose as the existing Lagrangian and Hamiltonian reduction theories do in gyrokinetics. The discovery of metriplectic formulation of collisional electrostatic gyrokinetics is strong evidence in favor of such theory and, if uncovered, the theory would be invaluable in constructing reduced plasma models. Supported by U.S. DOE Contract Nos. DE-AC02-09-CH11466 (EH) and DE-AC05-06OR23100 (JWB) and by European Union's Horizon 2020 research and innovation Grant No. 708124 (MK).

  3. Boundary integral method for torsion of composite shafts

    International Nuclear Information System (INIS)

    Chou, S.I.; Mohr, J.A.

    1987-01-01

    The Saint-Venant torsion problem for homogeneous shafts with simply or multiply-connected regions has received a great deal of attention in the past. However, because of the mathematical difficulties inherent in the problem, very few problems of torsion of shafts with composite cross sections have been solved analytically. Muskhelishvili (1963) studied the torsion problem for shafts with cross sections having several solid inclusions surrounded by an elastic material. The problem of a circular shaft reinforced by a non-concentric round inclusion, a rectangular shaft composed of two rectangular parts made of different materials were solved. In this paper, a boundary integral equation method, which can be used to solve problems more complex than those considered by Katsikadelis et. al., is developed. Square shaft with two dissimilar rectangular parts, square shaft with a square inclusion are solved and the results compared with those given in the reference cited above. Finally, a square shaft composed of two rectangular parts with circular inclusion is solved. (orig./GL)

  4. Integration of rock typing methods for carbonate reservoir characterization

    International Nuclear Information System (INIS)

    Aliakbardoust, E; Rahimpour-Bonab, H

    2013-01-01

    Reservoir rock typing is the most important part of all reservoir modelling. For integrated reservoir rock typing, static and dynamic properties need to be combined, but sometimes these two are incompatible. The failure is due to the misunderstanding of the crucial parameters that control the dynamic behaviour of the reservoir rock and thus selecting inappropriate methods for defining static rock types. In this study, rock types were defined by combining the SCAL data with the rock properties, particularly rock fabric and pore types. First, air-displacing-water capillary pressure curues were classified because they are representative of fluid saturation and behaviour under capillary forces. Next the most important rock properties which control the fluid flow and saturation behaviour (rock fabric and pore types) were combined with defined classes. Corresponding petrophysical properties were also attributed to reservoir rock types and eventually, defined rock types were compared with relative permeability curves. This study focused on representing the importance of the pore system, specifically pore types in fluid saturation and entrapment in the reservoir rock. The most common tests in static rock typing, such as electrofacies analysis and porosity–permeability correlation, were carried out and the results indicate that these are not appropriate approaches for reservoir rock typing in carbonate reservoirs with a complicated pore system. (paper)

  5. High-integrity software, computation and the scientific method

    International Nuclear Information System (INIS)

    Hatton, L.

    2012-01-01

    Computation rightly occupies a central role in modern science. Datasets are enormous and the processing implications of some algorithms are equally staggering. With the continuing difficulties in quantifying the results of complex computations, it is of increasing importance to understand its role in the essentially Popperian scientific method. In this paper, some of the problems with computation, for example the long-term unquantifiable presence of undiscovered defect, problems with programming languages and process issues will be explored with numerous examples. One of the aims of the paper is to understand the implications of trying to produce high-integrity software and the limitations which still exist. Unfortunately Computer Science itself suffers from an inability to be suitably critical of its practices and has operated in a largely measurement-free vacuum since its earliest days. Within computer science itself, this has not been so damaging in that it simply leads to unconstrained creativity and a rapid turnover of new technologies. In the applied sciences however which have to depend on computational results, such unquantifiability significantly undermines trust. It is time this particular demon was put to rest. (author)

  6. Indoor integrated navigation and synchronous data acquisition method for Android smartphone

    Science.gov (United States)

    Hu, Chunsheng; Wei, Wenjian; Qin, Shiqiao; Wang, Xingshu; Habib, Ayman; Wang, Ruisheng

    2015-08-01

    Smartphones are widely used at present. Most smartphones have cameras and kinds of sensors, such as gyroscope, accelerometer and magnet meter. Indoor navigation based on smartphone is very important and valuable. According to the features of the smartphone and indoor navigation, a new indoor integrated navigation method is proposed, which uses MEMS (Micro-Electro-Mechanical Systems) IMU (Inertial Measurement Unit), camera and magnet meter of smartphone. The proposed navigation method mainly involves data acquisition, camera calibration, image measurement, IMU calibration, initial alignment, strapdown integral, zero velocity update and integrated navigation. Synchronous data acquisition of the sensors (gyroscope, accelerometer and magnet meter) and the camera is the base of the indoor navigation on the smartphone. A camera data acquisition method is introduced, which uses the camera class of Android to record images and time of smartphone camera. Two kinds of sensor data acquisition methods are introduced and compared. The first method records sensor data and time with the SensorManager of Android. The second method realizes open, close, data receiving and saving functions in C language, and calls the sensor functions in Java language with JNI interface. A data acquisition software is developed with JDK (Java Development Kit), Android ADT (Android Development Tools) and NDK (Native Development Kit). The software can record camera data, sensor data and time at the same time. Data acquisition experiments have been done with the developed software and Sumsang Note 2 smartphone. The experimental results show that the first method of sensor data acquisition is convenient but lost the sensor data sometimes, the second method is much better in real-time performance and much less in data losing. A checkerboard image is recorded, and the corner points of the checkerboard are detected with the Harris method. The sensor data of gyroscope, accelerometer and magnet meter have

  7. Method for Assessing the Integrated Risk of Soil Pollution in Industrial and Mining Gathering Areas

    Science.gov (United States)

    Guan, Yang; Shao, Chaofeng; Gu, Qingbao; Ju, Meiting; Zhang, Qian

    2015-01-01

    Industrial and mining activities are recognized as major sources of soil pollution. This study proposes an index system for evaluating the inherent risk level of polluting factories and introduces an integrated risk assessment method based on human health risk. As a case study, the health risk, polluting factories and integrated risks were analyzed in a typical industrial and mining gathering area in China, namely, Binhai New Area. The spatial distribution of the risk level was determined using a Geographic Information System. The results confirmed the following: (1) Human health risk in the study area is moderate to extreme, with heavy metals posing the greatest threat; (2) Polluting factories pose a moderate to extreme inherent risk in the study area. Such factories are concentrated in industrial and urban areas, but are irregularly distributed and also occupy agricultural land, showing a lack of proper planning and management; (3) The integrated risks of soil are moderate to high in the study area. PMID:26580644

  8. Method for Assessing the Integrated Risk of Soil Pollution in Industrial and Mining Gathering Areas.

    Science.gov (United States)

    Guan, Yang; Shao, Chaofeng; Gu, Qingbao; Ju, Meiting; Zhang, Qian

    2015-11-13

    Industrial and mining activities are recognized as major sources of soil pollution. This study proposes an index system for evaluating the inherent risk level of polluting factories and introduces an integrated risk assessment method based on human health risk. As a case study, the health risk, polluting factories and integrated risks were analyzed in a typical industrial and mining gathering area in China, namely, Binhai New Area. The spatial distribution of the risk level was determined using a Geographic Information System. The results confirmed the following: (1) Human health risk in the study area is moderate to extreme, with heavy metals posing the greatest threat; (2) Polluting factories pose a moderate to extreme inherent risk in the study area. Such factories are concentrated in industrial and urban areas, but are irregularly distributed and also occupy agricultural land, showing a lack of proper planning and management; (3) The integrated risks of soil are moderate to high in the study area.

  9. SIG-VISA: Signal-based Vertically Integrated Seismic Monitoring

    Science.gov (United States)

    Moore, D.; Mayeda, K. M.; Myers, S. C.; Russell, S.

    2013-12-01

    Traditional seismic monitoring systems rely on discrete detections produced by station processing software; however, while such detections may constitute a useful summary of station activity, they discard large amounts of information present in the original recorded signal. We present SIG-VISA (Signal-based Vertically Integrated Seismic Analysis), a system for seismic monitoring through Bayesian inference on seismic signals. By directly modeling the recorded signal, our approach incorporates additional information unavailable to detection-based methods, enabling higher sensitivity and more accurate localization using techniques such as waveform matching. SIG-VISA's Bayesian forward model of seismic signal envelopes includes physically-derived models of travel times and source characteristics as well as Gaussian process (kriging) statistical models of signal properties that combine interpolation of historical data with extrapolation of learned physical trends. Applying Bayesian inference, we evaluate the model on earthquakes as well as the 2009 DPRK test event, demonstrating a waveform matching effect as part of the probabilistic inference, along with results on event localization and sensitivity. In particular, we demonstrate increased sensitivity from signal-based modeling, in which the SIGVISA signal model finds statistical evidence for arrivals even at stations for which the IMS station processing failed to register any detection.

  10. Graphene-based integrated electrodes for flexible lithium ion batteries

    International Nuclear Information System (INIS)

    Shi, Ying; Wen, Lei; Zhou, Guangmin; Chen, Jing; Pei, Songfeng; Huang, Kun; Cheng, Hui-Ming; Li, Feng

    2015-01-01

    We have prepared flexible free-standing electrodes with anode and cathode active materials deposited on a highly conductive graphene membrane by a two-step filtration method. Compared with conventional electrodes using metal as current collectors, these electrodes have displayed stronger adhesion, superior electrochemical performance, higher energy density, and better flexibility. A full lithium ion battery assembled by adopting these graphene-based electrodes has showed high rate capability and long cyclic life. We have also assembled a thin, lightweight, and flexible lithium ion battery with poly-(dimethyl siloxane) sheets as packaging material to light a red light-emitting diode. This flexible battery can be easily bent without structural failure or performance loss and operated well under a bent state. The fabrication process of these graphene-based integrated electrodes only has two filtration steps; thus it is easy to scale up. These results suggest great potential for these graphene-based flexible batteries in lightweight, bendable, and wearable electronic devices. (paper)

  11. Computational methods assuring nuclear power plant structural integrity and safety: an overview of the recent activities at VTT

    International Nuclear Information System (INIS)

    Keinaenen, H.; Talja, H.; Rintamaa, R.

    1998-01-01

    Numerical, simplified engineering and standardised methods are applied in the safety analyses of primary circuit components and reactor pressure vessels. The integrity assessment procedures require input relating both to the steady state and transient loading actual material properties data and precise knowledge of the size and geometry of defects. Current procedures bold extensive information regarding these aspects. It is important to verify the accuracy of the different assessment methods especially in the case of complex structures and loading. The focus of this paper is on the recent results and development of computational fracture assessment methods at VTT Manufacturing Technology. The methods include effective engineering type tools for rapid structural integrity assessments and more sophisticated finite-element based methods. An integrated PC-based program system MASI for engineering fracture analysis is described. A summary of the verification of the methods in computational benchmark analyses and against the results of large scale experiments is presented. (orig.)

  12. On the integration scheme along a trajectory for the characteristics method

    International Nuclear Information System (INIS)

    Le Tellier, Romain; Hebert, Alain

    2006-01-01

    The issue of the integration scheme along a trajectory which appears for all tracking-based transport methods is discussed from the point of view of the method of characteristics. The analogy with the discrete ordinates method in slab geometry is highlighted along with the practical limitation in transposing high-order S N schemes to a trajectory-based method. We derived an example of such a transposition starting from the linear characteristic scheme. This new scheme is compared with the standard flat-source approximation of the step characteristic scheme and with the diamond differencing scheme. The numerical study covers a 1D analytical case, 2D one-group critical and fixed-source benchmarks and finally a realistic multigroup calculation on a BWR-MOX assembly

  13. Scalable electro-photonic integration concept based on polymer waveguides

    Science.gov (United States)

    Bosman, E.; Van Steenberge, G.; Boersma, A.; Wiegersma, S.; Harmsma, P.; Karppinen, M.; Korhonen, T.; Offrein, B. J.; Dangel, R.; Daly, A.; Ortsiefer, M.; Justice, J.; Corbett, B.; Dorrestein, S.; Duis, J.

    2016-03-01

    A novel method for fabricating a single mode optical interconnection platform is presented. The method comprises the miniaturized assembly of optoelectronic single dies, the scalable fabrication of polymer single mode waveguides and the coupling to glass fiber arrays providing the I/O's. The low cost approach for the polymer waveguide fabrication is based on the nano-imprinting of a spin-coated waveguide core layer. The assembly of VCSELs and photodiodes is performed before waveguide layers are applied. By embedding these components in deep reactive ion etched pockets in the silicon substrate, the planarity of the substrate for subsequent layer processing is guaranteed and the thermal path of chip-to-substrate is minimized. Optical coupling of the embedded devices to the nano-imprinted waveguides is performed by laser ablating 45 degree trenches which act as optical mirror for 90 degree deviation of the light from VCSEL to waveguide. Laser ablation is also implemented for removing parts of the polymer stack in order to mount a custom fabricated connector containing glass fiber arrays. A demonstration device was built to show the proof of principle of the novel fabrication, packaging and optical coupling principles as described above, combined with a set of sub-demonstrators showing the functionality of the different techniques separately. The paper represents a significant part of the electro-photonic integration accomplishments in the European 7th Framework project "Firefly" and not only discusses the development of the different assembly processes described above, but the efforts on the complete integration of all process approaches into the single device demonstrator.

  14. Optimizing some 3-stage W-methods for the time integration of PDEs

    Science.gov (United States)

    Gonzalez-Pinto, S.; Hernandez-Abreu, D.; Perez-Rodriguez, S.

    2017-07-01

    The optimization of some W-methods for the time integration of time-dependent PDEs in several spatial variables is considered. In [2, Theorem 1] several three-parametric families of three-stage W-methods for the integration of IVPs in ODEs were studied. Besides, the optimization of several specific methods for PDEs when the Approximate Matrix Factorization Splitting (AMF) is used to define the approximate Jacobian matrix (W ≈ fy(yn)) was carried out. Also, some convergence and stability properties were presented [2]. The derived methods were optimized on the base that the underlying explicit Runge-Kutta method is the one having the largest Monotonicity interval among the thee-stage order three Runge-Kutta methods [1]. Here, we propose an optimization of the methods by imposing some additional order condition [7] to keep order three for parabolic PDE problems [6] but at the price of reducing substantially the length of the nonlinear Monotonicity interval of the underlying explicit Runge-Kutta method.

  15. The iso-response method: measuring neuronal stimulus integration with closed-loop experiments

    Science.gov (United States)

    Gollisch, Tim; Herz, Andreas V. M.

    2012-01-01

    Throughout the nervous system, neurons integrate high-dimensional input streams and transform them into an output of their own. This integration of incoming signals involves filtering processes and complex non-linear operations. The shapes of these filters and non-linearities determine the computational features of single neurons and their functional roles within larger networks. A detailed characterization of signal integration is thus a central ingredient to understanding information processing in neural circuits. Conventional methods for measuring single-neuron response properties, such as reverse correlation, however, are often limited by the implicit assumption that stimulus integration occurs in a linear fashion. Here, we review a conceptual and experimental alternative that is based on exploring the space of those sensory stimuli that result in the same neural output. As demonstrated by recent results in the auditory and visual system, such iso-response stimuli can be used to identify the non-linearities relevant for stimulus integration, disentangle consecutive neural processing steps, and determine their characteristics with unprecedented precision. Automated closed-loop experiments are crucial for this advance, allowing rapid search strategies for identifying iso-response stimuli during experiments. Prime targets for the method are feed-forward neural signaling chains in sensory systems, but the method has also been successfully applied to feedback systems. Depending on the specific question, “iso-response” may refer to a predefined firing rate, single-spike probability, first-spike latency, or other output measures. Examples from different studies show that substantial progress in understanding neural dynamics and coding can be achieved once rapid online data analysis and stimulus generation, adaptive sampling, and computational modeling are tightly integrated into experiments. PMID:23267315

  16. Coal mine enterprise integration based on strategic alliance

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Q.; Sun, J.; Xu, S. [Tsinghua University, Beijing (China). Dept. of Computer Science and Technology

    2003-07-01

    The relationship between coal mine and related enterprise was analysed. Aiming at the competitive world market as well as the dynamic requirement, a coal mine enterprise integration strategy and a enterprise strategic alliance were proposed for the product providing service business pattern. The modelling method of the enterprise strategic alliance was proposed, including the relationship view model, information view model and business process view model. The idea of enterprise strategic alliance is useful for enterprise integration. 6 refs., 2 figs.

  17. Graphene based integrated tandem supercapacitors fabricated directly on separators

    KAUST Repository

    Chen, Wei

    2015-04-09

    It is of great importance to fabricate integrated supercapacitors with extended operation voltages as high energy density storage devices. In this work, we develop a novel direct electrode deposition on separator (DEDS) process to fabricate graphene based integrated tandem supercapacitors for the first time. The DEDS process generates compact graphene-polyaniline electrodes directly on the separators to form integrated supercapacitors. The integrated graphene-polyaniline tandem supercapacitors demonstrate ultrahigh volumetric energy density of 52.5 Wh L^(−1) at power density of 6037 W L^(−1) and excellent gravimetric energy density of 26.1 Wh kg^(−1) at power density of 3002 W kg^(−1) with outstanding electrochemical stability for over 10000 cycles. This study show great promises for the future development of integrated energy storage devices.

  18. Broadband image sensor array based on graphene-CMOS integration

    Science.gov (United States)

    Goossens, Stijn; Navickaite, Gabriele; Monasterio, Carles; Gupta, Shuchi; Piqueras, Juan José; Pérez, Raúl; Burwell, Gregory; Nikitskiy, Ivan; Lasanta, Tania; Galán, Teresa; Puma, Eric; Centeno, Alba; Pesquera, Amaia; Zurutuza, Amaia; Konstantatos, Gerasimos; Koppens, Frank

    2017-06-01

    Integrated circuits based on complementary metal-oxide-semiconductors (CMOS) are at the heart of the technological revolution of the past 40 years, enabling compact and low-cost microelectronic circuits and imaging systems. However, the diversification of this platform into applications other than microcircuits and visible-light cameras has been impeded by the difficulty to combine semiconductors other than silicon with CMOS. Here, we report the monolithic integration of a CMOS integrated circuit with graphene, operating as a high-mobility phototransistor. We demonstrate a high-resolution, broadband image sensor and operate it as a digital camera that is sensitive to ultraviolet, visible and infrared light (300-2,000 nm). The demonstrated graphene-CMOS integration is pivotal for incorporating 2D materials into the next-generation microelectronics, sensor arrays, low-power integrated photonics and CMOS imaging systems covering visible, infrared and terahertz frequencies.

  19. A Fuzzy Adaptive Tightly-Coupled Integration Method for Mobile Target Localization Using SINS/WSN

    Directory of Open Access Journals (Sweden)

    Wei Li

    2016-11-01

    Full Text Available In recent years, mobile target localization for enclosed environments has been a growing interest. In this paper, we have proposed a fuzzy adaptive tightly-coupled integration (FATCI method for positioning and tracking applications using strapdown inertial navigation system (SINS and wireless sensor network (WSN. The wireless signal outage and severe multipath propagation of WSN often influence the accuracy of measured distance and lead to difficulties with the WSN positioning. Note also that the SINS are known for their drifted error over time. Using as a base the well-known loosely-coupled integration method, we have built a tightly-coupled integrated positioning system for SINS/WSN based on the measured distances between anchor nodes and mobile node. The measured distance value of WSN is corrected with a least squares regression (LSR algorithm, with the aim of decreasing the systematic error for measured distance. Additionally, the statistical covariance of measured distance value is used to adjust the observation covariance matrix of a Kalman filter using a fuzzy inference system (FIS, based on the statistical characteristics. Then the tightly-coupled integration model can adaptively adjust the confidence level for measurement according to the different measured accuracies of distance measurements. Hence the FATCI system is achieved using SINS/WSN. This innovative approach is verified in real scenarios. Experimental results show that the proposed positioning system has better accuracy and stability compared with the loosely-coupled and traditional tightly-coupled integration model for WSN short-term failure or normal conditions.

  20. Methods for Addressing Uncertainty and Variability to Characterize Potential Health Risk From Trichloroethylene-Contaminated Ground Water Beale Air Force Base in California: Integration of Uncertainty and Variability in Pharmacokinetics and Dose-Response; TOPICAL

    International Nuclear Information System (INIS)

    Bogen, K.T.

    1999-01-01

    Traditional estimates of health risk are typically inflated, particularly if cancer is the dominant endpoint and there is fundamental uncertainty as to mechanism(s) of action. Risk is more realistically characterized if it accounts for joint uncertainty and interindividual variability after applying a unified probabilistic approach to the distributed parameters of all (linear as well as nonlinear) risk-extrapolation models involved. Such an approach was applied to characterize risks to potential future residents posed by trichloroethylene (TCE) in ground water at an inactive landfill site on Beale Air Force Base in California. Variability and uncertainty were addressed in exposure-route-specific estimates of applied dose, in pharmacokinetically based estimates of route-specific metabolized fractions of absorbed TCE, and in corresponding biologically effective doses estimated under a genotoxic/linear (MA(sub g)) vs. a cytotoxic/nonlinear (MA(sub c)) mechanistic assumption for TCE-induced cancer. Increased risk conditional on effective dose was estimated under MA(sub G) based on seven rodent-bioassay data sets, and under MA, based on mouse hepatotoxicity data. Mean and upper-bound estimates of combined risk calculated by the unified approach were and lt;10(sup -6) and and lt;10(sup -4), respectively, while corresponding estimates based on traditional deterministic methods were and gt;10(sup -5) and and gt;10(sup -4), respectively. It was estimated that no TCE-related harm is likely occur due any plausible residential exposure scenario involving the site. The unified approach illustrated is particularly suited to characterizing risks that involve uncertain and/or diverse mechanisms of action

  1. Integration of relational and textual biomedical sources. A pilot experiment using a semi-automated method for logical schema acquisition.

    Science.gov (United States)

    García-Remesal, M; Maojo, V; Billhardt, H; Crespo, J

    2010-01-01

    Bringing together structured and text-based sources is an exciting challenge for biomedical informaticians, since most relevant biomedical sources belong to one of these categories. In this paper we evaluate the feasibility of integrating relational and text-based biomedical sources using: i) an original logical schema acquisition method for textual databases developed by the authors, and ii) OntoFusion, a system originally designed by the authors for the integration of relational sources. We conducted an integration experiment involving a test set of seven differently structured sources covering the domain of genetic diseases. We used our logical schema acquisition method to generate schemas for all textual sources. The sources were integrated using the methods and tools provided by OntoFusion. The integration was validated using a test set of 500 queries. A panel of experts answered a questionnaire to evaluate i) the quality of the extracted schemas, ii) the query processing performance of the integrated set of sources, and iii) the relevance of the retrieved results. The results of the survey show that our method extracts coherent and representative logical schemas. Experts' feedback on the performance of the integrated system and the relevance of the retrieved results was also positive. Regarding the validation of the integration, the system successfully provided correct results for all queries in the test set. The results of the experiment suggest that text-based sources including a logical schema can be regarded as equivalent to structured databases. Using our method, previous research and existing tools designed for the integration of structured databases can be reused - possibly subject to minor modifications - to integrate differently structured sources.

  2. Exact and approximate interior corner problem in neutron diffusion by integral transform methods

    International Nuclear Information System (INIS)

    Bareiss, E.H.; Chang, K.S.J.; Constatinescu, D.A.

    1976-09-01

    The mathematical solution of the neutron diffusion equation exhibits singularities in its derivatives at material corners. A mathematical treatment of the nature of these singularities and its impact on coarse network approximation methods in computational work is presented. The mathematical behavior is deduced from Green's functions, based on a generalized theory for two space dimensions, and the resulting systems of integral equations, as well as from the Kontorovich--Lebedev Transform. The effect on numerical calculations is demonstrated for finite difference and finite element methods for a two-region corner problem

  3. Equivalent Method of Integrated Power Generation System of Wind, Photovoltaic and Energy Storage in Power Flow Calculation and Transient Simulation

    Institute of Scientific and Technical Information of China (English)

    2012-01-01

    The integrated power generation system of wind, photovoltaic (PV) and energy storage is composed of several wind turbines, PV units and energy storage units. The detailed model of integrated generation is not suitable for the large-scale powe.r system simulation because of the model's complexity and long computation time. An equivalent method for power flow calculation and transient simulation of the integrated generation system is proposed based on actual projects, so as to establish the foundation of such integrated system simulation and analysis.

  4. Diagrammatical methods within the path integral representation for quantum systems

    International Nuclear Information System (INIS)

    Alastuey, A

    2014-01-01

    The path integral representation has been successfully applied to the study of equilibrium properties of quantum systems for a long time. In particular, such a representation allowed Ginibre to prove the convergence of the low-fugacity expansions for systems with short-range interactions. First, I will show that the crucial trick underlying Ginibre's proof is the introduction of an equivalent classical system made with loops. Within the Feynman-Kac formula for the density matrix, such loops naturally emerge by collecting together the paths followed by particles exchanged in a given cyclic permutation. Two loops interact via an average of two- body genuine interactions between particles belonging to different loops, while the interactions between particles inside a given loop are accounted for in a loop fugacity. It turns out that the grand-partition function of the genuine quantum system exactly reduces to its classical counterpart for the gas of loops. The corresponding so-called magic formula can be combined with standard Mayer diagrammatics for the classical gas of loops. This provides low-density representations for the quantum correlations or thermodynamical functions, which are quite useful when collective effects must be taken into account properly. Indeed, resummations and or reorganizations of Mayer graphs can be performed by exploiting their remarkable topological and combinatorial properties, while statistical weights and bonds are purely c-numbers. The interest of that method will be illustrated through a brief description of its application to two long-standing problems, namely recombination in Coulomb systems and condensation in the interacting Bose gas.

  5. Pixel extraction based integral imaging with controllable viewing direction

    International Nuclear Information System (INIS)

    Ji, Chao-Chao; Deng, Huan; Wang, Qiong-Hua

    2012-01-01

    We propose pixel extraction based integral imaging with a controllable viewing direction. The proposed integral imaging can provide viewers three-dimensional (3D) images in a very small viewing angle. The viewing angle and the viewing direction of the reconstructed 3D images are controlled by the pixels extracted from an elemental image array. Theoretical analysis and a 3D display experiment of the viewing direction controllable integral imaging are carried out. The experimental results verify the correctness of the theory. A 3D display based on the integral imaging can protect the viewer’s privacy and has huge potential for a television to show multiple 3D programs at the same time. (paper)

  6. System and method for integrating and accessing multiple data sources within a data warehouse architecture

    Science.gov (United States)

    Musick, Charles R [Castro Valley, CA; Critchlow, Terence [Livermore, CA; Ganesh, Madhaven [San Jose, CA; Slezak, Tom [Livermore, CA; Fidelis, Krzysztof [Brentwood, CA

    2006-12-19

    A system and method is disclosed for integrating and accessing multiple data sources within a data warehouse architecture. The metadata formed by the present method provide a way to declaratively present domain specific knowledge, obtained by analyzing data sources, in a consistent and useable way. Four types of information are represented by the metadata: abstract concepts, databases, transformations and mappings. A mediator generator automatically generates data management computer code based on the metadata. The resulting code defines a translation library and a mediator class. The translation library provides a data representation for domain specific knowledge represented in a data warehouse, including "get" and "set" methods for attributes that call transformation methods and derive a value of an attribute if it is missing. The mediator class defines methods that take "distinguished" high-level objects as input and traverse their data structures and enter information into the data warehouse.

  7. Integrating adaptive governance and participatory multicriteria methods: a framework for climate adaptation governance

    Directory of Open Access Journals (Sweden)

    Stefania Munaretto

    2014-06-01

    Full Text Available Climate adaptation is a dynamic social and institutional process where the governance dimension is receiving growing attention. Adaptive governance is an approach that promises to reduce uncertainty by improving the knowledge base for decision making. As uncertainty is an inherent feature of climate adaptation, adaptive governance seems to be a promising approach for improving climate adaptation governance. However, the adaptive governance literature has so far paid little attention to decision-making tools and methods, and the literature on the governance of adaptation is in its infancy in this regard. We argue that climate adaptation governance would benefit from systematic and yet flexible decision-making tools and methods such as participatory multicriteria methods for the evaluation of adaptation options, and that these methods can be linked to key adaptive governance principles. Moving from these premises, we propose a framework that integrates key adaptive governance features into participatory multicriteria methods for the governance of climate adaptation.

  8. Practical Integration-Free Episomal Methods for Generating Human Induced Pluripotent Stem Cells.

    Science.gov (United States)

    Kime, Cody; Rand, Tim A; Ivey, Kathryn N; Srivastava, Deepak; Yamanaka, Shinya; Tomoda, Kiichiro

    2015-10-06

    The advent of induced pluripotent stem (iPS) cell technology has revolutionized biomedicine and basic research by yielding cells with embryonic stem (ES) cell-like properties. The use of iPS-derived cells for cell-based therapies and modeling of human disease holds great potential. While the initial description of iPS cells involved overexpression of four transcription factors via viral vectors that integrated within genomic DNA, advances in recent years by our group and others have led to safer and higher quality iPS cells with greater efficiency. Here, we describe commonly practiced methods for non-integrating induced pluripotent stem cell generation using nucleofection of episomal reprogramming plasmids. These methods are adapted from recent studies that demonstrate increased hiPS cell reprogramming efficacy with the application of three powerful episomal hiPS cell reprogramming factor vectors and the inclusion of an accessory vector expressing EBNA1. Copyright © 2015 John Wiley & Sons, Inc.

  9. A nodal method based on the response-matrix method

    International Nuclear Information System (INIS)

    Cunha Menezes Filho, A. da; Rocamora Junior, F.D.

    1983-02-01

    A nodal approach based on the Response-Matrix method is presented with the purpose of investigating the possibility of mixing two different allocations in the same problem. It is found that the use of allocation of albedo combined with allocation of direct reflection produces good results for homogeneous fast reactor configurations. (Author) [pt

  10. Ontology Based Resolution of Semantic Conflicts in Information Integration

    Institute of Scientific and Technical Information of China (English)

    LU Han; LI Qing-zhong

    2004-01-01

    Semantic conflict is the conflict caused by using different ways in heterogeneous systems to express the same entity in reality.This prevents information integration from accomplishing semantic coherence.Since ontology helps to solve semantic problems, this area has become a hot topic in information integration.In this paper, we introduce semantic conflict into information integration of heterogeneous applications.We discuss the origins and categories of the conflict, and present an ontology-based schema mapping approach to eliminate semantic conflicts.

  11. Evidence-based integrative medicine in clinical veterinary oncology.

    Science.gov (United States)

    Raditic, Donna M; Bartges, Joseph W

    2014-09-01

    Integrative medicine is the combined use of complementary and alternative medicine with conventional or traditional Western medicine systems. The demand for integrative veterinary medicine is growing, but evidence-based research on its efficacy is limited. In veterinary clinical oncology, such research could be translated to human medicine, because veterinary patients with spontaneous tumors are valuable translational models for human cancers. An overview of specific herbs, botanics, dietary supplements, and acupuncture evaluated in dogs, in vitro canine cells, and other relevant species both in vivo and in vitro is presented for their potential use as integrative therapies in veterinary clinical oncology. Published by Elsevier Inc.

  12. Beyond vertical integration--Community based medical education.

    Science.gov (United States)

    Kennedy, Emma Margaret

    2006-11-01

    The term 'vertical integration' is used broadly in medical education, sometimes when discussing community based medical education (CBME). This article examines the relevance of the term 'vertical integration' and provides an alternative perspective on the complexities of facilitating the CBME process. The principles of learner centredness, patient centredness and flexibility are fundamental to learning in the diverse contexts of 'community'. Vertical integration as a structural concept is helpful for academic organisations but has less application to education in the community setting; a different approach illuminates the strengths and challenges of CBME that need consideration by these organisations.

  13. Synchronization method for grid integrated battery storage systems during asymmetrical grid faults

    Directory of Open Access Journals (Sweden)

    Popadić Bane

    2017-01-01

    Full Text Available This paper aims at presenting a robust and reliable synchronization method for battery storage systems during asymmetrical grid faults. For this purpose, a Matlab/Simulink based model for testing of the power electronic interface between the grid and the battery storage systems has been developed. The synchronization method proposed in the paper is based on the proportional integral resonant controller with the delay signal cancellation. The validity of the synchronization method has been verified using the advanced laboratory station for the control of grid connected distributed energy sources. The proposed synchronization method has eliminated unfavourable components from the estimated grid angular frequency, leading to the more accurate and reliable tracking of the grid voltage vector positive sequence during both the normal operation and the operation during asymmetrical grid faults. [Project of the Serbian Ministry of Education, Science and Technological Development, Grant no. III 042004: Integrated and Interdisciplinary Research entitled: Smart Electricity Distribution Grids Based on Distribution Management System and Distributed Generation

  14. Graph-based sequence annotation using a data integration approach.

    Science.gov (United States)

    Pesch, Robert; Lysenko, Artem; Hindle, Matthew; Hassani-Pak, Keywan; Thiele, Ralf; Rawlings, Christopher; Köhler, Jacob; Taubert, Jan

    2008-08-25

    The automated annotation of data from high throughput sequencing and genomics experiments is a significant challenge for bioinformatics. Most current approaches rely on sequential pipelines of gene finding and gene function prediction methods that annotate a gene with information from different reference data sources. Each function prediction method contributes evidence supporting a functional assignment. Such approaches generally ignore the links between the information in the reference datasets. These links, however, are valuable for assessing the plausibility of a function assignment and can be used to evaluate the confidence in a prediction. We are working towards a novel annotation system that uses the network of information supporting the function assignment to enrich the annotation process for use by expert curators and predicting the function of previously unannotated genes. In this paper we describe our success in the first stages of this development. We present the data integration steps that are needed to create the core database of integrated reference databases (UniProt, PFAM, PDB, GO and the pathway database Ara-Cyc) which has been established in the ONDEX data integration system. We also present a comparison between different methods for integration of GO terms as part of the function assignment pipeline and discuss the consequences of this analysis for improving the accuracy of gene function annotation. The methods and algorithms presented in this publication are an integral part of the ONDEX system which is freely available from http://ondex.sf.net/.

  15. EVALUATION OF THE POUNDING FORCES DURING EARTHQUAKE USING EXPLICIT DYNAMIC TIME INTEGRATION METHOD

    Directory of Open Access Journals (Sweden)

    Nica George Bogdan

    2017-09-01

    Full Text Available Pounding effects during earthquake is a subject of high significance for structural engineers performing in the urban areas. In this paper, two ways to account for structural pounding are used in a MATLAB code, namely classical stereomechanics approach and nonlinear viscoelastic impact element. The numerical study is performed on SDOF structures acted by ELCentro recording. While most of the studies available in the literature are related to Newmark implicit time integration method, in this study the equations of motion are numerical integrated using central finite difference method, an explicit method, having the main advantage that in the displacement at the ith+1 step is calculated based on the loads from the ith step. Thus, the collision is checked and the pounding forces are taken into account into the equation of motion in an easier manner than in an implicit integration method. First, a comparison is done using available data in the literature. Both linear and nonlinear behavior of the structures during earthquake is further investigated. Several layout scenarios are also investigated, in which one or more weak buildings are adjacent to a stiffer building. One of the main findings in this paper is related to the behavior of a weak structure located between two stiff structures.

  16. Integrated navigation method of a marine strapdown inertial navigation system using a star sensor

    International Nuclear Information System (INIS)

    Wang, Qiuying; Diao, Ming; Gao, Wei; Zhu, Minghong; Xiao, Shu

    2015-01-01

    This paper presents an integrated navigation method of the strapdown inertial navigation system (SINS) using a star sensor. According to the principle of SINS, its own navigation information contains an error that increases with time. Hence, the inertial attitude matrix from the star sensor is introduced as the reference information to correct the SINS increases error. For the integrated navigation method, the vehicle’s attitude can be obtained in two ways: one is calculated from SINS; the other, which we have called star sensor attitude, is obtained as the product between the SINS position and the inertial attitude matrix from the star sensor. Therefore, the SINS position error is introduced in the star sensor attitude error. Based on the characteristics of star sensor attitude error and the mathematical derivation, the SINS navigation errors can be obtained by the coupling calculation between the SINS attitude and the star sensor attitude. Unlike several current techniques, the navigation process of this method is non-radiating and invulnerable to jamming. The effectiveness of this approach was demonstrated by simulation and experimental study. The results show that this integrated navigation method can estimate the attitude error and the position error of SINS. Therefore, the SINS navigation accuracy is improved. (paper)

  17. VALUE - Validating and Integrating Downscaling Methods for Climate Change Research

    Science.gov (United States)

    Maraun, Douglas; Widmann, Martin; Benestad, Rasmus; Kotlarski, Sven; Huth, Radan; Hertig, Elke; Wibig, Joanna; Gutierrez, Jose

    2013-04-01

    Our understanding of global climate change is mainly based on General Circulation Models (GCMs) with a relatively coarse resolution. Since climate change impacts are mainly experienced on regional scales, high-resolution climate change scenarios need to be derived from GCM simulations by downscaling. Several projects have been carried out over the last years to validate the performance of statistical and dynamical downscaling, yet several aspects have not been systematically addressed: variability on sub-daily, decadal and longer time-scales, extreme events, spatial variability and inter-variable relationships. Different downscaling approaches such as dynamical downscaling, statistical downscaling and bias correction approaches have not been systematically compared. Furthermore, collaboration between different communities, in particular regional climate modellers, statistical downscalers and statisticians has been limited. To address these gaps, the EU Cooperation in Science and Technology (COST) action VALUE (www.value-cost.eu) has been brought into life. VALUE is a research network with participants from currently 23 European countries running from 2012 to 2015. Its main aim is to systematically validate and develop downscaling methods for climate change research in order to improve regional climate change scenarios for use in climate impact studies. Inspired by the co-design idea of the international research initiative "future earth", stakeholders of climate change information have been involved in the definition of research questions to be addressed and are actively participating in the network. The key idea of VALUE is to identify the relevant weather and climate characteristics required as input for a wide range of impact models and to define an open framework to systematically validate these characteristics. Based on a range of benchmark data sets, in principle every downscaling method can be validated and compared with competing methods. The results of

  18. Earth Systems Science in an Integrated Science Content and Methods Course for Elementary Education Majors

    Science.gov (United States)

    Madsen, J. A.; Allen, D. E.; Donham, R. S.; Fifield, S. J.; Shipman, H. L.; Ford, D. J.; Dagher, Z. R.

    2004-12-01

    With funding from the National Science Foundation, we have designed an integrated science content and methods course for sophomore-level elementary teacher education (ETE) majors. This course, the Science Semester, is a 15-credit sequence that consists of three science content courses (Earth, Life, and Physical Science) and a science teaching methods course. The goal of this integrated science and education methods curriculum is to foster holistic understandings of science and pedagogy that future elementary teachers need to effectively use inquiry-based approaches in teaching science in their classrooms. During the Science Semester, traditional subject matter boundaries are crossed to stress shared themes that teachers must understand to teach standards-based elementary science. Exemplary approaches that support both learning science and learning how to teach science are used. In the science courses, students work collaboratively on multidisciplinary problem-based learning (PBL) activities that place science concepts in authentic contexts and build learning skills. In the methods course, students critically explore the theory and practice of elementary science teaching, drawing on their shared experiences of inquiry learning in the science courses. An earth system science approach is ideally adapted for the integrated, inquiry-based learning that takes place during the Science Semester. The PBL investigations that are the hallmark of the Science Semester provide the backdrop through which fundamental earth system interactions can be studied. For example in the PBL investigation that focuses on energy, the carbon cycle is examined as it relates to fossil fuels. In another PBL investigation centered on kids, cancer, and the environment, the hydrologic cycle with emphasis on surface runoff and ground water contamination is studied. In a PBL investigation that has students learning about the Delaware Bay ecosystem through the story of the horseshoe crab and the biome

  19. A discontinous Galerkin finite element method with an efficient time integration scheme for accurate simulations

    KAUST Repository

    Liu, Meilin; Bagci, Hakan

    2011-01-01

    A discontinuous Galerkin finite element method (DG-FEM) with a highly-accurate time integration scheme is presented. The scheme achieves its high accuracy using numerically constructed predictor-corrector integration coefficients. Numerical results

  20. Integrating Evidence Within and Across Evidence Streams Using Qualitative Methods

    Science.gov (United States)

    There is high demand in environmental health for adoption of a structured process that evaluates and integrates evidence while making decisions transparent. The Grading of Recommendations Assessment, Development and Evaluation (GRADE) framework holds promise to address this deman...

  1. Integral methods of solving boundary-value problems of nonstationary heat conduction and their comparative analysis

    Science.gov (United States)

    Kot, V. A.

    2017-11-01

    The modern state of approximate integral methods used in applications, where the processes of heat conduction and heat and mass transfer are of first importance, is considered. Integral methods have found a wide utility in different fields of knowledge: problems of heat conduction with different heat-exchange conditions, simulation of thermal protection, Stefantype problems, microwave heating of a substance, problems on a boundary layer, simulation of a fluid flow in a channel, thermal explosion, laser and plasma treatment of materials, simulation of the formation and melting of ice, inverse heat problems, temperature and thermal definition of nanoparticles and nanoliquids, and others. Moreover, polynomial solutions are of interest because the determination of a temperature (concentration) field is an intermediate stage in the mathematical description of any other process. The following main methods were investigated on the basis of the error norms: the Tsoi and Postol’nik methods, the method of integral relations, the Gudman integral method of heat balance, the improved Volkov integral method, the matched integral method, the modified Hristov method, the Mayer integral method, the Kudinov method of additional boundary conditions, the Fedorov boundary method, the method of weighted temperature function, the integral method of boundary characteristics. It was established that the two last-mentioned methods are characterized by high convergence and frequently give solutions whose accuracy is not worse that the accuracy of numerical solutions.

  2. The clinically-integrated randomized trial: proposed novel method for conducting large trials at low cost

    Directory of Open Access Journals (Sweden)

    Scardino Peter T

    2009-03-01

    Full Text Available Abstract Introduction Randomized controlled trials provide the best method of determining which of two comparable treatments is preferable. Unfortunately, contemporary randomized trials have become increasingly expensive, complex and burdened by regulation, so much so that many trials are of doubtful feasibility. Discussion Here we present a proposal for a novel, streamlined approach to randomized trials: the "clinically-integrated randomized trial". The key aspect of our methodology is that the clinical experience of the patient and doctor is virtually indistinguishable whether or not the patient is randomized, primarily because outcome data are obtained from routine clinical data, or from short, web-based questionnaires. Integration of a randomized trial into routine clinical practice also implies that there should be an attempt to randomize every patient, a corollary of which is that eligibility criteria are minimized. The similar clinical experience of patients on- and off-study also entails that the marginal cost of putting an additional patient on trial is negligible. We propose examples of how the clinically-integrated randomized trial might be applied in four distinct areas of medicine: comparisons of surgical techniques, "me too" drugs, rare diseases and lifestyle interventions. Barriers to implementing clinically-integrated randomized trials are discussed. Conclusion The proposed clinically-integrated randomized trial may allow us to enlarge dramatically the number of clinical questions that can be addressed by randomization.

  3. TL glow ratios at different temperature intervals of integration in thermoluminescence method. Comparison of Japanese standard (MHLW notified) method with CEN standard methods

    International Nuclear Information System (INIS)

    Todoriki, Setsuko; Saito, Kimie; Tsujimoto, Yuka

    2008-01-01

    The effect of the integration temperature intervals of TL intensities on the TL glow ratio was examined in comparison of the notified method of the Ministry of Health, Labour and Welfare (MHLW method) with EN1788. Two kinds of un-irradiated geological standard rock and three kinds of spices (black pepper, turmeric, and oregano) irradiated at 0.3 kGy or 1.0 kGy were subjected to TL analysis. Although the TL glow ratio exceeded 0.1 in the andesite according to the calculation of the MHLW notified method (integration interval; 70-490degC), the maximum of the first glow were observed at 300degC or more, attributed the influence of the natural radioactivity and distinguished from food irradiation. When the integration interval was set to 166-227degC according to EN1788, the TL glow ratios became remarkably smaller than 0.1, and the evaluation of the un-irradiated sample became more clear. For spices, the TL glow ratios by the MHLW notified method fell below 0.1 in un-irradiated samples and exceeded 0.1 in irradiated ones. Moreover, Glow1 maximum temperatures of the irradiated samples were observed at the range of 168-196degC, and those of un-irradiated samples were 258degC or more. Therefore, all samples were correctly judged by the criteria of the MHLW method. However, based on the temperature range of integration defined by EN1788, the TL glow ratio of un-irradiated samples remarkably became small compared with that of the MHLW method, and the discrimination of the irradiated sample from non-irradiation sample became clearer. (author)

  4. Developing a Self-Report-Based Sequential Analysis Method for Educational Technology Systems: A Process-Based Usability Evaluation

    Science.gov (United States)

    Lin, Yi-Chun; Hsieh, Ya-Hui; Hou, Huei-Tse

    2015-01-01

    The development of a usability evaluation method for educational systems or applications, called the self-report-based sequential analysis, is described herein. The method aims to extend the current practice by proposing self-report-based sequential analysis as a new usability method, which integrates the advantages of self-report in survey…

  5. GaN-based integrated photonics chip with suspended LED and waveguide

    Science.gov (United States)

    Li, Xin; Wang, Yongjin; Hane, Kazuhiro; Shi, Zheng; Yan, Jiang

    2018-05-01

    We propose a GaN-based integrated photonics chip with suspended LED and straight waveguide with different geometric parameters. The integrated photonics chip is prepared by double-side process. Light transmission performance of the integrated chip verse current is quantitatively analyzed by capturing light transmitted to waveguide tip and BPM (beam propagation method) simulation. Reduction of the waveguide width from 8 μm to 4 μm results in an over linear reduction of the light output power while a doubling of the length from 250 μm to 500 μm only results in under linear decrease of the output power. Free-space data transmission with 80 Mbps random binary sequence of the integrated chip is capable of achieving high speed data transmission via visible light. This study provides a potential approach for GaN-based integrated photonics chip as micro light source and passive optical device in VLC (visible light communication).

  6. The integrated circuit IC EMP transient state disturbance effect experiment method investigates

    International Nuclear Information System (INIS)

    Li Xiaowei

    2004-01-01

    Transient state disturbance characteristic study on the integrated circuit, IC, need from its coupling path outset. Through cable (aerial) coupling, EMP converts to an pulse current voltage and results in the impact to the integrated circuit I/O orifice passing the cable. Aiming at the armament system construction feature, EMP effect to the integrated circuit, IC inside the system is analyzed. The integrated circuit, IC EMP effect experiment current injection method is investigated and a few experiments method is given. (authors)

  7. Method of mechanical quadratures for solving singular integral equations of various types

    Science.gov (United States)

    Sahakyan, A. V.; Amirjanyan, H. A.

    2018-04-01

    The method of mechanical quadratures is proposed as a common approach intended for solving the integral equations defined on finite intervals and containing Cauchy-type singular integrals. This method can be used to solve singular integral equations of the first and second kind, equations with generalized kernel, weakly singular equations, and integro-differential equations. The quadrature rules for several different integrals represented through the same coefficients are presented. This allows one to reduce the integral equations containing integrals of different types to a system of linear algebraic equations.

  8. Color image definition evaluation method based on deep learning method

    Science.gov (United States)

    Liu, Di; Li, YingChun

    2018-01-01

    In order to evaluate different blurring levels of color image and improve the method of image definition evaluation, this paper proposed a method based on the depth learning framework and BP neural network classification model, and presents a non-reference color image clarity evaluation method. Firstly, using VGG16 net as the feature extractor to extract 4,096 dimensions features of the images, then the extracted features and labeled images are employed in BP neural network to train. And finally achieve the color image definition evaluation. The method in this paper are experimented by using images from the CSIQ database. The images are blurred at different levels. There are 4,000 images after the processing. Dividing the 4,000 images into three categories, each category represents a blur level. 300 out of 400 high-dimensional features are trained in VGG16 net and BP neural network, and the rest of 100 samples are tested. The experimental results show that the method can take full advantage of the learning and characterization capability of deep learning. Referring to the current shortcomings of the major existing image clarity evaluation methods, which manually design and extract features. The method in this paper can extract the images features automatically, and has got excellent image quality classification accuracy for the test data set. The accuracy rate is 96%. Moreover, the predicted quality levels of original color images are similar to the perception of the human visual system.

  9. Comprehensive profiling of retroviral integration sites using target enrichment methods from historical koala samples without an assembled reference genome

    Directory of Open Access Journals (Sweden)

    Pin Cui

    2016-03-01

    Full Text Available Background. Retroviral integration into the host germline results in permanent viral colonization of vertebrate genomes. The koala retrovirus (KoRV is currently invading the germline of the koala (Phascolarctos cinereus and provides a unique opportunity for studying retroviral endogenization. Previous analysis of KoRV integration patterns in modern koalas demonstrate that they share integration sites primarily if they are related, indicating that the process is currently driven by vertical transmission rather than infection. However, due to methodological challenges, KoRV integrations have not been comprehensively characterized. Results. To overcome these challenges, we applied and compared three target enrichment techniques coupled with next generation sequencing (NGS and a newly customized sequence-clustering based computational pipeline to determine the integration sites for 10 museum Queensland and New South Wales (NSW koala samples collected between the 1870s and late 1980s. A secondary aim of this study sought to identify common integration sites across modern and historical specimens by comparing our dataset to previously published studies. Several million sequences were processed, and the KoRV integration sites in each koala were characterized. Conclusions. Although the three enrichment methods each exhibited bias in integration site retrieval, a combination of two methods, Primer Extension Capture and hybridization capture is recommended for future studies on historical samples. Moreover, identification of integration sites shows that the proportion of integration sites shared between any two koalas is quite small.

  10. Unified method to integrate and blend several, potentially related, sources of information for genetic evaluation.

    Science.gov (United States)

    Vandenplas, Jérémie; Colinet, Frederic G; Gengler, Nicolas

    2014-09-30

    A condition to predict unbiased estimated breeding values by best linear unbiased prediction is to use simultaneously all available data. However, this condition is not often fully met. For example, in dairy cattle, internal (i.e. local) populations lead to evaluations based only on internal records while widely used foreign sires have been selected using internally unavailable external records. In such cases, internal genetic evaluations may be less accurate and biased. Because external records are unavailable, methods were developed to combine external information that summarizes these records, i.e. external estimated breeding values and associated reliabilities, with internal records to improve accuracy of internal genetic evaluations. Two issues of these methods concern double-counting of contributions due to relationships and due to records. These issues could be worse if external information came from several evaluations, at least partially based on the same records, and combined into a single internal evaluation. Based on a Bayesian approach, the aim of this research was to develop a unified method to integrate and blend simultaneously several sources of information into an internal genetic evaluation by avoiding double-counting of contributions due to relationships and due to records. This research resulted in equations that integrate and blend simultaneously several sources of information and avoid double-counting of contributions due to relationships and due to records. The performance of the developed equations was evaluated using simulated and real datasets. The results showed that the developed equations integrated and blended several sources of information well into a genetic evaluation. The developed equations also avoided double-counting of contributions due to relationships and due to records. Furthermore, because all available external sources of information were correctly propagated, relatives of external animals benefited from the integrated

  11. Comparative analysis of methods for integrating various environmental impacts as a single index in life cycle assessment

    International Nuclear Information System (INIS)

    Ji, Changyoon; Hong, Taehoon

    2016-01-01

    Previous studies have proposed several methods for integrating characterized environmental impacts as a single index in life cycle assessment. Each of them, however, may lead to different results. This study presents internal and external normalization methods, weighting factors proposed by panel methods, and a monetary valuation based on an endpoint life cycle impact assessment method as the integration methods. Furthermore, this study investigates the differences among the integration methods and identifies the causes of the differences through a case study in which five elementary school buildings were used. As a result, when using internal normalization with weighting factors, the weighting factors had a significant influence on the total environmental impacts whereas the normalization had little influence on the total environmental impacts. When using external normalization with weighting factors, the normalization had more significant influence on the total environmental impacts than weighing factors. Due to such differences, the ranking of the five buildings varied depending on the integration methods. The ranking calculated by the monetary valuation method was significantly different from that calculated by the normalization and weighting process. The results aid decision makers in understanding the differences among these integration methods, and, finally, help them select the method most appropriate for the goal at hand.

  12. Comparative analysis of methods for integrating various environmental impacts as a single index in life cycle assessment

    Energy Technology Data Exchange (ETDEWEB)

    Ji, Changyoon, E-mail: changyoon@yonsei.ac.kr; Hong, Taehoon, E-mail: hong7@yonsei.ac.kr

    2016-02-15

    Previous studies have proposed several methods for integrating characterized environmental impacts as a single index in life cycle assessment. Each of them, however, may lead to different results. This study presents internal and external normalization methods, weighting factors proposed by panel methods, and a monetary valuation based on an endpoint life cycle impact assessment method as the integration methods. Furthermore, this study investigates the differences among the integration methods and identifies the causes of the differences through a case study in which five elementary school buildings were used. As a result, when using internal normalization with weighting factors, the weighting factors had a significant influence on the total environmental impacts whereas the normalization had little influence on the total environmental impacts. When using external normalization with weighting factors, the normalization had more significant influence on the total environmental impacts than weighing factors. Due to such differences, the ranking of the five buildings varied depending on the integration methods. The ranking calculated by the monetary valuation method was significantly different from that calculated by the normalization and weighting process. The results aid decision makers in understanding the differences among these integration methods, and, finally, help them select the method most appropriate for the goal at hand.

  13. History based batch method preserving tally means

    International Nuclear Information System (INIS)

    Shim, Hyung Jin; Choi, Sung Hoon

    2012-01-01

    In the Monte Carlo (MC) eigenvalue calculations, the sample variance of a tally mean calculated from its cycle-wise estimates is biased because of the inter-cycle correlations of the fission source distribution (FSD). Recently, we proposed a new real variance estimation method named the history-based batch method in which a MC run is treated as multiple runs with small number of histories per cycle to generate independent tally estimates. In this paper, the history-based batch method based on the weight correction is presented to preserve the tally mean from the original MC run. The effectiveness of the new method is examined for the weakly coupled fissile array problem as a function of the dominance ratio and the batch size, in comparison with other schemes available

  14. Model-Based Method for Sensor Validation

    Science.gov (United States)

    Vatan, Farrokh

    2012-01-01

    Fault detection, diagnosis, and prognosis are essential tasks in the operation of autonomous spacecraft, instruments, and in situ platforms. One of NASA s key mission requirements is robust state estimation. Sensing, using a wide range of sensors and sensor fusion approaches, plays a central role in robust state estimation, and there is a need to diagnose sensor failure as well as component failure. Sensor validation can be considered to be part of the larger effort of improving reliability and safety. The standard methods for solving the sensor validation problem are based on probabilistic analysis of the system, from which the method based on Bayesian networks is most popular. Therefore, these methods can only predict the most probable faulty sensors, which are subject to the initial probabilities defined for the failures. The method developed in this work is based on a model-based approach and provides the faulty sensors (if any), which can be logically inferred from the model of the system and the sensor readings (observations). The method is also more suitable for the systems when it is hard, or even impossible, to find the probability functions of the system. The method starts by a new mathematical description of the problem and develops a very efficient and systematic algorithm for its solution. The method builds on the concepts of analytical redundant relations (ARRs).

  15. Recruitment recommendation system based on fuzzy measure and indeterminate integral

    Science.gov (United States)

    Yin, Xin; Song, Jinjie

    2017-08-01

    In this study, we propose a comprehensive evaluation approach based on indeterminate integral. By introducing the related concepts of indeterminate integral and their formulas into the recruitment recommendation system, we can calculate the suitability of each job for different applicants with the defined importance for each criterion listed in the job advertisements, the association between different criteria and subjective assessment as the prerequisite. Thus we can make recommendations to the applicants based on the score of the suitability of each job from high to low. In the end, we will exemplify the usefulness and practicality of this system with samples.

  16. A SysML-based Integration Framework for the Engineering of Mechatronic Systems

    OpenAIRE

    Chami, Muhammad; Seemüller, Holger; Voos, Holger

    2010-01-01

    The engineering discipline mechatronics is one of the main innovation leader in industry nowadays. With the need for an optimal synergetic integration of the involved disciplines, the engineering process of mechatronic systems is faced with an increasing complexity and the interdisciplinary nature of these systems. New methods and techniques have to be developed to deal with these challenges. This document presents an approach of a SysML-based integration framework that s...

  17. Leaky Integrate-and-Fire Neuron Circuit Based on Floating-Gate Integrator

    Science.gov (United States)

    Kornijcuk, Vladimir; Lim, Hyungkwang; Seok, Jun Yeong; Kim, Guhyun; Kim, Seong Keun; Kim, Inho; Choi, Byung Joon; Jeong, Doo Seok

    2016-01-01

    The artificial spiking neural network (SNN) is promising and has been brought to the notice of the theoretical neuroscience and neuromorphic engineering research communities. In this light, we propose a new type of artificial spiking neuron based on leaky integrate-and-fire (LIF) behavior. A distinctive feature of the proposed FG-LIF neuron is the use of a floating-gate (FG) integrator rather than a capacitor-based one. The relaxation time of the charge on the FG relies mainly on the tunnel barrier profile, e.g., barrier height and thickness (rather than the area). This opens up the possibility of large-scale integration of neurons. The circuit simulation results offered biologically plausible spiking activity (circuit was subject to possible types of noise, e.g., thermal noise and burst noise. The simulation results indicated remarkable distributional features of interspike intervals that are fitted to Gamma distribution functions, similar to biological neurons in the neocortex. PMID:27242416

  18. A discontinuous galerkin time domain-boundary integral method for analyzing transient electromagnetic scattering

    KAUST Repository

    Li, Ping

    2014-07-01

    This paper presents an algorithm hybridizing discontinuous Galerkin time domain (DGTD) method and time domain boundary integral (BI) algorithm for 3-D open region electromagnetic scattering analysis. The computational domain of DGTD is rigorously truncated by analytically evaluating the incoming numerical flux from the outside of the truncation boundary through BI method based on the Huygens\\' principle. The advantages of the proposed method are that it allows the truncation boundary to be conformal to arbitrary (convex/ concave) scattering objects, well-separated scatters can be truncated by their local meshes without losing the physics (such as coupling/multiple scattering) of the problem, thus reducing the total mesh elements. Furthermore, low frequency waves can be efficiently absorbed, and the field outside the truncation domain can be conveniently calculated using the same BI formulation. Numerical examples are benchmarked to demonstrate the accuracy and versatility of the proposed method.

  19. The Use of Conditional Probability Integral Transformation Method for Testing Accelerated Failure Time Models

    Directory of Open Access Journals (Sweden)

    Abdalla Ahmed Abdel-Ghaly

    2016-06-01

    Full Text Available This paper suggests the use of the conditional probability integral transformation (CPIT method as a goodness of fit (GOF technique in the field of accelerated life testing (ALT, specifically for validating the underlying distributional assumption in accelerated failure time (AFT model. The method is based on transforming the data into independent and identically distributed (i.i.d Uniform (0, 1 random variables and then applying the modified Watson statistic to test the uniformity of the transformed random variables. This technique is used to validate each of the exponential, Weibull and lognormal distributions' assumptions in AFT model under constant stress and complete sampling. The performance of the CPIT method is investigated via a simulation study. It is concluded that this method performs well in case of exponential and lognormal distributions. Finally, a real life example is provided to illustrate the application of the proposed procedure.

  20. Modular Architecture for Integrated Model-Based Decision Support.

    Science.gov (United States)

    Gaebel, Jan; Schreiber, Erik; Oeser, Alexander; Oeltze-Jafra, Steffen

    2018-01-01

    Model-based decision support systems promise to be a valuable addition to oncological treatments and the implementation of personalized therapies. For the integration and sharing of decision models, the involved systems must be able to communicate with each other. In this paper, we propose a modularized architecture of dedicated systems for the integration of probabilistic decision models into existing hospital environments. These systems interconnect via web services and provide model sharing and processing capabilities for clinical information systems. Along the lines of IHE integration profiles from other disciplines and the meaningful reuse of routinely recorded patient data, our approach aims for the seamless integration of decision models into hospital infrastructure and the physicians' daily work.

  1. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted.......Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...

  2. A highly accurate boundary integral equation method for surfactant-laden drops in 3D

    Science.gov (United States)

    Sorgentone, Chiara; Tornberg, Anna-Karin

    2018-05-01

    The presence of surfactants alters the dynamics of viscous drops immersed in an ambient viscous fluid. This is specifically true at small scales, such as in applications of droplet based microfluidics, where the interface dynamics become of increased importance. At such small scales, viscous forces dominate and inertial effects are often negligible. Considering Stokes flow, a numerical method based on a boundary integral formulation is presented for simulating 3D drops covered by an insoluble surfactant. The method is able to simulate drops with different viscosities and close interactions, automatically controlling the time step size and maintaining high accuracy also when substantial drop deformation appears. To achieve this, the drop surfaces as well as the surfactant concentration on each surface are represented by spherical harmonics expansions. A novel reparameterization method is introduced to ensure a high-quality representation of the drops also under deformation, specialized quadrature methods for singular and nearly singular integrals that appear in the formulation are evoked and the adaptive time stepping scheme for the coupled drop and surfactant evolution is designed with a preconditioned implicit treatment of the surfactant diffusion.

  3. Social Ecology of Asthma: Engaging Stakeholders in Integrating Health Behavior Theories and Practice-Based Evidence through Systems Mapping

    Science.gov (United States)

    Gillen, Emily M.; Hassmiller Lich, Kristen; Yeatts, Karin B.; Hernandez, Michelle L.; Smith, Timothy W.; Lewis, Megan A.

    2014-01-01

    This article describes a process for integrating health behavior and social science theories with practice-based insights using participatory systems thinking and diagramming methods largely inspired by system dynamics methods. This integration can help close the gap between research and practice in health education and health behavior by offering…

  4. Integrative Mixed Methods Data Analytic Strategies in Research on School Success in Challenging Circumstances

    Science.gov (United States)

    Jang, Eunice E.; McDougall, Douglas E.; Pollon, Dawn; Herbert, Monique; Russell, Pia

    2008-01-01

    There are both conceptual and practical challenges in dealing with data from mixed methods research studies. There is a need for discussion about various integrative strategies for mixed methods data analyses. This article illustrates integrative analytic strategies for a mixed methods study focusing on improving urban schools facing challenging…

  5. [A web-based integrated clinical database for laryngeal cancer].

    Science.gov (United States)

    E, Qimin; Liu, Jialin; Li, Yong; Liang, Chuanyu

    2014-08-01

    To establish an integrated database for laryngeal cancer, and to provide an information platform for laryngeal cancer in clinical and fundamental researches. This database also meet the needs of clinical and scientific use. Under the guidance of clinical expert, we have constructed a web-based integrated clinical database for laryngeal carcinoma on the basis of clinical data standards, Apache+PHP+MySQL technology, laryngeal cancer specialist characteristics and tumor genetic information. A Web-based integrated clinical database for laryngeal carcinoma had been developed. This database had a user-friendly interface and the data could be entered and queried conveniently. In addition, this system utilized the clinical data standards and exchanged information with existing electronic medical records system to avoid the Information Silo. Furthermore, the forms of database was integrated with laryngeal cancer specialist characteristics and tumor genetic information. The Web-based integrated clinical database for laryngeal carcinoma has comprehensive specialist information, strong expandability, high feasibility of technique and conforms to the clinical characteristics of laryngeal cancer specialties. Using the clinical data standards and structured handling clinical data, the database can be able to meet the needs of scientific research better and facilitate information exchange, and the information collected and input about the tumor sufferers are very informative. In addition, the user can utilize the Internet to realize the convenient, swift visit and manipulation on the database.

  6. Ship Integration of Energy Scavenging Technology for Sea Base Operations

    Science.gov (United States)

    2009-07-01

    Paraschivoiu, I. (2002). Wind Turbine Design with Emphasis on Darrieus Concept. Montreal, QC, Canada: Polytechnic International Press. 23 Pelamis...31 Wind Turbine Production Data..................................................................................... 31 Summary...integration. Lastly, wind turbines are a promising method of harvesting energy. Horizontal axis wind turbines are currently the most developed

  7. Integrated Ecological River Health Assessments, Based on Water Chemistry, Physical Habitat Quality and Biological Integrity

    Directory of Open Access Journals (Sweden)

    Ji Yoon Kim

    2015-11-01

    Full Text Available This study evaluated integrative river ecosystem health using stressor-based models of physical habitat health, chemical water health, and biological health of fish and identified multiple-stressor indicators influencing the ecosystem health. Integrated health responses (IHRs, based on star-plot approach, were calculated from qualitative habitat evaluation index (QHEI, nutrient pollution index (NPI, and index of biological integrity (IBI in four different longitudinal regions (Groups I–IV. For the calculations of IHRs values, multi-metric QHEI, NPI, and IBI models were developed and their criteria for the diagnosis of the health were determined. The longitudinal patterns of the river were analyzed by a self-organizing map (SOM model and the key major stressors in the river were identified by principal component analysis (PCA. Our model scores of integrated health responses (IHRs suggested that mid-stream and downstream regions were impaired, and the key stressors were closely associated with nutrient enrichment (N and P and organic matter pollutions from domestic wastewater disposal plants and urban sewage. This modeling approach of IHRs may be used as an effective tool for evaluations of integrative ecological river health..

  8. Piloting a method to evaluate the implementation of integrated water ...

    African Journals Online (AJOL)

    Journal Home > Vol 41, No 5 (2015) >. Log in or Register to get access to full text downloads. ... A methodology with a set of principles, change areas and measures was developed as a performance assessment tool. ... Keywords: Integrated water resource management, Inkomati River Basin, South Africa, Swaziland ...

  9. A joint classification method to integrate scientific and social networks

    NARCIS (Netherlands)

    Neshati, Mahmood; Asgari, Ehsaneddin; Hiemstra, Djoerd; Beigy, Hamid

    In this paper, we address the problem of scientific-social network integration to find a matching relationship between members of these networks. Utilizing several name similarity patterns and contextual properties of these networks, we design a focused crawler to find high probable matching pairs,

  10. Fringe integral equation method for a truncated grounded dielectric slab

    DEFF Research Database (Denmark)

    Jørgensen, Erik; Maci, S.; Toccafondi, A.

    2001-01-01

    The problem of scattering by a semi-infinite grounded dielectric slab illuminated by an arbitrary incident TMz polarized electric field is studied by solving a new set of “fringe” integral equations (F-IEs), whose functional unknowns are physically associated to the wave diffraction processes...

  11. Fourier path-integral Monte Carlo methods: Partial averaging

    International Nuclear Information System (INIS)

    Doll, J.D.; Coalson, R.D.; Freeman, D.L.

    1985-01-01

    Monte Carlo Fourier path-integral techniques are explored. It is shown that fluctuation renormalization techniques provide an effective means for treating the effects of high-order Fourier contributions. The resulting formalism is rapidly convergent, is computationally convenient, and has potentially useful variational aspects

  12. Writing Integrative Reviews of the Literature: Methods and Purposes

    Science.gov (United States)

    Torraco, Richard J.

    2016-01-01

    This article discusses the integrative review of the literature as a distinctive form of research that uses existing literature to create new knowledge. As an expansion and update of a previously published article on this topic, it acknowledges the growth and appeal of this form of research to scholars, it identifies the main components of the…

  13. Integral reactor system and method for fuel cells

    Science.gov (United States)

    Fernandes, Neil Edward; Brown, Michael S; Cheekatamarla, Praveen; Deng, Thomas; Dimitrakopoulos, James; Litka, Anthony F

    2013-11-19

    A reactor system is integrated internally within an anode-side cavity of a fuel cell. The reactor system is configured to convert hydrocarbons to smaller species while mitigating the lower production of solid carbon. The reactor system may incorporate one or more of a pre-reforming section, an anode exhaust gas recirculation device, and a reforming section.

  14. Spectrum estimation method based on marginal spectrum

    International Nuclear Information System (INIS)

    Cai Jianhua; Hu Weiwen; Wang Xianchun

    2011-01-01

    FFT method can not meet the basic requirements of power spectrum for non-stationary signal and short signal. A new spectrum estimation method based on marginal spectrum from Hilbert-Huang transform (HHT) was proposed. The procession of obtaining marginal spectrum in HHT method was given and the linear property of marginal spectrum was demonstrated. Compared with the FFT method, the physical meaning and the frequency resolution of marginal spectrum were further analyzed. Then the Hilbert spectrum estimation algorithm was discussed in detail, and the simulation results were given at last. The theory and simulation shows that under the condition of short data signal and non-stationary signal, the frequency resolution and estimation precision of HHT method is better than that of FFT method. (authors)

  15. Reconstruction of biological networks based on life science data integration

    Directory of Open Access Journals (Sweden)

    Kormeier Benjamin

    2010-06-01

    Full Text Available For the implementation of the virtual cell, the fundamental question is how to model and simulate complex biological networks. Therefore, based on relevant molecular database and information systems, biological data integration is an essential step in constructing biological networks. In this paper, we will motivate the applications BioDWH - an integration toolkit for building life science data warehouses, CardioVINEdb - a information system for biological data in cardiovascular-disease and VANESA- a network editor for modeling and simulation of biological networks. Based on this integration process, the system supports the generation of biological network models. A case study of a cardiovascular-disease related gene-regulated biological network is also presented.

  16. Reconstruction of biological networks based on life science data integration.

    Science.gov (United States)

    Kormeier, Benjamin; Hippe, Klaus; Arrigo, Patrizio; Töpel, Thoralf; Janowski, Sebastian; Hofestädt, Ralf

    2010-10-27

    For the implementation of the virtual cell, the fundamental question is how to model and simulate complex biological networks. Therefore, based on relevant molecular database and information systems, biological data integration is an essential step in constructing biological networks. In this paper, we will motivate the applications BioDWH--an integration toolkit for building life science data warehouses, CardioVINEdb--a information system for biological data in cardiovascular-disease and VANESA--a network editor for modeling and simulation of biological networks. Based on this integration process, the system supports the generation of biological network models. A case study of a cardiovascular-disease related gene-regulated biological network is also presented.

  17. ADVANCING THE STUDY OF VIOLENCE AGAINST WOMEN USING MIXED METHODS: INTEGRATING QUALITATIVE METHODS INTO A QUANTITATIVE RESEARCH PROGRAM

    Science.gov (United States)

    Testa, Maria; Livingston, Jennifer A.; VanZile-Tamsen, Carol

    2011-01-01

    A mixed methods approach, combining quantitative with qualitative data methods and analysis, offers a promising means of advancing the study of violence. Integrating semi-structured interviews and qualitative analysis into a quantitative program of research on women’s sexual victimization has resulted in valuable scientific insight and generation of novel hypotheses for testing. This mixed methods approach is described and recommendations for integrating qualitative data into quantitative research are provided. PMID:21307032

  18. Integration of capillary electrophoresis with gold nanoparticle-based colorimetry.

    Science.gov (United States)

    Li, Tong; Wu, Zhenglong; Qin, Weidong

    2017-12-01

    A method integrating capillary electrophoresis (CE) and gold nanoparticle aggregation-based colorimetry (AuNP-ABC) was described. By using a dual-sheath interface, the running buffer was isolated from the colorimetric reaction solution so that CE and AuNP-ABC would not interfere with each other. The proof-of-concept was validated by assay of polyamidoamine (PAMAM) dendrimers that were fortified in human urine samples. The factors influencing the CE-AuNP-ABC performances were investigated and optimized. Under the optimal conditions, the dendrimers were separated within 8 min, with detection limits of 0.5, 1.2 and 2.6 μg mL -1 for PAMAM G1.0, G2.0 and G3.0, respectively. The sensitivity of CE-AuNP-ABC was comparable to or even better than those of liquid chromatography-fluorimetry and liquid chromatography-mass spectrometry. The results suggested that the proposed strategy can be applied to facile and quick determination of analytes of similar properties in complex matrices. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Integrating Usage Control with SIP-Based Communications

    Directory of Open Access Journals (Sweden)

    A. Lakas

    2008-11-01

    Full Text Available The Session Initiation Protocol (SIP is a signaling protocol used for establishing and maintaining communication sessions involving two or more participants. SIP was initially designed for voice over IP and multimedia conferencing, and then was extended to support other services such as instant messaging and presence management. Today, SIP is also adopted to be used with 3G wireless networks, thus it becomes an integral protocol for ubiquitous environment. SIP has various methods that support a variety of applications such as subscribing to a service, notification of an event, status update, and location and presence services. However, when it comes to security, the use of wireless and mobile communication technologies and the pervasive nature of this environment introduce higher risks to security than that of the old simple environment. In this paper, we introduce new architecture that implements a new type of access control called usage access control (UCON to control the access to the SIP-based communication at preconnection, during connection, and postconnection. This will enable prescribers of SIP services to control who can identify their locations to approve or disapprove their subsequent connections, and to also set some parameters to determine whether a certain communication can continue or should terminate.

  20. Hybrid perturbation methods based on statistical time series models

    Science.gov (United States)

    San-Juan, Juan Félix; San-Martín, Montserrat; Pérez, Iván; López, Rosario

    2016-04-01

    In this work we present a new methodology for orbit propagation, the hybrid perturbation theory, based on the combination of an integration method and a prediction technique. The former, which can be a numerical, analytical or semianalytical theory, generates an initial approximation that contains some inaccuracies derived from the fact that, in order to simplify the expressions and subsequent computations, not all the involved forces are taken into account and only low-order terms are considered, not to mention the fact that mathematical models of perturbations not always reproduce physical phenomena with absolute precision. The prediction technique, which can be based on either statistical time series models or computational intelligence methods, is aimed at modelling and reproducing missing dynamics in the previously integrated approximation. This combination results in the precision improvement of conventional numerical, analytical and semianalytical theories for determining the position and velocity of any artificial satellite or space debris object. In order to validate this methodology, we present a family of three hybrid orbit propagators formed by the combination of three different orders of approximation of an analytical theory and a statistical time series model, and analyse their capability to process the effect produced by the flattening of the Earth. The three considered analytical components are the integration of the Kepler problem, a first-order and a second-order analytical theories, whereas the prediction technique is the same in the three cases, namely an additive Holt-Winters method.