WorldWideScience

Sample records for assignment algorithm asset-based

  1. A Method Based on Dial's Algorithm for Multi-time Dynamic Traffic Assignment

    Directory of Open Access Journals (Sweden)

    Rongjie Kuang

    2014-03-01

    Full Text Available Due to static traffic assignment has poor performance in reflecting actual case and dynamic traffic assignment may incurs excessive compute cost, method of multi-time dynamic traffic assignment combining static and dynamic traffic assignment balances factors of precision and cost effectively. A method based on Dial's logit algorithm is proposed in the article to solve the dynamic stochastic user equilibrium problem in dynamic traffic assignment. Before that, a fitting function that can proximately reflect overloaded traffic condition of link is proposed and used to give corresponding model. Numerical example is given to illustrate heuristic procedure of method and to compare results with one of same example solved by other literature's algorithm. Results show that method based on Dial's algorithm is preferable to algorithm from others.

  2. Asset Mapping: A Course Assignment and Community Assessment

    Science.gov (United States)

    Crozier, Mary; Melchior, Florence

    2013-01-01

    Asset mapping is a relatively new data collection strategy to identify services, staff capacity, programs, resources, values, and other protective factors in a geographic area that can be juxtaposed to risk factors when initiating community planning. A substance abuse prevention course for undergraduates added an assignment of assessing community…

  3. An efficient randomized algorithm for contact-based NMR backbone resonance assignment.

    Science.gov (United States)

    Kamisetty, Hetunandan; Bailey-Kellogg, Chris; Pandurangan, Gopal

    2006-01-15

    Backbone resonance assignment is a critical bottleneck in studies of protein structure, dynamics and interactions by nuclear magnetic resonance (NMR) spectroscopy. A minimalist approach to assignment, which we call 'contact-based', seeks to dramatically reduce experimental time and expense by replacing the standard suite of through-bond experiments with the through-space (nuclear Overhauser enhancement spectroscopy, NOESY) experiment. In the contact-based approach, spectral data are represented in a graph with vertices for putative residues (of unknown relation to the primary sequence) and edges for hypothesized NOESY interactions, such that observed spectral peaks could be explained if the residues were 'close enough'. Due to experimental ambiguity, several incorrect edges can be hypothesized for each spectral peak. An assignment is derived by identifying consistent patterns of edges (e.g. for alpha-helices and beta-sheets) within a graph and by mapping the vertices to the primary sequence. The key algorithmic challenge is to be able to uncover these patterns even when they are obscured by significant noise. This paper develops, analyzes and applies a novel algorithm for the identification of polytopes representing consistent patterns of edges in a corrupted NOESY graph. Our randomized algorithm aggregates simplices into polytopes and fixes inconsistencies with simple local modifications, called rotations, that maintain most of the structure already uncovered. In characterizing the effects of experimental noise, we employ an NMR-specific random graph model in proving that our algorithm gives optimal performance in expected polynomial time, even when the input graph is significantly corrupted. We confirm this analysis in simulation studies with graphs corrupted by up to 500% noise. Finally, we demonstrate the practical application of the algorithm on several experimental beta-sheet datasets. Our approach is able to eliminate a large majority of noise edges and to

  4. An algorithm for ranking assignments using reoptimization

    DEFF Research Database (Denmark)

    Pedersen, Christian Roed; Nielsen, Lars Relund; Andersen, Kim Allan

    2008-01-01

    We consider the problem of ranking assignments according to cost in the classical linear assignment problem. An algorithm partitioning the set of possible assignments, as suggested by Murty, is presented where, for each partition, the optimal assignment is calculated using a new reoptimization...... technique. Computational results for the new algorithm are presented...

  5. Automated sequence-specific protein NMR assignment using the memetic algorithm MATCH

    International Nuclear Information System (INIS)

    Volk, Jochen; Herrmann, Torsten; Wuethrich, Kurt

    2008-01-01

    MATCH (Memetic Algorithm and Combinatorial Optimization Heuristics) is a new memetic algorithm for automated sequence-specific polypeptide backbone NMR assignment of proteins. MATCH employs local optimization for tracing partial sequence-specific assignments within a global, population-based search environment, where the simultaneous application of local and global optimization heuristics guarantees high efficiency and robustness. MATCH thus makes combined use of the two predominant concepts in use for automated NMR assignment of proteins. Dynamic transition and inherent mutation are new techniques that enable automatic adaptation to variable quality of the experimental input data. The concept of dynamic transition is incorporated in all major building blocks of the algorithm, where it enables switching between local and global optimization heuristics at any time during the assignment process. Inherent mutation restricts the intrinsically required randomness of the evolutionary algorithm to those regions of the conformation space that are compatible with the experimental input data. Using intact and artificially deteriorated APSY-NMR input data of proteins, MATCH performed sequence-specific resonance assignment with high efficiency and robustness

  6. Automated backbone assignment of labeled proteins using the threshold accepting algorithm

    International Nuclear Information System (INIS)

    Leutner, Michael; Gschwind, Ruth M.; Liermann, Jens; Schwarz, Christian; Gemmecker, Gerd; Kessler, Horst

    1998-01-01

    The sequential assignment of backbone resonances is the first step in the structure determination of proteins by heteronuclear NMR. For larger proteins, an assignment strategy based on proton side-chain information is no longer suitable for the use in an automated procedure. Our program PASTA (Protein ASsignment by Threshold Accepting) is therefore designed to partially or fully automate the sequential assignment of proteins, based on the analysis of NMR backbone resonances plus C β information. In order to overcome the problems caused by peak overlap and missing signals in an automated assignment process, PASTA uses threshold accepting, a combinatorial optimization strategy, which is superior to simulated annealing due to generally faster convergence and better solutions. The reliability of this algorithm is shown by reproducing the complete sequential backbone assignment of several proteins from published NMR data. The robustness of the algorithm against misassigned signals, noise, spectral overlap and missing peaks is shown by repeating the assignment with reduced sequential information and increased chemical shift tolerances. The performance of the program on real data is finally demonstrated with automatically picked peak lists of human nonpancreatic synovial phospholipase A 2 , a protein with 124 residues

  7. Digital asset management.

    Science.gov (United States)

    Humphrey, Clinton D; Tollefson, Travis T; Kriet, J David

    2010-05-01

    Facial plastic surgeons are accumulating massive digital image databases with the evolution of photodocumentation and widespread adoption of digital photography. Managing and maximizing the utility of these vast data repositories, or digital asset management (DAM), is a persistent challenge. Developing a DAM workflow that incorporates a file naming algorithm and metadata assignment will increase the utility of a surgeon's digital images. Copyright 2010 Elsevier Inc. All rights reserved.

  8. A Biogeography-Based Optimization Algorithm Hybridized with Tabu Search for the Quadratic Assignment Problem.

    Science.gov (United States)

    Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah

    2016-01-01

    The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them.

  9. A Biogeography-Based Optimization Algorithm Hybridized with Tabu Search for the Quadratic Assignment Problem

    Science.gov (United States)

    Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah

    2016-01-01

    The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them. PMID:26819585

  10. Dynamic traffic assignment : genetic algorithms approach

    Science.gov (United States)

    1997-01-01

    Real-time route guidance is a promising approach to alleviating congestion on the nations highways. A dynamic traffic assignment model is central to the development of guidance strategies. The artificial intelligence technique of genetic algorithm...

  11. Tactical Asset Allocation mit Genetischen Algorithmen

    OpenAIRE

    Manuel Ammann; Christian Zenkner

    2003-01-01

    In this study of tactical asset allocation, we use a genetic algorithm to implement a market timing strategy. The algorithm makes a daily decision whether to invest in the market index or in a riskless asset. The market index is represented by the S&P500 Composite Index, the riskless asset by a 3-month T-Bill. The decision of the genetic algorithm is based on fundamental macroeconomic variables. The association of fundamental variables with a set of operators creates a space of possible strat...

  12. Asymmetry in some common assignment algorithms: the dispersion factor solution

    OpenAIRE

    T de la Barra; B Pérez

    1986-01-01

    Many common assignment algorithms are based on Dial's original design to determine the paths that trip makers will follow from a given origin to destination centroids. The purpose of this paper is to show that the rules that have to be applied result in two unwanted properties. The first is that trips assigned from an origin centroid i to a destination j can be dramatically different to those resulting from centroid j to centroid i , even if the number of trips is the same and the network is ...

  13. A robust algorithm to solve the signal setting problem considering different traffic assignment approaches

    Directory of Open Access Journals (Sweden)

    Adacher Ludovica

    2017-12-01

    Full Text Available In this paper we extend a stochastic discrete optimization algorithm so as to tackle the signal setting problem. Signalized junctions represent critical points of an urban transportation network, and the efficiency of their traffic signal setting influences the overall network performance. Since road congestion usually takes place at or close to junction areas, an improvement in signal settings contributes to improving travel times, drivers’ comfort, fuel consumption efficiency, pollution and safety. In a traffic network, the signal control strategy affects the travel time on the roads and influences drivers’ route choice behavior. The paper presents an algorithm for signal setting optimization of signalized junctions in a congested road network. The objective function used in this work is a weighted sum of delays caused by the signalized intersections. We propose an iterative procedure to solve the problem by alternately updating signal settings based on fixed flows and traffic assignment based on fixed signal settings. To show the robustness of our method, we consider two different assignment methods: one based on user equilibrium assignment, well established in the literature as well as in practice, and the other based on a platoon simulation model with vehicular flow propagation and spill-back. Our optimization algorithm is also compared with others well known in the literature for this problem. The surrogate method (SM, particle swarm optimization (PSO and the genetic algorithm (GA are compared for a combined problem of global optimization of signal settings and traffic assignment (GOSSTA. Numerical experiments on a real test network are reported.

  14. 12 CFR 615.5210 - Risk-adjusted assets.

    Science.gov (United States)

    2010-01-01

    ... appropriate credit conversion factor in § 615.5212, is assigned to one of the risk categories specified in... risk-based capital requirement for the credit-enhanced assets, the risk-based capital required under..., determine the appropriate risk weight for any asset or credit equivalent amount that does not fit wholly...

  15. ZAP: a distributed channel assignment algorithm for cognitive radio networks

    Directory of Open Access Journals (Sweden)

    Munaretto Anelise

    2011-01-01

    Full Text Available Abstract We propose ZAP, an algorithm for the distributed channel assignment in cognitive radio (CR networks. CRs are capable of identifying underutilized licensed bands of the spectrum, allowing their reuse by secondary users without interfering with primary users. In this context, efficient channel assignment is challenging as ideally it must be simple, incur acceptable communication overhead, provide timely response, and be adaptive to accommodate frequent changes in the network. Another challenge is the optimization of network capacity through interference minimization. In contrast to related work, ZAP addresses these challenges with a fully distributed approach based only on local (neighborhood knowledge, while significantly reducing computational costs and the number of messages required for channel assignment. Simulations confirm the efficiency of ZAP in terms of (i the performance tradeoff between different metrics and (ii the fast achievement of a suitable assignment solution regardless of network size and density.

  16. Simulated annealing algorithm for solving chambering student-case assignment problem

    Science.gov (United States)

    Ghazali, Saadiah; Abdul-Rahman, Syariza

    2015-12-01

    The problem related to project assignment problem is one of popular practical problem that appear nowadays. The challenge of solving the problem raise whenever the complexity related to preferences, the existence of real-world constraints and problem size increased. This study focuses on solving a chambering student-case assignment problem by using a simulated annealing algorithm where this problem is classified under project assignment problem. The project assignment problem is considered as hard combinatorial optimization problem and solving it using a metaheuristic approach is an advantage because it could return a good solution in a reasonable time. The problem of assigning chambering students to cases has never been addressed in the literature before. For the proposed problem, it is essential for law graduates to peruse in chambers before they are qualified to become legal counselor. Thus, assigning the chambering students to cases is a critically needed especially when involving many preferences. Hence, this study presents a preliminary study of the proposed project assignment problem. The objective of the study is to minimize the total completion time for all students in solving the given cases. This study employed a minimum cost greedy heuristic in order to construct a feasible initial solution. The search then is preceded with a simulated annealing algorithm for further improvement of solution quality. The analysis of the obtained result has shown that the proposed simulated annealing algorithm has greatly improved the solution constructed by the minimum cost greedy heuristic. Hence, this research has demonstrated the advantages of solving project assignment problem by using metaheuristic techniques.

  17. AN EVOLUTIONARY ALGORITHM FOR CHANNEL ASSIGNMENT PROBLEM IN WIRELESS MOBILE NETWORKS

    Directory of Open Access Journals (Sweden)

    Yee Shin Chia

    2012-12-01

    Full Text Available The channel assignment problem in wireless mobile network is the assignment of appropriate frequency spectrum to incoming calls while maintaining a satisfactory level of electromagnetic compatibility (EMC constraints. An effective channel assignment strategy is important due to the limited capacity of frequency spectrum in wireless mobile network. Most of the existing channel assignment strategies are based on deterministic methods. In this paper, an adaptive genetic algorithm (GA based channel assignment strategy is introduced for resource management and to reduce the effect of EMC interferences. The most significant advantage of the proposed optimization method is its capability to handle both the reassignment of channels for existing calls as well as the allocation of channel to a new incoming call in an adaptive process to maximize the utility of the limited resources. It is capable to adapt the population size to the number of eligible channels for a particular cell upon new call arrivals to achieve reasonable convergence speed. The MATLAB simulation on a 49-cells network model for both uniform and nonuniform call traffic distributions showed that the proposed channel optimization method can always achieve a lower average new incoming call blocking probability compared to the deterministic based channel assignment strategy.

  18. A fuzzy logic algorithm to assign confidence levels to heart and respiratory rate time series

    International Nuclear Information System (INIS)

    Liu, J; McKenna, T M; Gribok, A; Reifman, J; Beidleman, B A; Tharion, W J

    2008-01-01

    We have developed a fuzzy logic-based algorithm to qualify the reliability of heart rate (HR) and respiratory rate (RR) vital-sign time-series data by assigning a confidence level to the data points while they are measured as a continuous data stream. The algorithm's membership functions are derived from physiology-based performance limits and mass-assignment-based data-driven characteristics of the signals. The assigned confidence levels are based on the reliability of each HR and RR measurement as well as the relationship between them. The algorithm was tested on HR and RR data collected from subjects undertaking a range of physical activities, and it showed acceptable performance in detecting four types of faults that result in low-confidence data points (receiver operating characteristic areas under the curve ranged from 0.67 (SD 0.04) to 0.83 (SD 0.03), mean and standard deviation (SD) over all faults). The algorithm is sensitive to noise in the raw HR and RR data and will flag many data points as low confidence if the data are noisy; prior processing of the data to reduce noise allows identification of only the most substantial faults. Depending on how HR and RR data are processed, the algorithm can be applied as a tool to evaluate sensor performance or to qualify HR and RR time-series data in terms of their reliability before use in automated decision-assist systems

  19. A Parallel Biological Optimization Algorithm to Solve the Unbalanced Assignment Problem Based on DNA Molecular Computing.

    Science.gov (United States)

    Wang, Zhaocai; Pu, Jun; Cao, Liling; Tan, Jian

    2015-10-23

    The unbalanced assignment problem (UAP) is to optimally resolve the problem of assigning n jobs to m individuals (m applied mathematics, having numerous real life applications. In this paper, we present a new parallel DNA algorithm for solving the unbalanced assignment problem using DNA molecular operations. We reasonably design flexible-length DNA strands representing different jobs and individuals, take appropriate steps, and get the solutions of the UAP in the proper length range and O(mn) time. We extend the application of DNA molecular operations and simultaneity to simplify the complexity of the computation.

  20. Application of a fast sorting algorithm to the assignment of mass spectrometric cross-linking data.

    Science.gov (United States)

    Petrotchenko, Evgeniy V; Borchers, Christoph H

    2014-09-01

    Cross-linking combined with MS involves enzymatic digestion of cross-linked proteins and identifying cross-linked peptides. Assignment of cross-linked peptide masses requires a search of all possible binary combinations of peptides from the cross-linked proteins' sequences, which becomes impractical with increasing complexity of the protein system and/or if digestion enzyme specificity is relaxed. Here, we describe the application of a fast sorting algorithm to search large sequence databases for cross-linked peptide assignments based on mass. This same algorithm has been used previously for assigning disulfide-bridged peptides (Choi et al., ), but has not previously been applied to cross-linking studies. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Performance of multiobjective computational intelligence algorithms for the routing and wavelength assignment problem

    Directory of Open Access Journals (Sweden)

    Jorge Patiño

    2016-01-01

    Full Text Available This paper presents an evaluation performance of computational intelligence algorithms based on the multiobjective theory for the solution of the Routing and Wavelength Assignment problem (RWA in optical networks. The study evaluates the Firefly Algorithm, the Differential Evolutionary Algorithm, the Simulated Annealing Algorithm and two versions of the Particle Swarm Optimization algorithm. The paper provides a description of the multiobjective algorithms; then, an evaluation based on the performance provided by the multiobjective algorithms versus mono-objective approaches when dealing with different traffic loads, different numberof wavelengths and wavelength conversion process over the NSFNet topology is presented. Simulation results show that monoobjective algorithms properly solve the RWA problem for low values of data traffic and low number of wavelengths. However, the multiobjective approaches adapt better to online traffic when the number of wavelengths available in the network increases as well as when wavelength conversion is implemented in the nodes.

  2. Study on store-space assignment based on logistic AGV in e-commerce goods to person picking pattern

    Science.gov (United States)

    Xu, Lijuan; Zhu, Jie

    2017-10-01

    This paper studied on the store-space assignment based on logistic AGV in E-commerce goods to person picking pattern, and established the store-space assignment model based on the lowest picking cost, and design for store-space assignment algorithm after the cluster analysis based on similarity coefficient. And then through the example analysis, compared the picking cost between store-space assignment algorithm this paper design and according to item number and storage according to ABC classification allocation, and verified the effectiveness of the design of the store-space assignment algorithm.

  3. Fast and Rigorous Assignment Algorithm Multiple Preference and Calculation

    Directory of Open Access Journals (Sweden)

    Ümit Çiftçi

    2010-03-01

    Full Text Available The goal of paper is to develop an algorithm that evaluates students then places them depending on their desired choices according to dependant preferences. The developed algorithm is also used to implement software. The success and accuracy of the software as well as the algorithm are tested by applying it to ability test at Beykent University. This ability test is repeated several times in order to fill all available places at Fine Art Faculty departments in every academic year. It has been shown that this algorithm is very fast and rigorous after application of 2008-2009 and 2009-20010 academic years.Key Words: Assignment algorithm, student placement, ability test

  4. Ant Colony Algorithm and Simulation for Robust Airport Gate Assignment

    Directory of Open Access Journals (Sweden)

    Hui Zhao

    2014-01-01

    Full Text Available Airport gate assignment is core task for airport ground operations. Due to the fact that the departure and arrival time of flights may be influenced by many random factors, the airport gate assignment scheme may encounter gate conflict and many other problems. This paper aims at finding a robust solution for airport gate assignment problem. A mixed integer model is proposed to formulate the problem, and colony algorithm is designed to solve this model. Simulation result shows that, in consideration of robustness, the ability of antidisturbance for airport gate assignment scheme has much improved.

  5. 12 CFR 615.5211 - Risk categories-balance sheet assets.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Risk categories-balance sheet assets. 615.5211...—balance sheet assets. Section 615.5210(c) specifies certain balance sheet assets that are not assigned to the risk categories set forth below. All other balance sheet assets are assigned to the percentage...

  6. Algorithms for selecting informative marker panels for population assignment.

    Science.gov (United States)

    Rosenberg, Noah A

    2005-11-01

    Given a set of potential source populations, genotypes of an individual of unknown origin at a collection of markers can be used to predict the correct source population of the individual. For improved efficiency, informative markers can be chosen from a larger set of markers to maximize the accuracy of this prediction. However, selecting the loci that are individually most informative does not necessarily produce the optimal panel. Here, using genotypes from eight species--carp, cat, chicken, dog, fly, grayling, human, and maize--this univariate accumulation procedure is compared to new multivariate "greedy" and "maximin" algorithms for choosing marker panels. The procedures generally suggest similar panels, although the greedy method often recommends inclusion of loci that are not chosen by the other algorithms. In seven of the eight species, when applied to five or more markers, all methods achieve at least 94% assignment accuracy on simulated individuals, with one species--dog--producing this level of accuracy with only three markers, and the eighth species--human--requiring approximately 13-16 markers. The new algorithms produce substantial improvements over use of randomly selected markers; where differences among the methods are noticeable, the greedy algorithm leads to slightly higher probabilities of correct assignment. Although none of the approaches necessarily chooses the panel with optimal performance, the algorithms all likely select panels with performance near enough to the maximum that they all are suitable for practical use.

  7. A Parallel Biological Optimization Algorithm to Solve the Unbalanced Assignment Problem Based on DNA Molecular Computing

    Directory of Open Access Journals (Sweden)

    Zhaocai Wang

    2015-10-01

    Full Text Available The unbalanced assignment problem (UAP is to optimally resolve the problem of assigning n jobs to m individuals (m < n, such that minimum cost or maximum profit obtained. It is a vitally important Non-deterministic Polynomial (NP complete problem in operation management and applied mathematics, having numerous real life applications. In this paper, we present a new parallel DNA algorithm for solving the unbalanced assignment problem using DNA molecular operations. We reasonably design flexible-length DNA strands representing different jobs and individuals, take appropriate steps, and get the solutions of the UAP in the proper length range and O(mn time. We extend the application of DNA molecular operations and simultaneity to simplify the complexity of the computation.

  8. Computer Based Asset Management System For Commercial Banks

    Directory of Open Access Journals (Sweden)

    Amanze

    2015-08-01

    Full Text Available ABSTRACT The Computer-based Asset Management System is a web-based system. It allows commercial banks to keep track of their assets. The most advantages of this system are the effective management of asset by keeping records of the asset and retrieval of information. In this research I gather the information to define the requirements of the new application and look at factors how commercial banks managed their asset.

  9. ZAP: a distributed channel assignment algorithm for cognitive radio networks

    OpenAIRE

    Junior , Paulo Roberto ,; Fonseca , Mauro; Munaretto , Anelise; Viana , Aline ,; Ziviani , Artur

    2011-01-01

    Abstract We propose ZAP, an algorithm for the distributed channel assignment in cognitive radio (CR) networks. CRs are capable of identifying underutilized licensed bands of the spectrum, allowing their reuse by secondary users without interfering with primary users. In this context, efficient channel assignment is challenging as ideally it must be simple, incur acceptable communication overhead, provide timely response, and be adaptive to accommodate frequent changes in the network. Another ...

  10. Resonance assignment of the NMR spectra of disordered proteins using a multi-objective non-dominated sorting genetic algorithm

    International Nuclear Information System (INIS)

    Yang, Yu; Fritzsching, Keith J.; Hong, Mei

    2013-01-01

    A multi-objective genetic algorithm is introduced to predict the assignment of protein solid-state NMR (SSNMR) spectra with partial resonance overlap and missing peaks due to broad linewidths, molecular motion, and low sensitivity. This non-dominated sorting genetic algorithm II (NSGA-II) aims to identify all possible assignments that are consistent with the spectra and to compare the relative merit of these assignments. Our approach is modeled after the recently introduced Monte-Carlo simulated-annealing (MC/SA) protocol, with the key difference that NSGA-II simultaneously optimizes multiple assignment objectives instead of searching for possible assignments based on a single composite score. The multiple objectives include maximizing the number of consistently assigned peaks between multiple spectra (“good connections”), maximizing the number of used peaks, minimizing the number of inconsistently assigned peaks between spectra (“bad connections”), and minimizing the number of assigned peaks that have no matching peaks in the other spectra (“edges”). Using six SSNMR protein chemical shift datasets with varying levels of imperfection that was introduced by peak deletion, random chemical shift changes, and manual peak picking of spectra with moderately broad linewidths, we show that the NSGA-II algorithm produces a large number of valid and good assignments rapidly. For high-quality chemical shift peak lists, NSGA-II and MC/SA perform similarly well. However, when the peak lists contain many missing peaks that are uncorrelated between different spectra and have chemical shift deviations between spectra, the modified NSGA-II produces a larger number of valid solutions than MC/SA, and is more effective at distinguishing good from mediocre assignments by avoiding the hazard of suboptimal weighting factors for the various objectives. These two advantages, namely diversity and better evaluation, lead to a higher probability of predicting the correct

  11. Resonance assignment of the NMR spectra of disordered proteins using a multi-objective non-dominated sorting genetic algorithm.

    Science.gov (United States)

    Yang, Yu; Fritzsching, Keith J; Hong, Mei

    2013-11-01

    A multi-objective genetic algorithm is introduced to predict the assignment of protein solid-state NMR (SSNMR) spectra with partial resonance overlap and missing peaks due to broad linewidths, molecular motion, and low sensitivity. This non-dominated sorting genetic algorithm II (NSGA-II) aims to identify all possible assignments that are consistent with the spectra and to compare the relative merit of these assignments. Our approach is modeled after the recently introduced Monte-Carlo simulated-annealing (MC/SA) protocol, with the key difference that NSGA-II simultaneously optimizes multiple assignment objectives instead of searching for possible assignments based on a single composite score. The multiple objectives include maximizing the number of consistently assigned peaks between multiple spectra ("good connections"), maximizing the number of used peaks, minimizing the number of inconsistently assigned peaks between spectra ("bad connections"), and minimizing the number of assigned peaks that have no matching peaks in the other spectra ("edges"). Using six SSNMR protein chemical shift datasets with varying levels of imperfection that was introduced by peak deletion, random chemical shift changes, and manual peak picking of spectra with moderately broad linewidths, we show that the NSGA-II algorithm produces a large number of valid and good assignments rapidly. For high-quality chemical shift peak lists, NSGA-II and MC/SA perform similarly well. However, when the peak lists contain many missing peaks that are uncorrelated between different spectra and have chemical shift deviations between spectra, the modified NSGA-II produces a larger number of valid solutions than MC/SA, and is more effective at distinguishing good from mediocre assignments by avoiding the hazard of suboptimal weighting factors for the various objectives. These two advantages, namely diversity and better evaluation, lead to a higher probability of predicting the correct assignment for a

  12. Portfolio management fees: assets or profits based compensation?

    OpenAIRE

    Gil-Bazo, Javier

    2001-01-01

    This paper compares assets-based portfolio management fees to profits-based fees. Whilst both forms of compensation can provide appropriate risk incentives, fund managers' limited liability induces more excess risk-taking under a profits-based fee contract. On the other hand, an assets-based fee is more costly to investors. In Spain, where the law explicitly permits both forms of retribution, assets-based fees are observed far more frequently. Under this type of compensation, the paper provid...

  13. A Tutorial on Nonlinear Time-Series Data Mining in Engineering Asset Health and Reliability Prediction: Concepts, Models, and Algorithms

    Directory of Open Access Journals (Sweden)

    Ming Dong

    2010-01-01

    Full Text Available The primary objective of engineering asset management is to optimize assets service delivery potential and to minimize the related risks and costs over their entire life through the development and application of asset health and usage management in which the health and reliability prediction plays an important role. In real-life situations where an engineering asset operates under dynamic operational and environmental conditions, the lifetime of an engineering asset is generally described as monitored nonlinear time-series data and subject to high levels of uncertainty and unpredictability. It has been proved that application of data mining techniques is very useful for extracting relevant features which can be used as parameters for assets diagnosis and prognosis. In this paper, a tutorial on nonlinear time-series data mining in engineering asset health and reliability prediction is given. Besides that an overview on health and reliability prediction techniques for engineering assets is covered, this tutorial will focus on concepts, models, algorithms, and applications of hidden Markov models (HMMs and hidden semi-Markov models (HSMMs in engineering asset health prognosis, which are representatives of recent engineering asset health prediction techniques.

  14. A branch-and-cut algorithm for the Time Window Assignment Vehicle Routing Problem

    NARCIS (Netherlands)

    K. Dalmeijer (Kevin); R. Spliet (Remy)

    2016-01-01

    textabstractThis paper presents a branch-and-cut algorithm for the Time Window Assignment Vehicle Routing Problem (TWAVRP), the problem of assigning time windows for delivery before demand volume becomes known. A novel set of valid inequalities, the precedence inequalities, is introduced and

  15. Asset management using genetic algorithm: Evidence from Tehran Stock Exchange

    Directory of Open Access Journals (Sweden)

    Abbas Sarijaloo

    2014-02-01

    Full Text Available This paper presents an empirical investigation to study the effect of market management using Markowitz theorem. The study uses the information of 50 best performers on Tehran Stock Exchange over the period 2006-2009 and, using Markowitz theorem, the efficient asset allocation are determined and the result are analyzed. The proposed model of this paper has been solved using genetic algorithm. The results indicate that Tehran Stock Exchange has managed to perform much better than average world market in most years of studies especially on year 2009. The results of our investigation have also indicated that one could reach outstanding results using GA and forming efficient portfolio.

  16. Distributed Schemes for Crowdsourcing-Based Sensing Task Assignment in Cognitive Radio Networks

    Directory of Open Access Journals (Sweden)

    Linbo Zhai

    2017-01-01

    Full Text Available Spectrum sensing is an important issue in cognitive radio networks. The unlicensed users can access the licensed wireless spectrum only when the licensed wireless spectrum is sensed to be idle. Since mobile terminals such as smartphones and tablets are popular among people, spectrum sensing can be assigned to these mobile intelligent terminals, which is called crowdsourcing method. Based on the crowdsourcing method, this paper studies the distributed scheme to assign spectrum sensing task to mobile terminals such as smartphones and tablets. Considering the fact that mobile terminals’ positions may influence the sensing results, a precise sensing effect function is designed for the crowdsourcing-based sensing task assignment. We aim to maximize the sensing effect function and cast this optimization problem to address crowdsensing task assignment in cognitive radio networks. This problem is difficult to be solved because the complexity of this problem increases exponentially with the growth in mobile terminals. To assign crowdsensing task, we propose four distributed algorithms with different transition probabilities and use a Markov chain to analyze the approximation gap of our proposed schemes. Simulation results evaluate the average performance of our proposed algorithms and validate the algorithm’s convergence.

  17. Identifying asset-based trends in sustainable programmes which ...

    African Journals Online (AJOL)

    We indicate the similarities between the asset-based approach and current discourses focusing on the notion of schools as nodes of support and care.1 We conclude by suggesting that knowledge of asset-based good practices could be shared with families in school-based sessions, thereby developing schools', families' ...

  18. Minimum Interference Channel Assignment Algorithm for Multicast in a Wireless Mesh Network

    Directory of Open Access Journals (Sweden)

    Sangil Choi

    2016-12-01

    Full Text Available Wireless mesh networks (WMNs have been considered as one of the key technologies for the configuration of wireless machines since they emerged. In a WMN, wireless routers provide multi-hop wireless connectivity between hosts in the network and also allow them to access the Internet via gateway devices. Wireless routers are typically equipped with multiple radios operating on different channels to increase network throughput. Multicast is a form of communication that delivers data from a source to a set of destinations simultaneously. It is used in a number of applications, such as distributed games, distance education, and video conferencing. In this study, we address a channel assignment problem for multicast in multi-radio multi-channel WMNs. In a multi-radio multi-channel WMN, two nearby nodes will interfere with each other and cause a throughput decrease when they transmit on the same channel. Thus, an important goal for multicast channel assignment is to reduce the interference among networked devices. We have developed a minimum interference channel assignment (MICA algorithm for multicast that accurately models the interference relationship between pairs of multicast tree nodes using the concept of the interference factor and assigns channels to tree nodes to minimize interference within the multicast tree. Simulation results show that MICA achieves higher throughput and lower end-to-end packet delay compared with an existing channel assignment algorithm named multi-channel multicast (MCM. In addition, MICA achieves much lower throughput variation among the destination nodes than MCM.

  19. On the use of genetic algorithm to optimize industrial assets lifecycle management under safety and budget constraints

    International Nuclear Information System (INIS)

    Lonchampt, J.; Fessart, K.

    2013-01-01

    The purpose of this paper is to describe the method and tool dedicated to optimize investments planning for industrial assets. These investments may either be preventive maintenance tasks, asset enhancements or logistic investments such as spare parts purchases. The two methodological points to investigate in such an issue are: 1. The measure of the profitability of a portfolio of investments 2. The selection and planning of an optimal set of investments 3. The measure of the risk of a portfolio of investments The measure of the profitability of a set of investments in the IPOP tool is synthesised in the Net Present Value indicator. The NPV is the sum of the differences of discounted cash flows (direct costs, forced outages...) between the situations with and without a given investment. These cash flows are calculated through a pseudo-Markov reliability model representing independently the components of the industrial asset and the spare parts inventories. The component model has been widely discussed over the years but the spare part model is a new one based on some approximations that will be discussed. This model, referred as the NPV function, takes for input an investments portfolio and gives its NPV. The second issue is to optimize the NPV. If all investments were independent, this optimization would be an easy calculation, unfortunately there are two sources of dependency. The first one is introduced by the spare part model, as if components are indeed independent in their reliability model, the fact that several components use the same inventory induces a dependency. The second dependency comes from economic, technical or logistic constraints, such as a global maintenance budget limit or a safety requirement limiting the residual risk of failure of a component or group of component, making the aggregation of individual optimum not necessary feasible. The algorithm used to solve such a difficult optimization problem is a genetic algorithm. After a description

  20. On the use of genetic algorithm to optimize industrial assets lifecycle management under safety and budget constraints

    Energy Technology Data Exchange (ETDEWEB)

    Lonchampt, J.; Fessart, K. [EDF R and D, Departement MRI, 6, quai Watier, 78401 Chatou cedex (France)

    2013-07-01

    The purpose of this paper is to describe the method and tool dedicated to optimize investments planning for industrial assets. These investments may either be preventive maintenance tasks, asset enhancements or logistic investments such as spare parts purchases. The two methodological points to investigate in such an issue are: 1. The measure of the profitability of a portfolio of investments 2. The selection and planning of an optimal set of investments 3. The measure of the risk of a portfolio of investments The measure of the profitability of a set of investments in the IPOP tool is synthesised in the Net Present Value indicator. The NPV is the sum of the differences of discounted cash flows (direct costs, forced outages...) between the situations with and without a given investment. These cash flows are calculated through a pseudo-Markov reliability model representing independently the components of the industrial asset and the spare parts inventories. The component model has been widely discussed over the years but the spare part model is a new one based on some approximations that will be discussed. This model, referred as the NPV function, takes for input an investments portfolio and gives its NPV. The second issue is to optimize the NPV. If all investments were independent, this optimization would be an easy calculation, unfortunately there are two sources of dependency. The first one is introduced by the spare part model, as if components are indeed independent in their reliability model, the fact that several components use the same inventory induces a dependency. The second dependency comes from economic, technical or logistic constraints, such as a global maintenance budget limit or a safety requirement limiting the residual risk of failure of a component or group of component, making the aggregation of individual optimum not necessary feasible. The algorithm used to solve such a difficult optimization problem is a genetic algorithm. After a description

  1. Analyzing the multiple-target-multiple-agent scenario using optimal assignment algorithms

    Science.gov (United States)

    Kwok, Kwan S.; Driessen, Brian J.; Phillips, Cynthia A.; Tovey, Craig A.

    1997-09-01

    This work considers the problem of maximum utilization of a set of mobile robots with limited sensor-range capabilities and limited travel distances. The robots are initially in random positions. A set of robots properly guards or covers a region if every point within the region is within the effective sensor range of at least one vehicle. We wish to move the vehicles into surveillance positions so as to guard or cover a region, while minimizing the maximum distance traveled by any vehicle. This problem can be formulated as an assignment problem, in which we must optimally decide which robot to assign to which slot of a desired matrix of grid points. The cost function is the maximum distance traveled by any robot. Assignment problems can be solved very efficiently. Solution times for one hundred robots took only seconds on a silicon graphics crimson workstation. The initial positions of all the robots can be sampled by a central base station and their newly assigned positions communicated back to the robots. Alternatively, the robots can establish their own coordinate system with the origin fixed at one of the robots and orientation determined by the compass bearing of another robot relative to this robot. This paper presents example solutions to the multiple-target-multiple-agent scenario using a matching algorithm. Two separate cases with one hundred agents in each were analyzed using this method. We have found these mobile robot problems to be a very interesting application of network optimization methods, and we expect this to be a fruitful area for future research.

  2. Analyzing the multiple-target-multiple-agent scenario using optimal assignment algorithms

    International Nuclear Information System (INIS)

    Kwok, K.S.; Driessen, B.J.; Phillips, C.A.; Tovey, C.A.

    1997-01-01

    This work considers the problem of maximum utilization of a set of mobile robots with limited sensor-range capabilities and limited travel distances. The robots are initially in random positions. A set of robots properly guards or covers a region if every point within the region is within the effective sensor range of at least one vehicle. The authors wish to move the vehicles into surveillance positions so as to guard or cover a region, while minimizing the maximum distance traveled by any vehicle. This problem can be formulated as an assignment problem, in which they must optimally decide which robot to assign to which slot of a desired matrix of grid points. The cost function is the maximum distance traveled by any robot. Assignment problems can be solved very efficiently. Solutions times for one hundred robots took only seconds on a Silicon Graphics Crimson workstation. The initial positions of all the robots can be sampled by a central base station and their newly assigned positions communicated back to the robots. Alternatively, the robots can establish their own coordinate system with the origin fixed at one of the robots and orientation determined by the compass bearing of another robot relative to this robot. This paper presents example solutions to the multiple-target-multiple-agent scenario using a matching algorithm. Two separate cases with one hundred agents in each were analyzed using this method. They have found these mobile robot problems to be a very interesting application of network optimization methods, and they expect this to be a fruitful area for future research

  3. A multiobjective approach towards weapon assignment in a ground ...

    African Journals Online (AJOL)

    A typical ground-based air defence (GBAD) environment comprises defended assets on the ground which require protection from enemy aircraft entering the defended airspace. ... of computerised threat evaluation and weapon assignment (TEWA) decision support systems (DSSs) within the context of a GBAD system.

  4. Novel density-based and hierarchical density-based clustering algorithms for uncertain data.

    Science.gov (United States)

    Zhang, Xianchao; Liu, Han; Zhang, Xiaotong

    2017-09-01

    Uncertain data has posed a great challenge to traditional clustering algorithms. Recently, several algorithms have been proposed for clustering uncertain data, and among them density-based techniques seem promising for handling data uncertainty. However, some issues like losing uncertain information, high time complexity and nonadaptive threshold have not been addressed well in the previous density-based algorithm FDBSCAN and hierarchical density-based algorithm FOPTICS. In this paper, we firstly propose a novel density-based algorithm PDBSCAN, which improves the previous FDBSCAN from the following aspects: (1) it employs a more accurate method to compute the probability that the distance between two uncertain objects is less than or equal to a boundary value, instead of the sampling-based method in FDBSCAN; (2) it introduces new definitions of probability neighborhood, support degree, core object probability, direct reachability probability, thus reducing the complexity and solving the issue of nonadaptive threshold (for core object judgement) in FDBSCAN. Then, we modify the algorithm PDBSCAN to an improved version (PDBSCANi), by using a better cluster assignment strategy to ensure that every object will be assigned to the most appropriate cluster, thus solving the issue of nonadaptive threshold (for direct density reachability judgement) in FDBSCAN. Furthermore, as PDBSCAN and PDBSCANi have difficulties for clustering uncertain data with non-uniform cluster density, we propose a novel hierarchical density-based algorithm POPTICS by extending the definitions of PDBSCAN, adding new definitions of fuzzy core distance and fuzzy reachability distance, and employing a new clustering framework. POPTICS can reveal the cluster structures of the datasets with different local densities in different regions better than PDBSCAN and PDBSCANi, and it addresses the issues in FOPTICS. Experimental results demonstrate the superiority of our proposed algorithms over the existing

  5. An Asset-Based Approach to Tribal Community Energy Planning

    Energy Technology Data Exchange (ETDEWEB)

    Gutierrez, Rachael A. [Pratt Inst., Brooklyn, NY (United States). City and Regional Planning; Martino, Anthony [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Materials, Devices, and Energy Technologies; Begay, Sandra K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Materials, Devices, and Energy Technologies

    2016-08-01

    Community energy planning is a vital component of successful energy resource development and project implementation. Planning can help tribes develop a shared vision and strategies to accomplish their energy goals. This paper explores the benefits of an asset-based approach to tribal community energy planning. While a framework for community energy planning and federal funding already exists, some areas of difficulty in the planning cycle have been identified. This paper focuses on developing a planning framework that offsets those challenges. The asset-based framework described here takes inventory of a tribe’s capital assets, such as: land capital, human capital, financial capital, and political capital. Such an analysis evaluates how being rich in a specific type of capital can offer a tribe unique advantages in implementing their energy vision. Finally, a tribal case study demonstrates the practical application of an asset-based framework.

  6. A Temporal Domain Decomposition Algorithmic Scheme for Large-Scale Dynamic Traffic Assignment

    Directory of Open Access Journals (Sweden)

    Eric J. Nava

    2012-03-01

    This paper presents a temporal decomposition scheme for large spatial- and temporal-scale dynamic traffic assignment, in which the entire analysis period is divided into Epochs. Vehicle assignment is performed sequentially in each Epoch, thus improving the model scalability and confining the peak run-time memory requirement regardless of the total analysis period. A proposed self-turning scheme adaptively searches for the run-time-optimal Epoch setting during iterations regardless of the characteristics of the modeled network. Extensive numerical experiments confirm the promising performance of the proposed algorithmic schemes.

  7. A Rule-Based Model for Bankruptcy Prediction Based on an Improved Genetic Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Yudong Zhang

    2013-01-01

    Full Text Available In this paper, we proposed a hybrid system to predict corporate bankruptcy. The whole procedure consists of the following four stages: first, sequential forward selection was used to extract the most important features; second, a rule-based model was chosen to fit the given dataset since it can present physical meaning; third, a genetic ant colony algorithm (GACA was introduced; the fitness scaling strategy and the chaotic operator were incorporated with GACA, forming a new algorithm—fitness-scaling chaotic GACA (FSCGACA, which was used to seek the optimal parameters of the rule-based model; and finally, the stratified K-fold cross-validation technique was used to enhance the generalization of the model. Simulation experiments of 1000 corporations’ data collected from 2006 to 2009 demonstrated that the proposed model was effective. It selected the 5 most important factors as “net income to stock broker’s equality,” “quick ratio,” “retained earnings to total assets,” “stockholders’ equity to total assets,” and “financial expenses to sales.” The total misclassification error of the proposed FSCGACA was only 7.9%, exceeding the results of genetic algorithm (GA, ant colony algorithm (ACA, and GACA. The average computation time of the model is 2.02 s.

  8. Tolerance based algorithms for the ATSP

    NARCIS (Netherlands)

    Goldengorin, B; Sierksma, G; Turkensteen, M; Hromkovic, J; Nagl, M; Westfechtel, B

    2004-01-01

    In this paper we use arc tolerances, instead of arc costs, to improve Branch-and-Bound type algorithms for the Asymmetric Traveling Salesman Problem (ATSP). We derive new tighter lower bounds based on exact and approximate bottleneck upper tolerance values of the Assignment Problem (AP). It is shown

  9. Particle Swarm Optimization Algorithm for Optimizing Assignment of Blood in Blood Banking System

    Science.gov (United States)

    Olusanya, Micheal O.; Arasomwan, Martins A.; Adewumi, Aderemi O.

    2015-01-01

    This paper reports the performance of particle swarm optimization (PSO) for the assignment of blood to meet patients' blood transfusion requests for blood transfusion. While the drive for blood donation lingers, there is need for effective and efficient management of available blood in blood banking systems. Moreover, inherent danger of transfusing wrong blood types to patients, unnecessary importation of blood units from external sources, and wastage of blood products due to nonusage necessitate the development of mathematical models and techniques for effective handling of blood distribution among available blood types in order to minimize wastages and importation from external sources. This gives rise to the blood assignment problem (BAP) introduced recently in literature. We propose a queue and multiple knapsack models with PSO-based solution to address this challenge. Simulation is based on sets of randomly generated data that mimic real-world population distribution of blood types. Results obtained show the efficiency of the proposed algorithm for BAP with no blood units wasted and very low importation, where necessary, from outside the blood bank. The result therefore can serve as a benchmark and basis for decision support tools for real-life deployment. PMID:25815046

  10. Particle swarm optimization algorithm for optimizing assignment of blood in blood banking system.

    Science.gov (United States)

    Olusanya, Micheal O; Arasomwan, Martins A; Adewumi, Aderemi O

    2015-01-01

    This paper reports the performance of particle swarm optimization (PSO) for the assignment of blood to meet patients' blood transfusion requests for blood transfusion. While the drive for blood donation lingers, there is need for effective and efficient management of available blood in blood banking systems. Moreover, inherent danger of transfusing wrong blood types to patients, unnecessary importation of blood units from external sources, and wastage of blood products due to nonusage necessitate the development of mathematical models and techniques for effective handling of blood distribution among available blood types in order to minimize wastages and importation from external sources. This gives rise to the blood assignment problem (BAP) introduced recently in literature. We propose a queue and multiple knapsack models with PSO-based solution to address this challenge. Simulation is based on sets of randomly generated data that mimic real-world population distribution of blood types. Results obtained show the efficiency of the proposed algorithm for BAP with no blood units wasted and very low importation, where necessary, from outside the blood bank. The result therefore can serve as a benchmark and basis for decision support tools for real-life deployment.

  11. VALUE-BASED APPROACH TO MANAGING CURRENT ASSETS OF CORPORATE CONSTRUCTION COMPANIES

    Directory of Open Access Journals (Sweden)

    Galyna Shapoval

    2017-09-01

    Full Text Available In modern conditions of management, the value of an enterprise becomes the main indicator, which is learned not only by scientists, but also by owners of enterprise and potential investors. Current assets take a very important place among the factors that affect the value of an enterprise, so management of current assets becomes more acute from the point of their impact on enterprise value. The purpose of the paper is to develop a system of value-based management of corporate construction companies’ current assets. The main tasks are: the study of current assets impact on the value of corporate construction companies, the definition of value-based approach to managing current assets of corporate enterprises and development of value-based management system of corporate construction companies’ current assets by elements. General scientific and special research methods were used while writing the work. Value-based management of current assets involves value-based management of the elements of current assets. The value-based inventory management includes the following stages of management: the assessment of reliability and choice of supplier according to the criterion of cash flow maximization, the classification of stocks in management accounting according to the rhythm of supply and the establishment of periodicity of supplies in accordance with the needs of the construction process. The value-based management of accounts receivable includes the following stages of management: assessment of the efficiency of investment of working capital into accounts receivable, the assessment of customers' loyalty and the definition of credit conditions and monitoring of receivables by construction and debt instruments. Value-based cash management involves determining the required level of cash to ensure the continuity of the construction process, assessing the effectiveness of cash use according to the criterion of maximizing cash flow, as well as budget

  12. Incorporating a modified uniform crossover and 2-exchange neighborhood mechanism in a discrete bat algorithm to solve the quadratic assignment problem

    Directory of Open Access Journals (Sweden)

    Mohammed Essaid Riffi

    2017-11-01

    Full Text Available The bat algorithm is one of the recent nature-inspired algorithms, which has been emerged as a powerful search method for solving continuous as well as discrete problems. The quadratic assignment problem is a well-known NP-hard problem in combinatorial optimization. The goal of this problem is to assign n facilities to n locations in such a way as to minimize the assignment cost. For that purpose, this paper introduces a novel discrete variant of bat algorithm to deal with this combinatorial optimization problem. The proposed algorithm was evaluated on a set of benchmark instances from the QAPLIB library and the performance was compared to other algorithms. The empirical results of exhaustive experiments were promising and illustrated the efficacy of the suggested approach.

  13. A new automated assign and analysing method for high-resolution rotationally resolved spectra using genetic algorithms

    NARCIS (Netherlands)

    Meerts, W.L.; Schmitt, M.

    2006-01-01

    This paper describes a numerical technique that has recently been developed to automatically assign and fit high-resolution spectra. The method makes use of genetic algorithms (GA). The current algorithm is compared with previously used analysing methods. The general features of the GA and its

  14. Research and design on system of asset management based on RFID

    Science.gov (United States)

    Guan, Peng; Du, HuaiChang; Jing, Hua; Zhang, MengYue; Zhang, Meng; Xu, GuiXian

    2011-10-01

    By analyzing the problems in the current assets management, this thesis proposing RFID technology will be applied to asset management in order to improve the management level of automation and information. This paper designed the equipment identification based on 433MHz RFID tag and reader which was deeply studied on the basis of RFID tag and card reader circuits, and this paper also illustrates the system of asset management. The RS232 converts Ethernet is a innovative technology to transfer data to PC monitor software, and implement system of asset management based on WEB techniques (PHP and MySQL).

  15. Large-Scale Portfolio Optimization Using Multiobjective Evolutionary Algorithms and Preselection Methods

    Directory of Open Access Journals (Sweden)

    B. Y. Qu

    2017-01-01

    Full Text Available Portfolio optimization problems involve selection of different assets to invest in order to maximize the overall return and minimize the overall risk simultaneously. The complexity of the optimal asset allocation problem increases with an increase in the number of assets available to select from for investing. The optimization problem becomes computationally challenging when there are more than a few hundreds of assets to select from. To reduce the complexity of large-scale portfolio optimization, two asset preselection procedures that consider return and risk of individual asset and pairwise correlation to remove assets that may not potentially be selected into any portfolio are proposed in this paper. With these asset preselection methods, the number of assets considered to be included in a portfolio can be increased to thousands. To test the effectiveness of the proposed methods, a Normalized Multiobjective Evolutionary Algorithm based on Decomposition (NMOEA/D algorithm and several other commonly used multiobjective evolutionary algorithms are applied and compared. Six experiments with different settings are carried out. The experimental results show that with the proposed methods the simulation time is reduced while return-risk trade-off performances are significantly improved. Meanwhile, the NMOEA/D is able to outperform other compared algorithms on all experiments according to the comparative analysis.

  16. Robust MST-Based Clustering Algorithm.

    Science.gov (United States)

    Liu, Qidong; Zhang, Ruisheng; Zhao, Zhili; Wang, Zhenghai; Jiao, Mengyao; Wang, Guangjing

    2018-06-01

    Minimax similarity stresses the connectedness of points via mediating elements rather than favoring high mutual similarity. The grouping principle yields superior clustering results when mining arbitrarily-shaped clusters in data. However, it is not robust against noises and outliers in the data. There are two main problems with the grouping principle: first, a single object that is far away from all other objects defines a separate cluster, and second, two connected clusters would be regarded as two parts of one cluster. In order to solve such problems, we propose robust minimum spanning tree (MST)-based clustering algorithm in this letter. First, we separate the connected objects by applying a density-based coarsening phase, resulting in a low-rank matrix in which the element denotes the supernode by combining a set of nodes. Then a greedy method is presented to partition those supernodes through working on the low-rank matrix. Instead of removing the longest edges from MST, our algorithm groups the data set based on the minimax similarity. Finally, the assignment of all data points can be achieved through their corresponding supernodes. Experimental results on many synthetic and real-world data sets show that our algorithm consistently outperforms compared clustering algorithms.

  17. TaDb: A time-aware diffusion-based recommender algorithm

    Science.gov (United States)

    Li, Wen-Jun; Xu, Yuan-Yuan; Dong, Qiang; Zhou, Jun-Lin; Fu, Yan

    2015-02-01

    Traditional recommender algorithms usually employ the early and recent records indiscriminately, which overlooks the change of user interests over time. In this paper, we show that the interests of a user remain stable in a short-term interval and drift during a long-term period. Based on this observation, we propose a time-aware diffusion-based (TaDb) recommender algorithm, which assigns different temporal weights to the leading links existing before the target user's collection and the following links appearing after that in the diffusion process. Experiments on four real datasets, Netflix, MovieLens, FriendFeed and Delicious show that TaDb algorithm significantly improves the prediction accuracy compared with the algorithms not considering temporal effects.

  18. and Asset-based Poverty Dynamics in Ethiopia

    African Journals Online (AJOL)

    Optiplex 7010 Pro

    poverty status based on consumption and asset ownership. Using panel data ... In recent years Ethiopia has experienced remarkable economic growth with a ...... Understanding the relationship between household demographics and poverty ...

  19. An Airway Network Flow Assignment Approach Based on an Efficient Multiobjective Optimization Framework

    Directory of Open Access Journals (Sweden)

    Xiangmin Guan

    2015-01-01

    Full Text Available Considering reducing the airspace congestion and the flight delay simultaneously, this paper formulates the airway network flow assignment (ANFA problem as a multiobjective optimization model and presents a new multiobjective optimization framework to solve it. Firstly, an effective multi-island parallel evolution algorithm with multiple evolution populations is employed to improve the optimization capability. Secondly, the nondominated sorting genetic algorithm II is applied for each population. In addition, a cooperative coevolution algorithm is adapted to divide the ANFA problem into several low-dimensional biobjective optimization problems which are easier to deal with. Finally, in order to maintain the diversity of solutions and to avoid prematurity, a dynamic adjustment operator based on solution congestion degree is specifically designed for the ANFA problem. Simulation results using the real traffic data from China air route network and daily flight plans demonstrate that the proposed approach can improve the solution quality effectively, showing superiority to the existing approaches such as the multiobjective genetic algorithm, the well-known multiobjective evolutionary algorithm based on decomposition, and a cooperative coevolution multiobjective algorithm as well as other parallel evolution algorithms with different migration topology.

  20. On the Determinations of Class-Based Storage Assignments in AS/RS having two I/O Locations

    NARCIS (Netherlands)

    Ashayeri, J.; Heuts, R.M.J.; Beekhof, M.; Wilhelm, M.R.

    2001-01-01

    This paper presents the use and extension of a geometrical-based algorithmic approach for determining the expected S/R machine cycle times, and therefore warehouse throughput, for class-based storage assignment layouts in an AS/RS.The approach was designed for the purpose of solving a practical

  1. Accounting providing of statistical analysis of intangible assets renewal under marketing strategy

    Directory of Open Access Journals (Sweden)

    I.R. Polishchuk

    2016-12-01

    Full Text Available The article analyzes the content of the Regulations on accounting policies of the surveyed enterprises in terms of the operations concerning the amortization of intangible assets on the following criteria: assessment on admission, determination of useful life, the period of depreciation, residual value, depreciation method, reflection in the financial statements, a unit of account, revaluation, formation of fair value. The characteristic of factors affecting the accounting policies and determining the mechanism for evaluating the completeness and timeliness of intangible assets renewal is showed. The algorithm for selecting the method of intangible assets amortization is proposed. The knowledge base of statistical analysis of timeliness and completeness of intangible assets renewal in terms of the developed internal reporting is expanded. The statistical indicators to assess the effectiveness of the amortization policy for intangible assets are proposed. The marketing strategies depending on the condition and amount of intangible assets in relation to increasing marketing potential for continuity of economic activity are described.

  2. Two efficient label-equivalence-based connected-component labeling algorithms for 3-D binary images.

    Science.gov (United States)

    He, Lifeng; Chao, Yuyan; Suzuki, Kenji

    2011-08-01

    Whenever one wants to distinguish, recognize, and/or measure objects (connected components) in binary images, labeling is required. This paper presents two efficient label-equivalence-based connected-component labeling algorithms for 3-D binary images. One is voxel based and the other is run based. For the voxel-based one, we present an efficient method of deciding the order for checking voxels in the mask. For the run-based one, instead of assigning each foreground voxel, we assign each run a provisional label. Moreover, we use run data to label foreground voxels without scanning any background voxel in the second scan. Experimental results have demonstrated that our voxel-based algorithm is efficient for 3-D binary images with complicated connected components, that our run-based one is efficient for those with simple connected components, and that both are much more efficient than conventional 3-D labeling algorithms.

  3. Regime-Based Versus Static Asset Allocation: Letting the Data Speak

    DEFF Research Database (Denmark)

    Nystrup, Peter; Hansen, Bo William; Madsen, Henrik

    2015-01-01

    Regime shifts present a big challenge to traditional strategic asset allocation. This article investigates whether regimebased asset allocation can effectively respond to changes in financial regimes at the portfolio level, in an effort to provide better long-term results than more static...... approaches can offer. The authors center their regime-based approach around a regime-switching model with time-varying parameters that can match financial markets’ tendency to change behavior abruptly and the fact that the new behavior often persists for several periods after a change. In an asset universe...

  4. Optimization of Task Scheduling Algorithm through QoS Parameters for Cloud Computing

    Directory of Open Access Journals (Sweden)

    Monika

    2016-01-01

    Full Text Available Cloud computing is an incipient innovation which broadly spreads among analysts. It furnishes clients with foundation, stage and programming as enhancement which is easily available by means of web. A cloud is a sort of parallel and conveyed framework comprising of a gathering of virtualized PCs that are utilized to execute various tasks to accomplish good execution time, accomplish due date and usage of its assets. The scheduling issue can be seen as the finding an ideal task of assignments over the accessible arrangement of assets with the goal that we can accomplish the wanted objectives for tasks. This paper presents an optimal algorithm for scheduling tasks to get their waiting time as a QoS parameter. The algorithm is simulated using Cloudsim simulator and experiments are carried out to help clients to make sense of the bottleneck of utilizing no. of virtual machine parallely.

  5. Research on bulk-cargo-port berth assignment based on priority of resource allocation

    Directory of Open Access Journals (Sweden)

    Chunfang Guo

    2013-03-01

    Full Text Available Purpose: The purpose of this paper is to propose a Priority of Resource Allocation model about how to utilize the resources of the port efficiently, through the improvement of traditional ant colony algorithm, the ship-berth matching relation constraint matrix formed by ontology reasoning. Design/methodology/approach: Through questionnaires?Explore factor analysis (EFA and principal component analysis, the authors extract the importance of the goods, the importance of customers, and type of trade as the main factors of the ship operating priority. Then the authors combine berth assignment problem with the improved ant colony algorithm, and use the model to improve ship scheduling quality. Finally, the authors verify the model with physical data in a bulk-cargo-port in China. Findings: Test by the real data of bulk cargo port, it show that ships’ resource using priority and the length of waiting time are consistent; it indicates that the priority of resource allocation play a prominent role in improving ship scheduling quality. Research limitations: The questionnaires is limited in only one port group, more  related Influence factors should be considered to extend the conclusion. Practical implications: The Priority of Resource Allocation model in this paper can be used to improve the efficiency of the dynamic berth assignment. Originality: This paper makes the time of ship in port minimized as the optimization of key indicators and establishes a dynamic berth assignment model based on improved ant colony algorithm and the ontology reasoning model.

  6. Critical asset and portfolio risk analysis: an all-hazards framework.

    Science.gov (United States)

    Ayyub, Bilal M; McGill, William L; Kaminskiy, Mark

    2007-08-01

    This article develops a quantitative all-hazards framework for critical asset and portfolio risk analysis (CAPRA) that considers both natural and human-caused hazards. Following a discussion on the nature of security threats, the need for actionable risk assessments, and the distinction between asset and portfolio-level analysis, a general formula for all-hazards risk analysis is obtained that resembles the traditional model based on the notional product of consequence, vulnerability, and threat, though with clear meanings assigned to each parameter. Furthermore, a simple portfolio consequence model is presented that yields first-order estimates of interdependency effects following a successful attack on an asset. Moreover, depending on the needs of the decisions being made and available analytical resources, values for the parameters in this model can be obtained at a high level or through detailed systems analysis. Several illustrative examples of the CAPRA methodology are provided.

  7. Habit-based Asset Pricing with Limited Participation Consumption

    DEFF Research Database (Denmark)

    Bach, Christian; Møller, Stig Vinther

    We calibrate and estimate a consumption-based asset pricing model with habit formation using limited participation consumption data. Based on survey data of a representative sample of American households, we distinguish between assetholder and non-assetholder consumption, as well as the standard...

  8. Habit-based asset pricing with limited participation consumption

    DEFF Research Database (Denmark)

    Møller, Stig Vinther; Bach, Christian

    2011-01-01

    We calibrate and estimate a consumption-based asset pricing model with habit formation using limited participation consumption data. Based on survey data of a representative sample of American households, we distinguish between assetholder and non-assetholder consumption, as well as the standard...

  9. An O(NlogN Algorithm for Region Definition Using Channels/Switchboxes and Ordering Assignment

    Directory of Open Access Journals (Sweden)

    Jin-Tai Yan

    1996-01-01

    Full Text Available For a building block placement, the routing space can be further partitioned into channels and switchboxes. In general, the definition of switchboxes releases the cyclic channel precedence constraints and further yields a safe routing ordering process. However, switchbox routing is more difficult than channel routing. In this paper, an O(NlogN region definition and ordering assignment (RDAOA algorithm is proposed to minimize the number of switchboxes for the routing phase, where N is the number of vertices in a channel precedence graph. Several examples have been tested on the proposed algorithm, and the experimental results are listed and compared.

  10. Comparison of four machine learning algorithms for their applicability in satellite-based optical rainfall retrievals

    Science.gov (United States)

    Meyer, Hanna; Kühnlein, Meike; Appelhans, Tim; Nauss, Thomas

    2016-03-01

    Machine learning (ML) algorithms have successfully been demonstrated to be valuable tools in satellite-based rainfall retrievals which show the practicability of using ML algorithms when faced with high dimensional and complex data. Moreover, recent developments in parallel computing with ML present new possibilities for training and prediction speed and therefore make their usage in real-time systems feasible. This study compares four ML algorithms - random forests (RF), neural networks (NNET), averaged neural networks (AVNNET) and support vector machines (SVM) - for rainfall area detection and rainfall rate assignment using MSG SEVIRI data over Germany. Satellite-based proxies for cloud top height, cloud top temperature, cloud phase and cloud water path serve as predictor variables. The results indicate an overestimation of rainfall area delineation regardless of the ML algorithm (averaged bias = 1.8) but a high probability of detection ranging from 81% (SVM) to 85% (NNET). On a 24-hour basis, the performance of the rainfall rate assignment yielded R2 values between 0.39 (SVM) and 0.44 (AVNNET). Though the differences in the algorithms' performance were rather small, NNET and AVNNET were identified as the most suitable algorithms. On average, they demonstrated the best performance in rainfall area delineation as well as in rainfall rate assignment. NNET's computational speed is an additional advantage in work with large datasets such as in remote sensing based rainfall retrievals. However, since no single algorithm performed considerably better than the others we conclude that further research in providing suitable predictors for rainfall is of greater necessity than an optimization through the choice of the ML algorithm.

  11. Steam generator asset management: integrating technology and asset management

    International Nuclear Information System (INIS)

    Shoemaker, P.; Cislo, D.

    2006-01-01

    Asset Management is an established but often misunderstood discipline that is gaining momentum within the nuclear generation industry. The global impetus behind the movement toward asset management is sustainability. The discipline of asset management is based upon three fundamental aspects; key performance indicators (KPI), activity-based cost accounting, and cost benefits/risk analysis. The technology associated with these three aspects is fairly well-developed, in all but the most critical area; cost benefits/risk analysis. There are software programs that calculate, trend, and display key-performance indicators to ensure high-level visibility. Activity-based costing is a little more difficult; requiring a consensus on the definition of what comprises an activity and then adjusting cost accounting systems to track. In the United States, the Nuclear Energy Institute's Standard Nuclear Process Model (SNPM) serves as the basis for activity-based costing. As a result, the software industry has quickly adapted to develop tracking systems that include the SNPM structure. Both the KPI's and the activity-based cost accounting feed the cost benefits/risk analysis to allow for continuous improvement and task optimization; the goal of asset management. In the case where the benefits and risks are clearly understood and defined, there has been much progress in applying technology for continuous improvement. Within the nuclear generation industry, more specialized and unique software systems have been developed for active components, such as pumps and motors. Active components lend themselves well to the application of asset management techniques because failure rates can be established, which serves as the basis to quantify risk in the cost-benefits/risk analysis. A key issue with respect to asset management technologies is only now being understood and addressed, that is how to manage passive components. Passive components, such as nuclear steam generators, reactor vessels

  12. Entropy-based financial asset pricing.

    Directory of Open Access Journals (Sweden)

    Mihály Ormos

    Full Text Available We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the capital asset pricing model. For asset pricing we define the continuous entropy as an alternative measure of risk. Our results show that entropy decreases in the function of the number of securities involved in a portfolio in a similar way to the standard deviation, and that efficient portfolios are situated on a hyperbola in the expected return-entropy system. For empirical investigation we use daily returns of 150 randomly selected securities for a period of 27 years. Our regression results show that entropy has a higher explanatory power for the expected return than the capital asset pricing model beta. Furthermore we show the time varying behavior of the beta along with entropy.

  13. Entropy-based financial asset pricing.

    Science.gov (United States)

    Ormos, Mihály; Zibriczky, Dávid

    2014-01-01

    We investigate entropy as a financial risk measure. Entropy explains the equity premium of securities and portfolios in a simpler way and, at the same time, with higher explanatory power than the beta parameter of the capital asset pricing model. For asset pricing we define the continuous entropy as an alternative measure of risk. Our results show that entropy decreases in the function of the number of securities involved in a portfolio in a similar way to the standard deviation, and that efficient portfolios are situated on a hyperbola in the expected return-entropy system. For empirical investigation we use daily returns of 150 randomly selected securities for a period of 27 years. Our regression results show that entropy has a higher explanatory power for the expected return than the capital asset pricing model beta. Furthermore we show the time varying behavior of the beta along with entropy.

  14. FindFoci: a focus detection algorithm with automated parameter training that closely matches human assignments, reduces human inconsistencies and increases speed of analysis.

    Directory of Open Access Journals (Sweden)

    Alex D Herbert

    Full Text Available Accurate and reproducible quantification of the accumulation of proteins into foci in cells is essential for data interpretation and for biological inferences. To improve reproducibility, much emphasis has been placed on the preparation of samples, but less attention has been given to reporting and standardizing the quantification of foci. The current standard to quantitate foci in open-source software is to manually determine a range of parameters based on the outcome of one or a few representative images and then apply the parameter combination to the analysis of a larger dataset. Here, we demonstrate the power and utility of using machine learning to train a new algorithm (FindFoci to determine optimal parameters. FindFoci closely matches human assignments and allows rapid automated exploration of parameter space. Thus, individuals can train the algorithm to mirror their own assignments and then automate focus counting using the same parameters across a large number of images. Using the training algorithm to match human assignments of foci, we demonstrate that applying an optimal parameter combination from a single image is not broadly applicable to analysis of other images scored by the same experimenter or by other experimenters. Our analysis thus reveals wide variation in human assignment of foci and their quantification. To overcome this, we developed training on multiple images, which reduces the inconsistency of using a single or a few images to set parameters for focus detection. FindFoci is provided as an open-source plugin for ImageJ.

  15. A block chain based architecture for asset management in coalition operations

    Science.gov (United States)

    Verma, Dinesh; Desai, Nirmit; Preece, Alun; Taylor, Ian

    2017-05-01

    To support dynamic communities of interests in coalition operations, new architectures for efficient sharing of ISR assets are needed. The use of blockchain technology in wired business environments, such as digital currency systems, offers an interesting solution by creating a way to maintain a distributed shared ledger without requiring a single trusted authority. In this paper, we discuss how a blockchain-based system can be modified to provide a solution for dynamic asset sharing amongst coalition members, enabling the creation of a logically centralized asset management system by a seamless policy-compliant federation of different coalition systems. We discuss the use of blockchain for three different types of assets in a coalition context, showing how blockchain can offer a suitable solution for sharing assets in those environments. We also discuss the limitations in the current implementations of blockchain which need to be overcome for the technology to become more effective in a decentralized tactical edge environment.

  16. Interactive visual exploration and refinement of cluster assignments.

    Science.gov (United States)

    Kern, Michael; Lex, Alexander; Gehlenborg, Nils; Johnson, Chris R

    2017-09-12

    With ever-increasing amounts of data produced in biology research, scientists are in need of efficient data analysis methods. Cluster analysis, combined with visualization of the results, is one such method that can be used to make sense of large data volumes. At the same time, cluster analysis is known to be imperfect and depends on the choice of algorithms, parameters, and distance measures. Most clustering algorithms don't properly account for ambiguity in the source data, as records are often assigned to discrete clusters, even if an assignment is unclear. While there are metrics and visualization techniques that allow analysts to compare clusterings or to judge cluster quality, there is no comprehensive method that allows analysts to evaluate, compare, and refine cluster assignments based on the source data, derived scores, and contextual data. In this paper, we introduce a method that explicitly visualizes the quality of cluster assignments, allows comparisons of clustering results and enables analysts to manually curate and refine cluster assignments. Our methods are applicable to matrix data clustered with partitional, hierarchical, and fuzzy clustering algorithms. Furthermore, we enable analysts to explore clustering results in context of other data, for example, to observe whether a clustering of genomic data results in a meaningful differentiation in phenotypes. Our methods are integrated into Caleydo StratomeX, a popular, web-based, disease subtype analysis tool. We show in a usage scenario that our approach can reveal ambiguities in cluster assignments and produce improved clusterings that better differentiate genotypes and phenotypes.

  17. A Localization Algorithm Based on AOA for Ad-Hoc Sensor Networks

    Directory of Open Access Journals (Sweden)

    Yang Sun Lee

    2012-01-01

    Full Text Available Knowledge of positions of sensor nodes in Wireless Sensor Networks (WSNs will make possible many applications such as asset monitoring, object tracking and routing. In WSNs, the errors may happen in the measurement of distances and angles between pairs of nodes in WSN and these errors will be propagated to different nodes, the estimation of positions of sensor nodes can be difficult and have huge errors. In this paper, we will propose localization algorithm based on both distance and angle to landmark. So, we introduce a method of incident angle to landmark and the algorithm to exchange physical data such as distances and incident angles and update the position of a node by utilizing multiple landmarks and multiple paths to landmarks.

  18. A Plagiarism Detection Algorithm based on Extended Winnowing

    Directory of Open Access Journals (Sweden)

    Duan Xuliang

    2017-01-01

    Full Text Available Plagiarism is a common problem faced by academia and education. Mature commercial plagiarism detection system has the advantages of comprehensive and high accuracy, but the expensive detection costs make it unsuitable for real-time, lightweight application environment such as the student assignments plagiarism detection. This paper introduces the method of extending classic Winnowing plagiarism detection algorithm, expands the algorithm in functionality. The extended algorithm can retain the text location and length information in original document while extracting the fingerprints of a document, so that the locating and marking for plagiarism text fragment are much easier to achieve. The experimental results and several years of running practice show that the expansion of the algorithm has little effect on its performance, normal hardware configuration of PC will be able to meet small and medium-sized applications requirements. Based on the characteristics of lightweight, high efficiency, reliability and flexibility of Winnowing, the extended algorithm further enhances the adaptability and extends the application areas.

  19. Land of Addicts? An Empirical Investigation of Habit-Based Asset Pricing Behavior

    OpenAIRE

    Xiaohong Chen; Sydney C. Ludvigson

    2004-01-01

    This paper studies the ability of a general class of habit-based asset pricing models to match the conditional moment restrictions implied by asset pricing theory. We treat the functional form of the habit as unknown, and to estimate it along with the rest of the model's finite dimensional parameters. Using quarterly data on consumption growth, assets returns and instruments, our empirical results indicate that the estimated habit function is nonlinear, the habit formation is better described...

  20. Binary Bees Algorithm - bioinspiration from the foraging mechanism of honeybees to optimize a multiobjective multidimensional assignment problem

    Science.gov (United States)

    Xu, Shuo; Ji, Ze; Truong Pham, Duc; Yu, Fan

    2011-11-01

    The simultaneous mission assignment and home allocation for hospital service robots studied is a Multidimensional Assignment Problem (MAP) with multiobjectives and multiconstraints. A population-based metaheuristic, the Binary Bees Algorithm (BBA), is proposed to optimize this NP-hard problem. Inspired by the foraging mechanism of honeybees, the BBA's most important feature is an explicit functional partitioning between global search and local search for exploration and exploitation, respectively. Its key parts consist of adaptive global search, three-step elitism selection (constraint handling, non-dominated solutions selection, and diversity preservation), and elites-centred local search within a Hamming neighbourhood. Two comparative experiments were conducted to investigate its single objective optimization, optimization effectiveness (indexed by the S-metric and C-metric) and optimization efficiency (indexed by computational burden and CPU time) in detail. The BBA outperformed its competitors in almost all the quantitative indices. Hence, the above overall scheme, and particularly the searching history-adapted global search strategy was validated.

  1. Augmented Reality for Searching Potential Assets in Medan using GPS based Tracking

    Science.gov (United States)

    Muchtar, M. A.; Syahputra, M. F.; Syahputra, N.; Ashrafia, S.; Rahmat, R. F.

    2017-01-01

    Every city is required to introduce its variety of potential assets so that the people know how to utilize or to develop their area. Potential assets include infrastructure, facilities, people, communities, organizations, customs that affects the characteristics and the way of life in Medan. Due to lack of socialization and information, most of people in Medan only know a few parts of the assets. Recently, so many mobile apps provide search and mapping locations used to find the location of potential assets in user’s area. However, the available information, such as text and digital maps, sometimes do not much help the user clearly and dynamically. Therefore, Augmented Reality technology able to display information in real world vision is implemented in this research so that the information can be more interactive and easily understood by user. This technology will be implemented in mobile apps using GPS based tracking and define the coordinate of user’s smartphone as a marker so that it can help people dynamically and easily find the location of potential assets in the nearest area based on the direction of user’s view on camera.

  2. Research on Methodology to Prioritize Critical Digital Assets based on Nuclear Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Wonjik; Kwon, Kookheui; Kim, Hyundoo [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of)

    2016-10-15

    Digital systems are used in nuclear facilities to monitor and control various types of field devices, as well as to obtain and store vital information. Therefore, it is getting important for nuclear facilities to protect digital systems from cyber-attack in terms of safety operation and public health since cyber compromise of these systems could lead to unacceptable radiological consequences. Based on KINAC/RS-015 which is a cyber security regulatory standard, regulatory activities for cyber security at nuclear facilities generally focus on critical digital assets (CDAs) which are safety, security, and emergency preparedness related digital assets. Critical digital assets are estimated over 60% among all digital assets in a nuclear power plant. Therefore, it was required to prioritize critical digital assets to improve efficiency of regulation and implementation. In this paper, the research status on methodology development to prioritize critical digital assets based on nuclear risk assessment will be introduced. In this paper, to derive digital asset directly affect accident, PRA results (ET, FT, and minimal cut set) are analyzed. According to result of analysis, digital systems related to CD are derived ESF-CCS (safety-related component control system) and Process-CCS (non-safety-related component control system) as well as Engineered Safety Features Actuation System (ESFAS). These digital assets can be identified Vital Digital Asset (VDA). Hereafter, to develop general methodology which was identified VDA related to accident among CDAs, (1) method using result of minimal cut set in PRA model will be studied and (2) method quantifying result of Digital I and C PRA which is performed to reflect all digital cabinet related to system in FT will be studied.

  3. Research on Methodology to Prioritize Critical Digital Assets based on Nuclear Risk Assessment

    International Nuclear Information System (INIS)

    Kim, Wonjik; Kwon, Kookheui; Kim, Hyundoo

    2016-01-01

    Digital systems are used in nuclear facilities to monitor and control various types of field devices, as well as to obtain and store vital information. Therefore, it is getting important for nuclear facilities to protect digital systems from cyber-attack in terms of safety operation and public health since cyber compromise of these systems could lead to unacceptable radiological consequences. Based on KINAC/RS-015 which is a cyber security regulatory standard, regulatory activities for cyber security at nuclear facilities generally focus on critical digital assets (CDAs) which are safety, security, and emergency preparedness related digital assets. Critical digital assets are estimated over 60% among all digital assets in a nuclear power plant. Therefore, it was required to prioritize critical digital assets to improve efficiency of regulation and implementation. In this paper, the research status on methodology development to prioritize critical digital assets based on nuclear risk assessment will be introduced. In this paper, to derive digital asset directly affect accident, PRA results (ET, FT, and minimal cut set) are analyzed. According to result of analysis, digital systems related to CD are derived ESF-CCS (safety-related component control system) and Process-CCS (non-safety-related component control system) as well as Engineered Safety Features Actuation System (ESFAS). These digital assets can be identified Vital Digital Asset (VDA). Hereafter, to develop general methodology which was identified VDA related to accident among CDAs, (1) method using result of minimal cut set in PRA model will be studied and (2) method quantifying result of Digital I and C PRA which is performed to reflect all digital cabinet related to system in FT will be studied

  4. A New Approach to Weapon-Target Assignment in Cooperative Air Combat

    Directory of Open Access Journals (Sweden)

    Yi-zhe Chang

    2017-01-01

    Full Text Available A new approach to solving weapon-target assignment (WTA problem is proposed in this paper. Firstly, relative superiority that lays the foundation for assignment is calculated based on the combat power energy of the fighters. Based on the relative superiority, WTA problem is formulated. Afterwards, a hybrid algorithm consisting of improved artificial fish swarm algorithm (AFSA and improved harmony search (HS is introduced and furthermore applied to solve the assignment formulation. Finally, the proposed approach is validated by eight representative benchmark functions and two concrete cooperative air combat examples. The results show that the approach proposed in this paper achieves good performances in solving WTA problem in cooperative air combat.

  5. IoT-based Asset Management System for Healthcare-related Industries

    Directory of Open Access Journals (Sweden)

    Lee Carman Ka Man

    2015-11-01

    Full Text Available The healthcare industry has been focusing efforts on optimizing inventory management procedures through the incorporation of Information and Communication Technology, in the form of tracking devices and data mining, to establish ideal inventory models. In this paper, a roadmap is developed towards a technological assessment of the Internet of Things (IoT in the healthcare industry, 2010–2020. According to the roadmap, an IoT-based healthcare asset management system (IoT-HAMS is proposed and developed based on Artificial Neural Network (ANN and Fuzzy Logic (FL, incorporating IoT technologies for asset management to optimize the supply of resources.

  6. Waste-aware fluid volume assignment for flow-based microfluidic biochips

    DEFF Research Database (Denmark)

    Schneider, Alexander Rüdiger; Pop, Paul; Madsen, Jan

    2017-01-01

    complex Fluidic Units (FUs) such as switches, micropumps, mixers and separators can be constructed. When running a biochemical application on a FBMB, fluid volumes are dispensed from input reservoirs and used by the FUs. Given a biochemical application and a biochip, we are interested in determining...... the fluid volume assignment for each operation of the application, such that the FUs volume requirements are satisfied, while over- and underflow are avoided and the total volume of fluid used is minimized. We propose an algorithm for this fluid assignment problem. Compared to previous work, our method...

  7. Integer Linear Programming for Constrained Multi-Aspect Committee Review Assignment

    Science.gov (United States)

    Karimzadehgan, Maryam; Zhai, ChengXiang

    2011-01-01

    Automatic review assignment can significantly improve the productivity of many people such as conference organizers, journal editors and grant administrators. A general setup of the review assignment problem involves assigning a set of reviewers on a committee to a set of documents to be reviewed under the constraint of review quota so that the reviewers assigned to a document can collectively cover multiple topic aspects of the document. No previous work has addressed such a setup of committee review assignments while also considering matching multiple aspects of topics and expertise. In this paper, we tackle the problem of committee review assignment with multi-aspect expertise matching by casting it as an integer linear programming problem. The proposed algorithm can naturally accommodate any probabilistic or deterministic method for modeling multiple aspects to automate committee review assignments. Evaluation using a multi-aspect review assignment test set constructed using ACM SIGIR publications shows that the proposed algorithm is effective and efficient for committee review assignments based on multi-aspect expertise matching. PMID:22711970

  8. "Asset Ownership Across Generations"

    OpenAIRE

    Ngina S. Chiteji; Frank P. Stafford

    2000-01-01

    This paper examines cross-generational connections in asset ownership. It begins by presenting a theoretical framework that develops the distinction between the intergenerational transfer of knowledge about financial assets and the direct transfer of dollars from parents to children. Its analysis of data from the Panel Study of Income Dynamics (PSID) reveals intergenerational correlations in asset ownership, and we find evidence to suggest that parental asset ownership or family-based exposur...

  9. A Relative-Localization Algorithm Using Incomplete Pairwise Distance Measurements for Underwater Applications

    Directory of Open Access Journals (Sweden)

    Kae Y. Foo

    2010-01-01

    Full Text Available The task of localizing underwater assets involves the relative localization of each unit using only pairwise distance measurements, usually obtained from time-of-arrival or time-delay-of-arrival measurements. In the fluctuating underwater environment, a complete set of pair-wise distance measurements can often be difficult to acquire, thus hindering a straightforward closed-form solution in deriving the assets' relative coordinates. An iterative multidimensional scaling approach is presented based upon a weighted-majorization algorithm that tolerates missing or inaccurate distance measurements. Substantial modifications are proposed to optimize the algorithm, while the effects of refractive propagation paths are considered. A parametric study of the algorithm based upon simulation results is shown. An acoustic field-trial was then carried out, presenting field measurements to highlight the practical implementation of this algorithm.

  10. A Path-Based Gradient Projection Algorithm for the Cost-Based System Optimum Problem in Networks with Continuously Distributed Value of Time

    Directory of Open Access Journals (Sweden)

    Wen-Xiang Wu

    2014-01-01

    Full Text Available The cost-based system optimum problem in networks with continuously distributed value of time is formulated as a path-based form, which cannot be solved by the Frank-Wolfe algorithm. In light of magnitude improvement in the availability of computer memory in recent years, path-based algorithms have been regarded as a viable approach for traffic assignment problems with reasonably large network sizes. We develop a path-based gradient projection algorithm for solving the cost-based system optimum model, based on Goldstein-Levitin-Polyak method which has been successfully applied to solve standard user equilibrium and system optimum problems. The Sioux Falls network tested is used to verify the effectiveness of the algorithm.

  11. Optimising investment in asset management using the multivariate asset management assessment topography

    Directory of Open Access Journals (Sweden)

    Bam, Wouter Gideon

    2014-08-01

    Full Text Available The multivariate asset management assessment topography (MAMAT was developed to quantify, and represent graphically, development, adoption, and performance of a business’ asset management (AM systems, as described by standards such as PAS 55. The MAMAT provides a way to visualise clearly the strengths and weaknesses of a business’ asset management system. Building on MAMAT, a model describing the relationship between the commitment of resources and the corresponding improvement in the MAMAT assessment outcome is proposed. The goal is to develop an optimisation model that will maximise financial benefits by improving the MAMAT assessment score achieved by a business, while minimising the investment required to attain this improvement. This is achieved by determining the optimal allocation of resources to the different subcategories of the MAMAT assessment framework. The multi-objective cross-entropy method (MOO CEM is used to find the Pareto set of solutions for this problem. In order to showcase the intended industry application and use of the optimisation model, a hypothetical case study is executed and described in this paper. From this application, it was found that the MOO CEM finds useful solutions that can support the implementation of standards such as PAS 55 by prioritising and assigning resources to implementation activities.

  12. PID feedback controller used as a tactical asset allocation technique: The G.A.M. model

    Science.gov (United States)

    Gandolfi, G.; Sabatini, A.; Rossolini, M.

    2007-09-01

    The objective of this paper is to illustrate a tactical asset allocation technique utilizing the PID controller. The proportional-integral-derivative (PID) controller is widely applied in most industrial processes; it has been successfully used for over 50 years and it is used by more than 95% of the plants processes. It is a robust and easily understood algorithm that can provide excellent control performance in spite of the diverse dynamic characteristics of the process plant. In finance, the process plant, controlled by the PID controller, can be represented by financial market assets forming a portfolio. More specifically, in the present work, the plant is represented by a risk-adjusted return variable. Money and portfolio managers’ main target is to achieve a relevant risk-adjusted return in their managing activities. In literature and in the financial industry business, numerous kinds of return/risk ratios are commonly studied and used. The aim of this work is to perform a tactical asset allocation technique consisting in the optimization of risk adjusted return by means of asset allocation methodologies based on the PID model-free feedback control modeling procedure. The process plant does not need to be mathematically modeled: the PID control action lies in altering the portfolio asset weights, according to the PID algorithm and its parameters, Ziegler-and-Nichols-tuned, in order to approach the desired portfolio risk-adjusted return efficiently.

  13. Computational Aspects of Assigning Agents to a Line

    DEFF Research Database (Denmark)

    Aziz, Haris; Hougaard, Jens Leth; Moreno-Ternero, Juan D.

    2017-01-01

    -egalitarian assignments. The approach relies on an algorithm which is shown to be faster than general purpose algorithms for the assignment problem. We also extend the approach to probabilistic assignments and explore the computational features of existing, as well as new, methods for this setting....

  14. Computational aspects of assigning agents to a line

    DEFF Research Database (Denmark)

    Aziz, Haris; Hougaard, Jens Leth; Moreno-Ternero, Juan D.

    2017-01-01

    -egalitarian assignments. The approach relies on an algorithm which is shown to be faster than general purpose algorithms for the assignment problem. We also extend the approach to probabilistic assignments and explore the computational features of existing, as well as new, methods for this setting....

  15. Capacity constrained assignment in spatial databases

    DEFF Research Database (Denmark)

    U, Leong Hou; Yiu, Man Lung; Mouratidis, Kyriakos

    2008-01-01

    large to fit in main memory. Motivated by this fact, we propose efficient algorithms for optimal assignment that employ novel edge-pruning strategies, based on the spatial properties of the problem. Additionally, we develop approximate (i.e., suboptimal) CCA solutions that provide a trade-off between...

  16. A pathway to a more sustainable water sector: sustainability-based asset management.

    Science.gov (United States)

    Marlow, D R; Beale, D J; Burn, S

    2010-01-01

    The water sectors of many countries are faced with the need to address simultaneously two overarching challenges; the need to undertake effective asset management coupled with the broader need to evolve business processes so as to embrace sustainability principles. Research has thus been undertaken into the role sustainability principles play in asset management. As part of this research, a series of 25 in-depth interviews were undertaken with water sector professionals from around Australia. Drawing on the results of these interviews, this paper outlines the conceptual relationship between asset management and sustainability along with a synthesis of the relevant opinions voiced in the interviews. The interviews indicated that the participating water authorities have made a strong commitment to sustainability, but there is a need to facilitate change processes to embed sustainability principles into business as usual practices. Interviewees also noted that asset management and sustainability are interlinked from a number of perspectives, especially in the way decision making is undertaken with respect to assets and service provision. The interviews also provided insights into the research needed to develop a holistic sustainability-based asset management framework.

  17. Performance-based contracting for maintaining transportation assets with emphasis on bridges

    Directory of Open Access Journals (Sweden)

    Alsharqawi Mohammed

    2017-01-01

    Full Text Available With a large number of aging transportation infrastructure assets in North America and the growing problem of deterioration across the globe, managing these assets have been the subject of ongoing research. There is an overwhelming amount of maintenance and rehabilitation works to be done and selecting a suitable maintenance, repair or replacement (MRR strategy is one of the most challenging tasks for decision makers. Limited budget and resources are even making the decision making process more challenging. Maintaining infrastructure to the highest possible condition while investing the minimal amount of money has promoted innovative contracting approaches. Transportation agencies have increased private sector involvement through long term performance-based maintenance contracts or what is called Performance-Based Contracting. PBC is a type of contract that pays a contractor based on the results achieved, not on the methods for performing the maintenance work. By looking into the literature, it is observed that agencies are expanding the amount of contracting they do in order to maintain and achieve a better standard of infrastructure facilities. Therefore, the objective of this paper is to study and review performance-based contracting for transportation infrastructure with emphasis on bridge assets.

  18. De novo clustering methods outperform reference-based methods for assigning 16S rRNA gene sequences to operational taxonomic units

    Directory of Open Access Journals (Sweden)

    Sarah L. Westcott

    2015-12-01

    Full Text Available Background. 16S rRNA gene sequences are routinely assigned to operational taxonomic units (OTUs that are then used to analyze complex microbial communities. A number of methods have been employed to carry out the assignment of 16S rRNA gene sequences to OTUs leading to confusion over which method is optimal. A recent study suggested that a clustering method should be selected based on its ability to generate stable OTU assignments that do not change as additional sequences are added to the dataset. In contrast, we contend that the quality of the OTU assignments, the ability of the method to properly represent the distances between the sequences, is more important.Methods. Our analysis implemented six de novo clustering algorithms including the single linkage, complete linkage, average linkage, abundance-based greedy clustering, distance-based greedy clustering, and Swarm and the open and closed-reference methods. Using two previously published datasets we used the Matthew’s Correlation Coefficient (MCC to assess the stability and quality of OTU assignments.Results. The stability of OTU assignments did not reflect the quality of the assignments. Depending on the dataset being analyzed, the average linkage and the distance and abundance-based greedy clustering methods generated OTUs that were more likely to represent the actual distances between sequences than the open and closed-reference methods. We also demonstrated that for the greedy algorithms VSEARCH produced assignments that were comparable to those produced by USEARCH making VSEARCH a viable free and open source alternative to USEARCH. Further interrogation of the reference-based methods indicated that when USEARCH or VSEARCH were used to identify the closest reference, the OTU assignments were sensitive to the order of the reference sequences because the reference sequences can be identical over the region being considered. More troubling was the observation that while both USEARCH and

  19. AUTOBA: automation of backbone assignment from HN(C)N suite of experiments.

    Science.gov (United States)

    Borkar, Aditi; Kumar, Dinesh; Hosur, Ramakrishna V

    2011-07-01

    Development of efficient strategies and automation represent important milestones of progress in rapid structure determination efforts in proteomics research. In this context, we present here an efficient algorithm named as AUTOBA (Automatic Backbone Assignment) designed to automate the assignment protocol based on HN(C)N suite of experiments. Depending upon the spectral dispersion, the user can record 2D or 3D versions of the experiments for assignment. The algorithm uses as inputs: (i) protein primary sequence and (ii) peak-lists from user defined HN(C)N suite of experiments. In the end, one gets H(N), (15)N, C(α) and C' assignments (in common BMRB format) for the individual residues along the polypeptide chain. The success of the algorithm has been demonstrated, not only with experimental spectra recorded on two small globular proteins: ubiquitin (76 aa) and M-crystallin (85 aa), but also with simulated spectra of 27 other proteins using assignment data from the BMRB.

  20. DNABIT Compress - Genome compression algorithm.

    Science.gov (United States)

    Rajarajeswari, Pothuraju; Apparao, Allam

    2011-01-22

    Data compression is concerned with how information is organized in data. Efficient storage means removal of redundancy from the data being stored in the DNA molecule. Data compression algorithms remove redundancy and are used to understand biologically important molecules. We present a compression algorithm, "DNABIT Compress" for DNA sequences based on a novel algorithm of assigning binary bits for smaller segments of DNA bases to compress both repetitive and non repetitive DNA sequence. Our proposed algorithm achieves the best compression ratio for DNA sequences for larger genome. Significantly better compression results show that "DNABIT Compress" algorithm is the best among the remaining compression algorithms. While achieving the best compression ratios for DNA sequences (Genomes),our new DNABIT Compress algorithm significantly improves the running time of all previous DNA compression programs. Assigning binary bits (Unique BIT CODE) for (Exact Repeats, Reverse Repeats) fragments of DNA sequence is also a unique concept introduced in this algorithm for the first time in DNA compression. This proposed new algorithm could achieve the best compression ratio as much as 1.58 bits/bases where the existing best methods could not achieve a ratio less than 1.72 bits/bases.

  1. End-of-life conversations and care: an asset-based model for community engagement.

    Science.gov (United States)

    Matthiesen, Mary; Froggatt, Katherine; Owen, Elaine; Ashton, John R

    2014-09-01

    Public awareness work regarding palliative and end-of-life care is increasingly promoted within national strategies for palliative care. Different approaches to undertaking this work are being used, often based upon broader educational principles, but little is known about how to undertake such initiatives in a way that equally engages both the health and social care sector and the local communities. An asset-based community engagement approach has been developed that facilitates community-led awareness initiatives concerning end-of-life conversations and care by identifying and connecting existing skills and expertise. (1) To describe the processes and features of an asset-based community engagement approach that facilitates community-led awareness initiatives with a focus on end-of-life conversations and care; and (2) to identify key community-identified priorities for sustainable community engagement processes. An asset-based model of community engagement specific to end-of-life issues using a four-step process is described (getting started, coming together, action planning and implementation). The use of this approach, in two regional community engagement programmes, based across rural and urban communities in the northwest of England, is described. The assets identified in the facilitated community engagement process encompassed people's talents and skills, community groups and networks, government and non-government agencies, physical and economic assets and community values and stories. Five priority areas were addressed to ensure active community engagement work: information, outreach, education, leadership and sustainability. A facilitated, asset-based approach of community engagement for end-of-life conversations and care can catalyse community-led awareness initiatives. This occurs through the involvement of community and local health and social care organisations as co-creators of this change across multiple sectors in a sustainable way. This approach

  2. Genetic Algorithms for Agent-Based Infrastructure Interdependency Modeling and Analysis

    Energy Technology Data Exchange (ETDEWEB)

    May Permann

    2007-03-01

    Today’s society relies greatly upon an array of complex national and international infrastructure networks such as transportation, electric power, telecommunication, and financial networks. This paper describes initial research combining agent-based infrastructure modeling software and genetic algorithms (GAs) to help optimize infrastructure protection and restoration decisions. This research proposes to apply GAs to the problem of infrastructure modeling and analysis in order to determine the optimum assets to restore or protect from attack or other disaster. This research is just commencing and therefore the focus of this paper is the integration of a GA optimization method with a simulation through the simulation’s agents.

  3. Semi-flocking algorithm for motion control of mobile sensors in large-scale surveillance systems.

    Science.gov (United States)

    Semnani, Samaneh Hosseini; Basir, Otman A

    2015-01-01

    The ability of sensors to self-organize is an important asset in surveillance sensor networks. Self-organize implies self-control at the sensor level and coordination at the network level. Biologically inspired approaches have recently gained significant attention as a tool to address the issue of sensor control and coordination in sensor networks. These approaches are exemplified by the two well-known algorithms, namely, the Flocking algorithm and the Anti-Flocking algorithm. Generally speaking, although these two biologically inspired algorithms have demonstrated promising performance, they expose deficiencies when it comes to their ability to maintain simultaneous robust dynamic area coverage and target coverage. These two coverage performance objectives are inherently conflicting. This paper presents Semi-Flocking, a biologically inspired algorithm that benefits from key characteristics of both the Flocking and Anti-Flocking algorithms. The Semi-Flocking algorithm approaches the problem by assigning a small flock of sensors to each target, while at the same time leaving some sensors free to explore the environment. This allows the algorithm to strike balance between robust area coverage and target coverage. Such balance is facilitated via flock-sensor coordination. The performance of the proposed Semi-Flocking algorithm is examined and compared with other two flocking-based algorithms once using randomly moving targets and once using a standard walking pedestrian dataset. The results of both experiments show that the Semi-Flocking algorithm outperforms both the Flocking algorithm and the Anti-Flocking algorithm with respect to the area of coverage and the target coverage objectives. Furthermore, the results show that the proposed algorithm demonstrates shorter target detection time and fewer undetected targets than the other two flocking-based algorithms.

  4. Framework, process and tool for managing technology-based assets

    CSIR Research Space (South Africa)

    Kfir, R

    2000-10-01

    Full Text Available ) and the intellectual property (IP) of the organisation, The study describes a framework linking the core processes supporting the management of technology-based assets and offerings with other organisational elements such as leadership, strategy, and culture. Specific...

  5. Ship Block Transportation Scheduling Problem Based on Greedy Algorithm

    Directory of Open Access Journals (Sweden)

    Chong Wang

    2016-05-01

    Full Text Available Ship block transportation problems are crucial issues to address in reducing the construction cost and improving the productivity of shipyards. Shipyards aim to maximize the workload balance of transporters with time constraint such that all blocks should be transported during the planning horizon. This process leads to three types of penalty time: empty transporter travel time, delay time, and tardy time. This study aims to minimize the sum of the penalty time. First, this study presents the problem of ship block transportation with the generalization of the block transportation restriction on the multi-type transporter. Second, the problem is transformed into the classical traveling salesman problem and assignment problem through a reasonable model simplification and by adding a virtual node to the proposed directed graph. Then, a heuristic algorithm based on greedy algorithm is proposed to assign blocks to available transporters and sequencing blocks for each transporter simultaneously. Finally, the numerical experiment method is used to validate the model, and its result shows that the proposed algorithm is effective in realizing the efficient use of the transporters in shipyards. Numerical simulation results demonstrate the promising application of the proposed method to efficiently improve the utilization of transporters and to reduce the cost of ship block logistics for shipyards.

  6. Tuning and performance evaluation of PID controller for superheater steam temperature control of 200 MW boiler using gain phase assignment algorithm

    Science.gov (United States)

    Begum, A. Yasmine; Gireesh, N.

    2018-04-01

    In superheater, steam temperature is controlled in a cascade control loop. The cascade control loop consists of PI and PID controllers. To improve the superheater steam temperature control the controller's gains in a cascade control loop has to be tuned efficiently. The mathematical model of the superheater is derived by sets of nonlinear partial differential equations. The tuning methods taken for study here are designed for delay plus first order transfer function model. Hence from the dynamical model of the superheater, a FOPTD model is derived using frequency response method. Then by using Chien-Hrones-Reswick Tuning Algorithm and Gain-Phase Assignment Algorithm optimum controller gains has been found out based on the least value of integral time weighted absolute error.

  7. Saving-Based Asset Pricing

    DEFF Research Database (Denmark)

    Dreyer, Johannes Kabderian; Schneider, Johannes; T. Smith, William

    2013-01-01

    This paper explores the implications of a novel class of preferences for the behavior of asset prices. Following a suggestion by Marshall (1920), we entertain the possibility that people derive utility not only from consumption, but also from the very act of saving. These ‘‘saving-based’’ prefere...

  8. [Health promotion based on assets: how to work with this perspective in local interventions?

    Science.gov (United States)

    Cofiño, Rafael; Aviñó, Dory; Benedé, Carmen Belén; Botello, Blanca; Cubillo, Jara; Morgan, Antony; Paredes-Carbonell, Joan Josep; Hernán, Mariano

    2016-11-01

    An asset-based approach could be useful to revitalise health promotion or community health interventions combining work with multiple partnerships, positive health, community engagement, equity and orientation of health determinants. We set some recommendations about how to incorporate the assets model in programmes, projects and interventions in health promotion. Some techniques are described for assets mapping and some experiences with this methodology being developed in different regions are systematised. We propose the term "Asset-based Health Promotion/Community Health" as an operational definition to work at the local level with a community engagement and participatory approach, building alliances between different institutions at the state-regional level and trying to create a framework for action with the generation of evaluations and evidence to work on population interventions from the perspective of positive health. Copyright © 2016 SESPAS. All rights reserved.

  9. Improving the asset pricing ability of the Consumption-Capital Asset Pricing Model?

    DEFF Research Database (Denmark)

    Rasmussen, Anne-Sofie Reng

    This paper compares the asset pricing ability of the traditional consumption-based capital asset pricing model to models from two strands of literature attempting to improve on the poor empirical results of the C-CAPM. One strand is based on the intertemporal asset pricing model of Campbell (1993...... able to price assets conditionally as suggested by Cochrane (1996) and Lettau and Ludvigson (2001b). The unconditional C-CAPM is rewritten as a scaled factor model using the approximate log consumptionwealth ratio cay, developed by Lettau and Ludvigson (2001a), as scaling variable. The models...... and composite. Thus, there is no unambiguous solution to the pricing ability problems of the C-CAPM. Models from both the alternative literature strands are found to outperform the traditional C-CAPM on average pricing errors. However, when weighting pricing errors by the full variance-covariance matrix...

  10. Evaluation of the Jonker-Volgenant-Castanon (JVC) assignment algorithm for track association

    Science.gov (United States)

    Malkoff, Donald B.

    1997-07-01

    The Jonker-Volgenant-Castanon (JVC) assignment algorithm was used by Lockheed Martin Advanced Technology Laboratories (ATL) for track association in the Rotorcraft Pilot's Associate (RPA) program. RPA is Army Aviation's largest science and technology program, involving an integrated hardware/software system approach for a next generation helicopter containing advanced sensor equipments and applying artificial intelligence `associate' technologies. ATL is responsible for the multisensor, multitarget, onboard/offboard track fusion. McDonnell Douglas Helicopter Systems is the prime contractor and Lockheed Martin Federal Systems is responsible for developing much of the cognitive decision aiding and controls-and-displays subsystems. RPA is scheduled for flight testing beginning in 1997. RPA is unique in requiring real-time tracking and fusion for large numbers of highly-maneuverable ground (and air) targets in a target-dense environment. It uses diverse sensors and is concerned with a large area of interest. Target class and identification data is tightly integrated with spatial and kinematic data throughout the processing. Because of platform constraints, processing hardware for track fusion was quite limited. No previous experience using JVC in this type environment had been reported. ATL performed extensive testing of the JVC, concentrating on error rates and run- times under a variety of conditions. These included wide ranging numbers and types of targets, sensor uncertainties, target attributes, differing degrees of target maneuverability, and diverse combinations of sensors. Testing utilized Monte Carlo approaches, as well as many kinds of challenging scenarios. Comparisons were made with a nearest-neighbor algorithm and a new, proprietary algorithm (the `Competition' algorithm). The JVC proved to be an excellent choice for the RPA environment, providing a good balance between speed of operation and accuracy of results.

  11. Learning-based meta-algorithm for MRI brain extraction.

    Science.gov (United States)

    Shi, Feng; Wang, Li; Gilmore, John H; Lin, Weili; Shen, Dinggang

    2011-01-01

    Multiple-segmentation-and-fusion method has been widely used for brain extraction, tissue segmentation, and region of interest (ROI) localization. However, such studies are hindered in practice by their computational complexity, mainly coming from the steps of template selection and template-to-subject nonlinear registration. In this study, we address these two issues and propose a novel learning-based meta-algorithm for MRI brain extraction. Specifically, we first use exemplars to represent the entire template library, and assign the most similar exemplar to the test subject. Second, a meta-algorithm combining two existing brain extraction algorithms (BET and BSE) is proposed to conduct multiple extractions directly on test subject. Effective parameter settings for the meta-algorithm are learned from the training data and propagated to subject through exemplars. We further develop a level-set based fusion method to combine multiple candidate extractions together with a closed smooth surface, for obtaining the final result. Experimental results show that, with only a small portion of subjects for training, the proposed method is able to produce more accurate and robust brain extraction results, at Jaccard Index of 0.956 +/- 0.010 on total 340 subjects under 6-fold cross validation, compared to those by the BET and BSE even using their best parameter combinations.

  12. Exploring consumption- and asset-based poverty dynamics in Ethiopia

    African Journals Online (AJOL)

    This paper examines the dynamics of wellbeing in Ethiopia by assessing changes in poverty status based on consumption and asset ownership. Using panel data from the first two waves of the Ethiopia Socioeconomic Survey (ESS), we discover that although the cross-sectional poverty remains relatively unchanged ...

  13. DNABIT Compress – Genome compression algorithm

    Science.gov (United States)

    Rajarajeswari, Pothuraju; Apparao, Allam

    2011-01-01

    Data compression is concerned with how information is organized in data. Efficient storage means removal of redundancy from the data being stored in the DNA molecule. Data compression algorithms remove redundancy and are used to understand biologically important molecules. We present a compression algorithm, “DNABIT Compress” for DNA sequences based on a novel algorithm of assigning binary bits for smaller segments of DNA bases to compress both repetitive and non repetitive DNA sequence. Our proposed algorithm achieves the best compression ratio for DNA sequences for larger genome. Significantly better compression results show that “DNABIT Compress” algorithm is the best among the remaining compression algorithms. While achieving the best compression ratios for DNA sequences (Genomes),our new DNABIT Compress algorithm significantly improves the running time of all previous DNA compression programs. Assigning binary bits (Unique BIT CODE) for (Exact Repeats, Reverse Repeats) fragments of DNA sequence is also a unique concept introduced in this algorithm for the first time in DNA compression. This proposed new algorithm could achieve the best compression ratio as much as 1.58 bits/bases where the existing best methods could not achieve a ratio less than 1.72 bits/bases. PMID:21383923

  14. A Depth Map Generation Algorithm Based on Saliency Detection for 2D to 3D Conversion

    Science.gov (United States)

    Yang, Yizhong; Hu, Xionglou; Wu, Nengju; Wang, Pengfei; Xu, Dong; Rong, Shen

    2017-09-01

    In recent years, 3D movies attract people's attention more and more because of their immersive stereoscopic experience. However, 3D movies is still insufficient, so estimating depth information for 2D to 3D conversion from a video is more and more important. In this paper, we present a novel algorithm to estimate depth information from a video via scene classification algorithm. In order to obtain perceptually reliable depth information for viewers, the algorithm classifies them into three categories: landscape type, close-up type, linear perspective type firstly. Then we employ a specific algorithm to divide the landscape type image into many blocks, and assign depth value by similar relative height cue with the image. As to the close-up type image, a saliency-based method is adopted to enhance the foreground in the image and the method combine it with the global depth gradient to generate final depth map. By vanishing line detection, the calculated vanishing point which is regarded as the farthest point to the viewer is assigned with deepest depth value. According to the distance between the other points and the vanishing point, the entire image is assigned with corresponding depth value. Finally, depth image-based rendering is employed to generate stereoscopic virtual views after bilateral filter. Experiments show that the proposed algorithm can achieve realistic 3D effects and yield satisfactory results, while the perception scores of anaglyph images lie between 6.8 and 7.8.

  15. INNOVATION IN ACCOUNTING BIOLOGIC ASSETS

    OpenAIRE

    Stolуarova M. A.; Shcherbina I. D.

    2016-01-01

    The article describes the innovations in the classification and measurement of biological assets according to IFRS (IAS) 41 "Agriculture". The difficulties faced by agricultural producers using standard, set out in article. The classification based on the adopted amendments, according to which the fruit-bearing plants, previously accounted for as biological assets are measured at fair value are included in the category of fixed assets. The structure of biological assets and main means has bee...

  16. The Sociological Imagination and Community-Based Learning: Using an Asset-Based Approach

    Science.gov (United States)

    Garoutte, Lisa

    2018-01-01

    Fostering a sociological imagination in students is a central goal for most introductory sociology courses and sociology departments generally, yet success is difficult to achieve. This project suggests that using elements of asset-based community development can be used in sociology classrooms to develop a sociological perspective. After…

  17. Fuzzy Weight Cluster-Based Routing Algorithm for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Teng Gao

    2015-01-01

    Full Text Available Cluster-based protocol is a kind of important routing in wireless sensor networks. However, due to the uneven distribution of cluster heads in classical clustering algorithm, some nodes may run out of energy too early, which is not suitable for large-scale wireless sensor networks. In this paper, a distributed clustering algorithm based on fuzzy weighted attributes is put forward to ensure both energy efficiency and extensibility. On the premise of a comprehensive consideration of all attributes, the corresponding weight of each parameter is assigned by using the direct method of fuzzy engineering theory. Then, each node works out property value. These property values will be mapped to the time axis and be triggered by a timer to broadcast cluster headers. At the same time, the radio coverage method is adopted, in order to avoid collisions and to ensure the symmetrical distribution of cluster heads. The aggregated data are forwarded to the sink node in the form of multihop. The simulation results demonstrate that clustering algorithm based on fuzzy weighted attributes has a longer life expectancy and better extensibility than LEACH-like algorithms.

  18. Packets Distributing Evolutionary Algorithm Based on PSO for Ad Hoc Network

    Science.gov (United States)

    Xu, Xiao-Feng

    2018-03-01

    Wireless communication network has such features as limited bandwidth, changeful channel and dynamic topology, etc. Ad hoc network has lots of difficulties in accessing control, bandwidth distribution, resource assign and congestion control. Therefore, a wireless packets distributing Evolutionary algorithm based on PSO (DPSO)for Ad Hoc Network is proposed. Firstly, parameters impact on performance of network are analyzed and researched to obtain network performance effective function. Secondly, the improved PSO Evolutionary Algorithm is used to solve the optimization problem from local to global in the process of network packets distributing. The simulation results show that the algorithm can ensure fairness and timeliness of network transmission, as well as improve ad hoc network resource integrated utilization efficiency.

  19. submission of art studio-based assignments: students experience

    African Journals Online (AJOL)

    PUBLICATIONS1

    are reluctant to complete their studio assignments on time are critically ... tative and qualitative data, derived from survey and interviews were used to ... is therefore exploratory and studio based. It ... mogenous group of students who report pro- ... Assignment management .... The analyses in this study are based on data.

  20. A note on ranking assignments using reoptimization

    DEFF Research Database (Denmark)

    Pedersen, Christian Roed; Nielsen, L.R.; Andersen, K.A.

    2005-01-01

    We consider the problem of ranking assignments according to cost in the classical linear assignment problem. An algorithm partitioning the set of possible assignments, as suggested by Murty, is presented where, for each partition, the optimal assignment is calculated using a new reoptimization...

  1. A Stone Resource Assignment Model under the Fuzzy Environment

    Directory of Open Access Journals (Sweden)

    Liming Yao

    2012-01-01

    to tackle a stone resource assignment problem with the aim of decreasing dust and waste water emissions. On the upper level, the local government wants to assign a reasonable exploitation amount to each stone plant so as to minimize total emissions and maximize employment and economic profit. On the lower level, stone plants must reasonably assign stone resources to produce different stone products under the exploitation constraint. To deal with inherent uncertainties, the object functions and constraints are defuzzified using a possibility measure. A fuzzy simulation-based improved simulated annealing algorithm (FS-ISA is designed to search for the Pareto optimal solutions. Finally, a case study is presented to demonstrate the practicality and efficiency of the model. Results and a comparison analysis are presented to highlight the performance of the optimization method, which proves to be very efficient compared with other algorithms.

  2. Local natural and cultural heritage assets and community based ...

    African Journals Online (AJOL)

    Community based tourism (CBT) is seen as an opportunity which mass tourism does not offer for, especially, rural communities to develop their natural and cultural assets into tourism activities for the benefit of the community. The point of CBT is that the community, collectively and individually, gains a livelihood from ...

  3. Asset Pricing in Markets with Illiquid Assets

    OpenAIRE

    Longstaff, Francis A

    2005-01-01

    Many important classes of assets are illiquid in the sense that they cannot always be traded immediately. Thus, a portfolio position in these types of illiquid investments becomes at least temporarily irreversible. We study the asset-pricing implications of illiquidity in a two-asset exchange economy with heterogeneous agents. In this market, one asset is always liquid. The other asset can be traded initially, but then not again until after a “blackout†period. Illiquidity has a dramatic e...

  4. Digital asset ecosystems rethinking crowds and cloud

    CERN Document Server

    Blanke, Tobias

    2014-01-01

    Digital asset management is undergoing a fundamental transformation. Near universal availability of high-quality web-based assets makes it important to pay attention to the new world of digital ecosystems and what it means for managing, using and publishing digital assets. The Ecosystem of Digital Assets reflects on these developments and what the emerging 'web of things' could mean for digital assets. The book is structured into three parts, each covering an important aspect of digital assets. Part one introduces the emerging ecosystems of digital assets. Part two examines digital asset manag

  5. Dynamic Sequence Assignment.

    Science.gov (United States)

    1983-12-01

    D-136 548 DYNAMIIC SEQUENCE ASSIGNMENT(U) ADVANCED INFORMATION AND 1/2 DECISION SYSTEMS MOUNTAIN YIELW CA C A 0 REILLY ET AL. UNCLSSIIED DEC 83 AI/DS...I ADVANCED INFORMATION & DECISION SYSTEMS Mountain View. CA 94040 84 u ,53 V,..’. Unclassified _____ SCURITY CLASSIFICATION OF THIS PAGE REPORT...reviews some important heuristic algorithms developed for fas- ter solution of the sequence assignment problem. 3.1. DINAMIC MOGRAMUNIG FORMULATION FOR

  6. Dynamic routing and spectrum assignment based on multilayer virtual topology and ant colony optimization in elastic software-defined optical networks

    Science.gov (United States)

    Wang, Fu; Liu, Bo; Zhang, Lijia; Zhang, Qi; Tian, Qinghua; Tian, Feng; Rao, Lan; Xin, Xiangjun

    2017-07-01

    Elastic software-defined optical networks greatly improve the flexibility of the optical switching network while it has brought challenges to the routing and spectrum assignment (RSA). A multilayer virtual topology model is proposed to solve RSA problems. Two RSA algorithms based on the virtual topology are proposed, which are the ant colony optimization (ACO) algorithm of minimum consecutiveness loss and the ACO algorithm of maximum spectrum consecutiveness. Due to the computing power of the control layer in the software-defined network, the routing algorithm avoids the frequent link-state information between routers. Based on the effect of the spectrum consecutiveness loss on the pheromone in the ACO, the path and spectrum of the minimal impact on the network are selected for the service request. The proposed algorithms have been compared with other algorithms. The results show that the proposed algorithms can reduce the blocking rate by at least 5% and perform better in spectrum efficiency. Moreover, the proposed algorithms can effectively decrease spectrum fragmentation and enhance available spectrum consecutiveness.

  7. Effects of cash transfers on Children's health and social protection in Sub-Saharan Africa: differences in outcomes based on orphan status and household assets.

    Science.gov (United States)

    Crea, Thomas M; Reynolds, Andrew D; Sinha, Aakanksha; Eaton, Jeffrey W; Robertson, Laura A; Mushati, Phyllis; Dumba, Lovemore; Mavise, Gideon; Makoni, J C; Schumacher, Christina M; Nyamukapa, Constance A; Gregson, Simon

    2015-05-28

    Unconditional and conditional cash transfer programmes (UCT and CCT) show potential to improve the well-being of orphans and other children made vulnerable by HIV/AIDS (OVC). We address the gap in current understanding about the extent to which household-based cash transfers differentially impact individual children's outcomes, according to risk or protective factors such as orphan status and household assets. Data were obtained from a cluster-randomised controlled trial in eastern Zimbabwe, with random assignment to three study arms - UCT, CCT or control. The sample included 5,331 children ages 6-17 from 1,697 households. Generalized linear mixed models were specified to predict OVC health vulnerability (child chronic illness and disability) and social protection (birth registration and 90% school attendance). Models included child-level risk factors (age, orphan status); household risk factors (adults with chronic illnesses and disabilities, greater household size); and household protective factors (including asset-holding). Interactions were systematically tested. Orphan status was associated with decreased likelihood for birth registration, and paternal orphans and children for whom both parents' survival status was unknown were less likely to attend school. In the UCT arm, paternal orphans fared better in likelihood of birth registration compared with non-paternal orphans. Effects of study arms on outcomes were not moderated by any other risk or protective factors. High household asset-holding was associated with decreased likelihood of child's chronic illness and increased birth registration and school attendance, but household assets did not moderate the effects of cash transfers on risk or protective factors. Orphaned children are at higher risk for poor social protection outcomes even when cared for in family-based settings. UCT and CCT each produced direct effects on children's social protection which are not moderated by other child- and household

  8. Inferential backbone assignment for sparse data

    International Nuclear Information System (INIS)

    Vitek, Olga; Bailey-Kellogg, Chris; Craig, Bruce; Vitek, Jan

    2006-01-01

    This paper develops an approach to protein backbone NMR assignment that effectively assigns large proteins while using limited sets of triple-resonance experiments. Our approach handles proteins with large fractions of missing data and many ambiguous pairs of pseudoresidues, and provides a statistical assessment of confidence in global and position-specific assignments. The approach is tested on an extensive set of experimental and synthetic data of up to 723 residues, with match tolerances of up to 0.5 ppm for C α and C β resonance types. The tests show that the approach is particularly helpful when data contain experimental noise and require large match tolerances. The keys to the approach are an empirical Bayesian probability model that rigorously accounts for uncertainty in the data at all stages in the analysis, and a hybrid stochastic tree-based search algorithm that effectively explores the large space of possible assignments

  9. Determinants of investment in fixed assets and in intangible assets for high-tech firms

    Directory of Open Access Journals (Sweden)

    Paulo Maçãs Nunes

    2017-05-01

    Full Text Available Based on a sample of 141 Portuguese high-tech firms for the period 2004-2012 and using GMM system (1998 and LSDVC (2005 dynamic estimators, this paper studies whether the determinants of high-tech firms’ investment in fixed assets are identical to the determinants of their investment in intangible assets. The multiple empirical evidence obtained allows us to conclude that the determinants of their investment in fixed assets are considerably different from those of their investment in intangible assets. Debt is a determinant stimulating investment in fixed assets, with age being a determinant restricting such investment. Size, age, internal finance and GDP are determinants stimulating investment in intangible assets, whereas debt and interest rates restrict such investment. These results let us make important suggestions for the owners/managers of high-tech firms, and also for policy-makers.

  10. Semantic based cluster content discovery in description first clustering algorithm

    International Nuclear Information System (INIS)

    Khan, M.W.; Asif, H.M.S.

    2017-01-01

    In the field of data analytics grouping of like documents in textual data is a serious problem. A lot of work has been done in this field and many algorithms have purposed. One of them is a category of algorithms which firstly group the documents on the basis of similarity and then assign the meaningful labels to those groups. Description first clustering algorithm belong to the category in which the meaningful description is deduced first and then relevant documents are assigned to that description. LINGO (Label Induction Grouping Algorithm) is the algorithm of description first clustering category which is used for the automatic grouping of documents obtained from search results. It uses LSI (Latent Semantic Indexing); an IR (Information Retrieval) technique for induction of meaningful labels for clusters and VSM (Vector Space Model) for cluster content discovery. In this paper we present the LINGO while it is using LSI during cluster label induction and cluster content discovery phase. Finally, we compare results obtained from the said algorithm while it uses VSM and Latent semantic analysis during cluster content discovery phase. (author)

  11. Combining ambiguous chemical shift mapping with structure-based backbone and NOE assignment from 15N-NOESY

    KAUST Repository

    Jang, Richard

    2011-01-01

    Chemical shift mapping is an important technique in NMRbased drug screening for identifying the atoms of a target protein that potentially bind to a drug molecule upon the molecule\\'s introduction in increasing concentrations. The goal is to obtain a mapping of peaks with known residue assignment from the reference spectrum of the unbound protein to peaks with unknown assignment in the target spectrum of the bound protein. Although a series of perturbed spectra help to trace a path from reference peaks to target peaks, a one-to-one mapping generally is not possible, especially for large proteins, due to errors, such as noise peaks, missing peaks, missing but then reappearing, overlapped, and new peaks not associated with any peaks in the reference. Due to these difficulties, the mapping is typically done manually or semi-automatically. However, automated methods are necessary for high-throughput drug screening. We present PeakWalker, a novel peak walking algorithm for fast-exchange systems that models the errors explicitly and performs many-to-one mapping. On the proteins: hBclXL, UbcH5B, and histone H1, it achieves an average accuracy of over 95% with less than 1.5 residues predicted per target peak. Given these mappings as input, we present PeakAssigner, a novel combined structure-based backbone resonance and NOE assignment algorithm that uses just 15N-NOESY, while avoiding TOCSY experiments and 13C- labeling, to resolve the ambiguities for a one-toone mapping. On the three proteins, it achieves an average accuracy of 94% or better. Copyright © 2011 ACM.

  12. Using satellite imagery to evaluate land-based camouflage assets

    CSIR Research Space (South Africa)

    Baumbach, J

    2006-02-01

    Full Text Available to Evaluate Land-based Camouflage Assets J BAUMBACH, M LUBBE CSIR Defence, Peace, Safety and security, PO Box 395, Pretoria, 0001, South Africa Email: jbaumbac@csir.co.za ABSTRACT A camouflage field trial experiment was conducted. For the experiment... analysis, change detection, un-supervised classification, supervised classification and object based classification. RESULTS The following table shows a summary of the different targets, and whether it was detected ( ) or not detected (x), using...

  13. Segment-based dose optimization using a genetic algorithm

    International Nuclear Information System (INIS)

    Cotrutz, Cristian; Xing Lei

    2003-01-01

    Intensity modulated radiation therapy (IMRT) inverse planning is conventionally done in two steps. Firstly, the intensity maps of the treatment beams are optimized using a dose optimization algorithm. Each of them is then decomposed into a number of segments using a leaf-sequencing algorithm for delivery. An alternative approach is to pre-assign a fixed number of field apertures and optimize directly the shapes and weights of the apertures. While the latter approach has the advantage of eliminating the leaf-sequencing step, the optimization of aperture shapes is less straightforward than that of beamlet-based optimization because of the complex dependence of the dose on the field shapes, and their weights. In this work we report a genetic algorithm for segment-based optimization. Different from a gradient iterative approach or simulated annealing, the algorithm finds the optimum solution from a population of candidate plans. In this technique, each solution is encoded using three chromosomes: one for the position of the left-bank leaves of each segment, the second for the position of the right-bank and the third for the weights of the segments defined by the first two chromosomes. The convergence towards the optimum is realized by crossover and mutation operators that ensure proper exchange of information between the three chromosomes of all the solutions in the population. The algorithm is applied to a phantom and a prostate case and the results are compared with those obtained using beamlet-based optimization. The main conclusion drawn from this study is that the genetic optimization of segment shapes and weights can produce highly conformal dose distribution. In addition, our study also confirms previous findings that fewer segments are generally needed to generate plans that are comparable with the plans obtained using beamlet-based optimization. Thus the technique may have useful applications in facilitating IMRT treatment planning

  14. Optimization of Consignment-Store-Based Supply Chain with Black Hole Algorithm

    Directory of Open Access Journals (Sweden)

    Ágota Bányai

    2017-01-01

    Full Text Available The globalization of economy and market led to increased networking in the field of manufacturing and services. These manufacturing and service processes including supply chain became more and more complex. The supply chain includes in many cases consignment stores. The design and operation of these complex supply chain processes can be described as NP-hard optimization problems. These problems can be solved using sophisticated models and methods based on metaheuristic algorithms. This research proposes an integrated supply model based on consignment stores. After a careful literature review, this paper introduces a mathematical model to formulate the problem of consignment-store-based supply chain optimization. The integrated model includes facility location and assignment problems to be solved. Next, an enhanced black hole algorithm dealing with multiobjective supply chain model is presented. The sensitivity analysis of the heuristic black hole optimization method is also described to check the efficiency of new operators to increase the convergence of the algorithm. Numerical results with different datasets demonstrate how the proposed model supports the efficiency, flexibility, and reliability of the consignment-store-based supply chain.

  15. Adaptive protection algorithm and system

    Science.gov (United States)

    Hedrick, Paul [Pittsburgh, PA; Toms, Helen L [Irwin, PA; Miller, Roger M [Mars, PA

    2009-04-28

    An adaptive protection algorithm and system for protecting electrical distribution systems traces the flow of power through a distribution system, assigns a value (or rank) to each circuit breaker in the system and then determines the appropriate trip set points based on the assigned rank.

  16. Arbitrage Pricing, Capital Asset Pricing, and Agricultural Assets

    OpenAIRE

    Louise M. Arthur; Colin A. Carter; Fay Abizadeh

    1988-01-01

    A new asset pricing model, the arbitrage pricing theory, has been developed as an alternative to the capital asset pricing model. The arbitrage pricing theory model is used to analyze the relationship between risk and return for agricultural assets. The major conclusion is that the arbitrage pricing theory results support previous capital asset pricing model findings that the estimated risk associated with agricultural assets is low. This conclusion is more robust for the arbitrage pricing th...

  17. Towards fully automated structure-based NMR resonance assignment of 15N-labeled proteins from automatically picked peaks

    KAUST Repository

    Jang, Richard; Gao, Xin; Li, Ming

    2011-01-01

    In NMR resonance assignment, an indispensable step in NMR protein studies, manually processed peaks from both N-labeled and C-labeled spectra are typically used as inputs. However, the use of homologous structures can allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data. We propose a novel integer programming framework for structure-based backbone resonance assignment using N-labeled data. The core consists of a pair of integer programming models: one for spin system forming and amino acid typing, and the other for backbone resonance assignment. The goal is to perform the assignment directly from spectra without any manual intervention via automatically picked peaks, which are much noisier than manually picked peaks, so methods must be error-tolerant. In the case of semi-automated/manually processed peak data, we compare our system with the Xiong-Pandurangan-Bailey- Kellogg's contact replacement (CR) method, which is the most error-tolerant method for structure-based resonance assignment. Our system, on average, reduces the error rate of the CR method by five folds on their data set. In addition, by using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for human ubiquitin, where the typing accuracy is 83%, we achieve 91% accuracy, compared to the 59% accuracy obtained without correcting for such errors. In the case of automatically picked peaks, using assignment information from yeast ubiquitin, we achieve a fully automatic assignment with 97% accuracy. To our knowledge, this is the first system that can achieve fully automatic structure-based assignment directly from spectra. This has implications in NMR protein mutant studies, where the assignment step is repeated for each mutant. © Copyright 2011, Mary Ann Liebert, Inc.

  18. Towards fully automated structure-based NMR resonance assignment of 15N-labeled proteins from automatically picked peaks

    KAUST Repository

    Jang, Richard

    2011-03-01

    In NMR resonance assignment, an indispensable step in NMR protein studies, manually processed peaks from both N-labeled and C-labeled spectra are typically used as inputs. However, the use of homologous structures can allow one to use only N-labeled NMR data and avoid the added expense of using C-labeled data. We propose a novel integer programming framework for structure-based backbone resonance assignment using N-labeled data. The core consists of a pair of integer programming models: one for spin system forming and amino acid typing, and the other for backbone resonance assignment. The goal is to perform the assignment directly from spectra without any manual intervention via automatically picked peaks, which are much noisier than manually picked peaks, so methods must be error-tolerant. In the case of semi-automated/manually processed peak data, we compare our system with the Xiong-Pandurangan-Bailey- Kellogg\\'s contact replacement (CR) method, which is the most error-tolerant method for structure-based resonance assignment. Our system, on average, reduces the error rate of the CR method by five folds on their data set. In addition, by using an iterative algorithm, our system has the added capability of using the NOESY data to correct assignment errors due to errors in predicting the amino acid and secondary structure type of each spin system. On a publicly available data set for human ubiquitin, where the typing accuracy is 83%, we achieve 91% accuracy, compared to the 59% accuracy obtained without correcting for such errors. In the case of automatically picked peaks, using assignment information from yeast ubiquitin, we achieve a fully automatic assignment with 97% accuracy. To our knowledge, this is the first system that can achieve fully automatic structure-based assignment directly from spectra. This has implications in NMR protein mutant studies, where the assignment step is repeated for each mutant. © Copyright 2011, Mary Ann Liebert, Inc.

  19. Treatment Algorithms Based on Tumor Molecular Profiling: The Essence of Precision Medicine Trials.

    Science.gov (United States)

    Le Tourneau, Christophe; Kamal, Maud; Tsimberidou, Apostolia-Maria; Bedard, Philippe; Pierron, Gaëlle; Callens, Céline; Rouleau, Etienne; Vincent-Salomon, Anne; Servant, Nicolas; Alt, Marie; Rouzier, Roman; Paoletti, Xavier; Delattre, Olivier; Bièche, Ivan

    2016-04-01

    With the advent of high-throughput molecular technologies, several precision medicine (PM) studies are currently ongoing that include molecular screening programs and PM clinical trials. Molecular profiling programs establish the molecular profile of patients' tumors with the aim to guide therapy based on identified molecular alterations. The aim of prospective PM clinical trials is to assess the clinical utility of tumor molecular profiling and to determine whether treatment selection based on molecular alterations produces superior outcomes compared with unselected treatment. These trials use treatment algorithms to assign patients to specific targeted therapies based on tumor molecular alterations. These algorithms should be governed by fixed rules to ensure standardization and reproducibility. Here, we summarize key molecular, biological, and technical criteria that, in our view, should be addressed when establishing treatment algorithms based on tumor molecular profiling for PM trials. © The Author 2015. Published by Oxford University Press.

  20. Rigorous Progress on Algorithms Based Routing and Wavelength Assignment in Trans-Egypt Network (TEGYNET) Management

    OpenAIRE

    Abd El–Naser A. Mohammed; Ahmed Nabih Zaki Rashed; Osama S. Fragallah; Mohamed G. El-Abyad

    2013-01-01

    In simple wavelength-division multiplexed (WDM) networks, a connection must be established along a route using a common wavelength on all of the links along the route. The introduction of wavelength converters into WDM cross connects increases the hardware cost and complexity. Given a set of connection requests, the routing and wavelength assignment problem involves finding a route (routing) and assigning a wavelength to each request. This paper has presented the WDM technology is being exten...

  1. Dynamic Allocation or Diversification: A Regime-Based Approach to Multiple Assets

    DEFF Research Database (Denmark)

    Nystrup, Peter; Hansen, Bo William; Larsen, Henrik Olejasz

    2018-01-01

    ’ behavior and a new, more intuitive way of inferring the hidden market regimes. The empirical results show that regime-based asset allocation is profitable, even when compared to a diversified benchmark portfolio. The results are robust because they are based on available market data with no assumptions...... about forecasting skills....

  2. Asset sales, asset exchanges, and shareholder wealth in China

    Directory of Open Access Journals (Sweden)

    Weiting Huang

    2012-01-01

    Full Text Available In this paper, we study a sample of 1376 corporate asset sales and 250 asset exchanges in China between 1998 and 2006. We find that corporate asset sales in China enhance firm value with a cumulative abnormal return (CAR of 0.46% for the pre-announcement five-day period, which is consistent with the evidence discovered in both U.K. and U.S. For companies that exchanged assets during the sample period, the pre-announcement five-day CAR of 1.32% is statistically significant. We also discover that gains from divesting assets are positively related to managerial performance measured by Tobin's q ratio and the relative size of the asset sold or exchanged. Well-managed (high-q companies are more likely to sell or exchange assets in a value-maximizing fashion than poorly managed (low-q companies. Furthermore, asset-seller gains are not related to enhancing corporate focus, but improving corporate focus by exchanging for core assets enhances firm value.

  3. A parametric visualization software for the assignment problem

    Directory of Open Access Journals (Sweden)

    Papamanthou Charalampos

    2005-01-01

    Full Text Available In this paper we present a parametric visualization software used to assist the teaching of the Network Primal Simplex Algorithm for the assignment problem (AP. The assignment problem is a special case of the balanced transportation problem. The main functions of the algorithm and design techniques are also presented. Through this process, we aim to underline the importance and necessity of using such educational methods in order to improve the teaching of Computer Algorithms.

  4. Efficiently Inefficient Markets for Assets and Assets Management

    DEFF Research Database (Denmark)

    Garleanu, Nicolae; Heje Pedersen, Lasse

    We consider a model where investors can invest directly or search for an asset manager, information about assets is costly, and managers charge an endogenous fee. The efficiency of asset prices is linked to the efficiency of the asset management market: if investors can find managers more easily......, more money is allocated to active management, fees are lower, and asset prices are more efficient. Informed managers outperform after fees, uninformed managers underperform after fees, and the net performance of the average manager depends on the number of "noise allocators." Finally, we show why large...

  5. Combining automated peak tracking in SAR by NMR with structure-based backbone assignment from 15N-NOESY

    KAUST Repository

    Jang, Richard; Gao, Xin; Li, Ming

    2012-01-01

    Background: Chemical shift mapping is an important technique in NMR-based drug screening for identifying the atoms of a target protein that potentially bind to a drug molecule upon the molecule's introduction in increasing concentrations. The goal is to obtain a mapping of peaks with known residue assignment from the reference spectrum of the unbound protein to peaks with unknown assignment in the target spectrum of the bound protein. Although a series of perturbed spectra help to trace a path from reference peaks to target peaks, a one-to-one mapping generally is not possible, especially for large proteins, due to errors, such as noise peaks, missing peaks, missing but then reappearing, overlapped, and new peaks not associated with any peaks in the reference. Due to these difficulties, the mapping is typically done manually or semi-automatically, which is not efficient for high-throughput drug screening.Results: We present PeakWalker, a novel peak walking algorithm for fast-exchange systems that models the errors explicitly and performs many-to-one mapping. On the proteins: hBclXL, UbcH5B, and histone H1, it achieves an average accuracy of over 95% with less than 1.5 residues predicted per target peak. Given these mappings as input, we present PeakAssigner, a novel combined structure-based backbone resonance and NOE assignment algorithm that uses just 15N-NOESY, while avoiding TOCSY experiments and 13C-labeling, to resolve the ambiguities for a one-to-one mapping. On the three proteins, it achieves an average accuracy of 94% or better.Conclusions: Our mathematical programming approach for modeling chemical shift mapping as a graph problem, while modeling the errors directly, is potentially a time- and cost-effective first step for high-throughput drug screening based on limited NMR data and homologous 3D structures. 2012 Jang et al.; licensee BioMed Central Ltd.

  6. Combining automated peak tracking in SAR by NMR with structure-based backbone assignment from 15N-NOESY

    KAUST Repository

    Jang, Richard

    2012-03-21

    Background: Chemical shift mapping is an important technique in NMR-based drug screening for identifying the atoms of a target protein that potentially bind to a drug molecule upon the molecule\\'s introduction in increasing concentrations. The goal is to obtain a mapping of peaks with known residue assignment from the reference spectrum of the unbound protein to peaks with unknown assignment in the target spectrum of the bound protein. Although a series of perturbed spectra help to trace a path from reference peaks to target peaks, a one-to-one mapping generally is not possible, especially for large proteins, due to errors, such as noise peaks, missing peaks, missing but then reappearing, overlapped, and new peaks not associated with any peaks in the reference. Due to these difficulties, the mapping is typically done manually or semi-automatically, which is not efficient for high-throughput drug screening.Results: We present PeakWalker, a novel peak walking algorithm for fast-exchange systems that models the errors explicitly and performs many-to-one mapping. On the proteins: hBclXL, UbcH5B, and histone H1, it achieves an average accuracy of over 95% with less than 1.5 residues predicted per target peak. Given these mappings as input, we present PeakAssigner, a novel combined structure-based backbone resonance and NOE assignment algorithm that uses just 15N-NOESY, while avoiding TOCSY experiments and 13C-labeling, to resolve the ambiguities for a one-to-one mapping. On the three proteins, it achieves an average accuracy of 94% or better.Conclusions: Our mathematical programming approach for modeling chemical shift mapping as a graph problem, while modeling the errors directly, is potentially a time- and cost-effective first step for high-throughput drug screening based on limited NMR data and homologous 3D structures. 2012 Jang et al.; licensee BioMed Central Ltd.

  7. APPECT: An Approximate Backbone-Based Clustering Algorithm for Tags

    DEFF Research Database (Denmark)

    Zong, Yu; Xu, Guandong; Jin, Pin

    2011-01-01

    algorithm for Tags (APPECT). The main steps of APPECT are: (1) we execute the K-means algorithm on a tag similarity matrix for M times and collect a set of tag clustering results Z={C1,C2,…,Cm}; (2) we form the approximate backbone of Z by executing a greedy search; (3) we fix the approximate backbone...... as the initial tag clustering result and then assign the rest tags into the corresponding clusters based on the similarity. Experimental results on three real world datasets namely MedWorm, MovieLens and Dmoz demonstrate the effectiveness and the superiority of the proposed method against the traditional...... Agglomerative Clustering on tagging data, which possess the inherent drawbacks, such as the sensitivity of initialization. In this paper, we instead make use of the approximate backbone of tag clustering results to find out better tag clusters. In particular, we propose an APProximate backbonE-based Clustering...

  8. Detection of algorithmic trading

    Science.gov (United States)

    Bogoev, Dimitar; Karam, Arzé

    2017-10-01

    We develop a new approach to reflect the behavior of algorithmic traders. Specifically, we provide an analytical and tractable way to infer patterns of quote volatility and price momentum consistent with different types of strategies employed by algorithmic traders, and we propose two ratios to quantify these patterns. Quote volatility ratio is based on the rate of oscillation of the best ask and best bid quotes over an extremely short period of time; whereas price momentum ratio is based on identifying patterns of rapid upward or downward movement in prices. The two ratios are evaluated across several asset classes. We further run a two-stage Artificial Neural Network experiment on the quote volatility ratio; the first stage is used to detect the quote volatility patterns resulting from algorithmic activity, while the second is used to validate the quality of signal detection provided by our measure.

  9. Single machine scheduling with slack due dates assignment

    Science.gov (United States)

    Liu, Weiguo; Hu, Xiangpei; Wang, Xuyin

    2017-04-01

    This paper considers a single machine scheduling problem in which each job is assigned an individual due date based on a common flow allowance (i.e. all jobs have slack due date). The goal is to find a sequence for jobs, together with a due date assignment, that minimizes a non-regular criterion comprising the total weighted absolute lateness value and common flow allowance cost, where the weight is a position-dependent weight. In order to solve this problem, an ? time algorithm is proposed. Some extensions of the problem are also shown.

  10. Rule-Based Analytic Asset Management for Space Exploration Systems (RAMSES), Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — Payload Systems Inc. (PSI) and the Massachusetts Institute of Technology (MIT) were selected to jointly develop the Rule-based Analytic Asset Management for Space...

  11. An evolutionary algorithm technique for intelligence, surveillance, and reconnaissance plan optimization

    Science.gov (United States)

    Langton, John T.; Caroli, Joseph A.; Rosenberg, Brad

    2008-04-01

    To support an Effects Based Approach to Operations (EBAO), Intelligence, Surveillance, and Reconnaissance (ISR) planners must optimize collection plans within an evolving battlespace. A need exists for a decision support tool that allows ISR planners to rapidly generate and rehearse high-performing ISR plans that balance multiple objectives and constraints to address dynamic collection requirements for assessment. To meet this need we have designed an evolutionary algorithm (EA)-based "Integrated ISR Plan Analysis and Rehearsal System" (I2PARS) to support Effects-based Assessment (EBA). I2PARS supports ISR mission planning and dynamic replanning to coordinate assets and optimize their routes, allocation and tasking. It uses an evolutionary algorithm to address the large parametric space of route-finding problems which is sometimes discontinuous in the ISR domain because of conflicting objectives such as minimizing asset utilization yet maximizing ISR coverage. EAs are uniquely suited for generating solutions in dynamic environments and also allow user feedback. They are therefore ideal for "streaming optimization" and dynamic replanning of ISR mission plans. I2PARS uses the Non-dominated Sorting Genetic Algorithm (NSGA-II) to automatically generate a diverse set of high performing collection plans given multiple objectives, constraints, and assets. Intended end users of I2PARS include ISR planners in the Combined Air Operations Centers and Joint Intelligence Centers. Here we show the feasibility of applying the NSGA-II algorithm and EAs in general to the ISR planning domain. Unique genetic representations and operators for optimization within the ISR domain are presented along with multi-objective optimization criteria for ISR planning. Promising results of the I2PARS architecture design, early software prototype, and limited domain testing of the new algorithm are discussed. We also present plans for future research and development, as well as technology

  12. A demand assignment control in international business satellite communications network

    Science.gov (United States)

    Nohara, Mitsuo; Takeuchi, Yoshio; Takahata, Fumio; Hirata, Yasuo

    An experimental system is being developed for use in an international business satellite (IBS) communications network based on demand-assignment (DA) and TDMA techniques. This paper discusses its system design, in particular from the viewpoints of a network configuration, a DA control, and a satellite channel-assignment algorithm. A satellite channel configuration is also presented along with a tradeoff study on transmission rate, HPA output power, satellite resource efficiency, service quality, and so on.

  13. Content-Based Multi-Channel Network Coding Algorithm in the Millimeter-Wave Sensor Network

    Directory of Open Access Journals (Sweden)

    Kai Lin

    2016-07-01

    Full Text Available With the development of wireless technology, the widespread use of 5G is already an irreversible trend, and millimeter-wave sensor networks are becoming more and more common. However, due to the high degree of complexity and bandwidth bottlenecks, the millimeter-wave sensor network still faces numerous problems. In this paper, we propose a novel content-based multi-channel network coding algorithm, which uses the functions of data fusion, multi-channel and network coding to improve the data transmission; the algorithm is referred to as content-based multi-channel network coding (CMNC. The CMNC algorithm provides a fusion-driven model based on the Dempster-Shafer (D-S evidence theory to classify the sensor nodes into different classes according to the data content. By using the result of the classification, the CMNC algorithm also provides the channel assignment strategy and uses network coding to further improve the quality of data transmission in the millimeter-wave sensor network. Extensive simulations are carried out and compared to other methods. Our simulation results show that the proposed CMNC algorithm can effectively improve the quality of data transmission and has better performance than the compared methods.

  14. Cost approach of health care entity intangible asset valuation.

    Science.gov (United States)

    Reilly, Robert F

    2012-01-01

    In the valuation synthesis and conclusion process, the analyst should consider the following question: Does the selected valuation approach(es) and method(s) accomplish the analyst's assignment? Also, does the selected valuation approach and method actually quantify the desired objective of the intangible asset analysis? The analyst should also consider if the selected valuation approach and method analyzes the appropriate bundle of legal rights. The analyst should consider if there were sufficient empirical data available to perform the selected valuation approach and method. The valuation synthesis should consider if there were sufficient data available to make the analyst comfortable with the value conclusion. The valuation analyst should consider if the selected approach and method will be understandable to the intended audience. In the valuation synthesis and conclusion, the analyst should also consider which approaches and methods deserve the greatest consideration with respect to the intangible asset's RUL. The intangible asset RUL is a consideration of each valuation approach. In the income approach, the RUL may affect the projection period for the intangible asset income subject to either yield capitalization or direct capitalization. In the cost approach, the RUL may affect the total amount of obsolescence, if any, from the estimate cost measure (that is, the intangible reproduction cost new or replacement cost new). In the market approach, the RUL may effect the selection, rejection, and/or adjustment of the comparable or guideline intangible asset sale and license transactional data. The experienced valuation analyst will use professional judgment to weight the various value indications to conclude a final intangible asset value, based on: The analyst's confidence in the quantity and quality of available data; The analyst's level of due diligence performed on that data; The relevance of the valuation method to the intangible asset life cycle stage and

  15. Implementation of ASSET concept in India

    International Nuclear Information System (INIS)

    Koley, J.

    1997-01-01

    The paper presents a retrospective assessment of the use of ASSET methodology in India since the first ASSET seminary organized by IAEA in collaboration with the Atomic Energy Regulatory Board, India (AERB) in May, 1994. The first ASSET seminar was organized to initiate the spread of idea among operating and research organizations and regulatory body personnel. The participants were carefully chosen from various fields and with different levels of experiences to generate teams with sufficiently wide spectrum of knowledge base. AERB took initiative in leading by example and formed ASSET teams to carry out the first ASSET reviews in India. These teams at the instance of AERB carried out ASSET review of three Safety Related Events, two at Nuclear Power Plants and one at Research Reactor. This paper describes the outcome of these ASSET studies and subsequent implementation of the recommendations. The initiative taken by the regulatory body has led to formation of ASSET teams by the utilities to carry out ASSET study on their own. The results of these studies are yet to be assessed by the regulatory body. The result of the ASSET experience reveals the fact that it has further potential in improving the safety performance and safety culture and brining in fresh enthusiasm among safety professionals of Indian Nuclear Utilities

  16. Implementation of ASSET concept in India

    Energy Technology Data Exchange (ETDEWEB)

    Koley, J [Operating Plants Safety Div., AERB, Mumbai (India)

    1997-10-01

    The paper presents a retrospective assessment of the use of ASSET methodology in India since the first ASSET seminary organized by IAEA in collaboration with the Atomic Energy Regulatory Board, India (AERB) in May, 1994. The first ASSET seminar was organized to initiate the spread of idea among operating and research organizations and regulatory body personnel. The participants were carefully chosen from various fields and with different levels of experiences to generate teams with sufficiently wide spectrum of knowledge base. AERB took initiative in leading by example and formed ASSET teams to carry out the first ASSET reviews in India. These teams at the instance of AERB carried out ASSET review of three Safety Related Events, two at Nuclear Power Plants and one at Research Reactor. This paper describes the outcome of these ASSET studies and subsequent implementation of the recommendations. The initiative taken by the regulatory body has led to formation of ASSET teams by the utilities to carry out ASSET study on their own. The results of these studies are yet to be assessed by the regulatory body. The result of the ASSET experience reveals the fact that it has further potential in improving the safety performance and safety culture and brining in fresh enthusiasm among safety professionals of Indian Nuclear Utilities.

  17. Asset management using an extended Markowitz theorem

    Directory of Open Access Journals (Sweden)

    Paria Karimi

    2014-06-01

    Full Text Available Markowitz theorem is one of the most popular techniques for asset management. The method has been widely used to solve many applications, successfully. In this paper, we present a multi objective Markowitz model to determine asset allocation by considering cardinality constraints. The resulted model is an NP-Hard problem and the proposed study uses two metaheuristics, namely genetic algorithm (GA and particle swarm optimization (PSO to find efficient solutions. The proposed study has been applied on some data collected from Tehran Stock Exchange over the period 2009-2011. The study considers four objectives including cash return, 12-month return, 36-month return and Lower Partial Moment (LPM. The results indicate that there was no statistical difference between the implementation of PSO and GA methods.

  18. Case Based Asset Maintenance for the Electric Equipment

    International Nuclear Information System (INIS)

    Kim, Ji-Hyeon; Jung, Jae-Cheon; Chang, Young-Woo; Chang, Hoon-Seon; Kim, Jae-Cheol; Kim, Hang-Bae; Kim, Kyu-Ho; Hur, Yong; Lee, Dong-Chul

    2006-01-01

    The electric equipment maintenance strategies are changing from PM(Preventive Maintenance) or CM(Corrective Maintenance) to CBM(Condition Based Maintenance). The main benefits of CBM are reduced possibility of service failures of critical equipment and reduced costs or maintenance work. In CBM, the equipment status need to be monitored continuously and a decision should be made whether an equipment need to be repaired or replaced. For the maintenance decision making, the CBR(Case Base Reasoning) system is introduced. The CBR system receives the current equipment status and retrieves the case based historic database to determine any possible equipment failure under current conditions. In retrieving the case based historic data, the suggested DSS(Decision Support System) uses a reasoning engine with an equipment/asset ontology that describes the equipment subsumption relationships

  19. DNABIT Compress – Genome compression algorithm

    OpenAIRE

    Rajarajeswari, Pothuraju; Apparao, Allam

    2011-01-01

    Data compression is concerned with how information is organized in data. Efficient storage means removal of redundancy from the data being stored in the DNA molecule. Data compression algorithms remove redundancy and are used to understand biologically important molecules. We present a compression algorithm, “DNABIT Compress” for DNA sequences based on a novel algorithm of assigning binary bits for smaller segments of DNA bases to compress both repetitive and non repetitive DNA sequence. Our ...

  20. Research on Single Base-Station Distance Estimation Algorithm in Quasi-GPS Ultrasonic Location System

    Energy Technology Data Exchange (ETDEWEB)

    Cheng, X C; Su, S J; Wang, Y K; Du, J B [Instrument Department, College of Mechatronics Engineering and Automation, National University of Defense Technology, ChangSha, Hunan, 410073 (China)

    2006-10-15

    In order to identify each base-station in quasi-GPS ultrasonic location system, a unique pseudo-random code is assigned to each base-station. This article primarily studies the distance estimation problem between Autonomous Guide Vehicle (AGV) and single base-station, and then the ultrasonic spread-spectrum distance measurement Time Delay Estimation (TDE) model is established. Based on the above model, the envelope correlation fast TDE algorithm based on FFT is presented and analyzed. It shows by experiments that when the m sequence used in the received signal is as same as the reference signal, there will be a sharp correlation value in their envelope correlation function after they are processed by the above algorithm; otherwise, the will be no prominent correlation value. So, the AGV can identify each base-station easily.

  1. Research on Single Base-Station Distance Estimation Algorithm in Quasi-GPS Ultrasonic Location System

    International Nuclear Information System (INIS)

    Cheng, X C; Su, S J; Wang, Y K; Du, J B

    2006-01-01

    In order to identify each base-station in quasi-GPS ultrasonic location system, a unique pseudo-random code is assigned to each base-station. This article primarily studies the distance estimation problem between Autonomous Guide Vehicle (AGV) and single base-station, and then the ultrasonic spread-spectrum distance measurement Time Delay Estimation (TDE) model is established. Based on the above model, the envelope correlation fast TDE algorithm based on FFT is presented and analyzed. It shows by experiments that when the m sequence used in the received signal is as same as the reference signal, there will be a sharp correlation value in their envelope correlation function after they are processed by the above algorithm; otherwise, the will be no prominent correlation value. So, the AGV can identify each base-station easily

  2. Initial cash/asset ratio and asset prices: an experimental study.

    Science.gov (United States)

    Caginalp, G; Porter, D; Smith, V

    1998-01-20

    A series of experiments, in which nine participants trade an asset over 15 periods, test the hypothesis that an initial imbalance of asset/cash will influence the trading price over an extended time. Participants know at the outset that the asset or "stock" pays a single dividend with fixed expectation value at the end of the 15th period. In experiments with a greater total value of cash at the start, the mean prices during the trading periods are higher, compared with those with greater amount of asset, with a high degree of statistical significance. The difference is most significant at the outset and gradually tapers near the end of the experiment. The results are very surprising from a rational expectations and classical game theory perspective, because the possession of a large amount of cash does not lead to a simple motivation for a trader to bid excessively on a financial instrument. The gradual erosion of the difference toward the end of trading, however, suggests that fundamental value is approached belatedly, offering some consolation to the rational expectations theory. It also suggests that there is a time scale on which an evolution toward fundamental value occurs. The experimental results are qualitatively compatible with the price dynamics predicted by a system of differential equations based on asset flow. The results have broad implications for the marketing of securities, particularly initial and secondary public offerings, government bonds, etc., where excess supply has been conjectured to suppress prices.

  3. Joint Channel Assignment and Routing in Multiradio Multichannel Wireless Mesh Networks: Design Considerations and Approaches

    Directory of Open Access Journals (Sweden)

    Omar M. Zakaria

    2016-01-01

    Full Text Available Multiradio wireless mesh network is a promising architecture that improves the network capacity by exploiting multiple radio channels concurrently. Channel assignment and routing are underlying challenges in multiradio architectures since both determine the traffic distribution over links and channels. The interdependency between channel assignments and routing promotes toward the joint solutions for efficient configurations. This paper presents an in-depth review of the joint approaches of channel assignment and routing in multiradio wireless mesh networks. First, the key design issues, modeling, and approaches are identified and discussed. Second, existing algorithms for joint channel assignment and routing are presented and classified based on the channel assignment types. Furthermore, the set of reconfiguration algorithms to adapt the network traffic dynamics is also discussed. Finally, the paper presents some multiradio practical implementations and test-beds and points out the future research directions.

  4. Automated solid-state NMR resonance assignment of protein microcrystals and amyloids

    Energy Technology Data Exchange (ETDEWEB)

    Schmidt, Elena [Goethe University Frankfurt am Main, Center for Biomolecular Magnetic Resonance, Institute of Biophysical Chemistry (Germany); Gath, Julia [ETH Zurich, Physical Chemistry (Switzerland); Habenstein, Birgit [UMR 5086 CNRS/Universite de Lyon 1, Institut de Biologie et Chimie des Proteines (France); Ravotti, Francesco; Szekely, Kathrin; Huber, Matthias [ETH Zurich, Physical Chemistry (Switzerland); Buchner, Lena [Goethe University Frankfurt am Main, Center for Biomolecular Magnetic Resonance, Institute of Biophysical Chemistry (Germany); Boeckmann, Anja, E-mail: a.bockmann@ibcp.fr [UMR 5086 CNRS/Universite de Lyon 1, Institut de Biologie et Chimie des Proteines (France); Meier, Beat H., E-mail: beme@ethz.ch [ETH Zurich, Physical Chemistry (Switzerland); Guentert, Peter, E-mail: guentert@em.uni-frankfurt.de [Goethe University Frankfurt am Main, Center for Biomolecular Magnetic Resonance, Institute of Biophysical Chemistry (Germany)

    2013-07-15

    Solid-state NMR is an emerging structure determination technique for crystalline and non-crystalline protein assemblies, e.g., amyloids. Resonance assignment constitutes the first and often very time-consuming step to a structure. We present ssFLYA, a generally applicable algorithm for automatic assignment of protein solid-state NMR spectra. Application to microcrystals of ubiquitin and the Ure2 prion C-terminal domain, as well as amyloids of HET-s(218-289) and {alpha}-synuclein yielded 88-97 % correctness for the backbone and side-chain assignments that are classified as self-consistent by the algorithm, and 77-90 % correctness if also assignments classified as tentative by the algorithm are included.

  5. Efficiently Inefficient Markets for Assets and Asset Management

    DEFF Research Database (Denmark)

    Garleanu, Nicolae; Pedersen, Lasse Heje

    We consider a model where investors can invest directly or search for an asset manager, information about assets is costly, and managers charge an endogenous fee. The efficiency of asset prices is linked to the efficiency of the asset management market: if investors can find managers more easily......, more money is allocated to active management, fees are lower, and asset prices are more efficient. Informed managers outperform after fees, uninformed managers underperform after fees, and the net performance of the average manager depends on the number of "noise allocators." Small investors should...... be passive, but large and sophisticated investors benefit from searching for informed active managers since their search cost is low relative to capital. Hence, managers with larger and more sophisticated investors are expected to outperform....

  6. Investments Portfolio Optimal Planning for industrial assets management: Method and Tool

    International Nuclear Information System (INIS)

    Lonchampt, Jerome; Fessart, Karine

    2012-01-01

    The purpose of this paper is to describe the method and tool dedicated to optimize investments planning for industrial assets. These investments may either be preventive maintenance tasks, asset enhancement or logistic investment such as spare parts purchase. The three methodological points to investigate in such an issue are: 1. The measure of the profitability of a portfolio of investments 2. The selection and planning of an optimal set of investments 3. The measure of the risk of a portfolio of investments The measure of the profitability of a set of investments in the IPOP (registered) tool is synthesised in the Net Present Value indicator. The NPV is the sum of the differences of discounted cash flows (direct costs, forced outages...) between the situations with and without a given investment. These cash flows are calculated through a pseudo-markov reliability model representing independently the components of the industrial asset and the spare parts inventories. The component model has been widely discussed over the years but the spare part model is a new one based on some approximations that will be discussed. This model, referred as the NPV function, takes for input an investments portfolio and gives its NPV. The second issue is to optimize the NPV. If all investments were independent, this optimization would be an easy calculation, unfortunately there are two sources of dependency. The first one is introduced by the spare part model, as if components are indeed independent in their reliability model, the fact that several components use the same inventory induces a dependency. The second dependency comes from economic, technical or logistic constraints, such as a global maintenance budget limit or a precedence constraint between two investments, making the aggregation of individual optimum not necessary feasible. The algorithm used to solve such a difficult optimization problem is a genetic algorithm. After a description of the features of the software a

  7. Developing Subdomain Allocation Algorithms Based on Spatial and Communicational Constraints to Accelerate Dust Storm Simulation

    Science.gov (United States)

    Gui, Zhipeng; Yu, Manzhu; Yang, Chaowei; Jiang, Yunfeng; Chen, Songqing; Xia, Jizhe; Huang, Qunying; Liu, Kai; Li, Zhenlong; Hassan, Mohammed Anowarul; Jin, Baoxuan

    2016-01-01

    Dust storm has serious disastrous impacts on environment, human health, and assets. The developments and applications of dust storm models have contributed significantly to better understand and predict the distribution, intensity and structure of dust storms. However, dust storm simulation is a data and computing intensive process. To improve the computing performance, high performance computing has been widely adopted by dividing the entire study area into multiple subdomains and allocating each subdomain on different computing nodes in a parallel fashion. Inappropriate allocation may introduce imbalanced task loads and unnecessary communications among computing nodes. Therefore, allocation is a key factor that may impact the efficiency of parallel process. An allocation algorithm is expected to consider the computing cost and communication cost for each computing node to minimize total execution time and reduce overall communication cost for the entire simulation. This research introduces three algorithms to optimize the allocation by considering the spatial and communicational constraints: 1) an Integer Linear Programming (ILP) based algorithm from combinational optimization perspective; 2) a K-Means and Kernighan-Lin combined heuristic algorithm (K&K) integrating geometric and coordinate-free methods by merging local and global partitioning; 3) an automatic seeded region growing based geometric and local partitioning algorithm (ASRG). The performance and effectiveness of the three algorithms are compared based on different factors. Further, we adopt the K&K algorithm as the demonstrated algorithm for the experiment of dust model simulation with the non-hydrostatic mesoscale model (NMM-dust) and compared the performance with the MPI default sequential allocation. The results demonstrate that K&K method significantly improves the simulation performance with better subdomain allocation. This method can also be adopted for other relevant atmospheric and numerical

  8. Community Asset Mapping. Trends and Issues Alert.

    Science.gov (United States)

    Kerka, Sandra

    Asset mapping involves documenting tangible and intangible resources of a community viewed as a place with assets to be preserved and enhanced, not deficits to be remedied. Kretzmann and McKnight (1993) are credited with developing the concept of asset-based community development (ABCD) that draws on appreciative inquiry; recognition of social…

  9. Towards the Automatic Detection of Efficient Computing Assets in a Heterogeneous Cloud Environment

    OpenAIRE

    Iglesias, Jesus Omana; Stokes, Nicola; Ventresque, Anthony; Murphy, Liam, B.E.; Thorburn, James

    2013-01-01

    peer-reviewed In a heterogeneous cloud environment, the manual grading of computing assets is the first step in the process of configuring IT infrastructures to ensure optimal utilization of resources. Grading the efficiency of computing assets is however, a difficult, subjective and time consuming manual task. Thus, an automatic efficiency grading algorithm is highly desirable. In this paper, we compare the effectiveness of the different criteria used in the manual gr...

  10. Asset management: the big picture.

    Science.gov (United States)

    Deinstadt, Deborah C

    2005-10-01

    To develop an comprehensive asset management plan, you need, first of all, to understand the asset management continuum. A key preliminary step is to thoroughly assess the existing equipment base. A critical objective is to ensure that there are open lines of communication among the teams charged with managing the plan's various phases.

  11. System and Method for Monitoring Distributed Asset Data

    Science.gov (United States)

    Gorinevsky, Dimitry (Inventor)

    2015-01-01

    A computer-based monitoring system and monitoring method implemented in computer software for detecting, estimating, and reporting the condition states, their changes, and anomalies for many assets. The assets are of same type, are operated over a period of time, and outfitted with data collection systems. The proposed monitoring method accounts for variability of working conditions for each asset by using regression model that characterizes asset performance. The assets are of the same type but not identical. The proposed monitoring method accounts for asset-to-asset variability; it also accounts for drifts and trends in the asset condition and data. The proposed monitoring system can perform distributed processing of massive amounts of historical data without discarding any useful information where moving all the asset data into one central computing system might be infeasible. The overall processing is includes distributed preprocessing data records from each asset to produce compressed data.

  12. Static and dynamic factors in an information-based multi-asset artificial stock market

    Science.gov (United States)

    Ponta, Linda; Pastore, Stefano; Cincotti, Silvano

    2018-02-01

    An information-based multi-asset artificial stock market characterized by different types of stocks and populated by heterogeneous agents is presented. In the market, agents trade risky assets in exchange for cash. Beside the amount of cash and of stocks owned, each agent is characterized by sentiments and agents share their sentiments by means of interactions that are determined by sparsely connected networks. A central market maker (clearing house mechanism) determines the price processes for each stock at the intersection of the demand and the supply curves. Single stock price processes exhibit volatility clustering and fat-tailed distribution of returns whereas multivariate price process exhibits both static and dynamic stylized facts, i.e., the presence of static factors and common trends. Static factors are studied making reference to the cross-correlation of returns of different stocks. The common trends are investigated considering the variance-covariance matrix of prices. Results point out that the probability distribution of eigenvalues of the cross-correlation matrix of returns shows the presence of sectors, similar to those observed on real empirical data. As regarding the dynamic factors, the variance-covariance matrix of prices point out a limited number of assets prices series that are independent integrated processes, in close agreement with the empirical evidence of asset price time series of real stock markets. These results remarks the crucial dependence of statistical properties of multi-assets stock market on the agents' interaction structure.

  13. Whether and How to Select Inertia and Acceleration of Discrete Particle Swarm Optimization Algorithm: A Study on Channel Assignment

    Directory of Open Access Journals (Sweden)

    Min Jin

    2014-01-01

    Full Text Available There is recently a great deal of interest and excitement in understanding the role of inertia and acceleration in the motion equation of discrete particle swarm optimization (DPSO algorithms. It still remains unknown whether the inertia section should be abandoned and how to select the appropriate acceleration in order for DPSO to show the best convergence performance. Adopting channel assignment as a case study, this paper systematically conducts experimental filtering research on this issue. Compared with other channel assignment schemes, the proposed scheme and the selection of inertia and acceleration are verified to have the advantage to channel assignment in three respects of convergence rate, convergence speed, and the independency of the quality of initial solution. Furthermore, the experimental result implies that DSPO might have the best convergence performance when its motion equation includes an inertia section in a less medium weight, a bigger acceleration coefficient for global-search optimum, and a smaller acceleration coefficient for individual-search optimum.

  14. THE PROBLEMS OF FIXED ASSETS CLASSIFICATION FOR ACCOUNTING

    Directory of Open Access Journals (Sweden)

    Sophiia Kafka

    2016-06-01

    Full Text Available This article provides a critical analysis of research in accounting of fixed assets; the basic issues of fixed assets accounting that have been developed by the Ukrainian scientists during 1999-2016 have been determined. It is established that the problems of non-current assets taxation and their classification are the most noteworthy. In the dissertations the issues of fixed assets classification are of exclusively particular branch nature, so its improvement is important. The purpose of the article is developing science-based classification of fixed assets for accounting purposes since their composition is quite diverse. The classification of fixed assets for accounting purposes have been summarized and developed in Figure 1 according to the results of the research. The accomplished analysis of existing approaches to classification of fixed assets has made it possible to specify its basic types and justify the classification criteria of fixed assets for the main objects of fixed assets. Key words: non-current assets, fixed assets, accounting, valuation, classification of the fixed assets. JEL:G M41  

  15. Automated solid-state NMR resonance assignment of protein microcrystals and amyloids

    International Nuclear Information System (INIS)

    Schmidt, Elena; Gath, Julia; Habenstein, Birgit; Ravotti, Francesco; Székely, Kathrin; Huber, Matthias; Buchner, Lena; Böckmann, Anja; Meier, Beat H.; Güntert, Peter

    2013-01-01

    Solid-state NMR is an emerging structure determination technique for crystalline and non-crystalline protein assemblies, e.g., amyloids. Resonance assignment constitutes the first and often very time-consuming step to a structure. We present ssFLYA, a generally applicable algorithm for automatic assignment of protein solid-state NMR spectra. Application to microcrystals of ubiquitin and the Ure2 prion C-terminal domain, as well as amyloids of HET-s(218–289) and α-synuclein yielded 88–97 % correctness for the backbone and side-chain assignments that are classified as self-consistent by the algorithm, and 77–90 % correctness if also assignments classified as tentative by the algorithm are included

  16. A Novel Memetic Algorithm Based on Decomposition for Multiobjective Flexible Job Shop Scheduling Problem

    Directory of Open Access Journals (Sweden)

    Chun Wang

    2017-01-01

    Full Text Available A novel multiobjective memetic algorithm based on decomposition (MOMAD is proposed to solve multiobjective flexible job shop scheduling problem (MOFJSP, which simultaneously minimizes makespan, total workload, and critical workload. Firstly, a population is initialized by employing an integration of different machine assignment and operation sequencing strategies. Secondly, multiobjective memetic algorithm based on decomposition is presented by introducing a local search to MOEA/D. The Tchebycheff approach of MOEA/D converts the three-objective optimization problem to several single-objective optimization subproblems, and the weight vectors are grouped by K-means clustering. Some good individuals corresponding to different weight vectors are selected by the tournament mechanism of a local search. In the experiments, the influence of three different aggregation functions is first studied. Moreover, the effect of the proposed local search is investigated. Finally, MOMAD is compared with eight state-of-the-art algorithms on a series of well-known benchmark instances and the experimental results show that the proposed algorithm outperforms or at least has comparative performance to the other algorithms.

  17. Regionalisation of asset values for risk analyses

    Directory of Open Access Journals (Sweden)

    A. H. Thieken

    2006-01-01

    Full Text Available In risk analysis there is a spatial mismatch of hazard data that are commonly modelled on an explicit raster level and exposure data that are often only available for aggregated units, e.g. communities. Dasymetric mapping techniques that use ancillary information to disaggregate data within a spatial unit help to bridge this gap. This paper presents dasymetric maps showing the population density and a unit value of residential assets for whole Germany. A dasymetric mapping approach, which uses land cover data (CORINE Land Cover as ancillary variable, was adapted and applied to regionalize aggregated census data that are provided for all communities in Germany. The results were validated by two approaches. First, it was ascertained whether population data disaggregated at the community level can be used to estimate population in postcodes. Secondly, disaggregated population and asset data were used for a loss evaluation of two flood events that occurred in 1999 and 2002, respectively. It must be concluded that the algorithm tends to underestimate the population in urban areas and to overestimate population in other land cover classes. Nevertheless, flood loss evaluations demonstrate that the approach is capable of providing realistic estimates of the number of exposed people and assets. Thus, the maps are sufficient for applications in large-scale risk assessments such as the estimation of population and assets exposed to natural and man-made hazards.

  18. THEORETICAL ASPECTS REGARDING THE VALUATION OF INTANGIBLE ASSETS

    Directory of Open Access Journals (Sweden)

    HOLT GHEORGHE

    2015-03-01

    Full Text Available Valuation of intangible assets represents one of the most delicate problems of assessing a company. Usually, valuation of intangible assets is in the process of evaluating enterprise as a whole. Therefore, Intangible Asset Valuers must have detailed knowledge on business valuation, in particular, the income-based valuation methods (capitalization / updating net cash flow. Valuation of Intangible Assets is the objective of the International Valuation Standards (GN 4 Valuation of Intangible Assets (revised 2010. Next to it was recently proposed GN 16 Valuation of Intangible Assets for IFRS reporting. International Accounting Standard (IAS 38 Intangible Assets prescribe the accounting treatment for intangible assets, analyze the criteria that an intangible asset must meet to be recognized, specific carrying amount of intangible assets and sets out requirements for disclosure of intangible assets. From an accounting perspective, relevant professional accounting standards and the following: IFRS 3 Business Combinations, IAS 36 Impairment of Assets and SFAS 157 fair value measurement, developed by the FASB. There is a more pronounced near the provisions of IAS 38 contained in GN 4. Therefore, a good professional intangible asset valuation must know thoroughly the conditions, principles, criteria and assessment methods recognized by those standards

  19. Single-Sex Schools, Student Achievement, and Course Selection: Evidence from Rule-Based Student Assignments in Trinidad and Tobago

    OpenAIRE

    C. Kirabo Jackson

    2011-01-01

    Existing studies on single-sex schooling suffer from biases because students who attend single-sex schools differ in unmeasured ways from those who do not. In Trinidad and Tobago students are assigned to secondary schools based on an algorithm allowing one to address self-selection bias and estimate the causal effect of attending a single-sex school versus a similar coeducational school. While students (particularly females) with strong expressed preferences for single-sex schools benefit, mo...

  20. One of My Favorite Assignments: Automated Teller Machine Simulation.

    Science.gov (United States)

    Oberman, Paul S.

    2001-01-01

    Describes an assignment for an introductory computer science class that requires the student to write a software program that simulates an automated teller machine. Highlights include an algorithm for the assignment; sample file contents; language features used; assignment variations; and discussion points. (LRW)

  1. Web-Based Problem-Solving Assignment and Grading System

    Science.gov (United States)

    Brereton, Giles; Rosenberg, Ronald

    2014-11-01

    In engineering courses with very specific learning objectives, such as fluid mechanics and thermodynamics, it is conventional to reinforce concepts and principles with problem-solving assignments and to measure success in problem solving as an indicator of student achievement. While the modern-day ease of copying and searching for online solutions can undermine the value of traditional assignments, web-based technologies also provide opportunities to generate individualized well-posed problems with an infinite number of different combinations of initial/final/boundary conditions, so that the probability of any two students being assigned identical problems in a course is vanishingly small. Such problems can be designed and programmed to be: single or multiple-step, self-grading, allow students single or multiple attempts; provide feedback when incorrect; selectable according to difficulty; incorporated within gaming packages; etc. In this talk, we discuss the use of a homework/exam generating program of this kind in a single-semester course, within a web-based client-server system that ensures secure operation.

  2. Implementation of Asset Management System Based on Wireless Sensor Technology

    Directory of Open Access Journals (Sweden)

    Nan WANG

    2014-02-01

    Full Text Available RFID technology is regarded as one of the top ten key technologies in the 21st century, which has extensive application prospect in various fields, including asset management, public safety and so on. Through analyzing the current problems existing in asset management, this paper proposes to apply RFID technology in device management to effectively improve the level of automation and informatization of device management, and designs the scheme of equipment monitoring system based on 433 MHz RFID electronic tag and reader. The hardware part of monitoring system consists of the RFID sensor terminals attached in the device and the readers distributed in each monitoring site. The reader uploads the information collected by tag to the backend server and the management system, so as to allow managers and decision makers to understand the usage rate and location of the experimental instruments and to provide managers with a scientific basis for decision making, which effectively solves the relatively backward status quo of current device management level.

  3. THEORETICAL ASPECTS REGARDING THE VALUATION OF INTANGIBLE ASSETS

    OpenAIRE

    HOLT GHEORGHE

    2015-01-01

    Valuation of intangible assets represents one of the most delicate problems of assessing a company. Usually, valuation of intangible assets is in the process of evaluating enterprise as a whole. Therefore, Intangible Asset Valuers must have detailed knowledge on business valuation, in particular, the income-based valuation methods (capitalization / updating net cash flow). Valuation of Intangible Assets is the objective of the International Valuation Standards (GN) 4 Valuation of Intangible A...

  4. SPECIFIC OF ACCOUNTING OF NON-FINANCIAL ASSETS IN HEALTH INSTITUTIONS

    Directory of Open Access Journals (Sweden)

    Natalya Pryadka

    2016-11-01

    Full Text Available The purpose of the paper is to analysis of the modern state of accounting of non-financial assets and present accounting in health institutions protection during the period of medical reforms. The account of nonfinancial assets has his specific in medical establishments. Reforms, which implemented in Ukraine, affecting the account of non-financial assets. Medical institutions relate to the General government. Methodology. The survey is based on a comparison of data from the national and international reforms in medical industry. Results of the survey showed that in the dictionary of V. Raizberg next determination is driven: "public sector – it is a set of business units, that carry out economic activity, are in a public domain, controlled by public authorities or designated and hired persons" (Raizberg, 1999. In the Commercial code of Ukraine it is indicated: "The subjects of public sector of economy are subjects that operate on the basis of only public domain, and also subjects, which state share in authorized capital exceeds fifty percents or be value that provides a right to the state for decisive influence on economic activity of these subjects" (СС, 2003. Practical implications. Public sector structure specifies data of International Public Sector Accounting Standards. Substancial load concept "public sector" in national and international practice are relevant (Poznyakovska, 2009. Accounting in health institutions has specific terms, inherent to the government sector. They are determined by the types of activity and terms of assignation (Pasichnik, 2005. Medical services must be accessible to all stratum of population. They must have free basis. They come forward as a public benefit independent of individual possibility to pay for him. Therefore lion's part of general charges for health protection is used on the grant of medical services to the population. At the same time "upgrading of medical services lies inplane providing of

  5. Casting a Resource-Based View on Intangible Assets and Export Behaviour

    Directory of Open Access Journals (Sweden)

    Seyyed Mohammad Tabatabaei Nasab

    2013-09-01

    Full Text Available Prosperous companies in the 21st century have come to know the necessity of intangible assets as an important factor to achieve sustainable competitive advantage and constant presence in the international markets. Hence, the purpose of this paper is to examine intangible assets and evaluate its relationship with export behaviour in terms of export intensity (Export-Sales Ratio and export type (Permanent, Occasional & Periodical. The population under study includes all export firms during 2002 until 2010 in Yazd province, Iran. Research data were collected by questionnaire and in order to answer the research questions and testing hypotheses, MCDM techniques (i.e. AHP & TOPSIS and statistical analysis (i.e. ANOVA were utilized. According to the research results, human capital, relational capital, technological capital, corporate reputation, and structural capital placed as the first to the fifth significant factors respectively. Findings revealed that there is a significant difference between the permanent and occasional presence in the international markets regarding intangible assets; as the mean of intangible assets in the firms with permanent export is higher than the mean of intangible assets in the firms with occasional export. However, there is no significant difference between intangible assets and the export intensity.

  6. Gaussian mixture models-based ship target recognition algorithm in remote sensing infrared images

    Science.gov (United States)

    Yao, Shoukui; Qin, Xiaojuan

    2018-02-01

    Since the resolution of remote sensing infrared images is low, the features of ship targets become unstable. The issue of how to recognize ships with fuzzy features is an open problem. In this paper, we propose a novel ship target recognition algorithm based on Gaussian mixture models (GMMs). In the proposed algorithm, there are mainly two steps. At the first step, the Hu moments of these ship target images are calculated, and the GMMs are trained on the moment features of ships. At the second step, the moment feature of each ship image is assigned to the trained GMMs for recognition. Because of the scale, rotation, translation invariance property of Hu moments and the power feature-space description ability of GMMs, the GMMs-based ship target recognition algorithm can recognize ship reliably. Experimental results of a large simulating image set show that our approach is effective in distinguishing different ship types, and obtains a satisfactory ship recognition performance.

  7. Accounting valuation development of specific assets

    Directory of Open Access Journals (Sweden)

    I.V. Zhigley

    2017-12-01

    Full Text Available The current issues of accounting estimate development are considered. The necessity of the development of accounting estimate in the context of the non-institutional theory principles based on the selection of a number of reasons is grounded. The reasons for deterioration of accounting reputation as a separate socio-economic institute in the context of developing the methodology for specific assets accounting are discovered. The system of normative regulation of accounting estimate of enterprise non-current assets in the case of diminishing their usefulness is analyzed. The procedure for determining and accounting for the depreciation of assets in accordance with IFRS 36 «Depreciation of Assets» is developed. The features of the joint use of the concept of «value in use» and «fair value» in the accounting system are disclosed. The procedure for determining the value of compensation depending on the degree of specificity of assets is developed. The necessity to clarify the features that indicate the possibility of diminishing the usefulness of specific assets (termination or pre-term termination of the contract for the use of a specific asset is grounded.

  8. Robust species taxonomy assignment algorithm for 16S rRNA NGS reads: application to oral carcinoma samples

    Directory of Open Access Journals (Sweden)

    Nezar Noor Al-Hebshi

    2015-09-01

    Full Text Available Background: Usefulness of next-generation sequencing (NGS in assessing bacteria associated with oral squamous cell carcinoma (OSCC has been undermined by inability to classify reads to the species level. Objective: The purpose of this study was to develop a robust algorithm for species-level classification of NGS reads from oral samples and to pilot test it for profiling bacteria within OSCC tissues. Methods: Bacterial 16S V1-V3 libraries were prepared from three OSCC DNA samples and sequenced using 454's FLX chemistry. High-quality, well-aligned, and non-chimeric reads ≥350 bp were classified using a novel, multi-stage algorithm that involves matching reads to reference sequences in revised versions of the Human Oral Microbiome Database (HOMD, HOMD extended (HOMDEXT, and Greengene Gold (GGG at alignment coverage and percentage identity ≥98%, followed by assignment to species level based on top hit reference sequences. Priority was given to hits in HOMD, then HOMDEXT and finally GGG. Unmatched reads were subject to operational taxonomic unit analysis. Results: Nearly, 92.8% of the reads were matched to updated-HOMD 13.2, 1.83% to trusted-HOMDEXT, and 1.36% to modified-GGG. Of all matched reads, 99.6% were classified to species level. A total of 228 species-level taxa were identified, representing 11 phyla; the most abundant were Proteobacteria, Bacteroidetes, Firmicutes, Fusobacteria, and Actinobacteria. Thirty-five species-level taxa were detected in all samples. On average, Prevotella oris, Neisseria flava, Neisseria flavescens/subflava, Fusobacterium nucleatum ss polymorphum, Aggregatibacter segnis, Streptococcus mitis, and Fusobacterium periodontium were the most abundant. Bacteroides fragilis, a species rarely isolated from the oral cavity, was detected in two samples. Conclusion: This multi-stage algorithm maximizes the fraction of reads classified to the species level while ensuring reliable classification by giving priority to the

  9. Evaluation of Dynamic Channel and Power Assignment for Cognitive Networks

    Energy Technology Data Exchange (ETDEWEB)

    Syed A. Ahmad; Umesh Shukla; Ryan E. Irwin; Luiz A. DaSilva; Allen B. MacKenzie

    2011-03-01

    In this paper, we develop a unifying optimization formulation to describe the Dynamic Channel and Power Assignment (DCPA) problem and evaluation method for comparing DCPA algorithms. DCPA refers to the allocation of transmit power and frequency channels to links in a cognitive network so as to maximize the total number of feasible links while minimizing the aggregate transmit power. We apply our evaluation method to five algorithms representative of DCPA used in literature. This comparison illustrates the tradeoffs between control modes (centralized versus distributed) and channel/power assignment techniques. We estimate the complexity of each algorithm. Through simulations, we evaluate the effectiveness of the algorithms in achieving feasible link allocations in the network, as well as their power efficiency. Our results indicate that, when few channels are available, the effectiveness of all algorithms is comparable and thus the one with smallest complexity should be selected. The Least Interfering Channel and Iterative Power Assignment (LICIPA) algorithm does not require cross-link gain information, has the overall lowest run time, and highest feasibility ratio of all the distributed algorithms; however, this comes at a cost of higher average power per link.

  10. A priority-based heuristic algorithm (PBHA for optimizing integrated process planning and scheduling problem

    Directory of Open Access Journals (Sweden)

    Muhammad Farhan Ausaf

    2015-12-01

    Full Text Available Process planning and scheduling are two important components of a manufacturing setup. It is important to integrate them to achieve better global optimality and improved system performance. To find optimal solutions for integrated process planning and scheduling (IPPS problem, numerous algorithm-based approaches exist. Most of these approaches try to use existing meta-heuristic algorithms for solving the IPPS problem. Although these approaches have been shown to be effective in optimizing the IPPS problem, there is still room for improvement in terms of quality of solution and algorithm efficiency, especially for more complicated problems. Dispatching rules have been successfully utilized for solving complicated scheduling problems, but haven’t been considered extensively for the IPPS problem. This approach incorporates dispatching rules with the concept of prioritizing jobs, in an algorithm called priority-based heuristic algorithm (PBHA. PBHA tries to establish job and machine priority for selecting operations. Priority assignment and a set of dispatching rules are simultaneously used to generate both the process plans and schedules for all jobs and machines. The algorithm was tested for a series of benchmark problems. The proposed algorithm was able to achieve superior results for most complex problems presented in recent literature while utilizing lesser computational resources.

  11. Moving beyond "Bookish Knowledge": Using Film-Based Assignments to Promote Deep Learning

    Science.gov (United States)

    Olson, Joann S.; Autry, Linda; Moe, Jeffry

    2016-01-01

    This article investigates the effectiveness of a film-based assignment given to adult learners in a graduate-level group counseling class. Semi-structured interviews were conducted with four students; data analysis suggested film-based assignments may promote deep approaches to learning (DALs). Participants indicated the assignment helped them…

  12. Monitoring highway assets using remote sensing technology : research spotlight.

    Science.gov (United States)

    2014-04-01

    Collecting inventory data about roadway assets is a critical part of : MDOTs asset management efforts, which help the department operate, : maintain and upgrade these assets cost-effectively. Federal law requires : that states develop a risk-based...

  13. Looking for Synergy with Momentum in Main Asset Classes

    OpenAIRE

    Lukas Macijauskas; Dimitrios I. Maditinos

    2014-01-01

    As during turbulent market conditions correlations between main asset-classes falter, classical asset management concepts seem unreliable. This problem stimulates search for non-discretionary asset allocation methods. The aim of the paper is to test weather the concept of Momentum phenomena could be used as a stand alone investment strategy using all main asset classes. The study is based on exploring historical prices of various asset classes; statistical data analysis method is used. Result...

  14. Three-dimensional GIS approach for management of assets

    Science.gov (United States)

    Lee, S. Y.; Yee, S. X.; Majid, Z.; Setan, H.

    2014-02-01

    Assets play an important role in human life, especially to an organization. Organizations strive and put more effort to improve its operation and assets management. The development of GIS technology has become a powerful tool in management as it is able to provide a complete inventory for managing assets with location-based information. Spatial information is one of the requirements in decision making in various areas, including asset management in the buildings. This paper describes a 3D GIS approach for management of assets. An asset management system was developed by integrating GIS concept and 3D model assets. The purposes of 3D visualization to manage assets are to facilitate the analysis and understanding in the complex environment. Behind the 3D model of assets is a database to store the asset information. A user-friendly interface was also designed for more easier to operate the application. In the application developed, location of each individual asset can be easily tracked according to the referring spatial information and 3D viewing. The 3D GIS approach described in this paper is certainly would be useful in asset management. Systematic management of assets can be carried out and this will lead to less-time consuming and cost-effective. The results in this paper will show a new approach to improve asset management.

  15. Three-dimensional GIS approach for management of assets

    International Nuclear Information System (INIS)

    Lee, S Y; Yee, S X; Majid, Z; Setan, H

    2014-01-01

    Assets play an important role in human life, especially to an organization. Organizations strive and put more effort to improve its operation and assets management. The development of GIS technology has become a powerful tool in management as it is able to provide a complete inventory for managing assets with location-based information. Spatial information is one of the requirements in decision making in various areas, including asset management in the buildings. This paper describes a 3D GIS approach for management of assets. An asset management system was developed by integrating GIS concept and 3D model assets. The purposes of 3D visualization to manage assets are to facilitate the analysis and understanding in the complex environment. Behind the 3D model of assets is a database to store the asset information. A user-friendly interface was also designed for more easier to operate the application. In the application developed, location of each individual asset can be easily tracked according to the referring spatial information and 3D viewing. The 3D GIS approach described in this paper is certainly would be useful in asset management. Systematic management of assets can be carried out and this will lead to less-time consuming and cost-effective. The results in this paper will show a new approach to improve asset management

  16. Prudent management of utility assets -- Problem or promise?

    International Nuclear Information System (INIS)

    Hatch, D.; Serwinowski, M.

    1998-01-01

    As utilities move into a deregulated market, the extent and nature of their asset base, as well as, the manner in which they have managed it, may play a key factor in the form of regulatory recovery. Utilities must face the issue of stranded assets. One form of addressing this issue is using ''EVA'', Economic Value Added as a mechanism to form financial models for prudent asset management. The authors present an approach to this challenging aspect of deregulation. They focus on the following utility assets: buildings/facilities, and excess real physical assets. Primarily focusing on Niagara Mohawk, two or three case studies are used to demonstrate how proactive management and EVA analysis transforms underperforming utility assets. These will be presented in a way that can show benefits for all utility stakeholders such as cost avoidance, load growth, real estate tax savings, stranded asset reductions, environmental gains, corporate image enhancement, and regulatory/governmental gains; over and above possible economic gains. Examples will be given that include the transformation of utility assets into award winning commercial, residential, and industrial developments as well as recreational/park lands and greenways. Similarly, other examples will show the many tangible and intangible benefits of an effective investment recovery and waste stream management program. Various strategies will also be presented that detail how utilities can begin to develop a total comprehensive plan for their asset portfolio. The first step in realizing and maximizing EVA towards a portfolio of assets is a change in corporate policy--one from passive ownership to active prudent management. Service and cost will drive competition resulting from full deregulation. To drive down costs, utilities will need to become more efficient in dealing with their asset base. By embracing an EVA model on an entire asset portfolio, utilities can prepare and excel in the newly shaped marketplace

  17. Solar Asset Management Software

    Energy Technology Data Exchange (ETDEWEB)

    Iverson, Aaron [Ra Power Management, Inc., Oakland, CA (United States); Zviagin, George [Ra Power Management, Inc., Oakland, CA (United States)

    2016-09-30

    Ra Power Management (RPM) has developed a cloud based software platform that manages the financial and operational functions of third party financed solar projects throughout their lifecycle. RPM’s software streamlines and automates the sales, financing, and management of a portfolio of solar assets. The software helps solar developers automate the most difficult aspects of asset management, leading to increased transparency, efficiency, and reduction in human error. More importantly, our platform will help developers save money by improving their operating margins.

  18. The application of statistical methods to assess economic assets

    Directory of Open Access Journals (Sweden)

    D. V. Dianov

    2017-01-01

    Full Text Available The article is devoted to consideration and evaluation of machinery, equipment and special equipment, methodological aspects of the use of standards for assessment of buildings and structures in current prices, the valuation of residential, specialized houses, office premises, assessment and reassessment of existing and inactive military assets, the application of statistical methods to obtain the relevant cost estimates.The objective of the scientific article is to consider possible application of statistical tools in the valuation of the assets, composing the core group of elements of national wealth – the fixed assets. Firstly, capital tangible assets constitute the basis of material base of a new value creation, products and non-financial services. The gain, accumulated of tangible assets of a capital nature is a part of the gross domestic product, and from its volume and specific weight in the composition of GDP we can judge the scope of reproductive processes in the country.Based on the methodological materials of the state statistics bodies of the Russian Federation, regulations of the theory of statistics, which describe the methods of statistical analysis such as the index, average values, regression, the methodical approach is structured in the application of statistical tools to obtain value estimates of property, plant and equipment with significant accumulated depreciation. Until now, the use of statistical methodology in the practice of economic assessment of assets is only fragmentary. This applies to both Federal Legislation (Federal law № 135 «On valuation activities in the Russian Federation» dated 16.07.1998 in edition 05.07.2016 and the methodological documents and regulations of the estimated activities, in particular, the valuation activities’ standards. A particular problem is the use of a digital database of Rosstat (Federal State Statistics Service, as to the specific fixed assets the comparison should be carried

  19. ASSET guidelines

    International Nuclear Information System (INIS)

    1990-11-01

    The IAEA Assessment of Safety Significant Events Team (ASSET) Service provides advice and assistance to Member States to enhance the overall level of plant safety while dealing with the policy of prevention of incidents at nuclear power plants. The ASSET programme, initiated in 1986, is not restricted to any particular group of Member States, whether developing or industrialized, but is available to all countries with nuclear power plants in operation or approaching commercial operation. The IAEA Safety Series publications form common basis for the ASSET reviews, including the Nuclear Safety Standards (NUSS) and the Basic Safety Principles (Recommendations of Safety Series No. 75-INSAG-3). The ASSET Guidelines provide overall guidance for the experts to ensure the consistency and comprehensiveness of their review of incident investigations. Additional guidance and reference material is provided by the IAEA to complement the expertise of the ASSET members. ASSET reviews accept different approaches that contribute to ensuring an effective prevention of incidents at plants. Suggestions are offered to enhance plant safety performance. Commendable good practices are identified and generic lessons are communicated to other plants, where relevant, for long term improvement

  20. Fuzzy Rules for Ant Based Clustering Algorithm

    Directory of Open Access Journals (Sweden)

    Amira Hamdi

    2016-01-01

    Full Text Available This paper provides a new intelligent technique for semisupervised data clustering problem that combines the Ant System (AS algorithm with the fuzzy c-means (FCM clustering algorithm. Our proposed approach, called F-ASClass algorithm, is a distributed algorithm inspired by foraging behavior observed in ant colonyT. The ability of ants to find the shortest path forms the basis of our proposed approach. In the first step, several colonies of cooperating entities, called artificial ants, are used to find shortest paths in a complete graph that we called graph-data. The number of colonies used in F-ASClass is equal to the number of clusters in dataset. Hence, the partition matrix of dataset founded by artificial ants is given in the second step, to the fuzzy c-means technique in order to assign unclassified objects generated in the first step. The proposed approach is tested on artificial and real datasets, and its performance is compared with those of K-means, K-medoid, and FCM algorithms. Experimental section shows that F-ASClass performs better according to the error rate classification, accuracy, and separation index.

  1. Comprehensive transportation asset management : risk-based inventory expansion and data needs.

    Science.gov (United States)

    2011-12-01

    Several agencies are applying asset management principles as a business tool and paradigm to help them define goals and prioritize agency resources in decision making. Previously, transportation asset management (TAM) has focused more on big ticke...

  2. Development of an international scale of socio-economic position based on household assets.

    Science.gov (United States)

    Townend, John; Minelli, Cosetta; Harrabi, Imed; Obaseki, Daniel O; El-Rhazi, Karima; Patel, Jaymini; Burney, Peter

    2015-01-01

    The importance of studying associations between socio-economic position and health has often been highlighted. Previous studies have linked the prevalence and severity of lung disease with national wealth and with socio-economic position within some countries but there has been no systematic evaluation of the association between lung function and poverty at the individual level on a global scale. The BOLD study has collected data on lung function for individuals in a wide range of countries, however a barrier to relating this to personal socio-economic position is the need for a suitable measure to compare individuals within and between countries. In this paper we test a method for assessing socio-economic position based on the scalability of a set of durable assets (Mokken scaling), and compare its usefulness across countries of varying gross national income per capita. Ten out of 15 candidate asset questions included in the questionnaire were found to form a Mokken type scale closely associated with GNI per capita (Spearman's rank rs = 0.91, p = 0.002). The same set of assets conformed to a scale in 7 out of the 8 countries, the remaining country being Saudi Arabia where most respondents owned most of the assets. There was good consistency in the rank ordering of ownership of the assets in the different countries (Cronbach's alpha = 0.96). Scores on the Mokken scale were highly correlated with scores developed using principal component analysis (rs = 0.977). Mokken scaling is a potentially valuable tool for uncovering links between disease and socio-economic position within and between countries. It provides an alternative to currently used methods such as principal component analysis for combining personal asset data to give an indication of individuals' relative wealth. Relative strengths of the Mokken scale method were considered to be ease of interpretation, adaptability for comparison with other datasets, and reliability of imputation for even quite

  3. A risk-based approach to sanitary sewer pipe asset management.

    Science.gov (United States)

    Baah, Kelly; Dubey, Brajesh; Harvey, Richard; McBean, Edward

    2015-02-01

    Wastewater collection systems are an important component of proper management of wastewater to prevent environmental and human health implications from mismanagement of anthropogenic waste. Due to aging and inadequate asset management practices, the wastewater collection assets of many cities around the globe are in a state of rapid decline and in need of urgent attention. Risk management is a tool which can help prioritize resources to better manage and rehabilitate wastewater collection systems. In this study, a risk matrix and a weighted sum multi-criteria decision-matrix are used to assess the consequence and risk of sewer pipe failure for a mid-sized city, using ArcGIS. The methodology shows that six percent of the uninspected sewer pipe assets of the case study have a high consequence of failure while four percent of the assets have a high risk of failure and hence provide priorities for inspection. A map incorporating risk of sewer pipe failure and consequence is developed to facilitate future planning, rehabilitation and maintenance programs. The consequence of failure assessment also includes a novel failure impact factor which captures the effect of structurally defective stormwater pipes on the failure assessment. The methodology recommended in this study can serve as a basis for future planning and decision making and has the potential to be universally applied by municipal sewer pipe asset managers globally to effectively manage the sanitary sewer pipe infrastructure within their jurisdiction. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Genetic Algorithms Principles Towards Hidden Markov Model

    Directory of Open Access Journals (Sweden)

    Nabil M. Hewahi

    2011-10-01

    Full Text Available In this paper we propose a general approach based on Genetic Algorithms (GAs to evolve Hidden Markov Models (HMM. The problem appears when experts assign probability values for HMM, they use only some limited inputs. The assigned probability values might not be accurate to serve in other cases related to the same domain. We introduce an approach based on GAs to find
    out the suitable probability values for the HMM to be mostly correct in more cases than what have been used to assign the probability values.

  5. Single-Sex Schools, Student Achievement, and Course Selection: Evidence from Rule-Based Student Assignments in Trinidad and Tobago. NBER Working Paper No. 16817

    Science.gov (United States)

    Jackson, C. Kirabo

    2011-01-01

    Existing studies on single-sex schooling suffer from biases due to student selection to schools and single-sex schools being better in unmeasured ways. In Trinidad and Tobago students are assigned to secondary schools based on an algorithm allowing one to address self-selection bias and cleanly estimate an upper-bound single-sex school effect. The…

  6. Global Tactical Cross-Asset Allocation: Applying Value and Momentum Across Asset Classes

    NARCIS (Netherlands)

    D.C. Blitz (David); P. van Vliet (Pim)

    2008-01-01

    textabstractIn this paper we examine global tactical asset allocation (GTAA) strategies across a broad range of asset classes. Contrary to market timing for single asset classes and tactical allocation across similar assets, this topic has received little attention in the existing literature. Our

  7. Development of transportation asset management decision support tools : final report.

    Science.gov (United States)

    2017-08-09

    This study developed a web-based prototype decision support platform to demonstrate the benefits of transportation asset management in monitoring asset performance, supporting asset funding decisions, planning budget tradeoffs, and optimizing resourc...

  8. Using Photovoice and Asset Mapping to Inform a Community-Based Diabetes Intervention, Boston, Massachusetts, 2015.

    Science.gov (United States)

    Florian, Jana; Roy, Nicole M St Omer; Quintiliani, Lisa M; Truong, Ve; Feng, Yi; Bloch, Philippe P; Russinova, Zlatka L; Lasser, Karen E

    2016-08-11

    Diabetes self-management takes place within a complex social and environmental context.  This study's objective was to examine the perceived and actual presence of community assets that may aid in diabetes control. We conducted one 6-hour photovoice session with 11 adults with poorly controlled diabetes in Boston, Massachusetts.  Participants were recruited from census tracts with high numbers of people with poorly controlled diabetes (diabetes "hot spots").  We coded the discussions and identified relevant themes.  We further explored themes related to the built environment through community asset mapping.  Through walking surveys, we evaluated 5 diabetes hot spots related to physical activity resources, walking environment, and availability of food choices in restaurants and food stores. Community themes from the photovoice session were access to healthy food, restaurants, and prepared foods; food assistance programs; exercise facilities; and church.  Asset mapping identified 114 community assets including 22 food stores, 22 restaurants, and 5 exercise facilities.  Each diabetes hot spot contained at least 1 food store with 5 to 9 varieties of fruits and vegetables.  Only 1 of the exercise facilities had signage regarding hours or services.  Memberships ranged from free to $9.95 per month.  Overall, these findings were inconsistent with participants' reports in the photovoice group. We identified a mismatch between perceptions of community assets and built environment and the objective reality of that environment. Incorporating photovoice and community asset mapping into a community-based diabetes intervention may bring awareness to underused neighborhood resources that can help people control their diabetes.

  9. Global Tactical Cross-Asset Allocation: Applying Value and Momentum Across Asset Classes

    OpenAIRE

    Blitz, D.C.; van Vliet, P.

    2008-01-01

    textabstractIn this paper we examine global tactical asset allocation (GTAA) strategies across a broad range of asset classes. Contrary to market timing for single asset classes and tactical allocation across similar assets, this topic has received little attention in the existing literature. Our main finding is that momentum and value strategies applied to GTAA across twelve asset classes deliver statistically and economically significant abnormal returns. For a long top-quartile and short b...

  10. An Accurate and Impartial Expert Assignment Method for Scientific Project Review

    Directory of Open Access Journals (Sweden)

    Mingliang Yue

    2017-12-01

    Full Text Available Purpose: This paper proposes an expert assignment method for scientific project review that considers both accuracy and impartiality. As impartial and accurate peer review is extremely important to ensure the quality and feasibility of scientific projects, enhanced methods for managing the process are needed. Design/methodology/approach: To ensure both accuracy and impartiality, we design four criteria, the reviewers’ fitness degree, research intensity, academic association, and potential conflict of interest, to express the characteristics of an appropriate peer review expert. We first formalize the expert assignment problem as an optimization problem based on the designed criteria, and then propose a randomized algorithm to solve the expert assignment problem of identifying reviewer adequacy. Findings: Simulation results show that the proposed method is quite accurate and impartial during expert assignment. Research limitations: Although the criteria used in this paper can properly show the characteristics of a good and appropriate peer review expert, more criteria/conditions can be included in the proposed scheme to further enhance accuracy and impartiality of the expert assignment. Practical implications: The proposed method can help project funding agencies (e.g. the National Natural Science Foundation of China find better experts for project peer review. Originality/value: To the authors’ knowledge, this is the first publication that proposes an algorithm that applies an impartial approach to the project review expert assignment process. The simulation results show the effectiveness of the proposed method.

  11. The model of asset management of commercial banks

    OpenAIRE

    Shaymardanov, Shakhzod; Nuriddinov, Sadriddin; Mamadaliev, Donierbek; Murodkhonov, Mukhammad

    2018-01-01

    The main objective of the commercial bank's policy in the sphere of asset and liability management is to maintain the optimal structure of assets and liabilities, ensure the compliance of amounts, terms and currency of attracting and allocating resources. The objectives and principles of asset and liability management are based on the bank's strategy and the fundamental principles of the risk management policy.

  12. An Optimization Method of Passenger Assignment for Customized Bus

    OpenAIRE

    Yang Cao; Jian Wang

    2017-01-01

    This study proposes an optimization method of passenger assignment on customized buses (CB). Our proposed method guarantees benefits to passengers by balancing the elements of travel time, waiting time, delay, and economic cost. The optimization problem was solved using a Branch and Bound (B&B) algorithm based on the shortest path for the selected stations. A simulation-based evaluation of the proposed optimization method was conducted. We find that a CB service can save 38.33% in average tra...

  13. A Bio-Inspired Approach to Traffic Network Equilibrium Assignment Problem.

    Science.gov (United States)

    Zhang, Xiaoge; Mahadevan, Sankaran

    2018-04-01

    Finding an equilibrium state of the traffic assignment plays a significant role in the design of transportation networks. We adapt the path finding mathematical model of slime mold Physarum polycephalum to solve the traffic equilibrium assignment problem. We make three contributions in this paper. First, we propose a generalized Physarum model to solve the shortest path problem in directed and asymmetric graphs. Second, we extend it further to resolve the network design problem with multiple source nodes and sink nodes. At last, we demonstrate that the Physarum solver converges to the user-optimized (Wardrop) equilibrium by dynamically updating the costs of links in the network. In addition, convergence of the developed algorithm is proved. Numerical examples are used to demonstrate the efficiency of the proposed algorithm. The superiority of the proposed algorithm is demonstrated in comparison with several other algorithms, including the Frank-Wolfe algorithm, conjugate Frank-Wolfe algorithm, biconjugate Frank-Wolfe algorithm, and gradient projection algorithm.

  14. The strategic importance of identifying knowledge-based and intangible assets for generating value, competitiveness and innovation in sub-Saharan Africa

    Directory of Open Access Journals (Sweden)

    Nicoline Ondari-Okemwa

    2011-01-01

    Full Text Available This article discusses the strategic importance of identifying intangible assets for creating value and enhancing competitiveness and innovation in science and technology in a knowledge economy with particular reference to the sub- Saharan Africa region. It has always been difficult to gather the prerequisite information to manage such assets and create value from them. The paper discusses the nature of intangible assets, the characteristics of a knowledge economy and the role of knowledge workers in a knowledge economy. The paper also discusses the importance of identifying intangible assets in relation to capturing the value of such assets, the transfer of intangible assets to other owners and the challenges of managing organizational intangible assets. Objectives of the article include: underscoring the strategic importance of identifying intangible assets in sub-Saharan Africa; examining the performance of intangible assets in a knowledge economy; how intangible assets may generate competitiveness, economic growth and innovation; and assess how knowledge workers are becoming a dominant factor in the knowledge economy. An extensive literature review was employed to collect data for this article. It is concluded in the article that organizations and governments in sub-Saharan Africa should look at knowledge-based assets as strategic resources, even though the traditional accounting systems may still be having problems in determining the exact book value of such assets. It is recommended that organizations and government departments in sub-Saharan Africa should implement a system of the reporting of the value of intangible organizational assets just like the reporting of the value of tangible assets; and that organizations in sub-Saharan Africa should use knowledge to produce “smart products and services” which command premium prices.

  15. Opposition-Based Adaptive Fireworks Algorithm

    Directory of Open Access Journals (Sweden)

    Chibing Gong

    2016-07-01

    Full Text Available A fireworks algorithm (FWA is a recent swarm intelligence algorithm that is inspired by observing fireworks explosions. An adaptive fireworks algorithm (AFWA proposes additional adaptive amplitudes to improve the performance of the enhanced fireworks algorithm (EFWA. The purpose of this paper is to add opposition-based learning (OBL to AFWA with the goal of further boosting performance and achieving global optimization. Twelve benchmark functions are tested in use of an opposition-based adaptive fireworks algorithm (OAFWA. The final results conclude that OAFWA significantly outperformed EFWA and AFWA in terms of solution accuracy. Additionally, OAFWA was compared with a bat algorithm (BA, differential evolution (DE, self-adapting control parameters in differential evolution (jDE, a firefly algorithm (FA, and a standard particle swarm optimization 2011 (SPSO2011 algorithm. The research results indicate that OAFWA ranks the highest of the six algorithms for both solution accuracy and runtime cost.

  16. Opposition-Based Adaptive Fireworks Algorithm

    OpenAIRE

    Chibing Gong

    2016-01-01

    A fireworks algorithm (FWA) is a recent swarm intelligence algorithm that is inspired by observing fireworks explosions. An adaptive fireworks algorithm (AFWA) proposes additional adaptive amplitudes to improve the performance of the enhanced fireworks algorithm (EFWA). The purpose of this paper is to add opposition-based learning (OBL) to AFWA with the goal of further boosting performance and achieving global optimization. Twelve benchmark functions are tested in use of an opposition-based a...

  17. Optimal algorithmic trading and market microstructure

    OpenAIRE

    Labadie , Mauricio; Lehalle , Charles-Albert

    2010-01-01

    The efficient frontier is a core concept in Modern Portfolio Theory. Based on this idea, we will construct optimal trading curves for different types of portfolios. These curves correspond to the algorithmic trading strategies that minimize the expected transaction costs, i.e. the joint effect of market impact and market risk. We will study five portfolio trading strategies. For the first three (single-asset, general multi-asseet and balanced portfolios) we will assume that the underlyings fo...

  18. Network formation in a multi-asset artificial stock market

    Science.gov (United States)

    Wu, Songtao; He, Jianmin; Li, Shouwei; Wang, Chao

    2018-04-01

    A multi-asset artificial stock market is developed. In the market, stocks are assigned to a number of sectors and traded by heterogeneous investors. The mechanism of continuous double auction is employed to clear order book and form daily closed prices. Simulation results of prices at the sector level show an intra-sector similarity and inter-sector distinctiveness, and returns of individual stocks have stylized facts that are ubiquitous in the real-world stock market. We find that the market risk factor has critical impact on both network topology transition and connection formation, and that sector risk factors account for the formation of intra-sector links and sector-based local interaction. In addition, the number of community in threshold-based networks is correlated negatively and positively with the value of correlation coefficients and the ratio of intra-sector links, which are respectively determined by intensity of sector risk factors and the number of sectors.

  19. A controllable sensor management algorithm capable of learning

    Science.gov (United States)

    Osadciw, Lisa A.; Veeramacheneni, Kalyan K.

    2005-03-01

    Sensor management technology progress is challenged by the geographic space it spans, the heterogeneity of the sensors, and the real-time timeframes within which plans controlling the assets are executed. This paper presents a new sensor management paradigm and demonstrates its application in a sensor management algorithm designed for a biometric access control system. This approach consists of an artificial intelligence (AI) algorithm focused on uncertainty measures, which makes the high level decisions to reduce uncertainties and interfaces with the user, integrated cohesively with a bottom up evolutionary algorithm, which optimizes the sensor network"s operation as determined by the AI algorithm. The sensor management algorithm presented is composed of a Bayesian network, the AI algorithm component, and a swarm optimization algorithm, the evolutionary algorithm. Thus, the algorithm can change its own performance goals in real-time and will modify its own decisions based on observed measures within the sensor network. The definition of the measures as well as the Bayesian network determine the robustness of the algorithm and its utility in reacting dynamically to changes in the global system.

  20. Child-Centered Group Play Therapy: Impact on Social-Emotional Assets of Kindergarten Children

    Science.gov (United States)

    Cheng, Yi-Ju; Ray, Dee C.

    2016-01-01

    The current study explored the effects of child-centered group play therapy (CCGPT) on social-emotional assets of kindergarten children and the therapeutic aspect of group sizes in CCGPT outcome. A total of 43 participants were randomly assigned to either the intervention or waitlist control groups. We used Parent and Teacher forms of Social…

  1. Risk Management of Assets Dependency Based on Copulas Function

    Directory of Open Access Journals (Sweden)

    Cheng Lei

    2017-01-01

    Full Text Available As the two important form of financial market, the risk of financial securities, such as stocks and bonds, has been a hot topic in the financial field; at the same time, under the influence of many factors of financial assets, the correlation between portfolio returns causes more research. This paper presents Copula-SV-t model that it uses SV-t model to measure the edge distribution, and uses the Copula-t method to obtain the high-dimensional joint distribution. It not only solves the actual deviation with using the ARCH family model to calculate the portfolio risk, but also solves the problem to overestimate the risk with using extreme value theory to study financial risk. Through the empirical research, the conclusion shows that the model describes better assets and tail characteristics of assets, and is more in line with the reality of the market. Furthermore, Empirical evidence also shows that if the portfolio is relatively large degree of correlation, the ability to disperse portfolio risk is relatively weakness.

  2. Dynamic route guidance algorithm based algorithm based on artificial immune system

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    To improve the performance of the K-shortest paths search in intelligent traffic guidance systems,this paper proposes an optimal search algorithm based on the intelligent optimization search theory and the memphor mechanism of vertebrate immune systems.This algorithm,applied to the urban traffic network model established by the node-expanding method,can expediently realize K-shortest paths search in the urban traffic guidance systems.Because of the immune memory and global parallel search ability from artificial immune systems,K shortest paths can be found without any repeat,which indicates evidently the superiority of the algorithm to the conventional ones.Not only does it perform a better parallelism,the algorithm also prevents premature phenomenon that often occurs in genetic algorithms.Thus,it is especially suitable for real-time requirement of the traffic guidance system and other engineering optimal applications.A case study verifies the efficiency and the practicability of the algorithm aforementioned.

  3. Solving multiconstraint assignment problems using learning automata.

    Science.gov (United States)

    Horn, Geir; Oommen, B John

    2010-02-01

    This paper considers the NP-hard problem of object assignment with respect to multiple constraints: assigning a set of elements (or objects) into mutually exclusive classes (or groups), where the elements which are "similar" to each other are hopefully located in the same class. The literature reports solutions in which the similarity constraint consists of a single index that is inappropriate for the type of multiconstraint problems considered here and where the constraints could simultaneously be contradictory. This feature, where we permit possibly contradictory constraints, distinguishes this paper from the state of the art. Indeed, we are aware of no learning automata (or other heuristic) solutions which solve this problem in its most general setting. Such a scenario is illustrated with the static mapping problem, which consists of distributing the processes of a parallel application onto a set of computing nodes. This is a classical and yet very important problem within the areas of parallel computing, grid computing, and cloud computing. We have developed four learning-automata (LA)-based algorithms to solve this problem: First, a fixed-structure stochastic automata algorithm is presented, where the processes try to form pairs to go onto the same node. This algorithm solves the problem, although it requires some centralized coordination. As it is desirable to avoid centralized control, we subsequently present three different variable-structure stochastic automata (VSSA) algorithms, which have superior partitioning properties in certain settings, although they forfeit some of the scalability features of the fixed-structure algorithm. All three VSSA algorithms model the processes as automata having first the hosting nodes as possible actions; second, the processes as possible actions; and, third, attempting to estimate the process communication digraph prior to probabilistically mapping the processes. This paper, which, we believe, comprehensively reports the

  4. Analytical Provision of Management of Intangible Assets

    Directory of Open Access Journals (Sweden)

    Shelest Viktoriya S.

    2013-11-01

    Full Text Available The goal of the article lies in the study of the process of conduct of economic analysis of such a complex product of the innovation and information society as objects of intellectual property, which are accepted in business accounting as intangible assets. All-absorbing integration processes in the economy and large-scale propagation of information technologies influence the capital structure. Thus, accepting intangible assets as a driving factor of competitiveness, enterprises prefer namely these assets, reducing the share of tangible assets. Taking this into account the scientists thoroughly studied the issues of economic analysis of intangible assets, since the obtained data are the main source of accounting and analytical information required for making weighted managerial decisions. At the same time, the issues of authenticity, accuracy, efficiency and transparency of the obtained results become topical. In the process of the study the article shows information content of the accounting and analytical data due to introduction of accounting and conduct of economic analysis of intangible assets. The article considers the modern state of the methods of analysis of intangible assets based on opinions of scientists. It characterises economic and legal state of development of licence agreements in Ukraine. It justifies economic expediency of use of such agreements. It forms the ways of making efficient managerial decisions on use of intangible assets in economic activity of subjects of entrepreneurship.

  5. Contact replacement for NMR resonance assignment.

    Science.gov (United States)

    Xiong, Fei; Pandurangan, Gopal; Bailey-Kellogg, Chris

    2008-07-01

    Complementing its traditional role in structural studies of proteins, nuclear magnetic resonance (NMR) spectroscopy is playing an increasingly important role in functional studies. NMR dynamics experiments characterize motions involved in target recognition, ligand binding, etc., while NMR chemical shift perturbation experiments identify and localize protein-protein and protein-ligand interactions. The key bottleneck in these studies is to determine the backbone resonance assignment, which allows spectral peaks to be mapped to specific atoms. This article develops a novel approach to address that bottleneck, exploiting an available X-ray structure or homology model to assign the entire backbone from a set of relatively fast and cheap NMR experiments. We formulate contact replacement for resonance assignment as the problem of computing correspondences between a contact graph representing the structure and an NMR graph representing the data; the NMR graph is a significantly corrupted, ambiguous version of the contact graph. We first show that by combining connectivity and amino acid type information, and exploiting the random structure of the noise, one can provably determine unique correspondences in polynomial time with high probability, even in the presence of significant noise (a constant number of noisy edges per vertex). We then detail an efficient randomized algorithm and show that, over a variety of experimental and synthetic datasets, it is robust to typical levels of structural variation (1-2 AA), noise (250-600%) and missings (10-40%). Our algorithm achieves very good overall assignment accuracy, above 80% in alpha-helices, 70% in beta-sheets and 60% in loop regions. Our contact replacement algorithm is implemented in platform-independent Python code. The software can be freely obtained for academic use by request from the authors.

  6. Archimedean copula estimation of distribution algorithm based on artificial bee colony algorithm

    Institute of Scientific and Technical Information of China (English)

    Haidong Xu; Mingyan Jiang; Kun Xu

    2015-01-01

    The artificial bee colony (ABC) algorithm is a com-petitive stochastic population-based optimization algorithm. How-ever, the ABC algorithm does not use the social information and lacks the knowledge of the problem structure, which leads to in-sufficiency in both convergent speed and searching precision. Archimedean copula estimation of distribution algorithm (ACEDA) is a relatively simple, time-economic and multivariate correlated EDA. This paper proposes a novel hybrid algorithm based on the ABC algorithm and ACEDA cal ed Archimedean copula estima-tion of distribution based on the artificial bee colony (ACABC) algorithm. The hybrid algorithm utilizes ACEDA to estimate the distribution model and then uses the information to help artificial bees to search more efficiently in the search space. Six bench-mark functions are introduced to assess the performance of the ACABC algorithm on numerical function optimization. Experimen-tal results show that the ACABC algorithm converges much faster with greater precision compared with the ABC algorithm, ACEDA and the global best (gbest)-guided ABC (GABC) algorithm in most of the experiments.

  7. Evaluation of the Effect of Non-Current Fixed Assets on Profitability and Asset Management Efficiency

    Science.gov (United States)

    Lubyanaya, Alexandra V.; Izmailov, Airat M.; Nikulina, Ekaterina Y.; Shaposhnikov, Vladislav A.

    2016-01-01

    The purpose of this article is to investigate the problem, which stems from non-current fixed assets affecting profitability and asset management efficiency. Tangible assets, intangible assets and financial assets are all included in non-current fixed assets. The aim of the research is to identify the impact of estimates and valuation in…

  8. Explorations of the implementation of a parallel IDW interpolation algorithm in a Linux cluster-based parallel GIS

    Science.gov (United States)

    Huang, Fang; Liu, Dingsheng; Tan, Xicheng; Wang, Jian; Chen, Yunping; He, Binbin

    2011-04-01

    To design and implement an open-source parallel GIS (OP-GIS) based on a Linux cluster, the parallel inverse distance weighting (IDW) interpolation algorithm has been chosen as an example to explore the working model and the principle of algorithm parallel pattern (APP), one of the parallelization patterns for OP-GIS. Based on an analysis of the serial IDW interpolation algorithm of GRASS GIS, this paper has proposed and designed a specific parallel IDW interpolation algorithm, incorporating both single process, multiple data (SPMD) and master/slave (M/S) programming modes. The main steps of the parallel IDW interpolation algorithm are: (1) the master node packages the related information, and then broadcasts it to the slave nodes; (2) each node calculates its assigned data extent along one row using the serial algorithm; (3) the master node gathers the data from all nodes; and (4) iterations continue until all rows have been processed, after which the results are outputted. According to the experiments performed in the course of this work, the parallel IDW interpolation algorithm can attain an efficiency greater than 0.93 compared with similar algorithms, which indicates that the parallel algorithm can greatly reduce processing time and maximize speed and performance.

  9. Multi-Working Modes Product-Color Planning Based on Evolutionary Algorithms and Swarm Intelligence

    Directory of Open Access Journals (Sweden)

    Man Ding

    2010-01-01

    Full Text Available In order to assist designer in color planning during product development, a novel synthesized evaluation method is presented to evaluate color-combination schemes of multi-working modes products (MMPs. The proposed evaluation method considers color-combination images in different working modes as evaluating attributes, to which the corresponding weights are assigned for synthesized evaluation. Then a mathematical model is developed to search for optimal color-combination schemes of MMP based on the proposed evaluation method and two powerful search techniques known as Evolution Algorithms (EAs and Swarm Intelligence (SI. In the experiments, we present a comparative study for two EAs, namely, Genetic Algorithm (GA and Difference Evolution (DE, and one SI algorithm, namely, Particle Swarm Optimization (PSO, on searching for color-combination schemes of MMP problem. All of the algorithms are evaluated against a test scenario, namely, an Arm-type aerial work platform, which has two working modes. The results show that the DE obtains the superior solution than the other two algorithms for color-combination scheme searching problem in terms of optimization accuracy and computation robustness. Simulation results demonstrate that the proposed method is feasible and efficient.

  10. SPHINX--an algorithm for taxonomic binning of metagenomic sequences.

    Science.gov (United States)

    Mohammed, Monzoorul Haque; Ghosh, Tarini Shankar; Singh, Nitin Kumar; Mande, Sharmila S

    2011-01-01

    Compared with composition-based binning algorithms, the binning accuracy and specificity of alignment-based binning algorithms is significantly higher. However, being alignment-based, the latter class of algorithms require enormous amount of time and computing resources for binning huge metagenomic datasets. The motivation was to develop a binning approach that can analyze metagenomic datasets as rapidly as composition-based approaches, but nevertheless has the accuracy and specificity of alignment-based algorithms. This article describes a hybrid binning approach (SPHINX) that achieves high binning efficiency by utilizing the principles of both 'composition'- and 'alignment'-based binning algorithms. Validation results with simulated sequence datasets indicate that SPHINX is able to analyze metagenomic sequences as rapidly as composition-based algorithms. Furthermore, the binning efficiency (in terms of accuracy and specificity of assignments) of SPHINX is observed to be comparable with results obtained using alignment-based algorithms. A web server for the SPHINX algorithm is available at http://metagenomics.atc.tcs.com/SPHINX/.

  11. Japanese views on ASSET

    Energy Technology Data Exchange (ETDEWEB)

    Hirano, M [Department of Reactor Safety Research, Japan Atomic Energy Research Inst. (Japan)

    1997-10-01

    The presentation briefly reviews the following aspects directed to ensuring NPP safety: Japanese participation in ASSET activities; views to ASSET activities; recent operating experience in Japan; future ASSET activities.

  12. Japanese views on ASSET

    International Nuclear Information System (INIS)

    Hirano, M.

    1997-01-01

    The presentation briefly reviews the following aspects directed to ensuring NPP safety: Japanese participation in ASSET activities; views to ASSET activities; recent operating experience in Japan; future ASSET activities

  13. Estimating the value of a Country's built assets: investment-based exposure modelling for global risk assessment

    Science.gov (United States)

    Daniell, James; Pomonis, Antonios; Gunasekera, Rashmin; Ishizawa, Oscar; Gaspari, Maria; Lu, Xijie; Aubrecht, Christoph; Ungar, Joachim

    2017-04-01

    In order to quantify disaster risk, there is a demand and need for determining consistent and reliable economic value of built assets at national or sub national level exposed to natural hazards. The value of the built stock in the context of a city or a country is critical for risk modelling applications as it allows for the upper bound in potential losses to be established. Under the World Bank probabilistic disaster risk assessment - Country Disaster Risk Profiles (CDRP) Program and rapid post-disaster loss analyses in CATDAT, key methodologies have been developed that quantify the asset exposure of a country. In this study, we assess the complementary methods determining value of building stock through capital investment data vs aggregated ground up values based on built area and unit cost of construction analyses. Different approaches to modelling exposure around the world, have resulted in estimated values of built assets of some countries differing by order(s) of magnitude. Using the aforementioned methodology of comparing investment data based capital stock and bottom-up unit cost of construction values per square meter of assets; a suitable range of capital stock estimates for built assets have been created. A blind test format was undertaken to compare the two types of approaches from top-down (investment) and bottom-up (construction cost per unit), In many cases, census data, demographic, engineering and construction cost data are key for bottom-up calculations from previous years. Similarly for the top-down investment approach, distributed GFCF (Gross Fixed Capital Formation) data is also required. Over the past few years, numerous studies have been undertaken through the World Bank Caribbean and Central America disaster risk assessment program adopting this methodology initially developed by Gunasekera et al. (2015). The range of values of the building stock is tested for around 15 countries. In addition, three types of costs - Reconstruction cost

  14. AN ECOSYSTEM PERSPECTIVE ON ASSET MANAGEMENT INFORMATION

    Directory of Open Access Journals (Sweden)

    Lasse METSO

    2017-07-01

    Full Text Available Big Data and Internet of Things will increase the amount of data on asset management exceedingly. Data sharing with an increased number of partners in the area of asset management is important when developing business opportunities and new ecosystems. An asset management ecosystem is a complex set of relationships between parties taking part in asset management actions. In this paper, the current barriers and benefits of data sharing are identified based on the results of an interview study. The main benefits are transparency, access to data and reuse of data. New services can be created by taking advantage of data sharing. The main barriers to sharing data are an unclear view of the data sharing process and difficulties to recognize the benefits of data sharing. For overcoming the barriers in data sharing, this paper applies the ecosystem perspective on asset management information. The approach is explained by using the Swedish railway industry as an example.

  15. An Ecosystem Perspective On Asset Management Information

    Science.gov (United States)

    Metso, Lasse; Kans, Mirka

    2017-09-01

    Big Data and Internet of Things will increase the amount of data on asset management exceedingly. Data sharing with an increased number of partners in the area of asset management is important when developing business opportunities and new ecosystems. An asset management ecosystem is a complex set of relationships between parties taking part in asset management actions. In this paper, the current barriers and benefits of data sharing are identified based on the results of an interview study. The main benefits are transparency, access to data and reuse of data. New services can be created by taking advantage of data sharing. The main barriers to sharing data are an unclear view of the data sharing process and difficulties to recognize the benefits of data sharing. For overcoming the barriers in data sharing, this paper applies the ecosystem perspective on asset management information. The approach is explained by using the Swedish railway industry as an example.

  16. A study on intangible assets disclosure: An evidence from Indian companies

    Directory of Open Access Journals (Sweden)

    Subash Chander

    2011-04-01

    Full Text Available Purpose: India has emerged at the top of the pedestal in the present knowledge-driven global marketplace, where intangible assets hold much more value than physical assets. The objective of this study is to determine the extent of intangible asset disclosure by companies in IndiaDesign/methodology/approach: This study relates to the years 2003-04 and 2007-08 and is based on 243 companies selected from BT-500 companies. The annual reports of these companies were analyzed using content analysis so as to examine the level of disclosure of intangible asset information. Intangible assets disclosure index based on the intangible assets framework as given by Sveiby (1997 and as used and tested by Guthrie and Petty (2000 and many other subsequent studies was modified and used for this study. Findings: The results showed that external capital is the most disclosed intangible asset category with a disclosure score of 37.90% and 35.83% in the years 2003-04 and 2007-08 respectively. Infosys technologies Ltd. is the company with the highest intangible assets reporting for both the years (2003-04: 68.52%, 2007-08: 81.48%. Further the reporting of intangible assets is unorganized and unsystematic. There is lack of appropriate framework for disclosing intangible assets information in the annual reports.Originality/value: This is perhaps the first comprehensive study on intangible assets disclosure based on a large sample of the companies from India. Literature reveals that now the intangible assets play relatively an increasingly significant role in the decision making process of various users of corporate reports. This study shows that the overall disclosure of intangible assets is low in India. Thus this study may be of value to the corporate sector in India to explore the areas of intangible assets disclosure so that they can provide useful and relevant information to the users of annual reports.

  17. Fluidity models in ancient Greece and current practices of sex assignment.

    Science.gov (United States)

    Chen, Min-Jye; McCann-Crosby, Bonnie; Gunn, Sheila; Georgiadis, Paraskevi; Placencia, Frank; Mann, David; Axelrad, Marni; Karaviti, L P; McCullough, Laurence B

    2017-06-01

    Disorders of sexual differentiation such as androgen insensitivity and gonadal dysgenesis can involve an intrinsic fluidity at different levels, from the anatomical and biological to the social (gender) that must be considered in the context of social constraints. Sex assignment models based on George Engel's biopsychosocial aspects model of biology accept fluidity of gender as a central concept and therefore help establish expectations within the uncertainty of sex assignment and anticipate potential changes. The biology underlying the fluidity inherent to these disorders should be presented to parents at diagnosis, an approach that the gender medicine field should embrace as good practice. Greek mythology provides many accepted archetypes of change, and the ancient Greek appreciation of metamorphosis can be used as context with these patients. Our goal is to inform expertise and optimal approaches, knowing that this fluidity may eventually necessitate sex reassignment. Physicians should provide sex assignment education based on different components of sexual differentiation, prepare parents for future hormone-triggered changes in their children, and establish a sex-assignment algorithm. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Fluidity models in ancient Greece and current practices of sex assignment

    Science.gov (United States)

    Chen, Min-Jye; McCann-Crosby, Bonnie; Gunn, Sheila; Georgiadis, Paraskevi; Placencia, Frank; Mann, David; Axelrad, Marni; Karaviti, L.P; McCullough, Laurence B.

    2018-01-01

    Disorders of sexual differentiation such as androgen insensitivity and gonadal dysgenesis can involve an intrinsic fluidity at different levels, from the anatomical and biological to the social (gender) that must be considered in the context of social constraints. Sex assignment models based on George Engel’s biopsychosocial aspects model of biology accept fluidity of gender as a central concept and therefore help establish expectations within the uncertainty of sex assignment and anticipate potential changes. The biology underlying the fluidity inherent to these disorders should be presented to parents at diagnosis, an approach that the gender medicine field should embrace as good practice. Greek mythology provides many accepted archetypes of change, and the ancient Greek appreciation of metamorphosis can be used as context with these patients. Our goal is to inform expertise and optimal approaches, knowing that this fluidity may eventually necessitate sex reassignment. Physicians should provide sex assignment education based on different components of sexual differentiation, prepare parents for future hormone-triggered changes in their children, and establish a sex-assignment algorithm. PMID:28478088

  19. Automated Bug Assignment: Ensemble-based Machine Learning in Large Scale Industrial Contexts

    OpenAIRE

    Jonsson, Leif; Borg, Markus; Broman, David; Sandahl, Kristian; Eldh, Sigrid; Runeson, Per

    2016-01-01

    Bug report assignment is an important part of software maintenance. In particular, incorrect assignments of bug reports to development teams can be very expensive in large software development projects. Several studies propose automating bug assignment techniques using machine learning in open source software contexts, but no study exists for large-scale proprietary projects in industry. The goal of this study is to evaluate automated bug assignment techniques that are based on machine learni...

  20. Engineering Asset Management and Infrastructure Sustainability : Proceedings of the 5th World Congress on Engineering Asset Management

    CERN Document Server

    Ma, Lin; Tan, Andy; Weijnen, Margot; Lee, Jay

    2012-01-01

    Engineering Asset Management 2010 represents state-of-the art trends and developments in the emerging field of engineering asset management as presented at the Fifth World Congress on Engineering Asset Management (WCEAM). The proceedings of the WCEAM 2010 is an excellent reference for practitioners, researchers and students in the multidisciplinary field of asset management, covering topics such as: Asset condition monitoring and intelligent maintenance Asset data warehousing, data mining and fusion Asset performance and level-of-service models Design and life-cycle integrity of physical assets Education and training in asset management Engineering standards in asset management Fault diagnosis and prognostics Financial analysis methods for physical assets Human dimensions in integrated asset management Information quality management Information systems and knowledge management Intelligent sensors and devices Maintenance strategies in asset management Optimisation decisions in asset management Risk management ...

  1. Adaptive algorithm for predicting increases in central loads of electrical energy systems

    Energy Technology Data Exchange (ETDEWEB)

    Arbachyauskene, N A; Pushinaytis, K V

    1982-01-01

    An adaptive algorithm for predicting increases in central loads of the electrical energy system is suggested for the task of evaluating the condition. The algorithm is based on the Kalman filter. In order to calculate the coefficient of intensification, the a priori assigned noise characteristics with low accuracy are used only in the beginning of the calculation. Further, the coefficient of intensification is calculated from the innovation sequence. This approach makes it possible to correct errors in the assignment of the statistical noise characteristics and to follow their changes. The algorithm is experimentally verified.

  2. Class Schedule Assignment Based on Students Learning Rhythms Using A Genetic Algorithm Asignación de horarios de clase basado en los ritmos de aprendizaje de los estudiantes usando un algoritmo genético

    Directory of Open Access Journals (Sweden)

    Victor F. Suarez Chilma

    2013-03-01

    Full Text Available The objective of this proposal is to implement a school day agenda focused on the learning rhythms of students of elementary and secondary schools using a genetic algorithm. The methodology of this proposal takes into account legal requirements and constraints on the assignment of teachers and classrooms in public educational institutions in Colombia. In addition, this proposal provides a set of constraints focused on cognitive rhythms and subjects are scheduled at the most convenient times according to the area of knowledge. The genetic algorithm evolves through a process of mutation and selection and builds a total solution based on the best solutions for each group. Sixteen groups in a school are tested and the results of class schedule assignments are presented. The quality of the solution obtained through the established approach is validated by comparing the results to the solutions obtained using another algorithm.El objetivo de esta propuesta es implementar un horario escolar que tenga en cuenta los ritmos de aprendizaje en los estudiantes de educación primaria y secundaria, utilizando un algoritmo genético. La metodología considera los requerimientos legales y las restricciones necesarias para la asignación de maestros y aulas en instituciones educativas públicas de Colombia. Adicionalmente, se establecen un conjunto de restricciones relacionadas con el enfoque en los ritmos cognitivos, determinando las horas de la jornada en las que es más conveniente la ubicación de ciertas materias de acuerdo al área del conocimiento al que pertenecen. El algoritmo genético evoluciona mediante un proceso de mutación y selección, a través del cual se construye una solución completa a partir de la búsqueda de las mejores soluciones por grupo. Se presentan los resultados de las pruebas realizadas para la asignación de una institución con 16 grupos. La calidad de las soluciones obtenidas de acuerdo al enfoque establecido es validada

  3. Accounting treatment of intangible assets

    OpenAIRE

    Gorgieva-Trajkovska, Olivera; Koleva, Blagica; Georgieva Svrtinov, Vesna

    2015-01-01

    The accounting for fixed assets is, in many cases, a straightforward exercise, but it isn’t always so when it comes to the issue of intangible fixed assets and recognizing such assets on the balance sheet. IAS 38, In¬tan¬gi¬ble Assets, outlines the accounting re¬quire¬ments for in¬tan¬gi¬ble assets, which are non-mon¬e¬tary assets which are without physical substance and iden¬ti¬fi¬able (either being separable or arising from con¬trac¬tual or other legal rights). In¬tan¬gi¬ble assets meeting ...

  4. Regret Theory and Equilibrium Asset Prices

    Directory of Open Access Journals (Sweden)

    Jiliang Sheng

    2014-01-01

    Full Text Available Regret theory is a behavioral approach to decision making under uncertainty. In this paper we assume that there are two representative investors in a frictionless market, a representative active investor who selects his optimal portfolio based on regret theory and a representative passive investor who invests only in the benchmark portfolio. In a partial equilibrium setting, the objective of the representative active investor is modeled as minimization of the regret about final wealth relative to the benchmark portfolio. In equilibrium this optimal strategy gives rise to a behavioral asset priciting model. We show that the market beta and the benchmark beta that is related to the investor’s regret are the determinants of equilibrium asset prices. We also extend our model to a market with multibenchmark portfolios. Empirical tests using stock price data from Shanghai Stock Exchange show strong support to the asset pricing model based on regret theory.

  5. A Scheduling Algorithm for Cloud Computing System Based on the Driver of Dynamic Essential Path.

    Science.gov (United States)

    Xie, Zhiqiang; Shao, Xia; Xin, Yu

    2016-01-01

    To solve the problem of task scheduling in the cloud computing system, this paper proposes a scheduling algorithm for cloud computing based on the driver of dynamic essential path (DDEP). This algorithm applies a predecessor-task layer priority strategy to solve the problem of constraint relations among task nodes. The strategy assigns different priority values to every task node based on the scheduling order of task node as affected by the constraint relations among task nodes, and the task node list is generated by the different priority value. To address the scheduling order problem in which task nodes have the same priority value, the dynamic essential long path strategy is proposed. This strategy computes the dynamic essential path of the pre-scheduling task nodes based on the actual computation cost and communication cost of task node in the scheduling process. The task node that has the longest dynamic essential path is scheduled first as the completion time of task graph is indirectly influenced by the finishing time of task nodes in the longest dynamic essential path. Finally, we demonstrate the proposed algorithm via simulation experiments using Matlab tools. The experimental results indicate that the proposed algorithm can effectively reduce the task Makespan in most cases and meet a high quality performance objective.

  6. Accounting of Long-Term Biological Assets

    OpenAIRE

    Valeriy Mossakovskyy; Vasyl Korytnyy

    2015-01-01

    The article is devoted to generalization of experience in valuation of long-term biological assets of plant-growing and animal-breeding, and preparation of suggestions concerning improvement of accounting in this field. Recommendations concerning accounting of such assets are given based on the study of accounting practice at specific agricultural company during long period of time. Authors believe that fair value is applicable only if price level for agricultural products is fixed by the gov...

  7. Covariance-Based Measurement Selection Criterion for Gaussian-Based Algorithms

    Directory of Open Access Journals (Sweden)

    Fernando A. Auat Cheein

    2013-01-01

    Full Text Available Process modeling by means of Gaussian-based algorithms often suffers from redundant information which usually increases the estimation computational complexity without significantly improving the estimation performance. In this article, a non-arbitrary measurement selection criterion for Gaussian-based algorithms is proposed. The measurement selection criterion is based on the determination of the most significant measurement from both an estimation convergence perspective and the covariance matrix associated with the measurement. The selection criterion is independent from the nature of the measured variable. This criterion is used in conjunction with three Gaussian-based algorithms: the EIF (Extended Information Filter, the EKF (Extended Kalman Filter and the UKF (Unscented Kalman Filter. Nevertheless, the measurement selection criterion shown herein can also be applied to other Gaussian-based algorithms. Although this work is focused on environment modeling, the results shown herein can be applied to other Gaussian-based algorithm implementations. Mathematical descriptions and implementation results that validate the proposal are also included in this work.

  8. Modified Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Surafel Luleseged Tilahun

    2012-01-01

    Full Text Available Firefly algorithm is one of the new metaheuristic algorithms for optimization problems. The algorithm is inspired by the flashing behavior of fireflies. In the algorithm, randomly generated solutions will be considered as fireflies, and brightness is assigned depending on their performance on the objective function. One of the rules used to construct the algorithm is, a firefly will be attracted to a brighter firefly, and if there is no brighter firefly, it will move randomly. In this paper we modify this random movement of the brighter firefly by generating random directions in order to determine the best direction in which the brightness increases. If such a direction is not generated, it will remain in its current position. Furthermore the assignment of attractiveness is modified in such a way that the effect of the objective function is magnified. From the simulation result it is shown that the modified firefly algorithm performs better than the standard one in finding the best solution with smaller CPU time.

  9. Substantiation of Biological Assets Classification Indexes for Enhancing Their Accounting Efficiency

    OpenAIRE

    Rayisa Tsyhan; Olha Chubka

    2013-01-01

    Present day national agricultural companies sell their products in both domestic and foreign markets which has a significant impact on specifics of biological assets accounting. The article offers biological assets classification provided in the Practical Guide to Accounting for Biological Assets and, besides, specifications proposed by various scientists. Based on the analysis, biological assets classification has been supplemented with new classification factors and their appropriateness ha...

  10. A fast global fitting algorithm for fluorescence lifetime imaging microscopy based on image segmentation.

    Science.gov (United States)

    Pelet, S; Previte, M J R; Laiho, L H; So, P T C

    2004-10-01

    Global fitting algorithms have been shown to improve effectively the accuracy and precision of the analysis of fluorescence lifetime imaging microscopy data. Global analysis performs better than unconstrained data fitting when prior information exists, such as the spatial invariance of the lifetimes of individual fluorescent species. The highly coupled nature of global analysis often results in a significantly slower convergence of the data fitting algorithm as compared with unconstrained analysis. Convergence speed can be greatly accelerated by providing appropriate initial guesses. Realizing that the image morphology often correlates with fluorophore distribution, a global fitting algorithm has been developed to assign initial guesses throughout an image based on a segmentation analysis. This algorithm was tested on both simulated data sets and time-domain lifetime measurements. We have successfully measured fluorophore distribution in fibroblasts stained with Hoechst and calcein. This method further allows second harmonic generation from collagen and elastin autofluorescence to be differentiated in fluorescence lifetime imaging microscopy images of ex vivo human skin. On our experimental measurement, this algorithm increased convergence speed by over two orders of magnitude and achieved significantly better fits. Copyright 2004 Biophysical Society

  11. Financier-led asset lease model

    NARCIS (Netherlands)

    Zhao, X.; Angelov, S.A.; Grefen, P.W.P.J.; Meersman, R.A.; Dillon, T.S.

    2010-01-01

    Nowadays, the business globalisation trend drives organisations to spread their business worldwide, which in turn generates vast asset demands. In this context, broader asset channels and higher financial capacities are required to boost the asset lease sector to meet the increasing asset demands

  12. Asset Pricing - A Brief Review

    OpenAIRE

    Li, Minqiang

    2010-01-01

    I first introduce the early-stage and modern classical asset pricing and portfolio theories. These include: the capital asset pricing model (CAPM), the arbitrage pricing theory (APT), the consumption capital asset pricing model (CCAPM), the intertemporal capital asset pricing model (ICAPM), and some other important modern concepts and techniques. Finally, I discuss the most recent development during the last decade and the outlook in the field of asset pricing.

  13. Assessing Asset Pricing Anomalies

    NARCIS (Netherlands)

    W.A. de Groot (Wilma)

    2017-01-01

    markdownabstractOne of the most important challenges in the field of asset pricing is to understand anomalies: empirical patterns in asset returns that cannot be explained by standard asset pricing models. Currently, there is no consensus in the academic literature on the underlying causes of

  14. The Algorithm for Algorithms: An Evolutionary Algorithm Based on Automatic Designing of Genetic Operators

    Directory of Open Access Journals (Sweden)

    Dazhi Jiang

    2015-01-01

    Full Text Available At present there is a wide range of evolutionary algorithms available to researchers and practitioners. Despite the great diversity of these algorithms, virtually all of the algorithms share one feature: they have been manually designed. A fundamental question is “are there any algorithms that can design evolutionary algorithms automatically?” A more complete definition of the question is “can computer construct an algorithm which will generate algorithms according to the requirement of a problem?” In this paper, a novel evolutionary algorithm based on automatic designing of genetic operators is presented to address these questions. The resulting algorithm not only explores solutions in the problem space like most traditional evolutionary algorithms do, but also automatically generates genetic operators in the operator space. In order to verify the performance of the proposed algorithm, comprehensive experiments on 23 well-known benchmark optimization problems are conducted. The results show that the proposed algorithm can outperform standard differential evolution algorithm in terms of convergence speed and solution accuracy which shows that the algorithm designed automatically by computers can compete with the algorithms designed by human beings.

  15. Performance evaluation of distributed wavelength assignment in WDM optical networks

    Science.gov (United States)

    Hashiguchi, Tomohiro; Wang, Xi; Morikawa, Hiroyuki; Aoyama, Tomonori

    2004-04-01

    In WDM wavelength routed networks, prior to a data transfer, a call setup procedure is required to reserve a wavelength path between the source-destination node pairs. A distributed approach to a connection setup can achieve a very high speed, while improving the reliability and reducing the implementation cost of the networks. However, along with many advantages, several major challenges have been posed by the distributed scheme in how the management and allocation of wavelength could be efficiently carried out. In this thesis, we apply a distributed wavelength assignment algorithm named priority based wavelength assignment (PWA) that was originally proposed for the use in burst switched optical networks to the problem of reserving wavelengths of path reservation protocols in the distributed control optical networks. Instead of assigning wavelengths randomly, this approach lets each node select the "safest" wavelengths based on the information of wavelength utilization history, thus unnecessary future contention is prevented. The simulation results presented in this paper show that the proposed protocol can enhance the performance of the system without introducing any apparent drawbacks.

  16. Capital Structure and Assets

    DEFF Research Database (Denmark)

    Flor, Christian Riis

    2008-01-01

    This paper analyzes a firm's capital structure choice when assets have outside value. Valuable assets implicitly provide a collateral and increase tax shield exploitation. The key feature in this paper is asset value uncertainty, implying that it is unknown ex ante whether the equity holders ex p...

  17. Normalization based K means Clustering Algorithm

    OpenAIRE

    Virmani, Deepali; Taneja, Shweta; Malhotra, Geetika

    2015-01-01

    K-means is an effective clustering technique used to separate similar data into groups based on initial centroids of clusters. In this paper, Normalization based K-means clustering algorithm(N-K means) is proposed. Proposed N-K means clustering algorithm applies normalization prior to clustering on the available data as well as the proposed approach calculates initial centroids based on weights. Experimental results prove the betterment of proposed N-K means clustering algorithm over existing...

  18. Comprehensive transportation asset management : making a business case and prioritizing assets for inclusion in formal asset management programs.

    Science.gov (United States)

    2011-12-01

    Several agencies are applying asset management principles as a business tool and paradigm to help them define goals and prioritize agency resources in decision making. Previously, transportation asset management (TAM) has focused more on big ticke...

  19. CLUSTER ANALYSIS OF TOTAL ASSETS PROVIDED BY BANKS FROM FOUR CONTINENTS

    Directory of Open Access Journals (Sweden)

    MIRELA CĂTĂLINA TÜRKEȘ

    2017-08-01

    Full Text Available The paper analysed the total assets in 2016 achieved by the strongest 96 banks from 4 continents: Europe, America, Asia and Africa. It aims to evaluate the level of total assets provided by banks in 2016 and continental banking markets degree of differentiation to determine the overall conditions of the banks. Methodologies used in this study are based on cluster and descriptives analysis. Data set was built based on informations reported by banks on total assets. The results indicate that most of total banking assets are found in Asia and the fewest in Africa. At the end of 2016, the top 16 global banks owned total assets of $ 30.19 trillion according to the data set contains cluster 1 and the centroid was (2.25, 2.11, 3.06, 0.01.

  20. A new algorithm for coding geological terminology

    Science.gov (United States)

    Apon, W.

    The Geological Survey of The Netherlands has developed an algorithm to convert the plain geological language of lithologic well logs into codes suitable for computer processing and link these to existing plotting programs. The algorithm is based on the "direct method" and operates in three steps: (1) searching for defined word combinations and assigning codes; (2) deleting duplicated codes; (3) correcting incorrect code combinations. Two simple auxiliary files are used. A simple PC demonstration program is included to enable readers to experiment with this algorithm. The Department of Quarternary Geology of the Geological Survey of The Netherlands possesses a large database of shallow lithologic well logs in plain language and has been using a program based on this algorithm for about 3 yr. Erroneous codes resulting from using this algorithm are less than 2%.

  1. Proactive pavement asset management with climate change aspects

    Science.gov (United States)

    Zofka, Adam

    2018-05-01

    Pavement Asset Management System is a systematic and objective tool to manage pavement network based on the rational, engineering and economic principles. Once implemented and mature Pavement Asset Management System serves the entire range of users starting with the maintenance engineers and ending with the decision-makers. Such a system is necessary to coordinate agency management strategy including proactive maintenance. Basic inputs in the majority of existing Pavement Asset Management System approaches comprise the actual pavement inventory with associated construction history and condition, traffic information as well as various economical parameters. Some Pavement Management System approaches include also weather aspects which is of particular importance considering ongoing climate changes. This paper presents challenges in implementing the Pavement Asset Management System for those National Road Administrations that manage their pavement assets using more traditional strategies, e.g. worse-first approach. Special considerations are given to weather-related inputs and associated analysis to demonstrate the effects of climate change in a short- and long-term range. Based on the presented examples this paper concludes that National Road Administrations should account for the weather-related factors in their Pavement Management Systems as this has a significant impact on the system outcomes from the safety and economical perspective.

  2. Securitization of residential solar photovoltaic assets: Costs, risks and uncertainty

    International Nuclear Information System (INIS)

    Alafita, T.; Pearce, J.M.

    2014-01-01

    Limited access to low-cost financing is an impediment to high-velocity technological diffusion and high grid penetration of solar photovoltaic (PV) technology. Securitization of solar assets provides a potential solution to this problem. This paper assesses the viability of solar asset-backed securities (ABS) as a lower cost financing mechanism and identifies policies that could facilitate implementation of securitization. First, traditional solar financing is examined to provide a baseline for cost comparisons. Next, the securitization process is modeled. The model enables identification of several junctures at which risk and uncertainty influence costs. Next, parameter values are assigned and used to generate cost estimates. Results show that, under reasonable assumptions, securitization of solar power purchase agreements (PPA) can significantly reduce project financing costs, suggesting that securitization is a viable mechanism for improving the financing of PV projects. The clear impediment to the successful launch of a solar ABS is measuring and understanding the riskiness of underlying assets. This study identifies three classes of policy intervention that lower the cost of ABS by reducing risk or by improving the measurement of risk: (i) standardization of contracts and the contracting process, (ii) improved access to contract and equipment performance data, and (iii) geographic diversification. - Highlights: • Limited access to low-cost financing is hampering penetration of solar PV. • Solar asset-backed securities (ABS) provide a low cost financing mechanism. • Results for securitization of solar leases and power purchase agreements (PPA). • Securitization can significantly reduce project financing costs. • Identifies policy intervention that lower cost of ABS by reducing risk

  3. A multi-objective approach to the assignment of stock keeping units to unidirectional picking lines

    Directory of Open Access Journals (Sweden)

    Le Roux, G. J.

    2017-05-01

    Full Text Available An order picking system in a distribution centre consisting of parallel unidirectional picking lines is considered. The objectives are to minimise the walking distance of the pickers, the largest volume of stock on a picking line over all picking lines, the number of small packages, and the total penalty incurred for late distributions. The problem is formulated as a multi-objective multiple knapsack problem that is not solvable in a realistic time. Population-based algorithms, including the artificial bee colony algorithm and the genetic algorithm, are also implemented. The results obtained from all algorithms indicate a substantial improvement on all objectives relative to historical assignments. The genetic algorithm delivers the best performance.

  4. Higher Order Expectations in Asset Pricing

    OpenAIRE

    Philippe BACCHETTA; Eric VAN WINCOOP

    2004-01-01

    We examine formally Keynes' idea that higher order beliefs can drive a wedge between an asset price and its fundamental value based on expected future payoffs. Higher order expectations add an additional term to a standard asset pricing equation. We call this the higher order wedge, which depends on the difference between higher and first order expectations of future payoffs. We analyze the determinants of this wedge and its impact on the equilibrium price. In the context of a dynamic noisy r...

  5. Rational Asset Pricing Bubbles Revisited

    OpenAIRE

    Jan Werner

    2012-01-01

    Price bubble arises when the price of an asset exceeds the asset's fundamental value, that is, the present value of future dividend payments. The important result of Santos and Woodford (1997) says that price bubbles cannot exist in equilibrium in the standard dynamic asset pricing model with rational agents as long as assets are in strictly positive supply and the present value of total future resources is finite. This paper explores the possibility of asset price bubbles when either one of ...

  6. A Branch-and-Price algorithm for stable workforce assignments with hierarchical skills

    NARCIS (Netherlands)

    Firat, M.; Briskorn, D.; Laugier, A.

    2016-01-01

    This paper deals with assigning hierarchically skilled technicians to jobs by considering preferences. We investigate stability definitions in multi-skill workforce assignments stemming from the notion of blocking pairs as stated in the Marriage model of Gale–Shapley. We propose a Branch-and-Price

  7. Prediction of future asset prices

    Science.gov (United States)

    Seong, Ng Yew; Hin, Pooi Ah; Ching, Soo Huei

    2014-12-01

    This paper attempts to incorporate trading volumes as an additional predictor for predicting asset prices. Denoting r(t) as the vector consisting of the time-t values of the trading volume and price of a given asset, we model the time-(t+1) asset price to be dependent on the present and l-1 past values r(t), r(t-1), ....., r(t-1+1) via a conditional distribution which is derived from a (2l+1)-dimensional power-normal distribution. A prediction interval based on the 100(α/2)% and 100(1-α/2)% points of the conditional distribution is then obtained. By examining the average lengths of the prediction intervals found by using the composite indices of the Malaysia stock market for the period 2008 to 2013, we found that the value 2 appears to be a good choice for l. With the omission of the trading volume in the vector r(t), the corresponding prediction interval exhibits a slightly longer average length, showing that it might be desirable to keep trading volume as a predictor. From the above conditional distribution, the probability that the time-(t+1) asset price will be larger than the time-t asset price is next computed. When the probability differs from 0 (or 1) by less than 0.03, the observed time-(t+1) increase in price tends to be negative (or positive). Thus the above probability has a good potential of being used as a market indicator in technical analysis.

  8. The assets-based approach: furthering a neoliberal agenda or rediscovering the old public health? A critical examination of practitioner discourses.

    Science.gov (United States)

    Roy, Michael J

    2017-08-08

    The 'assets-based approach' to health and well-being has, on the one hand, been presented as a potentially empowering means to address the social determinants of health while, on the other, been criticised for obscuring structural drivers of inequality and encouraging individualisation and marketisation; in essence, for being a tool of neoliberalism. This study looks at how this apparent contestation plays out in practice through a critical realist-inspired examination of practitioner discourses, specifically of those working within communities to address social vulnerabilities that we know impact upon health. The study finds that practitioners interact with the assets-based policy discourse in interesting ways. Rather than unwitting tools of neoliberalism, they considered their work to be about mitigating the worst effects of poverty and social vulnerability in ways that enhance collectivism and solidarity, concepts that neoliberalism arguably seeks to disrupt. Furthermore, rather than a different, innovative, way of working, they consider the assets-based approach to simply be a re-labelling of what they have been doing anyway, for as long as they can remember. So, for practitioners, rather than a 'new' approach to public health, the assets-based public health movement seems to be a return to recognising and appreciating the role of community within public health policy and practice; ideals that predate neoliberalism by quite some considerable time.

  9. Tapping the Value Potential of Extended Asset Services - Experiences from Finnish Companies

    Science.gov (United States)

    Kortelainen, Helena; Hanski, Jyri; Valkokari, Pasi; Ahonen, Toni

    2017-09-01

    Recent developments in information technology and business models enable a wide variety of new services for companies looking for growth in services. Currently, manufacturing companies have been actively developing and providing novel asset based services such as condition monitoring and remote control. However, there is still untapped potential in extending the service delivery to the long-term co-operative development of physical assets over the whole lifecycle. Close collaboration with the end-customer and other stakeholders is needed in order to understand the value generation options. In this paper, we assess some of the asset services manufacturing companies are currently developing. The descriptions of the asset services are based on the results of an industrial workshop in which the companies presented their service development plans. The service propositions are compared with the Total Cost of Ownership and the closed loop life cycle frameworks. Based on the comparison, gaps that indicate potential for extended asset service concepts are recognised. In conclusion, we argue that the manufacturing companies do not recognise the whole potential for asset based services and for optimizing the performance of the end customers' processes.

  10. Bus Timetabling as a Fuzzy Multiobjective Optimization Problem Using Preference-based Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Surafel Luleseged Tilahun

    2012-05-01

    Full Text Available Transportation plays a vital role in the development of a country and the car is the most commonly used means. However, in third world countries long waiting time for public buses is a common problem, especially when people need to switch buses. The problem becomes critical when one considers buses joining different villages and cities. Theoretically this problem can be solved by assigning more buses on the route, which is not possible due to economical problem. Another option is to schedule the buses so that customers who want to switch buses at junction cities need not have to wait long. This paper discusses how to model single frequency routes bus timetabling as a fuzzy multiobjective optimization problem and how to solve it using preference-based genetic algorithm by assigning appropriate fuzzy preference to the need of the customers. The idea will be elaborated with an example.

  11. Managing Assets in The Infrastructure Sector

    Directory of Open Access Journals (Sweden)

    T.P. van Houten

    2010-09-01

    Full Text Available In view of the importance of managing assets and the lack of research in managing assets in the infrastructure sector, we develop an asset management model in this study. This model is developed in line with the unique characteristics of the infrastructure assets and asset management principles and criteria. In the proposed model, we consider activities at three levels, namely the strategical, tactical and operational levels. The interviews with experts in asset management and officials in several Dutch organizations have proven the potential of our asset management model.

  12. Methodological aspects of network assets accounting

    Directory of Open Access Journals (Sweden)

    Yuhimenko-Nazaruk I.A.

    2017-08-01

    Full Text Available The necessity of using innovative tools of processing and representation of information about network assets is substantiated. The suggestions for displaying network assets in accounts are presented. The main reasons for the need to display the network assets in the financial statements of all members of the network structure (the economic essence of network assets as the object of accounting; the non-additional model for the formation of the value of network assets; the internetworking mechanism for the formation of the value of network assets are identified. The stages of accounting valuation of network assets are allocated and substantiated. The analytical table for estimating the value of network assets and additional network capital in accounting is developed. The order of additional network capital reflection in accounting is developed. The method of revaluation of network assets in accounting in the broad sense is revealed. The order of accounting of network assets with increasing or decreasing the number of participants in the network structure is determined.

  13. Basel III and Asset Securitization

    Directory of Open Access Journals (Sweden)

    M. Mpundu

    2013-01-01

    Full Text Available Asset securitization via special purpose entities involves the process of transforming assets into securities that are issued to investors. These investors hold the rights to payments supported by the cash flows from an asset pool held by the said entity. In this paper, we discuss the mechanism by which low- and high-quality entities securitize low- and high-quality assets, respectively, into collateralized debt obligations. During the 2007–2009 financial crisis, asset securitization was seriously inhibited. In response to this, for instance, new Basel III capital and liquidity regulations were introduced. Here, we find that we can explicitly determine the transaction costs related to low-quality asset securitization. Also, in the case of dynamic and static multipliers, the effects of unexpected negative shocks such as rating downgrades on asset price and input, debt obligation price and output, and profit will be quantified. In this case, we note that Basel III has been designed to provide countercyclical capital buffers to negate procyclicality. Moreover, we will develop an illustrative example of low-quality asset securitization for subprime mortgages. Furthermore, numerical examples to illustrate the key results will be provided. In addition, connections between Basel III and asset securitization will be highlighted.

  14. Due Date Assignment in a Dynamic Job Shop with the Orthogonal Kernel Least Squares Algorithm

    Science.gov (United States)

    Yang, D. H.; Hu, L.; Qian, Y.

    2017-06-01

    Meeting due dates is a key goal in the manufacturing industries. This paper proposes a method for due date assignment (DDA) by using the Orthogonal Kernel Least Squares Algorithm (OKLSA). A simulation model is built to imitate the production process of a highly dynamic job shop. Several factors describing job characteristics and system state are extracted as attributes to predict job flow-times. A number of experiments under conditions of varying dispatching rules and 90% shop utilization level have been carried out to evaluate the effectiveness of OKLSA applied for DDA. The prediction performance of OKLSA is compared with those of five conventional DDA models and back-propagation neural network (BPNN). The experimental results indicate that OKLSA is statistically superior to other DDA models in terms of mean absolute lateness and root mean squares lateness in most cases. The only exception occurs when the shortest processing time rule is used for dispatching jobs, the difference between OKLSA and BPNN is not statistically significant.

  15. Comparison of neural network applications for channel assignment in cellular TDMA networks and dynamically sectored PCS networks

    Science.gov (United States)

    Hortos, William S.

    1997-04-01

    The use of artificial neural networks (NNs) to address the channel assignment problem (CAP) for cellular time-division multiple access and code-division multiple access networks has previously been investigated by this author and many others. The investigations to date have been based on a hexagonal cell structure established by omnidirectional antennas at the base stations. No account was taken of the use of spatial isolation enabled by directional antennas to reduce interference between mobiles. Any reduction in interference translates into increased capacity and consequently alters the performance of the NNs. Previous studies have sought to improve the performance of Hopfield- Tank network algorithms and self-organizing feature map algorithms applied primarily to static channel assignment (SCA) for cellular networks that handle uniformly distributed, stationary traffic in each cell for a single type of service. The resulting algorithms minimize energy functions representing interference constraint and ad hoc conditions that promote convergence to optimal solutions. While the structures of the derived neural network algorithms (NNAs) offer the potential advantages of inherent parallelism and adaptability to changing system conditions, this potential has yet to be fulfilled the CAP for emerging mobile networks. The next-generation communication infrastructures must accommodate dynamic operating conditions. Macrocell topologies are being refined to microcells and picocells that can be dynamically sectored by adaptively controlled, directional antennas and programmable transceivers. These networks must support the time-varying demands for personal communication services (PCS) that simultaneously carry voice, data and video and, thus, require new dynamic channel assignment (DCA) algorithms. This paper examines the impact of dynamic cell sectoring and geometric conditioning on NNAs developed for SCA in omnicell networks with stationary traffic to improve the metrics

  16. Managing assets in the infrastructure sector

    NARCIS (Netherlands)

    van Houten, T.P.; Zhang, L.

    2010-01-01

    In view of the importance of managing assets and the lack of research in managing assets in the infrastructure sector, we develop an asset management model in this study. This model is developed in line with the unique characteristics of the infrastructure assets and asset management principles and

  17. Effectiveness of infrastructure asset management: challenges for public agencies

    NARCIS (Netherlands)

    Schraven, Daan; Hartmann, Andreas; Dewulf, Geert P.M.R.

    2011-01-01

    Purpose: The aim of this research is to better understand the decisions in infrastructure asset management at public agencies and the challenges of these agencies to improve the effectiveness of their decision making. Design/methodology/approach: Based on a literature review on asset management at

  18. Empowering file-based radio production through media asset management systems

    Science.gov (United States)

    Muylaert, Bjorn; Beckers, Tom

    2006-10-01

    In recent years, IT-based production and archiving of media has matured to a level which enables broadcasters to switch over from tape- or CD-based to file-based workflows for the production of their radio and television programs. This technology is essential for the future of broadcasters as it provides the flexibility and speed of execution the customer demands by enabling, among others, concurrent access and production, faster than real-time ingest, edit during ingest, centrally managed annotation and quality preservation of media. In terms of automation of program production, the radio department is the most advanced within the VRT, the Flemish broadcaster. Since a couple of years ago, the radio department has been working with digital equipment and producing its programs mainly on standard IT equipment. Historically, the shift from analogue to digital based production has been a step by step process initiated and coordinated by each radio station separately, resulting in a multitude of tools and metadata collections, some of them developed in-house, lacking integration. To make matters worse, each of those stations adopted a slightly different production methodology. The planned introduction of a company-wide Media Asset Management System allows a coordinated overhaul to a unified production architecture. Benefits include the centralized ingest and annotation of audio material and the uniform, integrated (in terms of IT infrastructure) workflow model. Needless to say, the ingest strategy, metadata management and integration with radio production systems play a major role in the level of success of any improvement effort. This paper presents a data model for audio-specific concepts relevant to radio production. It includes an investigation of ingest techniques and strategies. Cooperation with external, professional production tools is demonstrated through a use-case scenario: the integration of an existing, multi-track editing tool with a commercially available

  19. Single-step reinitialization and extending algorithms for level-set based multi-phase flow simulations

    Science.gov (United States)

    Fu, Lin; Hu, Xiangyu Y.; Adams, Nikolaus A.

    2017-12-01

    We propose efficient single-step formulations for reinitialization and extending algorithms, which are critical components of level-set based interface-tracking methods. The level-set field is reinitialized with a single-step (non iterative) "forward tracing" algorithm. A minimum set of cells is defined that describes the interface, and reinitialization employs only data from these cells. Fluid states are extrapolated or extended across the interface by a single-step "backward tracing" algorithm. Both algorithms, which are motivated by analogy to ray-tracing, avoid multiple block-boundary data exchanges that are inevitable for iterative reinitialization and extending approaches within a parallel-computing environment. The single-step algorithms are combined with a multi-resolution conservative sharp-interface method and validated by a wide range of benchmark test cases. We demonstrate that the proposed reinitialization method achieves second-order accuracy in conserving the volume of each phase. The interface location is invariant to reapplication of the single-step reinitialization. Generally, we observe smaller absolute errors than for standard iterative reinitialization on the same grid. The computational efficiency is higher than for the standard and typical high-order iterative reinitialization methods. We observe a 2- to 6-times efficiency improvement over the standard method for serial execution. The proposed single-step extending algorithm, which is commonly employed for assigning data to ghost cells with ghost-fluid or conservative interface interaction methods, shows about 10-times efficiency improvement over the standard method while maintaining same accuracy. Despite their simplicity, the proposed algorithms offer an efficient and robust alternative to iterative reinitialization and extending methods for level-set based multi-phase simulations.

  20. Algorithms for worst-case tolerance optimization

    DEFF Research Database (Denmark)

    Schjær-Jacobsen, Hans; Madsen, Kaj

    1979-01-01

    New algorithms are presented for the solution of optimum tolerance assignment problems. The problems considered are defined mathematically as a worst-case problem (WCP), a fixed tolerance problem (FTP), and a variable tolerance problem (VTP). The basic optimization problem without tolerances...... is denoted the zero tolerance problem (ZTP). For solution of the WCP we suggest application of interval arithmetic and also alternative methods. For solution of the FTP an algorithm is suggested which is conceptually similar to algorithms previously developed by the authors for the ZTP. Finally, the VTP...... is solved by a double-iterative algorithm in which the inner iteration is performed by the FTP- algorithm. The application of the algorithm is demonstrated by means of relatively simple numerical examples. Basic properties, such as convergence properties, are displayed based on the examples....

  1. Optimal Management Of Renewable-Based Mgs An Intelligent Approach Through The Evolutionary Algorithm

    Directory of Open Access Journals (Sweden)

    Mehdi Nafar

    2015-08-01

    Full Text Available Abstract- This article proposes a probabilistic frame built on Scenario fabrication to considerate the uncertainties in the finest action managing of Micro Grids MGs. The MG contains different recoverable energy resources such as Wind Turbine WT Micro Turbine MT Photovoltaic PV Fuel Cell FC and one battery as the storing device. The advised frame is based on scenario generation and Roulette wheel mechanism to produce different circumstances for handling the uncertainties of altered factors. It habits typical spreading role as a probability scattering function of random factors. The uncertainties which are measured in this paper are grid bid alterations cargo request calculating error and PV and WT yield power productions. It is well-intentioned to asset that solving the MG difficult for 24 hours of a day by considering diverse uncertainties and different constraints needs one powerful optimization method that can converge fast when it doesnt fall in local optimal topic. Simultaneously single Group Search Optimization GSO system is presented to vision the total search space globally. The GSO algorithm is instigated from group active of beasts. Also the GSO procedure one change is similarly planned for this algorithm. The planned context and way is applied o one test grid-connected MG as a typical grid.

  2. Using genetic algorithm to solve a new multi-period stochastic optimization model

    Science.gov (United States)

    Zhang, Xin-Li; Zhang, Ke-Cun

    2009-09-01

    This paper presents a new asset allocation model based on the CVaR risk measure and transaction costs. Institutional investors manage their strategic asset mix over time to achieve favorable returns subject to various uncertainties, policy and legal constraints, and other requirements. One may use a multi-period portfolio optimization model in order to determine an optimal asset mix. Recently, an alternative stochastic programming model with simulated paths was proposed by Hibiki [N. Hibiki, A hybrid simulation/tree multi-period stochastic programming model for optimal asset allocation, in: H. Takahashi, (Ed.) The Japanese Association of Financial Econometrics and Engineering, JAFFE Journal (2001) 89-119 (in Japanese); N. Hibiki A hybrid simulation/tree stochastic optimization model for dynamic asset allocation, in: B. Scherer (Ed.), Asset and Liability Management Tools: A Handbook for Best Practice, Risk Books, 2003, pp. 269-294], which was called a hybrid model. However, the transaction costs weren't considered in that paper. In this paper, we improve Hibiki's model in the following aspects: (1) The risk measure CVaR is introduced to control the wealth loss risk while maximizing the expected utility; (2) Typical market imperfections such as short sale constraints, proportional transaction costs are considered simultaneously. (3) Applying a genetic algorithm to solve the resulting model is discussed in detail. Numerical results show the suitability and feasibility of our methodology.

  3. Identification of the vital digital assets based on PSA results analysis

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Moon Kyoung; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of); Son, Han Seong [Joongbu Univiersity, Geumsan (Korea, Republic of); Kim, Hyundoo [Korea Institute of Nuclear Nonproliferation and Control, Daejeon (Korea, Republic of)

    2016-10-15

    As the main systems for managing totally about the operation, control, monitoring, measurement, and safety function in an emergency, instrumentation and control systems (I and C) in nuclear power plants have been digitalized gradually for the precise operation and its convenience. The digitalization of infrastructure makes systems vulnerable to cyber threats and hybrid attacks. According to ICS-CERT report, as time goes by, the number of vulnerabilities in ICS industries increases rapidly. Recently, due to the digitalization of I and C, it has begun to rise the need of cyber security in the digitalized I and C in NPPs. However, there are too many critical digital assets (CDAs) in NPPs. More than 60% of the total critical systems are digital system. Addressing more than 100 security controls for each CDA needs too much effort for both licensee and inspector. It is necessary to focus on more significant CDAs for effective regulation. Probabilistic Safety Analysis (PSA) results are analyzed in order to identify more significant CDAs which could evoke an accident of NPPs by digital malfunction or cyber-attacks. By eliciting minimal cut sets using fault tree analyses, accident-related CDAs are drawn. Also the CDAs that must be secured from outsiders are elicited in case of some accident scenario. It is expected that effective cyber security regulation based on the graded approach can be implemented. Furthermore, defense-in-depth of digital assets for NPPs safety can be built up. Digital technologies such as computers, control systems, and data networks currently play essential roles in modern NPPs. Further, the introduction of new digitalized technologies is also being considered. These digital technologies make the operation of NPPs more convenient and economical; however, they are inherently susceptible to problems such as digital malfunction of components or cyber-attacks. Recently, needs for cyber security on digitalized nuclear Instrumentation and Control (I and C

  4. Identification of the vital digital assets based on PSA results analysis

    International Nuclear Information System (INIS)

    Choi, Moon Kyoung; Seong, Poong Hyun; Son, Han Seong; Kim, Hyundoo

    2016-01-01

    As the main systems for managing totally about the operation, control, monitoring, measurement, and safety function in an emergency, instrumentation and control systems (I and C) in nuclear power plants have been digitalized gradually for the precise operation and its convenience. The digitalization of infrastructure makes systems vulnerable to cyber threats and hybrid attacks. According to ICS-CERT report, as time goes by, the number of vulnerabilities in ICS industries increases rapidly. Recently, due to the digitalization of I and C, it has begun to rise the need of cyber security in the digitalized I and C in NPPs. However, there are too many critical digital assets (CDAs) in NPPs. More than 60% of the total critical systems are digital system. Addressing more than 100 security controls for each CDA needs too much effort for both licensee and inspector. It is necessary to focus on more significant CDAs for effective regulation. Probabilistic Safety Analysis (PSA) results are analyzed in order to identify more significant CDAs which could evoke an accident of NPPs by digital malfunction or cyber-attacks. By eliciting minimal cut sets using fault tree analyses, accident-related CDAs are drawn. Also the CDAs that must be secured from outsiders are elicited in case of some accident scenario. It is expected that effective cyber security regulation based on the graded approach can be implemented. Furthermore, defense-in-depth of digital assets for NPPs safety can be built up. Digital technologies such as computers, control systems, and data networks currently play essential roles in modern NPPs. Further, the introduction of new digitalized technologies is also being considered. These digital technologies make the operation of NPPs more convenient and economical; however, they are inherently susceptible to problems such as digital malfunction of components or cyber-attacks. Recently, needs for cyber security on digitalized nuclear Instrumentation and Control (I and C

  5. Identifying Assets Associated with Quality Extension Programming at the Local Level

    Directory of Open Access Journals (Sweden)

    Amy Harder

    2017-10-01

    Full Text Available County Extension offices are responsible for the majority of programming delivered in the United States. The purpose of this study was to identify and explore assets influencing the quality of county Extension programs. A basic qualitative research design was followed to conduct constant comparative analysis of five Extension county program review reports. Using the appreciative inquiry process as the lens through which to view the county program review reports revealed multiple assets leading to quality programming. Assets of the reviewed county Extension programs were found to cluster within the following themes: competent and enthusiastic Extension faculty, community partnerships, engaged and supportive stakeholders, effective resource management, sufficient and stable workforce, meeting stakeholder needs, positive reputation, access to facilities, positive relationships between county and state faculty, and innovative practices. The use of both needs-based and assets-based paradigms will provide Extension organizations with a more holistic understanding of its assets and a research-based foundation from which to make decisions about strengthening the organization at all levels.

  6. Three results on frequency assignment in linear cellular networks

    Czech Academy of Sciences Publication Activity Database

    Chrobak, M.; Sgall, Jiří

    2010-01-01

    Roč. 411, č. 1 (2010), s. 131-137 ISSN 0304-3975 R&D Projects: GA MŠk(CZ) 1M0545; GA AV ČR IAA100190902 Keywords : frequency assignment * approximation algorithms * online algorithms Subject RIV: BA - General Mathematics Impact factor: 0.838, year: 2010 http://www.sciencedirect.com/science/article/pii/S0304397509006574

  7. Wavelet-LMS algorithm-based echo cancellers

    Science.gov (United States)

    Seetharaman, Lalith K.; Rao, Sathyanarayana S.

    2002-12-01

    This paper presents Echo Cancellers based on the Wavelet-LMS Algorithm. The performance of the Least Mean Square Algorithm in Wavelet transform domain is observed and its application in Echo cancellation is analyzed. The Widrow-Hoff Least Mean Square Algorithm is most widely used algorithm for Adaptive filters that function as Echo Cancellers. The present day communication signals are widely non-stationary in nature and some errors crop up when Least Mean Square Algorithm is used for the Echo Cancellers handling such signals. The analysis of non-stationary signals often involves a compromise between how well transitions or discontinuities can be located. The multi-scale or multi-resolution of signal analysis, which is the essence of wavelet transform, makes Wavelets popular in non-stationary signal analysis. In this paper, we present a Wavelet-LMS algorithm wherein the wavelet coefficients of a signal are modified adaptively using the Least Mean Square Algorithm and then reconstructed to give an Echo-free signal. The Echo Canceller based on this Algorithm is found to have a better convergence and a comparatively lesser MSE (Mean Square error).

  8. PeerShield: determining control and resilience criticality of collaborative cyber assets in networks

    Science.gov (United States)

    Cam, Hasan

    2012-06-01

    As attackers get more coordinated and advanced in cyber attacks, cyber assets are required to have much more resilience, control effectiveness, and collaboration in networks. Such a requirement makes it essential to take a comprehensive and objective approach for measuring the individual and relative performances of cyber security assets in network nodes. To this end, this paper presents four techniques as to how the relative importance of cyber assets can be measured more comprehensively and objectively by considering together the main variables of risk assessment (e.g., threats, vulnerabilities), multiple attributes (e.g., resilience, control, and influence), network connectivity and controllability among collaborative cyber assets in networks. In the first technique, a Bayesian network is used to include the random variables for control, recovery, and resilience attributes of nodes, in addition to the random variables of threats, vulnerabilities, and risk. The second technique shows how graph matching and coloring can be utilized to form collaborative pairs of nodes to shield together against threats and vulnerabilities. The third technique ranks the security assets of nodes by incorporating multiple weights and thresholds of attributes into a decision-making algorithm. In the fourth technique, the hierarchically well-separated tree is enhanced to first identify critical nodes of a network with respect to their attributes and network connectivity, and then selecting some nodes as driver nodes for network controllability.

  9. Peak picking multidimensional NMR spectra with the contour geometry based algorithm CYPICK

    International Nuclear Information System (INIS)

    Würz, Julia M.; Güntert, Peter

    2017-01-01

    The automated identification of signals in multidimensional NMR spectra is a challenging task, complicated by signal overlap, noise, and spectral artifacts, for which no universally accepted method is available. Here, we present a new peak picking algorithm, CYPICK, that follows, as far as possible, the manual approach taken by a spectroscopist who analyzes peak patterns in contour plots of the spectrum, but is fully automated. Human visual inspection is replaced by the evaluation of geometric criteria applied to contour lines, such as local extremality, approximate circularity (after appropriate scaling of the spectrum axes), and convexity. The performance of CYPICK was evaluated for a variety of spectra from different proteins by systematic comparison with peak lists obtained by other, manual or automated, peak picking methods, as well as by analyzing the results of automated chemical shift assignment and structure calculation based on input peak lists from CYPICK. The results show that CYPICK yielded peak lists that compare in most cases favorably to those obtained by other automated peak pickers with respect to the criteria of finding a maximal number of real signals, a minimal number of artifact peaks, and maximal correctness of the chemical shift assignments and the three-dimensional structure obtained by fully automated assignment and structure calculation.

  10. Peak picking multidimensional NMR spectra with the contour geometry based algorithm CYPICK

    Energy Technology Data Exchange (ETDEWEB)

    Würz, Julia M.; Güntert, Peter, E-mail: guentert@em.uni-frankfurt.de [Goethe University Frankfurt am Main, Institute of Biophysical Chemistry, Center for Biomolecular Magnetic Resonance (Germany)

    2017-01-15

    The automated identification of signals in multidimensional NMR spectra is a challenging task, complicated by signal overlap, noise, and spectral artifacts, for which no universally accepted method is available. Here, we present a new peak picking algorithm, CYPICK, that follows, as far as possible, the manual approach taken by a spectroscopist who analyzes peak patterns in contour plots of the spectrum, but is fully automated. Human visual inspection is replaced by the evaluation of geometric criteria applied to contour lines, such as local extremality, approximate circularity (after appropriate scaling of the spectrum axes), and convexity. The performance of CYPICK was evaluated for a variety of spectra from different proteins by systematic comparison with peak lists obtained by other, manual or automated, peak picking methods, as well as by analyzing the results of automated chemical shift assignment and structure calculation based on input peak lists from CYPICK. The results show that CYPICK yielded peak lists that compare in most cases favorably to those obtained by other automated peak pickers with respect to the criteria of finding a maximal number of real signals, a minimal number of artifact peaks, and maximal correctness of the chemical shift assignments and the three-dimensional structure obtained by fully automated assignment and structure calculation.

  11. Handoff Triggering and Network Selection Algorithms for Load-Balancing Handoff in CDMA-WLAN Integrated Networks

    Directory of Open Access Journals (Sweden)

    Khalid Qaraqe

    2008-10-01

    Full Text Available This paper proposes a novel vertical handoff algorithm between WLAN and CDMA networks to enable the integration of these networks. The proposed vertical handoff algorithm assumes a handoff decision process (handoff triggering and network selection. The handoff trigger is decided based on the received signal strength (RSS. To reduce the likelihood of unnecessary false handoffs, the distance criterion is also considered. As a network selection mechanism, based on the wireless channel assignment algorithm, this paper proposes a context-based network selection algorithm and the corresponding communication algorithms between WLAN and CDMA networks. This paper focuses on a handoff triggering criterion which uses both the RSS and distance information, and a network selection method which uses context information such as the dropping probability, blocking probability, GoS (grade of service, and number of handoff attempts. As a decision making criterion, the velocity threshold is determined to optimize the system performance. The optimal velocity threshold is adjusted to assign the available channels to the mobile stations. The optimal velocity threshold is adjusted to assign the available channels to the mobile stations using four handoff strategies. The four handoff strategies are evaluated and compared with each other in terms of GOS. Finally, the proposed scheme is validated by computer simulations.

  12. Handoff Triggering and Network Selection Algorithms for Load-Balancing Handoff in CDMA-WLAN Integrated Networks

    Directory of Open Access Journals (Sweden)

    Kim Jang-Sub

    2008-01-01

    Full Text Available This paper proposes a novel vertical handoff algorithm between WLAN and CDMA networks to enable the integration of these networks. The proposed vertical handoff algorithm assumes a handoff decision process (handoff triggering and network selection. The handoff trigger is decided based on the received signal strength (RSS. To reduce the likelihood of unnecessary false handoffs, the distance criterion is also considered. As a network selection mechanism, based on the wireless channel assignment algorithm, this paper proposes a context-based network selection algorithm and the corresponding communication algorithms between WLAN and CDMA networks. This paper focuses on a handoff triggering criterion which uses both the RSS and distance information, and a network selection method which uses context information such as the dropping probability, blocking probability, GoS (grade of service, and number of handoff attempts. As a decision making criterion, the velocity threshold is determined to optimize the system performance. The optimal velocity threshold is adjusted to assign the available channels to the mobile stations. The optimal velocity threshold is adjusted to assign the available channels to the mobile stations using four handoff strategies. The four handoff strategies are evaluated and compared with each other in terms of GOS. Finally, the proposed scheme is validated by computer simulations.

  13. Asset management techniques

    International Nuclear Information System (INIS)

    Schneider, Joachim; Gaul, Armin J.; Neumann, Claus; Hograefer, Juergen; Wellssow, Wolfram; Schwan, Michael; Schnettler, Armin

    2006-01-01

    Deregulation and an increasing competition in electricity markets urge energy suppliers to optimize the utilization of their equipment, focusing on technical and cost-effective aspects. As a respond to these requirements utilities introduce methods formerly used by investment managers or insurance companies. The article describes the usage of these methods, particularly with regard to asset management and risk management within electrical grids. The essential information needed to set up an appropriate asset management system and differences between asset management systems in transmission and distribution systems are discussed. The bulk of costs in electrical grids can be found in costs for maintenance and capital depreciation. A comprehensive approach for an asset management in transmission systems thus focuses on the 'life-cycle costs' of the individual equipment. The objective of the life management process is the optimal utilisation of the remaining life time regarding a given reliability of service and a constant distribution of costs for reinvestment and maintenance ensuring a suitable return. In distribution systems the high number of components would require an enormous effort for the consideration of single individuals. Therefore statistical approaches have been used successfully in practical applications. Newest insights gained by a German research project on asset management systems in distribution grids give an outlook to future developments. (author)

  14. Asset management techniques

    Energy Technology Data Exchange (ETDEWEB)

    Schneider, Joachim; Gaul, Armin J. [RWE Energy AG, Assetmanagement, Dortmund (Germany); Neumann, Claus [RWE Transportnetz Strom GmbH, Dortmund (Germany); Hograefer, Juergen [SAG Energieversorgungsloesungen GmbH, Langen (Germany); Wellssow, Wolfram; Schwan, Michael [Siemens AG, Power Transmission and Distribution, Erlangen (Germany); Schnettler, Armin [RWTH-Aachen, Institut fuer Hochspannungstechnik, Aachen (Germany)

    2006-11-15

    Deregulation and an increasing competition in electricity markets urge energy suppliers to optimize the utilization of their equipment, focusing on technical and cost-effective aspects. As a respond to these requirements utilities introduce methods formerly used by investment managers or insurance companies. The article describes the usage of these methods, particularly with regard to asset management and risk management within electrical grids. The essential information needed to set up an appropriate asset management system and differences between asset management systems in transmission and distribution systems are discussed. The bulk of costs in electrical grids can be found in costs for maintenance and capital depreciation. A comprehensive approach for an asset management in transmission systems thus focuses on the 'life-cycle costs' of the individual equipment. The objective of the life management process is the optimal utilisation of the remaining life time regarding a given reliability of service and a constant distribution of costs for reinvestment and maintenance ensuring a suitable return. In distribution systems the high number of components would require an enormous effort for the consideration of single individuals. Therefore statistical approaches have been used successfully in practical applications. Newest insights gained by a German research project on asset management systems in distribution grids give an outlook to future developments. (author)

  15. Valuation of intangible assets

    OpenAIRE

    Karlíková, Jitka

    2010-01-01

    The thesis is focused on the valuation of intangible assets, particularly trademarks and copyrights. In the beginning it deals with the problems of valuation of intangible assets. The main part of the thesis provides an overview of methods for valuation of intangible assets. This part is followed by a practical section that illustrates the procedure of valuation of trademarks and copyrights on a concrete example.

  16. Asset Management as a Precondition for Knowledge Management

    International Nuclear Information System (INIS)

    Bajramovic, E.; Waedt, K.; Gupta, D.; Gao, Y.; Parekh, M.

    2016-01-01

    Full text: Smart sensors and extensively configurable devices are gradually imposed by the automation market. Except for safety systems, they find their way into the next instrumentation and control (I&C) generation. The understanding and handling of these devices require an extensive knowledge management (KM). This will be outlined for security, testing and training. For legacy systems, security often relates to vetting and access control. For digital devices, a refined asset management is needed, e.g., down to board-level support chipsets. Firmware and system/application software have their own configurations, versions and patch levels. So, here, as a first step of the KM, a user needs to know the firmware configurability. Then, trainings can address when to apply patches, perform regression tests and on what to focus, based on accumulated experience. While assets are often addressed implicitly, this document justifies an explicit and semiformal representation of primary and supporting assets (the asset portfolio) and the establishment of an asset management system as a basis for a robust knowledge management. (author

  17. Problems of intangible assets commercialization accounting

    Directory of Open Access Journals (Sweden)

    S.F. Legenchyk

    2016-03-01

    Full Text Available The growing role of intangible assets in conditions of global economy postindustrialization is grounded. The problems of intangible assets accounting are singled out. The basic tasks of the intangible assets accounting commercialization process are determined. The difference between the commercialization of intellectual property and intangible assets is considered. The basic approaches to understanding the essence of the intangible assets commercialization are singled out and grounded. The basic forms and methods of intangible assets commercialization researched by the author are analyzed. The order of accounting reflection of licensee royalties is considered. The factors of influence on the accounting process of intangible assets commercialization are determined. The necessity of solving the problem of accounting of lease payments for computer program by providing access to SaaS environment is grounded. The prospects of further studies of intangible assets accounting commercialization are determined.

  18. Risk-based asset management methodology for highway infrastructure systems.

    Science.gov (United States)

    2004-01-01

    Maintaining the infrastructure of roads, highways, and bridges is paramount to ensuring that these assets will remain safe and reliable in the future. If maintenance costs remain the same or continue to escalate, and additional funding is not made av...

  19. Financial Integration and Asset Returns

    OpenAIRE

    P Martin; H Rey

    2000-01-01

    The paper investigates the impact of financial integration on asset return, risk diversification and breadth of financial markets. We analyse a three-country macroeconomic model in which (i) the number of financial assets is endogenous; (ii) assets are imperfect substitutes; (iii) cross-border asset trade entails some transaction costs; (iv) the investment technology is indivisible. In such an environment, lower transaction costs between two financial markets translate to higher demand for as...

  20. Development of algorithm for depreciation costs allocation in dynamic input-output industrial enterprise model

    Directory of Open Access Journals (Sweden)

    Keller Alevtina

    2017-01-01

    Full Text Available The article considers the issue of allocation of depreciation costs in the dynamic inputoutput model of an industrial enterprise. Accounting the depreciation costs in such a model improves the policy of fixed assets management. It is particularly relevant to develop the algorithm for the allocation of depreciation costs in the construction of dynamic input-output model of an industrial enterprise, since such enterprises have a significant amount of fixed assets. Implementation of terms of the adequacy of such an algorithm itself allows: evaluating the appropriateness of investments in fixed assets, studying the final financial results of an industrial enterprise, depending on management decisions in the depreciation policy. It is necessary to note that the model in question for the enterprise is always degenerate. It is caused by the presence of zero rows in the matrix of capital expenditures by lines of structural elements unable to generate fixed assets (part of the service units, households, corporate consumers. The paper presents the algorithm for the allocation of depreciation costs for the model. This algorithm was developed by the authors and served as the basis for further development of the flowchart for subsequent implementation with use of software. The construction of such algorithm and its use for dynamic input-output models of industrial enterprises is actualized by international acceptance of the effectiveness of the use of input-output models for national and regional economic systems. This is what allows us to consider that the solutions discussed in the article are of interest to economists of various industrial enterprises.

  1. Pension plan asset valuation

    OpenAIRE

    Owadally, M. I; Haberman, S.

    2001-01-01

    Various asset valuation methods are used in the context of funding valuations. The motivation for such methods and their properties are briefly described. Some smoothed value or market-related methods based on arithmetic averaging and exponential smoothing are considered and their effect on funding is discussed. Suggestions for further research are also made.

  2. Inflation, Index-Linked Bonds, and Asset Allocation

    OpenAIRE

    Zvi Bodie

    1988-01-01

    The recent introduction of CPI-linked bonds by several financial institutions is a milestone in the history of the U.S. financial system. It has potentially far-reaching effects on individual and institutional asset allocation decisions because these securities represent the only true long-run hedge against inflation risk. CPI-linked bonds make possible the creation of additional financial innovations that would use them as the asset base. One such innovation that seems likely is inflation-pr...

  3. Hybrid employment recommendation algorithm based on Spark

    Science.gov (United States)

    Li, Zuoquan; Lin, Yubei; Zhang, Xingming

    2017-08-01

    Aiming at the real-time application of collaborative filtering employment recommendation algorithm (CF), a clustering collaborative filtering recommendation algorithm (CCF) is developed, which applies hierarchical clustering to CF and narrows the query range of neighbour items. In addition, to solve the cold-start problem of content-based recommendation algorithm (CB), a content-based algorithm with users’ information (CBUI) is introduced for job recommendation. Furthermore, a hybrid recommendation algorithm (HRA) which combines CCF and CBUI algorithms is proposed, and implemented on Spark platform. The experimental results show that HRA can overcome the problems of cold start and data sparsity, and achieve good recommendation accuracy and scalability for employment recommendation.

  4. Asset planning performance measurement framework

    NARCIS (Netherlands)

    Arthur, D.; Hodkiewicz, M.; Schoenmaker, R.; Muruvan, S.

    2014-01-01

    The international asset management standard ISO 55001, introduced in early 2014, outlines the requirement for an effective Asset Management System. Asset Management practitioners are seeking guidance on implementing one of the key requirements of the standard: the “line of sight” between the

  5. Toronto Hydro-Electric System Limited, 2010 asset condition assessment audit

    Energy Technology Data Exchange (ETDEWEB)

    Lotho, K.; Wang, F. [Kinectrics Inc., Toronto, ON (Canada)

    2010-07-15

    Toronto Hydro-Electric System Limited (THESL) has long been devoted to the enhancement of its asset management program. In 2006, Kinectrics Incorporated (Kinectrics) performed a full asset condition assessment (ACA) for important distribution assets. Subsequently, THESL made efforts to follow the recommendations given by the 2006 ACA and to enhance the quality of its asset condition data. THESL also created an application that measures the health indices of assets based on current and best available inspection data. In 2009, THESL performed a new ACA with this health index calculator. Kinectrics was requested to evaluate the improvement achieved by THESL between 2006 and 2009, and to compare the results obtained from the two ACA performed. An examination of the changes and ACA results between 2009 and 2010 has been conducted by Kinectrics. The Kinectrics findings were reported into the 2010 asset condition assessment audit report. The Health Index (HI) formulation and the results obtained between 2009 and 2010 were examined for twenty-one asset categories. The health index formulation including condition parameters, condition parameter weights and condition criteria, the granularity within the asset category, the percentage of the population presenting sufficient condition data and the health index classification distribution were compared for each one of the asset categories between 2009 and 2010. This report provides recommendations to facilitate future improvements.

  6. 24 CFR 990.270 - Asset management.

    Science.gov (United States)

    2010-04-01

    ... 24 Housing and Urban Development 4 2010-04-01 2010-04-01 false Asset management. 990.270 Section... THE PUBLIC HOUSING OPERATING FUND PROGRAM Asset Management § 990.270 Asset management. As owners, PHAs have asset management responsibilities that are above and beyond property management activities. These...

  7. [ASSET experience in China

    International Nuclear Information System (INIS)

    Zhang Shanming

    1996-01-01

    The ASSET philosophy for prevention of nuclear safety incident is being implemented in our nuclear power plant as the other international nuclear power plants, and the in-depth analysis of operational events in order to find out and eliminate the root causes is considered as the prioritized work in the plant safety management. Some observations are discussed which were made during the implementation of ASSET philosophy and the ASSET approach in our nuclear power plant

  8. [ASSET experience in China

    Energy Technology Data Exchange (ETDEWEB)

    Shanming, Zhang [Dayabay NPP (China)

    1997-12-31

    The ASSET philosophy for prevention of nuclear safety incident is being implemented in our nuclear power plant as the other international nuclear power plants, and the in-depth analysis of operational events in order to find out and eliminate the root causes is considered as the prioritized work in the plant safety management. Some observations are discussed which were made during the implementation of ASSET philosophy and the ASSET approach in our nuclear power plant.

  9. Asset Opacity and Liquidity

    OpenAIRE

    Stenzel, A.; Wagner, W.B.

    2013-01-01

    Abstract: We consider a model of private information acquisition in which the cost of information depends on an asset's opacity. The model generates a hump-shaped relationship between opacity and the equilibrium amount of private information. In particular, the incentives to acquire information are largest for assets of intermediate opacity; such assets hence display low liquidity in the secondary market due to adverse selection. We also show that costly information acquisition generates ince...

  10. Competitive Procurement and Asset Specificity

    NARCIS (Netherlands)

    Sorana, V.

    2003-01-01

    This paper studies the effects of asset specificity on the performance of procurement auctions with subcontracting and asset sales.The analysis highlights the role of several asset features like transfer costs, type of alternative uses and maintenance requirements.It is argued that, if bargaining

  11. Modeling multiple visual words assignment for bag-of-features based medical image retrieval

    KAUST Repository

    Wang, Jim Jing-Yan

    2012-01-01

    In this paper, we investigate the bag-of-features based medical image retrieval methods, which represent an image as a collection of local features, such as image patch and key points with SIFT descriptor. To improve the bag-of-features method, we first model the assignment of local descriptor as contribution functions, and then propose a new multiple assignment strategy. By assuming the local feature can be reconstructed by its neighboring visual words in vocabulary, we solve the reconstruction weights as a QP problem and then use the solved weights as contribution functions, which results in a new assignment method called the QP assignment. We carry our experiments on ImageCLEFmed datasets. Experiments\\' results show that our proposed method exceeds the performances of traditional solutions and works well for the bag-of-features based medical image retrieval tasks.

  12. Modeling multiple visual words assignment for bag-of-features based medical image retrieval

    KAUST Repository

    Wang, Jim Jing-Yan; Almasri, Islam

    2012-01-01

    In this paper, we investigate the bag-of-features based medical image retrieval methods, which represent an image as a collection of local features, such as image patch and key points with SIFT descriptor. To improve the bag-of-features method, we first model the assignment of local descriptor as contribution functions, and then propose a new multiple assignment strategy. By assuming the local feature can be reconstructed by its neighboring visual words in vocabulary, we solve the reconstruction weights as a QP problem and then use the solved weights as contribution functions, which results in a new assignment method called the QP assignment. We carry our experiments on ImageCLEFmed datasets. Experiments' results show that our proposed method exceeds the performances of traditional solutions and works well for the bag-of-features based medical image retrieval tasks.

  13. Macroeconomic Dynamics of Assets, Leverage and Trust

    Science.gov (United States)

    Rozendaal, Jeroen C.; Malevergne, Yannick; Sornette, Didier

    A macroeconomic model based on the economic variables (i) assets, (ii) leverage (defined as debt over asset) and (iii) trust (defined as the maximum sustainable leverage) is proposed to investigate the role of credit in the dynamics of economic growth, and how credit may be associated with both economic performance and confidence. Our first notable finding is the mechanism of reward/penalty associated with patience, as quantified by the return on assets. In regular economies where the EBITA/Assets ratio is larger than the cost of debt, starting with a trust higher than leverage results in the highest long-term return on assets (which can be seen as a proxy for economic growth). Therefore, patient economies that first build trust and then increase leverage are positively rewarded. Our second main finding concerns a recommendation for the reaction of a central bank to an external shock that affects negatively the economic growth. We find that late policy intervention in the model economy results in the highest long-term return on assets. However, this comes at the cost of suffering longer from the crisis until the intervention occurs. The phenomenon that late intervention is most effective to attain a high long-term return on assets can be ascribed to the fact that postponing intervention allows trust to increase first, and it is most effective to intervene when trust is high. These results are derived from two fundamental assumptions underlying our model: (a) trust tends to increase when it is above leverage; (b) economic agents learn optimally to adjust debt for a given level of trust and amount of assets. Using a Markov Switching Model for the EBITA/Assets ratio, we have successfully calibrated our model to the empirical data of the return on equity of the EURO STOXX 50 for the time period 2000-2013. We find that dynamics of leverage and trust can be highly nonmonotonous with curved trajectories, as a result of the nonlinear coupling between the variables. This

  14. A multidisciplinary, expert-based approach for the identification of lifetime impacts in asset life cycle management

    NARCIS (Netherlands)

    Ruitenburg, Richard Jacob; Braaksma, Anne Johannes Jan; van Dongen, Leonardus Adriana Maria

    2014-01-01

    Everyday our lives are dependent on countless physical structures. These assets represent an enormous value for their owners and for society at large. To grasp the full potential of these assets, a deep and thorough understanding of an asset's complete lifetime is needed. Problems with data

  15. Midwifery students' evaluation of team-based academic assignments involving peer-marking.

    Science.gov (United States)

    Parratt, Jenny A; Fahy, Kathleen M; Hastie, Carolyn R

    2014-03-01

    Midwives should be skilled team workers in maternity units and in group practices. Poor teamwork skills are a significant cause of adverse maternity care outcomes. Despite Australian and International regulatory requirements that all midwifery graduates are competent in teamwork, the systematic teaching and assessment of teamwork skills is lacking in higher education. How do midwifery students evaluate participation in team-based academic assignments, which include giving and receiving peer feedback? First and third year Bachelor of Midwifery students who volunteered (24 of 56 students). Participatory Action Research with data collection via anonymous online surveys. There was general agreement that team based assignments; (i) should have peer-marking, (ii) help clarify what is meant by teamwork, (iii) develop communication skills, (iv) promote student-to-student learning. Third year students strongly agreed that teams: (i) are valuable preparation for teamwork in practice, (ii) help meet Australian midwifery competency 8, and (iii) were enjoyable. The majority of third year students agreed with statements that their teams were effectively coordinated and team members shared responsibility for work equally; first year students strongly disagreed with these statements. Students' qualitative comments substantiated and expanded on these findings. The majority of students valued teacher feedback on well-developed drafts of the team's assignment prior to marking. Based on these findings we changed practice and created more clearly structured team-based assignments with specific marking criteria. We are developing supporting lessons to teach specific teamwork skills: together these resources are called "TeamUP". TeamUP should be implemented in all pre-registration Midwifery courses to foster students' teamwork skills and readiness for practice. Copyright © 2013 Australian College of Midwives. Published by Elsevier Ltd. All rights reserved.

  16. Financing Asset Sales and Business Cycles

    OpenAIRE

    Arnold, Marc; Hackbarth, Dirk; Puhan, Tatjana-Xenia

    2013-01-01

    This paper analyzes the decision of firms to sell assets to fund investments (financing asset sales). For a sample of U.S. manufacturing firms during the 1971-2010 period, we document new stylized facts about financing asset sales that cannot be explained by traditional motives for selling assets, such as financial distress or financing constraints. Using a structural model of financing, investment, and macroeconomic risk, we show that financing asset sales attenuate the debt overhang problem...

  17. Optimizing fixed observational assets in a coastal observatory

    Science.gov (United States)

    Frolov, Sergey; Baptista, António; Wilkin, Michael

    2008-11-01

    Proliferation of coastal observatories necessitates an objective approach to managing of observational assets. In this article, we used our experience in the coastal observatory for the Columbia River estuary and plume to identify and address common problems in managing of fixed observational assets, such as salinity, temperature, and water level sensors attached to pilings and moorings. Specifically, we addressed the following problems: assessing the quality of an existing array, adding stations to an existing array, removing stations from an existing array, validating an array design, and targeting of an array toward data assimilation or monitoring. Our analysis was based on a combination of methods from oceanographic and statistical literature, mainly on the statistical machinery of the best linear unbiased estimator. The key information required for our analysis was the covariance structure for a field of interest, which was computed from the output of assimilated and non-assimilated models of the Columbia River estuary and plume. The network optimization experiments in the Columbia River estuary and plume proved to be successful, largely withstanding the scrutiny of sensitivity and validation studies, and hence providing valuable insight into optimization and operation of the existing observational network. Our success in the Columbia River estuary and plume suggest that algorithms for optimal placement of sensors are reaching maturity and are likely to play a significant role in the design of emerging ocean observatories, such as the United State's ocean observation initiative (OOI) and integrated ocean observing system (IOOS) observatories, and smaller regional observatories.

  18. Japanese views on ASSET

    Energy Technology Data Exchange (ETDEWEB)

    Hirano, Masashi [Department of Reactor Safety Research, Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan)

    1997-12-31

    In general, the ASSET has had a positive effect on enhancement of operating experience feedback. The ASSET has played an important role to supply information to the IAEA Extra Budgetary Program. However, this role has come to an end; since the needs for safety upgrading have become identified and prioritized. ASSET missions in future: Linkage among various safety missions should be sought in order to avoid duplication and to enhance effective usage of a limited budget and human resources.

  19. Modifying Older Adults’ Daily Sedentary Behaviour Using an Asset-based Solution: Views from Older Adults

    Directory of Open Access Journals (Sweden)

    Dawn A Skelton

    2016-08-01

    Full Text Available Objective: There is a growing public health focus on the promotion of successful and active ageing. Interventions to reduce sedentary behaviour (SB in older adults are feasible and are improved by tailoring to individuals’ context and circumstances. SB is ubiquitous; therefore part of the tailoring process is to ensure individuals’ daily sedentary routine can be modified. The aim of this study was to understand the views of older adults and identify important considerations when creating a solution to modify daily sedentary patterns. Method: This was a qualitative research study. Fifteen older adult volunteers (mean age = 78 years participated in 1 of 4 focus groups to identify solutions to modify daily sedentary routine. Two researchers conducted the focus groups whilst a third took detailed fieldnotes on a flipchart to member check the findings. Data were recorded and analysed thematically. Results: Participants wanted a solution with a range of options which could be tailored to individual needs and circumstances. The strategy suggested was to use the activities of daily routine and reasons why individuals already naturally interrupting their SB, collectively framed as assets. These assets were categorised into 5 sub-themes: physical assets (eg. standing up to reduce stiffness; psychological assets (eg. standing up to reduce feelings of guilt; interpersonal assets (eg. standing up to answer the phone; knowledge assets (eg. standing up due to knowing the benefits of breaking SB and activities of daily living assets (eg. standing up to get a drink. Conclusion: This study provides important considerations from older adults’ perspectives to modify their daily sedentary patterns. The assets identified by participants could be used to co-create a tailored intervention with older adults to reduce SB, which may increase effectiveness and adherence.

  20. Modifying Older Adults' Daily Sedentary Behaviour Using an Asset-based Solution: Views from Older Adults.

    Science.gov (United States)

    Leask, Calum F; Sandlund, Marlene; Skelton, Dawn A; Tulle, Emmanuelle; Chastin, Sebastien Fm

    2016-01-01

    There is a growing public health focus on the promotion of successful and active ageing. Interventions to reduce sedentary behaviour (SB) in older adults are feasible and are improved by tailoring to individuals' context and circumstances. SB is ubiquitous; therefore part of the tailoring process is to ensure individuals' daily sedentary routine can be modified. The aim of this study was to understand the views of older adults and identify important considerations when creating a solution to modify daily sedentary patterns. This was a qualitative research study. Fifteen older adult volunteers (mean age = 78 years) participated in 1 of 4 focus groups to identify solutions to modify daily sedentary routine. Two researchers conducted the focus groups whilst a third took detailed fieldnotes on a flipchart to member check the findings. Data were recorded and analysed thematically. Participants wanted a solution with a range of options which could be tailored to individual needs and circumstances. The strategy suggested was to use the activities of daily routine and reasons why individuals already naturally interrupting their SB, collectively framed as assets. These assets were categorised into 5 sub-themes: physical assets (eg. standing up to reduce stiffness); psychological assets (eg. standing up to reduce feelings of guilt); interpersonal assets (eg. standing up to answer the phone); knowledge assets (eg. standing up due to knowing the benefits of breaking SB) and activities of daily living assets (eg. standing up to get a drink). This study provides important considerations from older adults' perspectives to modify their daily sedentary patterns. The assets identified by participants could be used to co-create a tailored intervention with older adults to reduce SB, which may increase effectiveness and adherence.

  1. Making sense in asset markets: Strategies for Implicit Organizations

    Directory of Open Access Journals (Sweden)

    Johannes M. Lehner

    2015-12-01

    Full Text Available While asset markets are traditionally left to economic inquiry, the paper shows that there is both a legal possibility and an incentive for organizing within such markets and for exercising market share-based strategic maneuvering. It proposes, based on sensemaking theory, Implicit Organizations in asset markets to exploit equivocality for momentum trading strategies. An Implicit Organization fulfills the criteria of an organization, while maintaining the image of a perfect market. Its members coordinate via market signals and fixed investment time windows to ensure positive returns to strategic maneuvering in asset markets. In support of hypotheses derived from sensemaking theory, results of empirical studies from two different investment contexts (Xetra and NYSE provide evidence that equivocal analysts’ recommendations predict investment returns after a fixed time period.

  2. Consumption-based macroeconomic models of asset pricing theory

    Directory of Open Access Journals (Sweden)

    Đorđević Marija

    2016-01-01

    Full Text Available The family of consumptionbased asset pricing models yields a stochastic discount factor proportional to the marginal rate of intertemporal substitution of consumption. In examining the empirical performance of this class of models, several puzzles are discovered. In this literature review we present the canonical model, the corresponding empirical tests, and different extensions to this model that propose a resolution of these puzzles.

  3. RFID Application of Smart Grid for Asset Management

    Directory of Open Access Journals (Sweden)

    Xiwei Wang

    2013-01-01

    Full Text Available RFID technology research has resolved practical application issues of the power industry such as assets management, working environment control, and vehicle networking. Also it provides technical reserves for the convergence of ERP and CPS. With the development of RFID and location-based services technology, RFID is converging with a variety of sensing, communication, and information technologies. Indoor positioning applications are under rapid development. Micromanagement environment of the assets is a useful practice for the RFID and positioning. In this paper, the model for RFID applications has been analyzed in the microenvironment management of the data center and electric vehicle batteries, and the optimization scheme of enterprise asset management is also proposed.

  4. EFFICIENCY OF CURRENCY ASSET CLASSES

    Directory of Open Access Journals (Sweden)

    Mohammad R. Safarzadeh

    2013-04-01

    Full Text Available Analyzing the risk and return for the S&P Currency Index Arbitrage and the Merk Absolute Return Currency Fund, this study intends to find whether currency asset classes are worthwhile investments. To determine where the efficient currency portfolios lie in the risk and return spectrum, this paper compares the two portfolios to fixed income and equity asset portfolios. The results lead to a baffling conclusion that, in general, the returns to low-risk currency asset portfolios are higher than the equity asset portfolios of same risk level.

  5. A rural transit asset management system

    Science.gov (United States)

    2005-01-01

    This reports the research undertaken to create an interactive, geographic information system based asset management system for the Alabama Department of Transportation to manage vehicles purchased and operated through Section 5310 and 5311 federal gr...

  6. Fair value versus historical cost-based valuation for biological assets: predictability of financial information

    Directory of Open Access Journals (Sweden)

    Josep M. Argilés

    2011-08-01

    This paper performs an empirical study with a sample of Spanish farms valuing biological assets at HC and a sample applying FV, finding no significant differences between both valuation methods to assess future cash flows. However, most tests reveal more predictive power of future earnings under fair valuation of biological assets, which is not explained by differences in volatility of earnings and profitability. The study also evidences the existence of flawed HC accounting practices for biological assets in agriculture, which suggests scarce information content of this valuation method in the predominant small business units existing in the agricultural sector in advanced Western countries.

  7. Asset prices and priceless assets

    NARCIS (Netherlands)

    Penasse, J.N.G.

    2014-01-01

    The doctoral thesis studies several aspects of asset returns dynamics. The first three chapters focus on returns in the fine art market. The first chapter provides evidence for the existence of a slow-moving fad component in art prices that induces short-term return predictability. The article has

  8. Flow Oriented Channel Assignment for Multi-radio Wireless Mesh Networks

    Directory of Open Access Journals (Sweden)

    Niu Zhisheng

    2010-01-01

    Full Text Available We investigate channel assignment for a multichannel wireless mesh network backbone, where each router is equipped with multiple interfaces. Of particular interest is the development of channel assignment heuristics for multiple flows. We present an optimization formulation and then propose two iterative flow oriented heuristics for the conflict-free and interference-aware cases, respectively. To maximize the aggregate useful end-to-end flow rates, both algorithms identify and resolve congestion at instantaneous bottleneck link in each iteration. Then the link rate is optimally allocated among contending flows that share this link by solving a linear programming (LP problem. A thorough performance evaluation is undertaken as a function of the number of channels and interfaces/node and the number of contending flows. The performance of our algorithm is shown to be significantly superior to best known algorithm in its class in multichannel limited radio scenarios.

  9. Application of a Dynamic Programming Algorithm for Weapon Target Assignment

    Science.gov (United States)

    2016-02-01

    evaluation and weapon assignment in maritime combat scenarios. Lloyd also acts as a liaison for the Weapons and Combat Systems Division with the ANZAC...positively identified a number of targets as threats, whether they are an enemy ship (i.e., specifically, its weapon launcher systems) or a directed

  10. Algorithmic approach to diagram techniques

    International Nuclear Information System (INIS)

    Ponticopoulos, L.

    1980-10-01

    An algorithmic approach to diagram techniques of elementary particles is proposed. The definition and axiomatics of the theory of algorithms are presented, followed by the list of instructions of an algorithm formalizing the construction of graphs and the assignment of mathematical objects to them. (T.A.)

  11. AUTOMATING ASSET KNOWLEDGE WITH MTCONNECT.

    Science.gov (United States)

    Venkatesh, Sid; Ly, Sidney; Manning, Martin; Michaloski, John; Proctor, Fred

    2016-01-01

    In order to maximize assets, manufacturers should use real-time knowledge garnered from ongoing and continuous collection and evaluation of factory-floor machine status data. In discrete parts manufacturing, factory machine monitoring has been difficult, due primarily to closed, proprietary automation equipment that make integration difficult. Recently, there has been a push in applying the data acquisition concepts of MTConnect to the real-time acquisition of machine status data. MTConnect is an open, free specification aimed at overcoming the "Islands of Automation" dilemma on the shop floor. With automated asset analysis, manufacturers can improve production to become lean, efficient, and effective. The focus of this paper will be on the deployment of MTConnect to collect real-time machine status to automate asset management. In addition, we will leverage the ISO 22400 standard, which defines an asset and quantifies asset performance metrics. In conjunction with these goals, the deployment of MTConnect in a large aerospace manufacturing facility will be studied with emphasis on asset management and understanding the impact of machine Overall Equipment Effectiveness (OEE) on manufacturing.

  12. Comparison of greedy algorithms for α-decision tree construction

    KAUST Repository

    Alkhalid, Abdulaziz; Chikalov, Igor; Moshkov, Mikhail

    2011-01-01

    A comparison among different heuristics that are used by greedy algorithms which constructs approximate decision trees (α-decision trees) is presented. The comparison is conducted using decision tables based on 24 data sets from UCI Machine Learning Repository [2]. Complexity of decision trees is estimated relative to several cost functions: depth, average depth, number of nodes, number of nonterminal nodes, and number of terminal nodes. Costs of trees built by greedy algorithms are compared with minimum costs calculated by an algorithm based on dynamic programming. The results of experiments assign to each cost function a set of potentially good heuristics that minimize it. © 2011 Springer-Verlag.

  13. Generalised Assignment Matrix Methodology in Linear Programming

    Science.gov (United States)

    Jerome, Lawrence

    2012-01-01

    Discrete Mathematics instructors and students have long been struggling with various labelling and scanning algorithms for solving many important problems. This paper shows how to solve a wide variety of Discrete Mathematics and OR problems using assignment matrices and linear programming, specifically using Excel Solvers although the same…

  14. Formularity: Software for Automated Formula Assignment of Natural and Other Organic Matter from Ultrahigh-Resolution Mass Spectra

    Energy Technology Data Exchange (ETDEWEB)

    Tolic, Nikola; Liu, Yina; Liyu, Andrey V.; Shen, Yufeng; Tfaily, Malak M.; Kujawinski, Elizabeth B.; Longnecker, Krista; Kuo, Li-Jung; Robinson, Errol W.; Pasa Tolic, Ljiljana; Hess, Nancy J.

    2017-11-13

    Ultrahigh-resolution mass spectrometry, such as Fourier transform ion-cyclotron resonance mass spectrometry (FT-ICR MS), can resolve thousands of molecular ions in complex organic matrices. A Compound Identification Algorithm (CIA) was previously developed for automated elemental formula assignment for natural organic matter (NOM). In this work we describe a user friendly interface for CIA, titled Formularity, which includes an additional functionality to perform search of formulas based on an Isotopic Pattern Algorithm (IPA). While CIA assigns elemental formulas for compounds containing C, H, O, N, S, and P, IPA is capable of assigning formulas for compounds containing other elements. We used halogenated organic compounds (HOC), a chemical class that is ubiquitous in nature as well as anthropogenic systems, as an example to demonstrate the capability of Formularity with IPA. A HOC standard mix was used to evaluate the identification confidence of IPA. The HOC spike in NOM and tap water were used to assess HOC identification in natural and anthropogenic matrices. Strategies for reconciliation of CIA and IPA assignments are discussed. Software and sample databases with documentation are freely available from the PNNL OMICS software repository https://omics.pnl.gov/software/formularity.

  15. 2014 World Congress on Engineering Asset Management

    CERN Document Server

    Hoohlo, Changela; Mathew, Joe

    2015-01-01

    Engineering asset management encompasses all types of engineered assets including built environment, infrastructure, plant, equipment, hardware systems and components. Following the release of ISO 5500x set of standards, the 9th WCEAM addresses the hugely important issue of what constitutes the body of knowledge in Engineering Asset Management. Topics discussed by Congress delegates are grouped into a number of tracks including strategies for investment and divestment of assets, operations and maintenance of assets, assessments of assets condition, risk and vulnerability, technologies and systems for management of asset, standards, education, training and certification. These proceedings include a sample of the wide range of topics presented during the 9th World Congress on Engineering Asset Management in Pretoria South Africa 28 – 31 October, 2014 and complements other emerging publications and standards that embrace the wide ranging issues concerning the management of engineered physical assets.

  16. Simulation-Based Dynamic Passenger Flow Assignment Modelling for a Schedule-Based Transit Network

    Directory of Open Access Journals (Sweden)

    Xiangming Yao

    2017-01-01

    Full Text Available The online operation management and the offline policy evaluation in complex transit networks require an effective dynamic traffic assignment (DTA method that can capture the temporal-spatial nature of traffic flows. The objective of this work is to propose a simulation-based dynamic passenger assignment framework and models for such applications in the context of schedule-based rail transit systems. In the simulation framework, travellers are regarded as individual agents who are able to obtain complete information on the current traffic conditions. A combined route selection model integrated with pretrip route selection and entrip route switch is established for achieving the dynamic network flow equilibrium status. The train agent is operated strictly with the timetable and its capacity limitation is considered. A continuous time-driven simulator based on the proposed framework and models is developed, whose performance is illustrated through a large-scale network of Beijing subway. The results indicate that more than 0.8 million individual passengers and thousands of trains can be simulated simultaneously at a speed ten times faster than real time. This study provides an efficient approach to analyze the dynamic demand-supply relationship for large schedule-based transit networks.

  17. Dukovany ASSET mission preparation

    International Nuclear Information System (INIS)

    Kouklik, I.

    1996-01-01

    We are in the final stages of the Dukovany ASSET mission 1996 preparation. I would like to present some of our recent experiences. Maybe they would be helpful to other plants, that host ASSET missions in future

  18. Dukovany ASSET mission preparation

    Energy Technology Data Exchange (ETDEWEB)

    Kouklik, I [NPP Dukovany (Czech Republic)

    1997-12-31

    We are in the final stages of the Dukovany ASSET mission 1996 preparation. I would like to present some of our recent experiences. Maybe they would be helpful to other plants, that host ASSET missions in future.

  19. Implementace Asset managementu

    OpenAIRE

    Fuxa, Lukáš

    2016-01-01

    Tato diplomová práce obsahuje návrh implementace Asset managementu do ServiceNow v nejmenované nadnárodní společnosti. Cílem diplomové práce je analýza požadavku společnosti a nalezení vhodného řešení implementace Asset managementu v rámci stávajících nástrojů. Závěrem zhodnotím, zda je možné vybraný nástroj využít. This master’s thesis contains proposal to implementation Asset management to ServiceNow in unnamed multinational company. The aim of this master’s thesis is analysis of company...

  20. Study Regarding the Financial Reporting of Intangible Assets. Case of Romanian Pharmaceutical Industry

    OpenAIRE

    Alina Gheorghe Ancuţa; Florentina Moisescu; Florina Varlanuta

    2017-01-01

    The accounting treatment of intangible assets is a particularly complex and important issue for today's economy, a knowledge based economy. For now days companies, these assets are inductors for success and an important factor for achieving competitive advantage. Also, these assets are an important part of the financial statements. With the increasing weight and importance of the intangible assets the need for financial information of financial statements’users has changed and the current acc...

  1. 6th World Congress on Engineering Asset Management

    CERN Document Server

    Ni, Jun; Sarangapani, Jagnathan; Mathew, Joseph

    2014-01-01

    This text represents state-of-the-art trends and developments in the emerging field of engineering asset management as presented at the Sixth World Congress on Engineering Asset Management (WCEAM) held in Cincinnati, OH, USA from October 3-5, 2011 The Proceedings of the WCEAM 2011 is an excellent reference for practitioners, researchers and students in the multidisciplinary field of asset management, covering topics such as: • Asset condition monitoring and intelligent maintenance • Asset data warehousing, data mining and fusion • Asset performance and level-of-service models • Design and lifecycle integrity of physical assets • Deterioration and preservation models for assets • Education and training in asset management • Engineering standards in asset management • Fault diagnosis and prognostics • Financial analysis methods for physical assets • Human dimensions in integrated asset management • Information quality management • Information systems and knowledge management • Intellig...

  2. Applications of random forest feature selection for fine-scale genetic population assignment.

    Science.gov (United States)

    Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G

    2018-02-01

    Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.

  3. Improvement of the methods for company’s fixed assets analysis

    Directory of Open Access Journals (Sweden)

    T. A. Zhurkina

    2018-01-01

    Full Text Available Fixed assets are an integral component of the productive capacity of any enterprise. The financial results of the enterprise largely depend on their intensity and efficiency of use. The analysis of fixed assets is usually carried out using an integrated and systematic approach, based on their availability, their movement, efficiency of use (including their active part. In the opinion of some authors, the traditional methods of analyzing fixed assets have a number of shortcomings, since they do not take into account the life cycle of an enterprise, the ecological aspects of the operation of fixed assets, the operation specifics of the individual divisions of a company and its branches. In order to improve the methodology for analyzing fixed assets, the authors proposed to use formalized and nonformalized criteria for analyzing the risks associated with the fixed asset use. A survey questionnaire was designed to determine the likelihood of the risk of economic losses associated with the use of fixed assets. The authors propose using the integral indicator for the purpose of analyzing the risk of using fixed assets in dynamics. In order to improve the procedure for auditing, the authors proposed segregation of economic transactions with fixed assets according to their cycles in accordance with the stage of their reproduction. Operational analysis is important for managing the efficiency of the fixed asset use, especially during a critical period. Using the analysis of the regularity in grain combines performance would reduce losses during harvesting, implement the work within strictly defined time frame and remunerate the employees for high-quality and intensive performance of their tasks.

  4. Optimization of linear consecutive-k-out-of-n system with a Birnbaum importance-based genetic algorithm

    International Nuclear Information System (INIS)

    Cai, Zhiqiang; Si, Shubin; Sun, Shudong; Li, Caitao

    2016-01-01

    The optimization of linear consecutive-k-out-of-n (Lin/Con/k/n) is to find an optimal component arrangement where n components are assigned to n positions to maximize the system reliability. With the interchangeability of components in practical systems, the optimization of Lin/Con/k/n systems is becoming widely applied in engineering practice, which is also a typical component assignment problem concerned by many researchers. This paper proposes a Birnbaum importance-based genetic algorithm (BIGA) to search the near global optimal solution for Lin/Con/k/n systems. First, the operation procedures and corresponding execution methods of BIGA are described in detail. Then, comprehensive simulation experiments are implemented on both small and large systems to evaluate the performance of the BIGA by comparing with the Birnbaum importance-based two-stage approach and Birnbaum importance-based genetic local search algorithm. Thirdly, further experiments are provided to discuss the applicability of BIGA for Lin/Con/k/n system with different k and n. Finally, the case study on oil transportation system is implemented to demonstrate the application of BIGA in the optimization of Lin/Con/k/n system. - Highlights: • BIGA integrates BI and GA to solve the Lin/Con/k/n systems optimization problems. • The experiment results show that the BIGA performs well in most conditions. • Suggestions are given for the application of BIGA and BITA with different k and n. • The application procedure of BIGA is demonstrated by the oil transportation system.

  5. Modelling the Costs of Preserving Digital Assets

    DEFF Research Database (Denmark)

    Kejser, Ulla Bøgvad; Nielsen, Anders Bo; Thirifays, Alex

    2012-01-01

    Information is increasingly being produced in digital form, and some of it must be preserved for the longterm. Digital preservation includes a series of actively managed activities that require on-going funding. To obtain sufficient resources, there is a need for assessing the costs...... and the benefits accrued by preserving the assets. Cost data is also needed for optimizing activities and comparing the costs of different preservation alternatives. The purpose of this study is to analyse generic requirements for modelling the cost of preserving digital assets. The analysis was based...

  6. The Evaluation of Company's Intangible Assets' influence for Business Value

    Directory of Open Access Journals (Sweden)

    Živilė Savickaitė

    2014-12-01

    Full Text Available Mismeasurement of intangible assets in a company may result in high costs and loss of its competitiveness and position in the market. Conventional evaluation methods are not able to identify reliably intangible intensive business value because of such assets specificity. Therefore, the business assessment process adjustment, making it comprehensive and including the intangible asset valuation methods is a critical process that allows to evaluate companies better and increases business management efficiency and quality. The article states the importance of further scientific research in the areas of the intangible value resources, creation of business valuation, intangible assets valuation methods and models - the creation of intangible assets on the firm level and how they meet changing needs of the company's owners, capital markets investors, politicians and other interest-groups needs in the intangible intensive economy should be analysed as well as how economic systems based on intangible assets operates. Also special attention is be given to the strenghtening of the cooperation of scientific research and business. It's important to avoid a repeat of guidelines, methods, models and systems of intangible assets' measurement and business valuation methods and to eliminate its disadvantages in order to create and establish universal system for effective intangible intensive business valuation.

  7. Retrading, production, and asset market performance.

    Science.gov (United States)

    Gjerstad, Steven D; Porter, David; Smith, Vernon L; Winn, Abel

    2015-11-24

    Prior studies have shown that traders quickly converge to the price-quantity equilibrium in markets for goods that are immediately consumed, but they produce speculative price bubbles in resalable asset markets. We present a stock-flow model of durable assets in which the existing stock of assets is subject to depreciation and producers may produce additional units of the asset. In our laboratory experiments inexperienced consumers who can resell their units disregard the consumption value of the assets and compete vigorously with producers, depressing prices and production. Consumers who have first participated in experiments without resale learn to heed their consumption values and, when they are given the option to resell, trade at equilibrium prices. Reproducibility is therefore the most natural and most effective treatment for suppression of bubbles in asset market experiments.

  8. The 5C model: A new approach to asset integrity management

    International Nuclear Information System (INIS)

    Rahim, Yousif; Refsdal, Ingbjorn; Kenett, Ron S.

    2010-01-01

    As organizations grow more complex in operation and more global in scope, assets and technical integrity become key success factors. A company's asset integrity business process needs to be mapped in order to 1) provide a proper overview of operation and business processes, 2) identify all critical interfaces and 3) ensure that all gaps and overlaps in processes are eliminated. Achieving asset integrity requires companies to sustain their activities and identify the hazards, weaknesses and objectives of their strategic assets. Technical integrity emphasizes a complete overview of technical conditions and related information, and the ability of the companies to document the technical state of its assets. It is based on an integrated view of the current state of operations, and the identification of all critical interfaces, in order to ensure that all gaps and unnecessary overlaps in processes are eliminated. Companies look increasingly at their asset integrity management system as a means to extend the life of their assets, beyond the original design conditions and production capacity. Establishing an asset integrity management system requires the documentation of the company's technical integrity management, a strategy and the processes for carrying it out, identifying gaps; selecting corrective interventions and conducting follow up actions. The paper discusses various aspects of asset integrity management, including its planning and implementation. We begin with an introduction to asset technical integrity, provide some theoretical backgrounds, present a model we call 5C and conclude with a summary and discussion.

  9. The pricing of illiquidity and illiquid assets : Essays on empirical asset pricing

    NARCIS (Netherlands)

    Tuijp, Patrick

    2016-01-01

    This dissertation studies the pricing of liquidity and illiquid assets. For this thesis, liquidity will generally refer to the ease with which an asset can be traded. The first chapter investigates the role of the investment horizon in the impact of illiquidity on stock prices. We obtain a clientele

  10. Best practices in geographic information systems-based transportation asset management

    Science.gov (United States)

    2012-01-31

    This report provides background on GIS and asset management, describes how public agencies have been integrating the two, and identifies benefits and challenges to doing so. The information presented is gleaned from a literature review and interviews...

  11. INTANGIBLE ASSETS THROUGH THE COHESION POLICY

    Directory of Open Access Journals (Sweden)

    Popescu (Stingaciu Ana-Maria

    2012-07-01

    Full Text Available INTANGIBLE ASSETS THROUGH THE COHESION POLICY Roth Anne-Marie-Monika West University of Timisoara Faculty of Economics and Business Administration Popescu (Stingaciu Ana-Maria West University of Timisoara Faculty of Economics and Business Administration Intangible assets in general and intellectual capital in particular are important to both society and organizations. It can be a source of competitive advantage for business and stimulate innovation that leads to wealth generation. Technological revolutions, the rise of the knowledge-based economy and the networked society have all led to the same conclusion that intangibles and how they contribute to value creation have to be appreciated so that the appropriate decisions can be made to protect and enhance them. The Cohesion Policy represents the main EU measure to ensure a balanced and sustainable growth in Europe by promoting harmonious development and reducing the regional disparities. The general objective of the paper is to highlight the important role of the Cohesion Policy in the development of intangible assets. The objectives and the instruments of the Cohesion Policy are designed to support programs on regional development, economic change, enhanced competitiveness and territorial cooperation through the European Union, to develop human resources and employability. Keywords: intangible assets, intellectual capital, Cohesion policy, development; JEL Classification: O43, G32, D24, O34

  12. Review of UK participation in ASSET activities 1995/96 for the annual workshop on ASSET experience

    International Nuclear Information System (INIS)

    Phipps, C.R.

    1996-01-01

    With the restructuring of the Nuclear Generation Industry in the UK over the last 12 months it has been difficult to provide support to international activities including ASSET. This is likely to continue for a further 12 months whilst consolidation of the privatised part of the industry takes place and Magnox Electric plc is merged with British Nuclear Fuels Ltd. Having made that statement I would confirm that the UK is fully supportive of the ASSET methodology and will continue to be a participant in as many ASSET activities as possible. It was noted that during 1995 ASSET completed its 100th mission and the UK would like to congratulate the staff in the IAEA on this achievement. Discussions are at present ongoing within Magnox Electric plc, regarding the possibility of hosting an ASSET Peer Review mission, at one of the UK's Magnox plants, in 1997/98. During the 1995/96 period the UK participated in a number of ASSET activities as detailed below

  13. Review of UK participation in ASSET activities 1995/96 for the annual workshop on ASSET experience

    Energy Technology Data Exchange (ETDEWEB)

    Phipps, C R

    1997-12-31

    With the restructuring of the Nuclear Generation Industry in the UK over the last 12 months it has been difficult to provide support to international activities including ASSET. This is likely to continue for a further 12 months whilst consolidation of the privatised part of the industry takes place and Magnox Electric plc is merged with British Nuclear Fuels Ltd. Having made that statement I would confirm that the UK is fully supportive of the ASSET methodology and will continue to be a participant in as many ASSET activities as possible. It was noted that during 1995 ASSET completed its 100th mission and the UK would like to congratulate the staff in the IAEA on this achievement. Discussions are at present ongoing within Magnox Electric plc, regarding the possibility of hosting an ASSET Peer Review mission, at one of the UK`s Magnox plants, in 1997/98. During the 1995/96 period the UK participated in a number of ASSET activities as detailed below.

  14. Optimization of the test intervals of a nuclear safety system by genetic algorithms, solution clustering and fuzzy preference assignment

    International Nuclear Information System (INIS)

    Zio, E.; Bazzo, R.

    2010-01-01

    In this paper, a procedure is developed for identifying a number of representative solutions manageable for decision-making in a multiobjective optimization problem concerning the test intervals of the components of a safety system of a nuclear power plant. Pareto Front solutions are identified by a genetic algorithm and then clustered by subtractive clustering into 'families'. On the basis of the decision maker's preferences, each family is then synthetically represented by a 'head of the family' solution. This is done by introducing a scoring system that ranks the solutions with respect to the different objectives: a fuzzy preference assignment is employed to this purpose. Level Diagrams are then used to represent, analyze and interpret the Pareto Fronts reduced to the head-of-the-family solutions

  15. Trends in asset structure between not-for-profit and investor-owned hospitals.

    Science.gov (United States)

    Song, Paula H; Reiter, Kristin L

    2010-12-01

    The delivery of health care is a capital-intensive industry, and thus, hospital investment strategy continues to be an important area of interest for both health policy and research. Much attention has been given to hospitals' capital investment policies with relatively little attention to investments in financial assets, which serve an important role in not-for-profit (NFP) hospitals. This study describes and analyzes trends in aggregate asset structure between NFP and investor-owned (IO) hospitals during the post-capital-based prospective payment system implementation period, providing the first documentation of long-term trends in hospital investment. The authors find hospitals' aggregate asset structure differs significantly based on ownership, size, and profitability. For both NFP and IO hospitals, financial securities have remained consistent over time, while fixed asset representation has declined in IO hospitals.

  16. Trends in Asset Structure between Not-for-Profit and Investor Owned Hospitals

    Science.gov (United States)

    Song, Paula H.; Reiter, Kristin L.

    2010-01-01

    The delivery of health care is a capital intensive industry and thus hospital investment strategy continues to be an important area of interest for both health policy and research. Much attention has been given to hospitals’ capital investment policies with relatively little attention to investments in financial assets, which serve an important role in NFP hospitals. This study describes and analyzes trends in aggregate asset structure between NFP and IO hospitals during the post-capital based PPS implementation period, providing the first documentation of long-term trends in hospital investment. We find hospitals’ aggregate asset structure differs significantly based on ownership, size, and profitability. For both NFP and IO hospitals, financial securities have remained consistent over time, while fixed asset representation has declined in IO hospitals. PMID:20519429

  17. 18 CFR 346.3 - Asset retirement obligations.

    Science.gov (United States)

    2010-04-01

    ... related to asset retirement obligations that would impact the calculation of rate base, such as carrier property and related accumulated depreciation and accumulated deferred income taxes, may not be reflected...

  18. 18 CFR 35.18 - Asset retirement obligations.

    Science.gov (United States)

    2010-04-01

    ... related to asset retirement obligations that would impact the calculation of rate base, such as electric plant and related accumulated depreciation and accumulated deferred income taxes, may not be reflected...

  19. Asset life cycle plans: twelve steps to assist strategic decision-making in asset life cycle management

    NARCIS (Netherlands)

    Ruitenburg, Richard Jacob; Braaksma, Anne Johannes Jan; van Dongen, Leonardus Adriana Maria; Carnero, Maria Carmen; Gonzalez-Prida, Vicente

    2017-01-01

    Effective management of physical assets should deliver maximum business value. Therefore, Asset Management standards such as PAS 55 and ISO 55000 ask for a life cycle approach. However, most existing methods focus only on the short term of the asset's life or the estimation of its remaining life.

  20. Forecasting financial asset processes: stochastic dynamics via learning neural networks.

    Science.gov (United States)

    Giebel, S; Rainer, M

    2010-01-01

    Models for financial asset dynamics usually take into account their inherent unpredictable nature by including a suitable stochastic component into their process. Unknown (forward) values of financial assets (at a given time in the future) are usually estimated as expectations of the stochastic asset under a suitable risk-neutral measure. This estimation requires the stochastic model to be calibrated to some history of sufficient length in the past. Apart from inherent limitations, due to the stochastic nature of the process, the predictive power is also limited by the simplifying assumptions of the common calibration methods, such as maximum likelihood estimation and regression methods, performed often without weights on the historic time series, or with static weights only. Here we propose a novel method of "intelligent" calibration, using learning neural networks in order to dynamically adapt the parameters of the stochastic model. Hence we have a stochastic process with time dependent parameters, the dynamics of the parameters being themselves learned continuously by a neural network. The back propagation in training the previous weights is limited to a certain memory length (in the examples we consider 10 previous business days), which is similar to the maximal time lag of autoregressive processes. We demonstrate the learning efficiency of the new algorithm by tracking the next-day forecasts for the EURTRY and EUR-HUF exchange rates each.

  1. A Logical Deduction Based Clause Learning Algorithm for Boolean Satisfiability Problems

    Directory of Open Access Journals (Sweden)

    Qingshan Chen

    2017-01-01

    Full Text Available Clause learning is the key component of modern SAT solvers, while conflict analysis based on the implication graph is the mainstream technology to generate the learnt clauses. Whenever a clause in the clause database is falsified by the current variable assignments, the SAT solver will try to analyze the reason by using different cuts (i.e., the Unique Implication Points on the implication graph. Those schemes reflect only the conflict on the current search subspace, does not reflect the inherent conflict directly involved in the rest space. In this paper, we propose a new advanced clause learning algorithm based on the conflict analysis and the logical deduction, which reconstructs a linear logical deduction by analyzing the relationship of different decision variables between the backjumping level and the current decision level. The logical deduction result is then added into the clause database as a newly learnt clause. The resulting implementation in Minisat improves the state-of-the-art performance in SAT solving.

  2. Essays on asset pricing

    NARCIS (Netherlands)

    Nazliben, Kamil

    2015-01-01

    The dissertation consists of three chapters that represent separate papers in the area of asset pricing. The first chapter studies investors optimal asset allocation problem in which mean reversion in stock prices is captured by explicitly modeling transitory and permanent shocks. The second chapter

  3. Ambiguity and Volatility : Asset Pricing Implications

    NARCIS (Netherlands)

    Pataracchia, B.

    2011-01-01

    Using a simple dynamic consumption-based asset pricing model, this paper explores the implications of a representative investor with smooth ambiguity averse preferences [Klibano¤, Marinacci and Mukerji, Econometrica (2005)] and provides a comparative analysis of risk aversion and ambiguity aversion.

  4. Evaluation on Optimal Scale of Rural Fixed-asset Investment-Based on Microcosmic Perspective of Farmers’ Income Increase

    Institute of Scientific and Technical Information of China (English)

    Jinqian; DENG; Kangkang; SHAN; Yan; ZHANG

    2014-01-01

    The rural fundamental and productive fixed-asset investment not only makes active influence on the changes of farmers’ operational,wages and property income,but it also has an optimal scale range for farmers’ income increase. From the perspective of farmers’ income increase,this article evaluates the optimal scale of rural fixed-asset investment by setting up model with statistic data,and the results show that the optimal scale of per capita rural fixed-asset investment is 76. 35% of per capita net income of rural residents,which has been reached in China in 2009. Therefore,compared with the adding of rural fixed-asset investment,a better income increase effect can be achieved through the adjustment of rural fixed-asset investment structure.

  5. Mutually catalyzed birth of population and assets in exchange-driven growth

    Science.gov (United States)

    Lin, Zhenquan; Ke, Jianhong; Ye, Gaoxiang

    2006-10-01

    We propose an exchange-driven aggregation growth model of population and assets with mutually catalyzed birth to study the interaction between the population and assets in their exchange-driven processes. In this model, monomer (or equivalently, individual) exchange occurs between any pair of aggregates of the same species (population or assets). The rate kernels of the exchanges of population and assets are K(k,l)=Kkl and L(k,l)=Lkl , respectively, at which one monomer migrates from an aggregate of size k to another of size l . Meanwhile, an aggregate of one species can yield a new monomer by the catalysis of an arbitrary aggregate of the other species. The rate kernel of asset-catalyzed population birth is I(k,l)=Iklμ [and that of population-catalyzed asset birth is J(k,l)=Jklν ], at which an aggregate of size k gains a monomer birth when it meets a catalyst aggregate of size l . The kinetic behaviors of the population and asset aggregates are solved based on the rate equations. The evolution of the aggregate size distributions of population and assets is found to fall into one of three categories for different parameters μ and ν : (i) population (asset) aggregates evolve according to the conventional scaling form in the case of μ⩽0 (ν⩽0) , (ii) population (asset) aggregates evolve according to a modified scaling form in the case of ν=0 and μ>0 ( μ=0 and ν>0 ), and (iii) both population and asset aggregates undergo gelation transitions at a finite time in the case of μ=ν>0 .

  6. Seizure detection algorithms based on EMG signals

    DEFF Research Database (Denmark)

    Conradsen, Isa

    Background: the currently used non-invasive seizure detection methods are not reliable. Muscle fibers are directly connected to the nerves, whereby electric signals are generated during activity. Therefore, an alarm system on electromyography (EMG) signals is a theoretical possibility. Objective...... on the amplitude of the signal. The other algorithm was based on information of the signal in the frequency domain, and it focused on synchronisation of the electrical activity in a single muscle during the seizure. Results: The amplitude-based algorithm reliably detected seizures in 2 of the patients, while...... the frequency-based algorithm was efficient for detecting the seizures in the third patient. Conclusion: Our results suggest that EMG signals could be used to develop an automatic seizuredetection system. However, different patients might require different types of algorithms /approaches....

  7. Incarceration and Household Asset Ownership.

    Science.gov (United States)

    Turney, Kristin; Schneider, Daniel

    2016-12-01

    A considerable literature documents the deleterious economic consequences of incarceration. However, little is known about the consequences of incarceration for household assets-a distinct indicator of economic well-being that may be especially valuable to the survival of low-income families-or about the spillover economic consequences of incarceration for families. In this article, we use longitudinal data from the Fragile Families and Child Wellbeing Study to examine how incarceration is associated with asset ownership among formerly incarcerated men and their romantic partners. Results, which pay careful attention to the social forces that select individuals into incarceration, show that incarceration is negatively associated with ownership of a bank account, vehicle, and home among men and that these consequences for asset ownership extend to the romantic partners of these men. These associations are concentrated among men who previously held assets. Results also show that post-incarceration changes in romantic relationships are an important pathway by which even short-term incarceration depletes assets.

  8. Intelligent tactical asset allocation support system

    NARCIS (Netherlands)

    Hiemstra, Y.

    1995-01-01

    This paper presents an advanced support system for Tactical Asset Allocation. Asset allocation explains over 90% of portfolio performance (Brinson, Hood and Beebower, 1988). Tactical asset allocation adjusts a strategic portfolio on the basis of short term market outlooks. The system includes

  9. Work at Forsmark since ASSET 1996

    Energy Technology Data Exchange (ETDEWEB)

    Loewenhielm, G; Andersson, O [Forsmark Kraftgrupp AB, Oesthammar (Sweden)

    1997-10-01

    The following directions of work at Forsmark since ASSET 1996 are briefly described: peer review follow-up; work related to peer review, Forsmark 2 mini-ASSET; MTO(man-technology-organization)-analysis method, concept development, combination of MTO and ASSET methods; Forsmark INES manual.

  10. Application of Securitization of Leasing Assets

    Directory of Open Access Journals (Sweden)

    Igor Viktorovich Linev

    2014-01-01

    Full Text Available Securitization of leasing assets was widely adopted abroad within the last decades. Securitization of leasing assets usually is meant as process of formation of a portfolio based on future leasing payments of one and (or more leasing company and sale of securities to investors for the subsequent refinancing of leasing operations. These securities can be bonds, actions or bills. Thus the asset leased, acts as providing these papers. Nomenclature of property includes office, medical (first of all, stomatology, training, video the equipment, and also a car, motor-equipment, towers of cellular communication production of heavy mechanical engineering and computers. The essence of securitization of leasing assets consists in isolation of streams of leasing payments from risk of bankruptcy of the leasing company. As the considered mechanism has the greatest development in the USA, so far as consideration of experience of its application in this country is represented especially actual. The special attention is deserved by a question of decrease in credit risk of the investor. External and internal providing is applied to its decision in different types. Interest of participants in securitization of leasing assets consists in distribution of risks between them, emergence of a new source of financing, depreciation of attracted resources, increase of liquidity of a leasing portfolio and optimization by management by balance of the enterprise. Appeal of this tool to the leasing company in a case when it has no available own funds for business development, represents separate interest. Securitization allows the leasing company to expand sources of attraction of the capital and to receive a reserve for the future, and also to broaden the sphere of options of activity and to give it new opportunities for financing of projects. Widespread introduction of schemes of securitization in practice of the Russian leasing business, requires development, and on some

  11. Valuing carbon assets for high-tech with application to the wind energy industry

    International Nuclear Information System (INIS)

    Han, Liyan; Liu, Yang; Lin, Qiang; Huang, Gubo

    2015-01-01

    In contrast to the traditional methods for high-tech evaluation, we introduce a new, more active idea for considering the carbon asset effect, in addition to the economic and technological considerations for strategic significance. The method proposed in this paper considers a reduced amount of carbon emissions, less than that of the current industry baseline, to be an asset that is beneficial to a firm that adopts a new technology. The measured carbon asset values vary across different technologies, in different industries and over time. The new method is applied to the valuing of wind energy technology and uses the Weibull distribution to estimate the wind energy capacity and a concrete sensitivity analysis. These applications support the validity of the new method and show that the impact of the fluctuations of carbon sinks on the values of carbon assets is significantly greater than that of volatility in the production output. The paper also presents some policy recommendations based on the results. - Highlights: • Carbon asset dimension for high-tech evaluation. • Valuing wind energy technology by Weibull distribution. • Greater impact of the carbon sink price on the carbon asset value than that of production output. • The environmental risk could be measured based on the carbon asset assessment.

  12. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  13. ISO 55000: Creating an asset management system.

    Science.gov (United States)

    Bradley, Chris; Main, Kevin

    2015-02-01

    In the October 2014 issue of HEJ, Keith Hamer, group vice-president, Asset Management & Engineering at Sodexo, and marketing director at Asset Wisdom, Kevin Main, argued that the new ISO 55000 standards present facilities managers with an opportunity to create 'a joined-up, whole lifecycle approach' to managing and delivering value from assets. In this article, Kevin Main and Chris Bradley, who runs various asset management projects, examine the process of creating an asset management system.

  14. Assessment of Random Assignment in Training and Test Sets using Generalized Cluster Analysis Technique

    Directory of Open Access Journals (Sweden)

    Sorana D. BOLBOACĂ

    2011-06-01

    Full Text Available Aim: The properness of random assignment of compounds in training and validation sets was assessed using the generalized cluster technique. Material and Method: A quantitative Structure-Activity Relationship model using Molecular Descriptors Family on Vertices was evaluated in terms of assignment of carboquinone derivatives in training and test sets during the leave-many-out analysis. Assignment of compounds was investigated using five variables: observed anticancer activity and four structure descriptors. Generalized cluster analysis with K-means algorithm was applied in order to investigate if the assignment of compounds was or not proper. The Euclidian distance and maximization of the initial distance using a cross-validation with a v-fold of 10 was applied. Results: All five variables included in analysis proved to have statistically significant contribution in identification of clusters. Three clusters were identified, each of them containing both carboquinone derivatives belonging to training as well as to test sets. The observed activity of carboquinone derivatives proved to be normal distributed on every. The presence of training and test sets in all clusters identified using generalized cluster analysis with K-means algorithm and the distribution of observed activity within clusters sustain a proper assignment of compounds in training and test set. Conclusion: Generalized cluster analysis using the K-means algorithm proved to be a valid method in assessment of random assignment of carboquinone derivatives in training and test sets.

  15. Low Access Delay Anti-Collision Algorithm for Reader in RFID systems

    DEFF Research Database (Denmark)

    Galiotto, Carlo; Marchetti, Nicola; Prasad, Neeli R.

    2010-01-01

    Radio Frequency Identification (RFID) is a technology which is spreading more and more as a medium to identify, locate and track assets through the productive chain. As all the wireless communication devices sharing the same transmission channel, RFID readers and tags experience collisions whenever...... deployed over the same area. In this work, the RFID reader collision problem is studied and a centralized scheduling-based algorithm is proposed as possible candidate solution, especially for those scenarios involving static or low mobility readers. Taking into account the circuitry limitations of the tags......, which do not allow to use frequency or code division multiple access schemes in the RFID systems, this paper proposes an algorithm aiming to prevent the readers collisions, while keeping the access delay of the readers to the channel possibly low. The simulation results show that this algorithm performs...

  16. CONTRADICTORY ASPECTS ASSESSMENT ON INTANGIBLE ASSETS

    OpenAIRE

    Ecaterina Necşulescu

    2011-01-01

    In Romania, the evaluation of intangible assets is rarely used due to extremely poor casuistry. From a sample of 100 companies we analyzed, only 4.5% revealed the existence of intangible assets worth less than 3% of total assets and none of the companies has not reviewed the assets. In crisis conditions, the study concludes that companies value decreases (bad will), and while economic growth increases the value of companies (good will). An effective leadership in the crisis assessment may be ...

  17. Gradient Evolution-based Support Vector Machine Algorithm for Classification

    Science.gov (United States)

    Zulvia, Ferani E.; Kuo, R. J.

    2018-03-01

    This paper proposes a classification algorithm based on a support vector machine (SVM) and gradient evolution (GE) algorithms. SVM algorithm has been widely used in classification. However, its result is significantly influenced by the parameters. Therefore, this paper aims to propose an improvement of SVM algorithm which can find the best SVMs’ parameters automatically. The proposed algorithm employs a GE algorithm to automatically determine the SVMs’ parameters. The GE algorithm takes a role as a global optimizer in finding the best parameter which will be used by SVM algorithm. The proposed GE-SVM algorithm is verified using some benchmark datasets and compared with other metaheuristic-based SVM algorithms. The experimental results show that the proposed GE-SVM algorithm obtains better results than other algorithms tested in this paper.

  18. Contingency Factors Influencing Implementation of Physical Asset Management Practices

    Directory of Open Access Journals (Sweden)

    Maletič Damjan

    2017-02-01

    Full Text Available Purpose: The purpose of this empirical study is to examine the role of two contingency factors, i.e. uncertainty and competitiveness in relation to physical asset management (PAM practices as well as to maintenance key performance indicators. The research is based on a premise that PAM, which was defined by risk management practices, performance assessment practices, life cycle management practices, and policy & strategy practices, has become an indispensable element of strategic thinking of asset owners as well as maintenance and asset managers. The purpose of this study is to advance the understanding of how organizations that face high or low level of uncertainty and competitiveness respond in terms of PAM deployment.

  19. A Trust-region-based Sequential Quadratic Programming Algorithm

    DEFF Research Database (Denmark)

    Henriksen, Lars Christian; Poulsen, Niels Kjølstad

    This technical note documents the trust-region-based sequential quadratic programming algorithm used in other works by the authors. The algorithm seeks to minimize a convex nonlinear cost function subject to linear inequalty constraints and nonlinear equality constraints.......This technical note documents the trust-region-based sequential quadratic programming algorithm used in other works by the authors. The algorithm seeks to minimize a convex nonlinear cost function subject to linear inequalty constraints and nonlinear equality constraints....

  20. Empirical Equation Based Chirality (n, m Assignment of Semiconducting Single Wall Carbon Nanotubes from Resonant Raman Scattering Data

    Directory of Open Access Journals (Sweden)

    Md Shamsul Arefin

    2012-12-01

    Full Text Available This work presents a technique for the chirality (n, m assignment of semiconducting single wall carbon nanotubes by solving a set of empirical equations of the tight binding model parameters. The empirical equations of the nearest neighbor hopping parameters, relating the term (2n, m with the first and second optical transition energies of the semiconducting single wall carbon nanotubes, are also proposed. They provide almost the same level of accuracy for lower and higher diameter nanotubes. An algorithm is presented to determine the chiral index (n, m of any unknown semiconducting tube by solving these empirical equations using values of radial breathing mode frequency and the first or second optical transition energy from resonant Raman spectroscopy. In this paper, the chirality of 55 semiconducting nanotubes is assigned using the first and second optical transition energies. Unlike the existing methods of chirality assignment, this technique does not require graphical comparison or pattern recognition between existing experimental and theoretical Kataura plot.

  1. Empirical Equation Based Chirality (n, m) Assignment of Semiconducting Single Wall Carbon Nanotubes from Resonant Raman Scattering Data

    Science.gov (United States)

    Arefin, Md Shamsul

    2012-01-01

    This work presents a technique for the chirality (n, m) assignment of semiconducting single wall carbon nanotubes by solving a set of empirical equations of the tight binding model parameters. The empirical equations of the nearest neighbor hopping parameters, relating the term (2n− m) with the first and second optical transition energies of the semiconducting single wall carbon nanotubes, are also proposed. They provide almost the same level of accuracy for lower and higher diameter nanotubes. An algorithm is presented to determine the chiral index (n, m) of any unknown semiconducting tube by solving these empirical equations using values of radial breathing mode frequency and the first or second optical transition energy from resonant Raman spectroscopy. In this paper, the chirality of 55 semiconducting nanotubes is assigned using the first and second optical transition energies. Unlike the existing methods of chirality assignment, this technique does not require graphical comparison or pattern recognition between existing experimental and theoretical Kataura plot. PMID:28348319

  2. A simulation framework for asset management in climate-change adaptation of transportation infrastructure

    NARCIS (Netherlands)

    Bhamidipati, S.K.

    2014-01-01

    An asset management framework, in an agent-based model with multiple assets, is presented as a tool that can assist in developing long-term climate change adaptation strategies for transportation infrastructure.

  3. Asset transformation and the challenges to servitize a utility business model

    International Nuclear Information System (INIS)

    Helms, Thorsten

    2016-01-01

    The traditional energy utility business model is under pressure, and energy services are expected to play an important role for the energy transition. Experts and scholars argue that utilities need to innovate their business models, and transform from commodity suppliers to service providers. The transition from a product-oriented, capital-intensive business model based on tangible assets, towards a service-oriented, expense-intensive business model based on intangible assets may present great managerial and organizational challenges. Little research exists about such transitions for capital-intensive commodity providers, and particularly energy utilities, where the challenges to servitize are expected to be greatest. This qualitative paper explores the barriers to servitization within selected Swiss and German utility companies through a series of interviews with utility managers. One of them is ‘asset transformation’, the shift from tangible to intangible assets as major input factor for the value proposition, which is proposed as a driver for the complexity of business model transitions. Managers need to carefully manage those challenges, and find ways to operate both new service and established utility business models aside. Policy makers can support the transition of utilities through more favorable regulatory frameworks for energy services, and by supporting the exchange of knowledge in the industry. - Highlights: •The paper analyses the expected transformation of utilities into service-providers. •Service and utility business models possess very different attributes. •The former is based on intangible, the latter on tangible assets. •The transformation into a service-provider is related with great challenges. •Asset transformation is proposed as a barrier for business model innovation.

  4. 12 CFR 560.160 - Asset classification.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Asset classification. 560.160 Section 560.160... Lending and Investment Provisions Applicable to all Savings Associations § 560.160 Asset classification... consistent with, or reconcilable to, the asset classification system used by OTS in its Thrift Activities...

  5. Design of SVC Controller Based on Improved Biogeography-Based Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Feifei Dong

    2014-01-01

    Full Text Available Considering that common subsynchronous resonance controllers cannot adapt to the characteristics of the time-varying and nonlinear behavior of a power system, the cosine migration model, the improved migration operator, and the mutative scale of chaos and Cauchy mutation strategy are introduced into an improved biogeography-based optimization (IBBO algorithm in order to design an optimal subsynchronous damping controller based on the mechanism of suppressing SSR by static var compensator (SVC. The effectiveness of the improved controller is verified by eigenvalue analysis and electromagnetic simulations. The simulation results of Jinjie plant indicate that the subsynchronous damping controller optimized by the IBBO algorithm can remarkably improve the damping of torsional modes and thus effectively depress SSR, and ensure the safety and stability of units and power grid operation. Moreover, the IBBO algorithm has the merits of a faster searching speed and higher searching accuracy in seeking the optimal control parameters over traditional algorithms, such as BBO algorithm, PSO algorithm, and GA algorithm.

  6. A New Evolutionary Algorithm Based on Bacterial Evolution and Its Application for Scheduling A Flexible Manufacturing System

    Directory of Open Access Journals (Sweden)

    Chandramouli Anandaraman

    2012-01-01

    Full Text Available A new evolutionary computation algorithm, Superbug algorithm, which simulates evolution of bacteria in a culture, is proposed. The algorithm is developed for solving large scale optimization problems such as scheduling, transportation and assignment problems. In this work, the algorithm optimizes machine schedules in a Flexible Manufacturing System (FMS by minimizing makespan. The FMS comprises of four machines and two identical Automated Guided Vehicles (AGVs. AGVs are used for carrying jobs between the Load/Unload (L/U station and the machines. Experimental results indicate the efficiency of the proposed algorithm in its optimization performance in scheduling is noticeably superior to other evolutionary algorithms when compared to the best results reported in the literature for FMS Scheduling.

  7. Portfolio Choice with Illiquid Assets

    OpenAIRE

    Andrew Ang; Dimitris Papanikolaou; Mark Westerfield

    2013-01-01

    We present a model of optimal allocation over liquid and illiquid assets, where illiquidity is the restriction that an asset cannot be traded for intervals of uncertain duration. Illiquidity leads to increased and state-dependent risk aversion, and reduces the allocation to both liquid and illiquid risky assets. Uncertainty about the length of the illiquidity interval, as opposed to a deterministic non-trading interval, is a primary determinant of the cost of illiquidity. We allow market liqu...

  8. Nuclear industry strategic asset management: Managing nuclear assets in a competitive environment

    International Nuclear Information System (INIS)

    Mueller, H.; Hunt, E.W. Jr.; Oatman, E.N.

    1999-01-01

    The former Electric Power Research Institute took the lead in developing an approach now widely known as strategic asset management (SAM). The SAM methodology applies the tools of decision/risk analysis used in the financial community to clarify effective use of physical assets and resources to create value: to build a clear line of sight to value creation. SAM processes have been used in both the power and other industries. The rapid change taking place in the nuclear business creates the need for competitive decision making regarding the management of nuclear assets. The nuclear industry is moving into an era in which shareholder value is determined by the net revenues earned on power marketed in a highly competitive and frequently low-priced power market environment

  9. 13 CFR 120.546 - Loan asset sales.

    Science.gov (United States)

    2010-01-01

    ... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Loan asset sales. 120.546 Section....546 Loan asset sales. (a) General. Loan asset sales are governed by § 120.545(b)(4) and by this... consented to SBA's sale of the loan (guaranteed and unguaranteed portions) in an asset sale conducted or...

  10. An algorithm for link restoration of wavelength routing optical networks

    DEFF Research Database (Denmark)

    Limal, Emmanuel; Stubkjær, Kristian

    1999-01-01

    We present an algorithm for restoration of single link failure in wavelength routing multihop optical networks. The algorithm is based on an innovative study of networks using graph theory. It has the following original features: it (i) assigns working and spare channels simultaneously, (ii......) prevents the search for unacceptable routing paths by pointing out channels required for restoration, (iii) offers a high utilization of the capacity resources and (iv) allows a trivial search for the restoration paths. The algorithm is for link restoration of networks without wavelength translation. Its...

  11. Preparing for asset retirement.

    Science.gov (United States)

    Luecke, Randall W; Reinstein, Alan

    2003-04-01

    Statement of Financial Accounting Standards (SFAS) No. 143 requires organizations to recognize a liability for an asset retirement obligation when it is incurred--even if that occurs far in advance of the asset's planned retirement. For example, organizations must recognize future costs associated with medical equipment disposal that carries hazardous material legal obligations.

  12. Bag-of-features based medical image retrieval via multiple assignment and visual words weighting

    KAUST Repository

    Wang, Jingyan

    2011-11-01

    Bag-of-features based approaches have become prominent for image retrieval and image classification tasks in the past decade. Such methods represent an image as a collection of local features, such as image patches and key points with scale invariant feature transform (SIFT) descriptors. To improve the bag-of-features methods, we first model the assignments of local descriptors as contribution functions, and then propose a novel multiple assignment strategy. Assuming the local features can be reconstructed by their neighboring visual words in a vocabulary, reconstruction weights can be solved by quadratic programming. The weights are then used to build contribution functions, resulting in a novel assignment method, called quadratic programming (QP) assignment. We further propose a novel visual word weighting method. The discriminative power of each visual word is analyzed by the sub-similarity function in the bin that corresponds to the visual word. Each sub-similarity function is then treated as a weak classifier. A strong classifier is learned by boosting methods that combine those weak classifiers. The weighting factors of the visual words are learned accordingly. We evaluate the proposed methods on medical image retrieval tasks. The methods are tested on three well-known data sets, i.e., the ImageCLEFmed data set, the 304 CT Set, and the basal-cell carcinoma image set. Experimental results demonstrate that the proposed QP assignment outperforms the traditional nearest neighbor assignment, the multiple assignment, and the soft assignment, whereas the proposed boosting based weighting strategy outperforms the state-of-the-art weighting methods, such as the term frequency weights and the term frequency-inverse document frequency weights. © 2011 IEEE.

  13. Bag-of-features based medical image retrieval via multiple assignment and visual words weighting

    KAUST Repository

    Wang, Jingyan; Li, Yongping; Zhang, Ying; Wang, Chao; Xie, Honglan; Chen, Guoling; Gao, Xin

    2011-01-01

    Bag-of-features based approaches have become prominent for image retrieval and image classification tasks in the past decade. Such methods represent an image as a collection of local features, such as image patches and key points with scale invariant feature transform (SIFT) descriptors. To improve the bag-of-features methods, we first model the assignments of local descriptors as contribution functions, and then propose a novel multiple assignment strategy. Assuming the local features can be reconstructed by their neighboring visual words in a vocabulary, reconstruction weights can be solved by quadratic programming. The weights are then used to build contribution functions, resulting in a novel assignment method, called quadratic programming (QP) assignment. We further propose a novel visual word weighting method. The discriminative power of each visual word is analyzed by the sub-similarity function in the bin that corresponds to the visual word. Each sub-similarity function is then treated as a weak classifier. A strong classifier is learned by boosting methods that combine those weak classifiers. The weighting factors of the visual words are learned accordingly. We evaluate the proposed methods on medical image retrieval tasks. The methods are tested on three well-known data sets, i.e., the ImageCLEFmed data set, the 304 CT Set, and the basal-cell carcinoma image set. Experimental results demonstrate that the proposed QP assignment outperforms the traditional nearest neighbor assignment, the multiple assignment, and the soft assignment, whereas the proposed boosting based weighting strategy outperforms the state-of-the-art weighting methods, such as the term frequency weights and the term frequency-inverse document frequency weights. © 2011 IEEE.

  14. Fractional-moment Capital Asset Pricing model

    International Nuclear Information System (INIS)

    Li Hui; Wu Min; Wang Xiaotian

    2009-01-01

    In this paper, we introduce the definition of the 'α-covariance' and present the fractional-moment versions of Capital Asset Pricing Model,which can be used to price assets when asset return distributions are likely to be stable Levy (or Student-t) distribution during panics and stampedes in worldwide security markets in 2008. Furthermore, if asset returns are truly governed by the infinite-variance stable Levy distributions, life is fundamentally riskier than in a purely Gaussian world. Sudden price movements like the worldwide security market crash in 2008 turn into real-world possibilities.

  15. Simple sorting algorithm test based on CUDA

    OpenAIRE

    Meng, Hongyu; Guo, Fangjin

    2015-01-01

    With the development of computing technology, CUDA has become a very important tool. In computer programming, sorting algorithm is widely used. There are many simple sorting algorithms such as enumeration sort, bubble sort and merge sort. In this paper, we test some simple sorting algorithm based on CUDA and draw some useful conclusions.

  16. The Earnings/Price Risk Factor in Capital Asset Pricing Models

    Directory of Open Access Journals (Sweden)

    Rafael Falcão Noda

    2015-01-01

    Full Text Available This article integrates the ideas from two major lines of research on cost of equity and asset pricing: multi-factor models and ex ante accounting models. The earnings/price ratio is used as a proxy for the ex ante cost of equity, in order to explain realized returns of Brazilian companies within the period from 1995 to 2013. The initial finding was that stocks with high (low earnings/price ratios have higher (lower risk-adjusted realized returns, already controlled by the capital asset pricing model's beta. The results show that selecting stocks based on high earnings/price ratios has led to significantly higher risk-adjusted returns in the Brazilian market, with average abnormal returns close to 1.3% per month. We design asset pricing models including an earnings/price risk factor, i.e. high earnings minus low earnings, based on the Fama and French three-factor model. We conclude that such a risk factor is significant to explain returns on portfolios, even when controlled by size and market/book ratios. Models including the high earnings minus low earnings risk factor were better to explain stock returns in Brazil when compared to the capital asset pricing model and to the Fama and French three-factor model, having the lowest number of significant intercepts. These findings may be due to the impact of historically high inflation rates, which reduce the information content of book values, thus making the models based on earnings/price ratios better than those based on market/book ratios. Such results are different from those obtained in more developed markets and the superiority of the earnings/price ratio for asset pricing may also exist in other emerging markets.

  17. SIFT based algorithm for point feature tracking

    Directory of Open Access Journals (Sweden)

    Adrian BURLACU

    2007-12-01

    Full Text Available In this paper a tracking algorithm for SIFT features in image sequences is developed. For each point feature extracted using SIFT algorithm a descriptor is computed using information from its neighborhood. Using an algorithm based on minimizing the distance between two descriptors tracking point features throughout image sequences is engaged. Experimental results, obtained from image sequences that capture scaling of different geometrical type object, reveal the performances of the tracking algorithm.

  18. Real-Coded Quantum-Inspired Genetic Algorithm-Based BP Neural Network Algorithm

    Directory of Open Access Journals (Sweden)

    Jianyong Liu

    2015-01-01

    Full Text Available The method that the real-coded quantum-inspired genetic algorithm (RQGA used to optimize the weights and threshold of BP neural network is proposed to overcome the defect that the gradient descent method makes the algorithm easily fall into local optimal value in the learning process. Quantum genetic algorithm (QGA is with good directional global optimization ability, but the conventional QGA is based on binary coding; the speed of calculation is reduced by the coding and decoding processes. So, RQGA is introduced to explore the search space, and the improved varied learning rate is adopted to train the BP neural network. Simulation test shows that the proposed algorithm is effective to rapidly converge to the solution conformed to constraint conditions.

  19. Investments in fixed assets and depreciation of fixed assets: theoretical and practical aspects of study and analysis

    Directory of Open Access Journals (Sweden)

    Irina D. Demina

    2017-01-01

    Full Text Available It is indicated that domestic economy is experiencing a shortage of investment.The acceleration of the processes of import substitution is one of the most important challenges facing the domestic economy at present.Investments, especially capital investments and related investment relations constitute the basis for the development of the national economy and improving the efficiency of social production as a whole. A problem of formation of the amortization fundremains actual at the moment. In the modern scientific and educational literature amortization fund means the fund, including the use of funds to complete the restoration and repair of the fixed assets. This paper makesthe analysis of the situation in the area of investment in the fixed capital, which has developed in Russia for the past severalyears. The aim of this paper is to study the investment climate in the country based on the analysis of investments in the fixed capital by the sources of financing and types of the economic activity. The work is based on dynamic and structural analysis of analytical and statistical information on the processes occurring in this field.As a result, it can be noted that in spite of a number of efforts being made, in general, there are low growth rates in industry, there is a deficit of investments in the fixed assets. Most of the investments in fixed assets are carried out at the expense of the organizations’ own funds. A significant number of economic entities do not have the means, necessary for the technological renewal. Unfortunately, the regulatory framework in the field of accounting for the fixed assets and accrual of depreciation does not imply the use of a special account for the accumulation, and, most importantly, for the purposeful control of the use of the depreciation fund.First of all, it is necessary for companies with state participation and monopoly organizations. The lack of control over the targeted use of the depreciation fund

  20. Learning Agents for Autonomous Space Asset Management (LAASAM)

    Science.gov (United States)

    Scally, L.; Bonato, M.; Crowder, J.

    2011-09-01

    Current and future space systems will continue to grow in complexity and capabilities, creating a formidable challenge to monitor, maintain, and utilize these systems and manage their growing network of space and related ground-based assets. Integrated System Health Management (ISHM), and in particular, Condition-Based System Health Management (CBHM), is the ability to manage and maintain a system using dynamic real-time data to prioritize, optimize, maintain, and allocate resources. CBHM entails the maintenance of systems and equipment based on an assessment of current and projected conditions (situational and health related conditions). A complete, modern CBHM system comprises a number of functional capabilities: sensing and data acquisition; signal processing; conditioning and health assessment; diagnostics and prognostics; and decision reasoning. In addition, an intelligent Human System Interface (HSI) is required to provide the user/analyst with relevant context-sensitive information, the system condition, and its effect on overall situational awareness of space (and related) assets. Colorado Engineering, Inc. (CEI) and Raytheon are investigating and designing an Intelligent Information Agent Architecture that will provide a complete range of CBHM and HSI functionality from data collection through recommendations for specific actions. The research leverages CEI’s expertise with provisioning management network architectures and Raytheon’s extensive experience with learning agents to define a system to autonomously manage a complex network of current and future space-based assets to optimize their utilization.

  1. Eigenvalue Decomposition-Based Modified Newton Algorithm

    Directory of Open Access Journals (Sweden)

    Wen-jun Wang

    2013-01-01

    Full Text Available When the Hessian matrix is not positive, the Newton direction may not be the descending direction. A new method named eigenvalue decomposition-based modified Newton algorithm is presented, which first takes the eigenvalue decomposition of the Hessian matrix, then replaces the negative eigenvalues with their absolute values, and finally reconstructs the Hessian matrix and modifies the searching direction. The new searching direction is always the descending direction. The convergence of the algorithm is proven and the conclusion on convergence rate is presented qualitatively. Finally, a numerical experiment is given for comparing the convergence domains of the modified algorithm and the classical algorithm.

  2. The capital-asset-pricing model and arbitrage pricing theory: a unification.

    Science.gov (United States)

    Ali Khan, M; Sun, Y

    1997-04-15

    We present a model of a financial market in which naive diversification, based simply on portfolio size and obtained as a consequence of the law of large numbers, is distinguished from efficient diversification, based on mean-variance analysis. This distinction yields a valuation formula involving only the essential risk embodied in an asset's return, where the overall risk can be decomposed into a systematic and an unsystematic part, as in the arbitrage pricing theory; and the systematic component further decomposed into an essential and an inessential part, as in the capital-asset-pricing model. The two theories are thus unified, and their individual asset-pricing formulas shown to be equivalent to the pervasive economic principle of no arbitrage. The factors in the model are endogenously chosen by a procedure analogous to the Karhunen-Loéve expansion of continuous time stochastic processes; it has an optimality property justifying the use of a relatively small number of them to describe the underlying correlational structures. Our idealized limit model is based on a continuum of assets indexed by a hyperfinite Loeb measure space, and it is asymptotically implementable in a setting with a large but finite number of assets. Because the difficulties in the formulation of the law of large numbers with a standard continuum of random variables are well known, the model uncovers some basic phenomena not amenable to classical methods, and whose approximate counterparts are not already, or even readily, apparent in the asymptotic setting.

  3. Complex-based OCT angiography algorithm recovers microvascular information better than amplitude- or phase-based algorithms in phase-stable systems.

    Science.gov (United States)

    Xu, Jingjiang; Song, Shaozhen; Li, Yuandong; Wang, Ruikang K

    2017-12-19

    Optical coherence tomography angiography (OCTA) is increasingly becoming a popular inspection tool for biomedical imaging applications. By exploring the amplitude, phase and complex information available in OCT signals, numerous algorithms have been proposed that contrast functional vessel networks within microcirculatory tissue beds. However, it is not clear which algorithm delivers optimal imaging performance. Here, we investigate systematically how amplitude and phase information have an impact on the OCTA imaging performance, to establish the relationship of amplitude and phase stability with OCT signal-to-noise ratio (SNR), time interval and particle dynamics. With either repeated A-scan or repeated B-scan imaging protocols, the amplitude noise increases with the increase of OCT SNR; however, the phase noise does the opposite, i.e. it increases with the decrease of OCT SNR. Coupled with experimental measurements, we utilize a simple Monte Carlo (MC) model to simulate the performance of amplitude-, phase- and complex-based algorithms for OCTA imaging, the results of which suggest that complex-based algorithms deliver the best performance when the phase noise is  algorithm delivers better performance than either the amplitude- or phase-based algorithms for both the repeated A-scan and the B-scan imaging protocols, which agrees well with the conclusion drawn from the MC simulations.

  4. Low-Energy Real-Time OS Using Voltage Scheduling Algorithm for Variable Voltage Processors

    OpenAIRE

    Okuma, Takanori; Yasuura, Hiroto

    2001-01-01

    This paper presents a real-time OS based on $ mu $ITRON using proposed voltage scheduling algorithm for variable voltage processors which can vary supply voltage dynamically. The proposed voltage scheduling algorithms assign voltage level for each task dynamically in order to minimize energy consumption under timing constraints. Using the presented real-time OS, running tasks with low supply voltage leads to drastic energy reduction. In addition, the presented voltage scheduling algorithm is ...

  5. European Interoperability Assets Register and Quality Framework Implementation.

    Science.gov (United States)

    Moreno-Conde, Alberto; Thienpont, Geert; Lamote, Inge; Coorevits, Pascal; Parra, Carlos; Kalra, Dipak

    2016-01-01

    Interoperability assets is the term applied to refer to any resource that can support the design, implementation and successful adoption of eHealth services that can exchange data meaningfully. Some examples may include functional requirements, specifications, standards, clinical models and term lists, guidance on how standards may be used concurrently, implementation guides, educational resources, and other resources. Unfortunately, these are largely accessible in ad hoc ways and result in scattered fragments of a solution space that urgently need to be brought together. At present, it is well known that new initiatives and projects will reinvent assets of which they were unaware, while those assets which were potentially of great value are forgotten, not maintained and eventually fall into disuse. This research has defined a quality in use model and assessed the suitability of this quality framework based on the feedback and opinion of a representative sample of potential end users. This quality framework covers the following domains of asset development and adoption: (i) Development process, (ii) Maturity level, (iii) Trustworthiness, (iv) Support & skills, (v) Sustainability, (vi) Semantic interoperability, (vii) Cost & effort of adoption (viii) Maintenance. When participants were requested to evaluate how the overall quality in use framework, 70% would recommend using the register to their colleagues, 70% felt that it could provide relevant benefits for discovering new assets, and 50% responded that it would support their decision making about the recommended asset to adopt or implement in their organisation. Several European projects have expressed interest in using the register, which will now be sustained and promoted by the the European Institute for Innovation through Health Data.

  6. Scalable and Cost-Effective Assignment of Mobile Crowdsensing Tasks Based on Profiling Trends and Prediction: The ParticipAct Living Lab Experience.

    Science.gov (United States)

    Bellavista, Paolo; Corradi, Antonio; Foschini, Luca; Ianniello, Raffaele

    2015-07-30

    Nowadays, sensor-rich smartphones potentially enable the harvesting of huge amounts of valuable sensing data in urban environments, by opportunistically involving citizens to play the role of mobile virtual sensors to cover Smart City areas of interest. This paper proposes an in-depth study of the challenging technical issues related to the efficient assignment of Mobile Crowd Sensing (MCS) data collection tasks to volunteers in a crowdsensing campaign. In particular, the paper originally describes how to increase the effectiveness of the proposed sensing campaigns through the inclusion of several new facilities, including accurate participant selection algorithms able to profile and predict user mobility patterns, gaming techniques, and timely geo-notification. The reported results show the feasibility of exploiting profiling trends/prediction techniques from volunteers' behavior; moreover, they quantitatively compare different MCS task assignment strategies based on large-scale and real MCS data campaigns run in the ParticipAct living lab, an ongoing MCS real-world experiment that involved more than 170 students of the University of Bologna for more than one year.

  7. Scalable and Cost-Effective Assignment of Mobile Crowdsensing Tasks Based on Profiling Trends and Prediction: The ParticipAct Living Lab Experience

    Directory of Open Access Journals (Sweden)

    Paolo Bellavista

    2015-07-01

    Full Text Available Nowadays, sensor-rich smartphones potentially enable the harvesting of huge amounts of valuable sensing data in urban environments, by opportunistically involving citizens to play the role of mobile virtual sensors to cover Smart City areas of interest. This paper proposes an in-depth study of the challenging technical issues related to the efficient assignment of Mobile Crowd Sensing (MCS data collection tasks to volunteers in a crowdsensing campaign. In particular, the paper originally describes how to increase the effectiveness of the proposed sensing campaigns through the inclusion of several new facilities, including accurate participant selection algorithms able to profile and predict user mobility patterns, gaming techniques, and timely geo-notification. The reported results show the feasibility of exploiting profiling trends/prediction techniques from volunteers’ behavior; moreover, they quantitatively compare different MCS task assignment strategies based on large-scale and real MCS data campaigns run in the ParticipAct living lab, an ongoing MCS real-world experiment that involved more than 170 students of the University of Bologna for more than one year.

  8. A proposed change to the NASA strategy for servicing space assets

    Science.gov (United States)

    Levin, George C.

    1989-01-01

    Given the limitations of the present Shuttle manifest, it is necessary for NASA to consider revision of its previous strategy for servicing satellites. This is particularly important in a period of tight budgets, when space assets will be difficult to replace. Therefore on-orbit assets take on additional value and keeping these assets operational will take on added importance. The key to maintaining these assets will be the long term strategy of developing a remote servicing capability which is space based and has a minimum reliance on the Shuttle. Such a strategy will require that the users of this servicing system design serviceable spacecraft at a high level and that these assets be located in or be capable of reaching orbits that are accessible to or compatible with the proposed servicing infrastructure. The infrastructure required to support this type of remote servicing architecture and the development of the necessary systems, tools, and procedures required to support a remote servicing architecture of this type are addressed.

  9. Single-machine common/slack due window assignment problems with linear decreasing processing times

    Science.gov (United States)

    Zhang, Xingong; Lin, Win-Chin; Wu, Wen-Hsiang; Wu, Chin-Chia

    2017-08-01

    This paper studies linear non-increasing processing times and the common/slack due window assignment problems on a single machine, where the actual processing time of a job is a linear non-increasing function of its starting time. The aim is to minimize the sum of the earliness cost, tardiness cost, due window location and due window size. Some optimality results are discussed for the common/slack due window assignment problems and two O(n log n) time algorithms are presented to solve the two problems. Finally, two examples are provided to illustrate the correctness of the corresponding algorithms.

  10. Forecasting Energy CO2 Emissions Using a Quantum Harmony Search Algorithm-Based DMSFE Combination Model

    Directory of Open Access Journals (Sweden)

    Xingsheng Gu

    2013-03-01

    Full Text Available he accurate forecasting of carbon dioxide (CO2 emissions from fossil fuel energy consumption is a key requirement for making energy policy and environmental strategy. In this paper, a novel quantum harmony search (QHS algorithm-based discounted mean square forecast error (DMSFE combination model is proposed. In the DMSFE combination forecasting model, almost all investigations assign the discounting factor (β arbitrarily since β varies between 0 and 1 and adopt one value for all individual models and forecasting periods. The original method doesn’t consider the influences of the individual model and the forecasting period. This work contributes by changing β from one value to a matrix taking the different model and the forecasting period into consideration and presenting a way of searching for the optimal β values by using the QHS algorithm through optimizing the mean absolute percent error (MAPE objective function. The QHS algorithm-based optimization DMSFE combination forecasting model is established and tested by forecasting CO2 emission of the World top‒5 CO2 emitters. The evaluation indexes such as MAPE, root mean squared error (RMSE and mean absolute error (MAE are employed to test the performance of the presented approach. The empirical analyses confirm the validity of the presented method and the forecasting accuracy can be increased in a certain degree.

  11. Development and Implementation of a Condition Based Maintenance Program for Geothermal Power Plants; FINAL

    International Nuclear Information System (INIS)

    Steve Miller; Jim Eddy; Murray Grande; Shawn Bratt; Manuchehr Shirmohamadi

    2002-01-01

    This report describes the development of the RCM team, identifying plant assets and developing an asset hierarchy, the development of sample Failure Mode Effects Analysis (FMEAs), identifying and prioritizing plant systems and components for RCM analysis, and identifying RCM/CBM software/hardware vendors. It also includes the Failure Mode Effects Analysis (FMEA) for all Class I Systems, Maintenance Task Assignments, use of Conditioned Based Maintenance (CBM) Tools and Displays of the RCM software System Development to date

  12. FLEET ASSIGNMENT MODELLING

    Directory of Open Access Journals (Sweden)

    2016-01-01

    Full Text Available The article is devoted to the airline scheduling process and methods of its modeling. This article describes the main stages of airline scheduling process (scheduling, fleet assignment, revenue management, operations, their features and interactions. The main part of scheduling process is fleet assignment. The optimal solution of the fleet assignment problem enables airlines to increase their incomes up to 3 % due to quality improving of connections and execution of the planned number of flights operated by less number of aircraft than usual or planned earlier. Fleet assignment of scheduling process is examined and Conventional Leg-Based Fleet Assignment Model is analyzed. Finally strong and weak aspects of the model (SWOT are released and applied. The article gives a critical analysis of FAM model, with the purpose of identi- fying possible options and constraints of its use (for example, in cases of short-term and long-term planning, changing the schedule or replacing the aircraft, as well as possible ways to improve the model.

  13. Asset management trends and challenges

    Energy Technology Data Exchange (ETDEWEB)

    Rijks, E. [Continuon, Arnhem (Netherlands); Ford, G.L. [PowerNex Associates Inc., Toronto, ON (Canada); Sanchis, G. [Reseau de Transport d' Electricite, Paris (France)

    2007-07-01

    Recent business and regulatory changes in the electric power industry have affected the operation of electric utilities. Most have accepted competition and commercialization. Various strategies have emerged as companies strive to improve performance and retain profitability in an environment where competition or regulatory pressure is reducing revenues at a time when customer expectation is increasing. As focus shifts away from engineering excellence towards commercial performance, the new business ideology for electric utilities is to optimize asset management. This paper identified asset management technology trends, opportunities and challenges. Although many utilities are currently comfortable with their existing asset management processes, regulators are increasingly scrutinizing utilities as they seek approval for rates and investments in aging infrastructure. Much more rigorous financial analysis methods are needed to justify the large investments that are needed. In addition, the credibility of the processes and methods used by utilities will be increasingly questioned. In recognition of the growing importance of asset management, several initiatives have been launched to provide forums for sharing information and to provide a unifying force to asset management methods. The International Council on Large Electric Systems (CIGRE) was one of the first to recognize the importance of asset management. This paper summarized recent CIGRE activities as well as the developments of publicly available specification (PAS) 55 in the United Kingdom. It was concluded that utilities that adopt standardized approaches will be more credible in the eyes of regulatory authorities. 3 refs., 4 figs.

  14. Pilot-based parametric channel estimation algorithm for DCO-OFDM-based visual light communications

    Science.gov (United States)

    Qian, Xuewen; Deng, Honggui; He, Hailang

    2017-10-01

    Due to wide modulation bandwidth in optical communication, multipath channels may be non-sparse and deteriorate communication performance heavily. Traditional compressive sensing-based channel estimation algorithm cannot be employed in this kind of situation. In this paper, we propose a practical parametric channel estimation algorithm for orthogonal frequency division multiplexing (OFDM)-based visual light communication (VLC) systems based on modified zero correlation code (ZCC) pair that has the impulse-like correlation property. Simulation results show that the proposed algorithm achieves better performances than existing least squares (LS)-based algorithm in both bit error ratio (BER) and frequency response estimation.

  15. Optimisation of timetable-based, stochastic transit assignment models based on MSA

    DEFF Research Database (Denmark)

    Nielsen, Otto Anker; Frederiksen, Rasmus Dyhr

    2006-01-01

    (CRM), such a large-scale transit assignment model was developed and estimated. The Stochastic User Equilibrium problem was solved by the Method of Successive Averages (MSA). However, the model suffered from very large calculation times. The paper focuses on how to optimise transit assignment models...

  16. Selection of asset investment models by hospitals: examination of influencing factors, using Switzerland as an example.

    Science.gov (United States)

    Eicher, Bernhard

    2016-10-01

    Hospitals are responsible for a remarkable part of the annual increase in healthcare expenditure. This article examines one of the major cost drivers, the expenditure for investment in hospital assets. The study, conducted in Switzerland, identifies factors that influence hospitals' investment decisions. A suggestion on how to categorize asset investment models is presented based on the life cycle of an asset, and its influencing factors defined based on transaction cost economics. The influence of five factors (human asset specificity, physical asset specificity, uncertainty, bargaining power, and privacy of ownership) on the selection of an asset investment model is examined using a two-step fuzzy-set Qualitative Comparative Analysis. The research shows that outsourcing-oriented asset investment models are particularly favored in the presence of two combinations of influencing factors: First, if technological uncertainty is high and both human asset specificity and bargaining power of a hospital are low. Second, if assets are very specific, technological uncertainty is high and there is a private hospital with low bargaining power, outsourcing-oriented asset investment models are favored too. Using Qualitative Comparative Analysis, it can be demonstrated that investment decisions of hospitals do not depend on isolated influencing factors but on a combination of factors. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Processing Risk In Asset Management : Exploring The Boundaries Of Risk Based Optimization Under Uncertainty For An Energy Infrastructure Asset Manager

    NARCIS (Netherlands)

    Wijnia, Y.C.

    2016-01-01

    In the liberalized energy market Distribution Network Operators (DNOs) are confronted with income reductions by the regulator. The common response to this challenge is the implementation of asset management, which can be regarded as systematically applying Cost Benefit Analysis (CBA) to the risks in

  18. Structure-Based Algorithms for Microvessel Classification

    KAUST Repository

    Smith, Amy F.

    2015-02-01

    © 2014 The Authors. Microcirculation published by John Wiley & Sons Ltd. Objective: Recent developments in high-resolution imaging techniques have enabled digital reconstruction of three-dimensional sections of microvascular networks down to the capillary scale. To better interpret these large data sets, our goal is to distinguish branching trees of arterioles and venules from capillaries. Methods: Two novel algorithms are presented for classifying vessels in microvascular anatomical data sets without requiring flow information. The algorithms are compared with a classification based on observed flow directions (considered the gold standard), and with an existing resistance-based method that relies only on structural data. Results: The first algorithm, developed for networks with one arteriolar and one venular tree, performs well in identifying arterioles and venules and is robust to parameter changes, but incorrectly labels a significant number of capillaries as arterioles or venules. The second algorithm, developed for networks with multiple inlets and outlets, correctly identifies more arterioles and venules, but is more sensitive to parameter changes. Conclusions: The algorithms presented here can be used to classify microvessels in large microvascular data sets lacking flow information. This provides a basis for analyzing the distinct geometrical properties and modelling the functional behavior of arterioles, capillaries, and venules.

  19. Agility in asset management, or: how to be flexible with assets designed for stability

    NARCIS (Netherlands)

    Ruitenburg, Richard Jacob; Braaksma, Anne Johannes Jan; van Dongen, Leonardus Adriana Maria

    2016-01-01

    Agility is increasingly important in manufacturing. However, thus far little attention has been paid to the agility of the physical assets used in production, which are typically designed for decades of operation in a stable context. This paper investigates the topic of agile Asset Management using

  20. WDM Multicast Tree Construction Algorithms and Their Comparative Evaluations

    Science.gov (United States)

    Makabe, Tsutomu; Mikoshi, Taiju; Takenaka, Toyofumi

    We propose novel tree construction algorithms for multicast communication in photonic networks. Since multicast communications consume many more link resources than unicast communications, effective algorithms for route selection and wavelength assignment are required. We propose a novel tree construction algorithm, called the Weighted Steiner Tree (WST) algorithm and a variation of the WST algorithm, called the Composite Weighted Steiner Tree (CWST) algorithm. Because these algorithms are based on the Steiner Tree algorithm, link resources among source and destination pairs tend to be commonly used and link utilization ratios are improved. Because of this, these algorithms can accept many more multicast requests than other multicast tree construction algorithms based on the Dijkstra algorithm. However, under certain delay constraints, the blocking characteristics of the proposed Weighted Steiner Tree algorithm deteriorate since some light paths between source and destinations use many hops and cannot satisfy the delay constraint. In order to adapt the approach to the delay-sensitive environments, we have devised the Composite Weighted Steiner Tree algorithm comprising the Weighted Steiner Tree algorithm and the Dijkstra algorithm for use in a delay constrained environment such as an IPTV application. In this paper, we also give the results of simulation experiments which demonstrate the superiority of the proposed Composite Weighted Steiner Tree algorithm compared with the Distributed Minimum Hop Tree (DMHT) algorithm, from the viewpoint of the light-tree request blocking.

  1. Dynamic service contracting for on-demand asset delivery

    NARCIS (Netherlands)

    Zhao, X.; Angelov, S.A.; Grefen, P.W.P.J.

    2014-01-01

    Traditional financial asset lease operates in an asset provider centred mode, in which financiers passively provide financial solutions to the customers of their allied asset vendors. To capture the highly customised asset lease demands from the mass market, this paper advocates adopting a

  2. Variants of Evolutionary Algorithms for Real-World Applications

    CERN Document Server

    Weise, Thomas; Michalewicz, Zbigniew

    2012-01-01

    Evolutionary Algorithms (EAs) are population-based, stochastic search algorithms that mimic natural evolution. Due to their ability to find excellent solutions for conventionally hard and dynamic problems within acceptable time, EAs have attracted interest from many researchers and practitioners in recent years. This book “Variants of Evolutionary Algorithms for Real-World Applications” aims to promote the practitioner’s view on EAs by providing a comprehensive discussion of how EAs can be adapted to the requirements of various applications in the real-world domains. It comprises 14 chapters, including an introductory chapter re-visiting the fundamental question of what an EA is and other chapters addressing a range of real-world problems such as production process planning, inventory system and supply chain network optimisation, task-based jobs assignment, planning for CNC-based work piece construction, mechanical/ship design tasks that involve runtime-intense simulations, data mining for the predictio...

  3. The Role of Agribusiness Assets in Investment Portfolios

    OpenAIRE

    Johnson, Michael; Malcolm, Bill; O'Connor, Ian

    2006-01-01

    Investment in agribusiness assets has grown significantly in recent years. The question of interest is whether including agribusiness assets in investment portfolios provide benefits. The effects of diversification by including agribusiness assets in two investment portfolios, a mixed asset portfolio and a diversified share portfolio was investigated using Markowitz’s (1952) Modern Portfolio Theory (MPT) of mean-variance optimization. To measure the performance of agribusiness assets, an in...

  4. Novel prediction- and subblock-based algorithm for fractal image compression

    International Nuclear Information System (INIS)

    Chung, K.-L.; Hsu, C.-H.

    2006-01-01

    Fractal encoding is the most consuming part in fractal image compression. In this paper, a novel two-phase prediction- and subblock-based fractal encoding algorithm is presented. Initially the original gray image is partitioned into a set of variable-size blocks according to the S-tree- and interpolation-based decomposition principle. In the first phase, each current block of variable-size range block tries to find the best matched domain block based on the proposed prediction-based search strategy which utilizes the relevant neighboring variable-size domain blocks. The first phase leads to a significant computation-saving effect. If the domain block found within the predicted search space is unacceptable, in the second phase, a subblock strategy is employed to partition the current variable-size range block into smaller blocks to improve the image quality. Experimental results show that our proposed prediction- and subblock-based fractal encoding algorithm outperforms the conventional full search algorithm and the recently published spatial-correlation-based algorithm by Truong et al. in terms of encoding time and image quality. In addition, the performance comparison among our proposed algorithm and the other two algorithms, the no search-based algorithm and the quadtree-based algorithm, are also investigated

  5. The Educational Asset Market: A Finance Perspective on Human Capital Investment

    DEFF Research Database (Denmark)

    Christiansen, Charlotte; Nielsen, Helena Skyt

    2002-01-01

    on type and level of education enables us to focus on the shared features between human capital and stock investments. An innovative finance-labor approach is applied to study the educational asset market. A risk-return trade-off is revealed which is not directly related to the length of education.......Like the stock market, the human capital market consists of a wide range of assets, i.e. educations. Each young individual chooses the educational asset that matches his preferred combination of risk and return in terms of future income. A unique register-based data set with exact information...

  6. Multiobjective Order Assignment Optimization in a Global Multiple-Factory Environment

    Directory of Open Access Journals (Sweden)

    Rong-Chang Chen

    2014-01-01

    Full Text Available In response to radically increasing competition, many manufacturers who produce time-sensitive products have expanded their production plants to worldwide sites. Given this environment, how to aggregate customer orders from around the globe and assign them quickly to the most appropriate plants is currently a crucial issue. This study proposes an effective method to solve the order assignment problem of companies with multiple plants distributed worldwide. A multiobjective genetic algorithm (MOGA is used to find solutions. To validate the effectiveness of the proposed approach, this study employs some real data, provided by a famous garment company in Taiwan, as a base to perform some experiments. In addition, the influences of orders with a wide range of quantities demanded are discussed. The results show that feasible solutions can be obtained effectively and efficiently. Moreover, if managers aim at lower total costs, they can divide a big customer order into more small manufacturing ones.

  7. MVDR Algorithm Based on Estimated Diagonal Loading for Beamforming

    Directory of Open Access Journals (Sweden)

    Yuteng Xiao

    2017-01-01

    Full Text Available Beamforming algorithm is widely used in many signal processing fields. At present, the typical beamforming algorithm is MVDR (Minimum Variance Distortionless Response. However, the performance of MVDR algorithm relies on the accurate covariance matrix. The MVDR algorithm declines dramatically with the inaccurate covariance matrix. To solve the problem, studying the beamforming array signal model and beamforming MVDR algorithm, we improve MVDR algorithm based on estimated diagonal loading for beamforming. MVDR optimization model based on diagonal loading compensation is established and the interval of the diagonal loading compensation value is deduced on the basis of the matrix theory. The optimal diagonal loading value in the interval is also determined through the experimental method. The experimental results show that the algorithm compared with existing algorithms is practical and effective.

  8. Laplace transform analysis of a multiplicative asset transfer model

    Science.gov (United States)

    Sokolov, Andrey; Melatos, Andrew; Kieu, Tien

    2010-07-01

    We analyze a simple asset transfer model in which the transfer amount is a fixed fraction f of the giver’s wealth. The model is analyzed in a new way by Laplace transforming the master equation, solving it analytically and numerically for the steady-state distribution, and exploring the solutions for various values of f∈(0,1). The Laplace transform analysis is superior to agent-based simulations as it does not depend on the number of agents, enabling us to study entropy and inequality in regimes that are costly to address with simulations. We demonstrate that Boltzmann entropy is not a suitable (e.g. non-monotonic) measure of disorder in a multiplicative asset transfer system and suggest an asymmetric stochastic process that is equivalent to the asset transfer model.

  9. 76 FR 78594 - Reporting of Specified Foreign Financial Assets

    Science.gov (United States)

    2011-12-19

    ... Reporting of Specified Foreign Financial Assets AGENCY: Internal Revenue Service (IRS), Treasury. ACTION... foreign financial assets and the value of those assets is more than the applicable reporting threshold... hold specified foreign financial assets generally will be excepted from reporting such assets under...

  10. Incentive-based demand response programs designed by asset-light retail electricity providers for the day-ahead market

    International Nuclear Information System (INIS)

    Fotouhi Ghazvini, Mohammad Ali; Faria, Pedro; Ramos, Sergio; Morais, Hugo; Vale, Zita

    2015-01-01

    Following the deregulation experience of retail electricity markets in most countries, the majority of the new entrants of the liberalized retail market were pure REP (retail electricity providers). These entities were subject to financial risks because of the unexpected price variations, price spikes, volatile loads and the potential for market power exertion by GENCO (generation companies). A REP can manage the market risks by employing the DR (demand response) programs and using its' generation and storage assets at the distribution network to serve the customers. The proposed model suggests how a REP with light physical assets, such as DG (distributed generation) units and ESS (energy storage systems), can survive in a competitive retail market. The paper discusses the effective risk management strategies for the REPs to deal with the uncertainties of the DAM (day-ahead market) and how to hedge the financial losses in the market. A two-stage stochastic programming problem is formulated. It aims to establish the financial incentive-based DR programs and the optimal dispatch of the DG units and ESSs. The uncertainty of the forecasted day-ahead load demand and electricity price is also taken into account with a scenario-based approach. The principal advantage of this model for REPs is reducing the risk of financial losses in DAMs, and the main benefit for the whole system is market power mitigation by virtually increasing the price elasticity of demand and reducing the peak demand. - Highlights: • Asset-light electricity retail providers subject to financial risks. • Incentive-based demand response program to manage the financial risks. • Maximizing the payoff of electricity retail providers in day-ahead market. • Mixed integer nonlinear programming to manage the risks

  11. Application of a greedy algorithm to military aircraft fleet retirements

    NARCIS (Netherlands)

    Newcamp, J.M.; Verhagen, W.J.C.; Udluft, H.; Curran, Ricky

    2017-01-01

    This article presents a retirement analysis model for aircraft fleets. By employing a greedy algorithm, the presented solution is capable of identifying individually weak assets in a fleet of aircraft with inhomogeneous historical utilization. The model forecasts future retirement scenarios

  12. Asset management techniques for transformers

    International Nuclear Information System (INIS)

    Abu-Elanien, Ahmed E.B.; Salama, M.M.A.

    2010-01-01

    In a deregulated/reformed environment, the electric utilities are under constant pressure for reducing operating costs, enhancing the reliability of transmission and distribution equipments, and improving quality of power and services to the customer. Moreover, the risk involved in running the system without proper attention to assets integrity in service is quite high. Additionally, the probability of losing any equipment vital to the transmission and distribution system, such as power and distribution transformers, is increasing especially with the aging of power system's assets. Today the focus of operating the power system is changed and efforts are being directed to explore new approaches/techniques of monitoring, diagnosis, condition evaluation, maintenance, life assessment, and possibility of extending the life of existing assets. In this paper, a comprehensive illustration of the transformer asset management activities is presented. The importance of each activity together with the latest researches done in the area is highlighted. (author)

  13. Are Accounting Metrics Applicable to Human Resources? The Case of Return on Valuing Assignments

    Directory of Open Access Journals (Sweden)

    Adam Steen

    2011-09-01

    Full Text Available The importance of accounting for human resources has long been recognised by the Accounting profession. Until recently, Human Resource Accounting (HRA literature has been dominated by discussion as to whether humans fit the traditional definition of assets, and how to measure and report them. We investigate the concept of human capital and its measurement through a review of the HRA literature, as well as the literature in Human Resources (HR. This paper then draws on the findings of a small exploratory study into the measurement of Return on Investment (ROI for international assignments. Interview data reveals that intangible costs and benefits are problematic when applying such a metric; that much of the outcome from the assignment is intellectual capital, in its broad sense, and therefore difficult to isolate and effectively measure.

  14. The Q theory of investment, the capital asset pricing model, and asset valuation: a synthesis.

    Science.gov (United States)

    McDonald, John F

    2004-05-01

    The paper combines Tobin's Q theory of real investment with the capital asset pricing model to produce a new and relatively simple procedure for the valuation of real assets using the income approach. Applications of the new method are provided.

  15. Asset pricing restrictions on predictability : Frictions matter

    NARCIS (Netherlands)

    de Roon, F.A.; Szymanowska, M.

    2012-01-01

    U.S. stock portfolios sorted on size; momentum; transaction costs; market-to-book, investment-to-assets, and return-on-assets (ROA) ratios; and industry classification show considerable levels and variation of return predictability, inconsistent with asset pricing models. This means that a

  16. Spectral decomposition of optimal asset-liability management

    NARCIS (Netherlands)

    Decamps, M.; de Schepper, A.; Goovaerts, M.

    2009-01-01

    This paper concerns optimal asset-liability management when the assets and the liabilities are modeled by means of correlated geometric Brownian motions as suggested in Gerber and Shiu [2003. Geometric Brownian motion models for assets and liabilities: from pension funding to optimal dividends.

  17. Asset Condition, Information Systems and Decision Models

    CERN Document Server

    Willett, Roger; Brown, Kerry; Mathew, Joseph

    2012-01-01

    Asset Condition, Information Systems and Decision Models, is the second volume of the Engineering Asset Management Review Series. The manuscripts provide examples of implementations of asset information systems as well as some practical applications of condition data for diagnostics and prognostics. The increasing trend is towards prognostics rather than diagnostics, hence the need for assessment and decision models that promote the conversion of condition data into prognostic information to improve life-cycle planning for engineered assets. The research papers included here serve to support the on-going development of Condition Monitoring standards. This volume comprises selected papers from the 1st, 2nd, and 3rd World Congresses on Engineering Asset Management, which were convened under the auspices of ISEAM in collaboration with a number of organisations, including CIEAM Australia, Asset Management Council Australia, BINDT UK, and Chinese Academy of Sciences, Beijing University of Chemical Technology, Chin...

  18. DNATCO: assignment of DNA conformers at dnatco.org.

    Science.gov (United States)

    Černý, Jiří; Božíková, Paulína; Schneider, Bohdan

    2016-07-08

    The web service DNATCO (dnatco.org) classifies local conformations of DNA molecules beyond their traditional sorting to A, B and Z DNA forms. DNATCO provides an interface to robust algorithms assigning conformation classes called NTC: to dinucleotides extracted from DNA-containing structures uploaded in PDB format version 3.1 or above. The assigned dinucleotide NTC: classes are further grouped into DNA structural alphabet NTA: , to the best of our knowledge the first DNA structural alphabet. The results are presented at two levels: in the form of user friendly visualization and analysis of the assignment, and in the form of a downloadable, more detailed table for further analysis offline. The website is free and open to all users and there is no login requirement. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  19. Algorithm Research of Individualized Travelling Route Recommendation Based on Similarity

    Directory of Open Access Journals (Sweden)

    Xue Shan

    2015-01-01

    Full Text Available Although commercial recommendation system has made certain achievement in travelling route development, the recommendation system is facing a series of challenges because of people’s increasing interest in travelling. It is obvious that the core content of the recommendation system is recommendation algorithm. The advantages of recommendation algorithm can bring great effect to the recommendation system. Based on this, this paper applies traditional collaborative filtering algorithm for analysis. Besides, illustrating the deficiencies of the algorithm, such as the rating unicity and rating matrix sparsity, this paper proposes an improved algorithm combing the multi-similarity algorithm based on user and the element similarity algorithm based on user, so as to compensate for the deficiencies that traditional algorithm has within a controllable range. Experimental results have shown that the improved algorithm has obvious advantages in comparison with the traditional one. The improved algorithm has obvious effect on remedying the rating matrix sparsity and rating unicity.

  20. Analysis System for Self-Efficacy Training (ASSET). Assessing treatment fidelity of self-management interventions.

    Science.gov (United States)

    Zinken, Katarzyna M; Cradock, Sue; Skinner, T Chas

    2008-08-01

    The paper presents the development of a coding tool for self-efficacy orientated interventions in diabetes self-management programmes (Analysis System for Self-Efficacy Training, ASSET) and explores its construct validity and clinical utility. Based on four sources of self-efficacy (i.e., mastery experience, role modelling, verbal persuasion and physiological and affective states), published self-efficacy based interventions for diabetes care were analysed in order to identify specific verbal behavioural techniques. Video-recorded facilitating behaviours were evaluated using ASSET. The reliability between four coders was high (K=0.71). ASSET enabled assessment of both self-efficacy based techniques and participants' response to those techniques. Individual patterns of delivery and shifts over time across facilitators were found. In the presented intervention we observed that self-efficacy utterances were followed by longer patient verbal responses than non-self-efficacy utterances. These detailed analyses with ASSET provide rich data and give the researcher an insight into the underlying mechanism of the intervention process. By providing a detailed description of self-efficacy strategies ASSET can be used by health care professionals to guide reflective practice and support training programmes.

  1. Duality based optical flow algorithms with applications

    DEFF Research Database (Denmark)

    Rakêt, Lars Lau

    We consider the popular TV-L1 optical flow formulation, and the so-called duality based algorithm for minimizing the TV-L1 energy. The original formulation is extended to allow for vector valued images, and minimization results are given. In addition we consider different definitions of total...... variation regularization, and related formulations of the optical flow problem that may be used with a duality based algorithm. We present a highly optimized algorithmic setup to estimate optical flows, and give five novel applications. The first application is registration of medical images, where X......-ray images of different hands, taken using different imaging devices are registered using a TV-L1 optical flow algorithm. We propose to regularize the input images, using sparsity enhancing regularization of the image gradient to improve registration results. The second application is registration of 2D...

  2. A hardware-algorithm co-design approach to optimize seizure detection algorithms for implantable applications.

    Science.gov (United States)

    Raghunathan, Shriram; Gupta, Sumeet K; Markandeya, Himanshu S; Roy, Kaushik; Irazoqui, Pedro P

    2010-10-30

    Implantable neural prostheses that deliver focal electrical stimulation upon demand are rapidly emerging as an alternate therapy for roughly a third of the epileptic patient population that is medically refractory. Seizure detection algorithms enable feedback mechanisms to provide focally and temporally specific intervention. Real-time feasibility and computational complexity often limit most reported detection algorithms to implementations using computers for bedside monitoring or external devices communicating with the implanted electrodes. A comparison of algorithms based on detection efficacy does not present a complete picture of the feasibility of the algorithm with limited computational power, as is the case with most battery-powered applications. We present a two-dimensional design optimization approach that takes into account both detection efficacy and hardware cost in evaluating algorithms for their feasibility in an implantable application. Detection features are first compared for their ability to detect electrographic seizures from micro-electrode data recorded from kainate-treated rats. Circuit models are then used to estimate the dynamic and leakage power consumption of the compared features. A score is assigned based on detection efficacy and the hardware cost for each of the features, then plotted on a two-dimensional design space. An optimal combination of compared features is used to construct an algorithm that provides maximal detection efficacy per unit hardware cost. The methods presented in this paper would facilitate the development of a common platform to benchmark seizure detection algorithms for comparison and feasibility analysis in the next generation of implantable neuroprosthetic devices to treat epilepsy. Copyright © 2010 Elsevier B.V. All rights reserved.

  3. Application of quantum master equation for long-term prognosis of asset-prices

    Science.gov (United States)

    Khrennikova, Polina

    2016-05-01

    This study combines the disciplines of behavioral finance and an extension of econophysics, namely the concepts and mathematical structure of quantum physics. We apply the formalism of quantum theory to model the dynamics of some correlated financial assets, where the proposed model can be potentially applied for developing a long-term prognosis of asset price formation. At the informational level, the asset price states interact with each other by the means of a ;financial bath;. The latter is composed of agents' expectations about the future developments of asset prices on the finance market, as well as financially important information from mass-media, society, and politicians. One of the essential behavioral factors leading to the quantum-like dynamics of asset prices is the irrationality of agents' expectations operating on the finance market. These expectations lead to a deeper type of uncertainty concerning the future price dynamics of the assets, than given by a classical probability theory, e.g., in the framework of the classical financial mathematics, which is based on the theory of stochastic processes. The quantum dimension of the uncertainty in price dynamics is expressed in the form of the price-states superposition and entanglement between the prices of the different financial assets. In our model, the resolution of this deep quantum uncertainty is mathematically captured with the aid of the quantum master equation (its quantum Markov approximation). We illustrate our model of preparation of a future asset price prognosis by a numerical simulation, involving two correlated assets. Their returns interact more intensively, than understood by a classical statistical correlation. The model predictions can be extended to more complex models to obtain price configuration for multiple assets and portfolios.

  4. Defining ecosystem assets for natural capital accounting

    NARCIS (Netherlands)

    Hein, Lars; Bagstad, Ken; Edens, Bram; Obst, Carl; Jong, de Rixt; Lesschen, Jan Peter

    2016-01-01

    In natural capital accounting, ecosystems are assets that provide ecosystem services to people. Assets can be measured using both physical and monetary units. In the international System of Environmental-Economic Accounting, ecosystem assets are generally valued on the basis of the net present

  5. QUANTITATIVE INDICATORS OF THE SECURITIZATION OF ASSETS

    Directory of Open Access Journals (Sweden)

    Denis VOSTRICOV

    2018-02-01

    Full Text Available Securitization is instrumental in return on capital increment through the withdrawal from the balance oflending activities being accompanied by off-balance incomes flow from fees, which are less capital-intensive. Thepurpose of this paper is to analyze the quantitative indicators characterizing the securitization of assets. For draftingthis article, the method of analysis, synthesis method, logic and dialectic method, normative method, the study ofstatistical sampling and time series of expert evaluations (Standard and Poor’s, personal observations, andmonographic studies have been used. The main difference between the securitization of assets from traditional waysof financing is related to the achievement of a plenty of secondary goals in attracting financial resources, whichcan play a significant role in choosing to favour the securitization of assets or other types of financing. Inparticular, it gives a possibility to write off the assets from the balance sheet along with the relevant obligationsunder the securities, to expand the range of potential investors accompanied by the reducing of credit risk, interestrate and liquidity risk, as well as to improve the management quality of assets, liabilities and risks. All of thesesecondary effects are achieved by the isolation of selected assets from the total credit risk of the enterprise, raisingits funds, which forms the pivotal actuality and significance of asset securitization. The article containsdemonstrations of quantitative and qualitative indicators characterizing the securitization of assets.

  6. Making the Invisible Visible: the Intangible Assets Recognition, the Valuation and Reporting in Romania

    OpenAIRE

    Nicoleta Radneantu

    2009-01-01

    The emergence of knowledge-based companies increased the importance of intangible assets, assets that were considered the most competitive advantages of companies. So, in this paper I tried to answer the following question: Which is the Romanian accounting reality about the intangible assets recognition, evaluation and reporting? What can we do that traditional financial statements do not become mostly useless for their end users?

  7. Developing Asset Life Cycle Management capabilities through the implementation of Asset Life Cycle Plans – an Action Research project

    OpenAIRE

    Ruitenburg, Richard; Braaksma, Anne Johannes Jan

    2017-01-01

    Asset Life Cycle Management is a strategic approach to managing physical assets over their complete life cycle. However, the literature and the recent ISO 55,000 standard do not offer guidance as to how to develop such an approach. This paper investigates the main capabilities for Asset Life Cycle Management by means of a four year Action Research project implementing Asset Life Cycle Plans. Five main capabilities emerged: 1. strategic information use; 2. alignment of operations and strategy;...

  8. Experience with the ASSET service in Slovakia

    International Nuclear Information System (INIS)

    Misak, J.

    1996-01-01

    The experience with the ASSET service in Slovakia is described, including the following: ASSET follow-up mission to Bohunice Unit 1-2 NPP; IAEA peer review of the national Incident Reporting System in the Slovak Republic; ASSET seminar on prevention of incidents, Bratislava, January 8-12, 1996

  9. Experience with the ASSET service in Slovakia

    Energy Technology Data Exchange (ETDEWEB)

    Misak, J [Nuclear Regulatory Authority, Bratislava (Slovakia)

    1997-12-31

    The experience with the ASSET service in Slovakia is described, including the following: ASSET follow-up mission to Bohunice Unit 1-2 NPP; IAEA peer review of the national Incident Reporting System in the Slovak Republic; ASSET seminar on prevention of incidents, Bratislava, January 8-12, 1996.

  10. Command and Control of Space Assets Through Internet-Based Technologies Demonstrated

    Science.gov (United States)

    Foltz, David A.

    2002-01-01

    The NASA Glenn Research Center successfully demonstrated a transmission-control-protocol/ Internet-protocol- (TCP/IP) based approach to the command and control of onorbit assets over a secure network. This is a significant accomplishment because future NASA missions will benefit by using Internet-standards-based protocols. Benefits of this Internet-based space command and control system architecture include reduced mission costs and increased mission efficiency. The demonstration proved that this communications architecture is viable for future NASA missions. This demonstration was a significant feat involving multiple NASA organizations and industry. Phillip Paulsen, from Glenn's Project Development and Integration Office, served as the overall project lead, and David Foltz, from Glenn's Satellite Networks and Architectures Branch, provided the hybrid networking support for the required Internet connections. The goal was to build a network that would emulate a connection between a space experiment on the International Space Station and a researcher accessing the experiment from anywhere on the Internet, as shown. The experiment was interfaced to a wireless 802.11 network inside the demonstration area. The wireless link provided connectivity to the Tracking and Data Relay Satellite System (TDRSS) Internet Link Terminal (TILT) satellite uplink terminal located 300 ft away in a parking lot on top of a panel van. TILT provided a crucial link in this demonstration. Leslie Ambrose, NASA Goddard Space Flight Center, provided the TILT/TDRSS support. The TILT unit transmitted the signal to TDRS 6 and was received at the White Sands Second TDRSS Ground Station. This station provided the gateway to the Internet. Coordination also took place at the White Sands station to install a Veridian Firewall and automated security incident measurement (ASIM) system to the Second TDRSS Ground Station Internet gateway. The firewall provides a trusted network for the simulated space

  11. Historical development of derivatives’ underlying assets

    Directory of Open Access Journals (Sweden)

    Sylvie Riederová

    2011-01-01

    Full Text Available The derivative transactions are able to eliminate the unexpected risk arising from the price volatility of the asset. The need for risk elimination relates to the application of derivatives.This paper is focused on derivatives’ underlying assets themselves. With the plain description, supported by progressive summarization, the authors analysed the relevant theoretical sources, dealt with derivatives, their underlying assets and their development in centuries. Starting in the ancient history, 2000 BC, the first non-standard transaction, very close to today’s understanding of derivatives, becomes to be closed between counterparties. During the time, in different kingdoms and emporiums, derivatives started to play a significant role in daily life, helping to reduce the uncertainty of the future. But the real golden era for derivatives started with the so called ‘New derivative markets’ and computer supported trading. They have extended their form from simple tools to most complex structures, without changing their main purpose hedging and risk – reduction.For the main purpose of this paper it is impossible to split the development of derivatives from the very wide extension of underlying assets. The change of these assets was one of the main drivers in derivatives development. Understanding of the dynamic character of these assets helps to understand the world of derivatives.

  12. The missing link between maintenance contracts and flexible asset management

    OpenAIRE

    Marttonen-Arola, Salla; Viskari, Sari; Kärri, Timo

    2013-01-01

    The paper shows how additional value can be created in maintenance collaboration through integrating the features of flexible asset management into maintenance contracts. We expand the traditional typology of maintenance contracts and introduce a new contract type, flexible asset management contracts. Also value sharing in the new contract type is discussed. Our logic for sharing the value is based on reaching for win-win situations in industrial maintenance collaboration. Finally, we present...

  13. Evidence-based algorithm for heparin dosing before cardiopulmonary bypass. Part 1: Development of the algorithm.

    Science.gov (United States)

    McKinney, Mark C; Riley, Jeffrey B

    2007-12-01

    The incidence of heparin resistance during adult cardiac surgery with cardiopulmonary bypass has been reported at 15%-20%. The consistent use of a clinical decision-making algorithm may increase the consistency of patient care and likely reduce the total required heparin dose and other problems associated with heparin dosing. After a directed survey of practicing perfusionists regarding treatment of heparin resistance and a literature search for high-level evidence regarding the diagnosis and treatment of heparin resistance, an evidence-based decision-making algorithm was constructed. The face validity of the algorithm decisive steps and logic was confirmed by a second survey of practicing perfusionists. The algorithm begins with review of the patient history to identify predictors for heparin resistance. The definition for heparin resistance contained in the algorithm is an activated clotting time 450 IU/kg heparin loading dose. Based on the literature, the treatment for heparin resistance used in the algorithm is anti-thrombin III supplement. The algorithm seems to be valid and is supported by high-level evidence and clinician opinion. The next step is a human randomized clinical trial to test the clinical procedure guideline algorithm vs. current standard clinical practice.

  14. A range-based predictive localization algorithm for WSID networks

    Science.gov (United States)

    Liu, Yuan; Chen, Junjie; Li, Gang

    2017-11-01

    Most studies on localization algorithms are conducted on the sensor networks with densely distributed nodes. However, the non-localizable problems are prone to occur in the network with sparsely distributed sensor nodes. To solve this problem, a range-based predictive localization algorithm (RPLA) is proposed in this paper for the wireless sensor networks syncretizing the RFID (WSID) networks. The Gaussian mixture model is established to predict the trajectory of a mobile target. Then, the received signal strength indication is used to reduce the residence area of the target location based on the approximate point-in-triangulation test algorithm. In addition, collaborative localization schemes are introduced to locate the target in the non-localizable situations. Simulation results verify that the RPLA achieves accurate localization for the network with sparsely distributed sensor nodes. The localization accuracy of the RPLA is 48.7% higher than that of the APIT algorithm, 16.8% higher than that of the single Gaussian model-based algorithm and 10.5% higher than that of the Kalman filtering-based algorithm.

  15. Setting Optimal Bounds on Risk in Asset Allocation - a Convex Program

    Directory of Open Access Journals (Sweden)

    James E. Falk

    2002-10-01

    Full Text Available The 'Portfolio Selection Problem' is traditionally viewed as selecting a mix of investment opportunities that maximizes the expected return subject to a bound on risk. However, in reality, portfolios are made up of a few 'asset classes' that consist of similar opportunities. The asset classes are managed by individual `sub-managers', under guidelines set by an overall portfolio manager. Once a benchmark (the `strategic' allocation has been set, an overall manager may choose to allow the sub-managers some latitude in which opportunities make up the classes. He may choose some overall bound on risk (as measured by the variance and wish to set bounds that constrain the submanagers. Mathematically we show that the problem is equivalent to finding a hyper-rectangle of maximal volume within an ellipsoid. It is a convex program, albeit with potentially a large number of constraints. We suggest a cutting plane algorithm to solve the problem and include computational results on a set of randomly generated problems as well as a real-world problem taken from the literature.

  16. DE and NLP Based QPLS Algorithm

    Science.gov (United States)

    Yu, Xiaodong; Huang, Dexian; Wang, Xiong; Liu, Bo

    As a novel evolutionary computing technique, Differential Evolution (DE) has been considered to be an effective optimization method for complex optimization problems, and achieved many successful applications in engineering. In this paper, a new algorithm of Quadratic Partial Least Squares (QPLS) based on Nonlinear Programming (NLP) is presented. And DE is used to solve the NLP so as to calculate the optimal input weights and the parameters of inner relationship. The simulation results based on the soft measurement of diesel oil solidifying point on a real crude distillation unit demonstrate that the superiority of the proposed algorithm to linear PLS and QPLS which is based on Sequential Quadratic Programming (SQP) in terms of fitting accuracy and computational costs.

  17. A difference tracking algorithm based on discrete sine transform

    Science.gov (United States)

    Liu, HaoPeng; Yao, Yong; Lei, HeBing; Wu, HaoKun

    2018-04-01

    Target tracking is an important field of computer vision. The template matching tracking algorithm based on squared difference matching (SSD) and standard correlation coefficient (NCC) matching is very sensitive to the gray change of image. When the brightness or gray change, the tracking algorithm will be affected by high-frequency information. Tracking accuracy is reduced, resulting in loss of tracking target. In this paper, a differential tracking algorithm based on discrete sine transform is proposed to reduce the influence of image gray or brightness change. The algorithm that combines the discrete sine transform and the difference algorithm maps the target image into a image digital sequence. The Kalman filter predicts the target position. Using the Hamming distance determines the degree of similarity between the target and the template. The window closest to the template is determined the target to be tracked. The target to be tracked updates the template. Based on the above achieve target tracking. The algorithm is tested in this paper. Compared with SSD and NCC template matching algorithms, the algorithm tracks target stably when image gray or brightness change. And the tracking speed can meet the read-time requirement.

  18. Macroeconomic influences on optimal asset allocation

    OpenAIRE

    Flavin, Thomas; Wickens, M.R.

    2003-01-01

    We develop a tactical asset allocation strategy that incorporates the effects of macroeconomic variables. The joint distribution of financial asset returns and the macroeconomic variables is modelled using a VAR with a multivariate GARCH (M-GARCH) error structure. As a result, the portfolio frontier is time varying and subject to contagion from the macroeconomic variable. Optimal asset allocation requires that this be taken into account. We illustrate how to do this using three ri...

  19. Managing corporate assets to maximize value

    International Nuclear Information System (INIS)

    Rubin, L.

    1992-01-01

    As the utility industry environment becomes more complex, pressures grow for managers to make more effective use of all their assets - including fuel, equipment, and personnel. Improving the management of assets leads to the delivery of greater value to ratepayers, stockholders, and society. EPRI is sponsoring a broad research program to help utilities effectively apply the tools needed in these changing business conditions, especially the latest in cost and quality management and asset management techniques

  20. Intelligent tactical asset allocation support system

    OpenAIRE

    Hiemstra, Y.

    1995-01-01

    This paper presents an advanced support system for Tactical Asset Allocation. Asset allocation explains over 90% of portfolio performance (Brinson, Hood and Beebower, 1988). Tactical asset allocation adjusts a strategic portfolio on the basis of short term market outlooks. The system includes aprediction model that forecasts quarterly excess returns on the S and PSOO, an optimization model that adjusts a user-specified strategic portfolio on thebasis of the excess return forecast, and a compo...

  1. Approaches of Improving University Assets Management Efficiency

    Science.gov (United States)

    Wang, Jingliang

    2015-01-01

    University assets management, as an important content of modern university management, is generally confronted with the issue of low efficiency. Currently, to address the problems exposed in university assets management and take appropriate modification measures is an urgent issue in front of Chinese university assets management sectors. In this…

  2. Rule-Based Analytic Asset Management for Space Exploration Systems (RAMSES), Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Currently, the state-of-the-art in space asset tracking and information management is bar-coding with relational database support. To support NASA's need for...

  3. Current approaches to assessing intangible assets

    OpenAIRE

    Урусова, Зінаїда Петрівна

    2013-01-01

    The article analyzes methods of assessing intangible assets in Ukraine as well as in accordance with International Financial Reporting Standards. Contemporary approaches to assessing intangible assets have been researched.

  4. Protein secondary structure assignment revisited: a detailed analysis of different assignment methods

    Directory of Open Access Journals (Sweden)

    de Brevern Alexandre G

    2005-09-01

    Full Text Available Abstract Background A number of methods are now available to perform automatic assignment of periodic secondary structures from atomic coordinates, based on different characteristics of the secondary structures. In general these methods exhibit a broad consensus as to the location of most helix and strand core segments in protein structures. However the termini of the segments are often ill-defined and it is difficult to decide unambiguously which residues at the edge of the segments have to be included. In addition, there is a "twilight zone" where secondary structure segments depart significantly from the idealized models of Pauling and Corey. For these segments, one has to decide whether the observed structural variations are merely distorsions or whether they constitute a break in the secondary structure. Methods To address these problems, we have developed a method for secondary structure assignment, called KAKSI. Assignments made by KAKSI are compared with assignments given by DSSP, STRIDE, XTLSSTR, PSEA and SECSTR, as well as secondary structures found in PDB files, on 4 datasets (X-ray structures with different resolution range, NMR structures. Results A detailed comparison of KAKSI assignments with those of STRIDE and PSEA reveals that KAKSI assigns slightly longer helices and strands than STRIDE in case of one-to-one correspondence between the segments. However, KAKSI tends also to favor the assignment of several short helices when STRIDE and PSEA assign longer, kinked, helices. Helices assigned by KAKSI have geometrical characteristics close to those described in the PDB. They are more linear than helices assigned by other methods. The same tendency to split long segments is observed for strands, although less systematically. We present a number of cases of secondary structure assignments that illustrate this behavior. Conclusion Our method provides valuable assignments which favor the regularity of secondary structure segments.

  5. ACCOUNTING POLICIES APPLIED BY ENTITIES AND THE VALUE OF FIXED ASSETS

    Directory of Open Access Journals (Sweden)

    Partenie Dumbravă

    2012-01-01

    Full Text Available The objective of this paper is to present the accounting policies applied by medium taxpayers in Covasna county and analyse the factors of influence over choice of accounting policies related to tangible assets. The study contains an analysis of answers given by respondents to the questionnaire sent, with regard to: evaluation bases, revaluation of the tangible assets of the entities, used amortization methods and the depreciable value. The results obtained among other, show that the greatest effect on accounting policy choices have influence factor no. 7 - The tangible assets value in the financial statements present the true and fair view of them.

  6. Optimal density assignment to 2D diode array detector for different dose calculation algorithms in patient specific VMAT QA

    International Nuclear Information System (INIS)

    Park, So Yeon; Park, Jong Min; Choi, Chang Heon; Chun, MinSoo; Han, Ji Hye; Cho, Jin Dong; Kim, Jung In

    2017-01-01

    The purpose of this study is to assign an appropriate density to virtual phantom for 2D diode array detector with different dose calculation algorithms to guarantee the accuracy of patient-specific QA. Ten VMAT plans with 6 MV photon beam and ten VMAT plans with 15 MV photon beam were selected retrospectively. The computed tomography (CT) images of MapCHECK2 with MapPHAN were acquired to design the virtual phantom images. For all plans, dose distributions were calculated for the virtual phantoms with four different materials by AAA and AXB algorithms. The four materials were polystyrene, 455 HU, Jursinic phantom, and PVC. Passing rates for several gamma criteria were calculated by comparing the measured dose distribution with calculated dose distributions of four materials. For validation of AXB modeling in clinic, the mean percentages of agreement in the cases of dose difference criteria of 1.0% and 2.0% for 6 MV were 97.2%±2.3%, and 99.4%±1.1%, respectively while those for 15 MV were 98.5%±0.85% and 99.8%±0.2%, respectively. In the case of 2%/2 mm, all mean passing rates were more than 96.0% and 97.2% for 6 MV and 15 MV, respectively, regardless of the virtual phantoms of different materials and dose calculation algorithms. The passing rates in all criteria slightly increased for AXB as well as AAA when using 455 HU rather than polystyrene. The virtual phantom which had a 455 HU values showed high passing rates for all gamma criteria. To guarantee the accuracy of patent-specific VMAT QA, each institution should fine-tune the mass density or HU values of this device

  7. Optimal density assignment to 2D diode array detector for different dose calculation algorithms in patient specific VMAT QA

    Energy Technology Data Exchange (ETDEWEB)

    Park, So Yeon; Park, Jong Min; Choi, Chang Heon; Chun, MinSoo; Han, Ji Hye; Cho, Jin Dong; Kim, Jung In [Dept. of Radiation Oncology, Seoul National University Hospital, Seoul (Korea, Republic of)

    2017-03-15

    The purpose of this study is to assign an appropriate density to virtual phantom for 2D diode array detector with different dose calculation algorithms to guarantee the accuracy of patient-specific QA. Ten VMAT plans with 6 MV photon beam and ten VMAT plans with 15 MV photon beam were selected retrospectively. The computed tomography (CT) images of MapCHECK2 with MapPHAN were acquired to design the virtual phantom images. For all plans, dose distributions were calculated for the virtual phantoms with four different materials by AAA and AXB algorithms. The four materials were polystyrene, 455 HU, Jursinic phantom, and PVC. Passing rates for several gamma criteria were calculated by comparing the measured dose distribution with calculated dose distributions of four materials. For validation of AXB modeling in clinic, the mean percentages of agreement in the cases of dose difference criteria of 1.0% and 2.0% for 6 MV were 97.2%±2.3%, and 99.4%±1.1%, respectively while those for 15 MV were 98.5%±0.85% and 99.8%±0.2%, respectively. In the case of 2%/2 mm, all mean passing rates were more than 96.0% and 97.2% for 6 MV and 15 MV, respectively, regardless of the virtual phantoms of different materials and dose calculation algorithms. The passing rates in all criteria slightly increased for AXB as well as AAA when using 455 HU rather than polystyrene. The virtual phantom which had a 455 HU values showed high passing rates for all gamma criteria. To guarantee the accuracy of patent-specific VMAT QA, each institution should fine-tune the mass density or HU values of this device.

  8. The Accounting Practices of Heritage Assets

    OpenAIRE

    Hassan, Nor Laili; Saad, Natrah; Ahmad, Halimah Nasibah; Salleh, Md. Suhaimi Md.; Ismail, Mohamad Sharofi

    2016-01-01

    Accrual-based accounting is introduced to the government agencies with the intention to hold prudent fiscal management and improve the efficiency of financial management and accounting of the Malaysian Government. For that purpose, Malaysian Public Sector Accounting Standards (MPSAS) was introduced as a main reference in applying the accrual-based accounting. MPSAS 17 which deals with heritage assets, will take effect in 2017. The study intended to discover how do overseas’ museums report the...

  9. On the management and operation of enterprises intangible asset

    Science.gov (United States)

    Zhu, Yu; Wang, Hong

    2011-10-01

    Since entering the knowledge economy, the management of intangible assets becomes an important part of manage, this article discusses the problem of management on intangible assets, the properties of intangible assets, and the channels of management and operation on intangible assets, and stressed the important role of intangible assets in the development and innovation of the enterprise.

  10. Generalized phase retrieval algorithm based on information measures

    OpenAIRE

    Shioya, Hiroyuki; Gohara, Kazutoshi

    2006-01-01

    An iterative phase retrieval algorithm based on the maximum entropy method (MEM) is presented. Introducing a new generalized information measure, we derive a novel class of algorithms which includes the conventionally used error reduction algorithm and a MEM-type iterative algorithm which is presented for the first time. These different phase retrieval methods are unified on the basis of the framework of information measures used in information theory.

  11. Efficient sampling algorithms for Monte Carlo based treatment planning

    International Nuclear Information System (INIS)

    DeMarco, J.J.; Solberg, T.D.; Chetty, I.; Smathers, J.B.

    1998-01-01

    Efficient sampling algorithms are necessary for producing a fast Monte Carlo based treatment planning code. This study evaluates several aspects of a photon-based tracking scheme and the effect of optimal sampling algorithms on the efficiency of the code. Four areas were tested: pseudo-random number generation, generalized sampling of a discrete distribution, sampling from the exponential distribution, and delta scattering as applied to photon transport through a heterogeneous simulation geometry. Generalized sampling of a discrete distribution using the cutpoint method can produce speedup gains of one order of magnitude versus conventional sequential sampling. Photon transport modifications based upon the delta scattering method were implemented and compared with a conventional boundary and collision checking algorithm. The delta scattering algorithm is faster by a factor of six versus the conventional algorithm for a boundary size of 5 mm within a heterogeneous geometry. A comparison of portable pseudo-random number algorithms and exponential sampling techniques is also discussed

  12. List-Based Simulated Annealing Algorithm for Traveling Salesman Problem

    Directory of Open Access Journals (Sweden)

    Shi-hua Zhan

    2016-01-01

    Full Text Available Simulated annealing (SA algorithm is a popular intelligent optimization algorithm which has been successfully applied in many fields. Parameters’ setting is a key factor for its performance, but it is also a tedious work. To simplify parameters setting, we present a list-based simulated annealing (LBSA algorithm to solve traveling salesman problem (TSP. LBSA algorithm uses a novel list-based cooling schedule to control the decrease of temperature. Specifically, a list of temperatures is created first, and then the maximum temperature in list is used by Metropolis acceptance criterion to decide whether to accept a candidate solution. The temperature list is adapted iteratively according to the topology of the solution space of the problem. The effectiveness and the parameter sensitivity of the list-based cooling schedule are illustrated through benchmark TSP problems. The LBSA algorithm, whose performance is robust on a wide range of parameter values, shows competitive performance compared with some other state-of-the-art algorithms.

  13. Developing Asset Life Cycle Management capabilities through the implementation of Asset Life Cycle Plans – an Action Research project

    NARCIS (Netherlands)

    Ruitenburg, Richard; Braaksma, Anne Johannes Jan

    2017-01-01

    Asset Life Cycle Management is a strategic approach to managing physical assets over their complete life cycle. However, the literature and the recent ISO 55,000 standard do not offer guidance as to how to develop such an approach. This paper investigates the main capabilities for Asset Life Cycle

  14. Research on personalized recommendation algorithm based on spark

    Science.gov (United States)

    Li, Zeng; Liu, Yu

    2018-04-01

    With the increasing amount of data in the past years, the traditional recommendation algorithm has been unable to meet people's needs. Therefore, how to better recommend their products to users of interest, become the opportunities and challenges of the era of big data development. At present, each platform enterprise has its own recommendation algorithm, but how to make efficient and accurate push information is still an urgent problem for personalized recommendation system. In this paper, a hybrid algorithm based on user collaborative filtering and content-based recommendation algorithm is proposed on Spark to improve the efficiency and accuracy of recommendation by weighted processing. The experiment shows that the recommendation under this scheme is more efficient and accurate.

  15. Fast image matching algorithm based on projection characteristics

    Science.gov (United States)

    Zhou, Lijuan; Yue, Xiaobo; Zhou, Lijun

    2011-06-01

    Based on analyzing the traditional template matching algorithm, this paper identified the key factors restricting the speed of matching and put forward a brand new fast matching algorithm based on projection. Projecting the grayscale image, this algorithm converts the two-dimensional information of the image into one-dimensional one, and then matches and identifies through one-dimensional correlation, meanwhile, because of normalization has been done, when the image brightness or signal amplitude increasing in proportion, it could also perform correct matching. Experimental results show that the projection characteristics based image registration method proposed in this article could greatly improve the matching speed, which ensuring the matching accuracy as well.

  16. GPU-based parallel algorithm for blind image restoration using midfrequency-based methods

    Science.gov (United States)

    Xie, Lang; Luo, Yi-han; Bao, Qi-liang

    2013-08-01

    GPU-based general-purpose computing is a new branch of modern parallel computing, so the study of parallel algorithms specially designed for GPU hardware architecture is of great significance. In order to solve the problem of high computational complexity and poor real-time performance in blind image restoration, the midfrequency-based algorithm for blind image restoration was analyzed and improved in this paper. Furthermore, a midfrequency-based filtering method is also used to restore the image hardly with any recursion or iteration. Combining the algorithm with data intensiveness, data parallel computing and GPU execution model of single instruction and multiple threads, a new parallel midfrequency-based algorithm for blind image restoration is proposed in this paper, which is suitable for stream computing of GPU. In this algorithm, the GPU is utilized to accelerate the estimation of class-G point spread functions and midfrequency-based filtering. Aiming at better management of the GPU threads, the threads in a grid are scheduled according to the decomposition of the filtering data in frequency domain after the optimization of data access and the communication between the host and the device. The kernel parallelism structure is determined by the decomposition of the filtering data to ensure the transmission rate to get around the memory bandwidth limitation. The results show that, with the new algorithm, the operational speed is significantly increased and the real-time performance of image restoration is effectively improved, especially for high-resolution images.

  17. Assets and Educational Achievement: Theory and Evidence

    Science.gov (United States)

    Elliott, William; Sherraden, Michael

    2013-01-01

    This special issue of Economics of Education Review explores the role of savings and asset holding in post-secondary educational achievement. Most college success research has focused on income rather than assets as a predictor, and most college financing policy has focused on tuition support and educational debt, rather than asset accumulation.…

  18. DEPRECIATION AS THE SOURCE OF REPLENISHMENT OF ENTERPRISE CURRENT ASSETS

    Directory of Open Access Journals (Sweden)

    KAFKA Sofiіa

    2017-06-01

    Full Text Available Along with the classical approach to understanding the meaning of such economic category as depreciation, we have considered the issues if the amortization is the source of the fixed assets renewal. For the purposes of the study we used methods of analysis and synthesis, systematic approach - to study the processes of working capital enterprises, logical method and simulation - to systematize information security of these processes. Results of investigations to determine the characteristics of the economic substance of depreciation established: depreciation is a compensation of working capital, which at one time was removed from circulation for the purchase of fixed assets, so you cannot treat depreciation as a source of investment in fixed assets. The source of the formation of the working capital of own funds is certainly profits and depreciation - only if profitable economic activity. So when it comes to sources of funding something, including the acquisition (creation of fixed assets, only the working capital can be the source that’s why it is important to use them efficiently and the conditions of the limited resources. The practical significance is to develop scientifically based recommendations on formation and sources of working capital enterprises and reproduction of fixed assets, which provide methodological support to forming the depreciation policy of enterprises

  19. EFFICIENT MULTI-OBJECTIVE EVOLUTIONARY ALGORITHM FOR JOB SHOP SCHEDULING

    Institute of Scientific and Technical Information of China (English)

    Lei Deming; Wu Zhiming

    2005-01-01

    A new representation method is first presented based on priority rules. According to this method, each entry in the chromosome indicates that in the procedure of the Giffler and Thompson (GT) algorithm, the conflict occurring in the corresponding machine is resolved by the corresponding priority rule. Then crowding-measure multi-objective evolutionary algorithm (CMOEA) is designed,in which both archive maintenance and fitness assignment use crowding measure. Finally the comparisons between CMOEA and SPEA in solving 15 scheduling problems demonstrate that CMOEA is suitable to job shop scheduling.

  20. Quantum Behaved Particle Swarm Optimization Algorithm Based on Artificial Fish Swarm

    OpenAIRE

    Yumin, Dong; Li, Zhao

    2014-01-01

    Quantum behaved particle swarm algorithm is a new intelligent optimization algorithm; the algorithm has less parameters and is easily implemented. In view of the existing quantum behaved particle swarm optimization algorithm for the premature convergence problem, put forward a quantum particle swarm optimization algorithm based on artificial fish swarm. The new algorithm based on quantum behaved particle swarm algorithm, introducing the swarm and following activities, meanwhile using the a...

  1. Operational management of offshore energy assets

    Science.gov (United States)

    Kolios, A. J.; Martinez Luengo, M.

    2016-02-01

    Energy assets and especially those deployed offshore are subject to a variety of harsh operational and environmental conditions which lead to deterioration of their performance and structural capacity over time. The aim of reduction of CAPEX in new installations shifts focus to operational management to monitor and assess performance of critical assets ensuring their fitness for service throughout their service life and also to provide appropriate and effective information towards requalification or other end of life scenarios, optimizing the OPEX. Over the last decades, the offshore oil & gas industry has developed and applied various approaches in operational management of assets through Structural Health and Condition Monitoring (SHM/CM) systems which can be, at a certain level, transferable to offshore renewable installations. This paper aims to highlight the key differences between offshore oil & gas and renewable energy assets from a structural integrity and reliability perspective, provide a comprehensive overview of different approaches that are available and applicable, and distinguish the benefits of such systems in the efficient operation of offshore energy assets.

  2. Algorithm of Particle Data Association for SLAM Based on Improved Ant Algorithm

    Directory of Open Access Journals (Sweden)

    KeKe Gen

    2015-01-01

    Full Text Available The article considers a problem of data association algorithm for simultaneous localization and mapping guidelines in determining the route of unmanned aerial vehicles (UAVs. Currently, these equipments are already widely used, but mainly controlled from the remote operator. An urgent task is to develop a control system that allows for autonomous flight. Algorithm SLAM (simultaneous localization and mapping, which allows to predict the location, speed, the ratio of flight parameters and the coordinates of landmarks and obstacles in an unknown environment, is one of the key technologies to achieve real autonomous UAV flight. The aim of this work is to study the possibility of solving this problem by using an improved ant algorithm.The data association for SLAM algorithm is meant to establish a matching set of observed landmarks and landmarks in the state vector. Ant algorithm is one of the widely used optimization algorithms with positive feedback and the ability to search in parallel, so the algorithm is suitable for solving the problem of data association for SLAM. But the traditional ant algorithm in the process of finding routes easily falls into local optimum. Adding random perturbations in the process of updating the global pheromone to avoid local optima. Setting limits pheromone on the route can increase the search space with a reasonable amount of calculations for finding the optimal route.The paper proposes an algorithm of the local data association for SLAM algorithm based on an improved ant algorithm. To increase the speed of calculation, local data association is used instead of the global data association. The first stage of the algorithm defines targets in the matching space and the observed landmarks with the possibility of association by the criterion of individual compatibility (IC. The second stage defines the matched landmarks and their coordinates using improved ant algorithm. Simulation results confirm the efficiency and

  3. Sinopec Goes After Oil Assets Worldwide

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    @@ US$2.45b deal to gain reserves of 393m barrels of crude equivalent China's enterprises eye global expansion via mergers and acquisitions in 2010 as the country's economic power increases.China Petrochemical Corporation (Sinopec),Asia's largest oil refiner, plans to purchase the entire oil and gas assets in the Argentinean arm of US-based Occidental Petroleum Corp.

  4. Collaborative design of Open Educational Practices: An Assets based approach

    Directory of Open Access Journals (Sweden)

    Kate Helen Miller

    2018-04-01

    Full Text Available This paper outlines a collaborative approach to the design of open educational resources (OER with community stakeholders so they can be shared with other community practitioners openly, online and repurposed for other contexts. We view curriculum not as something that educationalists provide but rather something that emerges as learners engage with an educational context. We draw on a Project consisting of a partnership between five European Institutions of Higher Education and a range of community stakeholder groups. The partnership will develop a suite of OER for community workers who are implementing assets based approaches in different contexts. We argue that these approaches are negotiated in that one cannot decide how they might operate in a given context without engaging in deliberative discussion. The challenge for us as open education practitioners is how to turn those deliberations into OER and to highlight the important pedagogical aspect of the design process.

  5. Intangible assets linked to consumers: Acknowledgement and evaluation in the business combination detached from Goodwill

    Directory of Open Access Journals (Sweden)

    Spasić Dejan

    2012-01-01

    Full Text Available The development of International Financial Reporting Standards causes new challenges for accounting as a profession. One of those challenges reefers to business combinations treatment in accounting and the related possibility of recognizing intangible assets detached from goodwill. As it usually represents internally generated intangible assets to which the recognition principle cannot be applied according to IAS 38, the author focuses on conditions under which it can be recognized in financial reports of the acquisitor in business combination, according to IFRS 3. Apart from that, the significant attention is dedicated to the methods for determining the fair value of recognizable intangible assets. As the application of market-based methods, the focus in this paper is on income-based methods and their applicability for consumer-related intangible assets valuation. Limitations in using methods for intangible assets valuation both in stable and crisis conditions are presented at the end of this paper.

  6. Parameter Selection for Ant Colony Algorithm Based on Bacterial Foraging Algorithm

    Directory of Open Access Journals (Sweden)

    Peng Li

    2016-01-01

    Full Text Available The optimal performance of the ant colony algorithm (ACA mainly depends on suitable parameters; therefore, parameter selection for ACA is important. We propose a parameter selection method for ACA based on the bacterial foraging algorithm (BFA, considering the effects of coupling between different parameters. Firstly, parameters for ACA are mapped into a multidimensional space, using a chemotactic operator to ensure that each parameter group approaches the optimal value, speeding up the convergence for each parameter set. Secondly, the operation speed for optimizing the entire parameter set is accelerated using a reproduction operator. Finally, the elimination-dispersal operator is used to strengthen the global optimization of the parameters, which avoids falling into a local optimal solution. In order to validate the effectiveness of this method, the results were compared with those using a genetic algorithm (GA and a particle swarm optimization (PSO, and simulations were conducted using different grid maps for robot path planning. The results indicated that parameter selection for ACA based on BFA was the superior method, able to determine the best parameter combination rapidly, accurately, and effectively.

  7. An algorithm and a Tool for Wavelength Allocation in OMS-SP Ring Architecture

    DEFF Research Database (Denmark)

    Riaz, Muhammad Tahir; Pedersen, Jens Myrup; Madsen, Ole Brun

    2006-01-01

    OMS-SP ring is one of the well known architectures in Wavelength Division Multiplexing based optical fiber networks. The architecture supports a restorable full mesh in an optical fiber ring using multiple light wavelengths. The paper presents an algorithm to allocate wavelengths in the OMS-SP ri...... architecture. A tool is also introduced which implements the algorithm and assigns wavelengths. The proposed algorithm uses fewer number of wavelengths than the classical allocation method. The algorithm is described and results are presented.......OMS-SP ring is one of the well known architectures in Wavelength Division Multiplexing based optical fiber networks. The architecture supports a restorable full mesh in an optical fiber ring using multiple light wavelengths. The paper presents an algorithm to allocate wavelengths in the OMS-SP ring...

  8. A Flocking Based algorithm for Document Clustering Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Gao, Jinzhu [ORNL; Potok, Thomas E [ORNL

    2006-01-01

    Social animals or insects in nature often exhibit a form of emergent collective behavior known as flocking. In this paper, we present a novel Flocking based approach for document clustering analysis. Our Flocking clustering algorithm uses stochastic and heuristic principles discovered from observing bird flocks or fish schools. Unlike other partition clustering algorithm such as K-means, the Flocking based algorithm does not require initial partitional seeds. The algorithm generates a clustering of a given set of data through the embedding of the high-dimensional data items on a two-dimensional grid for easy clustering result retrieval and visualization. Inspired by the self-organized behavior of bird flocks, we represent each document object with a flock boid. The simple local rules followed by each flock boid result in the entire document flock generating complex global behaviors, which eventually result in a clustering of the documents. We evaluate the efficiency of our algorithm with both a synthetic dataset and a real document collection that includes 100 news articles collected from the Internet. Our results show that the Flocking clustering algorithm achieves better performance compared to the K- means and the Ant clustering algorithm for real document clustering.

  9. AdaBoost-based algorithm for network intrusion detection.

    Science.gov (United States)

    Hu, Weiming; Hu, Wei; Maybank, Steve

    2008-04-01

    Network intrusion detection aims at distinguishing the attacks on the Internet from normal use of the Internet. It is an indispensable part of the information security system. Due to the variety of network behaviors and the rapid development of attack fashions, it is necessary to develop fast machine-learning-based intrusion detection algorithms with high detection rates and low false-alarm rates. In this correspondence, we propose an intrusion detection algorithm based on the AdaBoost algorithm. In the algorithm, decision stumps are used as weak classifiers. The decision rules are provided for both categorical and continuous features. By combining the weak classifiers for continuous features and the weak classifiers for categorical features into a strong classifier, the relations between these two different types of features are handled naturally, without any forced conversions between continuous and categorical features. Adaptable initial weights and a simple strategy for avoiding overfitting are adopted to improve the performance of the algorithm. Experimental results show that our algorithm has low computational complexity and error rates, as compared with algorithms of higher computational complexity, as tested on the benchmark sample data.

  10. A Novel Spectrum Scheduling Scheme with Ant Colony Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Liping Liu

    2018-01-01

    Full Text Available Cognitive radio is a promising technology for improving spectrum utilization, which allows cognitive users access to the licensed spectrum while primary users are absent. In this paper, we design a resource allocation framework based on graph theory for spectrum assignment in cognitive radio networks. The framework takes into account the constraints that interference for primary users and possible collision among cognitive users. Based on the proposed model, we formulate a system utility function to maximize the system benefit. Based on the proposed model and objective problem, we design an improved ant colony optimization algorithm (IACO from two aspects: first, we introduce differential evolution (DE process to accelerate convergence speed by monitoring mechanism; then we design a variable neighborhood search (VNS process to avoid the algorithm falling into the local optimal. Simulation results demonstrate that the improved algorithm achieves better performance.

  11. Q-learning-based adjustable fixed-phase quantum Grover search algorithm

    International Nuclear Information System (INIS)

    Guo Ying; Shi Wensha; Wang Yijun; Hu, Jiankun

    2017-01-01

    We demonstrate that the rotation phase can be suitably chosen to increase the efficiency of the phase-based quantum search algorithm, leading to a dynamic balance between iterations and success probabilities of the fixed-phase quantum Grover search algorithm with Q-learning for a given number of solutions. In this search algorithm, the proposed Q-learning algorithm, which is a model-free reinforcement learning strategy in essence, is used for performing a matching algorithm based on the fraction of marked items λ and the rotation phase α. After establishing the policy function α = π(λ), we complete the fixed-phase Grover algorithm, where the phase parameter is selected via the learned policy. Simulation results show that the Q-learning-based Grover search algorithm (QLGA) enables fewer iterations and gives birth to higher success probabilities. Compared with the conventional Grover algorithms, it avoids the optimal local situations, thereby enabling success probabilities to approach one. (author)

  12. Sequential backbone assignment based on dipolar amide-to-amide correlation experiments

    Energy Technology Data Exchange (ETDEWEB)

    Xiang, ShengQi; Grohe, Kristof; Rovó, Petra; Vasa, Suresh Kumar; Giller, Karin; Becker, Stefan; Linser, Rasmus, E-mail: rali@nmr.mpibpc.mpg.de [Max Planck Institute for Biophysical Chemistry, Department for NMR-Based Structural Biology (Germany)

    2015-07-15

    Proton detection in solid-state NMR has seen a tremendous increase in popularity in the last years. New experimental techniques allow to exploit protons as an additional source of information on structure, dynamics, and protein interactions with their surroundings. In addition, sensitivity is mostly improved and ambiguity in assignment experiments reduced. We show here that, in the solid state, sequential amide-to-amide correlations turn out to be an excellent, complementary way to exploit amide shifts for unambiguous backbone assignment. For a general assessment, we compare amide-to-amide experiments with the more common {sup 13}C-shift-based methods. Exploiting efficient CP magnetization transfers rather than less efficient INEPT periods, our results suggest that the approach is very feasible for solid-state NMR.

  13. Sequential backbone assignment based on dipolar amide-to-amide correlation experiments

    International Nuclear Information System (INIS)

    Xiang, ShengQi; Grohe, Kristof; Rovó, Petra; Vasa, Suresh Kumar; Giller, Karin; Becker, Stefan; Linser, Rasmus

    2015-01-01

    Proton detection in solid-state NMR has seen a tremendous increase in popularity in the last years. New experimental techniques allow to exploit protons as an additional source of information on structure, dynamics, and protein interactions with their surroundings. In addition, sensitivity is mostly improved and ambiguity in assignment experiments reduced. We show here that, in the solid state, sequential amide-to-amide correlations turn out to be an excellent, complementary way to exploit amide shifts for unambiguous backbone assignment. For a general assessment, we compare amide-to-amide experiments with the more common 13 C-shift-based methods. Exploiting efficient CP magnetization transfers rather than less efficient INEPT periods, our results suggest that the approach is very feasible for solid-state NMR

  14. Overlap Algorithms in Flexible Job-shop Scheduling

    Directory of Open Access Journals (Sweden)

    Celia Gutierrez

    2014-06-01

    Full Text Available The flexible Job-shop Scheduling Problem (fJSP considers the execution of jobs by a set of candidate resources while satisfying time and technological constraints. This work, that follows the hierarchical architecture, is based on an algorithm where each objective (resource allocation, start-time assignment is solved by a genetic algorithm (GA that optimizes a particular fitness function, and enhances the results by the execution of a set of heuristics that evaluate and repair each scheduling constraint on each operation. The aim of this work is to analyze the impact of some algorithmic features of the overlap constraint heuristics, in order to achieve the objectives at a highest degree. To demonstrate the efficiency of this approach, experimentation has been performed and compared with similar cases, tuning the GA parameters correctly.

  15. APPROACHES CONCERNING ACCOUNTING OF INTANGIBLE ASSETS

    Directory of Open Access Journals (Sweden)

    Gheorghe MOROSAN

    2016-02-01

    Full Text Available Given the importance of intangible assets in the company the paper aims to establish criteria for recognizing and measuring these assets through which the company can not only reflect the true value and its carrying amount. The main objective is to formulate a logical definition of intangible assets in accounting terms that allows their recognition in financial reporting to help build an accurate image of the company. It will demonstrate how important intangible assets for a successful company are and how they can help develop the economy and especially the Romanian economy. The secondary objectives are: - Setting limits in the valuation of intellectual capital from the point of view of internal control and external - Create a new post in the balance sheet to include this related value

  16. Low-cost asset tracking using location-aware camera phones

    Science.gov (United States)

    Chen, David; Tsai, Sam; Kim, Kyu-Han; Hsu, Cheng-Hsin; Singh, Jatinder Pal; Girod, Bernd

    2010-08-01

    Maintaining an accurate and up-to-date inventory of one's assets is a labor-intensive, tedious, and costly operation. To ease this difficult but important task, we design and implement a mobile asset tracking system for automatically generating an inventory by snapping photos of the assets with a smartphone. Since smartphones are becoming ubiquitous, construction and deployment of our inventory management solution is simple and costeffective. Automatic asset recognition is achieved by first segmenting individual assets out of the query photo and then performing bag-of-visual-features (BoVF) image matching on the segmented regions. The smartphone's sensor readings, such as digital compass and accelerometer measurements, can be used to determine the location of each asset, and this location information is stored in the inventory for each recognized asset. As a special case study, we demonstrate a mobile book tracking system, where users snap photos of books stacked on bookshelves to generate a location-aware book inventory. It is shown that segmenting the book spines is very important for accurate feature-based image matching into a database of book spines. Segmentation also provides the exact orientation of each book spine, so more discriminative upright local features can be employed for improved recognition. This system's mobile client has been implemented for smartphones running the Symbian or Android operating systems. The client enables a user to snap a picture of a bookshelf and to subsequently view the recognized spines in the smartphone's viewfinder. Two different pose estimates, one from BoVF geometric matching and the other from segmentation boundaries, are both utilized to accurately draw the boundary of each spine in the viewfinder for easy visualization. The BoVF representation also allows matching each photo of a bookshelf rack against a photo of the entire bookshelf, and the resulting feature matches are used in conjunction with the smartphone

  17. A modeling of dynamic storage assignment for order picking in beverage warehousing with Drive-in Rack system

    Science.gov (United States)

    Hadi, M. Z.; Djatna, T.; Sugiarto

    2018-04-01

    This paper develops a dynamic storage assignment model to solve storage assignment problem (SAP) for beverages order picking in a drive-in rack warehousing system to determine the appropriate storage location and space for each beverage products dynamically so that the performance of the system can be improved. This study constructs a graph model to represent drive-in rack storage position then combine association rules mining, class-based storage policies and an arrangement rule algorithm to determine an appropriate storage location and arrangement of the product according to dynamic orders from customers. The performance of the proposed model is measured as rule adjacency accuracy, travel distance (for picking process) and probability a product become expiry using Last Come First Serve (LCFS) queue approach. Finally, the proposed model is implemented through computer simulation and compare the performance for different storage assignment methods as well. The result indicates that the proposed model outperforms other storage assignment methods.

  18. Investment in Transportation Assets : Briefing Paper

    Science.gov (United States)

    2017-11-21

    Highways, streets, railroad lines, transit systems, ports, and other transportation fixed assets enable the movement of people and goods. Investment in transportation fixed assets helps build and maintain these critical resources. The pattern of tran...

  19. On-demand high-capacity ride-sharing via dynamic trip-vehicle assignment.

    Science.gov (United States)

    Alonso-Mora, Javier; Samaranayake, Samitha; Wallar, Alex; Frazzoli, Emilio; Rus, Daniela

    2017-01-17

    Ride-sharing services are transforming urban mobility by providing timely and convenient transportation to anybody, anywhere, and anytime. These services present enormous potential for positive societal impacts with respect to pollution, energy consumption, congestion, etc. Current mathematical models, however, do not fully address the potential of ride-sharing. Recently, a large-scale study highlighted some of the benefits of car pooling but was limited to static routes with two riders per vehicle (optimally) or three (with heuristics). We present a more general mathematical model for real-time high-capacity ride-sharing that (i) scales to large numbers of passengers and trips and (ii) dynamically generates optimal routes with respect to online demand and vehicle locations. The algorithm starts from a greedy assignment and improves it through a constrained optimization, quickly returning solutions of good quality and converging to the optimal assignment over time. We quantify experimentally the tradeoff between fleet size, capacity, waiting time, travel delay, and operational costs for low- to medium-capacity vehicles, such as taxis and van shuttles. The algorithm is validated with ∼3 million rides extracted from the New York City taxicab public dataset. Our experimental study considers ride-sharing with rider capacity of up to 10 simultaneous passengers per vehicle. The algorithm applies to fleets of autonomous vehicles and also incorporates rebalancing of idling vehicles to areas of high demand. This framework is general and can be used for many real-time multivehicle, multitask assignment problems.

  20. Risk and return in oilfield asset holdings

    International Nuclear Information System (INIS)

    Kretzschmar, Gavin L.; Kirchner, Axel; Reusch, Hans

    2008-01-01

    Convention suggests that emerging market investment should provide commensurately lower risk or higher returns than comparable assets in developed countries. This study demonstrates that emerging markets contain regulatory specificities that challenge asset valuation model convergence and potentially invert risk return convention. 292 oilfield assets are used to provide evidence that, under upward oil prices, emerging markets are characterized by progressive state participation in oilfield cash flows. Specifically, this work advances the low oil price paradigm of prior oil and gas asset valuation studies and provides evidence that emerging market state participation terms limit the corporate value of globalization for the sector. (author)