WorldWideScience

Sample records for sophisticated processing techniques

  1. Sophisticated Players and Sophisticated Agents

    NARCIS (Netherlands)

    Rustichini, A.

    1998-01-01

    A sophisticated player is an individual who takes the action of the opponents, in a strategic situation, as determined by decision of rational opponents, and acts accordingly. A sophisticated agent is rational in the choice of his action, but ignores the fact that he is part of a strategic

  2. STOCK EXCHANGE LISTING INDUCES SOPHISTICATION OF CAPITAL BUDGETING

    Directory of Open Access Journals (Sweden)

    Wesley Mendes-da-Silva

    2014-08-01

    Full Text Available This article compares capital budgeting techniques employed in listed and unlisted companies in Brazil. We surveyed the Chief Financial Officers (CFOs of 398 listed companies and 300 large unlisted companies, and based on 91 respondents, the results suggest that the CFOs of listed companies tend to use less simplistic methods more often, for example: NPV and CAPM, and that CFOs of unlisted companies are less likely to estimate the cost of equity, despite being large companies. These findings indicate that stock exchange listing may require greater sophistication of the capital budgeting process.

  3. Comparison of process estimation techniques for on-line calibration monitoring

    International Nuclear Information System (INIS)

    Shumaker, B. D.; Hashemian, H. M.; Morton, G. W.

    2006-01-01

    The goal of on-line calibration monitoring is to reduce the number of unnecessary calibrations performed each refueling cycle on pressure, level, and flow transmitters in nuclear power plants. The effort requires a baseline for determining calibration drift and thereby the need for a calibration. There are two ways to establish the baseline: averaging and modeling. Averaging techniques have proven to be highly successful in the applications when there are a large number of redundant transmitters; but, for systems with little or no redundancy, averaging methods are not always reliable. That is, for non-redundant transmitters, more sophisticated process estimation techniques are needed to augment or replace the averaging techniques. This paper explores three well-known process estimation techniques; namely Independent Component Analysis (ICA), Auto-Associative Neural Networks (AANN), and Auto-Associative Kernel Regression (AAKR). Using experience and data from an operating nuclear plant, the paper will present an evaluation of the effectiveness of these methods in detecting transmitter drift in actual plant conditions. (authors)

  4. Pension fund sophistication and investment policy

    NARCIS (Netherlands)

    de Dreu, J.|info:eu-repo/dai/nl/364537906; Bikker, J.A.|info:eu-repo/dai/nl/06912261X

    This paper assesses the sophistication of pension funds’ investment policies using data on 748 Dutch pension funds during the 1999–2006 period. We develop three indicators of sophistication: gross rounding of investment choices, investments in alternative sophisticated asset classes and ‘home bias’.

  5. In Praise of the Sophists.

    Science.gov (United States)

    Gibson, Walker

    1993-01-01

    Discusses the thinking of the Greek Sophist philosophers, particularly Gorgias and Protagoras, and their importance and relevance for contemporary English instructors. Considers the problem of language as signs of reality in the context of Sophist philosophy. (HB)

  6. Background field removal technique using regularization enabled sophisticated harmonic artifact reduction for phase data with varying kernel sizes.

    Science.gov (United States)

    Kan, Hirohito; Kasai, Harumasa; Arai, Nobuyuki; Kunitomo, Hiroshi; Hirose, Yasujiro; Shibamoto, Yuta

    2016-09-01

    An effective background field removal technique is desired for more accurate quantitative susceptibility mapping (QSM) prior to dipole inversion. The aim of this study was to evaluate the accuracy of regularization enabled sophisticated harmonic artifact reduction for phase data with varying spherical kernel sizes (REV-SHARP) method using a three-dimensional head phantom and human brain data. The proposed REV-SHARP method used the spherical mean value operation and Tikhonov regularization in the deconvolution process, with varying 2-14mm kernel sizes. The kernel sizes were gradually reduced, similar to the SHARP with varying spherical kernel (VSHARP) method. We determined the relative errors and relationships between the true local field and estimated local field in REV-SHARP, VSHARP, projection onto dipole fields (PDF), and regularization enabled SHARP (RESHARP). Human experiment was also conducted using REV-SHARP, VSHARP, PDF, and RESHARP. The relative errors in the numerical phantom study were 0.386, 0.448, 0.838, and 0.452 for REV-SHARP, VSHARP, PDF, and RESHARP. REV-SHARP result exhibited the highest correlation between the true local field and estimated local field. The linear regression slopes were 1.005, 1.124, 0.988, and 0.536 for REV-SHARP, VSHARP, PDF, and RESHARP in regions of interest on the three-dimensional head phantom. In human experiments, no obvious errors due to artifacts were present in REV-SHARP. The proposed REV-SHARP is a new method combined with variable spherical kernel size and Tikhonov regularization. This technique might make it possible to be more accurate backgroud field removal and help to achive better accuracy of QSM. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. The value of multivariate model sophistication

    DEFF Research Database (Denmark)

    Rombouts, Jeroen; Stentoft, Lars; Violante, Francesco

    2014-01-01

    We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ in their spec....... In addition to investigating the value of model sophistication in terms of dollar losses directly, we also use the model confidence set approach to statistically infer the set of models that delivers the best pricing performances.......We assess the predictive accuracies of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set of 444 multivariate models that differ...

  8. Sophistication and Performance of Italian Agri‐food Exports

    Directory of Open Access Journals (Sweden)

    Anna Carbone

    2012-06-01

    Full Text Available Nonprice competition is increasingly important in world food markets. Recently, the expression ‘export sophistication’ has been introduced in the economic literature to refer to a wide set of attributes that increase product value. An index has been proposed to measure sophistication in an indirect way through the per capita GDP of exporting countries (Lall et al., 2006; Haussmann et al., 2007.The paper applies the sophistication measure to the Italian food export sector, moving from an analysis of trends and performance of Italian food exports. An original way to disentangle different components in the temporal variation of the sophistication index is also proposed.Results show that the sophistication index offers original insights on recent trends in world food exports and with respect to Italian core food exports.

  9. The First Sophists and the Uses of History.

    Science.gov (United States)

    Jarratt, Susan C.

    1987-01-01

    Reviews the history of intellectual views on the Greek sophists in three phases: (1) their disparagement by Plato and Aristotle as the morally disgraceful "other"; (2) nineteenth century British positivists' reappraisal of these relativists as ethically and scientifically superior; and (3) twentieth century versions of the sophists as…

  10. Cumulative Dominance and Probabilistic Sophistication

    NARCIS (Netherlands)

    Wakker, P.P.; Sarin, R.H.

    2000-01-01

    Machina & Schmeidler (Econometrica, 60, 1992) gave preference conditions for probabilistic sophistication, i.e. decision making where uncertainty can be expressed in terms of (subjective) probabilities without commitment to expected utility maximization. This note shows that simpler and more general

  11. Does Investors' Sophistication Affect Persistence and Pricing of Discretionary Accruals?

    OpenAIRE

    Lanfeng Kao

    2007-01-01

    This paper examines whether the sophistication of market investors influences management's strategy on discretionary accounting choice, and thus changes the persistence of discretionary accruals. The results show that the persistence of discretionary accruals for firms face with naive investors is lower than that for firms face with sophisticated investors. The results also demonstrate that sophisticated investors indeed incorporate the implications of current earnings components into future ...

  12. Musical Sophistication and the Effect of Complexity on Auditory Discrimination in Finnish Speakers

    Science.gov (United States)

    Dawson, Caitlin; Aalto, Daniel; Šimko, Juraj; Vainio, Martti; Tervaniemi, Mari

    2017-01-01

    Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences. PMID:28450829

  13. Musical Sophistication and the Effect of Complexity on Auditory Discrimination in Finnish Speakers.

    Science.gov (United States)

    Dawson, Caitlin; Aalto, Daniel; Šimko, Juraj; Vainio, Martti; Tervaniemi, Mari

    2017-01-01

    Musical experiences and native language are both known to affect auditory processing. The present work aims to disentangle the influences of native language phonology and musicality on behavioral and subcortical sound feature processing in a population of musically diverse Finnish speakers as well as to investigate the specificity of enhancement from musical training. Finnish speakers are highly sensitive to duration cues since in Finnish, vowel and consonant duration determine word meaning. Using a correlational approach with a set of behavioral sound feature discrimination tasks, brainstem recordings, and a musical sophistication questionnaire, we find no evidence for an association between musical sophistication and more precise duration processing in Finnish speakers either in the auditory brainstem response or in behavioral tasks, but they do show an enhanced pitch discrimination compared to Finnish speakers with less musical experience and show greater duration modulation in a complex task. These results are consistent with a ceiling effect set for certain sound features which corresponds to the phonology of the native language, leaving an opportunity for music experience-based enhancement of sound features not explicitly encoded in the language (such as pitch, which is not explicitly encoded in Finnish). Finally, the pattern of duration modulation in more musically sophisticated Finnish speakers suggests integrated feature processing for greater efficiency in a real world musical situation. These results have implications for research into the specificity of plasticity in the auditory system as well as to the effects of interaction of specific language features with musical experiences.

  14. The conceptualization and measurement of cognitive health sophistication.

    Science.gov (United States)

    Bodie, Graham D; Collins, William B; Jensen, Jakob D; Davis, Lashara A; Guntzviller, Lisa M; King, Andy J

    2013-01-01

    This article develops a conceptualization and measure of cognitive health sophistication--the complexity of an individual's conceptual knowledge about health. Study 1 provides initial validity evidence for the measure--the Healthy-Unhealthy Other Instrument--by showing its association with other cognitive health constructs indicative of higher health sophistication. Study 2 presents data from a sample of low-income adults to provide evidence that the measure does not depend heavily on health-related vocabulary or ethnicity. Results from both studies suggest that the Healthy-Unhealthy Other Instrument can be used to capture variability in the sophistication or complexity of an individual's health-related schematic structures on the basis of responses to two simple open-ended questions. Methodological advantages of the Healthy-Unhealthy Other Instrument and suggestions for future research are highlighted in the discussion.

  15. Obfuscation, Learning, and the Evolution of Investor Sophistication

    OpenAIRE

    Bruce Ian Carlin; Gustavo Manso

    2011-01-01

    Investor sophistication has lagged behind the growing complexity of retail financial markets. To explore this, we develop a dynamic model to study the interaction between obfuscation and investor sophistication in mutual fund markets. Taking into account different learning mechanisms within the investor population, we characterize the optimal timing of obfuscation for financial institutions who offer retail products. We show that educational initiatives that are directed to facilitate learnin...

  16. Probabilistic Sophistication, Second Order Stochastic Dominance, and Uncertainty Aversion

    OpenAIRE

    Simone Cerreia-Vioglio; Fabio Maccheroni; Massimo Marinacci; Luigi Montrucchio

    2010-01-01

    We study the interplay of probabilistic sophistication, second order stochastic dominance, and uncertainty aversion, three fundamental notions in choice under uncertainty. In particular, our main result, Theorem 2, characterizes uncertainty averse preferences that satisfy second order stochastic dominance, as well as uncertainty averse preferences that are probabilistically sophisticated.

  17. Automatically Assessing Lexical Sophistication: Indices, Tools, Findings, and Application

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott A.

    2015-01-01

    This study explores the construct of lexical sophistication and its applications for measuring second language lexical and speaking proficiency. In doing so, the study introduces the Tool for the Automatic Analysis of LExical Sophistication (TAALES), which calculates text scores for 135 classic and newly developed lexical indices related to word…

  18. The role of sophisticated accounting system in strategy management

    OpenAIRE

    Naranjo Gil, David

    2004-01-01

    Organizations are designing more sophisticated accounting information systems to meet the strategic goals and enhance their performance. This study examines the effect of accounting information system design on the performance of organizations pursuing different strategic priorities. The alignment between sophisticated accounting information systems and organizational strategy is analyzed. The enabling effect of the accounting information system on performance is also examined. Relationships ...

  19. Financial Literacy and Financial Sophistication in the Older Population

    Science.gov (United States)

    Lusardi, Annamaria; Mitchell, Olivia S.; Curto, Vilsa

    2017-01-01

    Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications. PMID:28553191

  20. Financial Literacy and Financial Sophistication in the Older Population.

    Science.gov (United States)

    Lusardi, Annamaria; Mitchell, Olivia S; Curto, Vilsa

    2014-10-01

    Using a special-purpose module implemented in the Health and Retirement Study, we evaluate financial sophistication in the American population over the age of 50. We combine several financial literacy questions into an overall index to highlight which questions best capture financial sophistication and examine the sensitivity of financial literacy responses to framing effects. Results show that many older respondents are not financially sophisticated: they fail to grasp essential aspects of risk diversification, asset valuation, portfolio choice, and investment fees. Subgroups with notable deficits include women, the least educated, non-Whites, and those over age 75. In view of the fact that retirees increasingly must take on responsibility for their own retirement security, such meager levels of knowledge have potentially serious and negative implications.

  1. The Impact of Financial Sophistication on Adjustable Rate Mortgage Ownership

    Science.gov (United States)

    Smith, Hyrum; Finke, Michael S.; Huston, Sandra J.

    2011-01-01

    The influence of a financial sophistication scale on adjustable-rate mortgage (ARM) borrowing is explored. Descriptive statistics and regression analysis using recent data from the Survey of Consumer Finances reveal that ARM borrowing is driven by both the least and most financially sophisticated households but for different reasons. Less…

  2. Cognitive Load and Strategic Sophistication

    OpenAIRE

    Allred, Sarah; Duffy, Sean; Smith, John

    2013-01-01

    We study the relationship between the cognitive load manipulation and strategic sophistication. The cognitive load manipulation is designed to reduce the subject's cognitive resources that are available for deliberation on a choice. In our experiment, subjects are placed under a large cognitive load (given a difficult number to remember) or a low cognitive load (given a number which is not difficult to remember). Subsequently, the subjects play a one-shot game then they are asked to recall...

  3. Moral foundations and political attitudes: The moderating role of political sophistication.

    Science.gov (United States)

    Milesi, Patrizia

    2016-08-01

    Political attitudes can be associated with moral concerns. This research investigated whether people's level of political sophistication moderates this association. Based on the Moral Foundations Theory, this article examined whether political sophistication moderates the extent to which reliance on moral foundations, as categories of moral concerns, predicts judgements about policy positions. With this aim, two studies examined four policy positions shown by previous research to be best predicted by the endorsement of Sanctity, that is, the category of moral concerns focused on the preservation of physical and spiritual purity. The results showed that reliance on Sanctity predicted political sophisticates' judgements, as opposed to those of unsophisticates, on policy positions dealing with equal rights for same-sex and unmarried couples and with euthanasia. Political sophistication also interacted with Fairness endorsement, which includes moral concerns for equal treatment of everybody and reciprocity, in predicting judgements about equal rights for unmarried couples, and interacted with reliance on Authority, which includes moral concerns for obedience and respect for traditional authorities, in predicting opposition to stem cell research. Those findings suggest that, at least for these particular issues, endorsement of moral foundations can be associated with political attitudes more strongly among sophisticates than unsophisticates. © 2015 International Union of Psychological Science.

  4. Hazardous Waste Landfill Siting using GIS Technique and Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Ozeair Abessi

    2010-07-01

    Full Text Available Disposal of large amount of generated hazardous waste in power plants, has always received communities' and authori¬ties attentions. In this paper using site screening method and Analytical Hierarchy Process (AHP a sophisticated approach for siting hazardous waste landfill in large areas is presented. This approach demonstrates how the evaluation criteria such as physical, socio-economical, technical, environmental and their regulatory sub criteria can be introduced into an over layer technique to screen some limited appropriate zones in the area. Then, in order to find the optimal site amongst the primary screened site utilizing a Multiple Criteria Decision Making (MCDM method for hierarchy computations of the process is recommended. Using the introduced method an accurate siting procedure for environmental planning of the landfills in an area would be enabled. In the study this approach was utilized for disposal of hazardous wastes of Shahid Rajaee thermal power plant located in Qazvin province west central part of Iran. As a result of this study 10 suitable zones were screened in the area at first, then using analytical hierarchy process a site near the power plant were chosen as the optimal site for landfilling of the hazardous wastes in Qazvin province.

  5. Aristotle and Social-Epistemic Rhetoric: The Systematizing of the Sophistic Legacy.

    Science.gov (United States)

    Allen, James E.

    While Aristotle's philosophical views are more foundational than those of many of the Older Sophists, Aristotle's rhetorical theories inherit and incorporate many of the central tenets ascribed to Sophistic rhetoric, albeit in a more systematic fashion, as represented in the "Rhetoric." However, Aristotle was more than just a rhetorical…

  6. The predictors of economic sophistication: media, interpersonal communication and negative economic experiences

    NARCIS (Netherlands)

    Kalogeropoulos, A.; Albæk, E.; de Vreese, C.H.; van Dalen, A.

    2015-01-01

    In analogy to political sophistication, it is imperative that citizens have a certain level of economic sophistication, especially in times of heated debates about the economy. This study examines the impact of different influences (media, interpersonal communication and personal experiences) on

  7. Novel food processing techniques

    Directory of Open Access Journals (Sweden)

    Vesna Lelas

    2006-12-01

    Full Text Available Recently, a lot of investigations have been focused on development of the novel mild food processing techniques with the aim to obtain the high quality food products. It is presumed also that they could substitute some of the traditional processes in the food industry. The investigations are primarily directed to usage of high hydrostatic pressure, ultrasound, tribomechanical micronization, microwaves, pulsed electrical fields. The results of the scientific researches refer to the fact that application of some of these processes in particular food industry can result in lots of benefits. A significant energy savings, shortening of process duration, mild thermal conditions, food products with better sensory characteristics and with higher nutritional values can be achieved. As some of these techniques act also on the molecular level changing the conformation, structure and electrical potential of organic as well as inorganic materials, the improvement of some functional properties of these components may occur. Common characteristics of all of these techniques are treatment at ambient or insignificant higher temperatures and short time of processing (1 to 10 minutes. High hydrostatic pressure applied to various foodstuffs can destroy some microorganisms, successfully modify molecule conformation and consequently improve functional properties of foods. At the same time it acts positively on the food products intend for freezing. Tribomechanical treatment causes micronization of various solid materials that results in nanoparticles and changes in structure and electrical potential of molecules. Therefore, the significant improvement of some rheological and functional properties of materials occurred. Ultrasound treatment proved to be potentially very successful technique of food processing. It can be used as a pretreatment to drying (decreases drying time and improves functional properties of food, as extraction process of various components

  8. PAUL AND SOPHISTIC RHETORIC: A PERSPECTIVE ON HIS ...

    African Journals Online (AJOL)

    use of modern rhetorical theories but analyses the letter in terms of the clas- ..... If a critical reader would have had the traditional anti-sophistic arsenal ..... pressions and that 'rhetoric' is mainly a matter of communicating these thoughts.

  9. Applicability of statistical process control techniques

    NARCIS (Netherlands)

    Schippers, W.A.J.

    1998-01-01

    This paper concerns the application of Process Control Techniques (PCTs) for the improvement of the technical performance of discrete production processes. Successful applications of these techniques, such as Statistical Process Control Techniques (SPC), can be found in the literature. However, some

  10. Isocratean Discourse Theory and Neo-Sophistic Pedagogy: Implications for the Composition Classroom.

    Science.gov (United States)

    Blair, Kristine L.

    With the recent interest in the fifth century B.C. theories of Protagoras and Gorgias come assumptions about the philosophical affinity of the Greek educator Isocrates to this pair of older sophists. Isocratean education in discourse, with its emphasis on collaborative political discourse, falls within recent definitions of a sophist curriculum.…

  11. SMEs and new ventures need business model sophistication

    DEFF Research Database (Denmark)

    Kesting, Peter; Günzel-Jensen, Franziska

    2015-01-01

    , and Spreadshirt, this article develops a framework that introduces five business model sophistication strategies: (1) uncover additional functions of your product, (2) identify strategic benefits for third parties, (3) take advantage of economies of scope, (4) utilize cross-selling opportunities, and (5) involve...

  12. Developments in operator assistance techniques for nuclear power plant control and operation

    International Nuclear Information System (INIS)

    Poujol, A.; Papin, B.; Beltranda, G.; Soldermann, R.

    1989-01-01

    This paper describes an approach which has been developed in order to improve nuclear power plants control and monitoring in normal and abnormal situations. These developments take full advantage of the trend towards the computerization of control rooms in industrial continuous processes. This research program consists in a thorough exploration of different information processing techniques, ranking from the rather simple visual synthetization of informations on graphic displays to sophisticated Artificial Intelligence (AI) techniques. These techniques are put into application for the solving of man-machine interface problems in the different domains of plant operation

  13. A Comparative of business process modelling techniques

    Science.gov (United States)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  14. Analysis of Low Probability of Intercept (LPI) Radar Signals Using Cyclostationary Processing

    National Research Council Canada - National Science Library

    Lime, Antonio

    2002-01-01

    ... problem in the battle space To detect these types of radar, new digital receivers that use sophisticated signal processing techniques are required This thesis investigates the use of cyclostationary...

  15. Cognitive ability rivals the effect of political sophistication on ideological voting

    DEFF Research Database (Denmark)

    Hebbelstrup Rye Rasmussen, Stig

    2016-01-01

    This article examines the impact of cognitive ability on ideological voting. We find, using a US sample and a Danish sample, that the effect of cognitive ability rivals the effect of the traditionally strongest predicter of ideological voting political sophistication. Furthermore, the results...... are consistent with the effect of cognitive ability being partly mediated by political sophistication. Much of the effect of cognitive ability remains however and is not explained by differences in education or Openness to experience either. The implications of these results for democratic theory are discussed....

  16. ActionScript 30 Design Patterns Object Oriented Programming Techniques

    CERN Document Server

    Sanders, William

    2008-01-01

    If you're an experienced Flash or Flex developer ready to tackle sophisticated programming techniques with ActionScript 3.0, this hands-on introduction to design patterns takes you step by step through the process. You learn about various types of design patterns and construct small abstract examples before trying your hand at building full-fledged working applications outlined in the book.

  17. Business Process Customization using Process Merging Techniques

    NARCIS (Netherlands)

    Bulanov, Pavel; Lazovik, Alexander; Aiello, Marco

    2012-01-01

    One of the important application of service composition techniques lies in the field of business process management. Essentially a business process can be considered as a composition of services, which is usually prepared by domain experts, and many tasks still have to be performed manually. These

  18. The Relationship between Logistics Sophistication and Drivers of the Outsourcing of Logistics Activities

    Directory of Open Access Journals (Sweden)

    Peter Wanke

    2008-10-01

    Full Text Available A strong link has been established between operational excellence and the degree of sophistication of logistics organization, a function of factors such as performance monitoring, investment in Information Technology [IT] and the formalization of logistics organization, as proposed in the Bowersox, Daugherty, Dröge, Germain and Rogers (1992 Leading Edge model. At the same time, shippers have been increasingly outsourcing their logistics activities to third party providers. This paper, based on a survey with large Brazilian shippers, addresses a gap in the literature by investigating the relationship between dimensions of logistics organization sophistication and drivers of logistics outsourcing. To this end, the dimensions behind the logistics sophistication construct were first investigated. Results from factor analysis led to the identification of six dimensions of logistics sophistication. By means of multivariate logistical regression analyses it was possible to relate some of these dimensions, such as the formalization of the logistics organization, to certain drivers of the outsourcing of logistics activities of Brazilian shippers, such as cost savings. These results indicate the possibility of segmenting shippers according to characteristics of their logistics organization, which may be particularly useful to logistics service providers.

  19. Lexical Complexity Development from Dynamic Systems Theory Perspective: Lexical Density, Diversity, and Sophistication

    Directory of Open Access Journals (Sweden)

    Reza Kalantari

    2017-10-01

    Full Text Available This longitudinal case study explored Iranian EFL learners’ lexical complexity (LC through the lenses of Dynamic Systems Theory (DST. Fifty independent essays written by five intermediate to advanced female EFL learners in a TOEFL iBT preparation course over six months constituted the corpus of this study. Three Coh-Metrix indices (Graesser, McNamara, Louwerse, & Cai, 2004; McNamara & Graesser, 2012, three Lexical Complexity Analyzer indices (Lu, 2010, 2012; Lu & Ai, 2011, and four Vocabprofile indices (Cobb, 2000 were selected to measure different dimensions of LC. Results of repeated measures analysis of variance (RM ANOVA indicated an improvement with regard to only lexical sophistication. Positive and significant relationships were found between time and mean values in Academic Word List and Beyond-2000 as indicators of lexical sophistication. The remaining seven indices of LC, falling short of significance, tended to flatten over the course of this writing program. Correlation analyses among LC indices indicated that lexical density enjoyed positive correlations with lexical sophistication. However, lexical diversity revealed no significant correlations with both lexical density and lexical sophistication. This study suggests that DST perspective specifies a viable foundation for analyzing lexical complexity

  20. Fractional Processes and Fractional-Order Signal Processing Techniques and Applications

    CERN Document Server

    Sheng, Hu; Qiu, TianShuang

    2012-01-01

    Fractional processes are widely found in science, technology and engineering systems. In Fractional Processes and Fractional-order Signal Processing, some complex random signals, characterized by the presence of a heavy-tailed distribution or non-negligible dependence between distant observations (local and long memory), are introduced and examined from the ‘fractional’ perspective using simulation, fractional-order modeling and filtering and realization of fractional-order systems. These fractional-order signal processing (FOSP) techniques are based on fractional calculus, the fractional Fourier transform and fractional lower-order moments. Fractional Processes and Fractional-order Signal Processing: • presents fractional processes of fixed, variable and distributed order studied as the output of fractional-order differential systems; • introduces FOSP techniques and the fractional signals and fractional systems point of view; • details real-world-application examples of FOSP techniques to demonstr...

  1. The musicality of non-musicians: an index for assessing musical sophistication in the general population.

    Directory of Open Access Journals (Sweden)

    Daniel Müllensiefen

    Full Text Available Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636. Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.

  2. The musicality of non-musicians: an index for assessing musical sophistication in the general population.

    Science.gov (United States)

    Müllensiefen, Daniel; Gingras, Bruno; Musil, Jason; Stewart, Lauren

    2014-01-01

    Musical skills and expertise vary greatly in Western societies. Individuals can differ in their repertoire of musical behaviours as well as in the level of skill they display for any single musical behaviour. The types of musical behaviours we refer to here are broad, ranging from performance on an instrument and listening expertise, to the ability to employ music in functional settings or to communicate about music. In this paper, we first describe the concept of 'musical sophistication' which can be used to describe the multi-faceted nature of musical expertise. Next, we develop a novel measurement instrument, the Goldsmiths Musical Sophistication Index (Gold-MSI) to assess self-reported musical skills and behaviours on multiple dimensions in the general population using a large Internet sample (n = 147,636). Thirdly, we report results from several lab studies, demonstrating that the Gold-MSI possesses good psychometric properties, and that self-reported musical sophistication is associated with performance on two listening tasks. Finally, we identify occupation, occupational status, age, gender, and wealth as the main socio-demographic factors associated with musical sophistication. Results are discussed in terms of theoretical accounts of implicit and statistical music learning and with regard to social conditions of sophisticated musical engagement.

  3. The New Toxicology of Sophisticated Materials: Nanotoxicology and Beyond

    Science.gov (United States)

    Maynard, Andrew D.; Warheit, David B.; Philbert, Martin A.

    2011-01-01

    It has long been recognized that the physical form of materials can mediate their toxicity—the health impacts of asbestiform materials, industrial aerosols, and ambient particulate matter are prime examples. Yet over the past 20 years, toxicology research has suggested complex and previously unrecognized associations between material physicochemistry at the nanoscale and biological interactions. With the rapid rise of the field of nanotechnology and the design and production of increasingly complex nanoscale materials, it has become ever more important to understand how the physical form and chemical composition of these materials interact synergistically to determine toxicity. As a result, a new field of research has emerged—nanotoxicology. Research within this field is highlighting the importance of material physicochemical properties in how dose is understood, how materials are characterized in a manner that enables quantitative data interpretation and comparison, and how materials move within, interact with, and are transformed by biological systems. Yet many of the substances that are the focus of current nanotoxicology studies are relatively simple materials that are at the vanguard of a new era of complex materials. Over the next 50 years, there will be a need to understand the toxicology of increasingly sophisticated materials that exhibit novel, dynamic and multifaceted functionality. If the toxicology community is to meet the challenge of ensuring the safe use of this new generation of substances, it will need to move beyond “nano” toxicology and toward a new toxicology of sophisticated materials. Here, we present a brief overview of the current state of the science on the toxicology of nanoscale materials and focus on three emerging toxicology-based challenges presented by sophisticated materials that will become increasingly important over the next 50 years: identifying relevant materials for study, physicochemical characterization, and

  4. Building the competitive intelligence knowledge: processes and activities in a corporate organisation

    OpenAIRE

    Sreenivasulu, V.

    1999-01-01

    This paper discusses the process of building and developing comprehensive tools, techniques, support systems, and better methods of harnessing the competitive intelligence knowledge processes. The author stresses the need for building sophisticated methodological competitive intelligence knowledge acquisition, systematic collection of competitive intelligence knowledge from various sources for critical analysis, process, organization, synthesis, assessment, screening, filtering and interpreta...

  5. Library of sophisticated functions for analysis of nuclear spectra

    Science.gov (United States)

    Morháč, Miroslav; Matoušek, Vladislav

    2009-10-01

    In the paper we present compact library for analysis of nuclear spectra. The library consists of sophisticated functions for background elimination, smoothing, peak searching, deconvolution, and peak fitting. The functions can process one- and two-dimensional spectra. The software described in the paper comprises a number of conventional as well as newly developed methods needed to analyze experimental data. Program summaryProgram title: SpecAnalysLib 1.1 Catalogue identifier: AEDZ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEDZ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 42 154 No. of bytes in distributed program, including test data, etc.: 2 379 437 Distribution format: tar.gz Programming language: C++ Computer: Pentium 3 PC 2.4 GHz or higher, Borland C++ Builder v. 6. A precompiled Windows version is included in the distribution package Operating system: Windows 32 bit versions RAM: 10 MB Word size: 32 bits Classification: 17.6 Nature of problem: The demand for advanced highly effective experimental data analysis functions is enormous. The library package represents one approach to give the physicists the possibility to use the advanced routines simply by calling them from their own programs. SpecAnalysLib is a collection of functions for analysis of one- and two-parameter γ-ray spectra, but they can be used for other types of data as well. The library consists of sophisticated functions for background elimination, smoothing, peak searching, deconvolution, and peak fitting. Solution method: The algorithms of background estimation are based on Sensitive Non-linear Iterative Peak (SNIP) clipping algorithm. The smoothing algorithms are based on the convolution of the original data with several types of filters and algorithms based on discrete

  6. Procles the Carthaginian: A North African Sophist in Pausanias’ Periegesis

    Directory of Open Access Journals (Sweden)

    Juan Pablo Sánchez Hernández

    2010-11-01

    Full Text Available Procles, cited by Pausanias (in the imperfect tense about a display in Rome and for an opinion about Pyrrhus of Epirus, probably was not a historian of Hellenistic date, but a contemporary sophist whom Pausanias encountered in person in Rome.

  7. Does underground storage still require sophisticated studies?

    International Nuclear Information System (INIS)

    Marsily, G. de

    1997-01-01

    Most countries agree to the necessity of burying high or medium-level wastes in geological layers situated at a few hundred meters below the ground level. The advantages and disadvantages of different types of rock such as salt, clay, granite and volcanic material are examined. Sophisticated studies are lead to determine the best geological confinement but questions arise about the time for which safety must be ensured. France has chosen 3 possible sites. These sites are geologically described in the article. The final place will be proposed after a testing phase of about 5 years in an underground facility. (A.C.)

  8. Finding the Fabulous Few: Why Your Program Needs Sophisticated Research.

    Science.gov (United States)

    Pfizenmaier, Emily

    1981-01-01

    Fund raising, it is argued, needs sophisticated prospect research. Professional prospect researchers play an important role in helping to identify prospective donors and also in helping to stimulate interest in gift giving. A sample of an individual work-up on a donor and a bibliography are provided. (MLW)

  9. Reading wild minds: A computational assay of Theory of Mind sophistication across seven primate species.

    Directory of Open Access Journals (Sweden)

    Marie Devaine

    2017-11-01

    Full Text Available Theory of Mind (ToM, i.e. the ability to understand others' mental states, endows humans with highly adaptive social skills such as teaching or deceiving. Candidate evolutionary explanations have been proposed for the unique sophistication of human ToM among primates. For example, the Machiavellian intelligence hypothesis states that the increasing complexity of social networks may have induced a demand for sophisticated ToM. This type of scenario ignores neurocognitive constraints that may eventually be crucial limiting factors for ToM evolution. In contradistinction, the cognitive scaffolding hypothesis asserts that a species' opportunity to develop sophisticated ToM is mostly determined by its general cognitive capacity (on which ToM is scaffolded. However, the actual relationships between ToM sophistication and either brain volume (a proxy for general cognitive capacity or social group size (a proxy for social network complexity are unclear. Here, we let 39 individuals sampled from seven non-human primate species (lemurs, macaques, mangabeys, orangutans, gorillas and chimpanzees engage in simple dyadic games against artificial ToM players (via a familiar human caregiver. Using computational analyses of primates' choice sequences, we found that the probability of exhibiting a ToM-compatible learning style is mainly driven by species' brain volume (rather than by social group size. Moreover, primates' social cognitive sophistication culminates in a precursor form of ToM, which still falls short of human fully-developed ToM abilities.

  10. Evaluation of EMG processing techniques using Information Theory.

    Science.gov (United States)

    Farfán, Fernando D; Politti, Julio C; Felice, Carmelo J

    2010-11-12

    Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV), RMS values, variance values (VAR) and difference absolute mean value (DAMV). EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation), abduction and adduction movements and inter-electrode distance were also analyzed. Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively) the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.

  11. Lexical Sophistication as a Multidimensional Phenomenon: Relations to Second Language Lexical Proficiency, Development, and Writing Quality

    Science.gov (United States)

    Kim, Minkyung; Crossley, Scott A.; Kyle, Kristopher

    2018-01-01

    This study conceptualizes lexical sophistication as a multidimensional phenomenon by reducing numerous lexical features of lexical sophistication into 12 aggregated components (i.e., dimensions) via a principal component analysis approach. These components were then used to predict second language (L2) writing proficiency levels, holistic lexical…

  12. Few remarks on chiral theories with sophisticated topology

    International Nuclear Information System (INIS)

    Golo, V.L.; Perelomov, A.M.

    1978-01-01

    Two classes of the two-dimensional Euclidean chiral field theoreties are singled out: 1) the field phi(x) takes the values in the compact Hermitiam symmetric space 2) the field phi(x) takes the values in an orbit of the adjoint representation of the comcompact Lie group. The theories have sophisticated topological and rich analytical structures. They are considered with the help of topological invariants (topological charges). Explicit formulae for the topological charges are indicated, and the lower bound extimate for the action is given

  13. Evaluation of EMG processing techniques using Information Theory

    Directory of Open Access Journals (Sweden)

    Felice Carmelo J

    2010-11-01

    Full Text Available Abstract Background Electromyographic signals can be used in biomedical engineering and/or rehabilitation field, as potential sources of control for prosthetics and orthotics. In such applications, digital processing techniques are necessary to follow efficient and effectively the changes in the physiological characteristics produced by a muscular contraction. In this paper, two methods based on information theory are proposed to evaluate the processing techniques. Methods These methods determine the amount of information that a processing technique is able to extract from EMG signals. The processing techniques evaluated with these methods were: absolute mean value (AMV, RMS values, variance values (VAR and difference absolute mean value (DAMV. EMG signals from the middle deltoid during abduction and adduction movement of the arm in the scapular plane was registered, for static and dynamic contractions. The optimal window length (segmentation, abduction and adduction movements and inter-electrode distance were also analyzed. Results Using the optimal segmentation (200 ms and 300 ms in static and dynamic contractions, respectively the best processing techniques were: RMS, AMV and VAR in static contractions, and only the RMS in dynamic contractions. Using the RMS of EMG signal, variations in the amount of information between the abduction and adduction movements were observed. Conclusions Although the evaluation methods proposed here were applied to standard processing techniques, these methods can also be considered as alternatives tools to evaluate new processing techniques in different areas of electrophysiology.

  14. Reactive polymer coatings: A robust platform towards sophisticated surface engineering for biotechnology

    Science.gov (United States)

    Chen, Hsien-Yeh

    Functionalized poly(p-xylylenes) or so-called reactive polymers can be synthesized via chemical vapor deposition (CVD) polymerization. The resulting ultra-thin coatings are pinhole-free and can be conformally deposited to a wide range of substrates and materials. More importantly, the equipped functional groups can served as anchoring sites for tailoring the surface properties, making these reactive coatings a robust platform that can deal with sophisticated challenges faced in biointerfaces. In this work presented herein, surface coatings presenting various functional groups were prepared by CVD process. Such surfaces include aldehyde-functionalized coating to precisely immobilize saccharide molecules onto well-defined areas and alkyne-functionalized coating to click azide-modified molecules via Huisgen 1,3-dipolar cycloaddition reaction. Moreover, CVD copolymerization has been conducted to prepare multifunctional coatings and their specific functions were demonstrated by the immobilization of biotin and NHS-ester molecules. By using a photodefinable coating, polyethylene oxides were immobilized onto a wide range of substrates through photo-immobilization. Spatially controlled protein resistant properties were characterized by selective adsorption of fibrinogen and bovine serum albumin as model systems. Alternatively, surface initiator coatings were used for polymer graftings of polyethylene glycol) methyl ether methacrylate, and the resultant protein- and cell- resistant properties were characterized by adsorption of kinesin motor proteins, fibrinogen, and murine fibroblasts (NIH3T3). Accessibility of reactive coatings within confined microgeometries was systematically studied, and the preparation of homogeneous polymer thin films within the inner surface of microchannels was demonstrated. Moreover, these advanced coatings were applied to develop a dry adhesion process for microfluidic devices. This process provides (i) excellent bonding strength, (ii) extended

  15. Differential ethnic associations between maternal flexibility and play sophistication in toddlers born very low birth weight

    Science.gov (United States)

    Erickson, Sarah J.; Montague, Erica Q.; Maclean, Peggy C.; Bancroft, Mary E.; Lowe, Jean R.

    2013-01-01

    Children born very low birth weight (development of self-regulation and effective functional skills, and play serves as an important avenue of early intervention. The current study investigated associations between maternal flexibility and toddler play sophistication in Caucasian, Spanish speaking Hispanic, English speaking Hispanic, and Native American toddlers (18-22 months adjusted age) in a cross-sectional cohort of 73 toddlers born VLBW and their mothers. We found that the association between maternal flexibility and toddler play sophistication differed by ethnicity (F(3,65) = 3.34, p = .02). In particular, Spanish speaking Hispanic dyads evidenced a significant positive association between maternal flexibility and play sophistication of medium effect size. Results for Native Americans were parallel to those of Spanish speaking Hispanic dyads: the relationship between flexibility and play sophistication was positive and of small-medium effect size. Findings indicate that for Caucasians and English speaking Hispanics, flexibility evidenced a non-significant (negative and small effect size) association with toddler play sophistication. Significant follow-up contrasts revealed that the associations for Caucasian and English speaking Hispanic dyads were significantly different from those of the other two ethnic groups. Results remained unchanged after adjusting for the amount of maternal language, an index of maternal engagement and stimulation; and after adjusting for birth weight, gestational age, gender, test age, cognitive ability, as well maternal age, education, and income. Our results provide preliminary evidence that ethnicity and acculturation may mediate the association between maternal interactive behavior such as flexibility and toddler developmental outcomes, as indexed by play sophistication. Addressing these association differences is particularly important in children born VLBW because interventions targeting parent interaction strategies such as

  16. Does a more sophisticated storm erosion model improve probabilistic erosion estimates?

    NARCIS (Netherlands)

    Ranasinghe, R.W.M.R.J.B.; Callaghan, D.; Roelvink, D.

    2013-01-01

    The dependency between the accuracy/uncertainty of storm erosion exceedance estimates obtained via a probabilistic model and the level of sophistication of the structural function (storm erosion model) embedded in the probabilistic model is assessed via the application of Callaghan et al.'s (2008)

  17. Image processing in diabetic related causes

    CERN Document Server

    Kumar, Amit

    2016-01-01

    This book is a collection of all the experimental results and analysis carried out on medical images of diabetic related causes. The experimental investigations have been carried out on images starting from very basic image processing techniques such as image enhancement to sophisticated image segmentation methods. This book is intended to create an awareness on diabetes and its related causes and image processing methods used to detect and forecast in a very simple way. This book is useful to researchers, Engineers, Medical Doctors and Bioinformatics researchers.

  18. Sophisticating a naive Liapunov function

    International Nuclear Information System (INIS)

    Smith, D.; Lewins, J.D.

    1985-01-01

    The art of the direct method of Liapunov to determine system stability is to construct a suitable Liapunov or V function where V is to be positive definite (PD), to shrink to a center, which may be conveniently chosen as the origin, and where V is the negative definite (ND). One aid to the art is to solve an approximation to the system equations in order to provide a candidate V function. It can happen, however, that the V function is not strictly ND but vanishes at a finite number of isolated points. Naively, one anticipates that stability has been demonstrated since the trajectory of the system at such points is only momentarily tangential and immediately enters a region of inward directed trajectories. To demonstrate stability rigorously requires the construction of a sophisticated Liapunov function from what can be called the naive original choice. In this paper, the authors demonstrate the method of perturbing the naive function in the context of the well-known second-order oscillator and then apply the method to a more complicated problem based on a prompt jump model for a nuclear fission reactor

  19. Strategic sophistication of individuals and teams. Experimental evidence

    Science.gov (United States)

    Sutter, Matthias; Czermak, Simon; Feri, Francesco

    2013-01-01

    Many important decisions require strategic sophistication. We examine experimentally whether teams act more strategically than individuals. We let individuals and teams make choices in simple games, and also elicit first- and second-order beliefs. We find that teams play the Nash equilibrium strategy significantly more often, and their choices are more often a best response to stated first order beliefs. Distributional preferences make equilibrium play less likely. Using a mixture model, the estimated probability to play strategically is 62% for teams, but only 40% for individuals. A model of noisy introspection reveals that teams differ from individuals in higher order beliefs. PMID:24926100

  20. Probabilistic evaluation of process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2016-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  1. Development Strategies for Tourism Destinations: Tourism Sophistication vs. Resource Investments

    OpenAIRE

    Rainer Andergassen; Guido Candela

    2010-01-01

    This paper investigates the effectiveness of development strategies for tourism destinations. We argue that resource investments unambiguously increase tourism revenues and that increasing the degree of tourism sophistication, that is increasing the variety of tourism related goods and services, increases tourism activity and decreases the perceived quality of the destination's resource endowment, leading to an ambiguous effect on tourism revenues. We disentangle these two effects and charact...

  2. Do organizations adopt sophisticated capital budgeting practices to deal with uncertainty in the investment decision? : A research note

    NARCIS (Netherlands)

    Verbeeten, Frank H M

    This study examines the impact of uncertainty on the sophistication of capital budgeting practices. While the theoretical applications of sophisticated capital budgeting practices (defined as the use of real option reasoning and/or game theory decision rules) have been well documented, empirical

  3. Signal processing techniques for sodium boiling noise detection

    International Nuclear Information System (INIS)

    1989-05-01

    At the Specialists' Meeting on Sodium Boiling Detection organized by the International Working Group on Fast Reactors (IWGFR) of the International Atomic Energy Agency at Chester in the United Kingdom in 1981 various methods of detecting sodium boiling were reported. But, it was not possible to make a comparative assessment of these methods because the signal condition in each experiment was different from others. That is why participants of this meeting recommended that a benchmark test should be carried out in order to evaluate and compare signal processing methods for boiling detection. Organization of the Co-ordinated Research Programme (CRP) on signal processing techniques for sodium boiling noise detection was also recommended at the 16th meeting of the IWGFR. The CRP on Signal Processing Techniques for Sodium Boiling Noise Detection was set up in 1984. Eight laboratories from six countries have agreed to participate in this CRP. The overall objective of the programme was the development of reliable on-line signal processing techniques which could be used for the detection of sodium boiling in an LMFBR core. During the first stage of the programme a number of existing processing techniques used by different countries have been compared and evaluated. In the course of further work, an algorithm for implementation of this sodium boiling detection system in the nuclear reactor will be developed. It was also considered that the acoustic signal processing techniques developed for boiling detection could well make a useful contribution to other acoustic applications in the reactor. This publication consists of two parts. Part I is the final report of the co-ordinated research programme on signal processing techniques for sodium boiling noise detection. Part II contains two introductory papers and 20 papers presented at four research co-ordination meetings since 1985. A separate abstract was prepared for each of these 22 papers. Refs, figs and tabs

  4. Evaluation of stabilization techniques for ion implant processing

    Science.gov (United States)

    Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Narcy, Mark E.; Livesay, William R.

    1999-06-01

    With the integration of high current ion implant processing into volume CMOS manufacturing, the need for photoresist stabilization to achieve a stable ion implant process is critical. This study compares electron beam stabilization, a non-thermal process, with more traditional thermal stabilization techniques such as hot plate baking and vacuum oven processing. The electron beam processing is carried out in a flood exposure system with no active heating of the wafer. These stabilization techniques are applied to typical ion implant processes that might be found in a CMOS production process flow. The stabilization processes are applied to a 1.1 micrometers thick PFI-38A i-line photoresist film prior to ion implant processing. Post stabilization CD variation is detailed with respect to wall slope and feature integrity. SEM photographs detail the effects of the stabilization technique on photoresist features. The thermal stability of the photoresist is shown for different levels of stabilization and post stabilization thermal cycling. Thermal flow stability of the photoresist is detailed via SEM photographs. A significant improvement in thermal stability is achieved with the electron beam process, such that photoresist features are stable to temperatures in excess of 200 degrees C. Ion implant processing parameters are evaluated and compared for the different stabilization methods. Ion implant system end-station chamber pressure is detailed as a function of ion implant process and stabilization condition. The ion implant process conditions are detailed for varying factors such as ion current, energy, and total dose. A reduction in the ion implant systems end-station chamber pressure is achieved with the electron beam stabilization process over the other techniques considered. This reduction in end-station chamber pressure is shown to provide a reduction in total process time for a given ion implant dose. Improvements in the ion implant process are detailed across

  5. Systematization and sophistication of a comprehensive sensitivity analysis program. Phase 2

    International Nuclear Information System (INIS)

    Oyamada, Kiyoshi; Ikeda, Takao

    2004-02-01

    This study developed minute estimation by adopting comprehensive sensitivity analytical program for reliability of TRU waste repository concepts in a crystalline rock condition. We examined each components and groundwater scenario of geological repository and prepared systematic bases to examine the reliability from the point of comprehensiveness. Models and data are sophisticated to examine the reliability. Based on an existing TRU waste repository concepts, effects of parameters to nuclide migration were quantitatively classified. Those parameters, that will be decided quantitatively, are such as site character of natural barrier and design specification of engineered barriers. Considering the feasibility of those figures of specifications, reliability is re-examined on combinations of those parameters within a practical range. Future issues are; Comprehensive representation of hybrid geosphere model including the fractured medium and permeable matrix medium. Sophistication of tools to develop the reliable combinations of parameters. It is significant to continue this study because the disposal concepts and specification of TRU nuclides containing waste on various sites shall be determined rationally and safely through these studies. (author)

  6. The Value of Multivariate Model Sophistication: An Application to pricing Dow Jones Industrial Average options

    DEFF Research Database (Denmark)

    Rombouts, Jeroen V.K.; Stentoft, Lars; Violante, Francesco

    innovation for a Laplace innovation assumption improves the pricing in a smaller way. Apart from investigating directly the value of model sophistication in terms of dollar losses, we also use the model condence set approach to statistically infer the set of models that delivers the best pricing performance.......We assess the predictive accuracy of a large number of multivariate volatility models in terms of pricing options on the Dow Jones Industrial Average. We measure the value of model sophistication in terms of dollar losses by considering a set 248 multivariate models that differer...

  7. Digital processing optical transmission and coherent receiving techniques

    CERN Document Server

    Binh, Le Nguyen

    2013-01-01

    With coherent mixing in the optical domain and processing in the digital domain, advanced receiving techniques employing ultra-high speed sampling rates have progressed tremendously over the last few years. These advances have brought coherent reception systems for lightwave-carried information to the next stage, resulting in ultra-high capacity global internetworking. Digital Processing: Optical Transmission and Coherent Receiving Techniques describes modern coherent receiving techniques for optical transmission and aspects of modern digital optical communications in the most basic lines. The

  8. Sophistic Ethics in the Technical Writing Classroom: Teaching "Nomos," Deliberation, and Action.

    Science.gov (United States)

    Scott, J. Blake

    1995-01-01

    Claims that teaching ethics is particularly important to technical writing. Outlines a classical, sophistic approach to ethics based on the theories and pedagogies of Protagoras, Gorgias, and Isocrates, which emphasizes the Greek concept of "nomos," internal and external deliberation, and responsible action. Discusses problems and…

  9. Sophisticated Fowl: The Complex Behaviour and Cognitive Skills of Chickens and Red Junglefowl

    Directory of Open Access Journals (Sweden)

    Laura Garnham

    2018-01-01

    Full Text Available The world’s most numerous bird, the domestic chicken, and their wild ancestor, the red junglefowl, have long been used as model species for animal behaviour research. Recently, this research has advanced our understanding of the social behaviour, personality, and cognition of fowl, and demonstrated their sophisticated behaviour and cognitive skills. Here, we overview some of this research, starting with describing research investigating the well-developed senses of fowl, before presenting how socially and cognitively complex they can be. The realisation that domestic chickens, our most abundant production animal, are behaviourally and cognitively sophisticated should encourage an increase in general appraise and fascination towards them. In turn, this should inspire increased use of them as both research and hobby animals, as well as improvements in their unfortunately often poor welfare.

  10. A Framework for WWW Query Processing

    Science.gov (United States)

    Wu, Binghui Helen; Wharton, Stephen (Technical Monitor)

    2000-01-01

    Query processing is the most common operation in a DBMS. Sophisticated query processing has been mainly targeted at a single enterprise environment providing centralized control over data and metadata. Submitting queries by anonymous users on the web is different in such a way that load balancing or DBMS' accessing control becomes the key issue. This paper provides a solution by introducing a framework for WWW query processing. The success of this framework lies in the utilization of query optimization techniques and the ontological approach. This methodology has proved to be cost effective at the NASA Goddard Space Flight Center Distributed Active Archive Center (GDAAC).

  11. Microstructure of cheese: Processing, technological and microbiological considerations

    OpenAIRE

    Pereira, Cláudia I.; Gomes, Ana M. P.; Malcata, F. Xavier

    2009-01-01

    Cheese is a classical dairy product, which is strongly judged by its appearance and texture; hence, a renewed interest in its microstructure has been on the rise, as sophisticated techniques of analysis become more and more informative and widely available. Processing parameters that affect microstructure play a dominant role upon the features exhibited by the final product as perceived by the consumer; rational relationships between microstructure (which includes biochem...

  12. Processing data collected from radiometric experiments by multivariate technique

    International Nuclear Information System (INIS)

    Urbanski, P.; Kowalska, E.; Machaj, B.; Jakowiuk, A.

    2005-01-01

    Multivariate techniques applied for processing data collected from radiometric experiments can provide more efficient extraction of the information contained in the spectra. Several techniques are considered: (i) multivariate calibration using Partial Least Square Regression and Artificial Neural Network, (ii) standardization of the spectra, (iii) smoothing of collected spectra were autocorrelation function and bootstrap were used for the assessment of the processed data, (iv) image processing using Principal Component Analysis. Application of these techniques is illustrated on examples of some industrial applications. (author)

  13. Image processing for medical diagnosis of human organs

    International Nuclear Information System (INIS)

    Tamura, Shin-ichi

    1989-01-01

    The report first describes expectations and needs for diagnostic imaging in the field of clinical medicine, radiation medicine in particular, viewed by the author as an image processing expert working at a medical institute. Then, medical image processing techniques are discussed in relation to advanced information processing techniques that are currently drawing much attention in the field of engineering. Finally, discussion is also made of practical applications of image processing techniques to diagnosis. In the field of clinical diagnosis, advanced equipment such as PACS (picture archiving and communication system) has come into wider use, and efforts have been made to shift from visual examination to more quantitative and objective diagnosis by means of such advanced systems. In clinical medicine, practical, robust systems are more useful than sophisticated ones. It is difficult, though important, to develop completely automatized diagnostic systems. The urgent, realistic goal, therefore, is to develop effective diagnosis support systems. In particular, operation support systems equipped with three-dimensional displays will be very useful. (N.K.)

  14. Pretreatment techniques of biodegradable municipal wastewater for sustainable development of surface and groundwater resources: a survey/case studies (abstract)

    International Nuclear Information System (INIS)

    Rashid, A.; Sajjad, M.R.

    1999-01-01

    Water being a scarce commodity, recharge of groundwater with clean surface water is important to maintain good quality water resources. This paper reviews and discusses the advantages and disadvantages of different techniques for the treatment of municipal wastewater's in developing countries. Different processes discussed include from simple stabilization ponds and land treatment to aerated lagoons and oxidation ditches. More sophisticated techniques of activated sludge and anaerobic digestion are also discussed. The feasibility of these techniques in terms of cost, land area, removal of pathogens, effluent quality and need of technical expertise is discussed. (author)

  15. Assessing Epistemic Sophistication by Considering Domain-Specific Absolute and Multiplicistic Beliefs Separately

    Science.gov (United States)

    Peter, Johannes; Rosman, Tom; Mayer, Anne-Kathrin; Leichner, Nikolas; Krampen, Günter

    2016-01-01

    Background: Particularly in higher education, not only a view of science as a means of finding absolute truths (absolutism), but also a view of science as generally tentative (multiplicism) can be unsophisticated and obstructive for learning. Most quantitative epistemic belief inventories neglect this and understand epistemic sophistication as…

  16. A measurement technique for counting processes

    International Nuclear Information System (INIS)

    Cantoni, V.; Pavia Univ.; De Lotto, I.; Valenziano, F.

    1980-01-01

    A technique for the estimation of first and second order properties of a stationary counting process is presented here which uses standard instruments for analysis of a continuous stationary random signal. (orig.)

  17. A review on diagnostic techniques for brucellosis | Kaltungo | African ...

    African Journals Online (AJOL)

    ... but has not been validated for standard laboratory use. This paper highlights useful samples and, especially the different conventional to more sophisticated molecular techniques for the diagnosis of brucellosis. Keywords: Brucellosis, diagnosis, techniques. African Journal of Biotechnology, Vol. 13(1), pp. 1-10, 1 January, ...

  18. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  19. Multibeam swath bathymetry signal processing techniques

    Digital Repository Service at National Institute of Oceanography (India)

    Ranade, G.; Sudhakar, T.

    Mathematical advances and the advances in the real time signal processing techniques in the recent times, have considerably improved the state of art in the bathymetry systems. These improvements have helped in developing high resolution swath...

  20. Specific features of NDT data and processing algorithms: new remedies to old ills

    International Nuclear Information System (INIS)

    Georgel, B.

    1994-01-01

    Non destructive testing data from in-service inspections have specific features that require the most sophisticated techniques of signal and image processing. Each step in the overall information extraction process must be optimized by using recent approaches such like data decomposition and modelization, compression, sensor fusion and knowledge based systems. This can be achieved by means of wavelet transform, inverse problems formulation, standard compression algorithms, combined detection and estimation, neural networks and expert systems. These techniques are briefly presented through a number of Electricite de France applications or through recent literature results. (author). 1 fig., 20 refs

  1. Sophisticated Calculation of the 1oo4-architecture for Safety-related Systems Conforming to IEC61508

    International Nuclear Information System (INIS)

    Hayek, A; Al Bokhaiti, M; Schwarz, M H; Boercsoek, J

    2012-01-01

    With the publication and enforcement of the standard IEC 61508 of safety related systems, recent system architectures have been presented and evaluated. Among a number of techniques and measures to the evaluation of safety integrity level (SIL) for safety-related systems, several measures such as reliability block diagrams and Markov models are used to analyze the probability of failure on demand (PFD) and mean time to failure (MTTF) which conform to IEC 61508. The current paper deals with the quantitative analysis of the novel 1oo4-architecture (one out of four) presented in recent work. Therefore sophisticated calculations for the required parameters are introduced. The provided 1oo4-architecture represents an advanced safety architecture based on on-chip redundancy, which is 3-failure safe. This means that at least one of the four channels have to work correctly in order to trigger the safety function.

  2. Reacting to Neighborhood Cues?: Political Sophistication Moderates the Effect of Exposure to Immigrants

    DEFF Research Database (Denmark)

    Danckert, Bolette; Dinesen, Peter Thisted; Sønderskov, Kim Mannemar

    2017-01-01

    is founded on politically sophisticated individuals having a greater comprehension of news and other mass-mediated sources, which makes them less likely to rely on neighborhood cues as sources of information relevant for political attitudes. Based on a unique panel data set with fine-grained information...

  3. Power plant siting; an application of the nominal group process technique

    International Nuclear Information System (INIS)

    Voelker, A.H.

    1976-01-01

    The application of interactive group processes to the problem of facility siting is examined by this report. Much of the discussion is abstracted from experience gained in applying the Nominal Group Process Technique, an interactive group technique, to the identification and rating of factors important in siting nuclear power plants. Through this experience, interactive group process techniques are shown to facilitate the incorporation of the many diverse factors which play a role in siting. In direct contrast to mathematical optimization, commonly represented as the ultimate siting technique, the Nominal Group Process Technique described allows the incorporation of social, economic, and environmental factors and the quantification of the relative importance of these factors. The report concludes that the application of interactive group process techniques to planning and resource management will affect the consideration of social, economic, and environmental concerns and ultimately lead to more rational and credible siting decisions

  4. Advances in process research by radionuclide techniques

    International Nuclear Information System (INIS)

    Merz, A.; Vogg, H.

    1978-01-01

    Modifications and transformations of materials and their technical implementation in process systems require movement of materials. Radionuclide techniques can greatly help in understanding and describing these mechanisms. The specialized measuring technique is demonstrated by three examples selected from various fields of process technology. Radioactive tracer studies performed on a rotary kiln helped, inter alia, to establish a subdivision into process zones and to pinpoint areas of dust generation. Mixing and feeding actions were studied in a twin screw extruder equipped with a special screw and mixer disk arrangement. Tracer experiments conducted in two secondary settling basins indicate the differences in the mechanisms of movement of the aqueous phase if the mean residence time and the residence time distribution may be influenced not only by hydraulic loads, but also by design variants of the overflow flumes. (orig./HP) [de

  5. Classification of alarm processing techniques and human performance issues

    International Nuclear Information System (INIS)

    Kim, I.S.; O'Hara, J.M.

    1993-01-01

    Human factors reviews indicate that conventional alarm systems based on the one sensor, one alarm approach, have many human engineering deficiencies, a paramount example being too many alarms during major disturbances. As an effort to resolve these deficiencies, various alarm processing systems have been developed using different techniques. To ensure their contribution to operational safety, the impacts of those systems on operating crew performance should be carefully evaluated. This paper briefly reviews some of the human factors research issues associated with alarm processing techniques and then discusses a framework with which to classify the techniques. The dimensions of this framework can be used to explore the effects of alarm processing systems on human performance

  6. Classification of alarm processing techniques and human performance issues

    Energy Technology Data Exchange (ETDEWEB)

    Kim, I.S.; O' Hara, J.M.

    1993-01-01

    Human factors reviews indicate that conventional alarm systems based on the one sensor, one alarm approach, have many human engineering deficiencies, a paramount example being too many alarms during major disturbances. As an effort to resolve these deficiencies, various alarm processing systems have been developed using different techniques. To ensure their contribution to operational safety, the impacts of those systems on operating crew performance should be carefully evaluated. This paper briefly reviews some of the human factors research issues associated with alarm processing techniques and then discusses a framework with which to classify the techniques. The dimensions of this framework can be used to explore the effects of alarm processing systems on human performance.

  7. Classification of alarm processing techniques and human performance issues

    Energy Technology Data Exchange (ETDEWEB)

    Kim, I.S.; O`Hara, J.M.

    1993-05-01

    Human factors reviews indicate that conventional alarm systems based on the one sensor, one alarm approach, have many human engineering deficiencies, a paramount example being too many alarms during major disturbances. As an effort to resolve these deficiencies, various alarm processing systems have been developed using different techniques. To ensure their contribution to operational safety, the impacts of those systems on operating crew performance should be carefully evaluated. This paper briefly reviews some of the human factors research issues associated with alarm processing techniques and then discusses a framework with which to classify the techniques. The dimensions of this framework can be used to explore the effects of alarm processing systems on human performance.

  8. Signal Processing in Medical Ultrasound B-mode Imaging

    International Nuclear Information System (INIS)

    Song, Tai Kyong

    2000-01-01

    Ultrasonic imaging is the most widely used modality among modern imaging device for medical diagnosis and the system performance has been improved dramatically since early 90's due to the rapid advances in DSP performance and VLSI technology that made it possible to employ more sophisticated algorithms. This paper describes 'main stream' digital signal processing functions along with the associated implementation considerations in modern medical ultrasound imaging systems. Topics covered include signal processing methods for resolution improvement, ultrasound imaging system architectures, roles and necessity of the applications of DSP and VLSI technology in the development of the medical ultrasound imaging systems, and array signal processing techniques for ultrasound focusing

  9. Natural language processing techniques for automatic test ...

    African Journals Online (AJOL)

    Natural language processing techniques for automatic test questions generation using discourse connectives. ... PROMOTING ACCESS TO AFRICAN RESEARCH. AFRICAN JOURNALS ... Journal of Computer Science and Its Application.

  10. Application of on-line analytical processing technique in accelerator

    International Nuclear Information System (INIS)

    Xie Dong; Li Weimin; He Duohui; Liu Gongfa; Xuan Ke

    2005-01-01

    A method of application of the on-line analytical processing technique in accelerator is described, which includes data pre-processing, the process of constructing of data warehouse and on-line analytical processing. (authors)

  11. Potential of modern sonographic techniques in paediatric uroradiology

    Energy Technology Data Exchange (ETDEWEB)

    Riccabona, Michael E-mail: michael.riccabona@kfunigraz.ac.at

    2002-08-01

    Objective: To describe the potential of modern sonographic techniques in paediatric uroradiology. Method: Ultrasound (US)--now being the primary imaging tool--has revolutionised imaging diagnostic in the urinary tract. Constant developments and technical refinements have secured the role of US in uroradiology. Colour Doppler Sonography (CDS) and innovative applications such as the transperineal approach or application of m-mode US to the urinary tract have helped to develop US from just a basic tool to a sophisticated and respected method. The ongoing introduction of new and even more sophisticated methods further enhance the sonographic potential, which shall be demonstrated by a more detailed discussion of these methods. Results: Harmonic imaging, extended field of view US, amplitude coded CDS, echo-enhanced US, and three-dimensional US as the most recent new sonographic techniques are successfully applicable to paediatric urinary tract disease. They improve sonographic diagnosis in many conditions, such as detection of vesico-ureteral reflux, renal parenchymal volume assessment, comprehensive visualisation of hydronephrosis and complex pathology, evaluation of renal perfusional disturbances or defects, superior documentation with improved comparability for follow-up, or simply by offering clearer tissue delineation and differentiation. Conclusion: Modern US techniques are successfully applicable to neonates, infants, and children, further boosting the value of US in the paediatric urinary tract. However, as handling became more sophisticated, and artefacts have to be considered, modern urosonography became not only a more powerful, but also a more demanding method, with the need for expert knowledge and dedicated training.

  12. Potential of modern sonographic techniques in paediatric uroradiology

    International Nuclear Information System (INIS)

    Riccabona, Michael

    2002-01-01

    Objective: To describe the potential of modern sonographic techniques in paediatric uroradiology. Method: Ultrasound (US)--now being the primary imaging tool--has revolutionised imaging diagnostic in the urinary tract. Constant developments and technical refinements have secured the role of US in uroradiology. Colour Doppler Sonography (CDS) and innovative applications such as the transperineal approach or application of m-mode US to the urinary tract have helped to develop US from just a basic tool to a sophisticated and respected method. The ongoing introduction of new and even more sophisticated methods further enhance the sonographic potential, which shall be demonstrated by a more detailed discussion of these methods. Results: Harmonic imaging, extended field of view US, amplitude coded CDS, echo-enhanced US, and three-dimensional US as the most recent new sonographic techniques are successfully applicable to paediatric urinary tract disease. They improve sonographic diagnosis in many conditions, such as detection of vesico-ureteral reflux, renal parenchymal volume assessment, comprehensive visualisation of hydronephrosis and complex pathology, evaluation of renal perfusional disturbances or defects, superior documentation with improved comparability for follow-up, or simply by offering clearer tissue delineation and differentiation. Conclusion: Modern US techniques are successfully applicable to neonates, infants, and children, further boosting the value of US in the paediatric urinary tract. However, as handling became more sophisticated, and artefacts have to be considered, modern urosonography became not only a more powerful, but also a more demanding method, with the need for expert knowledge and dedicated training

  13. Financial Sophistication and the Distribution of the Welfare Cost of Inflation

    OpenAIRE

    Paola Boel; Gabriele Camera

    2009-01-01

    The welfare cost of anticipated inflation is quantified in a calibrated model of the U.S. economy that exhibits tractable equilibrium dispersion in wealth and earnings. Inflation does not generate large losses in societal welfare, yet its impact varies noticeably across segments of society depending also on the financial sophistication of the economy. If money is the only asset, then inflation hurts mostly the wealthier and more productive agents, while those poorer and less productive may ev...

  14. Development of process diagnostic techniques for piping and equipment

    International Nuclear Information System (INIS)

    Yotsutsuji, Mitoshi

    1987-01-01

    The thing required for using the facilities composing a plant for a long period without anxiety is to quantitatively grasp the quantities of the present condition of the facilities and to take the necessary measures beforehand. For this purpose, the diagnostic techniques for quickly and accurately detect the quantities of the condition of facilities are necessary, and the development of process diagnostic techniques has been desired. The process diagnostic techniques mentioned here mean those for diagnosing the contamination, clogging and performance of towers, tanks, heat exchangers and others. Idemitsu Engineering Co. had developed a simplified diagnostic equipment for detecting the state of fouling in piping in 1982, which is the gamma ray transmission diagnosis named Scale Checker. By further improving it, the process diagnostic techniques for piping and equipment were developed. In this report, the course of development and examination, the principle of detection, the constitution and the examination of remodeling of the Scale Checker are reported. As the cases of process diagnosis in plant facilities, the diagnosis of the clogging in process piping and the diagnosis of the performance of a distillation tower were carried out. The contents of the diagnosis and the results of those cases are explained. (Kako, I.)

  15. Tuning of PID controller using optimization techniques for a MIMO process

    Science.gov (United States)

    Thulasi dharan, S.; Kavyarasan, K.; Bagyaveereswaran, V.

    2017-11-01

    In this paper, two processes were considered one is Quadruple tank process and the other is CSTR (Continuous Stirred Tank Reactor) process. These are majorly used in many industrial applications for various domains, especially, CSTR in chemical plants.At first mathematical model of both the process is to be done followed by linearization of the system due to MIMO process and controllers are the major part to control the whole process to our desired point as per the applications so the tuning of the controller plays a major role among the whole process. For tuning of parameters we use two optimizations techniques like Particle Swarm Optimization, Genetic Algorithm. The above techniques are majorly used in different applications to obtain which gives the best among all, we use these techniques to obtain the best tuned values among many. Finally, we will compare the performance of the each process with both the techniques.

  16. Recent Advances in Techniques for Hyperspectral Image Processing

    Science.gov (United States)

    Plaza, Antonio; Benediktsson, Jon Atli; Boardman, Joseph W.; Brazile, Jason; Bruzzone, Lorenzo; Camps-Valls, Gustavo; Chanussot, Jocelyn; Fauvel, Mathieu; Gamba, Paolo; Gualtieri, Anthony; hide

    2009-01-01

    Imaging spectroscopy, also known as hyperspectral imaging, has been transformed in less than 30 years from being a sparse research tool into a commodity product available to a broad user community. Currently, there is a need for standardized data processing techniques able to take into account the special properties of hyperspectral data. In this paper, we provide a seminal view on recent advances in techniques for hyperspectral image processing. Our main focus is on the design of techniques able to deal with the highdimensional nature of the data, and to integrate the spatial and spectral information. Performance of the discussed techniques is evaluated in different analysis scenarios. To satisfy time-critical constraints in specific applications, we also develop efficient parallel implementations of some of the discussed algorithms. Combined, these parts provide an excellent snapshot of the state-of-the-art in those areas, and offer a thoughtful perspective on future potentials and emerging challenges in the design of robust hyperspectral imaging algorithms

  17. Future trends in power plant process computer techniques

    International Nuclear Information System (INIS)

    Dettloff, K.

    1975-01-01

    The development of new concepts of the process computer technique has advanced in great steps. The steps are in the three sections: hardware, software, application concept. New computers with a new periphery such as, e.g., colour layer equipment, have been developed in hardware. In software, a decisive step in the sector 'automation software' has been made. Through these components, a step forwards has also been made in the question of incorporating the process computer in the structure of the whole power plant control technique. (orig./LH) [de

  18. [The process of professional qualification for the critical care nurse].

    Science.gov (United States)

    Santana, Neuranides; Fernandes, Josicélia Dumêt

    2008-01-01

    Study of qualitative approach based on the dialectic historical materialism, that aimed at analizing the conformation of professional credentialing process of the critical care nurse of a hospital in Salvador, BA, Brazil. The subjects were 29 nurses. The analysis was based on the Analysis of Content, with the technique of Thematic Analysis, directed by the dialectic method. Three categories correlated to credentialing were generated: technological sophistication; individual and the collective organizational and as product and instrument of the work process. The results demonstrated that the institution estimulates the credentialing process; however the administrative politicies make it difficult the effectuation of the process of credentialing of the nurses.

  19. A probabilistic evaluation procedure for process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2018-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  20. Putin’s Russia: Russian Mentality and Sophisticated Imperialism in Military Policies

    OpenAIRE

    Szénási, Lieutenant-Colonel Endre

    2016-01-01

    According to my experiences, the Western world hopelessly fails to understand Russian mentality, or misinterprets it. During my analysis of the Russian way of thinking I devoted special attention to the examination of military mentality. I have connected the issue of the Russian way of thinking to the contemporary imperial policies of Putin’s Russia.  I have also attempted to prove the level of sophistication of both. I hope that a better understanding of both the Russian mentality and imperi...

  1. Document Examination: Applications of Image Processing Systems.

    Science.gov (United States)

    Kopainsky, B

    1989-12-01

    Dealing with images is a familiar business for an expert in questioned documents: microscopic, photographic, infrared, and other optical techniques generate images containing the information he or she is looking for. A recent method for extracting most of this information is digital image processing, ranging from the simple contrast and contour enhancement to the advanced restoration of blurred texts. When combined with a sophisticated physical imaging system, an image pricessing system has proven to be a powerful and fast tool for routine non-destructive scanning of suspect documents. This article reviews frequent applications, comprising techniques to increase legibility, two-dimensional spectroscopy (ink discrimination, alterations, erased entries, etc.), comparison techniques (stamps, typescript letters, photo substitution), and densitometry. Computerized comparison of handwriting is not included. Copyright © 1989 Central Police University.

  2. Recent advances in electronic nose techniques for monitoring of fermentation process.

    Science.gov (United States)

    Jiang, Hui; Zhang, Hang; Chen, Quansheng; Mei, Congli; Liu, Guohai

    2015-12-01

    Microbial fermentation process is often sensitive to even slight changes of conditions that may result in unacceptable end-product quality. Thus, the monitoring of the process is critical for discovering unfavorable deviations as early as possible and taking the appropriate measures. However, the use of traditional analytical techniques is often time-consuming and labor-intensive. In this sense, the most effective way of developing rapid, accurate and relatively economical method for quality assurance in microbial fermentation process is the use of novel chemical sensor systems. Electronic nose techniques have particular advantages in non-invasive monitoring of microbial fermentation process. Therefore, in this review, we present an overview of the most important contributions dealing with the quality control in microbial fermentation process using the electronic nose techniques. After a brief description of the fundamentals of the sensor techniques, some examples of potential applications of electronic nose techniques monitoring are provided, including the implementation of control strategies and the combination with other monitoring tools (i.e. sensor fusion). Finally, on the basis of the review, the electronic nose techniques are critically commented, and its strengths and weaknesses being highlighted. In addition, on the basis of the observed trends, we also propose the technical challenges and future outlook for the electronic nose techniques.

  3. Technical and economic benefits of nuclear techniques in ore processing

    International Nuclear Information System (INIS)

    1989-08-01

    This report is the outcome of an Advisory Group Meeting organized by the Agency and hosted by the Institute of Physics and Nuclear Techniques, the Academy of Mining and Metallurgy in Krakow, Poland. The purpose of the meeting was to assess the technical and economic benefits of applying nuclear techniques in ore processing industry. Nucleonic control systems and nuclear on-line analytical techniques as well as radioisotope tracer tests and their applications in metallic ore-processing, coal production, and cement fabrication were discussed. This report contains a summary and the presentations dealing with nuclear techniques for process control made at this meeting. Using a number of case-histories as examples, it illustrates technical and economic benefits obtainable by the installation of nuclear process control instrumentation. It is expected to be useful for everybody dealing with ore and coal production, but especially for administrative personnel and engineers who plan and implement national development programmes related to mineral resources. Refs, figs and tabs

  4. The relation between maturity and sophistication shall be properly dealt with in nuclear power development

    International Nuclear Information System (INIS)

    Li Yongjiang

    2009-01-01

    The paper analyses the advantages and disadvantages of the second generation improved technologies and third generation technologies mainly developed in China in terms of safety and economy. The paper also discusses the maturity of the second generation improved technologies and the sophistication of the third generation technologies respectively. Meanwhile, the paper proposes that the advantage and disadvantage of second generation improved technologies and third generation technologies should be carefully taken into consideration and the relationship between the maturity and sophistication should be properly dealt with in the current stage. A two-step strategy shall be taken as a solution to solve the problem of insufficient capacity of nuclear power, trace and develop the third generation technologies, so as to ensure the sound and fast development of nuclear power. (authors)

  5. Spectroscopic analysis technique for arc-welding process control

    Science.gov (United States)

    Mirapeix, Jesús; Cobo, Adolfo; Conde, Olga; Quintela, María Ángeles; López-Higuera, José-Miguel

    2005-09-01

    The spectroscopic analysis of the light emitted by thermal plasmas has found many applications, from chemical analysis to monitoring and control of industrial processes. Particularly, it has been demonstrated that the analysis of the thermal plasma generated during arc or laser welding can supply information about the process and, thus, about the quality of the weld. In some critical applications (e.g. the aerospace sector), an early, real-time detection of defects in the weld seam (oxidation, porosity, lack of penetration, ...) is highly desirable as it can reduce expensive non-destructive testing (NDT). Among others techniques, full spectroscopic analysis of the plasma emission is known to offer rich information about the process itself, but it is also very demanding in terms of real-time implementations. In this paper, we proposed a technique for the analysis of the plasma emission spectrum that is able to detect, in real-time, changes in the process parameters that could lead to the formation of defects in the weld seam. It is based on the estimation of the electronic temperature of the plasma through the analysis of the emission peaks from multiple atomic species. Unlike traditional techniques, which usually involve peak fitting to Voigt functions using the Levenberg-Marquardt recursive method, we employ the LPO (Linear Phase Operator) sub-pixel algorithm to accurately estimate the central wavelength of the peaks (allowing an automatic identification of each atomic species) and cubic-spline interpolation of the noisy data to obtain the intensity and width of the peaks. Experimental tests on TIG-welding using fiber-optic capture of light and a low-cost CCD-based spectrometer, show that some typical defects can be easily detected and identified with this technique, whose typical processing time for multiple peak analysis is less than 20msec. running in a conventional PC.

  6. Handbook of thin film deposition processes and techniques principles, methods, equipment and applications

    CERN Document Server

    Seshan, Krishna

    2002-01-01

    New second edition of the popular book on deposition (first edition by Klaus Schruegraf) for engineers, technicians, and plant personnel in the semiconductor and related industries. This book traces the technology behind the spectacular growth in the silicon semiconductor industry and the continued trend in miniaturization over the last 20 years. This growth has been fueled in large part by improved thin film deposition techniques and the development of highly specialized equipment to enable this deposition. The book includes much cutting-edge material. Entirely new chapters on contamination and contamination control describe the basics and the issues-as feature sizes shrink to sub-micron dimensions, cleanliness and particle elimination has to keep pace. A new chapter on metrology explains the growth of sophisticated, automatic tools capable of measuring thickness and spacing of sub-micron dimensions. The book also covers PVD, laser and e-beam assisted deposition, MBE, and ion beam methods to bring together a...

  7. Nurturing Opportunity Identification for Business Sophistication in a Cross-disciplinary Study Environment

    Directory of Open Access Journals (Sweden)

    Karine Oganisjana

    2012-12-01

    Full Text Available Opportunity identification is the key element of the entrepreneurial process; therefore the issue of developing this skill in students is a crucial task in contemporary European education which has recognized entrepreneurship as one of the lifelong learning key competences. The earlier opportunity identification becomes a habitual way of thinking and behavior across a broad range of contexts, the more likely that entrepreneurial disposition will steadily reside in students. In order to nurture opportunity identification in students for making them able to organize sophisticated businesses in the future, certain demands ought to be put forward as well to the teacher – the person who is to promote these qualities in their students. The paper reflects some findings of a research conducted within the frameworks of a workplace learning project for the teachers of one of Riga secondary schools (Latvia. The main goal of the project was to teach the teachers to identify hidden inner links between apparently unrelated things, phenomena and events within 10th grade study curriculum and connect them together and create new opportunities. The creation and solution of cross-disciplinary tasks were the means for achieving this goal.

  8. Demonstration of laser processing technique combined with water jet technique for retrieval of fuel debris at Fukushima Daiichi Nuclear Power Station

    International Nuclear Information System (INIS)

    Hanari, Toshihide; Takebe, Toshihiko; Yamada, Tomonori; Daido, Hiroyuki; Ishizuka, Ippei; Ohmori, Shinya; Kurosawa, Koichi; Sasaki, Go; Nakada, Masahiro; Sakai, Hideaki

    2017-01-01

    In decommissioning of Fukushima Daiichi Nuclear Power Station, a retrieval process of fuel debris in the Primary Containment Vessel by a remote operation is one of the key issues. In this process, prevention of spreading radioactive materials is one of the important considerations. Furthermore, an applicable technique to the process requires keeping of reasonable processing-efficiency. We propose to use the combined technique including a laser light and a water jet as a retrieval technique of the fuel debris. The laser processing technique combined with a repetitive pulsed water jet could perform an efficient retrieval processing. Our experimental result encourages us to promote further development of the technique towards a real application at Fukushima Daiichi Nuclear Power Station. (author)

  9. Experimental data processing techniques by a personal computer

    International Nuclear Information System (INIS)

    Matsuura, Kiyokata; Tsuda, Kenzo; Abe, Yoshihiko; Kojima, Tsuyoshi; Nishikawa, Akira; Shimura, Hitoshi; Hyodo, Hiromi; Yamagishi, Shigeru.

    1989-01-01

    A personal computer (16-bit, about 1 MB memory) can be used at a low cost in the experimental data processing. This report surveys the important techniques on A/D and D/A conversion, display, store and transfer of the experimental data. It is also discussed the items to be considered in the software. Practical softwares programed BASIC and Assembler language are given as examples. Here, we present some techniques to get faster process in BASIC language and show that the system composed of BASIC and Assembler is useful in a practical experiment. The system performance such as processing speed and flexibility in setting operation condition will depend strongly on programming language. We have made test for processing speed by some typical programming languages; BASIC(interpreter), C, FORTRAN and Assembler. As for the calculation, FORTRAN has the best performance which is comparable to or better than Assembler even in the personal computer. (author)

  10. Image processing techniques for remote sensing data

    Digital Repository Service at National Institute of Oceanography (India)

    RameshKumar, M.R.

    interpretation and for processing of scene data for autonomous machine perception. The technique of digital image processing are used for' automatic character/pattern recognition, industrial robots for product assembly and inspection, military recognizance... and spatial co-ordinates into discrete components. The mathematical concepts involved are the sampling and transform theory. Two dimensional transforms are used for image enhancement, restoration, encoding and description too. The main objective of the image...

  11. Reasoning about objects using process calculus techniques

    DEFF Research Database (Denmark)

    Kleist, Josva

    This thesis investigates the applicability of techniques known from the world of process calculi to reason about properties of object-oriented programs. The investigation is performed upon a small object-oriented language - The Sigma-calculus of Abadi and Cardelli. The investigation is twofold: We......-calculus turns out to be insufficient. Based on our experiences, we present a translation of a typed imperative Sigma-calculus, which looks promising. We are able to provide simple proofs of the equivalence of different Sigma-calculus objects using this translation. We use a labelled transition system adapted...... to the Sigma-calculus to investigate the use of process calculi techniques directly on the Sigma-calculus. The results obtained are of a fairly theoretical nature. We investigate the connection between the operational and denotaional semantics for a typed functional Sigma-calculus. The result is that Abadi...

  12. Roman sophisticated surface modification methods to manufacture silver counterfeited coins

    Science.gov (United States)

    Ingo, G. M.; Riccucci, C.; Faraldi, F.; Pascucci, M.; Messina, E.; Fierro, G.; Di Carlo, G.

    2017-11-01

    By means of the combined use of X-ray photoelectron spectroscopy (XPS), optical microscopy (OM) and scanning electron microscopy (SEM) coupled with energy dispersive X-ray spectroscopy (EDS) the surface and subsurface chemical and metallurgical features of silver counterfeited Roman Republican coins are investigated to decipher some aspects of the manufacturing methods and to evaluate the technological ability of the Roman metallurgists to produce thin silver coatings. The results demonstrate that over 2000 ago important advances in the technology of thin layer deposition on metal substrates were attained by Romans. The ancient metallurgists produced counterfeited coins by combining sophisticated micro-plating methods and tailored surface chemical modification based on the mercury-silvering process. The results reveal that Romans were able systematically to chemically and metallurgically manipulate alloys at a micro scale to produce adherent precious metal layers with a uniform thickness up to few micrometers. The results converge to reveal that the production of forgeries was aimed firstly to save expensive metals as much as possible allowing profitable large-scale production at a lower cost. The driving forces could have been a lack of precious metals, an unexpected need to circulate coins for trade and/or a combinations of social, political and economic factors that requested a change in money supply. Finally, some information on corrosion products have been achieved useful to select materials and methods for the conservation of these important witnesses of technology and economy.

  13. A new processing technique for airborne gamma-ray data

    DEFF Research Database (Denmark)

    Hovgaard, Jens

    1997-01-01

    The mathematical-statistical background for at new technique for processing gamma-ray spectra is presented. The technique - Noise Adjusted Singular Value Decomposition - decomposes at set of gamma-ray spectra into a few basic spectra - the spectral components. The spectral components can be proce...

  14. Application of nonliner reduction techniques in chemical process modeling: a review

    International Nuclear Information System (INIS)

    Muhaimin, Z; Aziz, N.; Abd Shukor, S.R.

    2006-01-01

    Model reduction techniques have been used widely in engineering fields for electrical, mechanical as well as chemical engineering. The basic idea of reduction technique is to replace the original system by an approximating system with much smaller state-space dimension. A reduced order model is more beneficial to process and industrial field in terms of control purposes. This paper is to provide a review on application of nonlinear reduction techniques in chemical processes. The advantages and disadvantages of each technique reviewed are also highlighted

  15. Actinide recovery techniques utilizing electromechanical processes

    International Nuclear Information System (INIS)

    Westphal, B.R.; Benedict, R.W.

    1994-01-01

    Under certain conditions, the separation of actinides using electromechanical techniques may be an effective means of residue processing. The separation of granular mixtures of actinides and other materials is based on appreciable differences in the magnetic and electrical properties of the actinide elements. In addition, the high density of actinides, particularly uranium and plutonium, may render a simultaneous separation based on mutually complementary parameters. Both high intensity magnetic separation and electrostatic separation have been investigated for the concentration of an actinide waste stream. Waste stream constituents include an actinide metal alloy and broken quartz shards. The investigation of these techniques is in support of the Integral Fast Reactor (IFR) concept currently being developed at Argonne National Laboratory under the auspices of the Department of Energy

  16. When not to copy: female fruit flies use sophisticated public information to avoid mated males

    Science.gov (United States)

    Loyau, Adeline; Blanchet, Simon; van Laere, Pauline; Clobert, Jean; Danchin, Etienne

    2012-10-01

    Semen limitation (lack of semen to fertilize all of a female's eggs) imposes high fitness costs to female partners. Females should therefore avoid mating with semen-limited males. This can be achieved by using public information extracted from watching individual males' previous copulating activities. This adaptive preference should be flexible given that semen limitation is temporary. We first demonstrate that the number of offspring produced by males Drosophila melanogaster gradually decreases over successive copulations. We then show that females avoid mating with males they just watched copulating and that visual public cues are sufficient to elicit this response. Finally, after males were given the time to replenish their sperm reserves, females did not avoid the males they previously saw copulating anymore. These results suggest that female fruit flies may have evolved sophisticated behavioural processes of resistance to semen-limited males, and demonstrate unsuspected adaptive context-dependent mate choice in an invertebrate.

  17. Close to the Clothes : Materiality and Sophisticated Archaism in Alexander van Slobbe’s Design Practices

    NARCIS (Netherlands)

    Baronian, M.-A.

    This article looks at the work of contemporary Dutch fashion designer Alexander van Slobbe (1959) and examines how, since the 1990s, his fashion practices have consistently and consciously put forward a unique reflection on questions related to materiality, sophisticated archaism, luxury,

  18. Close to the Clothes: Materiality and Sophisticated Archaism in Alexander van Slobbe’s Design Practices

    NARCIS (Netherlands)

    Baronian, M.-A.

    This article looks at the work of contemporary Dutch fashion designer Alexander van Slobbe (1959) and examines how, since the 1990s, his fashion practices have consistently and consciously put forward a unique reflection on questions related to materiality, sophisticated archaism, luxury,

  19. Hard X-ray techniques suitable for polymer experiments

    Energy Technology Data Exchange (ETDEWEB)

    Bras, W; Goossens, H; Goderis, B, E-mail: Wim.Bras@esrf.fr [Netherlands Organisation for Scientific Research (NWO) (Netherlands); DUBBLE-ESRF, BP 220, F38043 Grenoble Cedex (France); Department of Chemical Engineering and Chemistry, Eindhoven University of Technology, PO Box 513, 5600 MB Eindhoven (Netherlands); Molecular and Nanomaterials, Chemistry Department, Catholic University of Leuven, Celestijnenlaan 200F (Belgium)

    2010-11-15

    Polymers have been studied since 1979 with 8-12 keV synchrotron radiation X-ray scattering methods and the number and sophistication of the experiments have rapidly grown ever since. More recently, new experimental techniques have been developed that use softer or harder X-rays in less conventional ways. This article provides a brief overview of the possibilities of hard X-ray techniques and indicates some areas that might gain from further developments.

  20. Hard X-ray techniques suitable for polymer experiments

    International Nuclear Information System (INIS)

    Bras, W; Goossens, H; Goderis, B

    2010-01-01

    Polymers have been studied since 1979 with 8-12 keV synchrotron radiation X-ray scattering methods and the number and sophistication of the experiments have rapidly grown ever since. More recently, new experimental techniques have been developed that use softer or harder X-rays in less conventional ways. This article provides a brief overview of the possibilities of hard X-ray techniques and indicates some areas that might gain from further developments.

  1. Noncontaminating technique for making holes in existing process systems

    Science.gov (United States)

    Hecker, T. P.; Czapor, H. P.; Giordano, S. M.

    1972-01-01

    Technique is developed for making cleanly-contoured holes in assembled process systems without introducing chips or other contaminants into system. Technique uses portable equipment and does not require dismantling of system. Method was tested on Inconel, stainless steel, ASTMA-53, and Hastelloy X in all positions.

  2. Plasma transport studies using transient techniques

    International Nuclear Information System (INIS)

    Simonen, T.C.; Brower, D.L.; Efthimion, P.

    1991-01-01

    Selected topics from the Transient Transport sessions of the Transport Task Force Workshop, held February 19-23, 1990, in Hilton Head, South Carolina are summarized. Presentations on sawtooth propagation, ECH modulation, particle modulation, and H-mode transitions are included. The research results presented indicated a growing theoretical understanding and experimental sophistication in the application of transient techniques to transport studies. (Author)

  3. Processing techniques for data from the GKSS pressure suppression experiments

    International Nuclear Information System (INIS)

    Holman, G.S.; McCauley, E.W.

    1980-01-01

    This report describes techniques developed at LLNL for processing data from large-scale steam condensation experiments being performed by the GKSS Research Center in the Federal Republic of Germany. In particular, the computer code GKPLOT, a special evaluation program for generating time-history plots and numerical output files of GKSS data, will be discussed together with tape handling techniques to unblock the data to a form compatible with the LLNL octopus computer network. Using these data processing techniques, we have provided a convenient means of independently examining and analyzing a very extensive data base for steam condenstaion phenomena. In addition, the techniques developed for handling the GKSS data are applicable to the treatment of similar, but perhaps differently structured, experiment data sets

  4. New developments in techniques for information processing in radionuclide imaging

    International Nuclear Information System (INIS)

    Di Paola, R.; Todd-Pokropek, A.E.; CEA, 91 - Orsay

    1981-01-01

    Processing of scintigraphic data has passed through different stages in the past fifteen years. After an 'euphoric' era when large off-line computer facilities were used to process very low-quality rectilinear scan pictures, a much more critical period followed the introduction of on-line minicomputer systems to acquire, process and visualize scintillation camera data. A selection of some of the available techniques that could improve the extraction of information from scintigraphic examinations in routine is presented. Tomography has been excluded. As examples, the different techniques of (a) inhomogeneity correction of camera response and (b) respiratory motion corrections are used to show one evolutionary process in the use of computer systems. Filtering has been for a long time the major area of research in scintigraphic image processing. Only very simple (usually smoothing) filters are widely distributed. Little use of more 'powerful' filters in clinical data has been made, and very few serious evaluations have been published. Nevertheless, the number of installed minicomputer and microprocessor systems is increasing rapidly, but in general performing tasks other than filtering. The reasons for this (relative) failure are examined. Some 'new' techniques of image processing are presented. The compression of scintigraphic information is important because of the expected need in the near future for handling of large numbers of static pictures as in dynamic and tomographic studies. For dynamic information processing, the present methodology has been narrowed to those techniques that permit the entire 'data space' to be manipulated (as opposed to curve fitting after region of interest definition). 'Functional' imaging was the first step in this process. 'Factor analysis' could be the next. The results obtained by various research laboratories are reviewed. (author)

  5. A Snapshot of Serial Rape: An Investigation of Criminal Sophistication and Use of Force on Victim Injury and Severity of the Assault.

    Science.gov (United States)

    de Heer, Brooke

    2016-02-01

    Prior research on rapes reported to law enforcement has identified criminal sophistication and the use of force against the victim as possible unique identifiers to serial rape versus one-time rape. This study sought to contribute to the current literature on reported serial rape by investigating how the level of criminal sophistication of the rapist and use of force used were associated with two important outcomes of rape: victim injury and overall severity of the assault. In addition, it was evaluated whether rapist and victim ethnicity affected these relationships. A nation-wide sample of serial rape cases reported to law enforcement collected by the Federal Bureau of Investigation (FBI) was analyzed (108 rapists, 543 victims). Results indicated that serial rapists typically used a limited amount of force against the victim and displayed a high degree of criminal sophistication. In addition, the more criminally sophisticated the perpetrator was, the more sexual acts he performed on his victim. Finally, rapes between a White rapist and White victim were found to exhibit higher levels of criminal sophistication and were more severe in terms of number and types of sexual acts committed. These findings provide a more in-depth understanding of serial rape that can inform both academics and practitioners in the field about contributors to victim injury and severity of the assault. © The Author(s) 2014.

  6. Grapefruit (Citrus paradisi Macfad) phytochemicals composition is modulated by household processing techniques.

    Science.gov (United States)

    Uckoo, Ram M; Jayaprakasha, Guddadarangavvanahally K; Balasubramaniam, V M; Patil, Bhimanagouda S

    2012-09-01

    Grapefruits (Citrus paradisi Macfad) contain several phytochemicals known to have health maintaining properties. Due to the consumer's interest in obtaining high levels of these phytochemicals, it is important to understand the changes in their levels by common household processing techniques. Therefore, mature Texas "Rio Red" grapefruits were processed by some of the common household processing practices such as blending, juicing, and hand squeezing techniques and analyzed for their phytochemical content by high performance liquid chromatography (HPLC). Results suggest that grapefruit juice processed by blending had significantly (P levels of flavonoids (narirutin, naringin, hesperidin, neohesperidin, didymin, and poncirin) and limonin compared to juicing and hand squeezing. No significant variation in their content was noticed in the juice processed by juicing and hand squeezing. Ascorbic acid and citric acid were significantly (P processed by juicing and blending, respectively. Furthermore, hand squeezed fruit juice had significantly higher contents of dihydroxybergamottin (DHB) than juice processed by juicing and blending. Bergamottin and 5-methoxy-7 gernoxycoumarin (5-M-7-GC) were significantly higher in blended juice compared to juicing and hand squeezing. Therefore, consuming grapefruit juice processed by blending may provide higher levels of health beneficial phytochemicals such as naringin, narirutin, and poncirin. In contrast, juice processed by hand squeezing and juicing provides lower levels of limonin, bergamottin, and 5-M-7-GC. These results suggest that, processing techniques significantly influence the levels of phytochemicals and blending is a better technique for obtaining higher levels of health beneficial phytochemicals from grapefruits. Practical Application:  Blending, squeezing, and juicing are common household processing techniques used for obtaining fresh grapefruit juice. Understanding the levels of health beneficial phytochemicals

  7. Actinide recovery techniques utilizing electromechanical processes

    International Nuclear Information System (INIS)

    Westphal, B.R.; Benedict, R.W.

    1994-01-01

    Under certain conditions, the separation of actinides using electromechanical techniques may be an effective means of residue processing. The separation of granular mixtures of actinides and other materials discussed in this report is based on appreciable differences in the magnetic and electrical properties of the actinide elements. In addition, the high density of actinides, particularly uranium and plutonium, may render a simultaneous separation based on mutually complementary parameters. Both high intensity magnetic separation and electrostatic separation have been investigated for the concentration of an actinide waste stream. Waste stream constituents include an actinide metal alloy and broken quartz shards. The investigation of these techniques is in support of the Integral Fast Reactor (IFR) concept currently being developed at Argonne National Laboratory under the auspices of the Department of Energy

  8. Electrochemical Techniques in Textile Processes and Wastewater Treatment

    Directory of Open Access Journals (Sweden)

    Mireia Sala

    2012-01-01

    Full Text Available The textile industry uses the electrochemical techniques both in textile processes (such as manufacturing fibers, dyeing processes, and decolorizing fabrics and in wastewaters treatments (color removal. Electrochemical reduction reactions are mostly used in sulfur and vat dyeing, but in some cases, they are applied to effluents discoloration. However, the main applications of electrochemical treatments in the textile sector are based on oxidation reactions. Most of electrochemical oxidation processes involve indirect reactions which imply the generation of hypochlorite or hydroxyl radical in situ. These electrogenerated species are able to bleach indigo-dyed denim fabrics and to degrade dyes in wastewater in order to achieve the effluent color removal. The aim of this paper is to review the electrochemical techniques applied to textile industry. In particular, they are an efficient method to remove color of textile effluents. The reuse of the discolored effluent is possible, which implies an important saving of salt and water (i.e., by means of the “UVEC Cell”.

  9. Risk-assessment techniques and the reactor licensing process

    International Nuclear Information System (INIS)

    Levine, S.

    1979-01-01

    A brief description of the Reactor Safety Study (WASH-1400), concentrating on the engineering aspects of the contribution to reactor accident risks is followed by some comments on how we have applied the insights and techniques developed in this study to prepare a program to improve the safety of nuclear power plants. Some new work we are just beginning on the application of risk-assessment techniques to stablize the reactor licensing process is also discussed

  10. Generalized hardware post-processing technique for chaos-based pseudorandom number generators

    KAUST Repository

    Barakat, Mohamed L.

    2013-06-01

    This paper presents a generalized post-processing technique for enhancing the pseudorandomness of digital chaotic oscillators through a nonlinear XOR-based operation with rotation and feedback. The technique allows full utilization of the chaotic output as pseudorandom number generators and improves throughput without a significant area penalty. Digital design of a third-order chaotic system with maximum function nonlinearity is presented with verified chaotic dynamics. The proposed post-processing technique eliminates statistical degradation in all output bits, thus maximizing throughput compared to other processing techniques. Furthermore, the technique is applied to several fully digital chaotic oscillators with performance surpassing previously reported systems in the literature. The enhancement in the randomness is further examined in a simple image encryption application resulting in a better security performance. The system is verified through experiment on a Xilinx Virtex 4 FPGA with throughput up to 15.44 Gbit/s and logic utilization less than 0.84% for 32-bit implementations. © 2013 ETRI.

  11. The Recoverability of P-Technique Factor Analysis

    Science.gov (United States)

    Molenaar, Peter C. M.; Nesselroade, John R.

    2009-01-01

    It seems that just when we are about to lay P-technique factor analysis finally to rest as obsolete because of newer, more sophisticated multivariate time-series models using latent variables--dynamic factor models--it rears its head to inform us that an obituary may be premature. We present the results of some simulations demonstrating that even…

  12. Adaptive multiparameter control: application to a Rapid Thermal Processing process; Commande Adaptative Multivariable: Application a un Procede de Traitement Thermique Rapide

    Energy Technology Data Exchange (ETDEWEB)

    Morales Mago, S J

    1995-12-20

    In this work the problem of temperature uniformity control in rapid thermal processing is addressed by means of multivariable adaptive control. Rapid Thermal Processing (RTP) is a set of techniques proposed for semiconductor fabrication processes such as annealing, oxidation, chemical vapour deposition and others. The product quality depends on two mains issues: precise trajectory following and spatial temperature uniformity. RTP is a fabrication technique that requires a sophisticated real-time multivariable control system to achieve acceptable results. Modelling of the thermal behaviour of the process leads to very complex mathematical models. These are the reasons why adaptive control techniques are chosen. A multivariable linear discrete time model of the highly non-linear process is identified on-line, using an identification scheme which includes supervisory actions. This identified model, combined with a multivariable predictive control law allows to prevent the controller from systems variations. The control laws are obtained by minimization of a quadratic cost function or by pole placement. In some of these control laws, a partial state reference model was included. This reference model allows to incorporate an appropriate tracking capability into the control law. Experimental results of the application of the involved multivariable adaptive control laws on a RTP system are presented. (author) refs

  13. Particle Handling Techniques in Microchemical Processes

    Directory of Open Access Journals (Sweden)

    Brian S. Flowers

    2012-08-01

    Full Text Available The manipulation of particulates in microfluidics is a challenge that continues to impact applications ranging from fine chemicals manufacturing to the materials and the life sciences. Heterogeneous operations carried out in microreactors involve high surface-to-volume characteristics that minimize the heat and mass transport resistances, offering precise control of the reaction conditions. Considerable advances have been made towards the engineering of techniques that control particles in microscale laminar flow, yet there remain tremendous opportunities for improvements in the area of chemical processing. Strategies that have been developed to successfully advance systems involving heterogeneous materials are reviewed and an outlook provided in the context of the challenges of continuous flow fine chemical processes.

  14. Expert system and process optimization techniques for real-time monitoring and control of plasma processes

    Science.gov (United States)

    Cheng, Jie; Qian, Zhaogang; Irani, Keki B.; Etemad, Hossein; Elta, Michael E.

    1991-03-01

    To meet the ever-increasing demand of the rapidly-growing semiconductor manufacturing industry it is critical to have a comprehensive methodology integrating techniques for process optimization real-time monitoring and adaptive process control. To this end we have accomplished an integrated knowledge-based approach combining latest expert system technology machine learning method and traditional statistical process control (SPC) techniques. This knowledge-based approach is advantageous in that it makes it possible for the task of process optimization and adaptive control to be performed consistently and predictably. Furthermore this approach can be used to construct high-level and qualitative description of processes and thus make the process behavior easy to monitor predict and control. Two software packages RIST (Rule Induction and Statistical Testing) and KARSM (Knowledge Acquisition from Response Surface Methodology) have been developed and incorporated with two commercially available packages G2 (real-time expert system) and ULTRAMAX (a tool for sequential process optimization).

  15. The tool for the automatic analysis of lexical sophistication (TAALES): version 2.0.

    Science.gov (United States)

    Kyle, Kristopher; Crossley, Scott; Berger, Cynthia

    2017-07-11

    This study introduces the second release of the Tool for the Automatic Analysis of Lexical Sophistication (TAALES 2.0), a freely available and easy-to-use text analysis tool. TAALES 2.0 is housed on a user's hard drive (allowing for secure data processing) and is available on most operating systems (Windows, Mac, and Linux). TAALES 2.0 adds 316 indices to the original tool. These indices are related to word frequency, word range, n-gram frequency, n-gram range, n-gram strength of association, contextual distinctiveness, word recognition norms, semantic network, and word neighbors. In this study, we validated TAALES 2.0 by investigating whether its indices could be used to model both holistic scores of lexical proficiency in free writes and word choice scores in narrative essays. The results indicated that the TAALES 2.0 indices could be used to explain 58% of the variance in lexical proficiency scores and 32% of the variance in word-choice scores. Newly added TAALES 2.0 indices, including those related to n-gram association strength, word neighborhood, and word recognition norms, featured heavily in these predictor models, suggesting that TAALES 2.0 represents a substantial upgrade.

  16. A novel data processing technique for image reconstruction of penumbral imaging

    Science.gov (United States)

    Xie, Hongwei; Li, Hongyun; Xu, Zeping; Song, Guzhou; Zhang, Faqiang; Zhou, Lin

    2011-06-01

    CT image reconstruction technique was applied to the data processing of the penumbral imaging. Compared with other traditional processing techniques for penumbral coded pinhole image such as Wiener, Lucy-Richardson and blind technique, this approach is brand new. In this method, the coded aperture processing method was used for the first time independent to the point spread function of the image diagnostic system. In this way, the technical obstacles was overcome in the traditional coded pinhole image processing caused by the uncertainty of point spread function of the image diagnostic system. Then based on the theoretical study, the simulation of penumbral imaging and image reconstruction was carried out to provide fairly good results. While in the visible light experiment, the point source of light was used to irradiate a 5mm×5mm object after diffuse scattering and volume scattering. The penumbral imaging was made with aperture size of ~20mm. Finally, the CT image reconstruction technique was used for image reconstruction to provide a fairly good reconstruction result.

  17. A Document Imaging Technique for Implementing Electronic Loan Approval Process

    Directory of Open Access Journals (Sweden)

    J. Manikandan

    2015-04-01

    Full Text Available The image processing is one of the leading technologies of computer applications. Image processing is a type of signal processing, the input for image processor is an image or video frame and the output will be an image or subset of image [1]. Computer graphics and computer vision process uses an image processing techniques. Image processing systems are used in various environments like medical fields, computer-aided design (CAD, research fields, crime investigation fields and military fields. In this paper, we proposed a document image processing technique, for establishing electronic loan approval process (E-LAP [2]. Loan approval process has been tedious process, the E-LAP system attempts to reduce the complexity of loan approval process. Customers have to login to fill the loan application form online with all details and submit the form. The loan department then processes the submitted form and then sends an acknowledgement mail via the E-LAP to the requested customer with the details about list of documents required for the loan approval process [3]. The approaching customer can upload the scanned copies of all required documents. All this interaction between customer and bank take place using an E-LAP system.

  18. FY 1998 annual summary report on photon measuring/processing techniques. Development of the techniques for high-efficiency production processes; 1998 nendo foton keisoku kako gijutsu seika hokokusho. Kokoritsu seisan process gijutsu kaihatsu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-03-01

    The objectives are set to develop the techniques for energy-efficient laser-aided processing; techniques for high-precision, real-time measurement to improve quality control for production processes and increase their efficiency; and the techniques for generating/controlling photon of high efficiency and quality as the laser beam sources therefor, in order to promote energy saving at and improve efficiency of production processes consuming large quantities of energy, e.g., welding, joining, surface treatment and production of fine particles. The R and D themes are microscopic processing technology: simulation technology for laser welding phenomena; microscopic processing technology: synthesis of technology for quantum dot functional structures; in-situ status measuring technology: fine particle elements and size measurement technology; high-power all-solid-state laser technology: efficient rod type LD-pumping laser modules and pumping chamber of a slab-type laser; tightly-focusing all-solid-state laser technology: improvement of E/O efficiency of laser diode, high-quality nonlinear crystal growth technology and fabrication technology for nonlinear crystal; and comprehensive investigation of photonics engineering: high-efficiency harmonic generation technology. (NEDO)

  19. Development of food preservation and processing techniques by radiation

    Energy Technology Data Exchange (ETDEWEB)

    Byun, Myung Woo; Lee, Ju Woon; Kim, Dong Ho [KAERI, Taejon (Korea, Republic of); Yook, Hong Sun [Chungnam National Univ., Taejon (Korea, Republic of); Kim, Hak Soo [Sogang Univ., Seoul (Korea, Republic of); Lee, Cherl Ho; Park, Hyun Jin [Korea Univ., Seoul (Korea, Republic of); Kang, Il Jun [Hallym Univ., Chuncheon (Korea, Republic of); Kwon, Jung Ho [Kyungbook National Univ., Taegu (Korea, Republic of)

    2002-05-01

    To secure national food resources, development of energy-saving food processing and preservation technologies, establishment of method on improvement of national health and safety by development of alternative techniques of chemicals and foundation of the production of hygienic food and public health related products by irradiation technology were studied. Results at current stage are following; Processing techniques of low salted and fermented fish using gamma irradiation were developed and superiority of using irradiation to conventional food processing methods was established. Processing technique of value-added functional materials for the manufacture of food or public health products using RT/BT/NT combination technology was developed. The basic theories for the technology development to reduce toxic or undesirable compounds in food such as allergy or carcinogens were established. Exterminating methods of quarantine organisms in herbs/spices was established and the quality evaluation and detection conditions in quarantine treatment were set. From the studies on 'program of public understanding' based on safety of the gamma irradiated food, the information for public relation in enlargement of consumer acceptance/implementation and the peaceful use of nuclear energy were secured. Results from the research project will contribute on improvement of competency of domestic food industry and export market. The results also expect the improvement of public health by prevention of food borne diseases and enhancement of national economy and industry by increase of direct/indirect productivity.

  20. Development of food preservation and processing techniques by radiation

    International Nuclear Information System (INIS)

    Byun, Myung Woo; Lee, Ju Woon; Kim, Dong Ho; Yook, Hong Sun; Kim, Hak Soo; Lee, Cherl Ho; Park, Hyun Jin; Kang, Il Jun; Kwon, Jung Ho

    2002-05-01

    To secure national food resources, development of energy-saving food processing and preservation technologies, establishment of method on improvement of national health and safety by development of alternative techniques of chemicals and foundation of the production of hygienic food and public health related products by irradiation technology were studied. Results at current stage are following; Processing techniques of low salted and fermented fish using gamma irradiation were developed and superiority of using irradiation to conventional food processing methods was established. Processing technique of value-added functional materials for the manufacture of food or public health products using RT/BT/NT combination technology was developed. The basic theories for the technology development to reduce toxic or undesirable compounds in food such as allergy or carcinogens were established. Exterminating methods of quarantine organisms in herbs/spices was established and the quality evaluation and detection conditions in quarantine treatment were set. From the studies on 'program of public understanding' based on safety of the gamma irradiated food, the information for public relation in enlargement of consumer acceptance/implementation and the peaceful use of nuclear energy were secured. Results from the research project will contribute on improvement of competency of domestic food industry and export market. The results also expect the improvement of public health by prevention of food borne diseases and enhancement of national economy and industry by increase of direct/indirect productivity

  1. Application of hydrometallurgy techniques in quartz processing and purification: a review

    Science.gov (United States)

    Lin, Min; Lei, Shaomin; Pei, Zhenyu; Liu, Yuanyuan; Xia, Zhangjie; Xie, Feixiang

    2018-04-01

    Although there have been numerous studies on separation and purification of metallic minerals by hydrometallurgy techniques, applications of the chemical techniques in separation and purification of non-metallic minerals are rarely reported. This paper reviews disparate areas of study into processing and purification of quartz (typical non-metallic ore) in an attempt to summarize current work, as well as to suggest potential for future consolidation in the field. The review encompasses chemical techniques of the quartz processing including situations, progresses, leaching mechanism, scopes of application, advantages and drawbacks of micro-bioleaching, high temperature leaching, high temperature pressure leaching and catalyzed high temperature pressure leaching. Traditional leaching techniques including micro-bioleaching and high temperature leaching are unequal to demand of modern glass industry for quality of quartz concentrate because the quartz products has to be further processed. High temperature pressure leaching and catalyzed high temperature pressure leaching provide new ways to produce high-grade quartz sand with only one process and lower acid consumption. Furthermore, the catalyzed high temperature pressure leaching realizes effective purification of quartz with extremely low acid consumption (no using HF or any fluoride). It is proposed that, by integrating the different chemical processes of quartz processing and expounding leaching mechanisms and scopes of application, the research field as a monopolized industry would benefit.

  2. The sophisticated control of the tram bogie on track

    Directory of Open Access Journals (Sweden)

    Radovan DOLECEK

    2015-09-01

    Full Text Available The paper deals with the problems of routing control algorithms of new conception of tram vehicle bogie. The main goal of these research activities is wear reduction of rail wheels and tracks, wear reduction of traction energy losses and increasing of running comfort. The testing experimental tram vehicle with special bogie construction powered by traction battery is utilized for these purposes. This vehicle has a rotary bogie with independent rotating wheels driven by permanent magnets synchronous motors and a solid axle. The wheel forces in bogie are measured by large amounts of the various sensors placed on the testing experimental tram vehicle. Nowadays the designed control algorithms are implemented to the vehicle superset control system. The traction requirements and track characteristics have an effect to these control algorithms. This control including sophisticated routing brings other improvements which is verified and corrected according to individual traction and driving characteristics, and opens new possibilities.

  3. Transfer of physics detector models into CAD systems using modern techniques

    International Nuclear Information System (INIS)

    Dach, M.; Vuoskoski, J.

    1996-01-01

    Designing high energy physics detectors for future experiments requires sophisticated computer aided design and simulation tools. In order to satisfy the future demands in this domain, modern techniques, methods, and standards have to be applied. We present an interface application, designed and implemented using object-oriented techniques, for the widely used GEANT physics simulation package. It converts GEANT detector models into the future industrial standard, STEP. (orig.)

  4. Statistic techniques of process control for MTR type

    International Nuclear Information System (INIS)

    Oliveira, F.S.; Ferrufino, F.B.J.; Santos, G.R.T.; Lima, R.M.

    2002-01-01

    This work aims at introducing some improvements on the fabrication of MTR type fuel plates, applying statistic techniques of process control. The work was divided into four single steps and their data were analyzed for: fabrication of U 3 O 8 fuel plates; fabrication of U 3 Si 2 fuel plates; rolling of small lots of fuel plates; applying statistic tools and standard specifications to perform a comparative study of these processes. (author)

  5. Techniques and software architectures for medical visualisation and image processing

    NARCIS (Netherlands)

    Botha, C.P.

    2005-01-01

    This thesis presents a flexible software platform for medical visualisation and image processing, a technique for the segmentation of the shoulder skeleton from CT data and three techniques that make contributions to the field of direct volume rendering. Our primary goal was to investigate the use

  6. Signals and Systems in Biomedical Engineering Signal Processing and Physiological Systems Modeling

    CERN Document Server

    Devasahayam, Suresh R

    2013-01-01

    The use of digital signal processing is ubiquitous in the field of physiology and biomedical engineering. The application of such mathematical and computational tools requires a formal or explicit understanding of physiology. Formal models and analytical techniques are interlinked in physiology as in any other field. This book takes a unitary approach to physiological systems, beginning with signal measurement and acquisition, followed by signal processing, linear systems modelling, and computer simulations. The signal processing techniques range across filtering, spectral analysis and wavelet analysis. Emphasis is placed on fundamental understanding of the concepts as well as solving numerical problems. Graphs and analogies are used extensively to supplement the mathematics. Detailed models of nerve and muscle at the cellular and systemic levels provide examples for the mathematical methods and computer simulations. Several of the models are sufficiently sophisticated to be of value in understanding real wor...

  7. Development of food preservation and processing techniques by radiation

    Energy Technology Data Exchange (ETDEWEB)

    Byun, Myung Woo; Lee, Ju Woon; Kim, Dong Ho [KAERI, Taejon (Korea, Republic of); Yook, Hong Sun [Chungnam National Univ., Taejon (Korea, Republic of); Kim, Hak Soo [Sogang Univ., Seoul (Korea, Republic of); Lee, Cherl Ho; Park, Hyun Jin [Korea Univ., Seoul (Korea, Republic of); Kang, Il Jun [Hallym Univ., Chuncheon (Korea, Republic of); Kwon, Jung Ho [Kyungbook National Univ., Taegu (Korea, Republic of)

    2002-05-01

    To secure national food resources, development of energy-saving food processing and preservation technologies, establishment of method on improvement of national health and safety by development of alternative techniques of chemicals and foundation of the production of hygienic food and public health related products by irradiation technology were studied. Results at current stage are following; Processing techniques of low salted and fermented fish using gamma irradiation were developed and superiority of using irradiation to conventional food processing methods was established. Processing technique of value-added functional materials for the manufacture of food or public health products using RT/BT/NT combination technology was developed. The basic theories for the technology development to reduce toxic or undesirable compounds in food such as allergy or carcinogens were established. Exterminating methods of quarantine organisms in herbs/spices was established and the quality evaluation and detection conditions in quarantine treatment were set. From the studies on 'program of public understanding' based on safety of the gamma irradiated food, the information for public relation in enlargement of consumer acceptance/implementation and the peaceful use of nuclear energy were secured. Results from the research project will contribute on improvement of competency of domestic food industry and export market. The results also expect the improvement of public health by prevention of food borne diseases and enhancement of national economy and industry by increase of direct/indirect productivity.

  8. Processing ultrafine-grained Aluminum alloy using Multi-ECAP-Conform technique

    International Nuclear Information System (INIS)

    Fakhretdinova, Elvira; Raab, Georgy; Valiev, Ruslan; Ryzhikov, Oleg

    2014-01-01

    The stress-strained state (SSS), contact and force parameters of a new SPD technique – Multi-ECAP-Conform – have been studied. The new technique ensures a high level of accumulated strain □=4...5 per one processing cycle. Physical and computer modeling by finite element method in Deform-3D software was applied to evaluate the parameters. It is shown that the results of physical and computer modeling correlate with each other. Equipment has been upgraded, and experimental samples of Al-Mg-Si system alloy have been processed

  9. Development of laser materials processing and laser metrology techniques

    International Nuclear Information System (INIS)

    Kim, Cheol Jung; Chung, Chin Man; Kim, Jeong Mook; Kim, Min Suk; Kim, Kwang Suk; Baik, Sung Hoon; Kim, Seong Ouk; Park, Seung Kyu

    1997-09-01

    The applications of remote laser materials processing and metrology have been investigated in nuclear industry from the beginning of laser invention because they can reduce the risks of workers in the hostile environment by remote operation. The objective of this project is the development of laser material processing and metrology techniques for repairing and inspection to improve the safety of nuclear power plants. As to repairing, we developed our own laser sleeve welding head and innovative optical laser weld monitoring techniques to control the sleeve welding process. Furthermore, we designed and fabricated a 800 W Nd:YAG and a 150 W Excimer laser systems for high power laser materials processing in nuclear industry such as cladding and decontamination. As to inspection, we developed an ESPI and a laser triangulation 3-D profile measurement system for defect detection which can complement ECT and UT inspections. We also developed a scanning laser vibrometer for remote vibration measurement of large structures and tested its performance. (author). 58 refs., 16 tabs., 137 figs

  10. Towards emergence phenomenon in business process management

    Directory of Open Access Journals (Sweden)

    Koryl Maciej

    2017-06-01

    Full Text Available A standard solution regarding business process management automation in enterprises is the use of workflow management systems working by the Rule-Based Reasoning approach. In such systems, the process model which is designed entirely before the implementation has to meet all needs deriving from business activity of the organization. In practice, it means that great limitations arise in process control abilities, especially in the dynamic business environment. Therefore, new kinds of workflow systems may help which typically work in more agile way e.g. following the Case-Based Reasoning approach. The paper shows another possible solution – the use of emergence theory which indicates among other conditions required to fulfill stimulation of the system (for example the business environment to run grass-roots processes that lead to arising of new more sophisticated organizing forms. The paper also points the using opportunity of such techniques as the processing of complex events to fulfill key conditions pointed by the emergence theory.

  11. STUDY OF ELECTROPOLIMERIZATION PROCESSES OF PYRROLE BY CYCLIC VOLTAMMETRIC TECHNIQUE

    Directory of Open Access Journals (Sweden)

    Adhitasari Suratman

    2010-06-01

    Full Text Available Electropolymerization processes and electrochemical properties of polypyrrole as electroactive polymer have been studied by cyclic voltammetric technique. Pyrrole was electropolymerized to form polypyrrole in water-based solvent containing sodium perchlorate as supporting electrolyte in several pH values. The pH of the solutions were varied by using Britton Robinson buffer. The results showed that oxidation potential limit of electropolymerization processes of pyrrole was 1220 mV vs Ag/AgCl reference electrode. It can be seen that cyclic voltammetric respon of polypyrrole membrane that was prepared by electropolymerization processes of pyrrole at the scanning rate of 100 mV/s was stable. While the processes of pyrrole electropolymerization carried out at the variation of pH showed that the best condition was at the pH range of 2 - 6.   Keywords: polypyrolle, electropolymer, voltammetric technique

  12. Data Collision Prevention with Overflow Hashing Technique in Closed Hash Searching Process

    Science.gov (United States)

    Rahim, Robbi; Nurjamiyah; Rafika Dewi, Arie

    2017-12-01

    Hash search is a method that can be used for various search processes such as search engines, sorting, machine learning, neural network and so on, in the search process the possibility of collision data can happen and to prevent the occurrence of collision can be done in several ways one of them is to use Overflow technique, the use of this technique perform with varying length of data and this technique can prevent the occurrence of data collisions.

  13. Detection of Glaucoma Using Image Processing Techniques: A Critique.

    Science.gov (United States)

    Kumar, B Naveen; Chauhan, R P; Dahiya, Nidhi

    2018-01-01

    The primary objective of this article is to present a summary of different types of image processing methods employed for the detection of glaucoma, a serious eye disease. Glaucoma affects the optic nerve in which retinal ganglion cells become dead, and this leads to loss of vision. The principal cause is the increase in intraocular pressure, which occurs in open-angle and angle-closure glaucoma, the two major types affecting the optic nerve. In the early stages of glaucoma, no perceptible symptoms appear. As the disease progresses, vision starts to become hazy, leading to blindness. Therefore, early detection of glaucoma is needed for prevention. Manual analysis of ophthalmic images is fairly time-consuming and accuracy depends on the expertise of the professionals. Automatic analysis of retinal images is an important tool. Automation aids in the detection, diagnosis, and prevention of risks associated with the disease. Fundus images obtained from a fundus camera have been used for the analysis. Requisite pre-processing techniques have been applied to the image and, depending upon the technique, various classifiers have been used to detect glaucoma. The techniques mentioned in the present review have certain advantages and disadvantages. Based on this study, one can determine which technique provides an optimum result.

  14. Policy and process of innovation in techniques

    International Nuclear Information System (INIS)

    Kim, In Su; Lee, Jin Ju

    1982-09-01

    This book mentions policy and process of innovation in techniques, which deals with introduction, macroscopic analysis of development of science and technology including analysis of existing research about system of development on science and technology and new development system of science and technology, macroscopic analysis of development of science and technology of Korea. It also indicates innovation of technology in Korean industry with various access.

  15. Wind Erosion Processes and Control Techniques in the Sahelian Zone of Niger

    NARCIS (Netherlands)

    Sterk, G.; Stroosnijder, L.; Raats, P.A.C.

    1999-01-01

    Wind Erosion Processes and Control Techniques in the Sahelian Zone of Niger G. Sterk, L. Stroosnijder, and P.A.C. Raats Abstract The objective of this paper is to present the main results and conclusions from three years of field research on wind erosion processes and control techniques in the

  16. Process automation

    International Nuclear Information System (INIS)

    Moser, D.R.

    1986-01-01

    Process automation technology has been pursued in the chemical processing industries and to a very limited extent in nuclear fuel reprocessing. Its effective use has been restricted in the past by the lack of diverse and reliable process instrumentation and the unavailability of sophisticated software designed for process control. The Integrated Equipment Test (IET) facility was developed by the Consolidated Fuel Reprocessing Program (CFRP) in part to demonstrate new concepts for control of advanced nuclear fuel reprocessing plants. A demonstration of fuel reprocessing equipment automation using advanced instrumentation and a modern, microprocessor-based control system is nearing completion in the facility. This facility provides for the synergistic testing of all chemical process features of a prototypical fuel reprocessing plant that can be attained with unirradiated uranium-bearing feed materials. The unique equipment and mission of the IET facility make it an ideal test bed for automation studies. This effort will provide for the demonstration of the plant automation concept and for the development of techniques for similar applications in a full-scale plant. A set of preliminary recommendations for implementing process automation has been compiled. Some of these concepts are not generally recognized or accepted. The automation work now under way in the IET facility should be useful to others in helping avoid costly mistakes because of the underutilization or misapplication of process automation. 6 figs

  17. Applying traditional signal processing techniques to social media exploitation for situational understanding

    Science.gov (United States)

    Abdelzaher, Tarek; Roy, Heather; Wang, Shiguang; Giridhar, Prasanna; Al Amin, Md. Tanvir; Bowman, Elizabeth K.; Kolodny, Michael A.

    2016-05-01

    Signal processing techniques such as filtering, detection, estimation and frequency domain analysis have long been applied to extract information from noisy sensor data. This paper describes the exploitation of these signal processing techniques to extract information from social networks, such as Twitter and Instagram. Specifically, we view social networks as noisy sensors that report events in the physical world. We then present a data processing stack for detection, localization, tracking, and veracity analysis of reported events using social network data. We show using a controlled experiment that the behavior of social sources as information relays varies dramatically depending on context. In benign contexts, there is general agreement on events, whereas in conflict scenarios, a significant amount of collective filtering is introduced by conflicted groups, creating a large data distortion. We describe signal processing techniques that mitigate such distortion, resulting in meaningful approximations of actual ground truth, given noisy reported observations. Finally, we briefly present an implementation of the aforementioned social network data processing stack in a sensor network analysis toolkit, called Apollo. Experiences with Apollo show that our techniques are successful at identifying and tracking credible events in the physical world.

  18. Reliability modeling of digital component in plant protection system with various fault-tolerant techniques

    International Nuclear Information System (INIS)

    Kim, Bo Gyung; Kang, Hyun Gook; Kim, Hee Eun; Lee, Seung Jun; Seong, Poong Hyun

    2013-01-01

    Highlights: • Integrated fault coverage is introduced for reflecting characteristics of fault-tolerant techniques in the reliability model of digital protection system in NPPs. • The integrated fault coverage considers the process of fault-tolerant techniques from detection to fail-safe generation process. • With integrated fault coverage, the unavailability of repairable component of DPS can be estimated. • The new developed reliability model can reveal the effects of fault-tolerant techniques explicitly for risk analysis. • The reliability model makes it possible to confirm changes of unavailability according to variation of diverse factors. - Abstract: With the improvement of digital technologies, digital protection system (DPS) has more multiple sophisticated fault-tolerant techniques (FTTs), in order to increase fault detection and to help the system safely perform the required functions in spite of the possible presence of faults. Fault detection coverage is vital factor of FTT in reliability. However, the fault detection coverage is insufficient to reflect the effects of various FTTs in reliability model. To reflect characteristics of FTTs in the reliability model, integrated fault coverage is introduced. The integrated fault coverage considers the process of FTT from detection to fail-safe generation process. A model has been developed to estimate the unavailability of repairable component of DPS using the integrated fault coverage. The new developed model can quantify unavailability according to a diversity of conditions. Sensitivity studies are performed to ascertain important variables which affect the integrated fault coverage and unavailability

  19. Removable partial denture alloys processed by laser-sintering technique.

    Science.gov (United States)

    Alageel, Omar; Abdallah, Mohamed-Nur; Alsheghri, Ammar; Song, Jun; Caron, Eric; Tamimi, Faleh

    2018-04-01

    Removable partial dentures (RPDs) are traditionally made using a casting technique. New additive manufacturing processes based on laser sintering has been developed for quick fabrication of RPDs metal frameworks at low cost. The objective of this study was to characterize the mechanical, physical, and biocompatibility properties of RPD cobalt-chromium (Co-Cr) alloys produced by two laser-sintering systems and compare them to those prepared using traditional casting methods. The laser-sintered Co-Cr alloys were processed by the selective laser-sintering method (SLS) and the direct metal laser-sintering (DMLS) method using the Phenix system (L-1) and EOS system (L-2), respectively. L-1 and L-2 techniques were 8 and 3.5 times more precise than the casting (CC) technique (p laser-sintered and cast alloys were biocompatible. In conclusion, laser-sintered alloys are more precise and present better mechanical and fatigue properties than cast alloys for RPDs. © 2017 Wiley Periodicals, Inc. J Biomed Mater Res Part B: Appl Biomater, 106B: 1174-1185, 2018. © 2017 Wiley Periodicals, Inc.

  20. New computing techniques in physics research

    International Nuclear Information System (INIS)

    Becks, Karl-Heinz; Perret-Gallix, Denis

    1994-01-01

    New techniques were highlighted by the ''Third International Workshop on Software Engineering, Artificial Intelligence and Expert Systems for High Energy and Nuclear Physics'' in Oberammergau, Bavaria, Germany, from October 4 to 8. It was the third workshop in the series; the first was held in Lyon in 1990 and the second at France-Telecom site near La Londe les Maures in 1992. This series of workshops covers a broad spectrum of problems. New, highly sophisticated experiments demand new techniques in computing, in hardware as well as in software. Software Engineering Techniques could in principle satisfy the needs for forthcoming accelerator experiments. The growing complexity of detector systems demands new techniques in experimental error diagnosis and repair suggestions; Expert Systems seem to offer a way of assisting the experimental crew during data-taking

  1. Effects of novel processing techniques on glucosinolates and membrane associated myrosinases in broccoli

    OpenAIRE

    Frandsen, Heidi Blok; Markedal, Keld Ejdrup; Martín Belloso, Olga; Sánchez Vega, Rogelio; Soliva-Fortuny, Robert; Sørensen, Hilmer; Sørensen, Susanne; Sørensen, Jens Christian

    2014-01-01

    High pressure/high temperature (HP/HT) and pulsed electric field (PEF) treatment of food are among the novel processing techniques considered as alternatives to conventional thermal food processing. Introduction of new processing techniques with fast and gentle processing steps may reveal new possibilities for preservation of healthy bioactive compounds in processed food. However, effects on various food components due to autolysis and fast reactions prior to the applied HP/HT or PEF need to ...

  2. Investigation of HIV-1 infected and uninfected cells using the optical trapping technique

    CSIR Research Space (South Africa)

    Ombinda-Lemboumba, Saturnin

    2017-02-01

    Full Text Available Optical trapping has emerged as an essential tool for manipulating single biological material and performing sophisticated spectroscopy analysis on individual cell. The optical trapping technique has been used to grab and immobilize cells from a...

  3. Team training process for nuclear plant technicians

    International Nuclear Information System (INIS)

    Macris, A.C.

    1987-01-01

    The purpose of team training is the cooperative and coordinated actions of individual to attain a common goal. Such training requires the development of more sophisticated educational techniques than those previously established for training individuals alone. Extensive research has been conducted to devise methods and techniques to bring about effective team training. This paper discusses current team training methods and present an instructional strategy for the application of effective team training techniques

  4. The Effective Ransomware Prevention Technique Using Process Monitoring on Android Platform

    Directory of Open Access Journals (Sweden)

    Sanggeun Song

    2016-01-01

    Full Text Available Due to recent indiscriminate attacks of ransomware, damage cases including encryption of users’ important files are constantly increasing. The existing vaccine systems are vulnerable to attacks of new pattern ransomware because they can only detect the ransomware of existing patterns. More effective technique is required to prevent modified ransomware. In this paper, an effective method is proposed to prevent the attacks of modified ransomware on Android platform. The proposed technique specifies and intensively monitors processes and specific file directories using statistical methods based on Processor usage, Memory usage, and I/O rates so that the process with abnormal behaviors can be detected. If the process running a suspicious ransomware is detected, the proposed system will stop the process and take steps to confirm the deletion of programs associated with the process from users. The information of suspected and exceptional processes confirmed by users is stored in a database. The proposed technique can detect ransomware even if you do not save its patterns. Its speed of detection is very fast because it can be implemented in Android source code instead of mobile application. In addition, it can effectively determine modified patterns of ransomware and provide protection with minimum damage.

  5. Cleanex process: a versatile solvent extraction process for recovery and purification of lanthanides, americium, and curium

    International Nuclear Information System (INIS)

    Bigelow, J.E.; Collins, E.D.; King, L.J.

    1979-01-01

    At a concentration of 1 M in straight-chain hydrocarbon diluent, HDEHP will extract americium, curium, and other trivalent actinide and lanthanide elements from dilute acid or salt solutions. The solute is back-extracted with more concentrated acid, either nitric or hydrochloric. The process has been used in the continuous, countercurrent mode, but its greatest advantage arises in batch extractions where the excess acid can be titrated with NaOH to produce a final acidity of about 0.03 M. Under these conditions, 99% recovery can be achieved, usually in one stage. Cleanex was used on the 50-liter scale at the Transuranium Processing Plant at Oak Ridge for 12 years to provide a broad spectrum cleanup to transuranium elements before applying more sophisticated techniques for separating individual products. The process is also used routinely to recover excessive losses of curium and/or californium from plant waste streams. The solvent system is relatively resistant to radiation damage, being usable up to 200 W-h/liter

  6. Low Level RF Including a Sophisticated Phase Control System for CTF3

    CERN Document Server

    Mourier, J; Nonglaton, J M; Syratchev, I V; Tanner, L

    2004-01-01

    CTF3 (CLIC Test Facility 3), currently under construction at CERN, is a test facility designed to demonstrate the key feasibility issues of the CLIC (Compact LInear Collider) two-beam scheme. When completed, this facility will consist of a 150 MeV linac followed by two rings for bunch-interleaving, and a test stand where 30 GHz power will be generated. In this paper, the work that has been carried out on the linac's low power RF system is described. This includes, in particular, a sophisticated phase control system for the RF pulse compressor to produce a flat-top rectangular pulse over 1.4 µs.

  7. Process techniques of charge transfer time reduction for high speed CMOS image sensors

    International Nuclear Information System (INIS)

    Cao Zhongxiang; Li Quanliang; Han Ye; Qin Qi; Feng Peng; Liu Liyuan; Wu Nanjian

    2014-01-01

    This paper proposes pixel process techniques to reduce the charge transfer time in high speed CMOS image sensors. These techniques increase the lateral conductivity of the photo-generated carriers in a pinned photodiode (PPD) and the voltage difference between the PPD and the floating diffusion (FD) node by controlling and optimizing the N doping concentration in the PPD and the threshold voltage of the reset transistor, respectively. The techniques shorten the charge transfer time from the PPD diode to the FD node effectively. The proposed process techniques do not need extra masks and do not cause harm to the fill factor. A sub array of 32 × 64 pixels was designed and implemented in the 0.18 μm CIS process with five implantation conditions splitting the N region in the PPD. The simulation and measured results demonstrate that the charge transfer time can be decreased by using the proposed techniques. Comparing the charge transfer time of the pixel with the different implantation conditions of the N region, the charge transfer time of 0.32 μs is achieved and 31% of image lag was reduced by using the proposed process techniques. (semiconductor devices)

  8. TOF-SIMS imaging technique with information entropy

    International Nuclear Information System (INIS)

    Aoyagi, Satoka; Kawashima, Y.; Kudo, Masahiro

    2005-01-01

    Time-of-flight secondary ion mass spectrometry (TOF-SIMS) is capable of chemical imaging of proteins on insulated samples in principal. However, selection of specific peaks related to a particular protein, which are necessary for chemical imaging, out of numerous candidates had been difficult without an appropriate spectrum analysis technique. Therefore multivariate analysis techniques, such as principal component analysis (PCA), and analysis with mutual information defined by information theory, have been applied to interpret SIMS spectra of protein samples. In this study mutual information was applied to select specific peaks related to proteins in order to obtain chemical images. Proteins on insulated materials were measured with TOF-SIMS and then SIMS spectra were analyzed by means of the analysis method based on the comparison using mutual information. Chemical mapping of each protein was obtained using specific peaks related to each protein selected based on values of mutual information. The results of TOF-SIMS images of proteins on the materials provide some useful information on properties of protein adsorption, optimality of immobilization processes and reaction between proteins. Thus chemical images of proteins by TOF-SIMS contribute to understand interactions between material surfaces and proteins and to develop sophisticated biomaterials

  9. Sophisticated visualization algorithms for analysis of multidimensional experimental nuclear spectra

    International Nuclear Information System (INIS)

    Morhac, M.; Kliman, J.; Matousek, V.; Turzo, I.

    2004-01-01

    This paper describes graphical models of visualization of 2-, 3-, 4-dimensional scalar data used in nuclear data acquisition, processing and visualization system developed at the Institute of Physics, Slovak Academy of Sciences. It focuses on presentation of nuclear spectra (histograms). However it can be successfully applied for visualization of arrays of other data types. In the paper we present conventional as well as new developed surface and volume rendering visualization techniques used (Authors)

  10. Harmonizing the Writing Process with Music Training Techniques

    Science.gov (United States)

    Riecken, Nancy

    2009-01-01

    Can music help students become better thinkers and writers? Over the past three years, the author has incorporated some basic music training techniques in her classrooms to help her teach the writing process to students who would otherwise click her off. The students have developed clearer thinking and organizational skills, and have increased…

  11. Nanosilver conductive lines made by spray coating and aerosol jet printing technique

    Science.gov (United States)

    Krzeminski, Jakub; Wroblewski, Grzegorz; Dybowska-Sarapuk, Lucja; Lepak, Sandra; Jakubowska, Malgorzata

    2017-08-01

    Printing electronics even though the printing techniques are known for a long time, are gaining in importance. The possibility of making the electronic circuits on flexible, big-area substrates with efficient and cheap technology make it attractive for the electronic industry. Spray coating, as a one of printing methods, additionally provide the chance to print on the non-flat, complicated shaped substrates. Despite the spray coating is mostly used to print a big pads, it is reachable to spray the separate conductive lines both as a quickly-produced prototype and as a fully manufactured circuit. Our work presents the directly printed lines with spray coating technique. For the printing process self-made ink was used. We tested three different approaches to line formation and compare them in the terms of line edge, resistivity and thickness. Line profiles provide the information about the roughness and the line size. In the end we showed the aerosol jet printed meander to give an overview of this similar to spray coating but more sophisticated technique.

  12. Electron beam application in industrial polymer processing - Review and outlook

    International Nuclear Information System (INIS)

    Gielenz, G.

    2001-01-01

    Full text: The various established industrial electron beam (EB) applications as related to polymers, their corresponding material and process fundamentals are discussed in this paper. The basics of nowadays most common irradiation processes, which are for continuous stranded products: Single Beam, Rotary Technique; Single Beam, Multiple Pass Technique; Dual Beam, Multiple Pass Technique; and Single Beam, Single (Multiple) Pass Technique by means of a conveyor belt or cart system for discontinuous goods are briefly addressed together with some typical examples for illustration. Some comments on the (dis)advantages and the future economic optimization potential which EB processing technologies could provide to the respective polymer processing industries are presented with respect to material, accelerator equipment and related product handling hardware. The future competitiveness of irradiation crosslinking technologies, which offer numerous advantages in comparison to conventional CV curing and silane crosslinking technologies, only can be maintained by increasing their economic attractiveness, which is: high processing speeds, high material throughput at low production costs and comparatively low capital investment of the hardware involved. Other, more sophisticated irradiation process proposals found in the literature and respective patent publications will be briefly presented, although all of which lack more or less practical evidence for industrial economic and reliable application. Finally, the authors vision of a more efficient, economical EB-process design, by combining quasi state of the art EB-equipment components with a novel beam deflection system to practically achieve a 'Dual Beam, Four Side Crossfiring Process' for continuous strand-products, will be presented. (author)

  13. Flame analysis using image processing techniques

    Science.gov (United States)

    Her Jie, Albert Chang; Zamli, Ahmad Faizal Ahmad; Zulazlan Shah Zulkifli, Ahmad; Yee, Joanne Lim Mun; Lim, Mooktzeng

    2018-04-01

    This paper presents image processing techniques with the use of fuzzy logic and neural network approach to perform flame analysis. Flame diagnostic is important in the industry to extract relevant information from flame images. Experiment test is carried out in a model industrial burner with different flow rates. Flame features such as luminous and spectral parameters are extracted using image processing and Fast Fourier Transform (FFT). Flame images are acquired using FLIR infrared camera. Non-linearities such as thermal acoustic oscillations and background noise affect the stability of flame. Flame velocity is one of the important characteristics that determines stability of flame. In this paper, an image processing method is proposed to determine flame velocity. Power spectral density (PSD) graph is a good tool for vibration analysis where flame stability can be approximated. However, a more intelligent diagnostic system is needed to automatically determine flame stability. In this paper, flame features of different flow rates are compared and analyzed. The selected flame features are used as inputs to the proposed fuzzy inference system to determine flame stability. Neural network is used to test the performance of the fuzzy inference system.

  14. Sophisticated Approval Voting, Ignorance Priors, and Plurality Heuristics: A Behavioral Social Choice Analysis in a Thurstonian Framework

    Science.gov (United States)

    Regenwetter, Michel; Ho, Moon-Ho R.; Tsetlin, Ilia

    2007-01-01

    This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two…

  15. Sound Is Sound: Film Sound Techniques and Infrasound Data Array Processing

    Science.gov (United States)

    Perttu, A. B.; Williams, R.; Taisne, B.; Tailpied, D.

    2017-12-01

    A multidisciplinary collaboration between earth scientists and a sound designer/composer was established to explore the possibilities of audification analysis of infrasound array data. Through the process of audification of the infrasound we began to experiment with techniques and processes borrowed from cinema to manipulate the noise content of the signal. The results of this posed the question: "Would the accuracy of infrasound data array processing be enhanced by employing these techniques?". So a new area of research was born from this collaboration and highlights the value of these interactions and the unintended paths that can occur from them. Using a reference event database, infrasound data were processed using these new techniques and the results were compared with existing techniques to asses if there was any improvement to detection capability for the array. With just under one thousand volcanoes, and a high probability of eruption, Southeast Asia offers a unique opportunity to develop and test techniques for regional monitoring of volcanoes with different technologies. While these volcanoes are monitored locally (e.g. seismometer, infrasound, geodetic and geochemistry networks) and remotely (e.g. satellite and infrasound), there are challenges and limitations to the current monitoring capability. Not only is there a high fraction of cloud cover in the region, making plume observation more difficult via satellite, there have been examples of local monitoring networks and telemetry being destroyed early in the eruptive sequence. The success of local infrasound studies to identify explosions at volcanoes, and calculate plume heights from these signals, has led to an interest in retrieving source parameters for the purpose of ash modeling with a regional network independent of cloud cover.

  16. Development of radiation techniques in the 80's

    International Nuclear Information System (INIS)

    Wiesner, L.

    1990-01-01

    The application of radiation for the purpose of sterilisation in the medical field and for the enhancement of material properties, particularly for polymer products, has been in operation for decades. Electron accelerators and gamma radiation devices are the radiation sources. The technology is clearly concentrated in Asia, especially Japan, (surface coatings and crosslinking for cable insulators). In industry, the process is commonly used in curing, drying, crosslinking, grafting and vulcanisation. Radiation technology is an increasingly important factor in the low-cost production of top quality advanced products for sophisticated areas of manufacturing. It has already been introduced in high-tech applications in the manufacture of megabit chips, which are undoubtedly one of the main reasons for Japan's leading position in that field. The Japanese industry has already set out along the road towards the manufacture of ultra-highly integrated circuits, which open up completely new opportunities in the field of data and information processing. This has been made possible by Japanese mastery of the technique of building and operating electron accelerators to provide synchrotron beams. (orig./DG) [de

  17. Power system stabilizers based on modern control techniques

    Energy Technology Data Exchange (ETDEWEB)

    Malik, O P; Chen, G P; Zhang, Y; El-Metwally, K [Calgary Univ., AB (Canada). Dept. of Electrical and Computer Engineering

    1994-12-31

    Developments in digital technology have made it feasible to develop and implement improved controllers based on sophisticated control techniques. Power system stabilizers based on adaptive control, fuzzy logic and artificial networks are being developed. Each of these control techniques possesses unique features and strengths. In this paper, the relative performance of power systems stabilizers based on adaptive control, fuzzy logic and neural network, both in simulation studies and real time tests on a physical model of a power system, is presented and compared to that of a fixed parameter conventional power system stabilizer. (author) 16 refs., 45 figs., 3 tabs.

  18. Towards a Business Process Modeling Technique for Agile Development of Case Management Systems

    Directory of Open Access Journals (Sweden)

    Ilia Bider

    2017-12-01

    Full Text Available A modern organization needs to adapt its behavior to changes in the business environment by changing its Business Processes (BP and corresponding Business Process Support (BPS systems. One way of achieving such adaptability is via separation of the system code from the process description/model by applying the concept of executable process models. Furthermore, to ease introduction of changes, such process model should separate different perspectives, for example, control-flow, human resources, and data perspectives, from each other. In addition, for developing a completely new process, it should be possible to start with a reduced process model to get a BPS system quickly running, and then continue to develop it in an agile manner. This article consists of two parts, the first sets requirements on modeling techniques that could be used in the tools that supports agile development of BPs and BPS systems. The second part suggests a business process modeling technique that allows to start modeling with the data/information perspective which would be appropriate for processes supported by Case or Adaptive Case Management (CM/ACM systems. In a model produced by this technique, called data-centric business process model, a process instance/case is defined as sequence of states in a specially designed instance database, while the process model is defined as a set of rules that set restrictions on allowed states and transitions between them. The article details the background for the project of developing the data-centric process modeling technique, presents the outline of the structure of the model, and gives formal definitions for a substantial part of the model.

  19. Comparative study of resist stabilization techniques for metal etch processing

    Science.gov (United States)

    Becker, Gerry; Ross, Matthew F.; Wong, Selmer S.; Minter, Jason P.; Marlowe, Trey; Livesay, William R.

    1999-06-01

    This study investigates resist stabilization techniques as they are applied to a metal etch application. The techniques that are compared are conventional deep-UV/thermal stabilization, or UV bake, and electron beam stabilization. The electron beam tool use din this study, an ElectronCure system from AlliedSignal Inc., ELectron Vision Group, utilizes a flood electron source and a non-thermal process. These stabilization techniques are compared with respect to a metal etch process. In this study, two types of resist are considered for stabilization and etch: a g/i-line resist, Shipley SPR-3012, and an advanced i-line, Shipley SPR 955- Cm. For each of these resist the effects of stabilization on resist features are evaluated by post-stabilization SEM analysis. Etch selectivity in all cases is evaluated by using a timed metal etch, and measuring resists remaining relative to total metal thickness etched. Etch selectivity is presented as a function of stabilization condition. Analyses of the effects of the type of stabilization on this method of selectivity measurement are also presented. SEM analysis was also performed on the features after a compete etch process, and is detailed as a function of stabilization condition. Post-etch cleaning is also an important factor impacted by pre-etch resist stabilization. Results of post- etch cleaning are presented for both stabilization methods. SEM inspection is also detailed for the metal features after resist removal processing.

  20. Sampling phased array a new technique for signal processing and ultrasonic imaging

    OpenAIRE

    Bulavinov, A.; Joneit, D.; Kröning, M.; Bernus, L.; Dalichow, M.H.; Reddy, K.M.

    2006-01-01

    Different signal processing and image reconstruction techniques are applied in ultrasonic non-destructive material evaluation. In recent years, rapid development in the fields of microelectronics and computer engineering lead to wide application of phased array systems. A new phased array technique, called "Sampling Phased Array" has been developed in Fraunhofer Institute for non-destructive testing. It realizes unique approach of measurement and processing of ultrasonic signals. The sampling...

  1. Using of Natural Language Processing Techniques in Suicide Research

    Directory of Open Access Journals (Sweden)

    Azam Orooji

    2017-09-01

    Full Text Available It is estimated that each year many people, most of whom are teenagers and young adults die by suicide worldwide. Suicide receives special attention with many countries developing national strategies for prevention. Since, more medical information is available in text, Preventing the growing trend of suicide in communities requires analyzing various textual resources, such as patient records, information on the web or questionnaires. For this purpose, this study systematically reviews recent studies related to the use of natural language processing techniques in the area of people’s health who have completed suicide or are at risk. After electronically searching for the PubMed and ScienceDirect databases and studying articles by two reviewers, 21 articles matched the inclusion criteria. This study revealed that, if a suitable data set is available, natural language processing techniques are well suited for various types of suicide related research.

  2. Purex process modelling - do we really need speciation data?

    International Nuclear Information System (INIS)

    Taylor, R.J.; May, I.

    2001-01-01

    The design of reprocessing flowsheets has become a complex process requiring sophisticated simulation models, containing both chemical and engineering features. Probably the most basic chemical data needed is the distribution of process species between solvent and aqueous phases at equilibrium, which is described by mathematical algorithms. These algorithms have been constructed from experimentally determined distribution coefficients over a wide range of conditions. Distribution algorithms can either be empirical fits of the data or semi-empirical equations, which describe extraction as functions of process variables such as temperature, activity coefficients, uranium loading, etc. Speciation data is not strictly needed in the accumulation of distribution coefficients, which are simple ratios of analyte concentration in the solvent phase to that in the aqueous phase. However, as we construct process models of increasing complexity, speciation data becomes much more important both to raise confidence in the model and to understand the process chemistry at a more fundamental level. UV/vis/NIR spectrophotometry has been our most commonly used speciation method since it is a well-established method for the analysis of actinide ion oxidation states in solution at typical process concentrations. However, with the increasing availability to actinide science of more sophisticated techniques (e.g. NMR; EXAFS) complementary structural information can often be obtained. This paper will, through examples, show how we have used spectrophotometry as a primary tool in distribution and kinetic experiments to obtain data for process models, which are then validated through counter-current flowsheet trials. It will also discuss how spectrophotometry and other speciation methods are allowing us to study the link between molecular structure and extraction behaviour, showing how speciation data really is important in PUREX process modelling. (authors)

  3. Purification through Emotions: The Role of Shame in Plato's "Sophist" 230B4-E5

    Science.gov (United States)

    Candiotto, Laura

    2018-01-01

    This article proposes an analysis of Plato's "Sophist" (230b4--e5) that underlines the bond between the logical and the emotional components of the Socratic "elenchus", with the aim of depicting the social valence of this philosophical practice. The use of emotions characterizing the 'elenctic' method described by Plato is…

  4. Sampling phased array - a new technique for ultrasonic signal processing and imaging

    OpenAIRE

    Verkooijen, J.; Boulavinov, A.

    2008-01-01

    Over the past 10 years, the improvement in the field of microelectronics and computer engineering has led to significant advances in ultrasonic signal processing and image construction techniques that are currently being applied to non-destructive material evaluation. A new phased array technique, called 'Sampling Phased Array', has been developed in the Fraunhofer Institute for Non-Destructive Testing([1]). It realises a unique approach of measurement and processing of ultrasonic signals. Th...

  5. Condition monitoring of a check valve for nuclear power plants by means of acoustic emission technique

    International Nuclear Information System (INIS)

    Lee, Min Rae; Leee, Jun Hyun; Kim, Jung Tack; Kim, Jung Soo; Luk, V. K.

    2003-01-01

    This work performed in support of the International Nuclear Energy Research Initiative(INERI) program, which was to develop and demonstrate advanced sensor and computational technology for on-line monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). The primary object of this work is to investigate advanced condition monitoring systems based on acoustic emission detection that can provide timely detection of check valve degradation and service aging so that maintenance/replacement could be preformed prior to loss of safety function. The research is focused on the capability of AE technique to provide diagnostic information useful in determining check valve aging and degradation, check valve failures and undesirable operating modes. This work also includes the investigation and adaptation of several advanced sensor technologies such as accelerometer and advanced ultrasonic technique. In addition, this work will develop advanced sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms from check valve degradation.

  6. Condition monitoring of a check valve for nuclear power plants by means of acoustic emission technique

    International Nuclear Information System (INIS)

    Lee, M. R.; Lee, J. H.; Kim, J. T.; Kim, J. S.; Luk, V. K.

    2003-01-01

    This work performed in support of the International Nuclear Energy Research Institute (INERI) program, which was to develop and demonstrate advanced sensor and computational technology for on-line monitoring of the condition of components, structures, and systems in advanced and next-generation nuclear power plants (NPPs). This primary object of this work is to investigate advanced condition monitoring systems based on acoustic emission detection that can provide timely detection of check valve degeneration and service aging so that maintenance/replacement could be preformed prior to loss safety function. The research is focused on the capability of AE technique to provide diagnostic information useful in determining check valve aging and degradation check valve failure and undesirable operating modes. This work also includes the investigation and adaptation of several advanced sensor technologies such as accelerometer and advanced ultrasonic technique. In addition, this work will develop advanced sophisticated signal processing, noise reduction, and pattern recognition techniques and algorithms from check valve degradation.

  7. Automated synthesis of image processing procedures using AI planning techniques

    Science.gov (United States)

    Chien, Steve; Mortensen, Helen

    1994-01-01

    This paper describes the Multimission VICAR (Video Image Communication and Retrieval) Planner (MVP) (Chien 1994) system, which uses artificial intelligence planning techniques (Iwasaki & Friedland, 1985, Pemberthy & Weld, 1992, Stefik, 1981) to automatically construct executable complex image processing procedures (using models of the smaller constituent image processing subprograms) in response to image processing requests made to the JPL Multimission Image Processing Laboratory (MIPL). The MVP system allows the user to specify the image processing requirements in terms of the various types of correction required. Given this information, MVP derives unspecified required processing steps and determines appropriate image processing programs and parameters to achieve the specified image processing goals. This information is output as an executable image processing program which can then be executed to fill the processing request.

  8. Monitoring of Lactic Fermentation Process by Ultrasonic Technique

    Science.gov (United States)

    Alouache, B.; Touat, A.; Boutkedjirt, T.; Bennamane, A.

    The non-destructive control by using ultrasound techniques has become of great importance in food industry. In this work, Ultrasound has been used for quality control and monitoring the fermentation stages of yogurt, which is a highly consumed product. On the contrary to the physico-chemical methods, where the measurement instruments are directly introduced in the sample, ultrasound techniques have the advantage of being non-destructive and contactless, thus reducing the risk of contamination. Results obtained in this study by using ultrasound seem to be in good agreement with those obtained by physico-chemical methods such as acidity measurement by using a PH-meter instrument. This lets us to conclude that ultrasound method may be an alternative for a healthy control of yoghurt fermentation process.

  9. Development of a technique for three-dimensional image reconstruction from emission computed tomograms (ECT)

    International Nuclear Information System (INIS)

    Gerischer, R.

    1987-01-01

    The described technique for three-dimensional image reconstruction from ECT sections is based on a simple procedure, which can be carried out with the aid of any standard-type computer used in nuclear medicine and requires no sophisticated arithmetic approach. (TRV) [de

  10. Application of signal processing techniques for islanding detection of distributed generation in distribution network: A review

    International Nuclear Information System (INIS)

    Raza, Safdar; Mokhlis, Hazlie; Arof, Hamzah; Laghari, J.A.; Wang, Li

    2015-01-01

    Highlights: • Pros & cons of conventional islanding detection techniques (IDTs) are discussed. • Signal processing techniques (SPTs) ability in detecting islanding is discussed. • SPTs ability in improving performance of passive techniques are discussed. • Fourier, s-transform, wavelet, HHT & tt-transform based IDTs are reviewed. • Intelligent classifiers (ANN, ANFIS, Fuzzy, SVM) application in SPT are discussed. - Abstract: High penetration of distributed generation resources (DGR) in distribution network provides many benefits in terms of high power quality, efficiency, and low carbon emissions in power system. However, efficient islanding detection and immediate disconnection of DGR is critical in order to avoid equipment damage, grid protection interference, and personnel safety hazards. Islanding detection techniques are mainly classified into remote, passive, active, and hybrid techniques. From these, passive techniques are more advantageous due to lower power quality degradation, lower cost, and widespread usage by power utilities. However, the main limitations of these techniques are that they possess a large non detection zones and require threshold setting. Various signal processing techniques and intelligent classifiers have been used to overcome the limitations of passive islanding. Signal processing techniques, in particular, are adopted due to their versatility, stability, cost effectiveness, and ease of modification. This paper presents a comprehensive overview of signal processing techniques used to improve common passive islanding detection techniques. A performance comparison between the signal processing based islanding detection techniques with existing techniques are also provided. Finally, this paper outlines the relative advantages and limitations of the signal processing techniques in order to provide basic guidelines for researchers and field engineers in determining the best method for their system

  11. Process acceptance and adjustment techniques for Swiss automatic screw machine parts. Final report

    International Nuclear Information System (INIS)

    Robb, J.M.

    1976-01-01

    Product tolerance requirements for small, cylindrical, piece parts produced on swiss automatic screw machines have progressed to the reliability limits of inspection equipment. The miniature size, configuration, and tolerance requirements (plus or minus 0.0001 in.) (0.00254 mm) of these parts preclude the use of screening techniques to accept product or adjust processes during setup and production runs; therefore, existing means of product acceptance and process adjustment must be refined or new techniques must be developed. The purpose of this endeavor has been to determine benefits gained through the implementation of a process acceptance technique (PAT) to swiss automatic screw machine processes. PAT is a statistical approach developed for the purpose of accepting product and centering processes for parts produced by selected, controlled processes. Through this endeavor a determination has been made of the conditions under which PAT can benefit a controlled process and some specific types of screw machine processes upon which PAT could be applied. However, it was also determined that PAT, if used indiscriminately, may become a record keeping burden when applied to more than one dimension at a given machining operation

  12. Specific features of NDT data and processing algorithms: new remedies to old ills; Caracteristiques specifiques des donnees de controle non destructif et algorithmes de traitement: nouveaux remedes aux vielles douleurs

    Energy Technology Data Exchange (ETDEWEB)

    Georgel, B

    1994-12-31

    Non destructive testing data from in-service inspections have specific features that require the most sophisticated techniques of signal and image processing. Each step in the overall information extraction process must be optimized by using recent approaches such like data decomposition and modelization, compression, sensor fusion and knowledge based systems. This can be achieved by means of wavelet transform, inverse problems formulation, standard compression algorithms, combined detection and estimation, neural networks and expert systems. These techniques are briefly presented through a number of Electricite de France applications or through recent literature results. (author). 1 fig., 20 refs.

  13. Commercial Applications of X Ray Spectrometric Techniques

    Energy Technology Data Exchange (ETDEWEB)

    Wegrzynek, D., E-mail: D.Wedgrznek@iaea.org [International Atomic Energy Agency, The IAEA Laboratories, Seibersdorf, Vienna (Austria)

    2013-07-15

    In the 21st century, the X-ray fluorescence (XRF) technique is widely used in process control, industrial applications and for routine elemental analysis. The technique has a multielement capability capable of detecting elements with Z {>=} 10, with a few instruments capable of detecting also elements with Z {>=} 5. It is characterized by a non-destructive analysis process and relatively good detection limits, typically one part per million, for a wide range of elements. The first commercial XRF instruments were introduced to the market about 50 years ago. They were the wavelength dispersive X ray fluorescence (WDXRF) spectrometers utilizing Bragg's law and reflection on crystal lattices for sequential elemental analysis of sample composition. The advances made in radiation detector technology, especially the introduction of semiconductor detectors, improvements in signal processing electronics, availability and exponential growth of personal computer market led to invention of energy dispersive X ray fluorescence (EDXRF) technique. The EDXRF is more cost effective as compared to WDXRF. It also allows for designing compact instruments. Such instruments can be easily tailored to the needs of different customers, integrated with industrial installations, and also miniaturized for the purpose of in-situ applications. The versatility of the technique has been confirmed in a spectacular way by using the XRF and X-ray spectrometric techniques, among few others, during the NASA and ESA missions in search for the evidence of life and presence of water on the surface of Mars. The XRF technique has achieved its strong position within the atomic spectroscopy group of analytical techniques not only due to its versatility but also due to relatively low running costs, as compared to the commonly used methods, e.g., atomic absorption spectrometry (AAS) or inductively coupled plasma atomic emission/mass spectrometry (ICP-AES/MS). Presently, the XRF technique together with X ray

  14. Microstructure characterisation of processed fruits and vegetables by complementary imaging techniques

    NARCIS (Netherlands)

    Voda, A.; Nijsse, J.; Dalen, van G.; As, van H.; Duynhoven, van J.P.M.

    2011-01-01

    The assessment of the microstructural impact of processing on fruits and vegetables is a prerequisite for understanding the relation between processing and textural quality. By combining complementary imaging techniques, one can obtain a multi scale and real-time structural view on the impact of

  15. PROCESS PERFORMANCE EVALUATION USING HISTOGRAM AND TAGUCHI TECHNIQUE IN LOCK MANUFACTURING COMPANY

    Directory of Open Access Journals (Sweden)

    Hagos Berhane

    2013-12-01

    Full Text Available Process capability analysis is a vital part of an overall quality improvement program. It is a technique that has application in many segments of the product cycle, including product and process design, vendor sourcing, production or manufacturing planning, and manufacturing. Frequently, a process capability study involves observing a quality characteristic of the product. Since this information usually pertains to the product rather than the process, this analysis should strictly speaking be called a product analysis study. A true process capability study in this context would involve collecting data that relates to process parameters so that remedial actions can be identified on a timely basis. The present study attempts to analyze performance of drilling, pressing, and reaming operations carried out for the manufacturing of two major lock components viz. handle and lever plate, at Gaurav International, Aligarh (India. The data collected for depth of hole on handle, central hole diameter, and key hole diameter are used to construct histogram. Next, the information available in frequency distribution table, the process mean, process capability from calculations and specification limits provided by the manufacturing concern are used with Taguchi technique. The data obtained from histogram and Taguchi technique combined are used to evaluate the performance of the manufacturing process. Results of this study indicated that the performance of all the processes used to produce depth of hole on handle, key hole diameter, and central hole diameter are potentially incapable as the process capability indices are found to be 0.54, 0.54 and 0.76 respectively. The number of nonconforming parts expressed in terms of parts per million (ppm that have fallen out of the specification limits are found to be 140000, 26666.66, and 146666.66 for depth of hole on handle, central hole diameter, and key hole diameter respectively. As a result, the total loss incurred

  16. Effects of processing techniques on the radioactive contamination of food

    International Nuclear Information System (INIS)

    Bovard, P.; Delmas, J.; Grauby, A.

    Following contamination of cultures of rice, grapes and various vegetables by 90 Sr and 137 Cs, the effect of processing and cooking techniques on the contamination of the food-stuff was investigated [fr

  17. Analysis of pulse-shape discrimination techniques for BC501A using GHz digital signal processing

    International Nuclear Information System (INIS)

    Rooney, B.D.; Dinwiddie, D.R.; Nelson, M.A.; Rawool-Sullivan, Mohini W.

    2001-01-01

    A comparison study of pulse-shape analysis techniques was conducted for a BC501A scintillator using digital signal processing (DSP). In this study, output signals from a preamplifier were input directly into a 1 GHz analog-to-digital converter. The digitized data obtained with this method was post-processed for both pulse-height and pulse-shape information. Several different analysis techniques were evaluated for neutron and gamma-ray pulse-shape discrimination. It was surprising that one of the simplest and fastest techniques resulted in some of the best pulse-shape discrimination results. This technique, referred to here as the Integral Ratio technique, was able to effectively process several thousand detector pulses per second. This paper presents the results and findings of this study for various pulse-shape analysis techniques with digitized detector signals.

  18. Vertically contacting ultrathin semiconductor nanomembranes by rolled-up metallic contacts incorporating selective etching techniques

    Energy Technology Data Exchange (ETDEWEB)

    Thurmer, Dominic J.; Bof Bufon, Carlos Cesar; Deneke, Christoph [IFW Dresden, Dresden (Germany); Schmidt, Oliver G. [IFW Dresden, Dresden (Germany); TU Chemnitz, Chemnitz (Germany)

    2011-07-01

    Merging modern self-assembly techniques with well established top-down processing methods is paving the way for more sophisticated device generations in the future. Nanomembranes, composed of many different material classes, have already been shown to provide the necessary framework for a diverse range of structures and devices incorporating wrinkling, buckling, folding and rolling of thin films. In the past decade, an elegant symbiosis of bottom-up and top-down methods has emerged to fabricate hybrid layer systems incorporating the controlled release and rearrangement of inherently strained layers. Using selective III-V etchants in combination with inherently strained layers we are able to fabricate structures which allow us to contact through single and multi-material semiconductor nanomembrane creating many devices in parallel and on the original semiconductor substrate. We demonstrate this technique by creating hybrid superconducting junctions created by sandwiching the semiconductor nanomembrane between two superconducting contacts. Using solely optical lithography techniques we are able to form junctions with lateral dimensions of a few micrometers and a semiconductor barrier thickness of down to 5 nm.

  19. Evaluation of alternative drying techniques for the earthworm flour processing

    Directory of Open Access Journals (Sweden)

    Laura Suárez Hernández

    2016-01-01

    Full Text Available Production of earthworm flour includes several steps, among which the most critical is the drying process due to factors such as time and energ y requirements. In addition, the information available about this process is relquite limited. Thus, this work evaluated four drying techniques likely to be implemented by lombricultores: sun drying, oven drying, drying tunnel and microwave assisted drying. Drying kinetics values were obtained for all drying techniques, and specific parameters as the following were evaluated: drying tray material (stainless and ceramic steel for sun drying, microwave power (30 %, 50 % and 80 % and amount of material to be dried (72 and 100 g for microwave assisted drying, temperature (50, 65, 90 and 100 °C for oven drying, and temperature (50 and 63 °C and air speed (2.9 to 3.6 m/s for tunnel drying. It was determined that the most efficient technique is the drying tunnel, because this allows the combination of heat transfer by conduction and convection, and enables controlling the operating parameters. Finally, nutritional analyzes were performed in samples obtained by each drying technique evaluated. The crude protein content for sun drying, microwave assisted drying, oven drying and tunnel drying were 66.36 %, 67.91 %, 60.35 % and 62.33 % respectively, indicating that the drying method and operating parameters do not significantly affect the crude protein content.

  20. Analysis of the Growth Process of Neural Cells in Culture Environment Using Image Processing Techniques

    Science.gov (United States)

    Mirsafianf, Atefeh S.; Isfahani, Shirin N.; Kasaei, Shohreh; Mobasheri, Hamid

    Here we present an approach for processing neural cells images to analyze their growth process in culture environment. We have applied several image processing techniques for: 1- Environmental noise reduction, 2- Neural cells segmentation, 3- Neural cells classification based on their dendrites' growth conditions, and 4- neurons' features Extraction and measurement (e.g., like cell body area, number of dendrites, axon's length, and so on). Due to the large amount of noise in the images, we have used feed forward artificial neural networks to detect edges more precisely.

  1. Fluid Structure Interaction Techniques For Extrusion And Mixing Processes

    Science.gov (United States)

    Valette, Rudy; Vergnes, Bruno; Coupez, Thierry

    2007-05-01

    This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each sub-domain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique background computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.

  2. Alarm processing system using AI techniques for nuclear power plant

    International Nuclear Information System (INIS)

    Yang, Joon On; Chang, Soon Heung

    1990-01-01

    An alarm processing system (APS) has been developed using artificial intelligence (AI) techniques. The alarms of nuclear power plants (NPP's) are classified into the generalized and special alarms. The generalized alarms are also classified into the global and local alarms. For each type of alarms, the specific processing rules are applied to filter and suppress unnecessary and potentially misleading alarms. The local processing are based on 'model-based reasoning.' The global and special alarms are processed by using the general cause-consequence check rules. The priorities of alarms are determined according to the plant state and the consistencies between them

  3. Processing of dual-orthogonal cw polarimetric radar signals

    NARCIS (Netherlands)

    Babur, G.

    2009-01-01

    The thesis consists of two parts. The first part is devoted to the theory of dual-orthogonal polarimetric radar signals with continuous waveforms. The thesis presents a comparison of the signal compression techniques, namely correlation and de-ramping methods, for the dual-orthogonal sophisticated

  4. Criteria for assessing the quality of signal processing techniques for acoustic leak detection

    International Nuclear Information System (INIS)

    Prabhakar, R.; Singh, O.P.

    1990-01-01

    In this paper the criteria used in the first IAEA coordinated research programme to assess the quality of signal processing techniques for sodium boiling noise detection are highlighted. Signal processing techniques, using new features sensitive to boiling and a new approach for achieving higher reliability of detection, which were developed at Indira Gandhi Centre for Atomic Research are also presented. 10 refs, 3 figs, 2 tabs

  5. State of the art of toshiba maintenance techniques for reactor internals

    International Nuclear Information System (INIS)

    Maekawa, Osamu; Hattori, Yasuhiro; Sudo, Akira

    2002-01-01

    As the number of aged plants increases, maintaining the integrity of the reactor pressure vessel and reactor internals in aged plants has become an essential issue to ensure continued stable operation and achieve higher plant operability. A major issue with regard to reactor internals is stress corrosion cracks (SCCs). Laser-applying techniques have many features suitable for preventive maintenance work on reactor internals. Toshiba has developed various laser-applying preventive maintenance techniques and accumulated considerable field experience utilizing these techniques in various aged plants. Moreover, in view of the importance of confirming the soundness of reactor internals in aged plants, Toshiba has developed and applied sophisticated nondestructive testing techniques for this purpose. (author)

  6. Measurement techniques in dry-powdered processing of spent nuclear fuels

    International Nuclear Information System (INIS)

    Bowers, D. L.; Hong, J.-S.; Kim, H.-D.; Persiani, P. J.; Wolf, S. F.

    1999-01-01

    High-performance liquid chromatography (HPLC) with inductively coupled plasma mass spectrometry (ICPMS) detection, α-spectrometry (α-S), and γ-spectrometry (γ-S) were used for the determination of nuclide content in five samples excised from a high-burnup fuel rod taken from a pressurized water reactor (PWR). The samples were prepared for analysis by dissolution of dry-powdered samples. The measurement techniques required no separation of the plutonium, uranium, and fission products. The sample preparation and analysis techniques showed promise for in-line analysis of highly-irradiated spent fuels in a dry-powdered process. The analytical results allowed the determination of fuel burnup based on 148 Nd, Pu, and U content. A goal of this effort is to develop the HPLC-ICPMS method for direct fissile material accountancy in the dry-powdered processing of spent nuclear fuel

  7. Sophisticated Online Learning Scheme for Green Resource Allocation in 5G Heterogeneous Cloud Radio Access Networks

    KAUST Repository

    Alqerm, Ismail

    2018-01-23

    5G is the upcoming evolution for the current cellular networks that aims at satisfying the future demand for data services. Heterogeneous cloud radio access networks (H-CRANs) are envisioned as a new trend of 5G that exploits the advantages of heterogeneous and cloud radio access networks to enhance spectral and energy efficiency. Remote radio heads (RRHs) are small cells utilized to provide high data rates for users with high quality of service (QoS) requirements, while high power macro base station (BS) is deployed for coverage maintenance and low QoS users service. Inter-tier interference between macro BSs and RRHs and energy efficiency are critical challenges that accompany resource allocation in H-CRANs. Therefore, we propose an efficient resource allocation scheme using online learning, which mitigates interference and maximizes energy efficiency while maintaining QoS requirements for all users. The resource allocation includes resource blocks (RBs) and power. The proposed scheme is implemented using two approaches: centralized, where the resource allocation is processed at a controller integrated with the baseband processing unit and decentralized, where macro BSs cooperate to achieve optimal resource allocation strategy. To foster the performance of such sophisticated scheme with a model free learning, we consider users\\' priority in RB allocation and compact state representation learning methodology to improve the speed of convergence and account for the curse of dimensionality during the learning process. The proposed scheme including both approaches is implemented using software defined radios testbed. The obtained results and simulation results confirm that the proposed resource allocation solution in H-CRANs increases the energy efficiency significantly and maintains users\\' QoS.

  8. Studies on atom deceleration process by using the Zeeman-tuned technique

    International Nuclear Information System (INIS)

    Bagnato, V.S.

    1990-01-01

    The Zeeman-tuned technique to slow an atomic beam of sodium atoms was detailed studied. A new technique to study the deceleration which consists in monitoring the fluorescence along the deceleration path is used. This allows a direct observation of the process and open possibilities to investigate the adiabatic following of atoms in the magnetic field, and others very important aspects of the process. With a single laser and some modification of the magnetic field profile it is possible stop atoms outside the slower solenoid, which make a lot of experiments much simpler. A systematic study of the optical pumping effects and adiabatic following conditions allow to produce a very strong slow motion atomic beam. (author)

  9. Novel process intensification techniques in solvent extraction. Contributed Paper IT-09

    International Nuclear Information System (INIS)

    Ghosh, S.K.

    2014-01-01

    Process intensification can be briefly described as any chemical engineering development that leads to substantially smaller, cleaner and more energy efficient technology. Process intensification in active nuclear material processing will offer additional benefit in terms of reduced containment volume. The intensification can be realized either by use of novel equipment or by novel operating techniques. Feasibility of hollow fiber (HF) modules and microchannels or microfluidic devices will be explained for their utilization in process intensification of solvent extraction operation in nuclear fuel cycle

  10. A Monte Carlo Sampling Technique for Multi-phonon Processes

    Energy Technology Data Exchange (ETDEWEB)

    Hoegberg, Thure

    1961-12-15

    A sampling technique for selecting scattering angle and energy gain in Monte Carlo calculations of neutron thermalization is described. It is supposed that the scattering is separated into processes involving different numbers of phonons. The number of phonons involved is first determined. Scattering angle and energy gain are then chosen by using special properties of the multi-phonon term.

  11. Process mining techniques: an application to time management

    Science.gov (United States)

    Khowaja, Ali Raza

    2018-04-01

    In an environment people have to make sure that all of their work are completed within a given time in accordance with its quality. In order to achieve the real phenomenon of process mining one needs to understand all of these processes in a detailed manner. Personal Information and communication has always been a highlighting issue on internet but for now information and communication tools within factual life refers to their daily schedule, location analysis, environmental analysis and, more generally, social media applications support these systems which makes data available for data analysis generated through event logs, but also for process analysis which combines environmental and location analysis. Process mining can be used to exploit all these real live processes with the help of the event logs which are already available in those datasets through user censored data or may be user labeled data. These processes could be used to redesign a user's flow and understand all these processes in a bit more detailed manner. In order to increase the quality of each of the processes that we go through our daily lives is to give a closer look to each of the processes and after analyzing them, one should make changes to get better results. On the contrarily, we applied process mining techniques on seven different subjects combined in a single dataset collected from Korea. Above all, the following paper comments on the efficiency of processes in the event logs referring to time management's sphere of influence.

  12. Electron sterilization validation techniques using the controlled depth of sterilization process

    International Nuclear Information System (INIS)

    Cleghorn, D.A.; Nablo, S.V.

    1990-01-01

    Many pharmaceutical products, especially parenteral drugs, cannot be sterilized with gamma rays or high energy electrons due to the concomitant product degradation. In view of the well-controlled electron energy spectrum available in modern electron processors, it is practical to deliver sterilizing doses over depths considerably less than those defining the thickness of blister-pack constructions or pharmaceutical containers. Because bremsstrahlung and X-ray production are minimized at these low electron energies and in these low Z materials, very high electron: penetrating X-ray dose ratios are possible for the application of the technique. Thin film dosimetric techniques have been developed utilizing radiochromic film in the 10-60 g/m 2 range for determining the surface dose distribution in occluded surface areas where direct electron illumination is not possible. Procedures for validation of the process using dried spore inoculum on the product as well as in good geometry are employed to determine the process lethality and its dependence on product surface geometry. Applications of the process to labile pharmaceuticals in glass and polystyrene syringes are reviewed. It has been applied to the sterilization of commercial sterile products since 1987, and the advantages and the natural limitations of the technique are discussed. (author)

  13. Lessons from the masters current concepts in astronomical image processing

    CERN Document Server

    2013-01-01

    There are currently thousands of amateur astronomers around the world engaged in astrophotography at increasingly sophisticated levels. Their ranks far outnumber professional astronomers doing the same and their contributions both technically and artistically are the dominant drivers of progress in the field today. This book is a unique collaboration of individuals, all world-renowned in their particular area, and covers in detail each of the major sub-disciplines of astrophotography. This approach offers the reader the greatest opportunity to learn the most current information and the latest techniques directly from the foremost innovators in the field today.   The book as a whole covers all types of astronomical image processing, including processing of eclipses and solar phenomena, extracting detail from deep-sky, planetary, and widefield images, and offers solutions to some of the most challenging and vexing problems in astronomical image processing. Recognized chapter authors include deep sky experts su...

  14. Making the PACS workstation a browser of image processing software: a feasibility study using inter-process communication techniques.

    Science.gov (United States)

    Wang, Chunliang; Ritter, Felix; Smedby, Orjan

    2010-07-01

    To enhance the functional expandability of a picture archiving and communication systems (PACS) workstation and to facilitate the integration of third-part image-processing modules, we propose a browser-server style method. In the proposed solution, the PACS workstation shows the front-end user interface defined in an XML file while the image processing software is running in the background as a server. Inter-process communication (IPC) techniques allow an efficient exchange of image data, parameters, and user input between the PACS workstation and stand-alone image-processing software. Using a predefined communication protocol, the PACS workstation developer or image processing software developer does not need detailed information about the other system, but will still be able to achieve seamless integration between the two systems and the IPC procedure is totally transparent to the final user. A browser-server style solution was built between OsiriX (PACS workstation software) and MeVisLab (Image-Processing Software). Ten example image-processing modules were easily added to OsiriX by converting existing MeVisLab image processing networks. Image data transfer using shared memory added communication based on IPC techniques is an appealing method that allows PACS workstation developers and image processing software developers to cooperate while focusing on different interests.

  15. Evaluation of Anomaly Detection Techniques for SCADA Communication Resilience

    OpenAIRE

    Shirazi, Syed Noor Ul Hassan; Gouglidis, Antonios; Syeda, Kanza Noor; Simpson, Steven; Mauthe, Andreas Ulrich; Stephanakis, Ioannis M.; Hutchison, David

    2016-01-01

    Attacks on critical infrastructures’ Supervisory Control and Data Acquisition (SCADA) systems are beginning to increase. They are often initiated by highly skilled attackers, who are capable of deploying sophisticated attacks to exfiltrate data or even to cause physical damage. In this paper, we rehearse the rationale for protecting against cyber attacks and evaluate a set of Anomaly Detection (AD) techniques in detecting attacks by analysing traffic captured in a SCADA network. For this purp...

  16. An image processing technique for the radiographic assessment of vertebral derangements

    Energy Technology Data Exchange (ETDEWEB)

    Breen, A.C. (Anglo-European Coll. of Chiropractic, Bournemouth (UK)); Allen, R. (Southampton Univ. (UK). Dept. of Mechanical Engineering); Morris, A. (Odstock Hospital, Salisbury (UK). Dept. of Radiology)

    1989-01-01

    A technique for measuring inter-vertebral motion by the digitization and processing of intensifier images is described. The technique reduces the time and X-ray dosage currently required to make such assessments. The errors associated with computing kinematic indices at increments of coronal plane rotations in the lumbar spine have been calculated using a calibration model designed to produce a facsimile of in vivo conditions in terms of image quality and geometric distortion. (author).

  17. Surveillance of arthropod vector-borne infectious diseases using remote sensing techniques: a review.

    Directory of Open Access Journals (Sweden)

    Satya Kalluri

    2007-10-01

    Full Text Available Epidemiologists are adopting new remote sensing techniques to study a variety of vector-borne diseases. Associations between satellite-derived environmental variables such as temperature, humidity, and land cover type and vector density are used to identify and characterize vector habitats. The convergence of factors such as the availability of multi-temporal satellite data and georeferenced epidemiological data, collaboration between remote sensing scientists and biologists, and the availability of sophisticated, statistical geographic information system and image processing algorithms in a desktop environment creates a fertile research environment. The use of remote sensing techniques to map vector-borne diseases has evolved significantly over the past 25 years. In this paper, we review the status of remote sensing studies of arthropod vector-borne diseases due to mosquitoes, ticks, blackflies, tsetse flies, and sandflies, which are responsible for the majority of vector-borne diseases in the world. Examples of simple image classification techniques that associate land use and land cover types with vector habitats, as well as complex statistical models that link satellite-derived multi-temporal meteorological observations with vector biology and abundance, are discussed here. Future improvements in remote sensing applications in epidemiology are also discussed.

  18. Development of safety analysis and constraint detection techniques for process interaction errors

    Energy Technology Data Exchange (ETDEWEB)

    Fan, Chin-Feng, E-mail: csfanc@saturn.yzu.edu.tw [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China); Tsai, Shang-Lin; Tseng, Wan-Hui [Computer Science and Engineering Dept., Yuan-Ze University, Taiwan (China)

    2011-02-15

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  19. Development of safety analysis and constraint detection techniques for process interaction errors

    International Nuclear Information System (INIS)

    Fan, Chin-Feng; Tsai, Shang-Lin; Tseng, Wan-Hui

    2011-01-01

    Among the new failure modes introduced by computer into safety systems, the process interaction error is the most unpredictable and complicated failure mode, which may cause disastrous consequences. This paper presents safety analysis and constraint detection techniques for process interaction errors among hardware, software, and human processes. Among interaction errors, the most dreadful ones are those that involve run-time misinterpretation from a logic process. We call them the 'semantic interaction errors'. Such abnormal interaction is not adequately emphasized in current research. In our static analysis, we provide a fault tree template focusing on semantic interaction errors by checking conflicting pre-conditions and post-conditions among interacting processes. Thus, far-fetched, but highly risky, interaction scenarios involve interpretation errors can be identified. For run-time monitoring, a range of constraint types is proposed for checking abnormal signs at run time. We extend current constraints to a broader relational level and a global level, considering process/device dependencies and physical conservation rules in order to detect process interaction errors. The proposed techniques can reduce abnormal interactions; they can also be used to assist in safety-case construction.

  20. Generic technique to grow III-V semiconductor nanowires in a closed glass vessel

    Directory of Open Access Journals (Sweden)

    Kan Li

    2016-06-01

    Full Text Available Crystalline III-V semiconductor nanowires have great potential in fabrication of nanodevices for applications in nanoelectronics and optoelectronics, and for studies of novel physical phenomena. Sophisticated epitaxy techniques with precisely controlled growth conditions are often used to prepare high quality III-V nanowires. The growth process and cost of these experiments are therefore dedicated and very high. Here, we report a simple but generic method to synthesize III-V nanowires with high crystal quality. The technique employs a closed evacuated tube vessel with a small tube carrier containing a solid source of materials and another small tube carrier containing a growth substrate inside. The growth of nanowires is achieved after heating the closed vessel in a furnace to a preset high temperature and then cooling it down naturally to room temperature. The technique has been employed to grow InAs, GaAs, and GaSb nanowires on Si/SiO2 substrates. The as-grown nanowires are analyzed by SEM, TEM and Raman spectroscopy and the results show that the nanowires are high quality zincblende single crystals. No particular condition needs to be adjusted and controlled in the experiments. This technique provides a convenient way of synthesis of III-V semiconductor nanowires with high material quality for a wide range of applications.

  1. Applications of process improvement techniques to improve workflow in abdominal imaging.

    Science.gov (United States)

    Tamm, Eric Peter

    2016-03-01

    Major changes in the management and funding of healthcare are underway that will markedly change the way radiology studies will be reimbursed. The result will be the need to deliver radiology services in a highly efficient manner while maintaining quality. The science of process improvement provides a practical approach to improve the processes utilized in radiology. This article will address in a step-by-step manner how to implement process improvement techniques to improve workflow in abdominal imaging.

  2. Monitoring alloy formation during mechanical alloying process by x-ray diffraction techniques

    International Nuclear Information System (INIS)

    Abdul Kadir Masrom; Noraizam Md Diah; Mazli Mustapha

    2002-01-01

    Monitoring alloying (MA) is a novel processing technique that use high energy impact ball mill to produce alloys with enhanced properties and microscopically homogeneous materials starting from various powder mixtures. Mechanical alloying process was originally developed to produce oxide dispersion strengthened nickel superalloys. In principal, in high-energy ball milling process, alloy is formed by the result of repeated welding, fracturing and rewelding of powder particles in a high energy ball mill. In this process a powder mixture in a ball mill is subjected to high-energy collisions among balls. MA has been shown to be capable of synthesizing a variety of materials. It is known to be capable to prepare equilibrium and non-equilibrium phases starting from blended elemental or prealloyed powders. The process ability to produce highly metastable materials such as amorphous alloys and nanostructured materials has made this process attractive and it has been considered as a promising material processing technique that could be used to produce many advanced materials at low cost. The present study explores the conditions under which aluminum alloys formation occurs by ball milling of blended aluminum and its alloying elements powders. In this work, attempt was made in producing aluminum 2024 alloys by milling of blended elemental aluminum powder of 2024 composition in a stainless steel container under argon atmosphere for up to 210 minutes. X-ray diffraction together with thermal analysis techniques has been used to monitor phase changes in the milled powder. Results indicate that, using our predetermined milling parameters, alloys were formed after 120 minutes milling. The thermal analysis data was also presented in this report. (Author)

  3. "SOCRATICS" AS ADDRESSES OF ISOCRATES’ EPIDEICTIC SPEECHES (Against the Sophists, Encomium of Helen, Busiris

    Directory of Open Access Journals (Sweden)

    Anna Usacheva

    2012-06-01

    Full Text Available This article analyses the three epideictic orations of Isocrates which are in themselves a precious testimony of the quality of intellectual life at the close of the fourth century before Christ. To this period belong also the Socratics who are generally seen as an important link between Socrates and Plato. The author of this article proposes a more productive approach to the study of Antisthenes, Euclid of Megara and other so-called Socratics, revealing them not as independent thinkers but rather as adherents of the sophistic school and also as teachers, thereby, including them among those who took part in the educative activity of their time

  4. Mapping innovation processes: Visual techniques for opening and presenting the black box of service innovation processes

    DEFF Research Database (Denmark)

    Olesen, Anne Rørbæk

    2017-01-01

    This chapter argues for the usefulness of visual mapping techniques for performing qualitative analysis of complex service innovation processes. Different mapping formats are presented, namely, matrices, networks, process maps, situational analysis maps and temporal situational analysis maps....... For the purpose of researching service innovation processes, the three latter formats are argued to be particularly interesting. Process maps can give an overview of different periods and milestones in a process in one carefully organized location. Situational analysis maps and temporal situational analysis maps...... can open up complexities of service innovation processes, as well as close them down for presentational purposes. The mapping formats presented are illustrated by displaying maps from an exemplary research project, and the chapter is concluded with a brief discussion of the limitations and pitfalls...

  5. Four-hour processing of clinical/diagnostic specimens for electron microscopy using microwave technique.

    Science.gov (United States)

    Giberson, R T; Demaree, R S; Nordhausen, R W

    1997-01-01

    A protocol for routine 4-hour microwave tissue processing of clinical or other samples for electron microscopy was developed. Specimens are processed by using a temperature-restrictive probe that can be set to automatically cycle the magnetron to maintain any designated temperature restriction (temperature maximum). In addition, specimen processing during fixation is performed in 1.7-ml microcentrifuge tubes followed by subsequent processing in flow-through baskets. Quality control is made possible during each step through the addition of an RS232 port to the microwave, allowing direct connection of the microwave oven to any personal computer. The software provided with the temperature probe enables the user to monitor time and temperature on a real-time basis. Tissue specimens, goat placenta, mouse liver, mouse kidney, and deer esophagus were processed by conventional and microwave techniques in this study. In all instances, the results for the microwave-processed samples were equal to or better than those achieved by routine processing techniques.

  6. Fabrication and processing of polymer solar cells: A review of printing and coating techniques

    DEFF Research Database (Denmark)

    Krebs, Frederik C

    2009-01-01

    Polymer solar cells are reviewed in the context of the processing techniques leading to complete devices. A distinction is made between the film-forming techniques that are used currently such as spincoating, doctor blading and casting and the, from a processing point of view, more desirable film...... are described with focus on the particular advantages and disadvantages associated with each case....

  7. Commercial Applications of X Ray Spectrometric Techniques

    International Nuclear Information System (INIS)

    Wegrzynek, D.

    2013-01-01

    In the 21st century, the X-ray fluorescence (XRF) technique is widely used in process control, industrial applications and for routine elemental analysis. The technique has a multielement capability capable of detecting elements with Z ≥ 10, with a few instruments capable of detecting also elements with Z ≥ 5. It is characterized by a non-destructive analysis process and relatively good detection limits, typically one part per million, for a wide range of elements. The first commercial XRF instruments were introduced to the market about 50 years ago. They were the wavelength dispersive X ray fluorescence (WDXRF) spectrometers utilizing Bragg’s law and reflection on crystal lattices for sequential elemental analysis of sample composition. The advances made in radiation detector technology, especially the introduction of semiconductor detectors, improvements in signal processing electronics, availability and exponential growth of personal computer market led to invention of energy dispersive X ray fluorescence (EDXRF) technique. The EDXRF is more cost effective as compared to WDXRF. It also allows for designing compact instruments. Such instruments can be easily tailored to the needs of different customers, integrated with industrial installations, and also miniaturized for the purpose of in-situ applications. The versatility of the technique has been confirmed in a spectacular way by using the XRF and X-ray spectrometric techniques, among few others, during the NASA and ESA missions in search for the evidence of life and presence of water on the surface of Mars. The XRF technique has achieved its strong position within the atomic spectroscopy group of analytical techniques not only due to its versatility but also due to relatively low running costs, as compared to the commonly used methods, e.g., atomic absorption spectrometry (AAS) or inductively coupled plasma atomic emission/mass spectrometry (ICP-AES/MS). Presently, the XRF technique together with X ray

  8. Air Conditioning Compressor Air Leak Detection by Image Processing Techniques for Industrial Applications

    Directory of Open Access Journals (Sweden)

    Pookongchai Kritsada

    2015-01-01

    Full Text Available This paper presents method to detect air leakage of an air conditioning compressor using image processing techniques. Quality of air conditioning compressor should not have air leakage. To test an air conditioning compressor leak, air is pumped into a compressor and then submerged into the water tank. If air bubble occurs at surface of the air conditioning compressor, that leakage compressor must be returned for maintenance. In this work a new method to detect leakage and search leakage point with high accuracy, fast, and precise processes was proposed. In a preprocessing procedure to detect the air bubbles, threshold and median filter techniques have been used. Connected component labeling technique is used to detect the air bubbles while blob analysis is searching technique to analyze group of the air bubbles in sequential images. The experiments are tested with proposed algorithm to determine the leakage point of an air conditioning compressor. The location of the leakage point was presented as coordinated point. The results demonstrated that leakage point during process could be accurately detected. The estimation point had error less than 5% compared to the real leakage point.

  9. Visual air quality simulation techniques

    Science.gov (United States)

    Molenar, John V.; Malm, William C.; Johnson, Christopher E.

    Visual air quality is primarily a human perceptual phenomenon beginning with the transfer of image-forming information through an illuminated, scattering and absorbing atmosphere. Visibility, especially the visual appearance of industrial emissions or the degradation of a scenic view, is the principal atmospheric characteristic through which humans perceive air pollution, and is more sensitive to changing pollution levels than any other air pollution effect. Every attempt to quantify economic costs and benefits of air pollution has indicated that good visibility is a highly valued and desired environmental condition. Measurement programs can at best approximate the state of the ambient atmosphere at a few points in a scenic vista viewed by an observer. To fully understand the visual effect of various changes in the concentration and distribution of optically important atmospheric pollutants requires the use of aerosol and radiative transfer models. Communication of the output of these models to scientists, decision makers and the public is best done by applying modern image-processing systems to generate synthetic images representing the modeled air quality conditions. This combination of modeling techniques has been under development for the past 15 yr. Initially, visual air quality simulations were limited by a lack of computational power to simplified models depicting Gaussian plumes or uniform haze conditions. Recent explosive growth in low cost, high powered computer technology has allowed the development of sophisticated aerosol and radiative transfer models that incorporate realistic terrain, multiple scattering, non-uniform illumination, varying spatial distribution, concentration and optical properties of atmospheric constituents, and relative humidity effects on aerosol scattering properties. This paper discusses these improved models and image-processing techniques in detail. Results addressing uniform and non-uniform layered haze conditions in both

  10. The application of neural networks with artificial intelligence technique in the modeling of industrial processes

    International Nuclear Information System (INIS)

    Saini, K. K.; Saini, Sanju

    2008-01-01

    Neural networks are a relatively new artificial intelligence technique that emulates the behavior of biological neural systems in digital software or hardware. These networks can 'learn', automatically, complex relationships among data. This feature makes the technique very useful in modeling processes for which mathematical modeling is difficult or impossible. The work described here outlines some examples of the application of neural networks with artificial intelligence technique in the modeling of industrial processes.

  11. Short-term Local Forecasting by Artificial Intelligence Techniques and Assess Related Social Effects from Heterogeneous Data

    OpenAIRE

    Gong, Bing

    2017-01-01

    This work aims to use the sophisticated artificial intelligence and statistic techniques to forecast pollution and assess its social impact. To achieve the target of the research, this study is divided into several research sub-objectives as follows: First research sub-objective: propose a framework for relocating and reconfiguring the existing pollution monitoring networks by using feature selection, artificial intelligence techniques, and information theory. Second research sub-objective: c...

  12. The use of tomographic techniques in the mineral processing Industry. A review

    International Nuclear Information System (INIS)

    Witika, L.K.; Jere, E.H.

    2002-01-01

    Process tomographic techniques may be used to analyse the internal state of most of the multiphase process engineering systems such as material segregation in a reactor multiphase flow in pipes and the spatial resolution of mineral grains in multiphase particles. These techniques include radiation computed tomography (X-ray or ray), electrical methods(capacitance, impedance and inductive tomography) positron emission tomography,optical tomography, microwave tomography, acoustic tomographical methods and many more. Many potential applications exist for process tomographic instrumentation for quantitative analysis and fault-detection purposes. Amongst these, electrical methods are widely used for those mineral processes deserving particular attention such as dense-medium separation, hydro cyclones, flotation cells and columns, gas-liquid absorbers, solvent extraction and other liquid-liquid processes, filtration and other solid-liquid processes, grinding mills (both dry and wet, conveyors and hoppers). Development in on-line measurement instrumentation now allow direct observation of the behaviour of fluids inside mineral separation equipment. This offers the possibility to acquire process data to enable models to be devised, to verify theoretical computational fluid dynamics predictions and control of various unit processes. In this review, the most important tomographic sensing methods are reviewed. Examples of the implementation of some electrical methods are illustrated. (authors)

  13. Comparison Of Several Metrology Techniques For In-line Process Monitoring Of Porous SiOCH

    International Nuclear Information System (INIS)

    Fossati, D.; Imbert, G.; Beitia, C.; Yu, L.; Plantier, L.; Volpi, F.; Royer, J.-C.

    2007-01-01

    As porous SiOCH is a widely used inter-metal dielectric for 65 nm nodes and below, the control of its elaboration process by in-line monitoring is necessary to guarantee successful integration of the material. In this paper, the sensitivities of several non-destructive metrology techniques towards the film elaboration process drifts are investigated. It appears that the two steps of the process should be monitored separately and that corona charge method is the most sensitive technique of the review for this application

  14. COCONUT WATER VINEGAR: NEW ALTERNATIVE WITH IMPROVED PROCESSING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    MUHAMMAD ANAS OTHAMAN

    2014-06-01

    Full Text Available Vinegar is a condiment made from various sugary and starchy materials by alcoholic and subsequent acetic fermentation. Vinegar can be produced via different methods and from various types of raw material. A new alternative substrate for vinegar production namely mature coconut water has been tested and was compared with 2 common substrates which were coconut sap and pineapple juice. Substrates such as sap and juices have been found to have high amount of total soluble solids which corresponding to high sugar content in the substrates which is more than 14oBrix. Therefore, both substrates could be directly used for vinegar production without requirement of other carbon sources. However, coconut water which showed low Brix value need to be adjusted to 14oBrix by adding sucrose prior to the fermentation process. Substrates fermented with Saccharomyces cerevisiae have yielded 7-8% of alcohol within 7-10 days aerobic incubation at room temperature. The alcoholic medium were then used as a seed broth for acetic fermentation with Acetobactor aceti as inoculums and fermented for approximately 2 months to obtain at least 4% of acetic acid. Investigation on the effect of inoculum sizes and implementation of back-slopping technique were performed to improve the processing method for coconut water vinegar production. The results show that 10% of inoculum size was the best for acetic acid fermentation and the back-slopping technique has helped to reduce the process time of coconut water vinegar production.

  15. Image Processing Based Signature Verification Technique to Reduce Fraud in Financial Institutions

    Directory of Open Access Journals (Sweden)

    Hussein Walid

    2016-01-01

    Full Text Available Handwritten signature is broadly utilized as personal verification in financial institutions ensures the necessity for a robust automatic signature verification tool. This tool aims to reduce fraud in all related financial transactions’ sectors. This paper proposes an online, robust, and automatic signature verification technique using the recent advances in image processing and machine learning. Once the image of a handwritten signature for a customer is captured, several pre-processing steps are performed on it including filtration and detection of the signature edges. Afterwards, a feature extraction process is applied on the image to extract Speeded up Robust Features (SURF and Scale-Invariant Feature Transform (SIFT features. Finally, a verification process is developed and applied to compare the extracted image features with those stored in the database for the specified customer. Results indicate high accuracy, simplicity, and rapidity of the developed technique, which are the main criteria to judge a signature verification tool in banking and other financial institutions.

  16. Simulation of land mine detection processes using nuclear techniques

    International Nuclear Information System (INIS)

    Aziz, M.

    2005-01-01

    A computer models were designed to study the processes of land mine detection using nuclear technique. Parameters that affect the detection were analyzed . Mines of different masses at different depths in the soil are considered using two types of sources , 252 C f and 14 MeV neutron source. The capability to differentiate between mines and other objects such as concrete , iron , wood , Aluminum ,water and polyethylene were analyzed and studied

  17. Applying advanced digital signal processing techniques in industrial radioisotopes applications

    International Nuclear Information System (INIS)

    Mahmoud, H.K.A.E.

    2012-01-01

    Radioisotopes can be used to obtain signals or images in order to recognize the information inside the industrial systems. The main problems of using these techniques are the difficulty of identification of the obtained signals or images and the requirement of skilled experts for the interpretation process of the output data of these applications. Now, the interpretation of the output data from these applications is performed mainly manually, depending heavily on the skills and the experience of trained operators. This process is time consuming and the results typically suffer from inconsistency and errors. The objective of the thesis is to apply the advanced digital signal processing techniques for improving the treatment and the interpretation of the output data from the different Industrial Radioisotopes Applications (IRA). This thesis focuses on two IRA; the Residence Time Distribution (RTD) measurement and the defect inspection of welded pipes using a gamma source (gamma radiography). In RTD measurement application, this thesis presents methods for signal pre-processing and modeling of the RTD signals. Simulation results have been presented for two case studies. The first case study is a laboratory experiment for measuring the RTD in a water flow rig. The second case study is an experiment for measuring the RTD in a phosphate production unit. The thesis proposes an approach for RTD signal identification in the presence of noise. In this approach, after signal processing, the Mel Frequency Cepstral Coefficients (MFCCs) and polynomial coefficients are extracted from the processed signal or from one of its transforms. The Discrete Wavelet Transform (DWT), Discrete Cosine Transform (DCT), and Discrete Sine Transform (DST) have been tested and compared for efficient feature extraction. Neural networks have been used for matching of the extracted features. Furthermore, the Power Density Spectrum (PDS) of the RTD signal has been also used instead of the discrete

  18. Quantification of UV-Visible and Laser Spectroscopic Techniques for Materials Accountability and Process Control

    International Nuclear Information System (INIS)

    Czerwinski, Kenneth; Weck, Phil

    2013-01-01

    Ultraviolet-visible spectroscopy (UV-Visible) and time-resolved laser fluorescence spectroscopy (TRLFS) optical techniques can permit on-line analysis of actinide elements in a solvent extraction process in real time. These techniques have been used for measuring actinide speciation and concentration under laboratory conditions and are easily adaptable to multiple sampling geometries, such as dip probes, fiber-optic sample cells, and flow-through cell geometries. To fully exploit these techniques, researchers must determine the fundamental speciation of target actinides and the resulting influence on spectroscopic properties. Detection limits, process conditions, and speciation of key actinide components can be established and utilized in a range of areas, particularly those related to materials accountability and process control. Through this project, researchers will develop tools and spectroscopic techniques to evaluate solution extraction conditions and concentrations of U, Pu, and Cm in extraction processes, addressing areas of process control and materials accountability. The team will evaluate UV- Visible and TRLFS for use in solvent extraction-based separations. Ongoing research is examining efficacy of UV-Visible spectroscopy to evaluate uranium and plutonium speciation under conditions found in the UREX process and using TRLFS to evaluate Cm speciation and concentration in the TALSPEAK process. A uranyl and plutonium nitrate UV-Visible spectroscopy study met with success, which supports the utility and continued exploration of spectroscopic methods for evaluation of actinide concentrations and solution conditions for other aspects of the UREX+ solvent extraction scheme. This project will examine U and Pu absorbance in TRUEX and TALSPEAK, perform detailed examination of Cm in TRUEX and TALSPEAK, study U laser fluorescence, and apply project data to contactors. The team will also determine peak ratios as a function of solution concentrations for the UV

  19. Trends in data processing of comprehensive two-dimensional chromatography: state of the art.

    Science.gov (United States)

    Matos, João T V; Duarte, Regina M B O; Duarte, Armando C

    2012-12-01

    The operation of advanced chromatographic systems, namely comprehensive two-dimensional (2D) chromatography coupled to multidimensional detectors, allows achieving a great deal of data that need special care to be processed in order to characterize and quantify as much as possible the analytes under study. The aim of this review is to identify the main trends, research needs and gaps on the techniques for data processing of multidimensional data sets obtained from comprehensive 2D chromatography. The following topics have been identified as the most promising for new developments in the near future: data acquisition and handling, peak detection and quantification, measurement of overlapping of 2D peaks, and data analysis software for 2D chromatography. The rational supporting most of the data processing techniques is based on the generalization of one-dimensional (1D) chromatography although algorithms, such as the inverted watershed algorithm, use the 2D chromatographic data as such. However, for processing more complex N-way data there is a need for using more sophisticated techniques. Apart from using other concepts from 1D chromatography, which have not been tested for 2D chromatography, there is still room for new improvements and developments in algorithms and software for dealing with 2D comprehensive chromatographic data. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Physical evaluations of Co-Cr-Mo parts processed using different additive manufacturing techniques

    Science.gov (United States)

    Ghani, Saiful Anwar Che; Mohamed, Siti Rohaida; Harun, Wan Sharuzi Wan; Noar, Nor Aida Zuraimi Md

    2017-12-01

    In recent years, additive manufacturing with highly design customization has gained an important technique for fabrication in aerospace and medical fields. Despite the ability of the process to produce complex components with highly controlled architecture geometrical features, maintaining the part's accuracy, ability to fabricate fully functional high density components and inferior surfaces quality are the major obstacles in producing final parts using additive manufacturing for any selected application. This study aims to evaluate the physical properties of cobalt chrome molybdenum (Co-Cr-Mo) alloys parts fabricated by different additive manufacturing techniques. The full dense Co-Cr-Mo parts were produced by Selective Laser Melting (SLM) and Direct Metal Laser Sintering (DMLS) with default process parameters. The density and relative density of samples were calculated using Archimedes' principle while the surface roughness on the top and side surface was measured using surface profiler. The roughness average (Ra) for top surface for SLM produced parts is 3.4 µm while 2.83 µm for DMLS produced parts. The Ra for side surfaces for SLM produced parts is 4.57 µm while 9.0 µm for DMLS produced parts. The higher Ra values on side surfaces compared to the top faces for both manufacturing techniques was due to the balling effect phenomenon. The yield relative density for both Co-Cr-Mo parts produced by SLM and DMLS are 99.3%. Higher energy density has influence the higher density of produced samples by SLM and DMLS processes. The findings of this work demonstrated that SLM and DMLS process with default process parameters have effectively produced full dense parts of Co-Cr-Mo with high density, good agreement of geometrical accuracy and better surface finish. Despite of both manufacturing process yield that produced components with higher density, the current finding shows that SLM technique could produce components with smoother surface quality compared to DMLS

  1. Volumetric image processing: A new technique for three-dimensional imaging

    International Nuclear Information System (INIS)

    Fishman, E.K.; Drebin, B.; Magid, D.; St Ville, J.A.; Zerhouni, E.A.; Siegelman, S.S.; Ney, D.R.

    1986-01-01

    Volumetric three-dimensional (3D) image processing was performed on CT scans of 25 normal hips, and image quality and potential diagnostic applications were assessed. In contrast to surface detection 3D techniques, volumetric processing preserves every pixel of transaxial CT data, replacing the gray scale with transparent ''gels'' and shading. Anatomically, accurate 3D images can be rotated and manipulated in real time, including simulated tissue layer ''peeling'' and mock surgery or disarticulation. This pilot study suggests that volumetric rendering is a major advance in signal processing of medical image data, producing a high quality, uniquely maneuverable image that is useful for fracture interpretation, soft-tissue analysis, surgical planning, and surgical rehearsal

  2. Processing of combustible radioactive waste using incineration techniques

    International Nuclear Information System (INIS)

    Maestas, E.

    1981-01-01

    Among the OECD Nuclear Energy Agency Member countries numerous incineration concepts are being studied as potential methods for conditioning alpha-bearing and other types of combustible radioactive waste. The common objective of these different processes is volume reduction and the transformation of the waste to a more acceptable waste form. Because the combustion processes reduce the mass and volume of waste to a form which is generally more inert than the feed material, the resulting waste can be more uniformly compatible with safe handling, packaging, storage and/or disposal techniques. The number of different types of combustion process designed and operating specifically for alpha-bearing wastes is somewhat small compared with those for non-alpha radioactive wastes; however, research and development is under way in a number of countries to develop and improve alpha incinerators. This paper provides an overview of most alpha-incineration concepts in operation or under development in OECD/NEA Member countries. The special features of each concept are briefly discussed. A table containing characteristic data of incinerators is presented so that a comparison of the major programmes can be made. The table includes the incinerator name and location, process type, capacity throughput, operational status and application. (author)

  3. Combinatorial techniques to efficiently investigate and optimize organic thin film processing and properties.

    Science.gov (United States)

    Wieberger, Florian; Kolb, Tristan; Neuber, Christian; Ober, Christopher K; Schmidt, Hans-Werner

    2013-04-08

    In this article we present several developed and improved combinatorial techniques to optimize processing conditions and material properties of organic thin films. The combinatorial approach allows investigations of multi-variable dependencies and is the perfect tool to investigate organic thin films regarding their high performance purposes. In this context we develop and establish the reliable preparation of gradients of material composition, temperature, exposure, and immersion time. Furthermore we demonstrate the smart application of combinations of composition and processing gradients to create combinatorial libraries. First a binary combinatorial library is created by applying two gradients perpendicular to each other. A third gradient is carried out in very small areas and arranged matrix-like over the entire binary combinatorial library resulting in a ternary combinatorial library. Ternary combinatorial libraries allow identifying precise trends for the optimization of multi-variable dependent processes which is demonstrated on the lithographic patterning process. Here we verify conclusively the strong interaction and thus the interdependency of variables in the preparation and properties of complex organic thin film systems. The established gradient preparation techniques are not limited to lithographic patterning. It is possible to utilize and transfer the reported combinatorial techniques to other multi-variable dependent processes and to investigate and optimize thin film layers and devices for optical, electro-optical, and electronic applications.

  4. Combinatorial Techniques to Efficiently Investigate and Optimize Organic Thin Film Processing and Properties

    Directory of Open Access Journals (Sweden)

    Hans-Werner Schmidt

    2013-04-01

    Full Text Available In this article we present several developed and improved combinatorial techniques to optimize processing conditions and material properties of organic thin films. The combinatorial approach allows investigations of multi-variable dependencies and is the perfect tool to investigate organic thin films regarding their high performance purposes. In this context we develop and establish the reliable preparation of gradients of material composition, temperature, exposure, and immersion time. Furthermore we demonstrate the smart application of combinations of composition and processing gradients to create combinatorial libraries. First a binary combinatorial library is created by applying two gradients perpendicular to each other. A third gradient is carried out in very small areas and arranged matrix-like over the entire binary combinatorial library resulting in a ternary combinatorial library. Ternary combinatorial libraries allow identifying precise trends for the optimization of multi-variable dependent processes which is demonstrated on the lithographic patterning process. Here we verify conclusively the strong interaction and thus the interdependency of variables in the preparation and properties of complex organic thin film systems. The established gradient preparation techniques are not limited to lithographic patterning. It is possible to utilize and transfer the reported combinatorial techniques to other multi-variable dependent processes and to investigate and optimize thin film layers and devices for optical, electro-optical, and electronic applications.

  5. A novel eco-friendly technique for efficient control of lime water softening process.

    Science.gov (United States)

    Ostovar, Mohamad; Amiri, Mohamad

    2013-12-01

    Lime softening is an established type of water treatment used for water softening. The performance of this process is highly dependent on lime dosage. Currently, lime dosage is adjusted manually based on chemical tests, aimed at maintaining the phenolphthalein (P) and total (M) alkalinities within a certain range (2 P - M > or = 5). In this paper, a critical study of the softening process has been presented. It has been shown that the current method is frequently incorrect. Furthermore, electrical conductivity (EC) has been introduced as a novel indicator for effectively characterizing the lime softening process.This novel technique has several advantages over the current alkalinities method. Because no chemical reagents are needed for titration, which is a simple test, there is a considerable reduction in test costs. Additionally, there is a reduction in the treated water hardness and generated sludge during the lime softening process. Therefore, it is highly eco-friendly, and is a very cost effective alternative technique for efficient control of the lime softening process.

  6. A sophisticated simulation for the fracture behavior of concrete material using XFEM

    Science.gov (United States)

    Zhai, Changhai; Wang, Xiaomin; Kong, Jingchang; Li, Shuang; Xie, Lili

    2017-10-01

    The development of a powerful numerical model to simulate the fracture behavior of concrete material has long been one of the dominant research areas in earthquake engineering. A reliable model should be able to adequately represent the discontinuous characteristics of cracks and simulate various failure behaviors under complicated loading conditions. In this paper, a numerical formulation, which incorporates a sophisticated rigid-plastic interface constitutive model coupling cohesion softening, contact, friction and shear dilatation into the XFEM, is proposed to describe various crack behaviors of concrete material. An effective numerical integration scheme for accurately assembling the contribution to the weak form on both sides of the discontinuity is introduced. The effectiveness of the proposed method has been assessed by simulating several well-known experimental tests. It is concluded that the numerical method can successfully capture the crack paths and accurately predict the fracture behavior of concrete structures. The influence of mode-II parameters on the mixed-mode fracture behavior is further investigated to better determine these parameters.

  7. Changes in attentional processing and affective reactivity in pregnancy and postpartum

    Directory of Open Access Journals (Sweden)

    Gollan JK

    2014-11-01

    Full Text Available Jackie K Gollan, Laina Rosebrock, Denada Hoxha, Katherine L Wisner Asher Center for the Study and Treatment of Depressive Disorders, Department of Psychiatry and Behavioral Sciences, Northwestern University Feinberg School of Medicine, Chicago, IL, USA Abstract: The aim of this review is to provide an overview of the research in attentional processing and affective reactivity in pregnancy and postpartum to inform future research. Numerous changes occur in attentional processing and affective reactivity across the childbearing period. This review focuses on the definition and methods of measuring attentional processing and affective reactivity. We discuss research studies that have examined the changes in these two processes during the perinatal phases of pregnancy and postpartum, with and without depression and anxiety. We evaluate the importance of using multiple levels of measurement, including physiological and neuroimaging techniques, to study these processes via implicit and explicit tasks. Research that has identified regions of brain activation using functional magnetic resonance imaging as well as other physiological assessments is integrated into the discussion. The importance of using sophisticated methodological techniques in future studies, such as multiple mediation models, for the purpose of elucidating mechanisms of change during these processes in pregnancy and postpartum is emphasized. We conclude with a discussion of the effect of these processes on maternal psychological functioning and infant outcomes. These processes support a strategy for individualizing treatment for pregnant and postpartum women suffering from depression and anxiety. Keywords: attentional processing, emotion, affective reactivity, depression, pregnancy, postpartum

  8. The training and learning process of transseptal puncture using a modified technique.

    Science.gov (United States)

    Yao, Yan; Ding, Ligang; Chen, Wensheng; Guo, Jun; Bao, Jingru; Shi, Rui; Huang, Wen; Zhang, Shu; Wong, Tom

    2013-12-01

    As the transseptal (TS) puncture has become an integral part of many types of cardiac interventional procedures, its technique that was initial reported for measurement of left atrial pressure in 1950s, continue to evolve. Our laboratory adopted a modified technique which uses only coronary sinus catheter as the landmark to accomplishing TS punctures under fluoroscopy. The aim of this study is prospectively to evaluate the training and learning process for TS puncture guided by this modified technique. Guided by the training protocol, TS puncture was performed in 120 consecutive patients by three trainees without previous personal experience in TS catheterization and one experienced trainer as a controller. We analysed the following parameters: one puncture success rate, total procedure time, fluoroscopic time, and radiation dose. The learning curve was analysed using curve-fitting methodology. The first attempt at TS crossing was successful in 74 (82%), a second attempt was successful in 11 (12%), and 5 patients failed to puncture the interatrial septal finally. The average starting process time was 4.1 ± 0.8 min, and the estimated mean learning plateau was 1.2 ± 0.2 min. The estimated mean learning rate for process time was 25 ± 3 cases. Important aspects of learning curve can be estimated by fitting inverse curves for TS puncture. The study demonstrated that this technique was a simple, safe, economic, and effective approach for learning of TS puncture. Base on the statistical analysis, approximately 29 TS punctures will be needed for trainee to pass the steepest area of learning curve.

  9. Implications of a ''Noisy'' observer to data processing techniques

    International Nuclear Information System (INIS)

    Goodenough, D.J.; Metz, C.E.

    1975-01-01

    It is attempted to show how an internal noise source (darklight and threshold jitter) would tend to explain experimental data concerning the visual detection of noise-limited signal in diagnostic imaging. The interesting conclusions can be drawn that the internal noise sets the upper limit to the utility of data processing techniques designed to reduce image noise. Moreover, there should be instances where contrast enhancement techniques may be far more useful to the human observer than corresponding reductions in noise amplitude, especially at high count rates (sigma/sub p/ less than or equal to sigma/sub D/). Then too, the limitations imposed on the human observer by an internal noise source, may point towards the need for additional methods (e.g. computer/microdensitometer) of interpreting images of high photon density so that the highest possible signal to noise ratio might be obtained

  10. A study on hybrid split-spectrum processing technique for enhanced reliability in ultrasonic signal analysis

    International Nuclear Information System (INIS)

    Huh, Hyung; Koo, Kil Mo; Cheong, Yong Moo; Kim, G. J.

    1995-01-01

    Many signal-processing techniques have been found to be useful in ultrasonic and nondestructive evaluation. Among the most popular techniques are signal averaging, spatial compounding, matched filters, and homomorphic processing. One of the significant new process is split-spectrum processing(SSP), which can be equally useful in signal-to-noise ratio(SNR) improvement and grain characterization in several engineering materials. The purpose of this paper is to explore the utility of SSP in ultrasonic NDE. A wide variety of engineering problems are reviewed and suggestions for implementation of the technique are provided. SSP uses the frequency-dependent response of the interfering coherent noise produced by unresolvable scatters in the resolution range cell of a transducer. It is implemented by splitting the Sequency spectrum of the received signal by using Gaussian bandpass filters. The theoretical basis for the potential of SSP for grain characterization in SUS 304 material is discussed, and some experimental-evidence for the feasibility of the approach is presented. Results of SNR enhancement in signals obtained from real four samples of SUS 304. The influence of various processing parameters on the performance of the processing technique is also discussed. The minimization algorithm. which provides an excellent SNR enhancement when used either in conjunction with other SSP algorithms like polarity-check or by itself, is also presented.

  11. A Study on Hybrid Split-Spectrum Processing Technique for Enhanced Reliability in Ultrasonic Signal Analysis

    International Nuclear Information System (INIS)

    Huh, H.; Koo, K. M.; Kim, G. J.

    1996-01-01

    Many signal-processing techniques have been found to be useful in ultrasonic and nondestructive evaluation. Among the most popular techniques are signal averaging, spatial compounding, matched filters and homomorphic processing. One of the significant new process is split-spectrum processing(SSP), which can be equally useful in signal-to-noise ratio(SNR) improvement and grain characterization in several specimens. The purpose of this paper is to explore the utility of SSP in ultrasonic NDE. A wide variety of engineering problems are reviewed, and suggestions for implementation of the technique are provided. SSP uses the frequency-dependent response of the interfering coherent noise produced by unresolvable scatters in the resolution range cell of a transducer. It is implemented by splitting the frequency spectrum of the received signal by using gaussian bandpass filter. The theoretical basis for the potential of SSP for grain characterization in SUS 304 material is discussed, and some experimental evidence for the feasibility of the approach is presented. Results of SNR enhancement in signals obtained from real four samples of SUS 304. The influence of various processing parameters on the performance of the processing technique is also discussed. The minimization algorithm, which provides an excellent SNR enhancement when used either in conjunction with other SSP algorithms like polarity-check or by itself, is also presented

  12. The effect of starting point placement technique on thoracic transverse process strength: an ex vivo biomechanical study

    Directory of Open Access Journals (Sweden)

    Burton Douglas C

    2010-07-01

    Full Text Available Abstract Background The use of thoracic pedicle screws in spinal deformity, trauma, and tumor reconstruction is becoming more common. Unsuccessful screw placement may require salvage techniques utilizing transverse process hooks. The effect of different starting point placement techniques on the strength of the transverse process has not previously been reported. The purpose of this paper is to determine the biomechanical properties of the thoracic transverse process following various pedicle screw starting point placement techniques. Methods Forty-seven fresh-frozen human cadaveric thoracic vertebrae from T2 to T9 were disarticulated and matched by bone mineral density (BMD and transverse process (TP cross-sectional area. Specimens were randomized to one of four groups: A, control, and three others based on thoracic pedicle screw placement technique; B, straightforward; C, funnel; and D, in-out-in. Initial cortical bone removal for pedicle screw placement was made using a burr at the location on the transverse process or transverse process-laminar junction as published in the original description of each technique. The transverse process was tested measuring load-to-failure simulating a hook in compression mode. Analysis of covariance and Pearson correlation coefficients were used to examine the data. Results Technique was a significant predictor of load-to-failure (P = 0.0007. The least squares mean (LS mean load-to-failure of group A (control was 377 N, group B (straightforward 355 N, group C (funnel 229 N, and group D (in-out-in 301 N. Significant differences were noted between groups A and C, A and D, B and C, and C and D. BMD (0.925 g/cm2 [range, 0.624-1.301 g/cm2] was also a significant predictor of load-to-failure, for all specimens grouped together (P P 0.05. Level and side tested were not found to significantly correlate with load-to-failure. Conclusions The residual coronal plane compressive strength of the thoracic transverse process

  13. The Validation of Vapor Phase Hydrogen Peroxide Microbial Reduction for Planetary Protection and a Proposed Vacuum Process Specification

    Science.gov (United States)

    Chung, Shirley; Barengoltz, Jack; Kern, Roger; Koukol, Robert; Cash, Howard

    2006-01-01

    The Jet Propulsion Laboratory, in conjunction with the NASA Planetary Protection Officer, has selected the vapor phase hydrogen peroxide sterilization process for continued development as a NASA approved sterilization technique for spacecraft subsystems and systems. The goal is to include this technique, with an appropriate specification, in NPR 8020.12C as a low temperature complementary technique to the dry heat sterilization process.To meet microbial reduction requirements for all Mars in-situ life detection and sample return missions, various planetary spacecraft subsystems will have to be exposed to a qualified sterilization process. This process could be the elevated temperature dry heat sterilization process (115 C for 40 hours) which was used to sterilize the Viking lander spacecraft. However, with utilization of such elements as highly sophisticated electronics and sensors in modern spacecraft, this process presents significant materials challenges and is thus an undesirable bioburden reduction method to design engineers. The objective of this work is to introduce vapor hydrogen peroxide (VHP) as an alternative to dry heat microbial reduction to meet planetary protection requirements.The VHP process is widely used by the medical industry to sterilize surgical instruments and biomedical devices, but high doses of VHP may degrade the performance of flight hardware, or compromise material properties. Our goal for this study was to determine the minimum VHP process conditions to achieve microbial reduction levels acceptable for planetary protection.

  14. Surface analytical techniques applied to minerals processing

    International Nuclear Information System (INIS)

    Smart, R.St.C.

    1991-01-01

    An understanding of the chemical and physical forms of the chemically altered layers on the surfaces of base metal sulphides, particularly in the form of hydroxides, oxyhydroxides and oxides, and the changes that occur in them during minerals processing lies at the core of a complete description of flotation chemistry. This paper reviews the application of a variety of surface-sensitive techniques and methodologies applied to the study of surface layers on single minerals, mixed minerals, synthetic ores and real ores. Evidence from combined XPS/SAM/SEM studies have provided images and analyses of three forms of oxide, oxyhydroxide and hydroxide products on the surfaces of single sulphide minerals, mineral mixtures and complex sulphide ores. 4 refs., 2 tabs., 4 figs

  15. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    Science.gov (United States)

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  16. Advanced signal processing techniques for acoustic detection of sodium/water reaction

    International Nuclear Information System (INIS)

    Yughay, V.S.; Gribok, A.V.; Volov, A.N.

    1997-01-01

    In this paper results of development of a neural network technique for processing of acoustic background noise and injection noise of various media (argon, water steam, hydrogen) at test rigs and industrial steam generator are presented. (author). 3 refs, 9 figs, 3 tabs

  17. Material accountancy measurement techniques in dry-powdered processing of nuclear spent fuels

    International Nuclear Information System (INIS)

    Wolf, S. F.

    1999-01-01

    The paper addresses the development of inductively coupled plasma-mass spectrometry (ICPMS), thermal ionization-mass spectrometry (TIMS), alpha-spectrometry, and gamma spectrometry techniques for in-line analysis of highly irradiated (18 to 64 GWD/T) PWR spent fuels in a dry-powdered processing cycle. The dry-powdered technique for direct elemental and isotopic accountancy assay measurements was implemented without the need for separation of the plutonium, uranium and fission product elements in the bulk powdered process. The analyses allow the determination of fuel burn-up based on the isotopic composition of neodymium and/or cesium. An objective of the program is to develop the ICPMS method for direct fissile nuclear materials accountancy in the dry-powdered processing of spent fuel. The ICPMS measurement system may be applied to the KAERI DUPIC (direct use of spent PWR fuel in CANDU reactors) experiment, and in a near-real-time mode for international safeguards verification and non-proliferation policy concerns

  18. Impact of sophisticated fog spray models on accident analyses

    International Nuclear Information System (INIS)

    Roblyer, S.P.; Owzarski, P.C.

    1978-01-01

    The N-Reactor confinement system release dose to the public in a postulated accident is reduced by washing the confinement atmosphere with fog sprays. This allows a low pressure release of confinement atmosphere containing fission products through filters and out an elevated stack. The current accident analysis required revision of the CORRAL code and other codes such as CONTEMPT to properly model the N Reactor confinement into a system of multiple fog-sprayed compartments. In revising these codes, more sophisticated models for the fog sprays and iodine plateout were incorporated to remove some of the conservatism of steam condensing rate, fission product washout and iodine plateout than used in previous studies. The CORRAL code, which was used to describe the transport and deposition of airborne fission products in LWR containment systems for the Rasmussen Study, was revised to describe fog spray removal of molecular iodine (I 2 ) and particulates in multiple compartments for sprays having individual characteristics of on-off times, flow rates, fall heights, and drop sizes in changing containment atmospheres. During postulated accidents, the code determined the fission product removal rates internally rather than from input decontamination factors. A discussion is given of how the calculated plateout and washout rates vary with time throughout the analysis. The results of the accident analyses indicated that more credit could be given to fission product washout and plateout. An important finding was that the release of fission products to the atmosphere and adsorption of fission products on the filters were significantly lower than previous studies had indicated

  19. Sophisticated Communication in the Brazilian Torrent Frog Hylodes japi.

    Science.gov (United States)

    de Sá, Fábio P; Zina, Juliana; Haddad, Célio F B

    2016-01-01

    Intraspecific communication in frogs plays an important role in the recognition of conspecifics in general and of potential rivals or mates in particular and therefore with relevant consequences for pre-zygotic reproductive isolation. We investigate intraspecific communication in Hylodes japi, an endemic Brazilian torrent frog with territorial males and an elaborate courtship behavior. We describe its repertoire of acoustic signals as well as one of the most complex repertoires of visual displays known in anurans, including five new visual displays. Previously unknown in frogs, we also describe a bimodal inter-sexual communication system where the female stimulates the male to emit a courtship call. As another novelty for frogs, we show that in addition to choosing which limb to signal with, males choose which of their two vocal sacs will be used for visual signaling. We explain how and why this is accomplished. Control of inflation also provides additional evidence that vocal sac movement and color must be important for visual communication, even while producing sound. Through the current knowledge on visual signaling in Neotropical torrent frogs (i.e. hylodids), we discuss and highlight the behavioral diversity in the family Hylodidae. Our findings indicate that communication in species of Hylodes is undoubtedly more sophisticated than we expected and that visual communication in anurans is more widespread than previously thought. This is especially true in tropical regions, most likely due to the higher number of species and phylogenetic groups and/or to ecological factors, such as higher microhabitat diversity.

  20. Sophisticated Communication in the Brazilian Torrent Frog Hylodes japi.

    Directory of Open Access Journals (Sweden)

    Fábio P de Sá

    Full Text Available Intraspecific communication in frogs plays an important role in the recognition of conspecifics in general and of potential rivals or mates in particular and therefore with relevant consequences for pre-zygotic reproductive isolation. We investigate intraspecific communication in Hylodes japi, an endemic Brazilian torrent frog with territorial males and an elaborate courtship behavior. We describe its repertoire of acoustic signals as well as one of the most complex repertoires of visual displays known in anurans, including five new visual displays. Previously unknown in frogs, we also describe a bimodal inter-sexual communication system where the female stimulates the male to emit a courtship call. As another novelty for frogs, we show that in addition to choosing which limb to signal with, males choose which of their two vocal sacs will be used for visual signaling. We explain how and why this is accomplished. Control of inflation also provides additional evidence that vocal sac movement and color must be important for visual communication, even while producing sound. Through the current knowledge on visual signaling in Neotropical torrent frogs (i.e. hylodids, we discuss and highlight the behavioral diversity in the family Hylodidae. Our findings indicate that communication in species of Hylodes is undoubtedly more sophisticated than we expected and that visual communication in anurans is more widespread than previously thought. This is especially true in tropical regions, most likely due to the higher number of species and phylogenetic groups and/or to ecological factors, such as higher microhabitat diversity.

  1. Radioactive tracer technique in process optimization: applications in the chemical industry

    International Nuclear Information System (INIS)

    Charlton, J.S.

    1989-01-01

    Process optimization is concerned with the selection of the most appropriate technological design of the process and with controlling its operation to obtain maximum benefit. The role of radioactive tracers in process optimization is discussed and the various circumstances under which such techniques may be beneficially applied are identified. Case studies are presented which illustrate how radioisotopes may be used to monitor plant performance under dynamic conditions to improve production efficiency and to investigate the cause of production limitations. In addition, the use of sealed sources to provide information complementary to the tracer study is described. (author)

  2. Sophisticated fuel handling system evolved

    International Nuclear Information System (INIS)

    Ross, D.A.

    1988-01-01

    The control systems at Sellafield fuel handling plant are described. The requirements called for built-in diagnostic features as well as the ability to handle a large sequencing application. Speed was also important; responses better than 50ms were required. The control systems are used to automate operations within each of the three main process caves - two Magnox fuel decanners and an advanced gas-cooled reactor fuel dismantler. The fuel route within the fuel handling plant is illustrated and described. ASPIC (Automated Sequence Package for Industrial Control) which was developed as a controller for the plant processes is described. (U.K.)

  3. Demonstration of FBRM as process analytical technology tool for dewatering processes via CST correlation.

    Science.gov (United States)

    Cobbledick, Jeffrey; Nguyen, Alexander; Latulippe, David R

    2014-07-01

    The current challenges associated with the design and operation of net-energy positive wastewater treatment plants demand sophisticated approaches for the monitoring of polymer-induced flocculation. In anaerobic digestion (AD) processes, the dewaterability of the sludge is typically assessed from off-line lab-bench tests - the capillary suction time (CST) test is one of the most common. Focused beam reflectance measurement (FBRM) is a promising technique for real-time monitoring of critical performance attributes in large scale processes and is ideally suited for dewatering applications. The flocculation performance of twenty-four cationic polymers, that spanned a range of polymer size and charge properties, was measured using both the FBRM and CST tests. Analysis of the data revealed a decreasing monotonic trend; the samples that had the highest percent removal of particles less than 50 microns in size as determined by FBRM had the lowest CST values. A subset of the best performing polymers was used to evaluate the effects of dosage amount and digestate sources on dewatering performance. The results from this work show that FBRM is a powerful tool that can be used for optimization and on-line monitoring of dewatering processes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Improvements in techniques and processes

    International Nuclear Information System (INIS)

    Cairon, B.; Nolin, D.

    2003-01-01

    The paper presents the De-construction And Decontamination Techniques used at COGEMA-La Hague for dismantling and decontamination of plant UP2 400. Intervention under water particularly intervention from the edge of the pool are described while significant radiological constraints due to the presence of fuel are observed. The Under water fuel operations were undertaking to recover pieces of UNGG fuel and miscellaneous technological waste under 5 m of water and with reduced visibility. Here remote works implying reduced dosimetry and increased security were carried out. Specific issues concerning tools and procedures are addressed as fallows: Pendulous telescopic tool holder on runway channel 215.40; HP cutting under water; Cutting machine set up in the facility; Suction of sludge; Gripping and handling system for the slider and lid; Dredging the Sludge; tests in facility; Control console; Shock absorbing units; Moving the shock absorbing mattresses using slings; Decontamination of large areas of stainless steel walls; Cutting bulky parts in air; Cutting a tubular structure under water; Compacting the drums; Concrete skinning using skinning machines; Concrete skinning using the BRH, hydraulic rock breaker; Concrete skinning using shot blasting; Dismantling the process cell using the 'ATENA' remote power carrier; Removing openings through dry core sample drilling; Removing openings through demolition

  5. Progress of fusion fuel processing system development at the Japan Atomic Energy Research Institute

    International Nuclear Information System (INIS)

    Nishi, Masataka; Yamanishi, Toshihiko; Kawamura, Yoshinori; Iwai, Yasunori; Isobe, Kanetsugu; O'Hira, Shigeru; Hayashi, Takumi; Nakamura, Hirofumi; Kobayashi, Kazuhiro; Suzuki, Takumi; Yamada, Masayuki; Konishi, Satoshi

    2000-01-01

    The Tritium Process Laboratory (TPL) at the Japan Atomic Energy Research Institute has been working on the development of fuel processing technology for fusion reactors as a major activity. A fusion fuel processing loop was installed and is being tested with tritium under reactor relevant conditions. The loop at the TPL consists of ZrCo based tritium storage beds, a plasma exhaust processing system using a palladium diffuser and an electrolytic reactor, cryogenic distillation columns for isotope separation, and analytical systems based on newly developed micro gas chromatographs and Raman Spectroscopy. Several extended demonstration campaigns were performed under realistic reactor conditions to test tritiated impurity processing. A sophisticated control technique of distillation column was performed at the same time, and integrated fuel circulation was successfully demonstrated. Major recent design work on the International Thermonuclear Experimental Reactor (ITER) tritium plant at the TPL is devoted to water detritiation based on liquid phase catalytic exchange for improved tritium removal from waste water

  6. Integrative techniques related to positive processes in psychotherapy.

    Science.gov (United States)

    Cromer, Thomas D

    2013-09-01

    This review compiles and evaluates a number of therapist interventions that have been found to significantly contribute to positive psychotherapy processes (i.e., increased alliance, patient engagement/satisfaction, and symptomatic improvement). Four forms of intervention are presented: Affect-focused, Supportive, Exploratory, and Patient-Therapist Interaction. The intention of this review is to link specific interventions to applied practice so that integrative clinicians can potentially use these techniques to improve their clinical work. To this end, there is the inclusion of theory and empirical studies from a range of orientations including Emotionally Focused, Psychodynamic, Client-Centered, Cognitive-Behavioral, Interpersonal, Eclectic, and Motivational Interviewing. Each of the four sections will include the theoretical basis and proposed mechanism of change for the intervention, research that supports its positive impact on psychotherapy processes, and conclude with examples demonstrating its use in actual practice. Clinical implications and considerations regarding the use of these interventions will also be presented. 2013 APA, all rights reserved

  7. A high-resolution neutron spectra unfolding method using the Genetic Algorithm technique

    CERN Document Server

    Mukherjee, B

    2002-01-01

    The Bonner sphere spectrometers (BSS) are commonly used to determine the neutron spectra within various nuclear facilities. Sophisticated mathematical tools are used to unfold the neutron energy distribution from the output data of the BSS. This paper highlights a novel high-resolution neutron spectra-unfolding method using the Genetic Algorithm (GA) technique. The GA imitates the biological evolution process prevailing in the nature to solve complex optimisation problems. The GA method was utilised to evaluate the neutron energy distribution, average energy, fluence and equivalent dose rates at important work places of a DIDO class research reactor and a high-energy superconducting heavy ion cyclotron. The spectrometer was calibrated with a sup 2 sup 4 sup 1 Am/Be (alpha,n) neutron standard source. The results of the GA method agreed satisfactorily with the results obtained by using the well-known BUNKI neutron spectra unfolding code.

  8. EU-Korea FTA and Its Impact on V4 Economies. A Comparative Analysis of Trade Sophistication and Intra-Industry Trade

    Directory of Open Access Journals (Sweden)

    Michalski Bartosz

    2018-03-01

    Full Text Available This paper investigates selected short- and mid-term effects in trade in goods between the Visegrad countries (V4: the Czech Republic, Hungary, Poland and the Slovak Republic and the Republic of Korea under the framework of the Free Trade Agreement between the European Union and the Republic of Korea. This Agreement is described in the “Trade for All” (2015: 9 strategy as the most ambitious trade deal ever implemented by the EU. The primary purpose of our analysis is to identify, compare, and evaluate the evolution of the technological sophistication of bilateral exports and imports. Another dimension of the paper concentrates on the developments within intra-industry trade. Moreover, these objectives are approached taking into account the context of the South Korean direct investment inflow to the V4. The evaluation of technological sophistication is based on UNCTAD’s methodology, while the intensity of intra-industry trade is measured by the GL-index and identification of its subcategories (horizontal and vertical trade. The analysis covers the timespan 2001–2015. The novelty of the paper lies in the fact that the study of South Korean-V4 trade relations has not so far been carried out from this perspective. Thus this paper investigates interesting phenomena identified in the trade between the Republic of Korea (ROK and V4 economies. The main findings imply an impact of South Korean direct investments on trade. This is represented by the trade deficit of the V4 with ROK and the structure of bilateral trade in terms of its technological sophistication. South Korean investments might also have had positive consequences for the evolution of IIT, particularly in the machinery sector. The political interpretation indicates that they may strengthen common threats associated with the middle-income trap, particularly the technological gap and the emphasis placed on lower costs of production.

  9. Multi-factor models and signal processing techniques application to quantitative finance

    CERN Document Server

    Darolles, Serges; Jay, Emmanuelle

    2013-01-01

    With recent outbreaks of multiple large-scale financial crises, amplified by interconnected risk sources, a new paradigm of fund management has emerged. This new paradigm leverages "embedded" quantitative processes and methods to provide more transparent, adaptive, reliable and easily implemented "risk assessment-based" practices.This book surveys the most widely used factor models employed within the field of financial asset pricing. Through the concrete application of evaluating risks in the hedge fund industry, the authors demonstrate that signal processing techniques are an intere

  10. Correlation techniques for the improvement of signal-to-noise ratio in measurements with stochastic processes

    CERN Document Server

    Reddy, V R; Reddy, T G; Reddy, P Y; Reddy, K R

    2003-01-01

    An AC modulation technique is described to convert stochastic signal variations into an amplitude variation and its retrieval through Fourier analysis. It is shown that this AC detection of signals of stochastic processes when processed through auto- and cross-correlation techniques improve the signal-to-noise ratio; the correlation techniques serve a similar purpose of frequency and phase filtering as that of phase-sensitive detection. A few model calculations applied to nuclear spectroscopy measurements such as Angular Correlations, Mossbauer spectroscopy and Pulse Height Analysis reveal considerable improvement in the sensitivity of signal detection. Experimental implementation of the technique is presented in terms of amplitude variations of harmonics representing the derivatives of normal spectra. Improved detection sensitivity to spectral variations is shown to be significant. These correlation techniques are general and can be made applicable to all the fields of particle counting where measurements ar...

  11. Fully Solution-Processable Fabrication of Multi-Layered Circuits on a Flexible Substrate Using Laser Processing

    Directory of Open Access Journals (Sweden)

    Seok Young Ji

    2018-02-01

    Full Text Available The development of printing technologies has enabled the realization of electric circuit fabrication on a flexible substrate. However, the current technique remains restricted to single-layer patterning. In this paper, we demonstrate a fully solution-processable patterning approach for multi-layer circuits using a combined method of laser sintering and ablation. Selective laser sintering of silver (Ag nanoparticle-based ink is applied to make conductive patterns on a heat-sensitive substrate and insulating layer. The laser beam path and irradiation fluence are controlled to create circuit patterns for flexible electronics. Microvia drilling using femtosecond laser through the polyvinylphenol-film insulating layer by laser ablation, as well as sequential coating of Ag ink and laser sintering, achieves an interlayer interconnection between multi-layer circuits. The dimension of microvia is determined by a sophisticated adjustment of the laser focal position and intensity. Based on these methods, a flexible electronic circuit with chip-size-package light-emitting diodes was successfully fabricated and demonstrated to have functional operations.

  12. Fully Solution-Processable Fabrication of Multi-Layered Circuits on a Flexible Substrate Using Laser Processing

    Science.gov (United States)

    Ji, Seok Young; Choi, Wonsuk; Jeon, Jin-Woo; Chang, Won Seok

    2018-01-01

    The development of printing technologies has enabled the realization of electric circuit fabrication on a flexible substrate. However, the current technique remains restricted to single-layer patterning. In this paper, we demonstrate a fully solution-processable patterning approach for multi-layer circuits using a combined method of laser sintering and ablation. Selective laser sintering of silver (Ag) nanoparticle-based ink is applied to make conductive patterns on a heat-sensitive substrate and insulating layer. The laser beam path and irradiation fluence are controlled to create circuit patterns for flexible electronics. Microvia drilling using femtosecond laser through the polyvinylphenol-film insulating layer by laser ablation, as well as sequential coating of Ag ink and laser sintering, achieves an interlayer interconnection between multi-layer circuits. The dimension of microvia is determined by a sophisticated adjustment of the laser focal position and intensity. Based on these methods, a flexible electronic circuit with chip-size-package light-emitting diodes was successfully fabricated and demonstrated to have functional operations. PMID:29425144

  13. Sophisticated approval voting, ignorance priors, and plurality heuristics: a behavioral social choice analysis in a Thurstonian framework.

    Science.gov (United States)

    Regenwetter, Michel; Ho, Moon-Ho R; Tsetlin, Ilia

    2007-10-01

    This project reconciles historically distinct paradigms at the interface between individual and social choice theory, as well as between rational and behavioral decision theory. The authors combine a utility-maximizing prescriptive rule for sophisticated approval voting with the ignorance prior heuristic from behavioral decision research and two types of plurality heuristics to model approval voting behavior. When using a sincere plurality heuristic, voters simplify their decision process by voting for their single favorite candidate. When using a strategic plurality heuristic, voters strategically focus their attention on the 2 front-runners and vote for their preferred candidate among these 2. Using a hierarchy of Thurstonian random utility models, the authors implemented these different decision rules and tested them statistically on 7 real world approval voting elections. They cross-validated their key findings via a psychological Internet experiment. Although a substantial number of voters used the plurality heuristic in the real elections, they did so sincerely, not strategically. Moreover, even though Thurstonian models do not force such agreement, the results show, in contrast to common wisdom about social choice rules, that the sincere social orders by Condorcet, Borda, plurality, and approval voting are identical in all 7 elections and in the Internet experiment. PsycINFO Database Record (c) 2007 APA, all rights reserved.

  14. Microstructure and associated properties of YBa2Cu3Ox superconductors prepared by melt-processing techniques

    International Nuclear Information System (INIS)

    Balachandran, U.; Zhong, W.; Youngdahl, C.A.; Poeppel, R.B.

    1993-03-01

    From the standpoint of applications, melt-processed bulk YBa 2 Cu 3 O x (YBCO) superconductors are of considerable interest. We have studied the microstructure and levitation force of melt-processed YBCO, YBCO plus Y 2 BaCuO 5 , and YBCO plus Pt samples. Large single crystalline samples, grown using a seeding technique, were also studied. The levitation force is highest in melt-processed samples made by the seeding technique. 6 figs, 24 refs

  15. The Design and Development of Test Platform for Wheat Precision Seeding Based on Image Processing Techniques

    OpenAIRE

    Li , Qing; Lin , Haibo; Xiu , Yu-Feng; Wang , Ruixue; Yi , Chuijie

    2009-01-01

    International audience; The test platform of wheat precision seeding based on image processing techniques is designed to develop the wheat precision seed metering device with high efficiency and precision. Using image processing techniques, this platform gathers images of seeds (wheat) on the conveyer belt which are falling from seed metering device. Then these data are processed and analyzed to calculate the qualified rate, reseeding rate and leakage sowing rate, etc. This paper introduces t...

  16. Recent developments in numerical simulation techniques of thermal recovery processes

    Energy Technology Data Exchange (ETDEWEB)

    Tamim, M. [Bangladesh University of Engineering and Technology, Bangladesh (Bangladesh); Abou-Kassem, J.H. [Chemical and Petroleum Engineering Department, UAE University, Al-Ain 17555 (United Arab Emirates); Farouq Ali, S.M. [University of Alberta, Alberta (Canada)

    2000-05-01

    Numerical simulation of thermal processes (steam flooding, steam stimulation, SAGD, in-situ combustion, electrical heating, etc.) is an integral part of a thermal project design. The general tendency in the last 10 years has been to use commercial simulators. During the last decade, only a few new models have been reported in the literature. More work has been done to modify and refine solutions to existing problems to improve the efficiency of simulators. The paper discusses some of the recent developments in simulation techniques of thermal processes such as grid refinement, grid orientation, effect of temperature on relative permeability, mathematical models, and solution methods. The various aspects of simulation discussed here promote better understanding of the problems encountered in the simulation of thermal processes and will be of value to both simulator users and developers.

  17. Acoustic levitation technique for containerless processing at high temperatures in space

    Science.gov (United States)

    Rey, Charles A.; Merkley, Dennis R.; Hammarlund, Gregory R.; Danley, Thomas J.

    1988-01-01

    High temperature processing of a small specimen without a container has been demonstrated in a set of experiments using an acoustic levitation furnace in the microgravity of space. This processing technique includes the positioning, heating, melting, cooling, and solidification of a material supported without physical contact with container or other surface. The specimen is supported in a potential energy well, created by an acoustic field, which is sufficiently strong to position the specimen in the microgravity environment of space. This containerless processing apparatus has been successfully tested on the Space Shuttle during the STS-61A mission. In that experiment, three samples wer successfully levitated and processed at temperatures from 600 to 1500 C. Experiment data and results are presented.

  18. Comparison and Evaluation of Various Tritium Decontamination Techniques and Processes

    International Nuclear Information System (INIS)

    Gentile, C.A.; Langish, S.W.; Skinner, C.H.; Ciebiera, L.P.

    2004-01-01

    In support of fusion energy development, various techniques and processes have been developed over the past two decades for the removal and decontamination of tritium from a variety of items, surfaces, and components. Tritium decontamination, by chemical, physical, mechanical, or a combination of these methods, is driven by two underlying motivational forces. The first of these motivational forces is safety. Safety is paramount to the established culture associated with fusion energy. The second of these motivational forces is cost. In all aspects, less tritium contamination equals lower operational and disposal costs. This paper will discuss and evaluate the various processes employed for tritium removal and decontamination

  19. Comparison and Evaluation of Various Tritium Decontamination Techniques and Processes

    International Nuclear Information System (INIS)

    Gentile, C.A.; Langish, S.W.; Skinner, C.H.; Ciebiera, L.P.

    2005-01-01

    In support of fusion energy development, various techniques and processes have been developed over the past two decades for the removal and decontamination of tritium from a variety of items, surfaces, and components. The motivational force for tritium decontamination by chemical, physical, mechanical, or a combination of these methods, is driven by two underlying forces. The first of these motivational forces is safety. Safety is paramount to the established culture associated with fusion energy. The second of these motivational forces is cost. In all aspects, less tritium contamination equals lower operational and disposal costs. This paper will discuss and evaluate the various processes employed for tritium removal and decontamination

  20. Multi-beam backscatter image data processing techniques employed to EM 1002 system

    Digital Repository Service at National Institute of Oceanography (India)

    Fernandes, W.A.; Chakraborty, B.

    to compensate outer-beam backscatter strength data in such a way that the effect of angular backscatter strength is removed. In this work we have developed backscatter data processing techniques for EM1002 multi-beam system...

  1. Development of an Advanced, Automatic, Ultrasonic NDE Imaging System via Adaptive Learning Network Signal Processing Techniques

    Science.gov (United States)

    1981-03-13

    UNCLASSIFIED SECURITY CLAS,:FtfC ’i OF TH*!’ AGC W~ct P- A* 7~9r1) 0. ABSTRACT (continued) onuing in concert with a sophisticated detector has...and New York, 1969. Whalen, M.F., L.J. O’Brien, and A.N. Mucciardi, "Application of Adaptive Learning Netowrks for the Characterization of Two

  2. Opportunities and applications of medical imaging and image processing techniques for nondestructive testing

    International Nuclear Information System (INIS)

    Song, Samuel Moon Ho; Cho, Jung Ho; Son, Sang Rock; Sung, Je Jonng; Ahn, Hyung Keun; Lee, Jeong Soon

    2002-01-01

    Nondestructive testing (NDT) of structures strives to extract all relevant data regarding the state of the structure without altering its form or properties. The success enjoyed by imaging and image processing technologies in the field of modem medicine forecasts similar success of image processing related techniques both in research and practice of NDT. In this paper, we focus on two particular instances of such applications: a modern vision technique for 3-D profile and shape measurement, and ultrasonic imaging with rendering for 3-D visualization. Ultrasonic imaging of 3-D structures for nondestructive evaluation purposes must provide readily recognizable 3-D images with enough details to clearly show various faults that may or may not be present. As a step towards Improving conspicuity and thus detection of faults, we propose a pulse-echo ultrasonic imaging technique to generate a 3-D image of the 3-D object under evaluation through strategic scanning and processing of the pulse-echo data. This three-dimensional processing and display improves conspicuity of faults and in addition, provides manipulation capabilities, such as pan and rotation of the 3-D structure. As a second application, we consider an image based three-dimensional shape determination system. The shape, and thus the three-dimensional coordinate information of the 3-D object, is determined solely from captured images of the 3-D object from a prescribed set of viewpoints. The approach is based on the shape from silhouette (SFS) technique and the efficacy of the SFS method is tested using a sample data set. This system may be used to visualize the 3-D object efficiently, or to quickly generate initial CAD data for reverse engineering purposes. The proposed system potentially may be used in three dimensional design applications such as 3-D animation and 3-D games.

  3. Processing techniques for data from the Kuosheng Unit 1 shakedown safety-relief-valve tests

    International Nuclear Information System (INIS)

    McCauley, E.W.; Rompel, S.L.; Weaver, H.J.; Altenbach, T.J.

    1982-08-01

    This report describes techniques developed at the Lawrence Livermore National Laobratory, Livermore, CA for processing original data from the Taiwan Power Company's Kuosheng MKIII Unit 1 Safety Relief Valve Shakedown Tests conducted in April/May 1981. The computer codes used, TPSORT, TPPLOT, and TPPSD, form a special evaluation system for treating the data from its original packed binary form to ordered, calibrated ASCII transducer files and then to production of time-history plots, numerical output files, and spectral analyses. Using the data processing techniques described, a convenient means of independently examining and analyzing a unique data base for steam condensation phenomena in the MARKIII wetwell is described. The techniques developed for handling these data are applicable to the treatment of similar, but perhaps differently structured, experiment data sets

  4. ABAQUS2MATLAB: A Novel Tool for Finite Element Post-Processing

    DEFF Research Database (Denmark)

    Martínez Pañeda, Emilio; Papazafeiropoulos, George; Muniz-Calvente, Miguel

    2017-01-01

    A novel piece of software is presented to connect Abaqus, a sophisticated finite element package, with Matlab, the most comprehensive program for mathematical analysis. This interface between these well-known codes not only benefits from the image processing and the integrated graph-plotting feat......A novel piece of software is presented to connect Abaqus, a sophisticated finite element package, with Matlab, the most comprehensive program for mathematical analysis. This interface between these well-known codes not only benefits from the image processing and the integrated graph...... to demonstrate its capabilities. The source code, detailed documentation and a large number of tutorials can be freely downloaded from www.abaqus2matlab.com....

  5. Control System Design for Cylindrical Tank Process Using Neural Model Predictive Control Technique

    Directory of Open Access Journals (Sweden)

    M. Sridevi

    2010-10-01

    Full Text Available Chemical manufacturing and process industry requires innovative technologies for process identification. This paper deals with model identification and control of cylindrical process. Model identification of the process was done using ARMAX technique. A neural model predictive controller was designed for the identified model. The performance of the controllers was evaluated using MATLAB software. The performance of NMPC controller was compared with Smith Predictor controller and IMC controller based on rise time, settling time, overshoot and ISE and it was found that the NMPC controller is better suited for this process.

  6. The Use of Plasma Technique in Nitridation Process of Metal Alloy DIN 42CrMo4

    International Nuclear Information System (INIS)

    Purwanto; Malau, Viktor; Tjipto Sujitno

    2003-01-01

    Nitridation process with plasma technique is one of technique for surface treatment of a material. Research on plasma technique for nitridation process has been carried out to find out the nitridation effect on properties of metal alloy DIN 42CrM04. Nitridation process with plasma technique was conducted in a vacuum tube under following conditions 0.36 torr of pressure, 300 o C of temperature and nitridation times 1, 2, and 3 hours. Nitridation process was followed by hardness test measurement using High Quality Micro Hardness Tester machine, serial number MM-0054, as well as microstructure test using Scanning Electron Microscope (SEM) coupled with Energy Dispersive Spectroscopy (EDS) EDAX-DX4. The results showed that surface hardness increased after nitridation process. For nitridation processes for 1, 2, and 3 hours, the hardness increased from 291 kg/mm 2 to 303 kg/mm 2 , 324 kg/mm 2 and 403 kg/mm 2 , respectively. The results from micro structure observation showed that new phase of Ferro Nitride (Fe 4 N) has been formed with 4.17% nitrogen weight equivalent to 14.73% nitrogen atom and with the thickness of 5.71 μm, 5.08% nitrogen weight or 17.51% nitrogen atom and 6.78 μm thickness, and 5.69% nitrogen weight or 19.24% nitrogen atom and 8.57 μm thickness. (author)

  7. Diazo processing of LANDSAT imagery: A low-cost instructional technique

    Science.gov (United States)

    Lusch, D. P.

    1981-01-01

    Diazo processing of LANDSAT imagery is a relatively simple and cost effective method of producing enhanced renditions of the visual LANDSAT products. This technique is capable of producing a variety of image enhancements which have value in a teaching laboratory environment. Additionally, with the appropriate equipment, applications research which relys on accurate and repeatable results is possible. Exposure and development equipment options, diazo materials, and enhancement routines are discussed.

  8. Measurable Disturbances Compensation: Analysis and Tuning of Feedforward Techniques for Dead-Time Processes

    Directory of Open Access Journals (Sweden)

    Andrzej Pawlowski

    2016-04-01

    Full Text Available In this paper, measurable disturbance compensation techniques are analyzed, focusing the problem on the input-output and disturbance-output time delays. The feedforward compensation method is evaluated for the common structures that appear between the disturbance and process dynamics. Due to the presence of time delays, the study includes causality and instability phenomena that can arise when a classical approach for disturbance compensation is used. Different feedforward configurations are analyzed for two feedback control techniques, PID (Proportional-Integral-Derivative and MPC (Model Predictive Control that are widely used for industrial process-control applications. The specific tuning methodology for the analyzed process structure is used to obtain improved disturbance rejection performance regarding classical approaches. The evaluation of the introduced disturbance rejection schemes is performed through simulation, considering process constraints in order to highlight the advantages and drawbacks in common scenarios. The performance of the analyzed structure is expressed with different indexes that allow us direct comparisons. The obtained results show that the proper design and tuning of the feedforward action helps to significantly improve the overall control performance in process control tasks.

  9. A safeguards verification technique for solution homogeneity and volume measurements in process tanks

    International Nuclear Information System (INIS)

    Suda, S.; Franssen, F.

    1987-01-01

    A safeguards verification technique is being developed for determining whether process-liquid homogeneity has been achieved in process tanks and for authenticating volume-measurement algorithms involving temperature corrections. It is proposed that, in new designs for bulk-handling plants employing automated process lines, bubbler probes and thermocouples be installed at several heights in key accountability tanks. High-accuracy measurements of density using an electromanometer can now be made which match or even exceed analytical-laboratory accuracies. Together with regional determination of tank temperatures, these measurements provide density, liquid-column weight and temperature gradients over the fill range of the tank that can be used to ascertain when the tank solution has reached equilibrium. Temperature-correction algorithms can be authenticated by comparing the volumes obtained from the several bubbler-probe liquid-height measurements, each based on different amounts of liquid above and below the probe. The verification technique is based on the automated electromanometer system developed by Brookhaven National Laboratory (BNL). The IAEA has recently approved the purchase of a stainless-steel tank equipped with multiple bubbler and thermocouple probes for installation in its Bulk Calibration Laboratory at IAEA Headquarters, Vienna. The verification technique is scheduled for preliminary trials in late 1987

  10. Processing Techniques and Applications of Silk Hydrogels in Bioengineering

    Directory of Open Access Journals (Sweden)

    Michael Floren

    2016-09-01

    Full Text Available Hydrogels are an attractive class of tunable material platforms that, combined with their structural and functional likeness to biological environments, have a diversity of applications in bioengineering. Several polymers, natural and synthetic, can be used, the material selection being based on the required functional characteristics of the prepared hydrogels. Silk fibroin (SF is an attractive natural polymer for its excellent processability, biocompatibility, controlled degradation, mechanical properties and tunable formats and a good candidate for the fabrication of hydrogels. Tremendous effort has been made to control the structural and functional characteristic of silk hydrogels, integrating novel biological features with advanced processing techniques, to develop the next generation of functional SF hydrogels. Here, we review the several processing methods developed to prepare advanced SF hydrogel formats, emphasizing a bottom-up approach beginning with critical structural characteristics of silk proteins and their behavior under specific gelation environments. Additionally, the preparation of SF hydrogel blends and other advanced formats will also be discussed. We conclude with a brief description of the attractive utility of SF hydrogels in relevant bioengineering applications.

  11. Ignition and monitoring technique for plasma processing of multicell superconducting radio-frequency cavities

    Science.gov (United States)

    Doleans, Marc

    2016-12-01

    An in-situ plasma processing technique has been developed at the Spallation Neutron Source (SNS) to improve the performance of the superconducting radio-frequency (SRF) cavities in operation. The technique uses a low-density reactive neon-oxygen plasma at room-temperature to improve the surface work function, to help remove adsorbed gases on the RF surface, and to reduce its secondary emission yield. SNS SRF cavities have six accelerating cells and the plasma typically ignites in the cell where the electric field is the highest. This article details the technique to ignite and monitor the plasma in each cell of the SNS cavities.

  12. RSYST: From nuclear reactor calculations towards a highly sophisticated scientific software integration environment

    International Nuclear Information System (INIS)

    Noack, M.; Seybold, J.; Ruehle, R.

    1996-01-01

    The software environment RSYST was originally used to solve problems of reactor physics. The consideration of advanced scientific simulation requirements and the strict application of modern software design principles led to a system which is perfectly suitable to solve problems in various complex scientific problem domains. Starting with a review of the early days of RSYST, we describe the straight evolution driven by the need of software environment which combines the advantages of a high-performance database system with the capability to integrate sophisticated scientific technical applications. The RSYST architecture is presented and the data modelling capabilities are described. To demonstrate the powerful possibilities and flexibility of the RSYST environment, we describe a wide range of RSYST applications, e.g., mechanical simulations of multibody systems, which are used in biomechanical research, civil engineering and robotics. In addition, a hypermedia system which is used for scientific technical training and documentation is presented. (orig.) [de

  13. Developments in functional neuroimaging techniques

    International Nuclear Information System (INIS)

    Aine, C.J.

    1995-01-01

    A recent review of neuroimaging techniques indicates that new developments have primarily occurred in the area of data acquisition hardware/software technology. For example, new pulse sequences on standard clinical imagers and high-powered, rapidly oscillating magnetic field gradients used in echo planar imaging (EPI) have advanced MRI into the functional imaging arena. Significant developments in tomograph design have also been achieved for monitoring the distribution of positron-emitting radioactive tracers in the body (PET). Detector sizes, which pose a limit on spatial resolution, have become smaller (e.g., 3--5 mm wide) and a new emphasis on volumetric imaging has emerged which affords greater sensitivity for determining locations of positron annihilations and permits smaller doses to be utilized. Electromagnetic techniques have also witnessed growth in the ability to acquire data from the whole head simultaneously. EEG techniques have increased their electrode coverage (e.g., 128 channels rather than 16 or 32) and new whole-head systems are now in use for MEG. But the real challenge now is in the design and implementation of more sophisticated analyses to effectively handle the tremendous amount of physiological/anatomical data that can be acquired. Furthermore, such analyses will be necessary for integrating data across techniques in order to provide a truly comprehensive understanding of the functional organization of the human brain

  14. Image processing techniques for thermal, x-rays and nuclear radiations

    International Nuclear Information System (INIS)

    Chadda, V.K.

    1998-01-01

    The paper describes image acquisition techniques for the non-visible range of electromagnetic spectrum especially thermal, x-rays and nuclear radiations. Thermal imaging systems are valuable tools used for applications ranging from PCB inspection, hot spot studies, fire identification, satellite imaging to defense applications. Penetrating radiations like x-rays and gamma rays are used in NDT, baggage inspection, CAT scan, cardiology, radiography, nuclear medicine etc. Neutron radiography compliments conventional x-rays and gamma radiography. For these applications, image processing and computed tomography are employed for 2-D and 3-D image interpretation respectively. The paper also covers main features of image processing systems for quantitative evaluation of gray level and binary images. (author)

  15. Parallel preconditioning techniques for sparse CG solvers

    Energy Technology Data Exchange (ETDEWEB)

    Basermann, A.; Reichel, B.; Schelthoff, C. [Central Institute for Applied Mathematics, Juelich (Germany)

    1996-12-31

    Conjugate gradient (CG) methods to solve sparse systems of linear equations play an important role in numerical methods for solving discretized partial differential equations. The large size and the condition of many technical or physical applications in this area result in the need for efficient parallelization and preconditioning techniques of the CG method. In particular for very ill-conditioned matrices, sophisticated preconditioner are necessary to obtain both acceptable convergence and accuracy of CG. Here, we investigate variants of polynomial and incomplete Cholesky preconditioners that markedly reduce the iterations of the simply diagonally scaled CG and are shown to be well suited for massively parallel machines.

  16. Development of spent solvent treatment process by a submerged combustion technique

    International Nuclear Information System (INIS)

    Uchiyama, Gunzo; Maeda, Mitsuru; Fujine, Sachio; Amakawa, Masayuki; Uchida, Katsuhide; Chida, Mitsuhisa

    1994-01-01

    An experimental study using a bench-scale equipment of 1 kg-simulated spent solvents per hour has been conducted in order to evaluate the applicability of a submerged combustion technique to the treatment of spent solvents contaminated with TRU elements. This report describes the experimental results on the combustion characteristics of the simulated spent solvents of tri-n-butyl phosphate and/or n-dodecane, and on the distribution behaviors of combustion products such as phosphoric acid, Ru, I, Zr and lanthanides as TRU simulants in the submerged combustion process. Also the experimental results of TRU separation from phosphoric acid solution by co-precipitation using bismuth phosphate are reported. It was shown that the submerged combustion technique was applicable to the treatment of spent solvents including the distillation residues of the solvent. Based on the experimental data, a new treatment process of spent solvent was proposed which consisted of submerged combustion, co-precipitation using bismuth phosphate, ceramic membrane filtration, cementation of TRU lean phosphate, and vitrification of TRU rich waste. (author)

  17. Financial planning and analysis techniques of mining firms: a note on Canadian practice

    Energy Technology Data Exchange (ETDEWEB)

    Blanco, H.; Zanibbi, L.R. (Laurentian University, Sudbury, ON (Canada). School of Commerce and Administration)

    1992-06-01

    This paper reports on the results of a survey of the financial planning and analysis techniques in use in the mining industry in Canada. The study was undertaken to determine the current status of these practices within mining firms in Canada and to investigate the extent to which the techniques are grouped together within individual firms. In addition, tests were performed on the relationship between these groups of techniques and both organizational size and price volatility of end product. The results show that a few techniques are widely utilized in this industry but that the techniques used most frequently are not as sophisticated as reported in previous, more broadly based surveys. The results also show that firms tend to use 'bundles' of techniques and that the relative use of some of these groups of techniques is weakly associated with both organizational size and type of end product. 19 refs., 7 tabs.

  18. Seismic qualification using digital signal processing/modal testing and finite element techniques

    International Nuclear Information System (INIS)

    Steedman, J.B.; Edelstein, A.

    1983-01-01

    A systematic procedure in which digital signal processing, modal testing and finite element techniques can be used to seismically qualify Class IE equipment for use in nuclear generating stations is presented. A new method was also developed in which measured transmissibility functions and Fourier transformation techniques were combined to compute instrument response spectra. As an illustrative example of the qualification method, the paper follows the qualification of a safety related Class IE Heating, Ventilating, and Air Conditioning (HVAC) Control Panel subjected to both seismic and hydrodynamic loading conditions

  19. Digital Image Processing Technique for Breast Cancer Detection

    Science.gov (United States)

    Guzmán-Cabrera, R.; Guzmán-Sepúlveda, J. R.; Torres-Cisneros, M.; May-Arrioja, D. A.; Ruiz-Pinales, J.; Ibarra-Manzano, O. G.; Aviña-Cervantes, G.; Parada, A. González

    2013-09-01

    Breast cancer is the most common cause of death in women and the second leading cause of cancer deaths worldwide. Primary prevention in the early stages of the disease becomes complex as the causes remain almost unknown. However, some typical signatures of this disease, such as masses and microcalcifications appearing on mammograms, can be used to improve early diagnostic techniques, which is critical for women’s quality of life. X-ray mammography is the main test used for screening and early diagnosis, and its analysis and processing are the keys to improving breast cancer prognosis. As masses and benign glandular tissue typically appear with low contrast and often very blurred, several computer-aided diagnosis schemes have been developed to support radiologists and internists in their diagnosis. In this article, an approach is proposed to effectively analyze digital mammograms based on texture segmentation for the detection of early stage tumors. The proposed algorithm was tested over several images taken from the digital database for screening mammography for cancer research and diagnosis, and it was found to be absolutely suitable to distinguish masses and microcalcifications from the background tissue using morphological operators and then extract them through machine learning techniques and a clustering algorithm for intensity-based segmentation.

  20. Uncertainty in safety : new techniques for the assessment and optimisation of safety in process industry

    NARCIS (Netherlands)

    Rouvroye, J.L.; Nieuwenhuizen, J.K.; Brombacher, A.C.; Stavrianidis, P.; Spiker, R.Th.E.; Pyatt, D.W.

    1995-01-01

    At this moment there is no standardised method for the assessment for safety in the process industry. Many companies and institutes use qualitative techniques for safety analysis while other companies and institutes use quantitative techniques. The authors of this paper will compare different

  1. Exploring the Role of Usability in the Software Process: A Study of Irish Software SMEs

    Science.gov (United States)

    O'Connor, Rory V.

    This paper explores the software processes and usability techniques used by Small and Medium Enterprises (SMEs) that develop web applications. The significance of this research is that it looks at development processes used by SMEs in order to assess to what degree usability is integrated into the process. This study seeks to gain an understanding into the level of awareness of usability within SMEs today and their commitment to usability in practice. The motivation for this research is to explore the current development processes used by SMEs in developing web applications and to understand how usability is represented in those processes. The background for this research is provided by the growth of the web application industry beyond informational web sites to more sophisticated applications delivering a broad range of functionality. This paper presents an analysis of the practices of several Irish SMEs that develop web applications through a series of case studies. With the focus on SMEs that develop web applications as Management Information Systems and not E-Commerce sites, informational sites, online communities or web portals. This study gathered data about the usability techniques practiced by these companies and their awareness of usability in the context of the software process in those SMEs. The contribution of this study is to further the understanding of the current role of usability within the software development processes of SMEs that develop web applications.

  2. Liquid argon TPC signal formation, signal processing and reconstruction techniques

    Science.gov (United States)

    Baller, B.

    2017-07-01

    This document describes a reconstruction chain that was developed for the ArgoNeuT and MicroBooNE experiments at Fermilab. These experiments study accelerator neutrino interactions that occur in a Liquid Argon Time Projection Chamber. Reconstructing the properties of particles produced in these interactions benefits from the knowledge of the micro-physics processes that affect the creation and transport of ionization electrons to the readout system. A wire signal deconvolution technique was developed to convert wire signals to a standard form for hit reconstruction, to remove artifacts in the electronics chain and to remove coherent noise. A unique clustering algorithm reconstructs line-like trajectories and vertices in two dimensions which are then matched to create of 3D objects. These techniques and algorithms are available to all experiments that use the LArSoft suite of software.

  3. Enhancement of crack detection in stud bolts of nuclear reactor by ultrasonic signal processing technique

    International Nuclear Information System (INIS)

    Lee, J.H.; Oh, W.D.; Choi, S.W.; Park, M.H.

    2004-01-01

    'Full-text:' The stud bolts is one of the most critical parts for safety of reactor vessels in the nuclear power plants. However, in the application of ultrasonic technique for crack detection in stud bolt, some difficulties encountered are classification of crack signal from the signals reflected from threads part in stud bolt. In this study, shadow effect technique combined with new signal processing method is Investigated to enhance the detectability of small crack initiated from root of thread in stud bolt. The key idea of signal processing is based on the fact that the shape of waveforms from the threads is uniform since the shape of the threads in a bolt is same. If some cracks exist in the thread, the flaw signals are different to the reference signals. It is demonstrated that the small flaws are efficiently detected by novel ultrasonic technique combined with this new signal processing concept. (author)

  4. Development and Quantification of UV-Visible and Laser Spectroscopic Techniques for Materials Accountability and Process Control

    International Nuclear Information System (INIS)

    Czerwinski, Ken; Weck, Phil; Poineau, Frederic

    2010-01-01

    Ultraviolet-Visible Spectroscopy (UV-Visible) and Time Resolved Laser Fluorescence Spectroscopy (TRLFS) optical techniques can permit on-line, real-time analysis of the actinide elements in a solvent extraction process. UV-Visible and TRLFS techniques have been used for measuring the speciation and concentration of the actinides under laboratory conditions. These methods are easily adaptable to multiple sampling geometries, such as dip probes, fiber-optic sample cells, and flow-through cell geometries. To fully exploit these techniques for GNEP applications, the fundamental speciation of the target actinides and the resulting influence on 3 spectroscopic properties must be determined. Through this effort detection limits, process conditions, and speciation of key actinide components can be establish and utilized in a range of areas of interest to GNEP, especially in areas related to materials accountability and process control.

  5. Application of Electroporation Technique in Biofuel Processing

    Directory of Open Access Journals (Sweden)

    Yousuf Abu

    2017-01-01

    Full Text Available Biofuels production is mostly oriented with fermentation process, which requires fermentable sugar as nutrient for microbial growth. Lignocellulosic biomass (LCB represents the most attractive, low-cost feedstock for biofuel production, it is now arousing great interest. The cellulose that is embedded in the lignin matrix has an insoluble, highly-crystalline structure, so it is difficult to hydrolyze into fermentable sugar or cell protein. On the other hand, microbial lipid has been studying as substitute of plant oils or animal fat to produce biodiesel. It is still a great challenge to extract maximum lipid from microbial cells (yeast, fungi, algae investing minimum energy.Electroporation (EP of LCB results a significant increase in cell conductivity and permeability caused due to the application of an external electric field. EP is required to alter the size and structure of the biomass, to reduce the cellulose crystallinity, and increase their porosity as well as chemical composition, so that the hydrolysis of the carbohydrate fraction to monomeric sugars can be achieved rapidly and with greater yields. Furthermore, EP has a great potential to disrupt the microbial cell walls within few seconds to bring out the intracellular materials (lipid to the solution. Therefore, this study aims to describe the challenges and prospect of application of EP technique in biofuels processing.

  6. A very simple technique to repair Grynfeltt-Lesshaft hernia.

    Science.gov (United States)

    Solaini, Leonardo; di Francesco, F; Gourgiotis, S; Solaini, Luciano

    2010-08-01

    A very simple technique to repair a superior lumbar hernia is described. The location of this type of hernia, also known as the Grynfeltt-Lesshaft hernia, is defined by a triangle placed in the lumbar region. An unusual case of a 67-year-old woman with a superior lumbar hernia is reported. The diagnosis was made by physical examination. The defect of the posterior abdominal wall was repaired with a polypropylene dart mesh. The patient had no evidence of recurrence at 11 months follow up. The surgical approach described in this paper is simple and easy to perform, and its result is comparable with other techniques that are much more sophisticated. No cases on the use of dart mesh to repair Grynfeltt-Lesshaft hernia have been reported by surgical journals indexed in PubMed.

  7. Modern techniques for condition monitoring of railway vehicle dynamics

    International Nuclear Information System (INIS)

    Ngigi, R W; Pislaru, C; Ball, A; Gu, F

    2012-01-01

    A modern railway system relies on sophisticated monitoring systems for maintenance and renewal activities. Some of the existing conditions monitoring techniques perform fault detection using advanced filtering, system identification and signal analysis methods. These theoretical approaches do not require complex mathematical models of the system and can overcome potential difficulties associated with nonlinearities and parameter variations in the system. Practical applications of condition monitoring tools use sensors which are mounted either on the track or rolling stock. For instance, monitoring wheelset dynamics could be done through the use of track-mounted sensors, while vehicle-based sensors are preferred for monitoring the train infrastructure. This paper attempts to collate and critically appraise the modern techniques used for condition monitoring of railway vehicle dynamics by analysing the advantages and shortcomings of these methods.

  8. Measurement of spatial correlation functions using image processing techniques

    International Nuclear Information System (INIS)

    Berryman, J.G.

    1985-01-01

    A procedure for using digital image processing techniques to measure the spatial correlation functions of composite heterogeneous materials is presented. Methods for eliminating undesirable biases and warping in digitized photographs are discussed. Fourier transform methods and array processor techniques for calculating the spatial correlation functions are treated. By introducing a minimal set of lattice-commensurate triangles, a method of sorting and storing the values of three-point correlation functions in a compact one-dimensional array is developed. Examples are presented at each stage of the analysis using synthetic photographs of cross sections of a model random material (the penetrable sphere model) for which the analytical form of the spatial correlations functions is known. Although results depend somewhat on magnification and on relative volume fraction, it is found that photographs digitized with 512 x 512 pixels generally have sufficiently good statistics for most practical purposes. To illustrate the use of the correlation functions, bounds on conductivity for the penetrable sphere model are calculated with a general numerical scheme developed for treating the singular three-dimensional integrals which must be evaluated

  9. A Case Study on E - Banking Security – When Security Becomes Too Sophisticated for the User to Access Their Information

    OpenAIRE

    Aaron M. French

    2012-01-01

    While eBanking security continues to increase in sophistication to protect against threats, the usability of the eBanking decreases resulting in poor security behaviors by the users. The current research evaluates se curity risks and measures taken for eBanking solutions. A case study is presented describing how increased complexity decreases vulnerabilities online but increases vulnerabilities from internal threats and eBanking users

  10. Performance of pile-up mitigation techniques for jets in pp collisions with the ATLAS detector

    CERN Document Server

    Testa, Marianna; The ATLAS collaboration

    2015-01-01

    The large rate of multiple simultaneous proton-proton interactions, or pile-up, generated by the Large Hadron Collider in Run I required the development of many new techniques to mitigate the adverse effects of these conditions. This presentation shows the methods employed to correct for the impact of pile-up on jet energy, jet shapes, and even spurious additional jets. Energy correction techniques that incorporate sophisticated estimates of the average pile-up energy density and tracking information are described in detail. Jet-to-vertex association techniques are also presented. We also describe the extension of these techniques to ameliorate the effect of pile-up on jet shapes using both subtraction and grooming procedures. Prospects for pile-up suppression at the HL-LHC will be also discussed.

  11. Process Integration Design Methods for Water Conservation and Wastewater Reduction in Industry. Part 3: Experience of Industrial Application

    DEFF Research Database (Denmark)

    Wenzel, Henrik; Dunn, Russell; Gottrup, Lene

    2002-01-01

    This paper is Part 3 in a three part series of papers addressing operational techniques for applying mass integration principles to design in industry with special focus on water conservation and wastewater reduction. The presented techniques derive from merging US and Danish experience with indu......This paper is Part 3 in a three part series of papers addressing operational techniques for applying mass integration principles to design in industry with special focus on water conservation and wastewater reduction. The presented techniques derive from merging US and Danish experience......’s experience with defining the scope of the system and with identifying water flow constraints and water quality constraints is discussed. It is shown, how physical constraints for the system design often set a limit for the sophistication of the water recycle network and thereby also a limit for how...... sophisticated the method for system design should be. Finally, pinch analysis and system designs for water recycling in a practical case study are shown, documenting large water saving potentials and achievements....

  12. Abaqus2Matlab: A suitable tool for finite element post-processing

    DEFF Research Database (Denmark)

    Papazafeiropoulos, George; Muñiz-Calvente, Miguel; Martínez Pañeda, Emilio

    2017-01-01

    A suitable piece of software is presented to connect Abaqus, a sophisticated finite element package, with Matlab, the most comprehensive program for mathematical analysis. This interface between these well- known codes not only benefits from the image processing and the integrated graph-plotting ......A suitable piece of software is presented to connect Abaqus, a sophisticated finite element package, with Matlab, the most comprehensive program for mathematical analysis. This interface between these well- known codes not only benefits from the image processing and the integrated graph...... crack propagation in structural materials by means of a cohesive zone approach. The source code, detailed documentation and a large number of tutorials can be freely downloaded from www.abaqus2matlab.com ....

  13. An experimental evaluation of the generalizing capabilities of process discovery techniques and black-box sequence models

    NARCIS (Netherlands)

    Tax, N.; van Zelst, S.J.; Teinemaa, I.; Gulden, Jens; Reinhartz-Berger, Iris; Schmidt, Rainer; Guerreiro, Sérgio; Guédria, Wided; Bera, Palash

    2018-01-01

    A plethora of automated process discovery techniques have been developed which aim to discover a process model based on event data originating from the execution of business processes. The aim of the discovered process models is to describe the control-flow of the underlying business process. At the

  14. Online Process Scaffolding and Students' Self-Regulated Learning with Hypermedia.

    Science.gov (United States)

    Azevedo, Roger; Cromley, Jennifer G.; Thomas, Leslie; Seibert, Diane; Tron, Myriam

    This study examined the role of different scaffolding instructional interventions in facilitating students' shift to more sophisticated mental models as indicated by both performance and process data. Undergraduate students (n=53) were randomly assigned to 1 of 3 scaffolding conditions (adaptive content and process scaffolding (ACPS), adaptive…

  15. Nondestructive evaluation of reinforced plastics by a radiometric measurement technique

    International Nuclear Information System (INIS)

    Entine, Gerald; Afshari, Sia; Verlinden, Matt

    1990-01-01

    The demand for new high-performance plastics has greatly increased with advances in the performance characteristics of sophisticated reinforced engineering resins. However, conventional methods for the evaluation of the glass and filler contents of reinforced plastics are destructive, labor intensive, and time consuming. We have developed a new instrument, to address this problem, which provides for the rapid, accurate, and nondestructive measurement of glass or filler content in reinforced plastics. This instrument utilizes radiation transmission and scattering techniques for analytical measurement of glass, graphite and other fillers used in reinforced plastics. (author)

  16. Sorption and chromatographic techniques for processing liquid waste of nuclear fuel cycle

    International Nuclear Information System (INIS)

    Gelis, V.M.; Milyutin, V.V.; Chuveleva, E.A.; Maslova, G.B.; Kudryavtseva, S.P.; Firsova, L.A.; Kozlitin, E.A.

    2000-01-01

    In the spent nuclear fuel processing procedures the significant quantity of high level liquid waste containing long-lived high toxic radionuclides of cesium, strontium, promethium, americium, curium, etc. is generated. Separation of those radionuclides from the waste not merely simplifies the further safe waste handling but also reduces the waste processing operation costs due to the market value of certain individual radionuclide preparations. Recovery and separation of high grade pure long-lived radionuclide preparations is frequently performed by means of chromatographic techniques. (authors)

  17. Patented Techniques for Acrylamide Mitigation in High-Temperature Processed Foods

    DEFF Research Database (Denmark)

    Mariotti, Salome; Pedreschi, Franco; Antonio Carrasco, José

    2011-01-01

    Heating foods has many advantages since it adds taste, color, texture and minimizes harmful germs, among others. Flavor and aroma compounds are produced via the Maillard reaction, where various hazardous com-pounds may form as well, such as acrylamide. Maillard reaction is believed to be the main...... for acrylamide reduction in foods processed at high temperatures are mentioned and briefly analyzed in order to develop new mitigation techniques for acrylamide in different food matrixes.......Heating foods has many advantages since it adds taste, color, texture and minimizes harmful germs, among others. Flavor and aroma compounds are produced via the Maillard reaction, where various hazardous com-pounds may form as well, such as acrylamide. Maillard reaction is believed to be the main...... route for acrylamide for-mation between reducing sugars (glucose and fructose), sucrose, and the amino acid asparagine, and, consequently, a variety of technologies have been developed to reduce acrylamide concentration in thermally processed foods based ei-ther on: (i) Changing process parameters (e...

  18. Fuzzy-based HAZOP study for process industry

    Energy Technology Data Exchange (ETDEWEB)

    Ahn, Junkeon; Chang, Daejun, E-mail: djchang@kaist.edu

    2016-11-05

    Highlights: • HAZOP is the important technique to evaluate system safety and its risks while process operations. • Fuzzy theory can handle the inherent uncertainties of process systems for the HAZOP. • Fuzzy-based HAZOP considers the aleatory and epistemic uncertainties and provides the risk level with less uncertainty. • Risk acceptance criteria should be considered regarding the transition region for each risk. - Abstract: This study proposed a fuzzy-based HAZOP for analyzing process hazards. Fuzzy theory was used to express uncertain states. This theory was found to be a useful approach to overcome the inherent uncertainty in HAZOP analyses. Fuzzy logic sharply contrasted with classical logic and provided diverse risk values according to its membership degree. Appropriate process parameters and guidewords were selected to describe the frequency and consequence of an accident. Fuzzy modeling calculated risks based on the relationship between the variables of an accident. The modeling was based on the mean expected value, trapezoidal fuzzy number, IF-THEN rules, and the center of gravity method. A cryogenic LNG (liquefied natural gas) testing facility was the objective process for the fuzzy-based and conventional HAZOPs. The most significant index is the frequency to determine risks. The comparison results showed that the fuzzy-based HAZOP provides better sophisticated risks than the conventional HAZOP. The fuzzy risk matrix presents the significance of risks, negligible risks, and necessity of risk reduction.

  19. A history of engraving and etching techniques: developments of manual intaglio printmaking processes, 1400-2000

    NARCIS (Netherlands)

    Stijnman, A.C.J.

    2012-01-01

    This book surveys the history of the techniques of engraving, etching and plate printing - i.e. that of manual intaglio printmaking processes - from its beginning in the 1430s until today. These developments are observed in the light of the coherence between the technique of the intaglio print (such

  20. Column ratio mapping: a processing technique for atomic resolution high-angle annular dark-field (HAADF) images.

    Science.gov (United States)

    Robb, Paul D; Craven, Alan J

    2008-12-01

    An image processing technique is presented for atomic resolution high-angle annular dark-field (HAADF) images that have been acquired using scanning transmission electron microscopy (STEM). This technique is termed column ratio mapping and involves the automated process of measuring atomic column intensity ratios in high-resolution HAADF images. This technique was developed to provide a fuller analysis of HAADF images than the usual method of drawing single intensity line profiles across a few areas of interest. For instance, column ratio mapping reveals the compositional distribution across the whole HAADF image and allows a statistical analysis and an estimation of errors. This has proven to be a very valuable technique as it can provide a more detailed assessment of the sharpness of interfacial structures from HAADF images. The technique of column ratio mapping is described in terms of a [110]-oriented zinc-blende structured AlAs/GaAs superlattice using the 1 angstroms-scale resolution capability of the aberration-corrected SuperSTEM 1 instrument.

  1. Column ratio mapping: A processing technique for atomic resolution high-angle annular dark-field (HAADF) images

    International Nuclear Information System (INIS)

    Robb, Paul D.; Craven, Alan J.

    2008-01-01

    An image processing technique is presented for atomic resolution high-angle annular dark-field (HAADF) images that have been acquired using scanning transmission electron microscopy (STEM). This technique is termed column ratio mapping and involves the automated process of measuring atomic column intensity ratios in high-resolution HAADF images. This technique was developed to provide a fuller analysis of HAADF images than the usual method of drawing single intensity line profiles across a few areas of interest. For instance, column ratio mapping reveals the compositional distribution across the whole HAADF image and allows a statistical analysis and an estimation of errors. This has proven to be a very valuable technique as it can provide a more detailed assessment of the sharpness of interfacial structures from HAADF images. The technique of column ratio mapping is described in terms of a [1 1 0]-oriented zinc-blende structured AlAs/GaAs superlattice using the 1 A-scale resolution capability of the aberration-corrected SuperSTEM 1 instrument.

  2. Sampling phased array, a new technique for ultrasonic signal processing and imaging now available to industry

    OpenAIRE

    Verkooijen, J.; Bulavinov, A.

    2008-01-01

    Over the past 10 years the improvement in the field of microelectronics and computer engineering has led to significant advances in ultrasonic signal processing and image construction techniques that are currently being applied to non-destructive material evaluation. A new phased array technique, called "Sampling Phased Array" has been developed in the Fraunhofer Institute for non-destructive testing [1]. It realizes a unique approach of measurement and processing of ultrasonic signals. The s...

  3. Managing complex processing of medical image sequences by program supervision techniques

    Science.gov (United States)

    Crubezy, Monica; Aubry, Florent; Moisan, Sabine; Chameroy, Virginie; Thonnat, Monique; Di Paola, Robert

    1997-05-01

    Our objective is to offer clinicians wider access to evolving medical image processing (MIP) techniques, crucial to improve assessment and quantification of physiological processes, but difficult to handle for non-specialists in MIP. Based on artificial intelligence techniques, our approach consists in the development of a knowledge-based program supervision system, automating the management of MIP libraries. It comprises a library of programs, a knowledge base capturing the expertise about programs and data and a supervision engine. It selects, organizes and executes the appropriate MIP programs given a goal to achieve and a data set, with dynamic feedback based on the results obtained. It also advises users in the development of new procedures chaining MIP programs.. We have experimented the approach for an application of factor analysis of medical image sequences as a means of predicting the response of osteosarcoma to chemotherapy, with both MRI and NM dynamic image sequences. As a result our program supervision system frees clinical end-users from performing tasks outside their competence, permitting them to concentrate on clinical issues. Therefore our approach enables a better exploitation of possibilities offered by MIP and higher quality results, both in terms of robustness and reliability.

  4. Some fuzzy techniques for staff selection process: A survey

    Science.gov (United States)

    Md Saad, R.; Ahmad, M. Z.; Abu, M. S.; Jusoh, M. S.

    2013-04-01

    With high level of business competition, it is vital to have flexible staff that are able to adapt themselves with work circumstances. However, staff selection process is not an easy task to be solved, even when it is tackled in a simplified version containing only a single criterion and a homogeneous skill. When multiple criteria and various skills are involved, the problem becomes much more complicated. In adddition, there are some information that could not be measured precisely. This is patently obvious when dealing with opinions, thoughts, feelings, believes, etc. One possible tool to handle this issue is by using fuzzy set theory. Therefore, the objective of this paper is to review the existing fuzzy techniques for solving staff selection process. It classifies several existing research methods and identifies areas where there is a gap and need further research. Finally, this paper concludes by suggesting new ideas for future research based on the gaps identified.

  5. Clinical Processes - The Killer Application for Constraint-Based Process Interactions

    DEFF Research Database (Denmark)

    Jiménez-Ramírez, Andrés; Barba, Irene; Reichert, Manfred

    2018-01-01

    . The scenario is subject to complex temporal constraints and entails the need for coordinating the constraint-based interactions among the processes related to a patient treatment process. As demonstrated in this work, the selected real process scenario can be suitably modeled through a declarative approach....... examples. However, to the best of our knowledge, they have not been used to model complex, real-world scenarios that comprise constraints going beyond control-flow. In this paper, we propose the use of a declarative language for modeling a sophisticated healthcare process scenario from the real world......For more than a decade, the interest in aligning information systems in a process-oriented way has been increasing. To enable operational support for business processes, the latter are usually specified in an imperative way. The resulting process models, however, tend to be too rigid to meet...

  6. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    Science.gov (United States)

    Hernandez, Wilmar

    2007-01-01

    In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.

  7. Spray Drying as a Processing Technique for Syndiotactic Polystyrene to Powder Form for Part Manufacturing Through Selective Laser Sintering

    Science.gov (United States)

    Mys, N.; Verberckmoes, A.; Cardon, L.

    2017-03-01

    Selective laser sintering (SLS) is a rapidly expanding field of the three-dimensional printing concept. One stumbling block in the evolution of the technique is the limited range of materials available for processing with SLS making the application window small. This article aims at identifying syndiotactic polystyrene (sPS) as a promising material. sPS pellets were processed into powder form with a lab-scale spray dryer with vibrating nozzle. This technique is the focus of this scope as it almost eliminates the agglomeration phenomenon often encountered with the use of solution-based processing techniques. Microspheres obtained were characterized in shape and size by scanning electron microscopy and evaluation of the particle size distribution. The effect the processing technique imparts on the intrinsic properties of the material was examined by differential scanning calorimetry analysis.

  8. Development of food preservation and processing techniques by radiation

    International Nuclear Information System (INIS)

    Byun, Myung Woo; Yook, Hong Sun; Lee, Ju Woon and others

    1999-03-01

    Development of food preservation and processing techniques by radiation was performed. Gamma irradiation at 2-10 kGy is considered to be an effective method to control pathogenic bacteria in species including Escherichia coli O157:H7. Gamma irradiation at 5 kGy completely eliminated pathogenic bacteria in beef. Gamma irradiation at such doses and subsequent storage at less than 4 deg C could ensure hygienic quality and prolong the microbiological shelf-life resulting from the reduction of spoilage microorganisms. Gamma irradiation on pre-rigor beef shortens the aging-period, improves tenderness and enhances the beef quality. And, a new beef processing method using gamma irradiation, such as in the low salt sausage and hygienic beef patty was developed. Safety tests of gamma-irradiated meats(beefs: 0-5 kGy; porks: 0-30 kGy) in areas such as genotoxicity, acute toxicity, four-week oral toxicity, rat hepato carcinogenesis and the anti oxidative defense system, were not affected by gamma irradiation. To pre-establish an alternative technique to the toxic fumigant, methyl bromide, which is the current quarantine measure of agricultural products for export and import, some selected agricultural products, such as chestnuts, acorns, red beans and mung beans, were subjected to a preliminary study to confirm the comparative effects of gamma irradiation and MBr fumigant on their disinfestation and quality, thereby preparing the basic data for the practical approach.Current fumigation(MBr) was perfect in its disinfecting capability, but it caused detrimental effects on the physical quality of agricultural produce. However, irradiation doses suitable for controlling pests did not induce any significant changes in the quality of the products. (author)

  9. Analysis of two dimensional charged particle scintillation using video image processing techniques

    International Nuclear Information System (INIS)

    Sinha, A.; Bhave, B.D.; Singh, B.; Panchal, C.G.; Joshi, V.M.; Shyam, A.; Srinivasan, M.

    1993-01-01

    A novel method for video recording of individual charged particle scintillation images and their offline analysis using digital image processing techniques for obtaining position, time and energy information is presented . Results of an exploratory experiment conducted using 241 Am and 239 Pu alpha sources are presented. (author). 3 figs., 4 tabs

  10. Feedback correction of injection errors using digital signal-processing techniques

    Directory of Open Access Journals (Sweden)

    N. S. Sereno

    2007-01-01

    Full Text Available Efficient transfer of electron beams from one accelerator to another is important for 3rd-generation light sources that operate using top-up. In top-up mode, a constant amount of charge is injected at regular intervals into the storage ring to replenish beam lost primarily due to Touschek scattering. Top-up therefore requires that the complex of injector accelerators that fill the storage ring transport beam with a minimum amount of loss. Injection can be a source of significant beam loss if not carefully controlled. In this note we describe a method of processing injection transient signals produced by beam-position monitors and using the processed data in feedback. Feedback control using the technique described here has been incorporated in the Advanced Photon Source (APS booster synchrotron to correct injection transients.

  11. Controlled Fabrication of Metallic Electrodes with Atomic Separation

    DEFF Research Database (Denmark)

    Morpurgo, A.; Robinson, D.; M. Marcus, C.

    1998-01-01

    We report a new technique for fabricating metallic electrodes on insulating substrates with separations on the 1 nm scale. The fabrication technique, which combines lithographic and electrochemical methods, provides atomic resolution without requiring sophisticated instrumentation. The process is...

  12. Newer techniques of mechanical ventilation: an overview.

    Science.gov (United States)

    Donn, Steven M; Sinha, Sunil K

    2002-10-01

    The introduction of newer, state-of-the-art, microprocessor controlled ventilator systems provides clinicians with opportunities to apply a number of advanced ventilatory modalities which were not previously available for treating newborns. Some of these techniques will need further scientific evaluation in controlled trials, but this should not preclude their use in clinical settings, as their safety has already been proved by "standard setters" for use in neonates. There is a firm physiological rationale for their use, and individual centres have already acquired substantial experience in the application of these modalities. The trend towards increasing sophistication and greater versatility is likely to continue, and clinicians involved in the care of sick newborn infants must keep abreast of these developments.

  13. Rapid Automated Dissolution and Analysis Techniques for Radionuclides in Recycle Process Streams

    International Nuclear Information System (INIS)

    Sudowe, Ralf; Roman, Audrey; Dailey, Ashlee; Go, Elaine

    2013-01-01

    The analysis of process samples for radionuclide content is an important part of current procedures for material balance and accountancy in the different process streams of a recycling plant. The destructive sample analysis techniques currently available necessitate a significant amount of time. It is therefore desirable to develop new sample analysis procedures that allow for a quick turnaround time and increased sample throughput with a minimum of deviation between samples. In particular, new capabilities for rapid sample dissolution and radiochemical separation are required. Most of the radioanalytical techniques currently employed for sample analysis are based on manual laboratory procedures. Such procedures are time- and labor-intensive, and not well suited for situations in which a rapid sample analysis is required and/or large number of samples need to be analyzed. To address this issue we are currently investigating radiochemical separation methods based on extraction chromatography that have been specifically optimized for the analysis of process stream samples. The influence of potential interferences present in the process samples as well as mass loading, flow rate and resin performance is being studied. In addition, the potential to automate these procedures utilizing a robotic platform is evaluated. Initial studies have been carried out using the commercially available DGA resin. This resin shows an affinity for Am, Pu, U, and Th and is also exhibiting signs of a possible synergistic effects in the presence of iron.

  14. The Ansel Adams zone system: HDR capture and range compression by chemical processing

    Science.gov (United States)

    McCann, John J.

    2010-02-01

    We tend to think of digital imaging and the tools of PhotoshopTM as a new phenomenon in imaging. We are also familiar with multiple-exposure HDR techniques intended to capture a wider range of scene information, than conventional film photography. We know about tone-scale adjustments to make better pictures. We tend to think of everyday, consumer, silver-halide photography as a fixed window of scene capture with a limited, standard range of response. This description of photography is certainly true, between 1950 and 2000, for instant films and negatives processed at the drugstore. These systems had fixed dynamic range and fixed tone-scale response to light. All pixels in the film have the same response to light, so the same light exposure from different pixels was rendered as the same film density. Ansel Adams, along with Fred Archer, formulated the Zone System, staring in 1940. It was earlier than the trillions of consumer photos in the second half of the 20th century, yet it was much more sophisticated than today's digital techniques. This talk will describe the chemical mechanisms of the zone system in the parlance of digital image processing. It will describe the Zone System's chemical techniques for image synthesis. It also discusses dodging and burning techniques to fit the HDR scene into the LDR print. Although current HDR imaging shares some of the Zone System's achievements, it usually does not achieve all of them.

  15. Assessment of myocardial metabolism by PET - a sophisticated dream or clinical reality

    Energy Technology Data Exchange (ETDEWEB)

    Schelbert, H R

    1986-08-01

    This symposium reviewed radionuclide techniques for the noninvasive study of regional myocardial metabolism and spanned a wide range of topics. New radiotracers for probing different metabolic pathways or selected biochemical reaction steps were presented. New information on tracers already in use was forthcoming. Because the imaging device can measure only concentrations of radiolabel in tissue, other studies examined relationships between uptake and turnover of radioactivity in tissue as an externally observed signal, the chemical fate of the label, and the biologic process under study. Other studies formulated these relationships through tracer compartment models, which are fundament to quantifying regional physiologic processes externally. Other investigations applied radiotracer methods to experimental models of cardiac disease to patients. They described findings of regional or global alterations in substrate metabolism. These observations highlighted the potential clinical value of this new methodology. At the same time, several of these observations remain at present without mechanistic explanation; yet they form the foundation on which working hypotheses can be built, which in turn can be tested in vivo.

  16. Assessment of myocardial metabolism by PET - a sophisticated dream or clinical reality

    International Nuclear Information System (INIS)

    Schelbert, H.R.

    1986-01-01

    This symposium reviewed radionuclide techniques for the noninvasive study of regional myocardial metabolism and spanned a wide range of topics. New radiotracers for probing different metabolic pathways or selected biochemical reaction steps were presented. New information on tracers already in use was forthcoming. Because the imaging device can measure only concentrations of radiolabel in tissue, other studies examined relationships between uptake and turnover of radioactivity in tissue as an externally observed signal, the chemical fate of the label, and the biologic process under study. Other studies formulated these relationships through tracer compartment models, which are fundament to quantifying regional physiologic processes externally. Other investigations applied radiotracer methods to experimental models of cardiac disease to patients. They described findings of regional or global alterations in substrate metabolism. These observations highlighted the potential clinical value of this new methodology. At the same time, several of these observations remain at present without mechanistic explanation; yet they form the foundation on which working hypotheses can be built, which in turn can be tested in vivo. (orig.)

  17. A modified bonded-interface technique with improved features for studying indentation damage of materials

    International Nuclear Information System (INIS)

    Low, I.M.

    1998-01-01

    A modified 'bonded-interface' technique with improved features for studying contact damage of ceramic (Al 2 O 3 graded Al 2 TiO 5 /Al 2 O 3 , Ti 3 SiC 2 ) and non-ceramic (epoxy, tooth) materials is developed and compared with the conventional method. This technique enables the surface damage around and below an indentor to be studied. When used in conjunction with Nomarski illumination and atomic force microscopy, this technique can reveal substantial information on the topography of indentation surface damage. In particular, it is ideal for monitoring the evolution of deformation-micro fracture damage of quasi-plastic materials. The technique is much less sophisticated, less time consuming, and user-friendly. It does not require a highly experience user to be proficient in the procedure. When compared with the conventional tool- clamp method, this modified technique gives similar, if not, identical results. Copyright (1998) Australasian Ceramic Society

  18. Pipelines programming paradigms: Prefab plumbing

    International Nuclear Information System (INIS)

    Boeheim, C.

    1991-08-01

    Mastery of CMS Pipelines is a process of learning increasingly sophisticated tools and techniques that can be applied to your problem. This paper presents a compilation of techniques that can be used as a reference for solving similar problems

  19. Mathematical Foundation Based Inter-Connectivity modelling of Thermal Image processing technique for Fire Protection

    Directory of Open Access Journals (Sweden)

    Sayantan Nath

    2015-09-01

    Full Text Available In this paper, integration between multiple functions of image processing and its statistical parameters for intelligent alarming series based fire detection system is presented. The proper inter-connectivity mapping between processing elements of imagery based on classification factor for temperature monitoring and multilevel intelligent alarm sequence is introduced by abstractive canonical approach. The flow of image processing components between core implementation of intelligent alarming system with temperature wise area segmentation as well as boundary detection technique is not yet fully explored in the present era of thermal imaging. In the light of analytical perspective of convolutive functionalism in thermal imaging, the abstract algebra based inter-mapping model between event-calculus supported DAGSVM classification for step-by-step generation of alarm series with gradual monitoring technique and segmentation of regions with its affected boundaries in thermographic image of coal with respect to temperature distinctions is discussed. The connectedness of the multifunctional operations of image processing based compatible fire protection system with proper monitoring sequence is presently investigated here. The mathematical models representing the relation between the temperature affected areas and its boundary in the obtained thermal image defined in partial derivative fashion is the core contribution of this study. The thermal image of coal sample is obtained in real-life scenario by self-assembled thermographic camera in this study. The amalgamation between area segmentation, boundary detection and alarm series are described in abstract algebra. The principal objective of this paper is to understand the dependency pattern and the principles of working of image processing components and structure an inter-connected modelling technique also for those components with the help of mathematical foundation.

  20. assessment of environmental impacts in comfortable furniture production process using life cycle assessment (LCA technique

    Directory of Open Access Journals (Sweden)

    hejhar abbasi

    2016-12-01

    Full Text Available Furniture industry releases annually a large amount of volatile organic compound to the environment due to the use of adhesives, textiles, paints and coating materials. There are some different methods to measure the load of pollutions and the environmental impacts. Life cycle assessment (LCA is one of the best techniques. LCA is a technique in which all environmental impacts related to a product assessed all over its life cycle, from cradle to grave, and ultimately can be used to improve the production process and to prevent unsuitable environmental impacts. In summary, it can be concluded that the use of this technique is the basis for sustainable development and improving social, economic, and environmental indices. This study focused on the collecting of a comprehensive life cycle inventory data for comfortable furniture in two different production processes (B1 and B2 located in Tehran province, and analyzed the environmental impacts during the production process as gate to gate investigation. The results revealed that emissions in production process B1 were higher than that of production process B2. The reason for this is that basic operations such as sawing and frame assembling along with final operation have been done in the same unit for case B1. Textile production and usage, and polyurethane foam were identified as the main hotspots, respectively. Moreover, the results showed that comfortable furniture production process has the highest effects on ecosystem quality, human health, and resources (fossil fuels and mines, respectively.

  1. A novel technique for die-level post-processing of released optical MEMS

    International Nuclear Information System (INIS)

    Elsayed, Mohannad Y; Beaulieu, Philippe-Olivier; Briere, Jonathan; Ménard, Michaël; Nabki, Frederic

    2016-01-01

    This work presents a novel die-level post-processing technique for dies including released movable structures. The procedure was applied to microelectromechanical systems (MEMS) chips that were fabricated in a commercial process, SOIMUMPs from MEMSCAP. It allows the performance of a clean DRIE etch of sidewalls on the diced chips enabling the optical testing of the pre-released MEMS mirrors through the chip edges. The etched patterns are defined by photolithography using photoresist spray coating. The photoresist thickness is tuned to create photoresist bridges over the pre-released gaps, protecting the released structures during subsequent wet processing steps. Then, the chips are subject to a sequence of wet and dry etching steps prior to dry photoresist removal in O 2 plasma. Processed micromirrors were tested and found to rotate similarly to devices without processing, demonstrating that the post-processing procedure does not affect the mechanical performance of the devices significantly. (technical note)

  2. Comparative evaluation of scatter correction techniques in 3D positron emission tomography

    CERN Document Server

    Zaidi, H

    2000-01-01

    Much research and development has been concentrated on the scatter compensation required for quantitative 3D PET. Increasingly sophisticated scatter correction procedures are under investigation, particularly those based on accurate scatter models, and iterative reconstruction-based scatter compensation approaches. The main difference among the correction methods is the way in which the scatter component in the selected energy window is estimated. Monte Carlo methods give further insight and might in themselves offer a possible correction procedure. Methods: Five scatter correction methods are compared in this paper where applicable. The dual-energy window (DEW) technique, the convolution-subtraction (CVS) method, two variants of the Monte Carlo-based scatter correction technique (MCBSC1 and MCBSC2) and our newly developed statistical reconstruction-based scatter correction (SRBSC) method. These scatter correction techniques are evaluated using Monte Carlo simulation studies, experimental phantom measurements...

  3. Discrete wavelet transform-based denoising technique for advanced state-of-charge estimator of a lithium-ion battery in electric vehicles

    International Nuclear Information System (INIS)

    Lee, Seongjun; Kim, Jonghoon

    2015-01-01

    Sophisticated data of the experimental DCV (discharging/charging voltage) of a lithium-ion battery is required for high-accuracy SOC (state-of-charge) estimation algorithms based on the state-space ECM (electrical circuit model) in BMSs (battery management systems). However, when sensing noisy DCV signals, erroneous SOC estimation (which results in low BMS performance) is inevitable. Therefore, this manuscript describes the design and implementation of a DWT (discrete wavelet transform)-based denoising technique for DCV signals. The steps for denoising a noisy DCV measurement in the proposed approach are as follows. First, using MRA (multi-resolution analysis), the noise-riding DCV signal is decomposed into different frequency sub-bands (low- and high-frequency components, A n and D n ). Specifically, signal processing of the high frequency component D n that focuses on a short-time interval is necessary to reduce noise in the DCV measurement. Second, a hard-thresholding-based denoising rule is applied to adjust the wavelet coefficients of the DWT to achieve a clear separation between the signal and the noise. Third, the desired de-noised DCV signal is reconstructed by taking the IDWT (inverse discrete wavelet transform) of the filtered detailed coefficients. Finally, this signal is sent to the ECM-based SOC estimation algorithm using an EKF (extended Kalman filter). Experimental results indicate the robustness of the proposed approach for reliable SOC estimation. - Highlights: • Sophisticated data of the experimental DCV is required for high-accuracy SOC. • DWT (discrete wavelet transform)-based denoising technique is newly investigated. • Three steps for denoising a noisy DCV measurement in this work are implemented. • Experimental results indicate the robustness of the proposed work for reliable SOC

  4. Ultra-mini PNL (UMP): Material, indications, technique, advantages and results.

    Science.gov (United States)

    Desai, Janak D

    2017-01-01

    Stone disease has afflicted mankind since centuries; records from ancient civilisations of India and Egypt have shown stones in human bodies. The scientific mind of humans has always made smart endeavours to remove the kidney stones. From large instruments made like the beaks of different animals and birds in 600 BC (Indian civilisation) to extremely sophisticated and miniaturised endoscopic intruments of today the human race has travelled a long way. The theme has always been to remove the stones with minimal morbidity and mortality and with minimum pain to the patient. The article takes you through the journey of instruments used in 600 BC until today. The story of instrumentation is a symbiosis of the medical minds along with engineering advances. The story of miniaturisation could not have moved further without the development of lasers, fiberoptics and sophisticated cameras. As the field stands today, we remove more complex stones by larger endoscopic intervention and smaller stones by miniaturised instruments. The article discusses all the merits and shortcomings of various techniques: from open surgery to standard PCNL to Mini PCNL to Ultra- Mini PCNL to Micro-PCNL.

  5. SOFT: smooth OPC fixing technique for ECO process

    Science.gov (United States)

    Zhang, Hongbo; Shi, Zheng

    2007-03-01

    SOFT (Smooth OPC Fixing Technique) is a new OPC flow developed from the basic OPC framework. It provides a new method to reduce the computation cost and complexities of ECO-OPC (Engineering Change Order - Optical Proximity Correction). In this paper, we introduce polygon comparison to extract the necessary but possibly lost fragmentation and offset information of previous post-OPC layout. By reusing these data, we can start the modification on each segment from a more accurate initial offset. In addition, the fragmentation method in the boundary of the patch in the previous OPC process is therefore available for engineers to stitch the regional ECO-OPC result back to the whole post-OPC layout seamlessly. For the ripple effect in the OPC, by comparing each segment's movement in each loop, we much free the fixing speed from the limitation of patch size. We handle layout remodification, especially in three basic kinds of ECO-OPC processes, while maintaining other design closure. Our experimental results show that, by utilizing the previous post-OPC layout, full-chip ECO-OPC can realize an over 5X acceleration and the regional ECO-OPC result can also be stitched back into the whole layout seamlessly with the ripple effect of the lithography interaction.

  6. Congestion estimation technique in the optical network unit registration process.

    Science.gov (United States)

    Kim, Geunyong; Yoo, Hark; Lee, Dongsoo; Kim, Youngsun; Lim, Hyuk

    2016-07-01

    We present a congestion estimation technique (CET) to estimate the optical network unit (ONU) registration success ratio for the ONU registration process in passive optical networks. An optical line terminal (OLT) estimates the number of collided ONUs via the proposed scheme during the serial number state. The OLT can obtain congestion level among ONUs to be registered such that this information may be exploited to change the size of a quiet window to decrease the collision probability. We verified the efficiency of the proposed method through simulation and experimental results.

  7. Sophisticated Epistemologies of Physics versus High-Stakes Tests: How Do Elite High School Students Respond to Competing Influences about How to Learn Physics?

    Science.gov (United States)

    Yerdelen-Damar, Sevda; Elby, Andrew

    2016-01-01

    This study investigates how elite Turkish high school physics students claim to approach learning physics when they are simultaneously (i) engaged in a curriculum that led to significant gains in their epistemological sophistication and (ii) subject to a high-stakes college entrance exam. Students reported taking surface (rote) approaches to…

  8. A Survey on Optimal Signal Processing Techniques Applied to Improve the Performance of Mechanical Sensors in Automotive Applications

    Directory of Open Access Journals (Sweden)

    Wilmar Hernandez

    2007-01-01

    Full Text Available In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart sensors that today’s cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher’s interest in the fusion of intelligent sensors and optimal signal processing techniques.

  9. Process sensors characterization based on noise analysis technique and artificial intelligence

    International Nuclear Information System (INIS)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos

    2005-01-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  10. Process sensors characterization based on noise analysis technique and artificial intelligence

    Energy Technology Data Exchange (ETDEWEB)

    Mesquita, Roberto N. de; Perillo, Sergio R.P.; Santos, Roberto C. dos [Instituto de Pesquisas Energeticas e Nucleares (IPEN), Sao Paulo, SP (Brazil)]. E-mail: rnavarro@ipen.br; sperillo@ipen.br; rcsantos@ipen.br

    2005-07-01

    The time response of pressure and temperature sensors from the Reactor Protection System (RPS) is a requirement that must be satisfied in nuclear power plants, furthermore is an indicative of its degradation and its remaining life. The nuclear power industry and others have been eager to implement smart sensor technologies and digital instrumentation concepts to reduce manpower and effort currently spent on testing and calibration. Process parameters fluctuations during normal operation of a reactor are caused by random variations in neutron flux, heat transfer and other sources. The output sensor noise can be considered as the response of the system to an input representing the statistical nature of the underlying process which can be modeled using a time series model. Since the noise signal measurements are influenced by many factors, such as location of sensors, extraneous noise interference, and randomness in temperature and pressure fluctuation - the quantitative estimate of the time response using autoregressive noise modeling is subject to error. This technique has been used as means of sensor monitoring. In this work a set of pressure sensors installed in one experimental loop adapted from a flow calibration setup is used to test and analyze signals in a new approach using artificial intelligence techniques. A set of measurements of dynamic signals in different experimental conditions is used to distinguish and identify underlying process sources. A methodology that uses Blind Separation of Sources with a neural networks scheme is being developed to improve time response estimate reliability in noise analysis. (author)

  11. On the Interface Between Automated Predictive Demand Planning Techniques and Humans in Collaborative Planning Processes

    DEFF Research Database (Denmark)

    Schorsch, Timm; Wallenburg, Carl Marcus; Wieland, Andreas

    The introduction of big data and predictive analytics techniques in the supply chain context constitutes a “hot topic” in both research and practice. Without arguing against this euphoria, this paper critically assesses the consequences of confronting human actors with an increasing usage...... of these techniques. The underlying case of this paper refers to collaborative supply chain processes that are predestinated for integrating new big data and predictive analytics techniques. By building a theoretical framework for deriving sound hypothesis and introducing and testing the experimental design...

  12. Review of Palm Kernel Oil Processing And Storage Techniques In South East Nigeria

    Directory of Open Access Journals (Sweden)

    Okeke CG

    2017-06-01

    Full Text Available An assessment of palm kernel processing and storage in South-Eastern Nigeria was carried out by investigative survey approach. The survey basically ascertained the extent of mechanization applicable in the area to enable the palm kernel processors and agricultural policy makers, device the modalities for improving palm kernel processing in the area. According to the results obtained from the study, in Abia state, 85% of the respondents use mechanical method while 15% use manual method in cracking their kernels. In Imo state, 83% of the processors use mechanical method while 17% use manual method. In Enugu and Ebonyi state, 70% and 50% of the processors respectively use mechanical method. It is only in Anambra state that greater number of the processors (50% use manual method while 45% use mechanical means. It is observable from the results that palm kernel oil extraction has not received much attention in mechanization. The ANOVA of the palm kernel oil extraction technique in South- East Nigeria showed significant difference in both the study area and oil extraction techniques at 5% level of probability. Results further revealed that in Abia State, 70% of the processors use complete fractional process in refining the palm kernel oil; 25% and 5% respectively use incomplete fractional process and zero refining process. In Anambra, 60% of the processors use complete fractional process and 40% use incomplete fractional process. Zero refining method is not applicable in Anambra state. In Enugu sate, 53% use complete fractional process while 25% and 22% respectively use zero refining and incomplete fractional process in refining the palm kernel oil. Imo state, mostly use complete fractional process (85% in refining palm kernel oil. About 10% use zero refining method while 5% of the processors use incomplete fractional process. Plastic containers and metal drums are dominantly used in most areas in south-east Nigeria for the storage of palm kernel oil.

  13. Three-dimensional MR imaging of the cerebrospinal system with the RARE technique

    International Nuclear Information System (INIS)

    Hennig, J.; Ott, D.; Ylayasski, J.

    1987-01-01

    Three-dimensional RARE myelography is a fast technique for high-resolution imaging of the cerebrospinal fluid. A data set with 1 x 1 x 1-mm resolution can be generated with a 12-minute acquisition time. Sophisticated three-dimensional display algorithms allow reconstruction of planes at arbitrary angles and full three-dimensional displays, which yield extremely useful information for neurosurgical planning. Additionally, the injection of contrast agent can be simulated on the computer and communication pathways between structures of interest can be found noninvasively

  14. Plant process computer replacements - techniques to limit installation schedules and costs

    International Nuclear Information System (INIS)

    Baker, M.D.; Olson, J.L.

    1992-01-01

    Plant process computer systems, a standard fixture in all nuclear power plants, are used to monitor and display important plant process parameters. Scanning thousands of field sensors and alarming out-of-limit values, these computer systems are heavily relied on by control room operators. The original nuclear steam supply system (NSSS) vendor for the power plant often supplied the plant process computer. Designed using sixties and seventies technology, a plant's original process computer has been obsolete for some time. Driven by increased maintenance costs and new US Nuclear Regulatory Commission regulations such as NUREG-0737, Suppl. 1, many utilities have replaced their process computers with more modern computer systems. Given that computer systems are by their nature prone to rapid obsolescence, this replacement cycle will likely repeat. A process computer replacement project can be a significant capital expenditure and must be performed during a scheduled refueling outage. The object of the installation process is to install a working system on schedule. Experience gained by supervising several computer replacement installations has taught lessons that, if applied, will shorten the schedule and limit the risk of costly delays. Examples illustrating this technique are given. This paper and these examples deal only with the installation process and assume that the replacement computer system has been adequately designed, and development and factory tested

  15. Functional imaging of the pancreas. Image processing techniques and clinical evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Nakanishi, Fumiko

    1984-02-01

    An image processing technique for functional imaging of the pancreas was developed and is here reported. In this paper, clinical efficacy of the technique for detecting pancreatic abnormality is evaluated in comparison with conventional pancreatic scintigraphy and CT. For quantitative evaluation, functional rate, i.e. the rate of normal functioning pancreatic area, was calculated from the functional image and subtraction image. Two hundred and ninety-five cases were studied using this technique. Conventional image had a sensitivity of 65% and a specificity of 78%, while the use of functional imaging improved sensitivity to 88% and specificity to 88%. The mean functional rate in patients with pancreatic disease was significantly lower (33.3 +- 24.5 in patients with chronic pancreatitis, 28.1 +- 26.9 in patients with acute pancreatitis, 43.4 +- 22.3 in patients with diabetes mellitus, 20.4 +- 23.4 in patients with pancreatic cancer) than the mean functional rate in cases without pancreatic disease (86.4 +- 14.2). It is suggested that functional image of the pancreas reflecting pancreatic exocrine function and functional rate is a useful indicator of pancreatic exocrine function.

  16. A deconvolution technique for processing small intestinal transit data

    Energy Technology Data Exchange (ETDEWEB)

    Brinch, K. [Department of Clinical Physiology and Nuclear Medicine, Glostrup Hospital, University Hospital of Copenhagen (Denmark); Larsson, H.B.W. [Danish Research Center of Magnetic Resonance, Hvidovre Hospital, University Hospital of Copenhagen (Denmark); Madsen, J.L. [Department of Clinical Physiology and Nuclear Medicine, Hvidovre Hospital, University Hospital of Copenhagen (Denmark)

    1999-03-01

    The deconvolution technique can be used to compute small intestinal impulse response curves from scintigraphic data. Previously suggested approaches, however, are sensitive to noise from the data. We investigated whether deconvolution based on a new simple iterative convolving technique can be recommended. Eight healthy volunteers ingested a meal that contained indium-111 diethylene triamine penta-acetic acid labelled water and technetium-99m stannous colloid labelled omelette. Imaging was performed at 30-min intervals until all radioactivity was located in the colon. A Fermi function=(1+e{sup -{alpha}{beta}})/(1+e{sup (t-{alpha}){beta}}) was chosen to characterize the small intestinal impulse response function. By changing only two parameters, {alpha} and {beta}, it is possible to obtain configurations from nearly a square function to nearly a monoexponential function. Small intestinal input function was obtained from the gastric emptying curve and convolved with the Fermi function. The sum of least squares was used to find {alpha} and {beta} yielding the best fit of the convolved curve to the oberved small intestinal time-activity curve. Finally, a small intestinal mean transit time was calculated from the Fermi function referred to. In all cases, we found an excellent fit of the convolved curve to the observed small intestinal time-activity curve, that is the Fermi function reflected the small intestinal impulse response curve. Small intestinal mean transit time of liquid marker (median 2.02 h) was significantly shorter than that of solid marker (median 2.99 h; P<0.02). The iterative convolving technique seems to be an attractive alternative to ordinary approaches for the processing of small intestinal transit data. (orig.) With 2 figs., 13 refs.

  17. Image Processing Techniques for Assessing Contractility in Isolated Adult Cardiac Myocytes

    Directory of Open Access Journals (Sweden)

    Carlos Bazan

    2009-01-01

    The physiologic application of the methodology is evaluated by assessing overall contraction in enzymatically dissociated adult rat cardiocytes. Our results demonstrate the effectiveness of the proposed approach in characterizing the true, two-dimensional, “shortening” in the contraction process of adult cardiocytes. We compare the performance of the proposed method to that of a popular edge detection system in the literature. The proposed method not only provides a more comprehensive assessment of the myocyte contraction process but also can potentially eliminate historical concerns and sources of errors caused by myocyte rotation or translation during contraction. Furthermore, the versatility of the image processing techniques makes the method suitable for determining myocyte shortening in cells that usually bend or move during contraction. The proposed method can be utilized to evaluate changes in contractile behavior resulting from drug intervention, disease modeling, transgeneity, or other common applications to mammalian cardiocytes.

  18. Measurement and image processing evaluation of surface modifications of dental implants G4 pure titanium created by different techniques

    Energy Technology Data Exchange (ETDEWEB)

    Bulutsuz, A. G., E-mail: asligunaya@gmail.com [Department of Mechanical Engineering, Yildiz Technical University, 34349 Besiktas, İstanbul (Turkey); Demircioglu, P., E-mail: pinar.demircioglu@adu.edu.tr; Bogrekci, I., E-mail: ismail.bogrekci@adu.edu.tr [Adnan Menderes University, Faculty of Engineering, Department of Mechanical Engineering, Aytepe, 09010, Aydin (Turkey); Durakbasa, M. N., E-mail: durakbasa@gmx.at [Department of Interchangeable Manufacturing and Industrial Metrology, Institute for Production Engineering and Laser Technology, Vienna University of Technology, Karlsplatz 13/3113 A-1040 Wien (Austria); Katiboglu, A. B., E-mail: abkatiboglu@hotmail.com [Istanbul University, Faculty of Dentistry, Department of Oral and Maxillofacial Surgery, Istanbul (Turkey)

    2015-03-30

    Foreign substances and organic tissue interaction placed into the jaw in order to eliminate tooth loss involves a highly complex process. Many biological reactions take place as well as the biomechanical forces that influence this formation. Osseointegration denotes to the direct structural and functional association between the living bone and the load-bearing artificial implant's surface. Taking into consideration of the requirements in the manufacturing processes of the implants, surface characterizations with high precise measurement techniques are investigated and thus long-term success of dental implant is emphasized on the importance of these processes in this study. In this research, the detailed surface characterization was performed to identify the dependence of the manufacturing techniques on the surface properties by using the image processing methods and using the scanning electron microscope (SEM) for morphological properties in 3D and Taylor Hobson stylus profilometer for roughness properties in 2D. Three implant surfaces fabricated by different manufacturing techniques were inspected, and a machined surface was included into the study as a reference specimen. The results indicated that different surface treatments were strongly influenced surface morphology. Thus 2D and 3D precise inspection techniques were highlighted on the importance for surface characterization. Different image analyses techniques such as Dark-light technique were used to verify the surface measurement results. The computational phase was performed using image processing toolbox in Matlab with precise evaluation of the roughness for the implant surfaces. The relationship between the number of black and white pixels and surface roughness is presented. FFT image processing and analyses results explicitly imply that the technique is useful in the determination of surface roughness. The results showed that the number of black pixels in the image increases with increase in

  19. Microstructures and Mechanical Properties of Co-Cr Dental Alloys Fabricated by Three CAD/CAM-Based Processing Techniques

    Directory of Open Access Journals (Sweden)

    Hae Ri Kim

    2016-07-01

    Full Text Available The microstructures and mechanical properties of cobalt-chromium (Co-Cr alloys produced by three CAD/CAM-based processing techniques were investigated in comparison with those produced by the traditional casting technique. Four groups of disc- (microstructures or dumbbell- (mechanical properties specimens made of Co-Cr alloys were prepared using casting (CS, milling (ML, selective laser melting (SLM, and milling/post-sintering (ML/PS. For each technique, the corresponding commercial alloy material was used. The microstructures of the specimens were evaluated via X-ray diffractometry, optical and scanning electron microscopy with energy-dispersive X-ray spectroscopy, and electron backscattered diffraction pattern analysis. The mechanical properties were evaluated using a tensile test according to ISO 22674 (n = 6. The microstructure of the alloys was strongly influenced by the manufacturing processes. Overall, the SLM group showed superior mechanical properties, the ML/PS group being nearly comparable. The mechanical properties of the ML group were inferior to those of the CS group. The microstructures and mechanical properties of Co-Cr alloys were greatly dependent on the manufacturing technique as well as the chemical composition. The SLM and ML/PS techniques may be considered promising alternatives to the Co-Cr alloy casting process.

  20. Uncovering cognitive processes: Different techniques that can contribute to cognitive load research and instruction

    NARCIS (Netherlands)

    Van Gog, Tamara; Kester, Liesbeth; Nievelstein, Fleurie; Giesbers, Bas; Fred, Paas

    2009-01-01

    Van Gog, T., Kester, L., Nievelstein, F., Giesbers, B., & Paas, F. (2009). Uncovering cognitive processes: Different techniques that can contribute to cognitive load research and instruction. Computers in Human Behavior, 25, 325-331.

  1. Preparing systems engineering and computing science students in disciplined methods, quantitative, and advanced statistical techniques to improve process performance

    Science.gov (United States)

    McCray, Wilmon Wil L., Jr.

    The research was prompted by a need to conduct a study that assesses process improvement, quality management and analytical techniques taught to students in U.S. colleges and universities undergraduate and graduate systems engineering and the computing science discipline (e.g., software engineering, computer science, and information technology) degree programs during their academic training that can be applied to quantitatively manage processes for performance. Everyone involved in executing repeatable processes in the software and systems development lifecycle processes needs to become familiar with the concepts of quantitative management, statistical thinking, process improvement methods and how they relate to process-performance. Organizations are starting to embrace the de facto Software Engineering Institute (SEI) Capability Maturity Model Integration (CMMI RTM) Models as process improvement frameworks to improve business processes performance. High maturity process areas in the CMMI model imply the use of analytical, statistical, quantitative management techniques, and process performance modeling to identify and eliminate sources of variation, continually improve process-performance; reduce cost and predict future outcomes. The research study identifies and provides a detail discussion of the gap analysis findings of process improvement and quantitative analysis techniques taught in U.S. universities systems engineering and computing science degree programs, gaps that exist in the literature, and a comparison analysis which identifies the gaps that exist between the SEI's "healthy ingredients " of a process performance model and courses taught in U.S. universities degree program. The research also heightens awareness that academicians have conducted little research on applicable statistics and quantitative techniques that can be used to demonstrate high maturity as implied in the CMMI models. The research also includes a Monte Carlo simulation optimization

  2. Algorithms for image processing and computer vision

    CERN Document Server

    Parker, J R

    2010-01-01

    A cookbook of algorithms for common image processing applications Thanks to advances in computer hardware and software, algorithms have been developed that support sophisticated image processing without requiring an extensive background in mathematics. This bestselling book has been fully updated with the newest of these, including 2D vision methods in content-based searches and the use of graphics cards as image processing computational aids. It's an ideal reference for software engineers and developers, advanced programmers, graphics programmers, scientists, and other specialists wh

  3. Early skin tumor detection from microscopic images through image processing

    International Nuclear Information System (INIS)

    Siddiqi, A.A.; Narejo, G.B.; Khan, A.M.

    2017-01-01

    The research is done to provide appropriate detection technique for skin tumor detection. The work is done by using the image processing toolbox of MATLAB. Skin tumors are unwanted skin growth with different causes and varying extent of malignant cells. It is a syndrome in which skin cells mislay the ability to divide and grow normally. Early detection of tumor is the most important factor affecting the endurance of a patient. Studying the pattern of the skin cells is the fundamental problem in medical image analysis. The study of skin tumor has been of great interest to the researchers. DIP (Digital Image Processing) allows the use of much more complex algorithms for image processing, and hence, can offer both more sophisticated performance at simple task, and the implementation of methods which would be impossibly by analog means. It allows much wider range of algorithms to be applied to the input data and can avoid problems such as build up of noise and signal distortion during processing. The study shows that few works has been done on cellular scale for the images of skin. This research allows few checks for the early detection of skin tumor using microscopic images after testing and observing various algorithms. After analytical evaluation the result has been observed that the proposed checks are time efficient techniques and appropriate for the tumor detection. The algorithm applied provides promising results in lesser time with accuracy. The GUI (Graphical User Interface) that is generated for the algorithm makes the system user friendly. (author)

  4. Plasma processing techniques for deposition of carbonic thin protective coatings on structural nuclear materials

    International Nuclear Information System (INIS)

    Andrei, V.; Oncioiu, G.; Coaca, E.; Rusu, O.; Lungu, C.

    2009-01-01

    Full text of publication follows: The production of nano-structured surface films with controlled properties is crucial for the development of materials necessary for the Advanced Systems for Nuclear Energy. Since the surface of materials is the zone through which materials interact with the environment, the surface science and surface engineering techniques plays an essential role in the understanding and control of the processes involved. Complex surface structures were developed on stainless steels used as structural nuclear materials: austenitic stainless steels based on Fe, austenitic steels with high content of Cr, ferrites resistant to corrosion, by various Plasma Processing methods which include: - Plasma Electrolytic (PE) treatments: the steel substrates were modified by nitriding and nitro-carburizing plasma diffusion treatments; - carbonic films deposition in Thermionic Vacuum Arc Plasma. The results of the characterization of surface structures obtained in various experimental conditions for improvement of the properties (corrosion resistance, hardness, wear properties) are reported: the processes and structures were characterized by correlation of the results of the complementary techniques: XPS, 'depth profiling', SEM, XRD, EIS. An overall description of the processes involved in the surface properties improvement, and some consideration about the new materials development for energy technologies are presented

  5. Noise Suppression in ECG Signals through Efficient One-Step Wavelet Processing Techniques

    Directory of Open Access Journals (Sweden)

    E. Castillo

    2013-01-01

    Full Text Available This paper illustrates the application of the discrete wavelet transform (DWT for wandering and noise suppression in electrocardiographic (ECG signals. A novel one-step implementation is presented, which allows improving the overall denoising process. In addition an exhaustive study is carried out, defining threshold limits and thresholding rules for optimal wavelet denoising using this presented technique. The system has been tested using synthetic ECG signals, which allow accurately measuring the effect of the proposed processing. Moreover, results from real abdominal ECG signals acquired from pregnant women are presented in order to validate the presented approach.

  6. Microbial safety of minimally processed foods

    National Research Council Canada - National Science Library

    Novak, John S; Sapers, Gerald M; Juneja, Vijay K

    2003-01-01

    ...-course meals. All are expected to be portioned and minimally processed to balance the naturalness of unaltered foods with a concern for safety. Yet the responsibility for proper food preparation and handling remains with the naïve modern consumer, who may be less adept in food preparations than his or her less sophisticated ancestors. As a result,...

  7. Assessment of the impact strength of the denture base resin polymerized by various processing techniques

    Directory of Open Access Journals (Sweden)

    Rajashree Jadhav

    2013-01-01

    Full Text Available Aim : To measure the impact strength of denture base resins polymerized using short and long curing cycles by water bath, pressure cooker and microwave techniques. Materials and Methods: For impact strength testing, 60 samples were made. The sample dimensions were 60 mm × 12 mm × 3 mm, as standardized by the American Standards for Testing and Materials (ASTM. A digital caliper was used to locate the midpoint of sample. The impact strength was measured in IZOD type of impact tester using CEAST Impact tester. The pendulum struck the sample and it broke. The energy required to break the sample was measured in Joules. Data were analyzed using Student′s " t" test. Results: There was statistically significant difference in the impact strength of denture base resins polymerized by long curing cycle and short curing cycle in each technique, with the long curing processing being the best. Conclusion: The polymerization technique plays an important role in the influence of impact strength in the denture base resin. This research demonstrates that the denture base resin polymerized by microwave processing technique possessed the highest impact strength.

  8. Multiple-output all-optical header processing technique based on two-pulse correlation principle

    NARCIS (Netherlands)

    Calabretta, N.; Liu, Y.; Waardt, de H.; Hill, M.T.; Khoe, G.D.; Dorren, H.J.S.

    2001-01-01

    A serial all-optical header processing technique based on a two-pulse correlation principle in a semiconductor laser amplifier in a loop mirror (SLALOM) configuration that can have a large number of output ports is presented. The operation is demonstrated experimentally at a 10Gbit/s Manchester

  9. Myocardial tagging by Cardiovascular Magnetic Resonance: evolution of techniques--pulse sequences, analysis algorithms, and applications

    Directory of Open Access Journals (Sweden)

    Ibrahim El-Sayed H

    2011-07-01

    Full Text Available Abstract Cardiovascular magnetic resonance (CMR tagging has been established as an essential technique for measuring regional myocardial function. It allows quantification of local intramyocardial motion measures, e.g. strain and strain rate. The invention of CMR tagging came in the late eighties, where the technique allowed for the first time for visualizing transmural myocardial movement without having to implant physical markers. This new idea opened the door for a series of developments and improvements that continue up to the present time. Different tagging techniques are currently available that are more extensive, improved, and sophisticated than they were twenty years ago. Each of these techniques has different versions for improved resolution, signal-to-noise ratio (SNR, scan time, anatomical coverage, three-dimensional capability, and image quality. The tagging techniques covered in this article can be broadly divided into two main categories: 1 Basic techniques, which include magnetization saturation, spatial modulation of magnetization (SPAMM, delay alternating with nutations for tailored excitation (DANTE, and complementary SPAMM (CSPAMM; and 2 Advanced techniques, which include harmonic phase (HARP, displacement encoding with stimulated echoes (DENSE, and strain encoding (SENC. Although most of these techniques were developed by separate groups and evolved from different backgrounds, they are in fact closely related to each other, and they can be interpreted from more than one perspective. Some of these techniques even followed parallel paths of developments, as illustrated in the article. As each technique has its own advantages, some efforts have been made to combine different techniques together for improved image quality or composite information acquisition. In this review, different developments in pulse sequences and related image processing techniques are described along with the necessities that led to their invention

  10. SENSE-MAKING TECHNIQUES IN EDUCATIONAL PROCESS AND THEIR IMPACT ON THE PERSONAL CHARACTERISTICS OF STUDENTS

    Directory of Open Access Journals (Sweden)

    Irina V. Abakumova

    2017-12-01

    Full Text Available This study looks into psychotechnics used in education and contributing to initiating logic among students, their personal growth and characterizes psychological features of “sense-deducting”. Here you will find a review of the sense-making techniques considering as one of the categories of psychotechnics. The described techniques are based on the human psychology, they improve the quality of instruction, create a favorable and unique system of values, take into account the individual characteristics of all types of education, and influence the sense-making process development among children. Sense-making techniques are stated in the author’s classification and extended by practical methods. The study of psychological features of influence of sense-making techniques on the personality of a student lets us see new patterns in personal, subjective and “meta-subjective” results of acquiring of the school program via transformation and development of value/logic consciousness of a child. The work emphasizes that the use of sense-making techniques is effective in the educational and after-school activities of the educational organization. The achieved results make it possible to understand, to substantiate the naturalness and relevance of the sense-technical approach according to personal and academic indicators of students. In the process of competent and correct use of the semantic techniques, we see the possibility of conveying the best, productive and quality pedagogical experience, as well as the perspective of innovative developments in the psychological and pedagogical sciences. For children and adolescents, information, thanks to sense-techniques, starts to be personal in nature, knowledge is objectified, learning activity becomes an individual need.

  11. Acoustic Emission Signal Processing Technique to Characterize Reactor In-Pile Phenomena

    Energy Technology Data Exchange (ETDEWEB)

    Vivek Agarwal; Magdy Samy Tawfik; James A Smith

    2014-07-01

    Existing and developing advanced sensor technologies and instrumentation will allow non-intrusive in-pile measurement of temperature, extension, and fission gases when coupled with advanced signal processing algorithms. The transmitted measured sensor signals from inside to the outside of containment structure are corrupted by noise and are attenuated, thereby reducing the signal strength and signal-to-noise ratio. Identification and extraction of actual signal (representative of an in-pile phenomenon) is a challenging and complicated process. In this paper, empirical mode decomposition technique is proposed to reconstruct actual sensor signal by partially combining intrinsic mode functions. Reconstructed signal corresponds to phenomena and/or failure modes occurring inside the reactor. In addition, it allows accurate non-intrusive monitoring and trending of in-pile phenomena.

  12. Multi-disciplinary communication networks for skin risk assessment in nursing homes with high IT sophistication.

    Science.gov (United States)

    Alexander, Gregory L; Pasupathy, Kalyan S; Steege, Linsey M; Strecker, E Bradley; Carley, Kathleen M

    2014-08-01

    The role of nursing home (NH) information technology (IT) in quality improvement has not been clearly established, and its impacts on communication between care givers and patient outcomes in these settings deserve further attention. In this research, we describe a mixed method approach to explore communication strategies used by healthcare providers for resident skin risk in NH with high IT sophistication (ITS). Sample included NH participating in the statewide survey of ITS. We incorporated rigorous observation of 8- and 12-h shifts, and focus groups to identify how NH IT and a range of synchronous and asynchronous tools are used. Social network analysis tools and qualitative analysis were used to analyze data and identify relationships between ITS dimensions and communication interactions between care providers. Two of the nine ITS dimensions (resident care-technological and administrative activities-technological) and total ITS were significantly negatively correlated with number of unique interactions. As more processes in resident care and administrative activities are supported by technology, the lower the number of observed unique interactions. Additionally, four thematic areas emerged from staff focus groups that demonstrate how important IT is to resident care in these facilities including providing resident-centered care, teamwork and collaboration, maintaining safety and quality, and using standardized information resources. Our findings in this study confirm prior research that as technology support (resident care and administrative activities) and overall ITS increases, observed interactions between staff members decrease. Conversations during staff interviews focused on how technology facilitated resident centered care through enhanced information sharing, greater virtual collaboration between team members, and improved care delivery. These results provide evidence for improving the design and implementation of IT in long term care systems to support

  13. Health care logistics and space: accounting for the physical build environment

    NARCIS (Netherlands)

    Boucherie, Richardus J.; Hans, Elias W.; Hartmann, Timo; Larqoque, C.; Himmelspach, J.; Pasupathy, R.; Rose, O.; Uhrmacher, A.M.

    2012-01-01

    Planning and scheduling of health care processes has improved considerably using operations research techniques. Besides analytical and optimization tools, a substantial amount of sophisticated discrete event simulation tools supporting (re-)design of existing logistical processes in and around

  14. An improved algorithm of image processing technique for film thickness measurement in a horizontal stratified gas-liquid two-phase flow

    Energy Technology Data Exchange (ETDEWEB)

    Kuntoro, Hadiyan Yusuf, E-mail: hadiyan.y.kuntoro@mail.ugm.ac.id; Majid, Akmal Irfan; Deendarlianto, E-mail: deendarlianto@ugm.ac.id [Center for Energy Studies, Gadjah Mada University, Sekip K-1A Kampus UGM, Yogyakarta 55281 (Indonesia); Department of Mechanical and Industrial Engineering, Faculty of Engineering, Gadjah Mada University, Jalan Grafika 2, Yogyakarta 55281 (Indonesia); Hudaya, Akhmad Zidni; Dinaryanto, Okto [Department of Mechanical and Industrial Engineering, Faculty of Engineering, Gadjah Mada University, Jalan Grafika 2, Yogyakarta 55281 (Indonesia)

    2016-06-03

    Due to the importance of the two-phase flow researches for the industrial safety analysis, many researchers developed various methods and techniques to study the two-phase flow phenomena on the industrial cases, such as in the chemical, petroleum and nuclear industries cases. One of the developing methods and techniques is image processing technique. This technique is widely used in the two-phase flow researches due to the non-intrusive capability to process a lot of visualization data which are contain many complexities. Moreover, this technique allows to capture direct-visual information data of the flow which are difficult to be captured by other methods and techniques. The main objective of this paper is to present an improved algorithm of image processing technique from the preceding algorithm for the stratified flow cases. The present algorithm can measure the film thickness (h{sub L}) of stratified flow as well as the geometrical properties of the interfacial waves with lower processing time and random-access memory (RAM) usage than the preceding algorithm. Also, the measurement results are aimed to develop a high quality database of stratified flow which is scanty. In the present work, the measurement results had a satisfactory agreement with the previous works.

  15. Neural engineering from advanced biomaterials to 3D fabrication techniques

    CERN Document Server

    Kaplan, David

    2016-01-01

    This book covers the principles of advanced 3D fabrication techniques, stem cells and biomaterials for neural engineering. Renowned contributors cover topics such as neural tissue regeneration, peripheral and central nervous system repair, brain-machine interfaces and in vitro nervous system modeling. Within these areas, focus remains on exciting and emerging technologies such as highly developed neuroprostheses and the communication channels between the brain and prostheses, enabling technologies that are beneficial for development of therapeutic interventions, advanced fabrication techniques such as 3D bioprinting, photolithography, microfluidics, and subtractive fabrication, and the engineering of implantable neural grafts. There is a strong focus on stem cells and 3D bioprinting technologies throughout the book, including working with embryonic, fetal, neonatal, and adult stem cells and a variety of sophisticated 3D bioprinting methods for neural engineering applications. There is also a strong focus on b...

  16. xSyn: A Software Tool for Identifying Sophisticated 3-Way Interactions From Cancer Expression Data

    Directory of Open Access Journals (Sweden)

    Baishali Bandyopadhyay

    2017-08-01

    Full Text Available Background: Constructing gene co-expression networks from cancer expression data is important for investigating the genetic mechanisms underlying cancer. However, correlation coefficients or linear regression models are not able to model sophisticated relationships among gene expression profiles. Here, we address the 3-way interaction that 2 genes’ expression levels are clustered in different space locations under the control of a third gene’s expression levels. Results: We present xSyn, a software tool for identifying such 3-way interactions from cancer gene expression data based on an optimization procedure involving the usage of UPGMA (Unweighted Pair Group Method with Arithmetic Mean and synergy. The effectiveness is demonstrated by application to 2 real gene expression data sets. Conclusions: xSyn is a useful tool for decoding the complex relationships among gene expression profiles. xSyn is available at http://www.bdxconsult.com/xSyn.html .

  17. Estudo teórico das transições eletrônicas usando métodos simples e sofisticados Theoretical study of electronic transitions using simple and sophisticated methods

    Directory of Open Access Journals (Sweden)

    Nelson H. Morgon

    2013-01-01

    Full Text Available In this paper, the use of both simple and sophisticated models in the study of electronic transitions was explored for a set of molecular systems: C2H4, C4H4, C4H6, C6H6, C6H8, "C8", C60, and [H2NCHCH(CHCHkCHNH2]+, where k = 0 to 4. The simple model of the free particle (1D, 2D, and 3D boxes, rings or spherical surfaces, considering the boundary conditions, was found to yield similar results to the sophisticated theoretical methods such as EOM-CCSD/6-311++G** or TD(NStates=5,Root=1-M06-2X/6-311++G**.

  18. DaqProVis, a toolkit for acquisition, interactive analysis, processing and visualization of multidimensional data

    Energy Technology Data Exchange (ETDEWEB)

    Morhac, M. [Institute of Physics, Slovak Academy of Sciences, Dubravska cesta 9, 845 11 Bratislava (Slovakia)]. E-mail: fyzimiro@savba.sk; Matousek, V. [Institute of Physics, Slovak Academy of Sciences, Dubravska cesta 9, 845 11 Bratislava (Slovakia); Turzo, I. [Institute of Physics, Slovak Academy of Sciences, Dubravska cesta 9, 845 11 Bratislava (Slovakia); Kliman, J. [Institute of Physics, Slovak Academy of Sciences, Dubravska cesta 9, 845 11 Bratislava (Slovakia)

    2006-04-01

    Multidimensional data acquisition, processing and visualization system to analyze experimental data in nuclear physics is described. It includes a large number of sophisticated algorithms of the multidimensional spectra processing, including background elimination, deconvolution, peak searching and fitting.

  19. Reduce of adherence problems in galvanised processes through data mining techniques

    International Nuclear Information System (INIS)

    Martinez de Pison, F. J.; Ordieres, J.; Pernia, A.; Alba, F.; Torre, V.

    2007-01-01

    This paper presents an example of the application of data mining techniques to obtain hidden knowledge from the historical data of a hot dip galvanizing process and to establish rules to improve quality in the final product and to reduce errors in the process. For this purpose, the tuning records of a hot dip galvanizing line where coils with adherence problems in the zinc coating had been identified were used as starting point. From the database of the process, the classical data mining approach was applied to obtain and analyze a number of decision trees hat classified two types of coils, i.e. those with the right adherence and those with irregular adherence. The variables and values that might have influenced the quality of the coating were extracted from these tress. Several rules that may be applied to reduce the number of faulty coils with adherence problems were also established. (Author) 24 refs

  20. Measurement of pharyngeal sensory cortical processing: technique and physiologic implications

    Directory of Open Access Journals (Sweden)

    Ringelstein E Bernd

    2009-07-01

    Full Text Available Abstract Background Dysphagia is a major complication of different diseases affecting both the central and peripheral nervous system. Pharyngeal sensory impairment is one of the main features of neurogenic dysphagia. Therefore an objective technique to examine the cortical processing of pharyngeal sensory input would be a helpful diagnostic tool in this context. We developed a simple paradigm to perform pneumatic stimulation to both sides of the pharyngeal wall. Whole-head MEG was employed to study changes in cortical activation during this pharyngeal stimulation in nine healthy subjects. Data were analyzed by means of synthetic aperture magnetometry (SAM and the group analysis of individual SAM data was performed using a permutation test. Results Our results revealed bilateral activation of the caudolateral primary somatosensory cortex following sensory pharyngeal stimulation with a slight lateralization to the side of stimulation. Conclusion The method introduced here is simple and easy to perform and might be applicable in the clinical setting. The results are in keeping with previous findings showing bihemispheric involvement in the complex task of sensory pharyngeal processing. They might also explain changes in deglutition after hemispheric strokes. The ipsilaterally lateralized processing is surprising and needs further investigation.

  1. Method for automatically evaluating a transition from a batch manufacturing technique to a lean manufacturing technique

    Science.gov (United States)

    Ivezic, Nenad; Potok, Thomas E.

    2003-09-30

    A method for automatically evaluating a manufacturing technique comprises the steps of: receiving from a user manufacturing process step parameters characterizing a manufacturing process; accepting from the user a selection for an analysis of a particular lean manufacturing technique; automatically compiling process step data for each process step in the manufacturing process; automatically calculating process metrics from a summation of the compiled process step data for each process step; and, presenting the automatically calculated process metrics to the user. A method for evaluating a transition from a batch manufacturing technique to a lean manufacturing technique can comprise the steps of: collecting manufacturing process step characterization parameters; selecting a lean manufacturing technique for analysis; communicating the selected lean manufacturing technique and the manufacturing process step characterization parameters to an automatic manufacturing technique evaluation engine having a mathematical model for generating manufacturing technique evaluation data; and, using the lean manufacturing technique evaluation data to determine whether to transition from an existing manufacturing technique to the selected lean manufacturing technique.

  2. Influence of different processing techniques on the mechanical properties of used tires in embankment construction

    International Nuclear Information System (INIS)

    Edincliler, Ayse; Baykal, Goekhan; Saygili, Altug

    2010-01-01

    Use of the processed used tires in embankment construction is becoming an accepted way of beneficially recycling scrap tires due to shortages of natural mineral resources and increasing waste disposal costs. Using these used tires in construction requires an awareness of the properties and the limitations associated with their use. The main objective of this paper is to assess the different processing techniques on the mechanical properties of used tires-sand mixtures to improve the engineering properties of the available soil. In the first part, a literature study on the mechanical properties of the processed used tires such as tire shreds, tire chips, tire buffings and their mixtures with sand are summarized. In the second part, large-scale direct shear tests are performed to evaluate shear strength of tire crumb-sand mixtures where information is not readily available in the literature. The test results with tire crumb were compared with the other processed used tire-sand mixtures. Sand-used tire mixtures have higher shear strength than that of the sand alone and the shear strength parameters depend on the processing conditions of used tires. Three factors are found to significantly affect the mechanical properties: normal stress, processing techniques, and the used tire content.

  3. Influence of different processing techniques on the mechanical properties of used tires in embankment construction.

    Science.gov (United States)

    Edinçliler, Ayşe; Baykal, Gökhan; Saygili, Altug

    2010-06-01

    Use of the processed used tires in embankment construction is becoming an accepted way of beneficially recycling scrap tires due to shortages of natural mineral resources and increasing waste disposal costs. Using these used tires in construction requires an awareness of the properties and the limitations associated with their use. The main objective of this paper is to assess the different processing techniques on the mechanical properties of used tires-sand mixtures to improve the engineering properties of the available soil. In the first part, a literature study on the mechanical properties of the processed used tires such as tire shreds, tire chips, tire buffings and their mixtures with sand are summarized. In the second part, large-scale direct shear tests are performed to evaluate shear strength of tire crumb-sand mixtures where information is not readily available in the literature. The test results with tire crumb were compared with the other processed used tire-sand mixtures. Sand-used tire mixtures have higher shear strength than that of the sand alone and the shear strength parameters depend on the processing conditions of used tires. Three factors are found to significantly affect the mechanical properties: normal stress, processing techniques, and the used tire content. Copyright 2009. Published by Elsevier Ltd.

  4. Domain Immersion Technique And Free Surface Computations Applied To Extrusion And Mixing Processes

    Science.gov (United States)

    Valette, Rudy; Vergnes, Bruno; Basset, Olivier; Coupez, Thierry

    2007-04-01

    This work focuses on the development of numerical techniques devoted to the simulation of mixing processes of complex fluids such as twin-screw extrusion or batch mixing. In mixing process simulation, the absence of symmetry of the moving boundaries (the screws or the rotors) implies that their rigid body motion has to be taken into account by using a special treatment. We therefore use a mesh immersion technique (MIT), which consists in using a P1+/P1-based (MINI-element) mixed finite element method for solving the velocity-pressure problem and then solving the problem in the whole barrel cavity by imposing a rigid motion (rotation) to nodes found located inside the so called immersed domain, each subdomain (screw, rotor) being represented by a surface CAD mesh (or its mathematical equation in simple cases). The independent meshes are immersed into a unique backgound computational mesh by computing the distance function to their boundaries. Intersections of meshes are accounted for, allowing to compute a fill factor usable as for the VOF methodology. This technique, combined with the use of parallel computing, allows to compute the time-dependent flow of generalized Newtonian fluids including yield stress fluids in a complex system such as a twin screw extruder, including moving free surfaces, which are treated by a "level set" and Hamilton-Jacobi method.

  5. An Automated Energy Detection Algorithm Based on Morphological and Statistical Processing Techniques

    Science.gov (United States)

    2018-01-09

    100 kHz, 1 MHz 100 MHz–1 GHz 1 100 kHz 3. Statistical Processing 3.1 Statistical Analysis Statistical analysis is the mathematical science...quantitative terms. In commercial prognostics and diagnostic vibrational monitoring applications , statistical techniques that are mainly used for alarm...Balakrishnan N, editors. Handbook of statistics . Amsterdam (Netherlands): Elsevier Science; 1998. p 555–602; (Order statistics and their applications

  6. Extension of ERIM multispectral data processing capabilities through improved data handling techniques

    Science.gov (United States)

    Kriegler, F. J.

    1973-01-01

    The improvement and extension of the capabilities of the Environmental Research Institute of Michigan processing facility in handling multispectral data are discussed. Improvements consisted of implementing hardware modifications which permitted more rapid access to the recorded data through improved numbering and indexing of such data. In addition, techniques are discussed for handling data from sources other than the ERIM M-5 and M-7 scanner systems.

  7. New analysis methods to push the boundaries of diagnostic techniques in the environmental sciences

    International Nuclear Information System (INIS)

    Lungaroni, M.; Peluso, E.; Gelfusa, M.; Malizia, A.; Talebzadeh, S.; Gaudio, P.; Murari, A.; Vega, J.

    2016-01-01

    In the last years, new and more sophisticated measurements have been at the basis of the major progress in various disciplines related to the environment, such as remote sensing and thermonuclear fusion. To maximize the effectiveness of the measurements, new data analysis techniques are required. First data processing tasks, such as filtering and fitting, are of primary importance, since they can have a strong influence on the rest of the analysis. Even if Support Vector Regression is a method devised and refined at the end of the 90s, a systematic comparison with more traditional non parametric regression methods has never been reported. In this paper, a series of systematic tests is described, which indicates how SVR is a very competitive method of non-parametric regression that can usefully complement and often outperform more consolidated approaches. The performance of Support Vector Regression as a method of filtering is investigated first, comparing it with the most popular alternative techniques. Then Support Vector Regression is applied to the problem of non-parametric regression to analyse Lidar surveys for the environments measurement of particulate matter due to wildfires. The proposed approach has given very positive results and provides new perspectives to the interpretation of the data.

  8. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    Science.gov (United States)

    Fushing, Hsieh; McAssey, Michael P; Beisner, Brianne; McCowan, Brenda

    2011-03-15

    We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  9. Ranking network of a captive rhesus macaque society: a sophisticated corporative kingdom.

    Directory of Open Access Journals (Sweden)

    Hsieh Fushing

    2011-03-01

    Full Text Available We develop a three-step computing approach to explore a hierarchical ranking network for a society of captive rhesus macaques. The computed network is sufficiently informative to address the question: Is the ranking network for a rhesus macaque society more like a kingdom or a corporation? Our computations are based on a three-step approach. These steps are devised to deal with the tremendous challenges stemming from the transitivity of dominance as a necessary constraint on the ranking relations among all individual macaques, and the very high sampling heterogeneity in the behavioral conflict data. The first step simultaneously infers the ranking potentials among all network members, which requires accommodation of heterogeneous measurement error inherent in behavioral data. Our second step estimates the social rank for all individuals by minimizing the network-wide errors in the ranking potentials. The third step provides a way to compute confidence bounds for selected empirical features in the social ranking. We apply this approach to two sets of conflict data pertaining to two captive societies of adult rhesus macaques. The resultant ranking network for each society is found to be a sophisticated mixture of both a kingdom and a corporation. Also, for validation purposes, we reanalyze conflict data from twenty longhorn sheep and demonstrate that our three-step approach is capable of correctly computing a ranking network by eliminating all ranking error.

  10. HARM processing techniques for MEMS and MOEMS devices using bonded SOI substrates and DRIE

    Science.gov (United States)

    Gormley, Colin; Boyle, Anne; Srigengan, Viji; Blackstone, Scott C.

    2000-08-01

    Silicon-on-Insulator (SOI) MEMS devices (1) are rapidly gaining popularity in realizing numerous solutions for MEMS, especially in the optical and inertia application fields. BCO recently developed a DRIE trench etch, utilizing the Bosch process, and refill process for high voltage dielectric isolation integrated circuits on thick SOI substrates. In this paper we present our most recently developed DRIE processes for MEMS and MOEMS devices. These advanced etch techniques are initially described and their integration with silicon bonding demonstrated. This has enabled process flows that are currently being utilized to develop optical router and filter products for fiber optics telecommunications and high precision accelerometers.

  11. Characterization of stress degradation products of benazepril by using sophisticated hyphenated techniques.

    Science.gov (United States)

    Narayanam, Mallikarjun; Sahu, Archana; Singh, Saranjit

    2013-01-04

    Benazepril, an anti-hypertensive drug, was subjected to forced degradation studies. The drug was unstable under hydrolytic conditions, yielding benazeprilat, which is a known major degradation product (DP) and an active metabolite. It also underwent photochemical degradation in acid and neutral pH conditions, resulting in multiple minor DPs. The products were separated on a reversed phase (C18) column in a gradient mode, and subjected to LC-MS and LC-NMR studies. Initially, comprehensive mass fragmentation pathway of the drug was established through support of high resolution mass spectrometric (HR-MS) and multi stage tandem mass spectrometric (MS(n)) data. The DPs were also subjected to LC-MS/TOF studies to obtain their accurate masses. Along with, on-line H/D exchange data were obtained to ascertain the number of exchangeable hydrogens in each molecule. LC-(1)H NMR and LC-2DNMR data were additionally acquired in a fraction loop mode. The whole information was successfully employed for the characterization of all the DPs. A complete degradation pathway of the drug was also established. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Effects of processing techniques on oxidative stability of Prunus pedunculatus seed oil

    Directory of Open Access Journals (Sweden)

    J. Yan

    2017-09-01

    Full Text Available This paper investigated the effects of Prunus pedunculatus (P. pedunculatus seed pre-treatment, including microwaving (M, roasting (R, steaming (S and roasting plus steaming (RS on crude oil quality in terms of yield, color change, fatty acid composition, and oxidative stability. The results showed an increase in monounsaturated fatty acid content and oxidative stability of the oils obtained from different processing treatments compared to the oil obtained from raw seeds (RW without processing. The oils, obtained from pretreated seeds, had higher conjugated diene (CD and 2-thiobarbituric acid (2-TBA values, compared to that obtained from RW when stored in a Schaal oven at 65 °C for 168 h. However, polyphenol and tocopherol contents decreased in all oil samples, processed or unprocessed. The effect of pre-treating the seeds was more prominent in the oil sample obtained through the RS technique, and showed higher oxidative stability than the other processed oils and the oil from RW.

  13. From research to industry - the establishment of a radiation processing industry in South Africa

    International Nuclear Information System (INIS)

    Du Plessis, T.A.; Stevens, R.C.B.

    1983-01-01

    In the late sixties the South African Atomic Energy Board in pursuing its objectives to promote the peaceful application of nuclear energy in general, established a research group with the specific purpose of investigating and developing radiation processing as a new technique. During the early years it was realised that the economic and technological facets of establishing a new industry were equally important and, in addition to fundamental research, strong emphasis was placed on the necessity of marketing this new technology. Although the initial emphasis was put on gamma sterilization, and today still forms the backbone of the radiation processing industry, the promising fields of polymer modification and food irradiation hold a lot of promise in the radiation processing industry. Following ten years of successfully introducing and providing a radiation service, the South African Atomic Energy Board in 1980 decided to transfer its service to the private sector. These developments in South Africa are a good sample of how a small country, through initial government involvement, can acquire a sophisticated new private industry. (author)

  14. Fetal Cardiac Doppler Signal Processing Techniques: Challenges and Future Research Directions

    Directory of Open Access Journals (Sweden)

    Saeed Abdulrahman Alnuaimi

    2017-12-01

    Full Text Available The fetal Doppler Ultrasound (DUS is commonly used for monitoring fetal heart rate and can also be used for identifying the event timings of fetal cardiac valve motions. In early-stage fetuses, the detected Doppler signal suffers from noise and signal loss due to the fetal movements and changing fetal location during the measurement procedure. The fetal cardiac intervals, which can be estimated by measuring the fetal cardiac event timings, are the most important markers of fetal development and well-being. To advance DUS-based fetal monitoring methods, several powerful and well-advanced signal processing and machine learning methods have recently been developed. This review provides an overview of the existing techniques used in fetal cardiac activity monitoring and a comprehensive survey on fetal cardiac Doppler signal processing frameworks. The review is structured with a focus on their shortcomings and advantages, which helps in understanding fetal Doppler cardiogram signal processing methods and the related Doppler signal analysis procedures by providing valuable clinical information. Finally, a set of recommendations are suggested for future research directions and the use of fetal cardiac Doppler signal analysis, processing, and modeling to address the underlying challenges.

  15. Determination of some process parameters in a tyre-cord plant using radiotracer technique

    International Nuclear Information System (INIS)

    Kirti; Madhavankutti, C.K.; Eapen, A.C.

    1979-01-01

    In the process industry, it is often necessary to study the process parameters such as the residence time, flow rate, etc., under different operating conditions and equipment. The tracer technique represents in this respect an outstanding and even sometimes singular means of determining some of the above parameters. A method consisting of the introduction of a radioactive tracer at the input of a flow system under study and subsequently determining the distribution of activity with time at the output end is described. The form of the activity time curve depends on the parameters of the installation and the mode of operation. A study conducted at a multi-stage viscose rayon processing plant is described in detail. (auth.)

  16. An introduction to acoustic emission technology for in-process inspection of welds

    International Nuclear Information System (INIS)

    Goswami, G.L.

    1983-01-01

    Weld quality monitoring, as it stands today, is primarily done by X-ray radiography and ultrasonic testing which is applied after welding is complete. Acoustic Emission Technique (AET) also presents a possible substitute for weld quality monitoring which can be used during welding. Acoustic signals are generated during welding and the sound waves of weld defects are picked up by using AE sensors. With the introduction of sophisticated instrumentation in AET, it is possible to carry out the test even in noisy shop floor environments. Large number of reports on the subject of acoustic emission in recent years is a clear indication that it is gaining importance in welding industry. The present day status of the acoustic emission technology as an on-line weld quality monitoring technique has been reviewed. This report discusses the technique and system along with the acoustic emission parameters important for weld quality analysis. This also deals with the application of this technique in different welding processes like TIG, resistance, electro slag and submerged arc. It has been reported that monitoring of emission during welding can detect crack formation, crack growth and lack of fusion precisely. Static defects like porosity and inclusion do not generate very strong acoustic signals and are therefore difficult to intercept, but, however, lately they have detected successfully. (author)

  17. Continuous process tracing and the Iowa Gambling Task: Extending response dynamics to multialternative choice

    Directory of Open Access Journals (Sweden)

    Gregory J. Koop

    2011-12-01

    Full Text Available The history of judgment and decision making is defined by a trend toward increasingly nuanced explanations of the decision making process. Recently, process models have become incredibly sophisticated, yet the tools available to directly test these models have not kept pace. These increasingly complex process models require increasingly complex process data by which they can be adequately tested. We propose a new class of data collection that will facilitate evaluation of sophisticated process models. Tracking mouse paths during a continuous response provides an implicit measure of the growth of preference that produces a choice---rather than the current practice of recording just the button press that indicates that choice itself. Recent research in cognitive science (Spivey and Dale, 2006 has shown that cognitive processing can be revealed in these dynamic motor responses. Unlike current process methodologies, these response dynamics studies can demonstrate continuous competition between choice options and even online preference reversals. Here, in order to demonstrate the mechanics and utility of the methodology, we present an example response dynamics experiment utilizing a common multi-alternative decision task.

  18. [Motor capacities involved in the psychomotor skills of the cardiopulmonary resuscitation technique: recommendations for the teaching-learning process].

    Science.gov (United States)

    Miyadahira, A M

    2001-12-01

    It is a bibliographic study about the identification of the motor capacities involved in the psychomotor skills of the cardiopulmonary resuscitation (CPR) which aims to obtain subsidies to the planning of the teaching-learning process of this skill. It was found that: the motor capacities involved in the psychomotor skill of the CPR technique are predominantly cognitive and motor, involving 9 perceptive-motor capacities and 8 physical proficiency capacities. The CPR technique is a psychomotor skill classified as open, done in series and categorized as a thin and global skill and the teaching-learning process of the CPR technique has an elevated degree of complexity.

  19. Accelerated decomposition techniques for large discounted Markov decision processes

    Science.gov (United States)

    Larach, Abdelhadi; Chafik, S.; Daoui, C.

    2017-12-01

    Many hierarchical techniques to solve large Markov decision processes (MDPs) are based on the partition of the state space into strongly connected components (SCCs) that can be classified into some levels. In each level, smaller problems named restricted MDPs are solved, and then these partial solutions are combined to obtain the global solution. In this paper, we first propose a novel algorithm, which is a variant of Tarjan's algorithm that simultaneously finds the SCCs and their belonging levels. Second, a new definition of the restricted MDPs is presented to ameliorate some hierarchical solutions in discounted MDPs using value iteration (VI) algorithm based on a list of state-action successors. Finally, a robotic motion-planning example and the experiment results are presented to illustrate the benefit of the proposed decomposition algorithms.

  20. Optical modulation techniques for analog signal processing and CMOS compatible electro-optic modulation

    Science.gov (United States)

    Gill, Douglas M.; Rasras, Mahmoud; Tu, Kun-Yii; Chen, Young-Kai; White, Alice E.; Patel, Sanjay S.; Carothers, Daniel; Pomerene, Andrew; Kamocsai, Robert; Beattie, James; Kopa, Anthony; Apsel, Alyssa; Beals, Mark; Mitchel, Jurgen; Liu, Jifeng; Kimerling, Lionel C.

    2008-02-01

    Integrating electronic and photonic functions onto a single silicon-based chip using techniques compatible with mass-production CMOS electronics will enable new design paradigms for existing system architectures and open new opportunities for electro-optic applications with the potential to dramatically change the management, cost, footprint, weight, and power consumption of today's communication systems. While broadband analog system applications represent a smaller volume market than that for digital data transmission, there are significant deployments of analog electro-optic systems for commercial and military applications. Broadband linear modulation is a critical building block in optical analog signal processing and also could have significant applications in digital communication systems. Recently, broadband electro-optic modulators on a silicon platform have been demonstrated based on the plasma dispersion effect. The use of the plasma dispersion effect within a CMOS compatible waveguide creates new challenges and opportunities for analog signal processing since the index and propagation loss change within the waveguide during modulation. We will review the current status of silicon-based electrooptic modulators and also linearization techniques for optical modulation.

  1. Human Milk Processing: A Systematic Review of Innovative Techniques to Ensure the Safety and Quality of Donor Milk.

    Science.gov (United States)

    Peila, Chiara; Emmerik, Nikki E; Giribaldi, Marzia; Stahl, Bernd; Ruitenberg, Joost E; van Elburg, Ruurd M; Moro, Guido E; Bertino, Enrico; Coscia, Alessandra; Cavallarin, Laura

    2017-03-01

    Pasteurization, performed at 62.5°C for 30 minutes (holder pasteurization), is currently recommended in all international human milk banks guidelines, but it affects some human milk bioactive and nutritive components. The present systematic review is aimed at critically reviewing evidence on the suitability of human milk processing techniques other than holder pasteurization, both thermal and nonthermal, to ensure microbiological safety, and on the effects of these techniques on biologically active donor milk components. A systematic review of English and non-English articles using Medline, PubMed, Embase, SCOPUS, and CAB Abstracts, with no restriction in publication date was performed. Search terms included: human, breast, donor, or banked milk, breastmilk, breast fed, breastfed, breastfeed; HTST, Flash, High Pressure, UV, ultrasonic or nonthermal; process, pasteuris, pasteuriz. Only primary research articles published in peer-reviewed journals were included, providing or not a comparison with holder pasteurized human milk, provided that the pasteurization technique was clearly described, and not intended for domestic use. Additional studies were identified by searching bibliographies of relevant articles. Twenty-six studies were identified as being relevant. Two examined both High Pressure Processing and High-Temperature-Short-Time pasteurization; 10 only examined High Pressure Processing; 10 only examined High-Temperature-Short-Time; 2 articles examined ultraviolet irradiation; 2 articles examined (thermo-)ultrasonic processing. The results indicate that data about safety for microbiological control are still scarce for most of the novel technologies, and that consensus on processing conditions is necessary for nonthermal technologies, before any conclusions on the qualitative and nutritional advantages of these techniques can be drawn.

  2. Comparison of various techniques for the extraction of umbelliferone and herniarin in Matricaria chamomilla processing fractions.

    Science.gov (United States)

    Molnar, Maja; Mendešević, Nikolina; Šubarić, Drago; Banjari, Ines; Jokić, Stela

    2017-08-05

    Chamomile, a well-known medicinal plant, is a rich source of bioactive compounds, among which two coumarin derivatives, umbelliferone and herniarin, are often found in its extracts. Chamomile extracts have found a different uses in cosmetic industry, as well as umbelliferone itself, which is, due to its strong absorption of UV light, usually added to sunscreens, while herniarin (7-methoxycoumarin) is also known for its biological activity. Therefore, chamomile extracts with certain herniarin and umbelliferone content could be of interest for application in pharmaceutical and cosmetic products. The aim of this study was to compare the extracts of different chamomile fractions (unprocessed chamomile flowers first class, processed chamomile flowers first class, pulvis and processing waste) and to identify the best material and method of extraction to obtain herniarin and umbelliferone. Various extraction techniques such as soxhlet, hydrodistillation, maceration and supercritical CO 2 extraction were used in this study. Umbelliferone and herniarin content was determined by high performance liquid chromatography (HPLC). The highest yield of umbelliferone (11.80 mg/100 g) and herniarin (82.79 mg/100 g) were obtained from chamomile processing waste using maceration technique with 50% aqueous ethanol solution and this extract has also proven to possess antioxidant activity (61.5% DPPH scavenging activity). This study shows a possibility of potential utilization of waste from chamomile processing applying different extraction techniques.

  3. Influence of Processing Techniques on Microstructure and Mechanical Properties of a Biodegradable Mg-3Zn-2Ca Alloy.

    Science.gov (United States)

    Doležal, Pavel; Zapletal, Josef; Fintová, Stanislava; Trojanová, Zuzanka; Greger, Miroslav; Roupcová, Pavla; Podrábský, Tomáš

    2016-10-28

    New Mg-3Zn-2Ca magnesium alloy was prepared using different processing techniques: gravity casting as well as squeeze casting in liquid and semisolid states. Materials were further thermally treated; thermal treatment of the gravity cast alloy was additionally combined with the equal channel angular pressing (ECAP). Alloy processed by the squeeze casting in liquid as well as in semisolid state exhibit improved plasticity; the ECAP processing positively influenced both the tensile and compressive characteristics of the alloy. Applied heat treatment influenced the distribution and chemical composition of present intermetallic phases. Influence of particular processing techniques, heat treatment, and intermetallic phase distribution is thoroughly discussed in relation to mechanical behavior of presented alloys.

  4. Dysphagia Screening: Contributions of Cervical Auscultation Signals and Modern Signal-Processing Techniques

    Science.gov (United States)

    Dudik, Joshua M.; Coyle, James L.

    2015-01-01

    Cervical auscultation is the recording of sounds and vibrations caused by the human body from the throat during swallowing. While traditionally done by a trained clinician with a stethoscope, much work has been put towards developing more sensitive and clinically useful methods to characterize the data obtained with this technique. The eventual goal of the field is to improve the effectiveness of screening algorithms designed to predict the risk that swallowing disorders pose to individual patients’ health and safety. This paper provides an overview of these signal processing techniques and summarizes recent advances made with digital transducers in hopes of organizing the highly varied research on cervical auscultation. It investigates where on the body these transducers are placed in order to record a signal as well as the collection of analog and digital filtering techniques used to further improve the signal quality. It also presents the wide array of methods and features used to characterize these signals, ranging from simply counting the number of swallows that occur over a period of time to calculating various descriptive features in the time, frequency, and phase space domains. Finally, this paper presents the algorithms that have been used to classify this data into ‘normal’ and ‘abnormal’ categories. Both linear as well as non-linear techniques are presented in this regard. PMID:26213659

  5. Phase distribution measurements in narrow rectangular channels using image processing techniques

    International Nuclear Information System (INIS)

    Bentley, C.; Ruggles, A.

    1991-01-01

    Many high flux research reactor fuel assemblies are cooled by systems of parallel narrow rectangular channels. The HFIR is cooled by single phase forced convection under normal operating conditions. However, two-phase forced convection or two phase mixed convection can occur in the fueled region as a result of some hypothetical accidents. Such flow conditions would occur only at decay power levels. The system pressure would be around 0.15 MPa in such circumstances. Phase distribution of air-water flow in a narrow rectangular channel is examined using image processing techniques. Ink is added to the water and clear channel walls are used to allow high speed still photographs and video tape to be taken of the air-water flow field. Flow field images are digitized and stored in a Macintosh 2ci computer using a frame grabber board. Local grey levels are related to liquid thickness in the flow channel using a calibration fixture. Image processing shareware is used to calculate the spatially averaged liquid thickness from the image of the flow field. Time averaged spatial liquid distributions are calculated using image calculation algorithms. The spatially averaged liquid distribution is calculated from the time averaged spatial liquid distribution to formulate the combined temporally and spatially averaged fraction values. The temporally and spatially averaged liquid fractions measured using this technique compare well to those predicted from pressure gradient measurements at zero superficial liquid velocity

  6. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  7. Radiotracer techniques in mineral processing

    International Nuclear Information System (INIS)

    Przewlocki, K.

    1991-01-01

    The value of the smelter metal content in currently exploited polymetallic ores mostly does not exceed 2%. Before metallurgical treatment, ore must pass through the concentration process. The benefication process usually starts from the comminution of excavated material and terminates at the flotation and drying of the concentrate. These operations consume vast quantities of energy. To be economically justified, the process requires optimization and, if possible, automatic control. Radioactive tracers were found to be useful in the identification of particular technological subsystems and their subsequent optimization. A great deal of experience has been gathered in this field so far. The industrial radiotracer test (RTT) is carried out using very sensitive multidetector recording systems which have digital data acquisition capabilities. The optimization strategy consists of periodically adjusting technological process and set points of controlled variables according to certain improvement procedures. If computer facilities are available, data interpretation and calibration of the mathematical models describing the technical process itself can be performed on the spot. This significantly accelerates the whole procedure as RTT may be repeated for particular system configurations. The procedure of plant optimization by means of RTT is illustrated in the paper using the example of the copper ore enrichment process, assuming that it is representative of the whole mineral industry. Identification by RTT of the three main operations involved in the ore enrichment process, such as comminution, flotation and granular classification, is discussed in detail as particular case studies. In reference to this, it is also shown how the technological process can be adjusted to be most efficient. (author). 14 refs, 7 figs

  8. Feasibility of Johnson Noise Thermometry based on Digital Signal Processing Techniques

    International Nuclear Information System (INIS)

    Hwang, In Koo; Kim, Yang Mo

    2014-01-01

    This paper presents an implementation strategy of noise thermometry based on a digital signal processing technique and demonstrates its feasibilities. A key factor in its development is how to extract the small thermal noise signal from other noises, for example, random noise from amplifiers and continuous electromagnetic interference from the environment. The proposed system consists of two identical amplifiers and uses a cross correlation function to cancel the random noise of the amplifiers. Then, the external interference noises are eliminated by discriminating the difference in the peaks between the thermal signal and external noise. The gain of the amplifiers is estimated by injecting an already known pilot signal. The experimental simulation results of signal processing methods have demonstrated that the proposed approach is an effective method in eliminating an external noise signal and performing gain correction for development of the thermometry

  9. Image processing and pattern recognition with CVIPtools MATLAB toolbox: automatic creation of masks for veterinary thermographic images

    Science.gov (United States)

    Mishra, Deependra K.; Umbaugh, Scott E.; Lama, Norsang; Dahal, Rohini; Marino, Dominic J.; Sackman, Joseph

    2016-09-01

    CVIPtools is a software package for the exploration of computer vision and image processing developed in the Computer Vision and Image Processing Laboratory at Southern Illinois University Edwardsville. CVIPtools is available in three variants - a) CVIPtools Graphical User Interface, b) CVIPtools C library and c) CVIPtools MATLAB toolbox, which makes it accessible to a variety of different users. It offers students, faculty, researchers and any user a free and easy way to explore computer vision and image processing techniques. Many functions have been implemented and are updated on a regular basis, the library has reached a level of sophistication that makes it suitable for both educational and research purposes. In this paper, the detail list of the functions available in the CVIPtools MATLAB toolbox are presented and how these functions can be used in image analysis and computer vision applications. The CVIPtools MATLAB toolbox allows the user to gain practical experience to better understand underlying theoretical problems in image processing and pattern recognition. As an example application, the algorithm for the automatic creation of masks for veterinary thermographic images is presented.

  10. Digital Signal Processing For Low Bit Rate TV Image Codecs

    Science.gov (United States)

    Rao, K. R.

    1987-06-01

    In view of the 56 KBPS digital switched network services and the ISDN, low bit rate codecs for providing real time full motion color video are under various stages of development. Some companies have already brought the codecs into the market. They are being used by industry and some Federal Agencies for video teleconferencing. In general, these codecs have various features such as multiplexing audio and data, high resolution graphics, encryption, error detection and correction, self diagnostics, freezeframe, split video, text overlay etc. To transmit the original color video on a 56 KBPS network requires bit rate reduction of the order of 1400:1. Such a large scale bandwidth compression can be realized only by implementing a number of sophisticated,digital signal processing techniques. This paper provides an overview of such techniques and outlines the newer concepts that are being investigated. Before resorting to the data compression techniques, various preprocessing operations such as noise filtering, composite-component transformation and horizontal and vertical blanking interval removal are to be implemented. Invariably spatio-temporal subsampling is achieved by appropriate filtering. Transform and/or prediction coupled with motion estimation and strengthened by adaptive features are some of the tools in the arsenal of the data reduction methods. Other essential blocks in the system are quantizer, bit allocation, buffer, multiplexer, channel coding etc.

  11. Using Data-Driven and Process Mining Techniques for Identifying and Characterizing Problem Gamblers in New Zealand

    Directory of Open Access Journals (Sweden)

    Suriadi Suriadi

    2016-12-01

    Full Text Available This article uses data-driven techniques combined with established theory in order to analyse gambling behavioural patterns of 91 thousand individuals on a real-world fixed-odds gambling dataset in New Zealand. This research uniquely integrates a mixture of process mining, data mining and confirmatory statistical techniques in order to categorise different sub-groups of gamblers, with the explicit motivation of identifying problem gambling behaviours and reporting on the challenges and lessons learned from our case study.We demonstrate how techniques from various disciplines can be combined in order to gain insight into the behavioural patterns exhibited by different types of gamblers, as well as provide assurances of the correctness of our approach and findings. A highlight of this case study is both the methodology which demonstrates how such a combination of techniques provides a rich set of effective tools to undertake an exploratory and open-ended data analysis project that is guided by the process cube concept, as well as the findings themselves which indicate that the contribution that problem gamblers make to the total volume, expenditure, and revenue is higher than previous studies have maintained.

  12. Intelligent Technique for Signal Processing to Identify the Brain Disorder for Epilepsy Captures Using Fuzzy Systems

    Directory of Open Access Journals (Sweden)

    Gurumurthy Sasikumar

    2016-01-01

    Full Text Available The new direction of understand the signal that is created from the brain organization is one of the main chores in the brain signal processing. Amid all the neurological disorders the human brain epilepsy is measured as one of the extreme prevalent and then programmed artificial intelligence detection technique is an essential due to the crooked and unpredictable nature of happening of epileptic seizures. We proposed an Improved Fuzzy firefly algorithm, which would enhance the classification of the brain signal efficiently with minimum iteration. An important bunching technique created on fuzzy logic is the Fuzzy C means. Together in the feature domain with the spatial domain the features gained after multichannel EEG signals remained combined by means of fuzzy algorithms. And for better precision segmentation process the firefly algorithm is applied to optimize the Fuzzy C-means membership function. Simultaneously for the efficient clustering method the convergence criteria are set. On the whole the proposed technique yields more accurate results and that gives an edge over other techniques. This proposed algorithm result compared with other algorithms like fuzzy c means algorithm and PSO algorithm.

  13. Implementation of quality improvement techniques for management and technical processes in the ACRV project

    Science.gov (United States)

    Raiman, Laura B.

    1992-12-01

    Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .

  14. Implementation of quality improvement techniques for management and technical processes in the ACRV project

    Science.gov (United States)

    Raiman, Laura B.

    1992-01-01

    Total Quality Management (TQM) is a cooperative form of doing business that relies on the talents of everyone in an organization to continually improve quality and productivity, using teams and an assortment of statistical and measurement tools. The objective of the activities described in this paper was to implement effective improvement tools and techniques in order to build work processes which support good management and technical decisions and actions which are crucial to the success of the ACRV project. The objectives were met by applications in both the technical and management areas. The management applications involved initiating focused continuous improvement projects with widespread team membership. The technical applications involved applying proven statistical tools and techniques to the technical issues associated with the ACRV Project. Specific activities related to the objective included working with a support contractor team to improve support processes, examining processes involved in international activities, a series of tutorials presented to the New Initiatives Office and support contractors, a briefing to NIO managers, and work with the NIO Q+ Team. On the technical side, work included analyzing data from the large-scale W.A.T.E.R. test, landing mode trade analyses, and targeting probability calculations. The results of these efforts will help to develop a disciplined, ongoing process for producing fundamental decisions and actions that shape and guide the ACRV organization .

  15. Toward Meaningful Manufacturing Variation Data in Design - Feature Based Description of Variation in Manufacturing Processes

    DEFF Research Database (Denmark)

    Eifler, Tobias; Boorla, Srinivasa Murthy; Howard, Thomas J.

    2016-01-01

    The need to mitigate the effects of manufacturing variation already in design is nowadays commonly acknowledged and has led to a wide use of predictive modeling techniques, tolerancing approaches, etc. in industry. The trustworthiness of corresponding variation analyses is, however, not ensured...... by the availability of sophisticated methods and tools alone, but does evidently also depend on the accuracy of the input information used. As existing approaches for the description of manufacturing variation focus however, almost exclusively, on monitoring and controlling production processes, there is frequently...... a lack of objective variation data in design. As a result, variation analyses and tolerancing activities rely on numerous assumptions made to fill the gaps of missing or incomplete data. To overcome this hidden subjectivity, a schema for a consistent and standardised description of manufacturing...

  16. A Bayesian modelling method for post-processing daily sub-seasonal to seasonal rainfall forecasts from global climate models and evaluation for 12 Australian catchments

    Science.gov (United States)

    Schepen, Andrew; Zhao, Tongtiegang; Wang, Quan J.; Robertson, David E.

    2018-03-01

    Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs) are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S), which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.

  17. Computer processing of the scintigraphic image using digital filtering techniques

    International Nuclear Information System (INIS)

    Matsuo, Michimasa

    1976-01-01

    The theory of digital filtering was studied as a method for the computer processing of scintigraphic images. The characteristics and design techniques of finite impulse response (FIR) digital filters with linear phases were examined using the z-transform. The conventional data processing method, smoothing, could be recognized as one kind of linear phase FIR low-pass digital filtering. Ten representatives of FIR low-pass digital filters with various cut-off frequencies were scrutinized from the frequency domain in one-dimension and two-dimensions. These filters were applied to phantom studies with cold targets, using a Scinticamera-Minicomputer on-line System. These studies revealed that the resultant images had a direct connection with the magnitude response of the filter, that is, they could be estimated fairly well from the frequency response of the digital filter used. The filter, which was estimated from phantom studies as optimal for liver scintigrams using 198 Au-colloid, was successfully applied in clinical use for detecting true cold lesions and, at the same time, for eliminating spurious images. (J.P.N.)

  18. Exploring Graduate Students' Perspectives towards Using Gamification Techniques in Online Learning

    Directory of Open Access Journals (Sweden)

    Daniah ALABBASI

    2017-07-01

    Full Text Available Teachers and educational institutions are attempting to find an appropriate strategy to motivate as well as engage students in the learning process. Institutions are encouraging the use of gamification in education for the purpose of improving the intrinsic motivation as well as engagement. However, the students’ perspective of the issue is under-investigated. The purpose of this research study was to explore graduate students’ perspectives toward the use of gamification techniques in online learning. The study used exploratory research and survey as the data collection tool. Forty-seven graduate students (n = 47 enrolled in an instructional technology program studied in a learning management system that supports gamification (TalentLMS. The average total percentages were calculated for each survey section to compose the final perspective of the included students. The results showed a positive perception toward the use of gamification tools in online learning among graduate students. Students require effort-demanding, challenging, sophisticated learning systems that increase competency, enhance recall memory, concentration, attentiveness, commitment, and social interaction. Limitations of the study are identified, which highlights the need for further research on the subject matter.

  19. SALP (Sensitivity Analysis by List Processing), a computer assisted technique for binary systems reliability analysis

    International Nuclear Information System (INIS)

    Astolfi, M.; Mancini, G.; Volta, G.; Van Den Muyzenberg, C.L.; Contini, S.; Garribba, S.

    1978-01-01

    A computerized technique which allows the modelling by AND, OR, NOT binary trees, of various complex situations encountered in safety and reliability assessment, is described. By the use of list-processing, numerical and non-numerical types of information are used together. By proper marking of gates and primary events, stand-by systems, common cause failure and multiphase systems can be analyzed. The basic algorithms used in this technique are shown in detail. Application to a stand-by and multiphase system is then illustrated

  20. Prediction of UV spectra and UV-radiation damage in actual plasma etching processes using on-wafer monitoring technique

    International Nuclear Information System (INIS)

    Jinnai, Butsurin; Fukuda, Seiichi; Ohtake, Hiroto; Samukawa, Seiji

    2010-01-01

    UV radiation during plasma processing affects the surface of materials. Nevertheless, the interaction of UV photons with surface is not clearly understood because of the difficulty in monitoring photons during plasma processing. For this purpose, we have previously proposed an on-wafer monitoring technique for UV photons. For this study, using the combination of this on-wafer monitoring technique and a neural network, we established a relationship between the data obtained from the on-wafer monitoring technique and UV spectra. Also, we obtained absolute intensities of UV radiation by calibrating arbitrary units of UV intensity with a 126 nm excimer lamp. As a result, UV spectra and their absolute intensities could be predicted with the on-wafer monitoring. Furthermore, we developed a prediction system with the on-wafer monitoring technique to simulate UV-radiation damage in dielectric films during plasma etching. UV-induced damage in SiOC films was predicted in this study. Our prediction results of damage in SiOC films shows that UV spectra and their absolute intensities are the key cause of damage in SiOC films. In addition, UV-radiation damage in SiOC films strongly depends on the geometry of the etching structure. The on-wafer monitoring technique should be useful in understanding the interaction of UV radiation with surface and in optimizing plasma processing by controlling UV radiation.

  1. Evaluating Acoustic Emission Signals as an in situ process monitoring technique for Selective Laser Melting (SLM)

    Energy Technology Data Exchange (ETDEWEB)

    Fisher, Karl A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Candy, Jim V. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Guss, Gabe [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Mathews, M. J. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2016-10-14

    In situ real-time monitoring of the Selective Laser Melting (SLM) process has significant implications for the AM community. The ability to adjust the SLM process parameters during a build (in real-time) can save time, money and eliminate expensive material waste. Having a feedback loop in the process would allow the system to potentially ‘fix’ problem regions before a next powder layer is added. In this study we have investigated acoustic emission (AE) phenomena generated during the SLM process, and evaluated the results in terms of a single process parameter, of an in situ process monitoring technique.

  2. Test/score/report: Simulation techniques for automating the test process

    Science.gov (United States)

    Hageman, Barbara H.; Sigman, Clayton B.; Koslosky, John T.

    1994-01-01

    A Test/Score/Report capability is currently being developed for the Transportable Payload Operations Control Center (TPOCC) Advanced Spacecraft Simulator (TASS) system which will automate testing of the Goddard Space Flight Center (GSFC) Payload Operations Control Center (POCC) and Mission Operations Center (MOC) software in three areas: telemetry decommutation, spacecraft command processing, and spacecraft memory load and dump processing. Automated computer control of the acceptance test process is one of the primary goals of a test team. With the proper simulation tools and user interface, the task of acceptance testing, regression testing, and repeatability of specific test procedures of a ground data system can be a simpler task. Ideally, the goal for complete automation would be to plug the operational deliverable into the simulator, press the start button, execute the test procedure, accumulate and analyze the data, score the results, and report the results to the test team along with a go/no recommendation to the test team. In practice, this may not be possible because of inadequate test tools, pressures of schedules, limited resources, etc. Most tests are accomplished using a certain degree of automation and test procedures that are labor intensive. This paper discusses some simulation techniques that can improve the automation of the test process. The TASS system tests the POCC/MOC software and provides a score based on the test results. The TASS system displays statistics on the success of the POCC/MOC system processing in each of the three areas as well as event messages pertaining to the Test/Score/Report processing. The TASS system also provides formatted reports documenting each step performed during the tests and the results of each step. A prototype of the Test/Score/Report capability is available and currently being used to test some POCC/MOC software deliveries. When this capability is fully operational it should greatly reduce the time necessary

  3. Pre-writing Techniques In The Writing Process For The L2 Classroom

    OpenAIRE

    Gülşah Geyimci

    2014-01-01

    This study investigated pre-writing techniques in the learning process to improve written communication skills of learners using qualitative research methods. This study was performed in a public school, Suphi Öner Primary School in Turkey, in Mersin. Students were seventh grade class that their level was pre-intermediate. This class was made up of twenty students. It took three weeks, the students' samples, drawings and blogs were documented by the students. In order to examine the results, ...

  4. The Impact of Services on Economic Complexity: Service Sophistication as Route for Economic Growth.

    Science.gov (United States)

    Stojkoski, Viktor; Utkovski, Zoran; Kocarev, Ljupco

    2016-01-01

    Economic complexity reflects the amount of knowledge that is embedded in the productive structure of an economy. By combining tools from network science and econometrics, a robust and stable relationship between a country's productive structure and its economic growth has been established. Here we report that not only goods but also services are important for predicting the rate at which countries will grow. By adopting a terminology which classifies manufactured goods and delivered services as products, we investigate the influence of services on the country's productive structure. In particular, we provide evidence that complexity indices for services are in general higher than those for goods, which is reflected in a general tendency to rank countries with developed service sector higher than countries with economy centred on manufacturing of goods. By focusing on country dynamics based on experimental data, we investigate the impact of services on the economic complexity of countries measured in the product space (consisting of both goods and services). Importantly, we show that diversification of service exports and its sophistication can provide an additional route for economic growth in both developing and developed countries.

  5. The State of Nursing Home Information Technology Sophistication in Rural and Nonrural US Markets.

    Science.gov (United States)

    Alexander, Gregory L; Madsen, Richard W; Miller, Erin L; Wakefield, Douglas S; Wise, Keely K; Alexander, Rachel L

    2017-06-01

    To test for significant differences in information technology sophistication (ITS) in US nursing homes (NH) based on location. We administered a primary survey January 2014 to July 2015 to NH in each US state. The survey was cross-sectional and examined 3 dimensions (IT capabilities, extent of IT use, degree of IT integration) among 3 domains (resident care, clinical support, administrative activities) of ITS. ITS was broken down by NH location. Mean responses were compared across 4 NH categories (Metropolitan, Micropolitan, Small Town, and Rural) for all 9 ITS dimensions and domains. Least square means and Tukey's method were used for multiple comparisons. Methods yielded 815/1,799 surveys (45% response rate). In every health care domain (resident care, clinical support, and administrative activities) statistical differences in facility ITS occurred in larger (metropolitan or micropolitan) and smaller (small town or rural) populated areas. This study represents the most current national assessment of NH IT since 2004. Historically, NH IT has been used solely for administrative activities and much less for resident care and clinical support. However, results are encouraging as ITS in other domains appears to be greater than previously imagined. © 2016 National Rural Health Association.

  6. Supercritical fluid processing: a new dry technique for photoresist developing

    Science.gov (United States)

    Gallagher-Wetmore, Paula M.; Wallraff, Gregory M.; Allen, Robert D.

    1995-06-01

    Supercritical fluid (SCF) technology is investigated as a dry technique for photoresist developing. Because of their unique combination of gaseous and liquid-like properties, these fluids offer comparative or improved efficiencies over liquid developers and, particularly carbon dioxide, would have tremendous beneficial impact on the environment and on worker safety. Additionally, SCF technology offers the potential for processing advanced resist systems which are currently under investigation as well as those that may have been abandoned due to problems associated with conventional developers. An investigation of various negative and positive photoresist systems is ongoing. Initially, supercritical carbon dioxide (SC CO2) as a developer for polysilane resists was explored because the exposure products, polysiloxanes, are generally soluble in this fluid. These initial studies demonstrated the viability of the SCF technique with both single layer and bilayer systems. Subsequently, the investigation focused on using SC CO2 to produce negative images with polymers that would typically be considered positive resists. Polymers such as styrenes and methacrylates were chemically modified by fluorination and/or copolymerization to render them soluble in SC CO2. Siloxane copolymers and siloxane-modified methacrylates were examined as well. The preliminary findings reported here indicate the feasibility of using SC CO2 for photoresist developing.

  7. A Bayesian modelling method for post-processing daily sub-seasonal to seasonal rainfall forecasts from global climate models and evaluation for 12 Australian catchments

    Directory of Open Access Journals (Sweden)

    A. Schepen

    2018-03-01

    Full Text Available Rainfall forecasts are an integral part of hydrological forecasting systems at sub-seasonal to seasonal timescales. In seasonal forecasting, global climate models (GCMs are now the go-to source for rainfall forecasts. For hydrological applications however, GCM forecasts are often biased and unreliable in uncertainty spread, and calibration is therefore required before use. There are sophisticated statistical techniques for calibrating monthly and seasonal aggregations of the forecasts. However, calibration of seasonal forecasts at the daily time step typically uses very simple statistical methods or climate analogue methods. These methods generally lack the sophistication to achieve unbiased, reliable and coherent forecasts of daily amounts and seasonal accumulated totals. In this study, we propose and evaluate a Rainfall Post-Processing method for Seasonal forecasts (RPP-S, which is based on the Bayesian joint probability modelling approach for calibrating daily forecasts and the Schaake Shuffle for connecting the daily ensemble members of different lead times. We apply the method to post-process ACCESS-S forecasts for 12 perennial and ephemeral catchments across Australia and for 12 initialisation dates. RPP-S significantly reduces bias in raw forecasts and improves both skill and reliability. RPP-S forecasts are also more skilful and reliable than forecasts derived from ACCESS-S forecasts that have been post-processed using quantile mapping, especially for monthly and seasonal accumulations. Several opportunities to improve the robustness and skill of RPP-S are identified. The new RPP-S post-processed forecasts will be used in ensemble sub-seasonal to seasonal streamflow applications.

  8. Development of Electronic Nose and Near Infrared Spectroscopy Analysis Techniques to Monitor the Critical Time in SSF Process of Feed Protein

    Directory of Open Access Journals (Sweden)

    Hui Jiang

    2014-10-01

    Full Text Available In order to assure the consistency of the final product quality, a fast and effective process monitoring is a growing need in solid state fermentation (SSF industry. This work investigated the potential of non-invasive techniques combined with the chemometrics method, to monitor time-related changes that occur during SSF process of feed protein. Four fermentation trials conducted were monitored by an electronic nose device and a near infrared spectroscopy (NIRS spectrometer. Firstly, principal component analysis (PCA and independent component analysis (ICA were respectively applied to the feature extraction and information fusion. Then, the BP_AdaBoost algorithm was used to develop the fused model for monitoring of the critical time in SSF process of feed protein. Experimental results showed that the identified results of the fusion model are much better than those of the single technique model both in the training and validation sets, and the complexity of the fusion model was also less than that of the single technique model. The overall results demonstrate that it has a high potential in online monitoring of the critical moment in SSF process by use of integrating electronic nose and NIRS techniques, and data fusion from multi-technique could significantly improve the monitoring performance of SSF process.

  9. Performance evaluation of WDXRF as a process control technique for MOX fuel fabrication

    International Nuclear Information System (INIS)

    Pandey, A.; Khan, F.A.; Das, D.K.; Behere, P.G.; Afzal, Mohd

    2015-01-01

    This paper presents studies on Wavelength Dispersive X-Ray Fluorescence (WDXRF), as a powerful non destructive technique (NDT) for the compositional analysis of various types of MOX fuels. The sample has come after mixing and milling of UO 2 and PuO 2 powder for the estimation of plutonium, as a process control step of fabrication of (U, Pu)O 2 mixed oxide (MOX) fuel. For the characterization for heavy metal in various MOX fuel, a WDXRF method was established as a process control technique. The attractiveness of our system is that it can analyze the samples in solid form as well as in liquid form. The system is adapted in a glove box for handling of plutonium based fuels. The glove box adapted system was optimized with Uranium and Thorium based MOX sample before introduction of Pu. Uranium oxide and thorium oxide have been estimated in uranium thorium MOX samples. Standard deviation for the analysis of U 3 O 8 and ThO 2 were found to be 0.14 and 0.15 respectively. The results are validated against the conventional wet chemical methods of analysis. (author)

  10. Application of learning techniques based on kernel methods for the fault diagnosis in industrial processes

    Directory of Open Access Journals (Sweden)

    Jose M. Bernal-de-Lázaro

    2016-05-01

    Full Text Available This article summarizes the main contributions of the PhD thesis titled: "Application of learning techniques based on kernel methods for the fault diagnosis in Industrial processes". This thesis focuses on the analysis and design of fault diagnosis systems (DDF based on historical data. Specifically this thesis provides: (1 new criteria for adjustment of the kernel methods used to select features with a high discriminative capacity for the fault diagnosis tasks, (2 a proposed approach process monitoring using statistical techniques multivariate that incorporates a reinforced information concerning to the dynamics of the Hotelling's T2 and SPE statistics, whose combination with kernel methods improves the detection of small-magnitude faults; (3 an robustness index to compare the diagnosis classifiers performance taking into account their insensitivity to possible noise and disturbance on historical data.

  11. Modified technique for processing multiangle lidar data measured in clear and moderately polluted atmospheres

    Science.gov (United States)

    Vladimir Kovalev; Cyle Wold; Alexander Petkov; Wei Min Hao

    2011-01-01

    We present a modified technique for processing multiangle lidar data that is applicable for relatively clear atmospheres, where the utilization of the conventional Kano-Hamilton method meets significant issues. Our retrieval algorithm allows computing the two-way transmission and the corresponding extinction-coefficient profile in any slope direction searched during...

  12. Performance enhancement of various real-time image processing techniques via speculative execution

    Science.gov (United States)

    Younis, Mohamed F.; Sinha, Purnendu; Marlowe, Thomas J.; Stoyenko, Alexander D.

    1996-03-01

    In real-time image processing, an application must satisfy a set of timing constraints while ensuring the semantic correctness of the system. Because of the natural structure of digital data, pure data and task parallelism have been used extensively in real-time image processing to accelerate the handling time of image data. These types of parallelism are based on splitting the execution load performed by a single processor across multiple nodes. However, execution of all parallel threads is mandatory for correctness of the algorithm. On the other hand, speculative execution is an optimistic execution of part(s) of the program based on assumptions on program control flow or variable values. Rollback may be required if the assumptions turn out to be invalid. Speculative execution can enhance average, and sometimes worst-case, execution time. In this paper, we target various image processing techniques to investigate applicability of speculative execution. We identify opportunities for safe and profitable speculative execution in image compression, edge detection, morphological filters, and blob recognition.

  13. Distinguishing the cognitive processes of mindfulness: Developing a standardised mindfulness technique for use in longitudinal randomised control trials.

    Science.gov (United States)

    Isbel, Ben; Summers, Mathew J

    2017-07-01

    A capacity model of mindfulness is adopted to differentiate the cognitive faculty of mindfulness from the metacognitive processes required to cultivate this faculty in mindfulness training. The model provides an explanatory framework incorporating both the developmental progression from focussed attention to open monitoring styles of mindfulness practice, along with the development of equanimity and insight. A standardised technique for activating these processes without the addition of secondary components is then introduced. Mindfulness-based interventions currently available for use in randomised control trials introduce components ancillary to the cognitive processes of mindfulness, limiting their ability to draw clear causative inferences. The standardised technique presented here does not introduce such ancillary factors, rendering it a valuable tool with which to investigate the processes activated in mindfulness practice. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Comparative of signal processing techniques for micro-Doppler signature extraction with automotive radar systems

    Science.gov (United States)

    Rodriguez-Hervas, Berta; Maile, Michael; Flores, Benjamin C.

    2014-05-01

    In recent years, the automotive industry has experienced an evolution toward more powerful driver assistance systems that provide enhanced vehicle safety. These systems typically operate in the optical and microwave regions of the electromagnetic spectrum and have demonstrated high efficiency in collision and risk avoidance. Microwave radar systems are particularly relevant due to their operational robustness under adverse weather or illumination conditions. Our objective is to study different signal processing techniques suitable for extraction of accurate micro-Doppler signatures of slow moving objects in dense urban environments. Selection of the appropriate signal processing technique is crucial for the extraction of accurate micro-Doppler signatures that will lead to better results in a radar classifier system. For this purpose, we perform simulations of typical radar detection responses in common driving situations and conduct the analysis with several signal processing algorithms, including short time Fourier Transform, continuous wavelet or Kernel based analysis methods. We take into account factors such as the relative movement between the host vehicle and the target, and the non-stationary nature of the target's movement. A comparison of results reveals that short time Fourier Transform would be the best approach for detection and tracking purposes, while the continuous wavelet would be the best suited for classification purposes.

  15. Experimental techniques for cement hydration studies

    Directory of Open Access Journals (Sweden)

    Andreas Luttge

    2011-10-01

    Full Text Available Cement hydration kinetics is a complex problem of dissolution, nucleation and growth that is still not well understood, particularly in a quantitative way. While cement systems are unique in certain aspects they are also comparable to natural mineral systems. Therefore, geochemistry and particularly the study of mineral dissolution and growth may be able to provide insight and methods that can be utilized in cement hydration research. Here, we review mainly what is not known or what is currently used and applied in a problematic way. Examples are the typical Avrami approach, the application of Transition State Theory (TST to overall reaction kinetics and the problem of reactive surface area. Finally, we suggest an integrated approach that combines vertical scanning interferometry (VSI with other sophisticated analytical techniques such as atomic force microscopy (AFM and theoretical model calculations based on a stochastic treatment.

  16. Comparison of optimization techniques for MRR and surface roughness in wire EDM process for gear cutting

    Directory of Open Access Journals (Sweden)

    K.D. Mohapatra

    2016-11-01

    Full Text Available The objective of the present work is to use a suitable method that can optimize the process parameters like pulse on time (TON, pulse off time (TOFF, wire feed rate (WF, wire tension (WT and servo voltage (SV to attain the maximum value of MRR and minimum value of surface roughness during the production of a fine pitch spur gear made of copper. The spur gear has a pressure angle of 20⁰ and pitch circle diameter of 70 mm. The wire has a diameter of 0.25 mm and is made of brass. Experiments were conducted according to Taguchi’s orthogonal array concept with five factors and two levels. Thus, Taguchi quality loss design technique is used to optimize the output responses carried out from the experiments. Another optimization technique i.e. desirability with grey Taguchi technique has been used to optimize the process parameters. Both the optimized results are compared to find out the best combination of MRR and surface roughness. A confirmation test was carried out to identify the significant improvement in the machining performance in case of Taguchi quality loss. Finally, it was concluded that desirability with grey Taguchi technique produced a better result than the Taguchi quality loss technique in case of MRR and Taguchi quality loss gives a better result in case of surface roughness. The quality of the wire after the cutting operation has been presented in the scanning electron microscopy (SEM figure.

  17. Building Models in the Classroom: Taking Advantage of Sophisticated Geomorphic Numerical Tools Using a Simple Graphical User Interface

    Science.gov (United States)

    Roy, S. G.; Koons, P. O.; Gerbi, C. C.; Capps, D. K.; Tucker, G. E.; Rogers, Z. A.

    2014-12-01

    Sophisticated numerical tools exist for modeling geomorphic processes and linking them to tectonic and climatic systems, but they are often seen as inaccessible for users with an exploratory level of interest. We have improved the accessibility of landscape evolution models by producing a simple graphics user interface (GUI) that takes advantage of the Channel-Hillslope Integrated Landscape Development (CHILD) model. Model access is flexible: the user can edit values for basic geomorphic, tectonic, and climate parameters, or obtain greater control by defining the spatiotemporal distributions of those parameters. Users can make educated predictions by choosing their own parametric values for the governing equations and interpreting the results immediately through model graphics. This method of modeling allows users to iteratively build their understanding through experimentation. Use of this GUI is intended for inquiry and discovery-based learning activities. We discuss a number of examples of how the GUI can be used at the upper high school, introductory university, and advanced university level. Effective teaching modules initially focus on an inquiry-based example guided by the instructor. As students become familiar with the GUI and the CHILD model, the class can shift to more student-centered exploration and experimentation. To make model interpretations more robust, digital elevation models can be imported and direct comparisons can be made between CHILD model results and natural topography. The GUI is available online through the University of Maine's Earth and Climate Sciences website, through the Community Surface Dynamics Modeling System (CSDMS) model repository, or by contacting the corresponding author.

  18. Application of non-destructive liner thickness measurement technique for manufacturing and inspection process of zirconium lined cladding tube

    International Nuclear Information System (INIS)

    Nakazawa, Norio; Fukuda, Akihiro; Fujii, Noritsugu; Inoue, Koichi

    1986-01-01

    Recently, in order to meet the difference of electric power demand owing to electric power situation, large scale load following operation has become necessary. Therefore, the development of the cladding tubes which withstand power variation has been carried out, as the result, zirconium-lined zircaloy 2 cladding tubes have been developed. In order to reduce the sensitivity to stress corrosion cracking, these zirconium-lined cladding tubes require uniform liner thickness over the whole surface and whole length. Kobe Steel Ltd. developed the nondestructive liner thickness measuring technique based on ultrasonic flaw detection technique and eddy current flaw detection technique. These equipments were applied to the manufacturing and inspection processes of the zirconium-lined cladding tubes, and have demonstrated superiority in the control and assurance of the liner thickness of products. Zirconium-lined cladding tubes, the development of the measuring technique for guaranteeing the uniform liner thickness and the liner thickness control in the manufacturing and inspection processes are described. (Kako, I.)

  19. Comparing the Cognitive Process of Circular Causality in Two Patients with Strokes through Qualitative Analysis.

    Science.gov (United States)

    Derakhshanrad, Seyed Alireza; Piven, Emily; Ghoochani, Bahareh Zeynalzadeh

    2017-10-01

    Walter J. Freeman pioneered the neurodynamic model of brain activity when he described the brain dynamics for cognitive information transfer as the process of circular causality at intention, meaning, and perception (IMP) levels. This view contributed substantially to establishment of the Intention, Meaning, and Perception Model of Neuro-occupation in occupational therapy. As described by the model, IMP levels are three components of the brain dynamics system, with nonlinear connections that enable cognitive function to be processed in a circular causality fashion, known as Cognitive Process of Circular Causality (CPCC). Although considerable research has been devoted to study the brain dynamics by sophisticated computerized imaging techniques, less attention has been paid to study it through investigating the adaptation process of thoughts and behaviors. To explore how CPCC manifested thinking and behavioral patterns, a qualitative case study was conducted on two matched female participants with strokes, who were of comparable ages, affected sides, and other characteristics, except for their resilience and motivational behaviors. CPCC was compared by matrix analysis between two participants, using content analysis with pre-determined categories. Different patterns of thinking and behavior may have happened, due to disparate regulation of CPCC between two participants.

  20. Exponential models applied to automated processing of radioimmunoassay standard curves

    International Nuclear Information System (INIS)

    Morin, J.F.; Savina, A.; Caroff, J.; Miossec, J.; Legendre, J.M.; Jacolot, G.; Morin, P.P.

    1979-01-01

    An improved computer processing is described for fitting of radio-immunological standard curves by means of an exponential model on a desk-top calculator. This method has been applied to a variety of radioassays and the results are in accordance with those obtained by more sophisticated models [fr

  1. Rapid, low-cost, image analysis through video processing

    International Nuclear Information System (INIS)

    Levinson, R.A.; Marrs, R.W.; Grantham, D.G.

    1976-01-01

    Remote Sensing now provides the data necessary to solve many resource problems. However, many of the complex image processing and analysis functions used in analysis of remotely-sensed data are accomplished using sophisticated image analysis equipment. High cost of this equipment places many of these techniques beyond the means of most users. A new, more economical, video system capable of performing complex image analysis has now been developed. This report describes the functions, components, and operation of that system. Processing capability of the new video image analysis system includes many of the tasks previously accomplished with optical projectors and digital computers. Video capabilities include: color separation, color addition/subtraction, contrast stretch, dark level adjustment, density analysis, edge enhancement, scale matching, image mixing (addition and subtraction), image ratioing, and construction of false-color composite images. Rapid input of non-digital image data, instantaneous processing and display, relatively low initial cost, and low operating cost gives the video system a competitive advantage over digital equipment. Complex pre-processing, pattern recognition, and statistical analyses must still be handled through digital computer systems. The video system at the University of Wyoming has undergone extensive testing, comparison to other systems, and has been used successfully in practical applications ranging from analysis of x-rays and thin sections to production of color composite ratios of multispectral imagery. Potential applications are discussed including uranium exploration, petroleum exploration, tectonic studies, geologic mapping, hydrology sedimentology and petrography, anthropology, and studies on vegetation and wildlife habitat

  2. Devices, materials, and processes for nano-electronics: characterization with advanced X-ray techniques using lab-based and synchrotron radiation sources

    International Nuclear Information System (INIS)

    Zschech, E.; Wyon, C.; Murray, C.E.; Schneider, G.

    2011-01-01

    Future nano-electronics manufacturing at extraordinary length scales, new device structures, and advanced materials will provide challenges to process development and engineering but also to process control and physical failure analysis. Advanced X-ray techniques, using lab systems and synchrotron radiation sources, will play a key role for the characterization of thin films, nano-structures, surfaces, and interfaces. The development of advanced X-ray techniques and tools will reduce risk and time for the introduction of new technologies. Eventually, time-to-market for new products will be reduced by the timely implementation of the best techniques for process development and process control. The development and use of advanced methods at synchrotron radiation sources will be increasingly important, particularly for research and development in the field of advanced processes and new materials but also for the development of new X-ray components and procedures. The application of advanced X-ray techniques, in-line, in out-of-fab analytical labs and at synchrotron radiation sources, for research, development, and manufacturing in the nano-electronics industry is reviewed. The focus of this paper is on the study of nano-scale device and on-chip interconnect materials, and materials for 3D IC integration as well. (authors)

  3. Enhancing Student Learning of Enterprise Integration and Business Process Orientation through an ERP Business Simulation Game

    Science.gov (United States)

    Seethamraju, Ravi

    2011-01-01

    The sophistication of the integrated world of work and increased recognition of business processes as critical corporate assets require graduates to develop "process orientation" and an "integrated view" of business. Responding to these dynamic changes in business organizations, business schools are also continuing to modify…

  4. First of all: Do not harm! Use of simulation for the training of regional anaesthesia techniques: Which skills can be trained without the patient as substitute for a mannequin.

    Science.gov (United States)

    Sujatta, Susanne

    2015-03-01

    Character of clinical skills training is always influenced by technical improvement and cultural changes. Over the last years, two trends have changed the way of traditional apprenticeship-style training in regional anaesthesia: firstly, the development in ultrasound-guided regional anaesthesia, and secondly, the reduced acceptance of using patients as mannequins for invasive techniques. Against this background, simulation techniques are explored, ranging from simple low-fidelity part-task training models to train skills in needle application, to highly sophisticated virtual reality models – the full range is covered. This review tries to discuss all available options with benefits and neglects. The task in clinical practice will be in choosing the right level of sophistication for the desired approach and trainee level. However, the transfer of simulated skills to clinical practice has not been evaluated. It has to be proven whether simulation-trained skills could, as a last consequence, reduce the risk to patients. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Reducing the absorbed dose in analogue radiography of infant chest images by improving the image quality, using image processing techniques

    International Nuclear Information System (INIS)

    Karimian, A.; Yazdani, S.; Askari, M. A.

    2011-01-01

    Radiographic inspection is one of the most widely employed techniques for medical testing methods. Because of poor contrast and high un-sharpness of radiographic image quality in films, converting radiographs to a digital format and using further digital image processing is the best method of enhancing the image quality and assisting the interpreter in their evaluation. In this research work, radiographic films of 70 infant chest images with different sizes of defects were selected. To digitise the chest images and employ image processing the two algorithms (i) spatial domain and (ii) frequency domain techniques were used. The MATLAB environment was selected for processing in the digital format. Our results showed that by using these two techniques, the defects with small dimensions are detectable. Therefore, these suggested techniques may help medical specialists to diagnose the defects in the primary stages and help to prevent more repeat X-ray examination of paediatric patients. (authors)

  6. An intelligent signal processing and pattern recognition technique for defect identification using an active sensor network

    Science.gov (United States)

    Su, Zhongqing; Ye, Lin

    2004-08-01

    The practical utilization of elastic waves, e.g. Rayleigh-Lamb waves, in high-performance structural health monitoring techniques is somewhat impeded due to the complicated wave dispersion phenomena, the existence of multiple wave modes, the high susceptibility to diverse interferences, the bulky sampled data and the difficulty in signal interpretation. An intelligent signal processing and pattern recognition (ISPPR) approach using the wavelet transform and artificial neural network algorithms was developed; this was actualized in a signal processing package (SPP). The ISPPR technique comprehensively functions as signal filtration, data compression, characteristic extraction, information mapping and pattern recognition, capable of extracting essential yet concise features from acquired raw wave signals and further assisting in structural health evaluation. For validation, the SPP was applied to the prediction of crack growth in an alloy structural beam and construction of a damage parameter database for defect identification in CF/EP composite structures. It was clearly apparent that the elastic wave propagation-based damage assessment could be dramatically streamlined by introduction of the ISPPR technique.

  7. Study of the aging processes in polyurethane adhesives using thermal treatment and differential calorimetric, dielectric, and mechanical techniques ; 1, identifying the aging processes ; 2, quantifying the aging effect

    CERN Document Server

    Althouse, L P

    1979-01-01

    Study of the aging processes in polyurethane adhesives using thermal treatment and differential calorimetric, dielectric, and mechanical techniques ; 1, identifying the aging processes ; 2, quantifying the aging effect

  8. Impurities in sugar cane and their influence on industrial processing evaluated by nuclear techniques

    International Nuclear Information System (INIS)

    Bacchi, M.A.; Fernandes, E.A.N.; Ferraz, E.S.B.

    1990-01-01

    During the cutting and loading operations, impurities, mainly soil, are added to sugar cane in amounts that can impair industrial processing due to excessive wear of metallic members and contamination of juice and bagasse. Mechanization of loading operation has showed a considerable enhancement of the impurity content, leading to the improvement of cane washing technology. Nevertheless, for a correct understanding of the problem and the process optimization, it is necessary and exact and fast quantification of these impurities as well as of its consequences. Nuclear techniques, in special neutron activation analysis, have been proved to be appropriate for estimating soil level in sugar cane, washing process efficiency and wearing of cases and moving parts. (author)

  9. DATA ANALYSIS TECHNIQUES IN SERVICE QUALITY LITERATURE: ESSENTIALS AND ADVANCES

    Directory of Open Access Journals (Sweden)

    Mohammed naved Khan

    2013-05-01

    Full Text Available Academic and business researchers have for long debated on the most appropriate data analysis techniques that can be employed in conducting empirical researches in the domain of services marketing. On the basis of an exhaustive review of literature, the present paper attempts to provide a concise and schematic portrayal of generally followed data analysis techniques in the field of services quality literature. Collectively, the extant literature suggests that there is a growing trend among researchers to rely on higher order multivariate techniques viz. confirmatory factor analysis, structural equation modeling etc. to generate and analyze complex models, while at times ignoring very basic and yet powerful procedures such as mean, t-Test, ANOVA and correlation. The marked shift in orientation of researchers towards using sophisticated analytical techniques can largely beattributed to the competition within the community of researchers in social sciences in general and those working in the area of service quality in particular as also growing demands of reviewers ofjournals. From a pragmatic viewpoint, it is expected that the paper will serve as a useful source of information and provide deeper insights to academic researchers, consultants, and practitionersinterested in modelling patterns of service quality and arriving at optimal solutions to increasingly complex management problems.

  10. Security Transition Program Office (STPO), technology transfer of the STPO process, tools, and techniques

    Energy Technology Data Exchange (ETDEWEB)

    Hauth, J.T.; Forslund, C.R.J.; Underwood, J.A.

    1994-09-01

    In 1990, with the transition from a defense mission to environmental restoration, the U.S. Department of Energy`s (DOE`s) Hanford Site began a significant effort to diagnose, redesign, and implement new safeguards and security (SAS) processes. In 1992 the Security Transition Program Office (STPO) was formed to address the sweeping changes that were being identified. Comprised of SAS and other contractor staff with extensive experience and supported by staff experienced in organizational analysis and work process redesign, STPO undertook a series of tasks designed to make fundamental changes to SAS processes throughout the Hanford Site. The goal of STPO is to align the SAS work and organization with the new Site mission. This report describes the key strategy, tools, methods, and techniques used by STPO to change SAS processes at Hanford. A particular focus of this review is transferring STPO`s experience to other DOE sites and federal agency efforts: that is, to extract, analyze, and provide a critical review of the approach, tools, and techniques used by STPO that will be useful to other DOE sites and national laboratories in transitioning from a defense production mode to environmental restoration and other missions. In particular, what lessons does STPO provide as a pilot study or model for implementing change in other transition activities throughout the DOE complex? More broadly, what theoretical and practical contributions do DOE transition efforts, such as STPO, provide to federal agency streamlining efforts and attempts to {open_quotes}reinvent{close_quotes} government enterprises in the public sector? The approach used by STPO should provide valuable information to those examining their own processes in light of new mission requirements.

  11. A New Screening Methodology for Improved Oil Recovery Processes Using Soft-Computing Techniques

    Science.gov (United States)

    Parada, Claudia; Ertekin, Turgay

    2010-05-01

    The first stage of production of any oil reservoir involves oil displacement by natural drive mechanisms such as solution gas drive, gas cap drive and gravity drainage. Typically, improved oil recovery (IOR) methods are applied to oil reservoirs that have been depleted naturally. In more recent years, IOR techniques are applied to reservoirs even before their natural energy drive is exhausted by primary depletion. Descriptive screening criteria for IOR methods are used to select the appropriate recovery technique according to the fluid and rock properties. This methodology helps in assessing the most suitable recovery process for field deployment of a candidate reservoir. However, the already published screening guidelines neither provide information about the expected reservoir performance nor suggest a set of project design parameters, which can be used towards the optimization of the process. In this study, artificial neural networks (ANN) are used to build a high-performance neuro-simulation tool for screening different improved oil recovery techniques: miscible injection (CO2 and N2), waterflooding and steam injection processes. The simulation tool consists of proxy models that implement a multilayer cascade feedforward back propagation network algorithm. The tool is intended to narrow the ranges of possible scenarios to be modeled using conventional simulation, reducing the extensive time and energy spent in dynamic reservoir modeling. A commercial reservoir simulator is used to generate the data to train and validate the artificial neural networks. The proxy models are built considering four different well patterns with different well operating conditions as the field design parameters. Different expert systems are developed for each well pattern. The screening networks predict oil production rate and cumulative oil production profiles for a given set of rock and fluid properties, and design parameters. The results of this study show that the networks are

  12. Quantitative identification and analysis of sub-seismic extensional structure system: technique schemes and processes

    International Nuclear Information System (INIS)

    Chenghua, Ou; Chen, Wei; Ma, Zhonggao

    2015-01-01

    Quantitative characterization of complex sub-seismic extensional structure system that essentially controls petroleum exploitation is difficult to implement in seismic profile interpretation. This research, based on a case study in block M of Myanmar, established a set of quantitative treatment schemes and technique processes for the identification of sub-seismic low-displacement (SSLD) extensional faults or fractures upon structural deformation restoration and geometric inversion. Firstly, the master-subsidiary inheritance relations and configuration of the seismic-scale extensional fault systems are determined by analyzing the structural pattern. Besides, three-dimensional (3D) pattern and characteristics of the seismic-scale extensional structure have been illustrated by a 3D structure model built upon seismic sections. Moreover, according to the dilatancy obtained from structural restoration on the basis of inclined shear method, as well as the fracture-flow index, potential SSLD extensional faults or fractures have been quantitatively identified. Application of the technique processes to the sub-seismic low-displacement extensional structures in block M in Myanmar is instructive to quantitatively interpret those SSLD extensional structure systems in practice. (paper)

  13. Set Theory : Techniques and Applications : Curaçao 1995 and Barcelona 1996 Conferences

    CERN Document Server

    Larson, Jean; Bagaria, Joan; Mathias, A

    1998-01-01

    During the past 25 years, set theory has developed in several interesting directions. The most outstanding results cover the application of sophisticated techniques to problems in analysis, topology, infinitary combinatorics and other areas of mathematics. This book contains a selection of contributions, some of which are expository in nature, embracing various aspects of the latest developments. Amongst topics treated are forcing axioms and their applications, combinatorial principles used to construct models, and a variety of other set theoretical tools including inner models, partitions and trees. Audience: This book will be of interest to graduate students and researchers in foundational problems of mathematics.

  14. Qubit Manipulations Techniques for Trapped-Ion Quantum Information Processing

    Science.gov (United States)

    Gaebler, John; Tan, Ting; Lin, Yiheng; Bowler, Ryan; Jost, John; Meier, Adam; Knill, Emanuel; Leibfried, Dietrich; Wineland, David; Ion Storage Team

    2013-05-01

    We report recent results on qubit manipulation techniques for trapped-ions towards scalable quantum information processing (QIP). We demonstrate a platform-independent benchmarking protocol for evaluating the performance of Clifford gates, which form a basis for fault-tolerant QIP. We report a demonstration of an entangling gate scheme proposed by Bermudez et al. [Phys. Rev. A. 85, 040302 (2012)] and achieve a fidelity of 0.974(4). This scheme takes advantage of dynamic decoupling which protects the qubit against dephasing errors. It can be applied directly on magnetic-field-insensitive states, and provides a number of simplifications in experimental implementation compared to some other entangling gates with trapped ions. We also report preliminary results on dissipative creation of entanglement with trapped-ions. Creation of an entangled pair does not require discrete logic gates and thus could reduce the level of quantum-coherent control needed for large-scale QIP. Supported by IARPA, ARO contract No. EAO139840, ONR, and the NIST Quantum Information Program.

  15. Triaxial testing system for pressure core analysis using image processing technique

    Science.gov (United States)

    Yoneda, J.; Masui, A.; Tenma, N.; Nagao, J.

    2013-11-01

    In this study, a newly developed innovative triaxial testing system to investigate strength, deformation behavior, and/or permeability of gas hydrate bearing-sediments in deep sea is described. Transport of the pressure core from the storage chamber to the interior of the sealing sleeve of a triaxial cell without depressurization was achieved. An image processing technique was used to capture the motion and local deformation of a specimen in a transparent acrylic triaxial pressure cell and digital photographs were obtained at each strain level during the compression test. The material strength was successfully measured and the failure mode was evaluated under high confining and pore water pressures.

  16. Time resolved techniques: An overview

    International Nuclear Information System (INIS)

    Larson, B.C.; Tischler, J.Z.

    1990-06-01

    Synchrotron sources provide exceptional opportunities for carrying out time-resolved x-ray diffraction investigations. The high intensity, high angular resolution, and continuously tunable energy spectrum of synchrotron x-ray beams lend themselves directly to carrying out sophisticated time-resolved x-ray scattering measurements on a wide range of materials and phenomena. When these attributes are coupled with the pulsed time-structure of synchrotron sources, entirely new time-resolved scattering possibilities are opened. Synchrotron beams typically consist of sub-nanosecond pulses of x-rays separated in time by a few tens of nanoseconds to a few hundred nanoseconds so that these beams appear as continuous x-ray sources for investigations of phenomena on time scales ranging from hours down to microseconds. Studies requiring time-resolution ranging from microseconds to fractions of a nanosecond can be carried out in a triggering mode by stimulating the phenomena under investigation in coincidence with the x-ray pulses. Time resolution on the picosecond scale can, in principle, be achieved through the use of streak camera techniques in which the time structure of the individual x-ray pulses are viewed as quasi-continuous sources with ∼100--200 picoseconds duration. Techniques for carrying out time-resolved scattering measurements on time scales varying from picoseconds to kiloseconds at present and proposed synchrotron sources are discussed and examples of time-resolved studies are cited. 17 refs., 8 figs

  17. West Java Snack Mapping based on Snack Types, Main Ingredients, and Processing Techniques

    Science.gov (United States)

    Nurani, A. S.; Subekti, S.; Ana

    2016-04-01

    The research was motivated by lack of literature on archipelago snack especially from West Java. It aims to explore the snack types, the processing techniques, and the main ingredients by planning a learning material on archipelago cake especially from West Java. The research methods used are descriptive observations and interviews. The samples were randomly chosen from all regions in West Java. The findings show the identification of traditional snack from West java including: 1. snack types which are similar in all regions as research sample namely: opak, rangginang, nagasari, aliagrem, cuhcur, keripik, semprong, wajit, dodol, kecimpring, combro, tape ketan, and surabi. The typical snack types involve burayot (Garut), simping kaum (Purwakarta), surabi hejo (Karawang), papais cisaat (Subang), Papais moyong, opak bakar (Kuningan), opak oded, ranggesing (Sumedang), gapit, tapel (Cirebon), gulampo, kue aci (Tasikmalaya), wajit cililin, gurilem (West Bandung), and borondong (Bandung District); 2. various processing techniques namely: steaming, boiling, frying, caramelizing, baking, grilling, roaster, sugaring; 3. various main ingredients namely rice, local glutinous rice, rice flour, glutinous rice flour, starch, wheat flour, hunkue flour, cassava, sweet potato, banana, nuts, and corn; 4. snack classification in West Java namely (1) traditional snack, (2) creation-snack, (3) modification-snack, (4) outside influence-snack.

  18. Future of radiation processing of polymers

    International Nuclear Information System (INIS)

    Chapiro, A.; Tabata, Y.; Stannett, V.; Dole, M.; Dobo, J.; Charlesby, A.

    1990-01-01

    The present development of radiation processing in the polymer field including well established technologies, with large scale productions and substantial markets, such as: crosslinking; curing of monomer-polymer formulations; sterilization of plastic supplies; are discussed. The manufacture of sophisticated devices with low volume production but large added value: electronic devices; resistors and several promising applications for which only small commercial productions are on stream today: chain degradation; polymerization; graft copolymerization, are reviewed. (author)

  19. An Address Event Representation-Based Processing System for a Biped Robot

    Directory of Open Access Journals (Sweden)

    Uziel Jaramillo-Avila

    2016-02-01

    Full Text Available In recent years, several important advances have been made in the fields of both biologically inspired sensorial processing and locomotion systems, such as Address Event Representation-based cameras (or Dynamic Vision Sensors and in human-like robot locomotion, e.g., the walking of a biped robot. However, making these fields merge properly is not an easy task. In this regard, Neuromorphic Engineering is a fast-growing research field, the main goal of which is the biologically inspired design of hybrid hardware systems in order to mimic neural architectures and to process information in the manner of the brain. However, few robotic applications exist to illustrate them. The main goal of this work is to demonstrate, by creating a closed-loop system using only bio-inspired techniques, how such applications can work properly. We present an algorithm using Spiking Neural Networks (SNN for a biped robot equipped with a Dynamic Vision Sensor, which is designed to follow a line drawn on the floor. This is a commonly used method for demonstrating control techniques. Most of them are fairly simple to implement without very sophisticated components; however, it can still serve as a good test in more elaborate circumstances. In addition, the locomotion system proposed is able to coordinately control the six DOFs of a biped robot in switching between basic forms of movement. The latter has been implemented as a FPGA-based neuromorphic system. Numerical tests and hardware validation are presented.

  20. Modeling of outpatient prescribing process in iran: a gateway toward electronic prescribing system.

    Science.gov (United States)

    Ahmadi, Maryam; Samadbeik, Mahnaz; Sadoughi, Farahnaz

    2014-01-01

    Implementation of electronic prescribing system can overcome many problems of the paper prescribing system, and provide numerous opportunities of more effective and advantageous prescribing. Successful implementation of such a system requires complete and deep understanding of work content, human force, and workflow of paper prescribing. The current study was designed in order to model the current business process of outpatient prescribing in Iran and clarify different actions during this process. In order to describe the prescribing process and the system features in Iran, the methodology of business process modeling and analysis was used in the present study. The results of the process documentation were analyzed using a conceptual model of workflow elements and the technique of modeling "As-Is" business processes. Analysis of the current (as-is) prescribing process demonstrated that Iran stood at the first levels of sophistication in graduated levels of electronic prescribing, namely electronic prescription reference, and that there were problematic areas including bottlenecks, redundant and duplicated work, concentration of decision nodes, and communicative weaknesses among stakeholders of the process. Using information technology in some activities of medication prescription in Iran has not eliminated the dependence of the stakeholders on paper-based documents and prescriptions. Therefore, it is necessary to implement proper system programming in order to support change management and solve the problems in the existing prescribing process. To this end, a suitable basis should be provided for reorganization and improvement of the prescribing process for the future electronic systems.

  1. Process Management Plans

    Directory of Open Access Journals (Sweden)

    Tomasz Miksa

    2014-07-01

    Full Text Available In the era of research infrastructures and big data, sophisticated data management practices are becoming essential building blocks of successful science. Most practices follow a data-centric approach, which does not take into account the processes that created, analysed and presented the data. This fact limits the possibilities for reliable verification of results. Furthermore, it does not guarantee the reuse of research, which is one of the key aspects of credible data-driven science. For that reason, we propose the introduction of the new concept of Process Management Plans, which focus on the identification, description, sharing and preservation of the entire scientific processes. They enable verification and later reuse of result data and processes of scientific experiments. In this paper we describe the structure and explain the novelty of Process Management Plans by showing in what way they complement existing Data Management Plans. We also highlight key differences, major advantages, as well as references to tools and solutions that can facilitate the introduction of Process Management Plans.

  2. The use of artificial intelligence techniques to improve the multiple payload integration process

    Science.gov (United States)

    Cutts, Dannie E.; Widgren, Brian K.

    1992-01-01

    A maximum return of science and products with a minimum expenditure of time and resources is a major goal of mission payload integration. A critical component then, in successful mission payload integration is the acquisition and analysis of experiment requirements from the principal investigator and payload element developer teams. One effort to use artificial intelligence techniques to improve the acquisition and analysis of experiment requirements within the payload integration process is described.

  3. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique.

    Science.gov (United States)

    Intarapanich, Apichart; Kaewkamnerd, Saowaluck; Shaw, Philip J; Ukosakit, Kittipat; Tragoonrung, Somvong; Tongsima, Sissades

    2015-01-01

    DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. This work presents an automated genotyping tool from DNA

  4. Automatic DNA Diagnosis for 1D Gel Electrophoresis Images using Bio-image Processing Technique

    Science.gov (United States)

    2015-01-01

    Background DNA gel electrophoresis is a molecular biology technique for separating different sizes of DNA fragments. Applications of DNA gel electrophoresis include DNA fingerprinting (genetic diagnosis), size estimation of DNA, and DNA separation for Southern blotting. Accurate interpretation of DNA banding patterns from electrophoretic images can be laborious and error prone when a large number of bands are interrogated manually. Although many bio-imaging techniques have been proposed, none of them can fully automate the typing of DNA owing to the complexities of migration patterns typically obtained. Results We developed an image-processing tool that automatically calls genotypes from DNA gel electrophoresis images. The image processing workflow comprises three main steps: 1) lane segmentation, 2) extraction of DNA bands and 3) band genotyping classification. The tool was originally intended to facilitate large-scale genotyping analysis of sugarcane cultivars. We tested the proposed tool on 10 gel images (433 cultivars) obtained from polyacrylamide gel electrophoresis (PAGE) of PCR amplicons for detecting intron length polymorphisms (ILP) on one locus of the sugarcanes. These gel images demonstrated many challenges in automated lane/band segmentation in image processing including lane distortion, band deformity, high degree of noise in the background, and bands that are very close together (doublets). Using the proposed bio-imaging workflow, lanes and DNA bands contained within are properly segmented, even for adjacent bands with aberrant migration that cannot be separated by conventional techniques. The software, called GELect, automatically performs genotype calling on each lane by comparing with an all-banding reference, which was created by clustering the existing bands into the non-redundant set of reference bands. The automated genotype calling results were verified by independent manual typing by molecular biologists. Conclusions This work presents an

  5. Statistical techniques for automating the detection of anomalous performance in rotating machinery

    International Nuclear Information System (INIS)

    Piety, K.R.; Magette, T.E.

    1978-01-01

    Surveillance techniques which extend the sophistication existing in automated systems monitoring in industrial rotating equipment are described. The monitoring system automatically established limiting criteria during an initial learning period of a few days; and subsequently, while monitoring the test rotor during an extended period of normal operation, experienced a false alarm rate of 0.5%. At the same time, the monitoring system successfully detected all fault types that introduced into the test setup. Tests on real equipment are needed to provide final verification of the monitoring techniques. There are areas that would profit from additional investigation in the laboratory environment. A comparison of the relative value of alternate descriptors under given fault conditions would be worthwhile. This should be pursued in conjunction with extending the set of fault types available, e.g., lecaring problems. Other tests should examine the effects of using fewer (more coarse) intervals to define the lumped operational states. finally, techniques to diagnose the most probable fault should be developed by drawing upon the extensive data automatically logged by the monitoring system

  6. Report of the subgroup on fast processing

    International Nuclear Information System (INIS)

    Gibbard, B.G.; Kirsch, L.E.; Moneti, G.; Plano, R.J.; Rabin, M.S.Z.; Willen, E.

    1977-01-01

    A study was made of the flow of data and the simultaneous processing needed to reduce the 10 7 to 10 8 triggers per second expected at ISABELLE to a number of events on the order of 10 to 100 per second which would be written on magnetic tape. It was assumed that within 100 ns of the event a fast pretrigger would have reduced the data rate to at most 10 7 per second. At that point, data from all sense elements in the experiment would be fed into a 1-μs-long pipeline. Within the first 1 μs (while the data are in the first pipeline) another level of triggering would reduce the trigger rate to at most 10 6 per second. The data would then be fed into a second pipeline which is 50 μs long. During the 50 μs the data are in the second pipeline, a more sophisticated level of triggering (slow trigger) would reduce the trigger rate to a level that can be handled by standard data processing techniques (microprocessors or larger machines), i.e., 10 2 to 10 3 per second. The pipelines and the buffer between them, a sequential address memory, are described first, and then several alternative schemes for the pretrigger and slow trigger are presented. 10 figures

  7. Notes on the history of the radiological study of Egyptian mummies: from X-rays to new imaging techniques.

    Science.gov (United States)

    Cosmacini, P; Piacentini, P

    2008-08-01

    A few centuries after the practice of mummification was finally abolished in the seventh century A.D., mummies began to capture the collective imagination, exerting a mysterious fascination that continues to this day. From the beginning, the radiological study of Egyptian mummies permitted the collection not only of medical data but also of anthropological and archaeological evidence. The first radiological study of an Egyptian mummy was performed by Flinders Petrie shortly after the discovery of X-rays in 1895, and since then, radiology has never stopped investigating these special patients. By the end of the 1970s, computed tomography (CT) scanning permitted more in-depth studies to be carried out without requiring the mummies to be removed from their cartonnage. CT images can be used to obtain a three-dimensional reconstruction of the mummy that provides important new information, in part thanks to the virtual endoscopy technique known as "fly through". Moreover, starting from CT data and using sophisticated graphics software, one can reconstruct an image of the face of the mummified individual at the time of his or her death. The history of imaging, from its origins until now, from the simplest to the most sophisticated technique, allows us to appreciate why these studies have been, and still are, fundamental in the study of Egyptian mummies.

  8. Improving Vintage Seismic Data Quality through Implementation of Advance Processing Techniques

    Science.gov (United States)

    Latiff, A. H. Abdul; Boon Hong, P. G.; Jamaludin, S. N. F.

    2017-10-01

    It is essential in petroleum exploration to have high resolution subsurface images, both vertically and horizontally, in uncovering new geological and geophysical aspects of our subsurface. The lack of success may have been from the poor imaging quality which led to inaccurate analysis and interpretation. In this work, we re-processed the existing seismic dataset with an emphasis on two objectives. Firstly, to produce a better 3D seismic data quality with full retention of relative amplitudes and significantly reduce seismic and structural uncertainty. Secondly, to facilitate further prospect delineation through enhanced data resolution, fault definitions and events continuity, particularly in syn-rift section and basement cover contacts and in turn, better understand the geology of the subsurface especially in regard to the distribution of the fluvial and channel sands. By adding recent, state-of-the-art broadband processing techniques such as source and receiver de-ghosting, high density velocity analysis and shallow water de-multiple, the final results produced a better overall reflection detail and frequency in specific target zones, particularly in the deeper section.

  9. The application of irradiation techniques for food preservation and processing improvement

    Energy Technology Data Exchange (ETDEWEB)

    Byun, Myung Woo; Cho, Han Ok; Jo, Sung Ki; Yook, Hong Sun; Kwon, Oh Jin; Yang, Jae Seung; Kim, Sung; Im, Sung Il

    1997-09-01

    This project has intended to develop alternative techniques to be used in food industry for food processing and utilization by safe irradiation methods. For improvement of rheology and processing in corn starch by irradiation, the production of modified starch with low viscosity as well as with excellent viscosity stability became feasible by the control of gamma irradiation dose levels and the amount of added inorganic peroxides to starch. Also, this project was developed the improvement methods of hygienic quality and long-term storage of dried red pepper by gamma irradiation. And, in Korean medicinal plants, 10 kGy gamma irradiation was effective for improving sanitary quality and increasing extraction yield of major components. For the sanitization of health and convenience foods, gamma irradiation was more effective than ozone treatment in decontamination of microorganisms, with minimal effect on the physicochemical properties analysed. In evaluation of wholesomeness, gamma-irradiated the Korean medicinal plants could be safe on the genotoxic point of view. And, thirteen groups of irradiated foods approved for human consumption from Korea Ministry of Health and Welfare. (author). 81 refs., 74 tabs.

  10. The application of irradiation techniques for food preservation and processing improvement

    International Nuclear Information System (INIS)

    Byun, Myung Woo; Cho, Han Ok; Jo, Sung Ki; Yook, Hong Sun; Kwon, Oh Jin; Yang, Jae Seung; Kim, Sung; Im, Sung Il.

    1997-09-01

    This project has intended to develop alternative techniques to be used in food industry for food processing and utilization by safe irradiation methods. For improvement of rheology and processing in corn starch by irradiation, the production of modified starch with low viscosity as well as with excellent viscosity stability became feasible by the control of gamma irradiation dose levels and the amount of added inorganic peroxides to starch. Also, this project was developed the improvement methods of hygienic quality and long-term storage of dried red pepper by gamma irradiation. And, in Korean medicinal plants, 10 kGy gamma irradiation was effective for improving sanitary quality and increasing extraction yield of major components. For the sanitization of health and convenience foods, gamma irradiation was more effective than ozone treatment in decontamination of microorganisms, with minimal effect on the physicochemical properties analysed. In evaluation of wholesomeness, gamma-irradiated the Korean medicinal plants could be safe on the genotoxic point of view. And, thirteen groups of irradiated foods approved for human consumption from Korea Ministry of Health and Welfare. (author). 81 refs., 74 tabs

  11. Sophisticated Search Capabilities in the ADS Abstract Service

    Science.gov (United States)

    Eichhorn, G.; Accomazzi, A.; Grant, C. S.; Henneken, E.; Kurtz, M. J.; Murray, S. S.

    2003-12-01

    The ADS provides access to over 940,000 references from astronomy and planetary sciences publications and 1.5 million records from physics publications. It is funded by NASA and provides free access to these references, as well as to 2.4 million scanned pages from the astronomical literature. These include most of the major astronomy and several planetary sciences journals, as well as many historical observatory publications. The references now include the abstracts from all volumes of the Journal of Geophysical Research (JGR) since the beginning of 2002. We get these abstracts on a regular basis. The Kluwer journal Solar Physics has been scanned back to volume 1 and is available through the ADS. We have extracted the reference lists from this and many other journals and included them in the reference and citation database of the ADS. We have recently scanning Earth, Moon and Planets, another Kluwer journal, and will scan other Kluwer journals in the future as well. We plan on extracting references from these journals as well in the near future. The ADS has many sophisticated query features. These allow the user to formulate complex queries. Using results lists to get further information about the selected articles provide the means to quickly find important and relevant articles from the database. Three advanced feedback queries are available from the bottom of the ADS results list (in addition to regular feedback queries already available from the abstract page and from the bottom of the results list): 1. Get reference list for selected articles: This query returns all known references for the selected articles (or for all articles in the first list). The resulting list will be ranked according to how often each article is referred to and will show the most referenced articles in the field of study that created the first list. It presumably shows the most important articles in that field. 2. Get citation list for selected articles: This returns all known articles

  12. Signalign: An Ontology of DNA as Signal for Comparative Gene Structure Prediction Using Information-Coding-and-Processing Techniques.

    Science.gov (United States)

    Yu, Ning; Guo, Xuan; Gu, Feng; Pan, Yi

    2016-03-01

    Conventional character-analysis-based techniques in genome analysis manifest three main shortcomings-inefficiency, inflexibility, and incompatibility. In our previous research, a general framework, called DNA As X was proposed for character-analysis-free techniques to overcome these shortcomings, where X is the intermediates, such as digit, code, signal, vector, tree, graph network, and so on. In this paper, we further implement an ontology of DNA As Signal, by designing a tool named Signalign for comparative gene structure analysis, in which DNA sequences are converted into signal series, processed by modified method of dynamic time warping and measured by signal-to-noise ratio (SNR). The ontology of DNA As Signal integrates the principles and concepts of other disciplines including information coding theory and signal processing into sequence analysis and processing. Comparing with conventional character-analysis-based methods, Signalign can not only have the equivalent or superior performance, but also enrich the tools and the knowledge library of computational biology by extending the domain from character/string to diverse areas. The evaluation results validate the success of the character-analysis-free technique for improved performances in comparative gene structure prediction.

  13. Fuzzy logic and image processing techniques for the interpretation of seismic data

    International Nuclear Information System (INIS)

    Orozco-del-Castillo, M G; Ortiz-Alemán, C; Rodríguez-Castellanos, A; Urrutia-Fucugauchi, J

    2011-01-01

    Since interpretation of seismic data is usually a tedious and repetitive task, the ability to do so automatically or semi-automatically has become an important objective of recent research. We believe that the vagueness and uncertainty in the interpretation process makes fuzzy logic an appropriate tool to deal with seismic data. In this work we developed a semi-automated fuzzy inference system to detect the internal architecture of a mass transport complex (MTC) in seismic images. We propose that the observed characteristics of a MTC can be expressed as fuzzy if-then rules consisting of linguistic values associated with fuzzy membership functions. The constructions of the fuzzy inference system and various image processing techniques are presented. We conclude that this is a well-suited problem for fuzzy logic since the application of the proposed methodology yields a semi-automatically interpreted MTC which closely resembles the MTC from expert manual interpretation

  14. Mass Detection in Mammographic Images Using Wavelet Processing and Adaptive Threshold Technique.

    Science.gov (United States)

    Vikhe, P S; Thool, V R

    2016-04-01

    Detection of mass in mammogram for early diagnosis of breast cancer is a significant assignment in the reduction of the mortality rate. However, in some cases, screening of mass is difficult task for radiologist, due to variation in contrast, fuzzy edges and noisy mammograms. Masses and micro-calcifications are the distinctive signs for diagnosis of breast cancer. This paper presents, a method for mass enhancement using piecewise linear operator in combination with wavelet processing from mammographic images. The method includes, artifact suppression and pectoral muscle removal based on morphological operations. Finally, mass segmentation for detection using adaptive threshold technique is carried out to separate the mass from background. The proposed method has been tested on 130 (45 + 85) images with 90.9 and 91 % True Positive Fraction (TPF) at 2.35 and 2.1 average False Positive Per Image(FP/I) from two different databases, namely Mammographic Image Analysis Society (MIAS) and Digital Database for Screening Mammography (DDSM). The obtained results show that, the proposed technique gives improved diagnosis in the early breast cancer detection.

  15. Influence of the properties of granite and sandstone in the desalination process by electrokinetic technique

    DEFF Research Database (Denmark)

    Feijoo, J.; Ottosen, Lisbeth M.; Pozo-Antonio, J.S.

    2015-01-01

    ) achieved in both stones.From the results obtained, it was possible to find those inherent factors to each stone which could have an influence on the efficacy of the treatment. With this technique it was possible to reduce the salt concentration in the granite almost to 100%. However, in the sandstone...... samples the decreases were not equally high, mainly at the intermediate levels where slight enrichments were observed. The results indicate that although the used technique is efficient for salt removal regardless of the porosimetric distribution of the rock, the better interconnection between the pores...... in the granite samples (favored a faster desalination process)....

  16. Thresholding: A Pixel-Level Image Processing Methodology Preprocessing Technique for an OCR System for the Brahmi Script

    Directory of Open Access Journals (Sweden)

    H. K. Anasuya Devi

    2006-12-01

    Full Text Available In this paper we study the methodology employed for preprocessing the archaeological images. We present the various algorithms used in the low-level processing stage of image analysis for Optical Character Recognition System for Brahmi Script. The image preprocessing technique covered in this paper is thresholding. We also try to analyze the results obtained by the pixel-level processing algorithms.

  17. Applying industrial process improvement techniques to increase efficiency in a surgical practice.

    Science.gov (United States)

    Reznick, David; Niazov, Lora; Holizna, Eric; Siperstein, Allan

    2014-10-01

    The goal of this study was to examine how industrial process improvement techniques could help streamline the preoperative workup. Lean process improvement was used to streamline patient workup at an endocrine surgery service at a tertiary medical center utilizing multidisciplinary collaboration. The program consisted of several major changes in how patients are processed in the department. The goal was to shorten the wait time between initial call and consult visit and between consult and surgery. We enrolled 1,438 patients enrolled in the program. The wait time from the initial call until consult was reduced from 18.3 ± 0.7 to 15.4 ± 0.9 days. Wait time from consult until operation was reduced from 39.9 ± 1.5 to 33.9 ± 1.3 days for the overall practice and to 15.0 ± 4.8 days for low-risk patients. Patient cancellations were reduced from 27.9 ± 2.4% to 17.3 ± 2.5%. Overall patient flow increased from 30.9 ± 5.1 to 52.4 ± 5.8 consults per month (all P process improvement methodology, surgery patients can benefit from an improved, streamlined process with significant reduction in wait time from call to initial consult and initial consult to surgery, with reduced cancellations. This generalized process has resulted in increased practice throughput and efficiency and is applicable to any surgery practice. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Nuclear techniques used in study of biological processes in Sinapis alba culture

    International Nuclear Information System (INIS)

    Giosanu, D.; Fleancu, M.

    2001-01-01

    The aim of the present paper is to study different nuclear techniques, in particular the influence of gamma radiation upon germination, growth and respiration processes in Sinapis alba culture. The dependence of these phenomena on dose of gamma irradiation was studied. Research was done on dry seeds of mustard (Sinapis alba).The doses of gamma irradiation were: 20 krad, 40 krad, 60 krad, 80 krad and 100 krad.The subsequent evolution of the irradiated samples was compared with the evolution of an unirradiated (control) samples. The irradiation was done evenly, in a single phase. The treatment of the dry seeds of mustard with gamma radiation determined a diminution of energy of germination. So, the energy of germination was 57 - 73% in gamma treated batches and 81% in the control batch. Thus, the faculty of germination decreases from 92% (in the control batch) to 83% in the irradiated batches. Growth process (length of roots and hypocotyl) was also studied. For 100 krad gamma irradiation the rate of this process was lower than that of the control batch, both in the first and the four day of irradiation. The inhibition effect manifested on germination and growth processes for gamma treated dry seeds of mustard is determined by the modification in the membrane permeability. The intensity of respiration process in the irradiated lots was lower than that of the control lot. The inhibition effect manifested by respiration process following gamma irradiation could be explained by the enzymatic activity of mustard seeds. (authors)

  19. Solving real-life problems: future mobile technology sophistication

    International Nuclear Information System (INIS)

    Shafiq, F.; Ahsan, K.; Nadeem, A.

    2016-01-01

    Almost all the human being real life concerned domains are taking advantage of latest technologies for enhancing their process, procedures and operations. This integration of technological innovations provides ease of access, flexibility, transparency, reliability and speed for the concerned process and procedures. Rapid growth of ICT (Information and Communication Technology) and MT (Mobile Technology) provides opportunity to redesign and re-engineered the human routines life activities process and procedures. Technology integration and adoption in routine life activities may serves compensatory mechanism to assist the population in different manner such as monitoring older adults and children at homes, provides security assistance, monitoring and recording patients vital signs automatically, controlling and monitoring equipments and devices, providing assistance in shopping, banking and education as well. Disasters happened suddenly, destroy everything indiscriminately. Adoption and integration of latest technologies including ICT and MT can enhance the current disaster management process, procedures and operations. This research study focuses the impacts of latest and emerging technology trends in routine life activities and surrounds their potential strength to improve and enhance disaster management activities. MT is providing a promising platform for facilitating people to enhance their routine life activities. This research argue that integration and adoption of mobile computing in disaster management domain can enhance disaster management activities with promising minimizing error, quick information assembling, quick response based on technology manipulation and prioritizing action. (author)

  20. Solving Real-Life Problems: Future Mobile Technology Sophistication

    Directory of Open Access Journals (Sweden)

    FARHAN SHAFIQ

    2016-07-01

    Full Text Available Almost all the human being real life concerned domains are taking advantage of latest technologies for enhancing their process, procedures and operations. This integration of technological innovations provides ease of access, flexibility, transparency, reliability and speed for the concerned process and procedures. Rapid growth of ICT (Information and Communication Technology and MT (Mobile Technology provides opportunity to redesign and reengineered the human routines? life activities process and procedures. Technology integration and adoption in routine life activities may serves compensatory mechanism to assist the population in different manner such as monitoring older adults and children at homes, provides security assistance, monitoring and recording patients vital signs automatically, controlling and monitoring equipments and devices, providing assistance in shopping, banking and education as well. Disasters happened suddenly, destroy everything indiscriminately. Adoption and integration of latest technologies including ICT and MT can enhance the current disaster management process, procedures and operations. This research study focuses the impacts of latest and emerging technology trends in routine life activities and surrounds their potential strength to improve and enhance disaster management activities. MT is providing a promising platform for facilitating people to enhance their routine life activities. This research argue that integration and adoption of mobile computing in disaster management domain can enhance disaster management activities with promising minimizing error, quick information assembling, quick response based on technology manipulation and prioritizing action.

  1. Energy meshing techniques for processing ENDF/B-VI cross sections using the AMPX code system

    International Nuclear Information System (INIS)

    Dunn, M.E.; Greene, N.M.; Leal, L.C.

    1999-01-01

    Modern techniques for the establishment of criticality safety for fissile systems invariably require the use of neutronic transport codes with applicable cross-section data. Accurate cross-section data are essential for solving the Boltzmann Transport Equation for fissile systems. In the absence of applicable critical experimental data, the use of independent calculational methods is crucial for the establishment of subcritical limits. Moreover, there are various independent modern transport codes available to the criticality safety analyst (e.g., KENO V.a., MCNP, and MONK). In contrast, there is currently only one complete software package that processes data from the Version 6 format of the Evaluated Nuclear Data File (ENDF) to a format useable by criticality safety codes. To facilitate independent cross-section processing, Oak Ridge National Laboratory (ORNL) is upgrading the AMPX code system to enable independent processing of Version 6 formats using state-of-the-art procedures. The AMPX code system has been in continuous use at ORNL since the early 1970s and is the premier processor for providing multigroup cross sections for criticality safety analysis codes. Within the AMPX system, the module POLIDENT is used to access the resonance parameters in File 2 of an ENDF/B library, generate point cross-section data, and combine the cross sections with File 3 point data. At the heart of any point cross-section processing code is the generation of a suitable energy mesh for representing the data. The purpose of this work is to facilitate the AMPX upgrade through the development of a new and innovative energy meshing technique for processing point cross-section data

  2. Signal and image processing in medical applications

    CERN Document Server

    Kumar, Amit; Rahim, B Abdul; Kumar, D Sravan

    2016-01-01

    This book highlights recent findings on and analyses conducted on signals and images in the area of medicine. The experimental investigations involve a variety of signals and images and their methodologies range from very basic to sophisticated methods. The book explains how signal and image processing methods can be used to detect and forecast abnormalities in an easy-to-follow manner, offering a valuable resource for researchers, engineers, physicians and bioinformatics researchers alike.

  3. Full characterization of the photorefractive bright soliton formation process using a digital holographic technique

    International Nuclear Information System (INIS)

    Merola, F; Miccio, L; Paturzo, M; Ferraro, P; De Nicola, S

    2009-01-01

    An extensive characterization of the photorefractive bright soliton writing process in a lithium niobate crystal is presented. An interferometric approach based on a digital holographic technique has been used to reconstruct the complex wavefield at the exit face of the crystal. Temporal evolution of both intensity and phase profile of the writing beam has been analysed. The effective changes of the refractive index of the medium during the writing process and after the soliton formation are determined from the optical phase distribution. This method provides a reliable way to observe the process of soliton formation, whereas the determination of the intensity distribution of the output beam does not show clearly whether the soliton regime has been achieved or not. Furthermore, a detailed analysis of the soliton in a steady-state situation and under different writing conditions is presented and discussed

  4. Ultra-processed family foods in Australia: nutrition claims, health claims and marketing techniques.

    Science.gov (United States)

    Pulker, Claire Elizabeth; Scott, Jane Anne; Pollard, Christina Mary

    2018-01-01

    To objectively evaluate voluntary nutrition and health claims and marketing techniques present on packaging of high-market-share ultra-processed foods (UPF) in Australia for their potential impact on public health. Cross-sectional. Packaging information from five high-market-share food manufacturers and one retailer were obtained from supermarket and manufacturers' websites. Ingredients lists for 215 UPF were examined for presence of added sugar. Packaging information was categorised using a taxonomy of nutrition and health information which included nutrition and health claims and five common food marketing techniques. Compliance of statements and claims with the Australia New Zealand Food Standards Code and with Health Star Ratings (HSR) were assessed for all products. Almost all UPF (95 %) contained added sugars described in thirty-four different ways; 55 % of UPF displayed a HSR; 56 % had nutrition claims (18 % were compliant with regulations); 25 % had health claims (79 % were compliant); and 97 % employed common food marketing techniques. Packaging of 47 % of UPF was designed to appeal to children. UPF carried a mean of 1·5 health and nutrition claims (range 0-10) and 2·6 marketing techniques (range 0-5), and 45 % had HSR≤3·0/5·0. Most UPF packaging featured nutrition and health statements or claims despite the high prevalence of added sugars and moderate HSR. The degree of inappropriate or inaccurate statements and claims present is concerning, particularly on packaging designed to appeal to children. Public policies to assist parents to select healthy family foods should address the quality and accuracy of information provided on UPF packaging.

  5. Applying modern psychometric techniques to melodic discrimination testing: Item response theory, computerised adaptive testing, and automatic item generation.

    Science.gov (United States)

    Harrison, Peter M C; Collins, Tom; Müllensiefen, Daniel

    2017-06-15

    Modern psychometric theory provides many useful tools for ability testing, such as item response theory, computerised adaptive testing, and automatic item generation. However, these techniques have yet to be integrated into mainstream psychological practice. This is unfortunate, because modern psychometric techniques can bring many benefits, including sophisticated reliability measures, improved construct validity, avoidance of exposure effects, and improved efficiency. In the present research we therefore use these techniques to develop a new test of a well-studied psychological capacity: melodic discrimination, the ability to detect differences between melodies. We calibrate and validate this test in a series of studies. Studies 1 and 2 respectively calibrate and validate an initial test version, while Studies 3 and 4 calibrate and validate an updated test version incorporating additional easy items. The results support the new test's viability, with evidence for strong reliability and construct validity. We discuss how these modern psychometric techniques may also be profitably applied to other areas of music psychology and psychological science in general.

  6. Gasoline classification using near infrared (NIR) spectroscopy data: Comparison of multivariate techniques

    Energy Technology Data Exchange (ETDEWEB)

    Balabin, Roman M., E-mail: balabin@org.chem.ethz.ch [Department of Chemistry and Applied Biosciences, ETH Zurich, 8093 Zurich (Switzerland); Safieva, Ravilya Z. [Gubkin Russian State University of Oil and Gas, 119991 Moscow (Russian Federation); Lomakina, Ekaterina I. [Faculty of Computational Mathematics and Cybernetics, Lomonosov Moscow State University, 119992 Moscow (Russian Federation)

    2010-06-25

    Near infrared (NIR) spectroscopy is a non-destructive (vibrational spectroscopy based) measurement technique for many multicomponent chemical systems, including products of petroleum (crude oil) refining and petrochemicals, food products (tea, fruits, e.g., apples, milk, wine, spirits, meat, bread, cheese, etc.), pharmaceuticals (drugs, tablets, bioreactor monitoring, etc.), and combustion products. In this paper we have compared the abilities of nine different multivariate classification methods: linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), regularized discriminant analysis (RDA), soft independent modeling of class analogy (SIMCA), partial least squares (PLS) classification, K-nearest neighbor (KNN), support vector machines (SVM), probabilistic neural network (PNN), and multilayer perceptron (ANN-MLP) - for gasoline classification. Three sets of near infrared (NIR) spectra (450, 415, and 345 spectra) were used for classification of gasolines into 3, 6, and 3 classes, respectively, according to their source (refinery or process) and type. The 14,000-8000 cm{sup -1} NIR spectral region was chosen. In all cases NIR spectroscopy was found to be effective for gasoline classification purposes, when compared with nuclear magnetic resonance (NMR) spectroscopy or gas chromatography (GC). KNN, SVM, and PNN techniques for classification were found to be among the most effective ones. Artificial neural network (ANN-MLP) approach based on principal component analysis (PCA), which was believed to be efficient, has shown much worse results. We hope that the results obtained in this study will help both further chemometric (multivariate data analysis) investigations and investigations in the sphere of applied vibrational (infrared/IR, near-IR, and Raman) spectroscopy of sophisticated multicomponent systems.

  7. Gasoline classification using near infrared (NIR) spectroscopy data: Comparison of multivariate techniques

    International Nuclear Information System (INIS)

    Balabin, Roman M.; Safieva, Ravilya Z.; Lomakina, Ekaterina I.

    2010-01-01

    Near infrared (NIR) spectroscopy is a non-destructive (vibrational spectroscopy based) measurement technique for many multicomponent chemical systems, including products of petroleum (crude oil) refining and petrochemicals, food products (tea, fruits, e.g., apples, milk, wine, spirits, meat, bread, cheese, etc.), pharmaceuticals (drugs, tablets, bioreactor monitoring, etc.), and combustion products. In this paper we have compared the abilities of nine different multivariate classification methods: linear discriminant analysis (LDA), quadratic discriminant analysis (QDA), regularized discriminant analysis (RDA), soft independent modeling of class analogy (SIMCA), partial least squares (PLS) classification, K-nearest neighbor (KNN), support vector machines (SVM), probabilistic neural network (PNN), and multilayer perceptron (ANN-MLP) - for gasoline classification. Three sets of near infrared (NIR) spectra (450, 415, and 345 spectra) were used for classification of gasolines into 3, 6, and 3 classes, respectively, according to their source (refinery or process) and type. The 14,000-8000 cm -1 NIR spectral region was chosen. In all cases NIR spectroscopy was found to be effective for gasoline classification purposes, when compared with nuclear magnetic resonance (NMR) spectroscopy or gas chromatography (GC). KNN, SVM, and PNN techniques for classification were found to be among the most effective ones. Artificial neural network (ANN-MLP) approach based on principal component analysis (PCA), which was believed to be efficient, has shown much worse results. We hope that the results obtained in this study will help both further chemometric (multivariate data analysis) investigations and investigations in the sphere of applied vibrational (infrared/IR, near-IR, and Raman) spectroscopy of sophisticated multicomponent systems.

  8. Technique for Determination of Rational Boundaries in Combining Construction and Installation Processes Based on Quantitative Estimation of Technological Connections

    Science.gov (United States)

    Gusev, E. V.; Mukhametzyanov, Z. R.; Razyapov, R. V.

    2017-11-01

    The problems of the existing methods for the determination of combining and technologically interlinked construction processes and activities are considered under the modern construction conditions of various facilities. The necessity to identify common parameters that characterize the interaction nature of all the technology-related construction and installation processes and activities is shown. The research of the technologies of construction and installation processes for buildings and structures with the goal of determining a common parameter for evaluating the relationship between technologically interconnected processes and construction works are conducted. The result of this research was to identify the quantitative evaluation of interaction construction and installation processes and activities in a minimum technologically necessary volume of the previous process allowing one to plan and organize the execution of a subsequent technologically interconnected process. The quantitative evaluation is used as the basis for the calculation of the optimum range of the combination of processes and activities. The calculation method is based on the use of the graph theory. The authors applied a generic characterization parameter to reveal the technological links between construction and installation processes, and the proposed technique has adaptive properties which are key for wide use in organizational decisions forming. The article provides a written practical significance of the developed technique.

  9. Microwave Photonic Architecture for Direction Finding of LPI Emitters: Post-Processing for Angle of Arrival Estimation

    Science.gov (United States)

    2016-09-01

    APPENDIX. MATLAB CODE FOR SYSTEM SIMULATION .................................65  LIST OF REFERENCES...it easily detectable. The transmission of LPI signals with sophisticated modulation and high processing gain enable a good detection range and low...and data collection process . The software simulation of the system, which supports the hypothesis that the physical system is capable of detecting

  10. Conceptual design study and evaluation of an advanced treatment process applying a submerged combustion technique for spent solvents

    International Nuclear Information System (INIS)

    Uchiyama, Gunzo; Maeda, Mitsuru; Fijine, Sachio; Chida, Mitsuhisa; Kirishima, Kenji.

    1993-10-01

    An advanced treatment process based on a submerged combustion technique was proposed for spent solvents and the distillation residues containing transuranium (TRU) nuclides. A conceptual design study and the preliminary cost estimation of the treatment facility applying the process were conducted. Based on the results of the study, the process evaluation on the technical features, such as safety, volume reduction of TRU waste and economics was carried out. The key requirements for practical use were also summarized. It was shown that the process had the features as follows: the simplified treatment and solidification steps will not generate secondary aqueous wastes, the volume of TRU solid waste will be reduced less than one tenth of that of a reference technique (pyrolysis process), and the facility construction cost is less than 1 % of the total construction cost of a future large scale reprocessing plant. As for the low level wastes of calcium phosphate, it was shown that the further removal of β · γ nuclides with TRU nuclides from the wastes would be required for the safety in interim storage and transportation and for the load of shielding. (author)

  11. Technology for the product and process data base

    Science.gov (United States)

    Barnes, R. D.

    1984-01-01

    The computerized product and process data base is increasingly recognized to be the cornerstone component of an overall system aimed at the integrated automation of the industrial processes of a given company or enterprise. The technology needed to support these more effective computer integrated design and manufacturing methods, especially the concept of 3-D computer-sensible product definitions rather than engineering drawings, is not fully available and rationalized. Progress is being made, however, in bridging this technology gap with concentration on the modeling of sophisticated information and data structures, high-performance interactive user interfaces and comprehensive tools for managing the resulting computerized product definition and process data base.

  12. Dosimetric Verification and Evaluation of the 3-D Conformal Parotid Gland-Sparing Irradiation Technique for Bilateral Neck Treatment at University Hospital Centre Zagreb

    International Nuclear Information System (INIS)

    Kovacevic, N; Hrsak, H.; Bibic, J.

    2011-01-01

    3-D Conformal Parotid Gland-Sparing Irradiation Technique for Bilateral Neck (ConPas) is an alternative to Intensity-modulated radiotherapy (IMRT), and is in routine use at University Hospital Centre Rebro (KBC-Rebro), Zagreb. This technique includes highly asymmetric wedged conformal multi-leaf fields and demands very precise application. The aim of this paper is to present the dosimetric verification method of ConPas (and evaluation of ConPas applicability) as performed at KBC, taking into account the precision of the Treatment Planning System (TPS), possibilities of linear accelerator and patient set-up error. Results for two patients are shown in some details.ConPas is a rather sophisticated method and demands high precision in the whole radiotherapy process. Verification of ConPas using IMRT Verification Matrix Phantom shows good agreement between measured and predicted doses inside and outside PTV regions of the head and neck. Furthermore, a careful track of the positioning during the treatment shows that the overall set-up error is very small (practically negligible). When possible, one parotid gland may be partially spared, and therefore its function preserved at least to some extent. (author)

  13. Developing the technique of image processing for the study of bubble dynamics in subcooled flow boiling

    International Nuclear Information System (INIS)

    Donevski, Bozin; Saga, Tetsuo; Kobayashi, Toshio; Segawa, Shigeki

    1998-01-01

    This study presents the development of an image processing technique for studying the dynamic behavior of vapor bubbles in a two-phase bubbly flow. It focuses on the quantitative assessment of some basic parameters such as a local bubble size and size distribution in the range of void fraction between 0.03 < a < 0.07. The image processing methodology is based upon the computer evaluation of high speed motion pictures obtained from the flow field in the region of underdeveloped subcooled flow boiling for a variety of experimental conditions. This technique has the advantage of providing computer measurements and extracting the bubbles of the two-phase bubbly flow. This method appears to be promising for determining the governing mechanisms in subcooled flow boiling, particularly near the point of net vapor generation. The data collected by the image analysis software can be incorporated into the new models and computer codes currently under development which are aimed at incorporating the effect of vapor generation and condensation separately. (author)

  14. Automated processing of label-free Raman microscope images of macrophage cells with standardized regression for high-throughput analysis.

    Science.gov (United States)

    Milewski, Robert J; Kumagai, Yutaro; Fujita, Katsumasa; Standley, Daron M; Smith, Nicholas I

    2010-11-19

    Macrophages represent the front lines of our immune system; they recognize and engulf pathogens or foreign particles thus initiating the immune response. Imaging macrophages presents unique challenges, as most optical techniques require labeling or staining of the cellular compartments in order to resolve organelles, and such stains or labels have the potential to perturb the cell, particularly in cases where incomplete information exists regarding the precise cellular reaction under observation. Label-free imaging techniques such as Raman microscopy are thus valuable tools for studying the transformations that occur in immune cells upon activation, both on the molecular and organelle levels. Due to extremely low signal levels, however, Raman microscopy requires sophisticated image processing techniques for noise reduction and signal extraction. To date, efficient, automated algorithms for resolving sub-cellular features in noisy, multi-dimensional image sets have not been explored extensively. We show that hybrid z-score normalization and standard regression (Z-LSR) can highlight the spectral differences within the cell and provide image contrast dependent on spectral content. In contrast to typical Raman imaging processing methods using multivariate analysis, such as single value decomposition (SVD), our implementation of the Z-LSR method can operate nearly in real-time. In spite of its computational simplicity, Z-LSR can automatically remove background and bias in the signal, improve the resolution of spatially distributed spectral differences and enable sub-cellular features to be resolved in Raman microscopy images of mouse macrophage cells. Significantly, the Z-LSR processed images automatically exhibited subcellular architectures whereas SVD, in general, requires human assistance in selecting the components of interest. The computational efficiency of Z-LSR enables automated resolution of sub-cellular features in large Raman microscopy data sets without

  15. A comparative analysis of pre-processing techniques in colour retinal images

    International Nuclear Information System (INIS)

    Salvatelli, A; Bizai, G; Barbosa, G; Drozdowicz, B; Delrieux, C

    2007-01-01

    Diabetic retinopathy (DR) is a chronic disease of the ocular retina, which most of the times is only discovered when the disease is on an advanced stage and most of the damage is irreversible. For that reason, early diagnosis is paramount for avoiding the most severe consequences of the DR, of which complete blindness is not uncommon. Unsupervised or supervised image processing of retinal images emerges as a feasible tool for this diagnosis. The preprocessing stages are the key for any further assessment, since these images exhibit several defects, including non uniform illumination, sampling noise, uneven contrast due to pigmentation loss during sampling, and many others. Any feasible diagnosis system should work with images where these defects were compensated. In this work we analyze and test several correction techniques. Non uniform illumination is compensated using morphology and homomorphic filtering; uneven contrast is compensated using morphology and local enhancement. We tested our processing stages using Fuzzy C-Means, and local Hurst (self correlation) coefficient for unsupervised segmentation of the abnormal blood vessels. The results over a standard set of DR images are more than promising

  16. A comparative analysis of pre-processing techniques in colour retinal images

    Energy Technology Data Exchange (ETDEWEB)

    Salvatelli, A [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Bizai, G [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Barbosa, G [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Drozdowicz, B [Artificial Intelligence Group, Facultad de Ingenieria, Universidad Nacional de Entre Rios (Argentina); Delrieux, C [Electric and Computing Engineering Department, Universidad Nacional del Sur, Alem 1253, BahIa Blanca, (Partially funded by SECyT-UNS) (Argentina)], E-mail: claudio@acm.org

    2007-11-15

    Diabetic retinopathy (DR) is a chronic disease of the ocular retina, which most of the times is only discovered when the disease is on an advanced stage and most of the damage is irreversible. For that reason, early diagnosis is paramount for avoiding the most severe consequences of the DR, of which complete blindness is not uncommon. Unsupervised or supervised image processing of retinal images emerges as a feasible tool for this diagnosis. The preprocessing stages are the key for any further assessment, since these images exhibit several defects, including non uniform illumination, sampling noise, uneven contrast due to pigmentation loss during sampling, and many others. Any feasible diagnosis system should work with images where these defects were compensated. In this work we analyze and test several correction techniques. Non uniform illumination is compensated using morphology and homomorphic filtering; uneven contrast is compensated using morphology and local enhancement. We tested our processing stages using Fuzzy C-Means, and local Hurst (self correlation) coefficient for unsupervised segmentation of the abnormal blood vessels. The results over a standard set of DR images are more than promising.

  17. Collimation method using an image processing technique for an assembling-type antenna

    Science.gov (United States)

    Okuyama, Toshiyuki; Kimura, Shinichi; Fukase, Yutaro; Ueno, Hiroshi; Harima, Kouichi; Sato, Hitoshi; Yoshida, Tetsuji

    1998-10-01

    To construct highly precise space structures, such as antennas, it is essential to be able to collimate them with high precision by remote operation. Surveying techniques which are commonly used for collimating ground-based antennas cannot be applied to space systems, since they require relatively sensitive and complex instruments. In this paper, we propose a collimation method that is applied to mark-patterns mounted on an antenna dish for detecting very slight displacements. By calculating a cross- correlation function between the target and reference mark- patterns, and by interpolating this calculated function, we can measure the displacement of the target mark-pattern in sub-pixel precision. We developed a test-bed for the measuring system and evaluated several mark-patterns suitable for our image processing technique. A mark-pattern with which enabled to detect displacement within an RMS error of 1/100 pixels was found. Several tests conducted using this chosen pattern verified the robustness of the method to different light conditions and alignment errors. This collimating method is designed for application to an assembling-type antenna which is being developed by the Communications Research Laboratory.

  18. A Systematic Approach to Applying Lean Techniques to Optimize an Office Process at the Y-12 National Security Complex

    Energy Technology Data Exchange (ETDEWEB)

    Credille, Jennifer [Y-12 National Security Complex, Oak Ridge, TN (United States); Univ. of Tennessee, Knoxville, TN (United States); Owens, Elizabeth [Y-12 National Security Complex, Oak Ridge, TN (United States); Univ. of Tennessee, Knoxville, TN (United States)

    2017-10-11

    This capstone offers the introduction of Lean concepts to an office activity to demonstrate the versatility of Lean. Traditionally Lean has been associated with process improvements as applied to an industrial atmosphere. However, this paper will demonstrate that implementing Lean concepts within an office activity can result in significant process improvements. Lean first emerged with the conception of the Toyota Production System. This innovative concept was designed to improve productivity in the automotive industry by eliminating waste and variation. Lean has also been applied to office environments, however the limited literature reveals most Lean techniques within an office are restricted to one or two techniques. Our capstone confronts these restrictions by introducing a systematic approach that utilizes multiple Lean concepts. The approach incorporates: system analysis, system reliability, system requirements, and system feasibility. The methodical Lean outline provides tools for a successful outcome, which ensures the process is thoroughly dissected and can be achieved for any process in any work environment.

  19. Investigating deformation processes in AM60 magnesium alloy using the acoustic emission technique

    International Nuclear Information System (INIS)

    Mathis, K.; Chmelik, F.; Janecek, M.; Hadzima, B.; Trojanova, Z.; Lukac, P.

    2006-01-01

    Microstructure changes in an AM60 magnesium alloy were monitored using the acoustic emission (AE) technique during tensile tests in the temperature range from 20 to 300 deg. C. The correlation of the AE signal and the deformation processes is discussed. It is shown, using transmission electron and light microscopy, that the character of the AE response is associated with various modes of mechanical twinning at lower temperatures, whereas at higher temperatures also the influence of non-basal dislocations on the AE response must be taken into account

  20. The development of written word processing: the case of deaf children The development of written word processing: the case of deaf children

    Directory of Open Access Journals (Sweden)

    Jacqueline Leybaert

    2008-04-01

    Full Text Available Reading is a highly complex, flexible and sophisticated cognitive activity, and word recognition constitutes only a small and limited part of the whole process. It seems however that for various reasons, word recognition is worth studying separately from other components. Considering that writing systems are secondary codes representing the language, word recognition mechanisms may appear as an interface between printed material and general language capabilities, and thus, specific difficulties in reading and spelling acquisition should be iodated at the level of isolated word identification (see e. g. Crowder, 1982 for discussion. Moreover, it appears that a prominent characteristic of poor readers is their lack of efficiency in the processing of isolated words (Mitche11,1982; Stanovich, 1982. And finally, word recognition seems to be a more automatic and less controlled component of the whole reading process. Reading is a highly complex, flexible and sophisticated cognitive activity, and word recognition constitutes only a small and limited part of the whole process. It seems however that for various reasons, word recognition is worth studying separately from other components. Considering that writing systems are secondary codes representing the language, word recognition mechanisms may appear as an interface between printed material and general language capabilities, and thus, specific difficulties in reading and spelling acquisition should be iodated at the level of isolated word identification (see e. g. Crowder, 1982 for discussion. Moreover, it appears that a prominent characteristic of poor readers is their lack of efficiency in the processing of isolated words (Mitche11,1982; Stanovich, 1982. And finally, word recognition seems to be a more automatic and less controlled component of the whole reading process.