WorldWideScience

Sample records for hydrogasification process analysis

  1. Development of a Hydrogasification Process for Co-Production of Substitute Natural Gas (SNG) and Electric Power from Western Coals

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Xiaolei [Arizona Public Service Company, Pheonix, AZ (United States); Rink, Nancy [Arizona Public Service Company, Pheonix, AZ (United States)

    2011-04-30

    This report presents the results of the research and development conducted on an Advanced Hydrogasification Process (AHP) conceived and developed by Arizona Public Service Company (APS) under U.S. Department of Energy (DOE) contract: DE-FC26-06NT42759 for Substitute Natural Gas (SNG) production from western coal. A double-wall (i.e., a hydrogasification contained within a pressure shell) down-flow hydrogasification reactor was designed, engineered, constructed, commissioned and operated by APS, Phoenix, AZ. The reactor is ASME-certified under Section VIII with a rating of 1150 pounds per square inch gage (psig) maximum allowable working pressure at 1950 degrees Fahrenheit (°F). The reaction zone had a 1.75 inch inner diameter and 13 feet length. The initial testing of a sub-bituminous coal demonstrated ~ 50% carbon conversion and ~10% methane yield in the product gas under 1625°F, 1000 psig pressure, with a 11 seconds (s) residence time, and 0.4 hydrogen-to-coal mass ratio. Liquid by-products mainly contained Benzene, Toluene, Xylene (BTX) and tar. Char collected from the bottom of the reactor had 9000-British thermal units per pound (Btu/lb) heating value. A three-dimensional (3D) computational fluid dynamic model simulation of the hydrodynamics around the reactor head was utilized to design the nozzles for injecting the hydrogen into the gasifier to optimize gas-solid mixing to achieve improved carbon conversion. The report also presents the evaluation of using algae for carbon dioxide (CO2) management and biofuel production. Nannochloropsis, Selenastrum and Scenedesmus were determined to be the best algae strains for the project purpose and were studied in an outdoor system which included a 6-meter (6M) radius cultivator with a total surface area of 113 square meters (m2) and a total culture volume between 10,000 to 15,000 liters (L); a CO2 on-demand feeding system; an on-line data collection system for temperature, p

  2. Development of a Hydrogasification Process for Co-Production of Substitute Natural Gas (SNG) and Electric Power from Western Coals-Phase I

    Energy Technology Data Exchange (ETDEWEB)

    Raymond Hobbs

    2007-05-31

    The Advanced Hydrogasification Process (AHP)--conversion of coal to methane--is being developed through NETL with a DOE Grant and has successfully completed its first phase of development. The results so far are encouraging and have led to commitment by DOE/NETL to begin a second phase--bench scale reactor vessel testing, expanded engineering analysis and economic perspective review. During the next decade new means of generating electricity, and other forms of energy, will be introduced. The members of the AHP Team envision a need for expanded sources of natural gas or substitutes for natural gas, to fuel power generating plants. The initial work the team has completed on a process to use hydrogen to convert coal to methane (pipeline ready gas) shows promising potential. The Team has intentionally slanted its efforts toward the needs of US electric utilities, particularly on fuels that can be used near urban centers where the greatest need for new electric generation is found. The process, as it has evolved, would produce methane from coal by adding hydrogen. The process appears to be efficient using western coals for conversion to a highly sought after fuel with significantly reduced CO{sub 2} emissions. Utilities have a natural interest in the preservation of their industry, which will require a dramatic reduction in stack emissions and an increase in sustainable technologies. Utilities tend to rank long-term stable supplies of fuel higher than most industries and are willing to trade some ratio of cost for stability. The need for sustainability, stability and environmentally compatible production are key drivers in the formation and progression of the AHP development. In Phase II, the team will add a focus on water conservation to determine how the basic gasification process can be best integrated with all the plant components to minimize water consumption during SNG production. The process allows for several CO{sub 2} reduction options including consumption of

  3. A simple kinetic analysis of syngas during steam hydrogasification of biomass using a novel inverted batch reactor with instant high pressure feeding.

    Science.gov (United States)

    Fan, Xin; Liu, Zhongzhe; Norbeck, Joseph M; Park, Chan S

    2016-01-01

    A newly designed inverted batch reactor equipped with a pressure-driven feeding system was built for investigating the kinetics of syngas during the steam hydrogasification (SHR) of biomass. The system could instantly load the feedstock into the reactor at high temperature and pressure, which simulated the way to transport the feedstock into a hot and pressurized gasifier. Experiments were conducted from 600°C to 700°C. The inverted reactor showed very high heating rate by enhancing the carbon conversion and syngas production. The kinetic study showed that the rates of CH4, CO and CO2 formation during SHR were increased when the gasification temperature went up. SHR had comparatively lower activation energy for CH4 production. The activation energies of CH4, CO and CO2 during SHR were 42.8, 51.8 and 14kJ/mol, respectively. Copyright © 2015 Elsevier Ltd. All rights reserved.

  4. Advancement of flash hydrogasification: Task VIII. Performance testing

    Energy Technology Data Exchange (ETDEWEB)

    Falk, A.Y.; Schuman, M.D.; Kahn, D.R.

    1986-06-01

    This topical report documents the technical effort required to investigate and verify the reaction chemistry associated with the Rockwell Advanced Flash Hydropyrolysis (AFHP) concept for the production of substitute natural gas (SNG) from coal. The testing phase of the program included 5 preburner performance evaluation tests (14 test conditions) and 11 coal-fed reactor tests (19 test conditions). The reactor test parameters investigated spanned exist temperatures from 1775 to 2050/sup 0/F, residence times from 2 to 8 s, inlet gas-to-coal ratios from 0.15 to 0.27 lb-mole/lb, and inlet-steam-to-H/sub 2/ mole ratios from 0.15 to 0.86. One test was conducted to investigate the effect of CH/sub 4/ addition to the hydrogen feed stream (22 mole % CH/sub 4/), with subsequent partial oxidation of the CH/sub 4/ to CO/sub x/ in the preburner system, on the AFHP reactor chemistry and product gas composition. Overall carbon conversion and total carbon conversion to gases (namely, CH/sub 4/, C/sub 2/H/sub 6/, CO, and CO/sub 2/) ranged from 53 to 68% and 35 to 68%, respectively. The gas produced was primarily CH/sub 4/ (31 to 53% carbon conversion to CH/sub 4/). Carbon conversion to total liquids was strongly dependent on reactor exit temperature and to a lesser extent on residence time, with values ranging from about 20% to 1775/sup 0/F and 2-S residence time to zero at 1975/sup 0/F and residence times greater than 5 s. Carbon conversion to C/sub 6/H/sub 6/ asd high as 11.2% was obtained. Carbon conversion to CO/sub x/ ranged from 3.5 to 29.4%. Methane addition was found not to significantly affect the AFHP reactor chemistry. As a result of this program, Rockwell has expanded its data base and significantly improved its correlation model describing the processes occurring during flash hydropyrolysis. The correlation provides an excellent tool for subsequent process evaluations to determine the economic potential of the Rockwell coal hydrogasification process. 23 refs., 51 figs

  5. Achievement report for fiscal 1997 on investigative research on society compatibility of development of coal hydrogasification technology; 1997 nendo sekitan suiso tenka gas ka gijutsu kaihatsu shakai tekigosei ni kansuru chosa kenkyu seika hokokusho

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-03-01

    In view of possibility of the future tightness in natural gas supply, establishment of coal gasification technology was set as the final objective, which can supply cheaply and stably the substitution natural gas of high quality by using coal existing affluently over the world as the raw material. An investigative research is carried out under a five-year plan on society compatibility required to assess the possibility of the practical application thereof. Fiscal 1997 has performed in continuation from the previous year the 'survey on process level elevation' and 'survey on the society compatibility'. This report summarizes the achievements thereon. In the investigative research on the process level elevation, the Shell's methane synthesis process based on an oxygen blown and dry feed coal gasifier was evaluated, and the calculation process was pursued on material balance in a hydrogasification reactor as having been performed in the 'survey on developing the coal hydrogasification technology' in which its reasonability was verified. In the survey on the society compatibility of the process, a survey was carried out on natural gas (including non-conventional methane hydrate and coal bed methane) and coals as raw materials for hydrogasification. (NEDO)

  6. Novel approach to coal gasification using chemically incorporated catalysts (Phase II). Appendix A-F. Final report, May 1978-June 1981

    Energy Technology Data Exchange (ETDEWEB)

    Feldmann, H.F.; Conkle, H.N.; Appelbaum, H.R.; Chauhan, S.P.

    1981-01-01

    This volume contains six appendices: experimental apparatus, test conditions, and results of catalytic coal treatment; direct hydrogasification; summary of test runs for hydrogasification of BTC; summary of test runs for hydrogasification of char; summary of steam/O/sub 2/ gasification runs; and process analysis. Forty tables and nine figures are also included.

  7. Achievement report for fiscal 1999 on auxiliary New Sunshine Project. Development of coal hydrogasification technology (Survey and research on its social acceptability); 1999 nendo New Sunshine keikaku hojo jigyo seika hokokusho. Sekitan suiso tenka gaska gijutsu kaihatsu - Shakai tekigo sei ni kansuru chosa kenkyu

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2000-03-01

    With an objective to evaluate feasibility of practical use and economy of the coal hydrogasification technology (the ARCH process), survey and research have been performed. This paper summarizes the achievements in fiscal 1999. In the survey on the social acceptability, survey has been made on the future trend in the demand and supply and the price of LNG, LPG, and coal for hydrogasification. As a result, it was discovered that the price of LNG imported into Japan is determined as if linked with the crude oil price, and Saudi Arabia is the price leader of the LPG price. With respect to the survey on the possibility of international cooperation, surveys were conducted on the prospects of the long-term demand and supply in China, natural gas resources, and the demand and supply thereof. The feasibility study has estimated the product gas manufacturing cost after the process has been improved. In the trial calculation on the three-mode cost, it was discovered that, although the profit from byproducts is great, the BTX maximized mode causes the manufacturing cost to be higher by as much as 2 to 3 yen per Nm{sup 3} than that of other modes because of higher unit consumption in raw materials and higher construction cost. (NEDO)

  8. Badge Office Process Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Haurykiewicz, John Paul [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Dinehart, Timothy Grant [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Parker, Robert Young [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    The purpose of this process analysis was to analyze the Badge Offices’ current processes from a systems perspective and consider ways of pursuing objectives set forth by SEC-PS, namely increased customer flow (throughput) and reduced customer wait times. Information for the analysis was gathered for the project primarily through Badge Office Subject Matter Experts (SMEs), and in-person observation of prevailing processes. Using the information gathered, a process simulation model was constructed to represent current operations and allow assessment of potential process changes relative to factors mentioned previously. The overall purpose of the analysis was to provide SEC-PS management with information and recommendations to serve as a basis for additional focused study and areas for potential process improvements in the future.

  9. Chemical process hazards analysis

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1996-02-01

    The Office of Worker Health and Safety (EH-5) under the Assistant Secretary for the Environment, Safety and Health of the US Department (DOE) has published two handbooks for use by DOE contractors managing facilities and processes covered by the Occupational Safety and Health Administration (OSHA) Rule for Process Safety Management of Highly Hazardous Chemicals (29 CFR 1910.119), herein referred to as the PSM Rule. The PSM Rule contains an integrated set of chemical process safety management elements designed to prevent chemical releases that can lead to catastrophic fires, explosions, or toxic exposures. The purpose of the two handbooks, ``Process Safety Management for Highly Hazardous Chemicals`` and ``Chemical Process Hazards Analysis,`` is to facilitate implementation of the provisions of the PSM Rule within the DOE. The purpose of this handbook ``Chemical Process Hazards Analysis,`` is to facilitate, within the DOE, the performance of chemical process hazards analyses (PrHAs) as required under the PSM Rule. It provides basic information for the performance of PrHAs, and should not be considered a complete resource on PrHA methods. Likewise, to determine if a facility is covered by the PSM rule, the reader should refer to the handbook, ``Process Safety Management for Highly Hazardous Chemicals`` (DOE- HDBK-1101-96). Promulgation of the PSM Rule has heightened the awareness of chemical safety management issues within the DOE. This handbook is intended for use by DOE facilities and processes covered by the PSM rule to facilitate contractor implementation of the PrHA element of the PSM Rule. However, contractors whose facilities and processes not covered by the PSM Rule may also use this handbook as a basis for conducting process hazards analyses as part of their good management practices. This handbook explains the minimum requirements for PrHAs outlined in the PSM Rule. Nowhere have requirements been added beyond what is specifically required by the rule.

  10. Maintenance Process Strategic Analysis

    Science.gov (United States)

    Jasiulewicz-Kaczmarek, M.; Stachowiak, A.

    2016-08-01

    The performance and competitiveness of manufacturing companies is dependent on the availability, reliability and productivity of their production facilities. Low productivity, downtime, and poor machine performance is often linked to inadequate plant maintenance, which in turn can lead to reduced production levels, increasing costs, lost market opportunities, and lower profits. These pressures have given firms worldwide the motivation to explore and embrace proactive maintenance strategies over the traditional reactive firefighting methods. The traditional view of maintenance has shifted into one of an overall view that encompasses Overall Equipment Efficiency, Stakeholders Management and Life Cycle assessment. From practical point of view it requires changes in approach to maintenance represented by managers and changes in actions performed within maintenance area. Managers have to understand that maintenance is not only about repairs and conservations of machines and devices, but also actions striving for more efficient resources management and care for safety and health of employees. The purpose of the work is to present strategic analysis based on SWOT analysis to identify the opportunities and strengths of maintenance process, to benefit from them as much as possible, as well as to identify weaknesses and threats, so that they could be eliminated or minimized.

  11. Dynamic analysis of process reactors

    Energy Technology Data Exchange (ETDEWEB)

    Shadle, L.J.; Lawson, L.O.; Noel, S.D.

    1995-06-01

    The approach and methodology of conducting a dynamic analysis is presented in this poster session in order to describe how this type of analysis can be used to evaluate the operation and control of process reactors. Dynamic analysis of the PyGas{trademark} gasification process is used to illustrate the utility of this approach. PyGas{trademark} is the gasifier being developed for the Gasification Product Improvement Facility (GPIF) by Jacobs-Siffine Engineering and Riley Stoker. In the first step of the analysis, process models are used to calculate the steady-state conditions and associated sensitivities for the process. For the PyGas{trademark} gasifier, the process models are non-linear mechanistic models of the jetting fluidized-bed pyrolyzer and the fixed-bed gasifier. These process sensitivities are key input, in the form of gain parameters or transfer functions, to the dynamic engineering models.

  12. Semantic multimedia analysis and processing

    CERN Document Server

    Spyrou, Evaggelos; Mylonas, Phivos

    2014-01-01

    Broad in scope, Semantic Multimedia Analysis and Processing provides a complete reference of techniques, algorithms, and solutions for the design and the implementation of contemporary multimedia systems. Offering a balanced, global look at the latest advances in semantic indexing, retrieval, analysis, and processing of multimedia, the book features the contributions of renowned researchers from around the world. Its contents are based on four fundamental thematic pillars: 1) information and content retrieval, 2) semantic knowledge exploitation paradigms, 3) multimedia personalization, and 4)

  13. Fourier analysis and stochastic processes

    CERN Document Server

    Brémaud, Pierre

    2014-01-01

    This work is unique as it provides a uniform treatment of the Fourier theories of functions (Fourier transforms and series, z-transforms), finite measures (characteristic functions, convergence in distribution), and stochastic processes (including arma series and point processes). It emphasises the links between these three themes. The chapter on the Fourier theory of point processes and signals structured by point processes is a novel addition to the literature on Fourier analysis of stochastic processes. It also connects the theory with recent lines of research such as biological spike signals and ultrawide-band communications. Although the treatment is mathematically rigorous, the convivial style makes the book accessible to a large audience. In particular, it will be interesting to anyone working in electrical engineering and communications, biology (point process signals) and econometrics (arma models). A careful review of the prerequisites (integration and probability theory in the appendix, Hilbert spa...

  14. Command Process Modeling & Risk Analysis

    Science.gov (United States)

    Meshkat, Leila

    2011-01-01

    Commanding Errors may be caused by a variety of root causes. It's important to understand the relative significance of each of these causes for making institutional investment decisions. One of these causes is the lack of standardized processes and procedures for command and control. We mitigate this problem by building periodic tables and models corresponding to key functions within it. These models include simulation analysis and probabilistic risk assessment models.

  15. Preliminary hazards analysis -- vitrification process

    Energy Technology Data Exchange (ETDEWEB)

    Coordes, D.; Ruggieri, M.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1994-06-01

    This paper presents a Preliminary Hazards Analysis (PHA) for mixed waste vitrification by joule heating. The purpose of performing a PHA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PHA is then followed by a Preliminary Safety Analysis Report (PSAR) performed during Title 1 and 2 design. The PSAR then leads to performance of the Final Safety Analysis Report performed during the facility`s construction and testing. It should be completed before routine operation of the facility commences. This PHA addresses the first four chapters of the safety analysis process, in accordance with the requirements of DOE Safety Guidelines in SG 830.110. The hazards associated with vitrification processes are evaluated using standard safety analysis methods which include: identification of credible potential hazardous energy sources; identification of preventative features of the facility or system; identification of mitigative features; and analyses of credible hazards. Maximal facility inventories of radioactive and hazardous materials are postulated to evaluate worst case accident consequences. These inventories were based on DOE-STD-1027-92 guidance and the surrogate waste streams defined by Mayberry, et al. Radiological assessments indicate that a facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous materials assessment indicates that a Mixed Waste Vitrification facility will be a Low Hazard facility having minimal impacts to offsite personnel and the environment.

  16. RISK ANALYSIS IN MILK PROCESSING

    Directory of Open Access Journals (Sweden)

    I. PIRVUTOIU

    2008-05-01

    Full Text Available This paper aimed to evaluate Risk bankruptcy using “Score Method” based on Canon and Holder’s Model. The data were collected from the Balance Sheet and Profit and Loss Account for the period 2005-2007, recorded by a Meat processing Plant (Rador Commercial Company .The study has put in evidence the financial situation of the company,the level of the main financial ratios fundamenting the calculation of Z score function value in the three years The low values of Z score function recorded every year reflects that the company is still facing backruptcy. However , the worst situation was recorded in the years 2005 and 2006, when baknruptcy risk was ranging between 70 – 80 % . In the year 2007, the risk bankruptcy was lower, ranging between 50-70 % , as Z function recorded a value lower than 4 .For Meat processing companies such an analysis is compulsory at present as long as business environment is very risky in our country.

  17. Experimental analysis of armouring process

    Science.gov (United States)

    Lamberti, Alberto; Paris, Ennio

    Preliminary results from an experimental investigation on armouring processes are presented. Particularly, the process of development and formation of the armour layer under different steady flow conditions has been analyzed in terms of grain size variations and sediment transport rate associated to each size fraction.

  18. Analysis of design process dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    To enable the development of automated support for design, a challenge is to model and analyse dynamics of design processes in a formal manner. This paper contributes a declarative, logical approach for specification of dynamic properties of design processes, supported by a formal temporal

  19. Analysis of Hospital Processes with Process Mining Techniques.

    Science.gov (United States)

    Orellana García, Arturo; Pérez Alfonso, Damián; Larrea Armenteros, Osvaldo Ulises

    2015-01-01

    Process mining allows for discovery, monitoring, and improving processes identified in information systems from their event logs. In hospital environments, process analysis has been a crucial factor for cost reduction, control and proper use of resources, better patient care, and achieving service excellence. This paper presents a new component for event logs generation in the Hospital Information System or HIS, developed at University of Informatics Sciences. The event logs obtained are used for analysis of hospital processes with process mining techniques. The proposed solution intends to achieve the generation of event logs in the system with high quality. The performed analyses allowed for redefining functions in the system and proposed proper flow of information. The study exposed the need to incorporate process mining techniques in hospital systems to analyze the processes execution. Moreover, we illustrate its application for making clinical and administrative decisions for the management of hospital activities.

  20. Formal analysis of design process dynamics

    NARCIS (Netherlands)

    Bosse, T.; Jonker, C.M.; Treur, J.

    2010-01-01

    This paper presents a formal analysis of design process dynamics. Such a formal analysis is a prerequisite to come to a formal theory of design and for the development of automated support for the dynamics of design processes. The analysis was geared toward the identification of dynamic design

  1. Group Process: A Systematic Analysis.

    Science.gov (United States)

    Roark, Albert E.; Radl, Myrna C.

    1984-01-01

    Identifies components of group process and describes leader functions. Discusses personal elements, focus of interaction/psychological distance, group development, content, quality of interaction, and self-reflective/meaning attribution, illustrated by a case study of a group of persons (N=5) arrested for drunk driving. (JAC)

  2. Process Correlation Analysis Model for Process Improvement Identification

    Directory of Open Access Journals (Sweden)

    Su-jin Choi

    2014-01-01

    software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  3. Mesh Processing in Medical Image Analysis

    DEFF Research Database (Denmark)

    The following topics are dealt with: mesh processing; medical image analysis; interactive freeform modeling; statistical shape analysis; clinical CT images; statistical surface recovery; automated segmentation; cerebral aneurysms; and real-time particle-based representation....

  4. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  5. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  6. Alternative process schemes for coal conversion. Progress report No. 1, October 1, 1978--January 31, 1979

    Energy Technology Data Exchange (ETDEWEB)

    Sansone, M.J.

    1979-02-01

    On the basis of simple, first approximation calculations, it has been shown that catalytic gasification and hydrogasification are inherently superior to conventional gasification with respect to carbon utilization and thermal efficiency. However, most processes which are directed toward the production of substitute natural gas (SNG) by direct combination of coal with steam at low temperatures (catalytic processes) or with hydrogen (hydrogasification) will require a step for separation of product SNG from a recycle stream. The success or falure of the process could well depend upon the economics of this separation scheme. The energetics for the separation of mixtures of ideal gases has been considered in some detail. Minimum energies for complete separation of representative effluent mixtures have been calculated as well as energies for separation into product and recycle streams. The gas mixtures include binary systems of H/sub 2/ and CH/sub 4/ and ternary mixtures of H/sub 2/, CH/sub 4/, and CO. A brief summary of a number of different real separation schemes has also been included. We have arbitrarily divided these into five categories: liquefaction, absorption, adsorption, chemical, and diffusional methods. These separation methods will be screened and the more promising methods examined in more detail in later reports. Finally, a brief mention of alternative coal conversion processes concludes this report.

  7. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Waagepetersen, Rasmus Plenge

    2005-01-01

    Discussion of the paper "Residual analysis for spatial point processes" by A. Baddeley, M. Hazelton, J. Møller and R. Turner. Journal of the Royal Statistical Society, Series B, vol. 67, pages 617-666, 2005.......Discussion of the paper "Residual analysis for spatial point processes" by A. Baddeley, M. Hazelton, J. Møller and R. Turner. Journal of the Royal Statistical Society, Series B, vol. 67, pages 617-666, 2005....

  8. Artificial intelligence applied to process signal analysis

    Science.gov (United States)

    Corsberg, Dan

    1988-01-01

    Many space station processes are highly complex systems subject to sudden, major transients. In any complex process control system, a critical aspect of the human/machine interface is the analysis and display of process information. Human operators can be overwhelmed by large clusters of alarms that inhibit their ability to diagnose and respond to a disturbance. Using artificial intelligence techniques and a knowledge base approach to this problem, the power of the computer can be used to filter and analyze plant sensor data. This will provide operators with a better description of the process state. Once a process state is recognized, automatic action could be initiated and proper system response monitored.

  9. Radar signal analysis and processing using Matlab

    CERN Document Server

    Mahafza, Bassem R

    2008-01-01

    Offering radar-related software for the analysis and design of radar waveform and signal processing, this book provides comprehensive coverage of radar signals and signal processing techniques and algorithms. It contains numerous graphical plots, common radar-related functions, table format outputs, and end-of-chapter problems. The complete set of MATLAB[registered] functions and routines are available for download online.

  10. Vygotsky's Analysis of Children's Meaning Making Processes

    Science.gov (United States)

    Mahn, Holbrook

    2012-01-01

    Vygotsky's work is extensive and covers many aspects of the development of children's meaning-making processes in social and cultural contexts. However, his main focus is on the examination of the unification of speaking and thinking processes. His investigation centers on the analysis of the entity created by this unification--an internal…

  11. IRB Process Improvements: A Machine Learning Analysis.

    Science.gov (United States)

    Shoenbill, Kimberly; Song, Yiqiang; Cobb, Nichelle L; Drezner, Marc K; Mendonca, Eneida A

    2017-06-01

    Clinical research involving humans is critically important, but it is a lengthy and expensive process. Most studies require institutional review board (IRB) approval. Our objective is to identify predictors of delays or accelerations in the IRB review process and apply this knowledge to inform process change in an effort to improve IRB efficiency, transparency, consistency and communication. We analyzed timelines of protocol submissions to determine protocol or IRB characteristics associated with different processing times. Our evaluation included single variable analysis to identify significant predictors of IRB processing time and machine learning methods to predict processing times through the IRB review system. Based on initial identified predictors, changes to IRB workflow and staffing procedures were instituted and we repeated our analysis. Our analysis identified several predictors of delays in the IRB review process including type of IRB review to be conducted, whether a protocol falls under Veteran's Administration purview and specific staff in charge of a protocol's review. We have identified several predictors of delays in IRB protocol review processing times using statistical and machine learning methods. Application of this knowledge to process improvement efforts in two IRBs has led to increased efficiency in protocol review. The workflow and system enhancements that are being made support our four-part goal of improving IRB efficiency, consistency, transparency, and communication.

  12. Profitability Analysis of Soybean Oil Processes

    Directory of Open Access Journals (Sweden)

    Ming-Hsun Cheng

    2017-10-01

    Full Text Available Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV, break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP, is profitable when the capacity is larger than 17 million kg of annual oil production.

  13. Profitability Analysis of Soybean Oil Processes.

    Science.gov (United States)

    Cheng, Ming-Hsun; Rosentrater, Kurt A

    2017-10-07

    Soybean oil production is the basic process for soybean applications. Cash flow analysis is used to estimate the profitability of a manufacturing venture. Besides capital investments, operating costs, and revenues, the interest rate is the factor to estimate the net present value (NPV), break-even points, and payback time; which are benchmarks for profitability evaluation. The positive NPV and reasonable payback time represent a profitable process, and provide an acceptable projection for real operating. Additionally, the capacity of the process is another critical factor. The extruding-expelling process and hexane extraction are the two typical approaches used in industry. When the capacities of annual oil production are larger than 12 and 173 million kg respectively, these two processes are profitable. The solvent free approach, known as enzyme assisted aqueous extraction process (EAEP), is profitable when the capacity is larger than 17 million kg of annual oil production.

  14. Flux Analysis in Process Models via Causality

    Directory of Open Access Journals (Sweden)

    Ozan Kahramanoğulları

    2010-02-01

    Full Text Available We present an approach for flux analysis in process algebra models of biological systems. We perceive flux as the flow of resources in stochastic simulations. We resort to an established correspondence between event structures, a broadly recognised model of concurrency, and state transitions of process models, seen as Petri nets. We show that we can this way extract the causal resource dependencies in simulations between individual state transitions as partial orders of events. We propose transformations on the partial orders that provide means for further analysis, and introduce a software tool, which implements these ideas. By means of an example of a published model of the Rho GTP-binding proteins, we argue that this approach can provide the substitute for flux analysis techniques on ordinary differential equation models within the stochastic setting of process algebras.

  15. Systemic analysis of the caulking assembly process

    Directory of Open Access Journals (Sweden)

    Rodean Claudiu

    2017-01-01

    Full Text Available The present paper highlights the importance of a caulking process which is nowadays less studied in comparison with the growing of its usage in the automotive industry. Due to the fact that the caulking operation is used in domains with high importance such as shock absorbers and brake systems there comes the demand of this paper to detail the parameters which characterize the process, viewed as input data and output data, and the requirements asked for the final product. The paper presents the actual measurement methods used for analysis the performance of the caulking assembly. All this parameters leads to an analysis algorithm of performance established for the caulking process which it is used later in the paper for an experimental research. The study is a basis from which it will be able to go to further researches in order to optimize the following processing.

  16. Sneak analysis applied to process systems

    Science.gov (United States)

    Whetton, Cris

    Traditional safety analyses, such as HAZOP, FMEA, FTA, and MORT, are less than effective at identifying hazards resulting from incorrect 'flow' - whether this be flow of information, actions, electric current, or even the literal flow of process fluids. Sneak Analysis (SA) has existed since the mid nineteen-seventies as a means of identifying such conditions in electric circuits; in which area, it is usually known as Sneak Circuit Analysis (SCA). This paper extends the ideas of Sneak Circuit Analysis to a general method of Sneak Analysis applied to process plant. The methods of SA attempt to capitalize on previous work in the electrical field by first producing a pseudo-electrical analog of the process and then analyzing the analog by the existing techniques of SCA, supplemented by some additional rules and clues specific to processes. The SA method is not intended to replace any existing method of safety analysis; instead, it is intended to supplement such techniques as HAZOP and FMEA by providing systematic procedures for the identification of a class of potential problems which are not well covered by any other method.

  17. Applications of random process excursion analysis

    CERN Document Server

    Brainina, Irina S

    2013-01-01

    This book addresses one of the key problems in signal processing, the problem of identifying statistical properties of excursions in a random process in order to simplify the theoretical analysis and make it suitable for engineering applications. Precise and approximate formulas are explained, which are relatively simple and can be used for engineering applications such as the design of devices which can overcome the high initial uncertainty of the self-training period. The information presented in the monograph can be used to implement adaptive signal processing devices capable of d

  18. Analysis and evaluation of collaborative modeling processes

    NARCIS (Netherlands)

    Ssebuggwawo, D.

    2012-01-01

    Analysis and evaluation of collaborative modeling processes is confronted with many challenges. On the one hand, many systems design and re-engineering projects require collaborative modeling approaches that can enhance their productivity. But, such collaborative efforts, which often consist of the

  19. Exergy analysis of nutrient recovery processes.

    Science.gov (United States)

    Hellström, D

    2003-01-01

    In an exergy analysis, the actual consumption of resources in physical and chemical processes is calculated. Energy and chemical elements are not consumed in the processes--they are only transformed into other forms with lower quality. The principals of exergy analysis are illustrated by comparing different wastewater treatment systems for nutrient recovery. One system represents an end-of-pipe structure, whereas other systems include source separation of grey water, black water, and urine. The exergy flows analysed in this paper are those related to management and treatment of organic matter and nutrients. The study shows that the total exergy consumption is lowest for the system with source separation of urine and faeces and greatest for the conventional wastewater treatment system complemented by processes for nutrient recovery.

  20. Decision analysis in the formulary process.

    Science.gov (United States)

    Kessler, J M

    1997-11-15

    The use of decision analysis as a tool in making formulary decisions is discussed. Decision analysis is best applied in formulary decisions when factors other than acquisition costs are important in determining overall treatment costs for two products. The decision-analysis process assigns probabilities and costs to various treatments and outcomes. In the case of acute myocardial infarction, the decision analyst would gather data on angioplasty and thrombolysis and assign probabilities and costs for each treatment and subsequent endpoints on the basis of clinical trial data. When such data do not exist, estimates may be generated by expert panels. Applying clinical trial data to an individual hospital is not straightforward because of differences between clinical trials and clinical practice. Analysts and clinicians should evaluate any proposed model for its robustness and adaptability to local conditions and practitioner variation. Access to internal hospital data is essential in developing the model. An ideal decision-analysis model includes all important available interventions and defines and discloses the analyst's time frame and financial perspective. After implementation of the formulary decision, the results can be monitored and, if necessary, adjustments can be made in the allocation of resources. Barriers to effective decision analysis include lack of data and differences in sources of cost and outcome data. Despite the current limitations of decision analysis, clinicians and policymakers may find this technique increasingly useful in the complex formulary process.

  1. Integrating human factors into process hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kariuki, S.G. [Technische Universitaet Berlin, Institute of Process and Plant Technology, Sekr. TK0-1, Strasse des 17. Juni 135, 10623 Berlin (Germany); Loewe, K. [Technische Universitaet Berlin, Institute of Process and Plant Technology, Sekr. TK0-1, Strasse des 17. Juni 135, 10623 Berlin (Germany)]. E-mail: katharina.loewe@tu-berlin.de

    2007-12-15

    A comprehensive process hazard analysis (PHA) needs to address human factors. This paper describes an approach that systematically identifies human error in process design and the human factors that influence its production and propagation. It is deductive in nature and therefore considers human error as a top event. The combinations of different factors that may lead to this top event are analysed. It is qualitative in nature and is used in combination with other PHA methods. The method has an advantage because it does not look at the operator error as the sole contributor to the human failure within a system but a combination of all underlying factors.

  2. Neyman, Markov processes and survival analysis.

    Science.gov (United States)

    Yang, Grace

    2013-07-01

    J. Neyman used stochastic processes extensively in his applied work. One example is the Fix and Neyman (F-N) competing risks model (1951) that uses finite homogeneous Markov processes to analyse clinical trials with breast cancer patients. We revisit the F-N model, and compare it with the Kaplan-Meier (K-M) formulation for right censored data. The comparison offers a way to generalize the K-M formulation to include risks of recovery and relapses in the calculation of a patient's survival probability. The generalization is to extend the F-N model to a nonhomogeneous Markov process. Closed-form solutions of the survival probability are available in special cases of the nonhomogeneous processes, like the popular multiple decrement model (including the K-M model) and Chiang's staging model, but these models do not consider recovery and relapses while the F-N model does. An analysis of sero-epidemiology current status data with recurrent events is illustrated. Fix and Neyman used Neyman's RBAN (regular best asymptotic normal) estimates for the risks, and provided a numerical example showing the importance of considering both the survival probability and the length of time of a patient living a normal life in the evaluation of clinical trials. The said extension would result in a complicated model and it is unlikely to find analytical closed-form solutions for survival analysis. With ever increasing computing power, numerical methods offer a viable way of investigating the problem.

  3. Automatic processing, analysis, and recognition of images

    Science.gov (United States)

    Abrukov, Victor S.; Smirnov, Evgeniy V.; Ivanov, Dmitriy G.

    2004-11-01

    New approaches and computer codes (A&CC) for automatic processing, analysis and recognition of images are offered. The A&CC are based on presentation of object image as a collection of pixels of various colours and consecutive automatic painting of distinguished itself parts of the image. The A&CC have technical objectives centred on such direction as: 1) image processing, 2) image feature extraction, 3) image analysis and some others in any consistency and combination. The A&CC allows to obtain various geometrical and statistical parameters of object image and its parts. Additional possibilities of the A&CC usage deal with a usage of artificial neural networks technologies. We believe that A&CC can be used at creation of the systems of testing and control in a various field of industry and military applications (airborne imaging systems, tracking of moving objects), in medical diagnostics, at creation of new software for CCD, at industrial vision and creation of decision-making system, etc. The opportunities of the A&CC are tested at image analysis of model fires and plumes of the sprayed fluid, ensembles of particles, at a decoding of interferometric images, for digitization of paper diagrams of electrical signals, for recognition of the text, for elimination of a noise of the images, for filtration of the image, for analysis of the astronomical images and air photography, at detection of objects.

  4. Preliminary Hazards Analysis Plasma Hearth Process

    Energy Technology Data Exchange (ETDEWEB)

    Aycock, M.; Coordes, D.; Russell, J.; TenBrook, W.; Yimbo, P. [Science Applications International Corp., Pleasanton, CA (United States)

    1993-11-01

    This Preliminary Hazards Analysis (PHA) for the Plasma Hearth Process (PHP) follows the requirements of United States Department of Energy (DOE) Order 5480.23 (DOE, 1992a), DOE Order 5480.21 (DOE, 1991d), DOE Order 5480.22 (DOE, 1992c), DOE Order 5481.1B (DOE, 1986), and the guidance provided in DOE Standards DOE-STD-1027-92 (DOE, 1992b). Consideration is given to ft proposed regulations published as 10 CFR 830 (DOE, 1993) and DOE Safety Guide SG 830.110 (DOE, 1992b). The purpose of performing a PRA is to establish an initial hazard categorization for a DOE nuclear facility and to identify those processes and structures which may have an impact on or be important to safety. The PHA is typically performed during and provides input to project conceptual design. The PRA then is followed by a Preliminary Safety Analysis Report (PSAR) performed during Title I and II design. This PSAR then leads to performance of the Final Safety Analysis Report performed during construction, testing, and acceptance and completed before routine operation. Radiological assessments indicate that a PHP facility, depending on the radioactive material inventory, may be an exempt, Category 3, or Category 2 facility. The calculated impacts would result in no significant impact to offsite personnel or the environment. Hazardous material assessments indicate that a PHP facility will be a Low Hazard facility having no significant impacts either onsite or offsite to personnel and the environment.

  5. Warpage analysis in injection moulding process

    Science.gov (United States)

    Hidayah, M. H. N.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    This study was concentrated on the effects of process parameters in plastic injection moulding process towards warpage problem by using Autodesk Moldflow Insight (AMI) software for the simulation. In this study, plastic dispenser of dental floss has been analysed with thermoplastic material of Polypropylene (PP) used as the moulded material and details properties of 80 Tonne Nessei NEX 1000 injection moulding machine also has been used in this study. The variable parameters of the process are packing pressure, packing time, melt temperature and cooling time. Minimization of warpage obtained from the optimization and analysis data from the Design Expert software. Integration of Response Surface Methodology (RSM), Center Composite Design (CCD) with polynomial models that has been obtained from Design of Experiment (DOE) is the method used in this study. The results show that packing pressure is the main factor that will contribute to the formation of warpage in x-axis and y-axis. While in z-axis, the main factor is melt temperature and packing time is the less significant among the four parameters in x, y and z-axes. From optimal processing parameter, the value of warpage in x, y and z-axis have been optimised by 21.60%, 26.45% and 24.53%, respectively.

  6. Mathematical Analysis and Optimization of Infiltration Processes

    Science.gov (United States)

    Chang, H.-C.; Gottlieb, D.; Marion, M.; Sheldon, B. W.

    1997-01-01

    A variety of infiltration techniques can be used to fabricate solid materials, particularly composites. In general these processes can be described with at least one time dependent partial differential equation describing the evolution of the solid phase, coupled to one or more partial differential equations describing mass transport through a porous structure. This paper presents a detailed mathematical analysis of a relatively simple set of equations which is used to describe chemical vapor infiltration. The results demonstrate that the process is controlled by only two parameters, alpha and beta. The optimization problem associated with minimizing the infiltration time is also considered. Allowing alpha and beta to vary with time leads to significant reductions in the infiltration time, compared with the conventional case where alpha and beta are treated as constants.

  7. Sustainable process design & analysis of hybrid separations

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Befort, Bridgette; Garg, Nipun

    2016-01-01

    shown that more than 50% of energy is spent in purifying the last 5-10% of the distillate product. Membrane modules on the other hand can achieve high purity separations at lower energy costs, but if the flux is high, it requires large membrane area. A hybrid scheme where distillation and membrane...... modules are combined such that each operates at its highest efficiency, has the potential for significant energy reduction without significant increase of capital costs. This paper presents a method for sustainable design of hybrid distillation-membrane schemes with guaranteed reduction of energy......Distillation is an energy intensive operation in chemical process industries. There are around 40,000 distillation columns in operation in the US, requiring approximately 40% of the total energy consumption in US chemical process industries. However, analysis of separations by distillation has...

  8. Mathematical foundations of image processing and analysis

    CERN Document Server

    Pinoli, Jean-Charles

    2014-01-01

    Mathematical Imaging is currently a rapidly growing field in applied mathematics, with an increasing need for theoretical mathematics. This book, the second of two volumes, emphasizes the role of mathematics as a rigorous basis for imaging sciences. It provides a comprehensive and convenient overview of the key mathematical concepts, notions, tools and frameworks involved in the various fields of gray-tone and binary image processing and analysis, by proposing a large, but coherent, set of symbols and notations, a complete list of subjects and a detailed bibliography. It establishes a bridg

  9. Auxetic polyurethane foam: Manufacturing and processing analysis

    Science.gov (United States)

    Jahan, Md Deloyer

    experimental design approach to identify significant processing parameters followed by optimization of those processing parameters in fabrication of auxetic PU foam. A split-plot factorial design has been selected for screening purpose. Response Surface Methodology (RSM) has been utilized to optimize the processing parameters in fabrication of auxetic PU foam. Two different designs named Box-Behnken and I-optimal designs have been employed for this analysis. The results obtained by those designs exhibit that I-optimal design provides more accurate and realistic results than Box-Behnken design when experiments are performed in split-plot manner. Finally, a near stationary ridge system is obtained by optimization analysis. As a result a set of operating conditions are obtained that produces similar minimum Poisson's ratio in auxetic PU foam.

  10. Quantum Chemical Strain Analysis For Mechanochemical Processes.

    Science.gov (United States)

    Stauch, Tim; Dreuw, Andreas

    2017-04-18

    The use of mechanical force to initiate a chemical reaction is an efficient alternative to the conventional sources of activation energy, i.e., heat, light, and electricity. Applications of mechanochemistry in academic and industrial laboratories are diverse, ranging from chemical syntheses in ball mills and ultrasound baths to direct activation of covalent bonds using an atomic force microscope. The vectorial nature of force is advantageous because specific covalent bonds can be preconditioned for rupture by selective stretching. However, the influence of mechanical force on single molecules is still not understood at a fundamental level, which limits the applicability of mechanochemistry. As a result, many chemists still resort to rules of thumb when it comes to conducting mechanochemical syntheses. In this Account, we show that comprehension of mechanochemistry at the molecular level can be tremendously advanced by quantum chemistry, in particular by using quantum chemical force analysis tools. One such tool is the JEDI (Judgement of Energy DIstribution) analysis, which provides a convenient approach to analyze the distribution of strain energy in a mechanically deformed molecule. Based on the harmonic approximation, the strain energy contribution is calculated for each bond length, bond angle and dihedral angle, thus providing a comprehensive picture of how force affects molecules. This Account examines the theoretical foundations of quantum chemical force analysis and provides a critical overview of the performance of the JEDI analysis in various mechanochemical applications. We explain in detail how this analysis tool is to be used to identify the "force-bearing scaffold" of a distorted molecule, which allows both the rationalization and the optimization of diverse mechanochemical processes. More precisely, we show that the inclusion of every bond, bending and torsion of a molecule allows a particularly insightful discussion of the distribution of mechanical

  11. Multivariate Analysis for the Processing of Signals

    Directory of Open Access Journals (Sweden)

    Beattie J.R.

    2014-01-01

    Full Text Available Real-world experiments are becoming increasingly more complex, needing techniques capable of tracking this complexity. Signal based measurements are often used to capture this complexity, where a signal is a record of a sample’s response to a parameter (e.g. time, displacement, voltage, wavelength that is varied over a range of values. In signals the responses at each value of the varied parameter are related to each other, depending on the composition or state sample being measured. Since signals contain multiple information points, they have rich information content but are generally complex to comprehend. Multivariate Analysis (MA has profoundly transformed their analysis by allowing gross simplification of the tangled web of variation. In addition MA has also provided the advantage of being much more robust to the influence of noise than univariate methods of analysis. In recent years, there has been a growing awareness that the nature of the multivariate methods allows exploitation of its benefits for purposes other than data analysis, such as pre-processing of signals with the aim of eliminating irrelevant variations prior to analysis of the signal of interest. It has been shown that exploiting multivariate data reduction in an appropriate way can allow high fidelity denoising (removal of irreproducible non-signals, consistent and reproducible noise-insensitive correction of baseline distortions (removal of reproducible non-signals, accurate elimination of interfering signals (removal of reproducible but unwanted signals and the standardisation of signal amplitude fluctuations. At present, the field is relatively small but the possibilities for much wider application are considerable. Where signal properties are suitable for MA (such as the signal being stationary along the x-axis, these signal based corrections have the potential to be highly reproducible, and highly adaptable and are applicable in situations where the data is noisy or

  12. Traffic analysis and control using image processing

    Science.gov (United States)

    Senthilkumar, K.; Ellappan, Vijayan; Arun, A. R.

    2017-11-01

    This paper shows the work on traffic analysis and control till date. It shows an approach to regulate traffic the use of image processing and MATLAB systems. This concept uses computational images that are to be compared with original images of the street taken in order to determine the traffic level percentage and set the timing for the traffic signal accordingly which are used to reduce the traffic stoppage on traffic lights. They concept proposes to solve real life scenarios in the streets, thus enriching the traffic lights by adding image receivers like HD cameras and image processors. The input is then imported into MATLAB to be used. as a method for calculating the traffic on roads. Their results would be computed in order to adjust the traffic light timings on a particular street, and also with respect to other similar proposals but with the added value of solving a real, big instance.

  13. Analysis and Optimization of Central Processing Unit Process Parameters

    Science.gov (United States)

    Kaja Bantha Navas, R.; Venkata Chaitana Vignan, Budi; Durganadh, Margani; Rama Krishna, Chunduri

    2017-05-01

    The rapid growth of computer has made processing more data capable, which increase the heat dissipation. Hence the system unit CPU must be cooled against operating temperature. This paper presents a novel approach for the optimization of operating parameters on Central Processing Unit with single response based on response graph method. These methods have a series of steps from of proposed approach which are capable of decreasing uncertainty caused by engineering judgment in the Taguchi method. Orthogonal Array value was taken from ANSYS report. The method shows a good convergence with the experimental and the optimum process parameters.

  14. Quantitative Risk Analysis: Method And Process

    Directory of Open Access Journals (Sweden)

    Anass BAYAGA

    2010-03-01

    Full Text Available Recent and past studies (King III report, 2009: 73-75; Stoney 2007;Committee of Sponsoring Organisation-COSO, 2004, Bartell, 2003; Liebenberg and Hoyt, 2003; Reason, 2000; Markowitz 1957 lament that although, the introduction of quantifying risk to enhance degree of objectivity in finance for instance was quite parallel to its development in the manufacturing industry, it is not the same in Higher Education Institution (HEI. In this regard, the objective of the paper was to demonstrate the methods and process of Quantitative Risk Analysis (QRA through likelihood of occurrence of risk (phase I. This paper serves as first of a two-phased study, which sampled hundred (100 risk analysts in a University in the greater Eastern Cape Province of South Africa.The analysis of likelihood of occurrence of risk by logistic regression and percentages were conducted to investigate whether there were a significant difference or not between groups (analyst in respect of QRA.The Hosmer and Lemeshow test was non-significant with a chi-square(X2 =8.181; p = 0.300, which indicated that there was a good model fit, since the data did not significantly deviate from the model. The study concluded that to derive an overall likelihood rating that indicated the probability that a potential risk may be exercised within the construct of an associated threat environment, the following governing factors must be considered: (1 threat source motivation and capability (2 nature of the vulnerability (3 existence and effectiveness of current controls (methods and process.

  15. Robust analysis of semiparametric renewal process models.

    Science.gov (United States)

    Lin, Feng-Chang; Truong, Young K; Fine, Jason P

    2013-09-01

    A rate model is proposed for a modulated renewal process comprising a single long sequence, where the covariate process may not capture the dependencies in the sequence as in standard intensity models. We consider partial likelihood-based inferences under a semiparametric multiplicative rate model, which has been widely studied in the context of independent and identical data. Under an intensity model, gap times in a single long sequence may be used naively in the partial likelihood with variance estimation utilizing the observed information matrix. Under a rate model, the gap times cannot be treated as independent and studying the partial likelihood is much more challenging. We employ a mixing condition in the application of limit theory for stationary sequences to obtain consistency and asymptotic normality. The estimator's variance is quite complicated owing to the unknown gap times dependence structure. We adapt block bootstrapping and cluster variance estimators to the partial likelihood. Simulation studies and an analysis of a semiparametric extension of a popular model for neural spike train data demonstrate the practical utility of the rate approach in comparison with the intensity approach.

  16. Developing an intelligence analysis process through social network analysis

    Science.gov (United States)

    Waskiewicz, Todd; LaMonica, Peter

    2008-04-01

    Intelligence analysts are tasked with making sense of enormous amounts of data and gaining an awareness of a situation that can be acted upon. This process can be extremely difficult and time consuming. Trying to differentiate between important pieces of information and extraneous data only complicates the problem. When dealing with data containing entities and relationships, social network analysis (SNA) techniques can be employed to make this job easier. Applying network measures to social network graphs can identify the most significant nodes (entities) and edges (relationships) and help the analyst further focus on key areas of concern. Strange developed a model that identifies high value targets such as centers of gravity and critical vulnerabilities. SNA lends itself to the discovery of these high value targets and the Air Force Research Laboratory (AFRL) has investigated several network measures such as centrality, betweenness, and grouping to identify centers of gravity and critical vulnerabilities. Using these network measures, a process for the intelligence analyst has been developed to aid analysts in identifying points of tactical emphasis. Organizational Risk Analyzer (ORA) and Terrorist Modus Operandi Discovery System (TMODS) are the two applications used to compute the network measures and identify the points to be acted upon. Therefore, the result of leveraging social network analysis techniques and applications will provide the analyst and the intelligence community with more focused and concentrated analysis results allowing them to more easily exploit key attributes of a network, thus saving time, money, and manpower.

  17. Entrepreneurship Learning Process by using SWOT Analysis

    Directory of Open Access Journals (Sweden)

    Jajat Sudrajat

    2016-03-01

    Full Text Available The research objective was to produce a model of learning entrepreneurship by using SWOT analysis, which was currently being run with the concept of large classes and small classes. The benefits of this study was expected to be useful for the Binus Entrepreneurship Center (BEC unit to create a map development learning entrepreneurship. Influences that would be generated by using SWOT Analysis were very wide as the benefits of the implementation of large classes and small classes for students and faculty. Participants of this study were Binus student of various majors who were taking courses EN001 and EN002. This study used research and development that examining the theoretical learning components of entrepreneurship education (teaching and learning dimension, where there were six dimensions of the survey which was a fundamental element in determining the framework of entrepreneurship education. Research finds that a strategy based on a matrix of factors is at least eight strategies for improving the learning process of entrepreneurship. From eight strategies are one of them strategies to increase collaboration BEC with family support. This strategy is supported by the survey results to the three majors who are following the EN001 and EN002, where more than 85% of the students are willing to do an aptitude test to determine the advantages and disadvantages of self-development and more of 54% of the students are not willing to accept the wishes of their parents because they do not correspond to his ideals. Based on the above results, it is suggested for further research, namely developing entrepreneurship research by analyzing other dimensions.

  18. Integrated Process Design, Control and Analysis of Intensified Chemical Processes

    DEFF Research Database (Denmark)

    Mansouri, Seyed Soheil

    distillation column. Next, these design methods are extended using element concept to also include ternary as well as multicomponent reactive distillation processes. The element concept is used to translate a ternary system of compounds (A + B ↔ C) to a binary system of elements (WA and WB). When only two...... elements are needed to represent the reacting system of more than two compounds, a binary element system is identified. In the case of multi-element reactive distillation processes (where more than two elements are encountered) the equivalent element concept is used to translate a multicomponent (multi......-element) system of compounds (A + B ↔ C + D) to a binary system of key elements (elements WHK and WLK). For an energy-efficient design, non-reactive driving force (for binary non-reactive distillation), reactive driving force (for binary element systems) and binary-equivalent driving force (for multicomponent...

  19. Process Simulation Analysis of HF Stripping

    Directory of Open Access Journals (Sweden)

    Thaer A. Abdulla

    2013-05-01

    Full Text Available    HYSYS process simulator is used for the analysis of existing HF stripping column in LAB plant (Arab Detergent Company, Baiji-Iraq. Simulated column performance and profiles curves are constructed. The variables considered are the thermodynamic model option, bottom temperature, feed temperature, and column profiles for the temperature, vapor flow rate, liquid flow rate and composition. The five thermodynamic models options used (Margules, UNIQUAC, van laar, Antoine, and Zudkevitch-Joffee, affecting the results within (0.1-58% variation for the most cases.        The simulated results show that about 4% of paraffin (C10 & C11 presents at the top stream, which may cause a problem in the LAB production plant. The major variations were noticed for the total top vapor flow rate with bottom temperature and with feed composition. The column profiles maintain fairly constants from tray 5 to tray 18. The study gives evidence about a successful simulation with HYSYS because the results correspond with the real plant operation data.

  20. Goals Analysis Procedure Guidelines for Applying the Goals Analysis Process

    Science.gov (United States)

    Motley, Albert E., III

    2000-01-01

    One of the key elements to successful project management is the establishment of the "right set of requirements", requirements that reflect the true customer needs and are consistent with the strategic goals and objectives of the participating organizations. A viable set of requirements implies that each individual requirement is a necessary element in satisfying the stated goals and that the entire set of requirements, taken as a whole, is sufficient to satisfy the stated goals. Unfortunately, it is the author's experience that during project formulation phases' many of the Systems Engineering customers do not conduct a rigorous analysis of the goals and objectives that drive the system requirements. As a result, the Systems Engineer is often provided with requirements that are vague, incomplete, and internally inconsistent. To complicate matters, most systems development methodologies assume that the customer provides unambiguous, comprehensive and concise requirements. This paper describes the specific steps of a Goals Analysis process applied by Systems Engineers at the NASA Langley Research Center during the formulation of requirements for research projects. The objective of Goals Analysis is to identify and explore all of the influencing factors that ultimately drive the system's requirements.

  1. Risk Analysis for Nonthermal process interventions

    Science.gov (United States)

    Over the last few years a number of nonthermal process interventions including ionizing radiation and ultraviolet light, high pressure processing, pulsed-electric and radiofrequency electric fields, microwave and infrared technologies, bacteriophages, etc. have been approved by regulatory agencies, ...

  2. Software Process Improvement Using Force Field Analysis ...

    African Journals Online (AJOL)

    Software process improvement is a necessity especially since the dynamic nature of today's hardware demands reciprocal improvements in the underlying software systems. Several process improvement models exist where organizations perform an introspective study of the current software development process and ...

  3. Second order analysis for spatial Hawkes processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Torrisi, Giovanni Luca

    We derive summary statistics for stationary Hawkes processes which can be considered as spatial versions of classical Hawkes processes. Particularly, we derive the intensity, the pair correlation function and the Bartlett spectrum. Our results for Gaussian fertility rates and the extension...... to marked Hawkes processes are discussed....

  4. The Development of Manufacturing Process Analysis: Lesson Learned from Process Mining

    Directory of Open Access Journals (Sweden)

    Bernardo Nugroho Yahya

    2014-01-01

    Full Text Available Process analysis is recognized as a major stage in business process reengineering that has developed over the last two decades. In the field of manufacturing, manufacturing process analysis (MPA is defined as performance analysis of the production process. The performance analysis is an outline from data and knowledge into useful forms that can be broadly applied in manufacturing sectors. Process mining, an emerge tool focusing on process perspective and resource perspective, is a way to analyze system based on the event log. The objective of this study is to extend the existing process analysis framework by considering attribute perspective. This study also aims to learn the lessons from some experiences on process mining in manufacture industries. The result of this study will help manufacturing organizations to utilize the process mining approach for analyzing their respective processes.

  5. Articulating the Resources for Business Process Analysis and Design

    Science.gov (United States)

    Jin, Yulong

    2012-01-01

    Effective process analysis and modeling are important phases of the business process management lifecycle. When many activities and multiple resources are involved, it is very difficult to build a correct business process specification. This dissertation provides a resource perspective of business processes. It aims at a better process analysis…

  6. Introduction to image processing and analysis

    CERN Document Server

    Russ, John C

    2007-01-01

    ADJUSTING PIXEL VALUES Optimizing Contrast Color Correction Correcting Nonuniform Illumination Geometric Transformations Image Arithmetic NEIGHBORHOOD OPERATIONS Convolution Other Neighborhood Operations Statistical Operations IMAGE PROCESSING IN THE FOURIER DOMAIN The Fourier Transform Removing Periodic Noise Convolution and Correlation Deconvolution Other Transform Domains Compression BINARY IMAGES Thresholding Morphological Processing Other Morphological Operations Boolean Operations MEASUREMENTS Global Measurements Feature Measurements Classification APPENDIX: SOFTWARE REFERENCES AND LITERATURE INDEX.

  7. Profitability Analysis of Groundnuts Processing in Maiduguri ...

    African Journals Online (AJOL)

    This paper examines the profitability of groundnuts processing in Maiduguri Metropolitan Council of Borno State. The specific objectives of the study were to examine the socioeconomic characteristics of groundnut processors, estimate the costs and returns in groundnut processing and determine the return on investment in ...

  8. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper

    We define residuals for point process models fitted to spatial point pattern data, and propose diagnostic plots based on these residuals. The techniques apply to any Gibbs point process model, which may exhibit spatial heterogeneity, interpoint interaction and dependence on spatial covariates. Ou...... or covariate effects. Q-Q plots of the residuals are effective in diagnosing interpoint interaction. Some existing ad hoc statistics of point patterns (quadrat counts, scan statistic, kernel smoothed intensity, Berman's diagnostic) are recovered as special cases....

  9. Sequential decision analysis for nonstationary stochastic processes

    Science.gov (United States)

    Schaefer, B.

    1974-01-01

    A formulation of the problem of making decisions concerning the state of nonstationary stochastic processes is given. An optimal decision rule, for the case in which the stochastic process is independent of the decisions made, is derived. It is shown that this rule is a generalization of the Bayesian likelihood ratio test; and an analog to Wald's sequential likelihood ratio test is given, in which the optimal thresholds may vary with time.

  10. Residual analysis for spatial point processes

    DEFF Research Database (Denmark)

    Baddeley, A.; Turner, R.; Møller, Jesper

    2005-01-01

    We define residuals for point process models fitted to spatial point pattern data, and we propose diagnostic plots based on them. The residuals apply to any point process model that has a conditional intensity; the model may exhibit spatial heterogeneity, interpoint interaction and dependence...... on spatial covariates. Some existing ad hoc methods for model checking (quadrat counts, scan statistic, kernel smoothed intensity and Berman's diagnostic) are recovered as special cases. Diagnostic tools are developed systematically, by using an analogy between our spatial residuals and the ususal residuals...... for (non-spatial) generalized linear models. The conditional intensity $\\lambda$ plays the role of the mean response. This makes it possible to adapt existing knowledge about model validation for generalized linear models to the spatial point process context, giving recommendations for diagnostic plots...

  11. Laser processing and analysis of materials

    CERN Document Server

    Duley, W W

    1983-01-01

    It has often been said that the laser is a solution searching for a problem. The rapid development of laser technology over the past dozen years has led to the availability of reliable, industrially rated laser sources with a wide variety of output characteristics. This, in turn, has resulted in new laser applications as the laser becomes a familiar processing and analytical tool. The field of materials science, in particular, has become a fertile one for new laser applications. Laser annealing, alloying, cladding, and heat treating were all but unknown 10 years ago. Today, each is a separate, dynamic field of research activity with many of the early laboratory experiments resulting in the development of new industrial processing techniques using laser technology. Ten years ago, chemical processing was in its infancy awaiting, primarily, the development of reliable tunable laser sources. Now, with tunability over the entire spectrum from the vacuum ultraviolet to the far infrared, photo­ chemistry is undergo...

  12. 300 Area process trench sediment analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Zimmerman, M.G.; Kossik, C.D.

    1987-12-01

    This report describes the results of a sampling program for the sediments underlying the Process Trenches serving the 300 Area on the Hanford reservation. These Process Trenches were the subject of a Closure Plan submitted to the Washington State Department of Ecology and to the US Environmental Protection Agency in lieu of a Part B permit application on November 8, 1985. The closure plan described a proposed sampling plan for the underlying sediments and potential remedial actions to be determined by the sample analyses results. The results and proposed remedial action plan are presented and discussed in this report. 50 refs., 6 figs., 8 tabs.

  13. Exergy analysis in industrial food processing

    NARCIS (Netherlands)

    Zisopoulos, F.K.

    2016-01-01

    The sustainable provision of food on a global scale in the near future is a very serious challenge. This thesis focuses on the assessment and design of sustainable industrial food production chains and processes by using the concept of exergy which is an objective metric based on the first and

  14. Representative process sampling for reliable data analysis

    DEFF Research Database (Denmark)

    Julius, Lars Petersen; Esbensen, Kim

    2005-01-01

    Process sampling of moving streams of particulate matter, fluids and slurries (over time or space) or stationary one-dimensional (1-D) lots is often carried out according to existing tradition or protocol not taking the theory of sampling (TOS) into account. In many situations, sampling errors...

  15. Simplified Processing Method for Meter Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, Kimberly M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Colotelo, Alison H. A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Downs, Janelle L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ham, Kenneth D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Henderson, Jordan W. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Montgomery, Sadie A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Vernon, Christopher R. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Parker, Steven A. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-11-01

    Simple/Quick metered data processing method that can be used for Army Metered Data Management System (MDMS) and Logistics Innovation Agency data, but may also be useful for other large data sets. Intended for large data sets when analyst has little information about the buildings.

  16. SPAN C - Terminal sterilization process analysis program

    Science.gov (United States)

    1969-01-01

    Computer program, SPAN-C, measures the dry heat thermal sterilization process applied to a planetary capsule and calculates the time required for heat application, steady state conditions, and cooling. The program is based on the logarithmic survival of micro-organisms. Temperature profiles must be input on cards.

  17. SPAN - Terminal sterilization process analysis program

    Science.gov (United States)

    1969-01-01

    Computer program, SPAN, measures the dry heat thermal sterilization process applied to a planetary capsule and calculates the time required for heat application, steady state conditions, and cooling. The program is based on the logarithmic survival of micro-organisms. Temperature profiles must be input on tape.

  18. Fingerprint Analysis with Marked Point Processes

    DEFF Research Database (Denmark)

    Forbes, Peter G. M.; Lauritzen, Steffen; Møller, Jesper

    We present a framework for fingerprint matching based on marked point process models. An efficient Monte Carlo algorithm is developed to calculate the marginal likelihood ratio for the hypothesis that two observed prints originate from the same finger against the hypothesis that they originate fr...

  19. Kinetic Analysis of Mica Tape Curing Process

    Directory of Open Access Journals (Sweden)

    Radek Polansky

    2008-01-01

    Full Text Available Curing program of thermoset insulating materials and its responsible setting has the key importance for assurance of high quality and reliability of electrical devices. It is possible to determine parameters of this program (temperature and time of curing by several ways in practise. There is mostly focused on methods based on kinetic analysis. The result comparison of selected methods of kinetic analysis and residual enthalpy measurement is the main aim of the paper. Two insulating tapes were chosen for the purpose of this study. These tapes correspond in their composition (glass fabric, mica and epoxy binder, but they differ in curing agent type. Simultaneous thermal analysis (STA was used during the measurements. Monitored results demonstrate the advantages and disadvantages of particular methods.

  20. Knee joint vibroarthrographic signal processing and analysis

    CERN Document Server

    Wu, Yunfeng

    2015-01-01

    This book presents the cutting-edge technologies of knee joint vibroarthrographic signal analysis for the screening and detection of knee joint injuries. It describes a number of effective computer-aided methods for analysis of the nonlinear and nonstationary biomedical signals generated by complex physiological mechanics. This book also introduces several popular machine learning and pattern recognition algorithms for biomedical signal classifications. The book is well-suited for all researchers looking to better understand knee joint biomechanics and the advanced technology for vibration arthrometry. Dr. Yunfeng Wu is an Associate Professor at the School of Information Science and Technology, Xiamen University, Xiamen, Fujian, China.

  1. Processing Cost Analysis for Biomass Feedstocks

    Energy Technology Data Exchange (ETDEWEB)

    Badger, P.C.

    2002-11-20

    The receiving, handling, storing, and processing of woody biomass feedstocks is an overlooked component of biopower systems. The purpose of this study was twofold: (1) to identify and characterize all the receiving, handling, storing, and processing steps required to make woody biomass feedstocks suitable for use in direct combustion and gasification applications, including small modular biopower (SMB) systems, and (2) to estimate the capital and operating costs at each step. Since biopower applications can be varied, a number of conversion systems and feedstocks required evaluation. In addition to limiting this study to woody biomass feedstocks, the boundaries of this study were from the power plant gate to the feedstock entry point into the conversion device. Although some power plants are sited at a source of wood waste fuel, it was assumed for this study that all wood waste would be brought to the power plant site. This study was also confined to the following three feedstocks (1) forest residues, (2) industrial mill residues, and (3) urban wood residues. Additionally, the study was confined to grate, suspension, and fluidized bed direct combustion systems; gasification systems; and SMB conversion systems. Since scale can play an important role in types of equipment, operational requirements, and capital and operational costs, this study examined these factors for the following direct combustion and gasification system size ranges: 50, 20, 5, and 1 MWe. The scope of the study also included: Specific operational issues associated with specific feedstocks (e.g., bark and problems with bridging); Opportunities for reducing handling, storage, and processing costs; How environmental restrictions can affect handling and processing costs (e.g., noise, commingling of treated wood or non-wood materials, emissions, and runoff); and Feedstock quality issues and/or requirements (e.g., moisture, particle size, presence of non-wood materials). The study found that over the

  2. Advanced Color Image Processing and Analysis

    CERN Document Server

    2013-01-01

    This volume does much more than survey modern advanced color processing. Starting with a historical perspective on ways we have classified color, it sets out the latest numerical techniques for analyzing and processing colors, the leading edge in our search to accurately record and print what we see. The human eye perceives only a fraction of available light wavelengths, yet we live in a multicolor world of myriad shining hues. Colors rich in metaphorical associations make us “purple with rage” or “green with envy” and cause us to “see red.” Defining colors has been the work of centuries, culminating in today’s complex mathematical coding that nonetheless remains a work in progress: only recently have we possessed the computing capacity to process the algebraic matrices that reproduce color more accurately. With chapters on dihedral color and image spectrometers, this book provides technicians and researchers with the knowledge they need to grasp the intricacies of today’s color imaging.

  3. Dynamical analysis of the exclusive queueing process.

    Science.gov (United States)

    Arita, Chikashi; Schadschneider, Andreas

    2011-05-01

    Recently, the stationary state of a parallel-update totally asymmetric simple exclusion process with varying system length, which can be regarded as a queueing process with excluded-volume effect (exclusive queueing process), was obtained [C Arita and D Yanagisawa, J. Stat. Phys. 141, 829 (2010)]. In this paper, we analyze the dynamical properties of the number of particles [N(t)] and the position of the last particle (the system length) [L(t)], using an analytical method (generating function technique) as well as a phenomenological description based on domain-wall dynamics and Monte Carlo simulations. The system exhibits two phases corresponding to linear convergence or divergence of [N(t)] and [L(t)]. These phases can both further be subdivided into high-density and maximal-current subphases. The predictions of the domain-wall theory are found to be in very good agreement quantitively with results from Monte Carlo simulations in the convergent phase. On the other hand, in the divergent phase, only the prediction for [N(t)] agrees with simulations.

  4. The analysis of thermally stimulated processes

    CERN Document Server

    Chen, R; Pamplin, Brian

    1981-01-01

    Thermally stimulated processes include a number of phenomena - either physical or chemical in nature - in which a certain property of a substance is measured during controlled heating from a 'low' temperature. Workers and graduate students in a wide spectrum of fields require an introduction to methods of extracting information from such measurements. This book gives an interdisciplinary approach to various methods which may be applied to analytical chemistry including radiation dosimetry and determination of archaeological and geological ages. In addition, recent advances are included, such

  5. Identifying Organizational Inefficiencies with Pictorial Process Analysis (PPA

    Directory of Open Access Journals (Sweden)

    David John Patrishkoff

    2013-11-01

    Full Text Available Pictorial Process Analysis (PPA was created by the author in 2004. PPA is a unique methodology which offers ten layers of additional analysis when compared to standard process mapping techniques.  The goal of PPA is to identify and eliminate waste, inefficiencies and risk in manufacturing or transactional business processes at 5 levels in an organization. The highest level being assessed is the process management, followed by the process work environment, detailed work habits, process performance metrics and general attitudes towards the process. This detailed process assessment and analysis is carried out during process improvement brainstorming efforts and Kaizen events. PPA creates a detailed visual efficiency rating for each step of the process under review.  A selection of 54 pictorial Inefficiency Icons (cards are available for use to highlight major inefficiencies and risks that are present in the business process under review. These inefficiency icons were identified during the author's independent research on the topic of why things go wrong in business. This paper will highlight how PPA was developed and show the steps required to conduct Pictorial Process Analysis on a sample manufacturing process. The author has successfully used PPA to dramatically improve business processes in over 55 different industries since 2004.  

  6. Digital image sequence processing, compression, and analysis

    CERN Document Server

    Reed, Todd R

    2004-01-01

    IntroductionTodd R. ReedCONTENT-BASED IMAGE SEQUENCE REPRESENTATIONPedro M. Q. Aguiar, Radu S. Jasinschi, José M. F. Moura, andCharnchai PluempitiwiriyawejTHE COMPUTATION OF MOTIONChristoph Stiller, Sören Kammel, Jan Horn, and Thao DangMOTION ANALYSIS AND DISPLACEMENT ESTIMATION IN THE FREQUENCY DOMAINLuca Lucchese and Guido Maria CortelazzoQUALITY OF SERVICE ASSESSMENT IN NEW GENERATION WIRELESS VIDEO COMMUNICATIONSGaetano GiuntaERROR CONCEALMENT IN DIGITAL VIDEOFrancesco G.B. De NataleIMAGE SEQUENCE RESTORATION: A WIDER PERSPECTIVEAnil KokaramVIDEO SUMMARIZATIONCuneyt M. Taskiran and Edward

  7. ANALYSIS ON TECHNOLOGICAL PROCESSES CLEANING OIL PIPELINES

    Directory of Open Access Journals (Sweden)

    Mariana PǍTRAŞCU

    2015-05-01

    Full Text Available In this paper the researches are presented concerning the technological processes of oil pipelines.We know several technologies and materials used for cleaning the sludge deposits, iron and manganese oxides, dross, stone, etc.de on the inner walls of drinking water pipes or industries.For the oil industry, methods of removal of waste materials and waste pipes and liquid and gas transport networks are operations known long, tedious and expensive. The main methods and associated problems can be summarized as follows: 1 Blowing with compressed air.2 manual or mechanical brushing, sanding with water or dry.3 Wash with water jet of high pressure, solvent or chemical solution to remove the stone and hard deposits.4 The combined methods of cleaning machines that use water jets, cutters, chains, rotary heads cutters, etc.

  8. Automatic image analysis of multicellular apoptosis process.

    Science.gov (United States)

    Ziraldo, Riccardo; Link, Nichole; Abrams, John; Ma, Lan

    2014-01-01

    Apoptotic programmed cell death (PCD) is a common and fundamental aspect of developmental maturation. Image processing techniques have been developed to detect apoptosis at the single-cell level in a single still image, while an efficient algorithm to automatically analyze the temporal progression of apoptosis in a large population of cells is unavailable. In this work, we have developed an ImageJ-based program that can quantitatively analyze time-lapse microscopy movies of live tissues undergoing apoptosis with a fluorescent cellular marker, and subsequently extract the temporospatial pattern of multicellular response. The protocol is applied to characterize apoptosis of Drosophila wing epithelium cells at eclosion. Using natural anatomic structures as reference, we identify dynamic patterns in the progression of apoptosis within the wing tissue, which not only confirms the previously observed collective cell behavior from a quantitative perspective for the first time, but also reveals a plausible role played by the anatomic structures in Drosophila apoptosis.

  9. Uncertainty Budget Analysis for Dimensional Inspection Processes (U)

    Energy Technology Data Exchange (ETDEWEB)

    Valdez, Lucas M. [Los Alamos National Laboratory

    2012-07-26

    This paper is intended to provide guidance and describe how to prepare an uncertainty analysis of a dimensional inspection process through the utilization of an uncertainty budget analysis. The uncertainty analysis is stated in the same methodology as that of the ISO GUM standard for calibration and testing. There is a specific distinction between how Type A and Type B uncertainty analysis is used in a general and specific process. All theory and applications are utilized to represent both a generalized approach to estimating measurement uncertainty and how to report and present these estimations for dimensional measurements in a dimensional inspection process. The analysis of this uncertainty budget shows that a well-controlled dimensional inspection process produces a conservative process uncertainty, which can be attributed to the necessary assumptions in place for best possible results.

  10. Advanced information processing and analysis steering group: intelligence community

    Science.gov (United States)

    Kees, Terry S.; Rose, Russell R.

    1994-03-01

    Today's intelligence analysis environment is more complex with an ever increasing focus on technology to solve the analyst's problems and to make the information processing and analysis simpler. The analytic emphasis is heavily oriented toward document selection, data extraction and data monitoring, as well as toward the drafting, coordinating and editing of written reports and similar intelligence products. The Advanced Information Processing and Analysis Steering Group (AIPASG) desires to have an impact on technology development and on the technology insertion to solve high priority information processing and analysis problems.

  11. Computer software for process hazards analysis.

    Science.gov (United States)

    Hyatt, N

    2000-10-01

    Computerized software tools are assuming major significance in conducting HAZOPs. This is because they have the potential to offer better online presentations and performance to HAZOP teams, as well as better documentation and downstream tracking. The chances of something being "missed" are greatly reduced. We know, only too well, that HAZOP sessions can be like the industrial equivalent of a trip to the dentist. Sessions can (and usually do) become arduous and painstaking. To make the process easier for all those involved, we need all the help computerized software can provide. In this paper I have outlined the challenges addressed in the production of Windows software for performing HAZOP and other forms of PHA. The object is to produce more "intelligent", more user-friendly software for performing HAZOP where technical interaction between team members is of key significance. HAZOP techniques, having already proven themselves, are extending into the field of computer control and human error. This makes further demands on HAZOP software and emphasizes its importance.

  12. Spectral Components Analysis of Diffuse Emission Processes

    Energy Technology Data Exchange (ETDEWEB)

    Malyshev, Dmitry; /KIPAC, Menlo Park

    2012-09-14

    We develop a novel method to separate the components of a diffuse emission process based on an association with the energy spectra. Most of the existing methods use some information about the spatial distribution of components, e.g., closeness to an external template, independence of components etc., in order to separate them. In this paper we propose a method where one puts conditions on the spectra only. The advantages of our method are: 1) it is internal: the maps of the components are constructed as combinations of data in different energy bins, 2) the components may be correlated among each other, 3) the method is semi-blind: in many cases, it is sufficient to assume a functional form of the spectra and determine the parameters from a maximization of a likelihood function. As an example, we derive the CMB map and the foreground maps for seven yeas of WMAP data. In an Appendix, we present a generalization of the method, where one can also add a number of external templates.

  13. Imaging Heat and Mass Transfer Processes Visualization and Analysis

    CERN Document Server

    Panigrahi, Pradipta Kumar

    2013-01-01

    Imaging Heat and Mass Transfer Processes: Visualization and Analysis applies Schlieren and shadowgraph techniques to complex heat and mass transfer processes. Several applications are considered where thermal and concentration fields play a central role. These include vortex shedding and suppression from stationary and oscillating bluff bodies such as cylinders, convection around crystals growing from solution, and buoyant jets. Many of these processes are unsteady and three dimensional. The interpretation and analysis of images recorded are discussed in the text.

  14. Selected Tools for Risk Analysis in Logistics Processes

    Science.gov (United States)

    Kulińska, Ewa

    2012-03-01

    As each organization aims at managing effective logistics processes, risk factors can and should be controlled through proper system of risk management. Implementation of complex approach to risk management allows for the following: - evaluation of significant risk groups associated with logistics processes implementation, - composition of integrated strategies of risk management, - composition of tools for risk analysis in logistics processes.

  15. Signal Analysis and Processing Platform Based on LabVIEW

    Directory of Open Access Journals (Sweden)

    Xu Yang

    2014-06-01

    Full Text Available A signal analysis and processing platform was developed in this paper. The platform was designed by LabVIEW 2012 which covered many signal analysis and processing functions, such as Filter, Spectrum analysis and so on. After testing and practical application, the interface of the platform is flexible, vivid and easy to operate which can meet the needs of universities and research laboratories.

  16. Analysis of thermal process of pozzolan production

    Directory of Open Access Journals (Sweden)

    Mejía De Gutiérrez, R.

    2004-06-01

    Full Text Available The objective of this study was evaluated the effect of heat treatment parameters on the pozzolanic activity of natural kaolin clays. The experimental design included three factors: kaolin type, temperature and time. Five types of Colombian kaolin clays were thermally treated from 400 to 1000 °C by 1, 2, and 3 hours. The raw materials and the products obtained were characterized by X-Ray Diffraction (XRD, Fourier Transform Infrared Spectroscopy (FTIR and Differential Thermal / Thermo gravimetric Analysis (DTAJ TGA. The pozzolanic activity of thermally treated samples according to chemical and mechanical tests was investigated.

    El objetivo de este estudio fue caracterizar las variables de producción de un metacaolín de alta reactividad puzolánica. El diseño experimental utilizó un modelo factorial que consideró tres factores: tipo de caolín (C, temperatura y tiempo. A partir del conocimiento de las fuentes de caolín y el contacto con proveedores y distribuidores del producto a nivel nacional, se seleccionaron cinco muestras representativas de arcillas caoliníticas, las cuales se sometieron a un tratamiento térmico entre 400 y 1.000 ºC (seis niveles de temperatura y tres tiempos de exposición, 1, 2 y 3 horas. Los caolines de origen y los productos obtenidos de cada proceso térmico fueron evaluados mediante técnicas de tipo físico y químico, difracción de rayos X, infrarrojo FTIR, y análisis térmico diferencial (OTA, TGA. Complementariamente se evalúa la actividad puzolánica, tanto química como mecánica, del producto obtenido a diferentes temperaturas de estudio.

  17. Assessment of hospital processes using a process mining technique: Outpatient process analysis at a tertiary hospital.

    Science.gov (United States)

    Yoo, Sooyoung; Cho, Minsu; Kim, Eunhye; Kim, Seok; Sim, Yerim; Yoo, Donghyun; Hwang, Hee; Song, Minseok

    2016-04-01

    Many hospitals are increasing their efforts to improve processes because processes play an important role in enhancing work efficiency and reducing costs. However, to date, a quantitative tool has not been available to examine the before and after effects of processes and environmental changes, other than the use of indirect indicators, such as mortality rate and readmission rate. This study used process mining technology to analyze process changes based on changes in the hospital environment, such as the construction of a new building, and to measure the effects of environmental changes in terms of consultation wait time, time spent per task, and outpatient care processes. Using process mining technology, electronic health record (EHR) log data of outpatient care before and after constructing a new building were analyzed, and the effectiveness of the technology in terms of the process was evaluated. Using the process mining technique, we found that the total time spent in outpatient care did not increase significantly compared to that before the construction of a new building, considering that the number of outpatients increased, and the consultation wait time decreased. These results suggest that the operation of the outpatient clinic was effective after changes were implemented in the hospital environment. We further identified improvements in processes using the process mining technique, thereby demonstrating the usefulness of this technique for analyzing complex hospital processes at a low cost. This study confirmed the effectiveness of process mining technology at an actual hospital site. In future studies, the use of process mining technology will be expanded by applying this approach to a larger variety of process change situations. Copyright © 2016. Published by Elsevier Ireland Ltd.

  18. AN ENHANCED HAZARD ANALYSIS PROCESS FOR THE HANFORD TANK FARMS

    Energy Technology Data Exchange (ETDEWEB)

    SHULTZ MV

    2008-05-15

    CH2M HILL Hanford Group, Inc., has expanded the scope and increased the formality of process hazards analyses performed on new or modified Tank Farm facilities, designs, and processes. The CH2M HILL process hazard analysis emphasis has been altered to reflect its use as a fundamental part of the engineering and change control process instead of simply being a nuclear safety analysis tool. The scope has been expanded to include identification of accidents/events that impact the environment, or require emergency response, in addition to those with significant impact to the facility worker, the offsite, and the 100-meter receptor. Also, there is now an expectation that controls will be identified to address all types of consequences. To ensure that the process has an appropriate level of rigor and formality, a new engineering standard for process hazards analysis was created. This paper discusses the role of process hazards analysis as an information source for not only nuclear safety, but also for the worker-safety management programs, emergency management, environmental programs. This paper also discusses the role of process hazards analysis in the change control process, including identifying when and how it should be applied to changes in design or process.

  19. Analysis of bilinear stochastic systems. [involving multiplicative noise processes

    Science.gov (United States)

    Willsky, A. S.; Marcus, S. I.; Martin, D. N.

    1974-01-01

    Analysis of stochastic dynamical systems that involve multiplicative (bilinear) noise processes is considered. After defining the systems of interest, the evolution of the moments of such systems, the question of stochastic stability, and estimation for bilinear stochastic systems are discussed. Both exact and approximate methods of analysis are introduced, and, in particular, the uses of Lie-theoretic concepts and harmonic analysis are discussed.

  20. Recurrence Quantification for the Analysis of Coupled Processes in Aging.

    Science.gov (United States)

    Brick, Timothy R; Gray, Allison L; Staples, Angela D

    2017-09-06

    Aging is a complex phenomenon, with numerous simultaneous processes that interact with each other on a moment-to-moment basis. One way to quantify the interactions of these processes is by measuring how much a process is similar to its own past states or the past states of another system through the analysis of recurrence. This paper presents an introduction to recurrence quantification analysis (RQA) and cross-recurrence quantification analysis (CRQA), two dynamical systems analysis techniques that provide ways to characterize the self-similar nature of each process and the properties of their mutual temporal co-occurrence. We present RQA and CRQA and demonstrate their effectiveness with an example of conversational movements across age groups. RQA and CRQA provide methods of analyzing the repetitive processes that occur in day-to-day life, describing how different processes co-occur, synchronize, or predict each other and comparing the characteristics of those processes between groups. With intensive longitudinal data becoming increasingly available, it is possible to examine how the processes of aging unfold. RQA and CRQA provide information about how one process may show patterns of internal repetition or echo the patterning of another process and how those characteristics may change across the process of aging.

  1. Document analysis using an aggregative and iterative process.

    Science.gov (United States)

    Rasmussen, Philippa; Muir-Cochrane, Eimear; Henderson, Ann

    2012-06-01

    This paper is a descriptive commentary concerning the use of document analysis in qualitative research concerned with developing an understanding of the role of child and adolescent mental health nursing in an inpatient. The document analysis was undertaken using thematic analysis with both an iterative process (Attride-Stirling) and an aggregative process, the Joanna Briggs Institute Thematic Analysis Program (TAP). After the initial iterative process the data were entered into an online software program, TAP, for aggregation and further analysis. The TAP software consisted of a three-step approach in the analysis of data extraction of illustrations, aggregation to categories and synthesis of categories into themes. A TAP chart was generated displaying the connections between the illustrations, categories and themes. The advantage and limitations of utilising the TAP software compared with Computer Assisted Qualitative Data Analysis Software were discussed. The program afforded direct involvement by the researcher in the cognitive process of the analysis; rather than just the technical process. A limitation of the program would be the volume of the data if the research involved a vast amount of data. The TAP program was a clearly defined three-step software program that was appropriate for the documents analysis for the research. The program would have a wide application for facilitating the thematic analysis of documents, although the program is suitable for smaller amounts of data. © 2012 The Authors. International Journal of Evidence-Based Healthcare © 2012 The Joanna Briggs Institute.

  2. Computational Fluid Dynamics Analysis of Freeze Drying Process and Equipment

    OpenAIRE

    Varma, Nikhil P.

    2014-01-01

    Freeze drying is an important, but expensive, inefficient and time consuming process in the pharmaceutical, chemical and food processing industries. Computational techniques could be a very effective tool in predictive design and analysis of both freeze drying process and equipment. This work is an attempt at using Computational Fluid Dynamics(CFD) and numerical simulations as a tool for freeze drying process and equipment design. Pressure control is critical in freeze dryers, keeping in v...

  3. Lithography process calibration with applications in defect printability analysis

    Science.gov (United States)

    Wu, Shao-Po; Liu, Hua-Yu; Chang, Fang C.; Karklin, Linard

    1998-12-01

    Lithography process simulation has proven to be a useful and effective tool for process characterization, namely, properly characterize critical dimension (CD) variations from the design that are caused by proximity effects and distortions introduced by the patterning tool, reticle, resist processing and etching. Accurate lithography process simulator further enables process engineers to automate the tasks of advanced mask design, verification and inspection that are used in deep-sub-micron semiconductor manufacturing. However, to get the most benefit from process simulations, we should properly calibrate the simulation model according to the process to be characterized. That is, given a representative set of CD measurements obtained from the process, we fine-tune the process model parameters so that the simulated/predicted CDs well match the measured CDs. By doing so, we can ensure to some extent that process simulations give sensible results to be used in the design analysis, verification and inspection applications. In this paper, we would like to demonstrate the possibility of obtaining an accurate process model for lithography process simulations via model calibration. We will also demonstrate the accuracy of calibrated process simulations by applying the calibrated model in mask defect printability analysis. For simplicity, the process model and the algorithms used in model calibration will not be discussed in this article but in our future publications. In Section 2, we present the characterization and calibration of a 0.18 micrometer DUV lithography process using positive chemically amplified resist (APEX-E) as an example. We describe the test pattern selections, the calibration process, and the performance of the calibrated model in terms of predicting the CD measurements given test patterns. In Section 3, we briefly describe the technology of defect printability analysis based on process simulations. We will demonstrate that with the help of calibrated

  4. Process Knowledge Discovery Using Sparse Principal Component Analysis

    DEFF Research Database (Denmark)

    Gao, Huihui; Gajjar, Shriram; Kulahci, Murat

    2016-01-01

    As the goals of ensuring process safety and energy efficiency become ever more challenging, engineers increasingly rely on data collected from such processes for informed decision making. During recent decades, extracting and interpreting valuable process information from large historical data sets...... SPCA approach that helps uncover the underlying process knowledge regarding variable relations. This approach systematically determines the optimal sparse loadings for each sparse PC while improving interpretability and minimizing information loss. The salient features of the proposed approach...... are demonstrated through the Tennessee Eastman process simulation. The results indicate how knowledge and process insight can be discovered through a systematic analysis of sparse loadings....

  5. Energy analysis handbook. CAC document 214. [Combining process analysis with input-output analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bullard, C. W.; Penner, P. S.; Pilati, D. A.

    1976-10-01

    Methods are presented for calculating the energy required, directly and indirectly, to produce all types of goods and services. Procedures for combining process analysis with input-output analysis are described. This enables the analyst to focus data acquisition cost-effectively, and to achieve a specified degree of accuracy in the results. The report presents sample calculations and provides the tables and charts needed to perform most energy cost calculations, including the cost of systems for producing or conserving energy.

  6. Pedagogical issues for effective teaching of biosignal processing and analysis.

    Science.gov (United States)

    Sandham, William A; Hamilton, David J

    2010-01-01

    Biosignal processing and analysis is generally perceived by many students to be a challenging topic to understand, and to become adept with the necessary analytical skills. This is a direct consequence of the high mathematical content involved, and the many abstract features of the topic. The MATLAB and Mathcad software packages offer an excellent algorithm development environment for teaching biosignal processing and analysis modules, and can also be used effectively in many biosignal, and indeed bioengineering, research areas. In this paper, traditional introductory and advanced biosignal processing (and analysis) syllabi are reviewed, and the use of MATLAB and Mathcad for teaching and research is illustrated with a number of examples.

  7. Image processing and analysis with graphs theory and practice

    CERN Document Server

    Lézoray, Olivier

    2012-01-01

    Covering the theoretical aspects of image processing and analysis through the use of graphs in the representation and analysis of objects, Image Processing and Analysis with Graphs: Theory and Practice also demonstrates how these concepts are indispensible for the design of cutting-edge solutions for real-world applications. Explores new applications in computational photography, image and video processing, computer graphics, recognition, medical and biomedical imaging With the explosive growth in image production, in everything from digital photographs to medical scans, there has been a drast

  8. THE PROCESS CAPABILITY ANALYSIS - A TOOL FOR PROCESS PERFORMANCE MEASURES AND METRICS - A CASE STUDY

    Directory of Open Access Journals (Sweden)

    Yerriswamy Wooluru

    2014-09-01

    Full Text Available Process Capability can be evaluated through the computations of various process capability ratios and indices. The basic three capability indices commonly used in manufacturing industries are Cp, Cpk, Cpm and Cpmk .Process capability indices are intended to provide single number assessment of the ability of a process to meet specification limits on quality characteristics of interest. Thus, it identifies the opportunities for improving quality and productivity. The level of significance on process capability analysis has been increased considerably over last decade, but the literature findings reveal the importance of understanding the concepts, methodologies and critical assumptions while its implementation in manufacturing process. The objective of this paper is to conduct process capability analysis for boring operation by understanding the concepts, methodologies and making critical assumptions.

  9. THE KNOWLEDGE CONVERSION SECI PROCESS AS INNOVATION INDICATOR ANALYSIS FACTOR

    National Research Council Canada - National Science Library

    Elaine da Silva; Marta Lígia Pomim Valentim

    2013-01-01

    .... We made an analysis in the 80 variables distributed through seven GII pillars, trying to identify the direct, indirect or null incidences of the knowledge conversion way described by the SECI Process...

  10. THE ANALYSIS OF TECHNICAL PROCESS OF TERMINAL SYSTEMS

    Directory of Open Access Journals (Sweden)

    N. Yu. Shramenko

    2010-10-01

    Full Text Available The analysis of technological processes of the terminal systems is conducted, features are exposed and the basic problems of modern technology are certain. Directions of improvement technology of functioning of the terminal systems are offered.

  11. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    Science.gov (United States)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  12. The Process Writing Approach: A Meta-Analysis

    Science.gov (United States)

    Graham, Steve; Sandmel, Karin

    2011-01-01

    The process approach to writing instruction is one of the most popular methods for teaching writing. The authors conducted meta-analysis of 29 experimental and quasi-experimental studies conducted with students in Grades 1-12 to examine if process writing instruction improves the quality of students' writing and motivation to write. For students…

  13. Iterated Process Analysis over Lattice-Valued Regular Expressions

    DEFF Research Database (Denmark)

    Midtgaard, Jan; Nielson, Flemming; Nielson, Hanne Riis

    2016-01-01

    We present an iterated approach to statically analyze programs of two processes communicating by message passing. Our analysis operates over a domain of lattice-valued regular expressions, and computes increasingly better approximations of each process's communication behavior. Overall the work...

  14. Modeling and Dynamic Analysis on Animals’ Repeated Learning Process

    OpenAIRE

    Lin, Mu; Yang, Jinqiao; Xu, Bin

    2011-01-01

    Part 5: Learning and Novel Algorithms; International audience; Dynamical modeling is used to describe the process of animals’ repeated learning. Theoretical analysis is done to explore the dynamic property of this process, such as the limit sets and their stability. The scope of variables is provided for different practical purpose, cooperated with necessary numerical simulation.

  15. Explaining discontinuity in organizational learning : A process analysis

    NARCIS (Netherlands)

    Berends, Hans; Lammers, Irene

    2010-01-01

    This paper offers a process analysis of organizational learning as it unfolds in a social and temporal context. Building upon the 4I framework (Crossan et al.1999), we examine organizational learning processes in a longitudinal case study of an implementation of knowledge management in an

  16. An analysis of processes that can shape higher education research ...

    African Journals Online (AJOL)

    An analysis of processes that can shape higher education research utilising as case-study an investigation into postgraduates from the rest of Africa at University ... Following a narrative account of this process, the article analyses the institutional settings of such research, utilizing some categorization from U Teichler (2000).

  17. Computer teaching process optimization strategy analysis of thinking ability

    National Research Council Canada - National Science Library

    Luo, Liang

    2016-01-01

    .... Therefore, this article on how to the calculation of optimization in the process of computer teaching college students thinking ability on further discussion and analysis, and then explore the strategies and methods, so as to promote the computer teaching in the process of the cultivation of thinking ability and optimize the computer

  18. Evaluating The Effectiveness Of Production Process Using Pareto Analysis

    Directory of Open Access Journals (Sweden)

    Polák Pavel

    2015-03-01

    Full Text Available The aim of this paper is to present the possibilities of using the Pareto method in evaluating the effectiveness of machine production processes. The paper deals with the production process of material cutting using progressive technology and subsequent evaluation of its effectiveness and quality. In the production process, we have used the method of material cutting by abrasive water jet. The Pareto analysis was used for eliminating the shortcomings in the quality of the final part.

  19. Thermodynamic analysis of resources used in manufacturing processes.

    Science.gov (United States)

    Gutowski, Timothy G; Branham, Matthew S; Dahmus, Jeffrey B; Jones, Alissa J; Thiriez, Alexandre

    2009-03-01

    In this study we use a thermodynamic framework to characterize the material and energy resources used in manufacturing processes. The analysis and data span a wide range of processes from "conventional" processes such as machining, casting, and injection molding, to the so-called "advanced machining" processes such as electrical discharge machining and abrasive waterjet machining, and to the vapor-phase processes used in semiconductor and nanomaterials fabrication. In all, 20 processes are analyzed. The results show that the intensity of materials and energy used per unit of mass of material processed (measured either as specific energy or exergy) has increased by at least 6 orders of magnitude over the past several decades. The increase of material/energy intensity use has been primarily a consequence of the introduction of new manufacturing processes, rather than changes in traditional technologies. This phenomenon has been driven by the desire for precise small-scale devices and product features and enabled by stable and declining material and energy prices over this period. We illustrate the relevance of thermodynamics (including exergy analysis) for all processes in spite of the fact that long-lasting focus in manufacturing has been on product quality--not necessarily energy/material conversion efficiency. We promote the use of thermodynamics tools for analysis of manufacturing processes within the context of rapidly increasing relevance of sustainable human enterprises. We confirm that exergy analysis can be used to identify where resources are lost in these processes, which is the first step in proposing and/or redesigning new more efficient processes.

  20. Problem based learning: Cognitive and metacognitive processes during problem analysis

    NARCIS (Netherlands)

    W.S. de Grave; H.P.A. Boshuizen (Henny); H.G. Schmidt (Henk)

    1996-01-01

    textabstractAn important phase of problem-based learning in a tutorial group is problem analysis. This article describes a study investigating the ongoing cognitive and metacognitive processes during problem analysis, by analysing the verbal communication among group members, and their thinking

  1. Comparative analysis of genomic signal processing for microarray data clustering.

    Science.gov (United States)

    Istepanian, Robert S H; Sungoor, Ala; Nebel, Jean-Christophe

    2011-12-01

    Genomic signal processing is a new area of research that combines advanced digital signal processing methodologies for enhanced genetic data analysis. It has many promising applications in bioinformatics and next generation of healthcare systems, in particular, in the field of microarray data clustering. In this paper we present a comparative performance analysis of enhanced digital spectral analysis methods for robust clustering of gene expression across multiple microarray data samples. Three digital signal processing methods: linear predictive coding, wavelet decomposition, and fractal dimension are studied to provide a comparative evaluation of the clustering performance of these methods on several microarray datasets. The results of this study show that the fractal approach provides the best clustering accuracy compared to other digital signal processing and well known statistical methods.

  2. Materials, process, product analysis of coal process technology. Phase I final report

    Energy Technology Data Exchange (ETDEWEB)

    Saxton, J. C.; Roig, R. W.; Loridan, A.; Leggett, N. E.; Capell, R. G.; Humpstone, C. C.; Mudry, R. N.; Ayres, E.

    1976-02-01

    The purpose of materials-process-product analysis is a systematic evaluation of alternative manufacturing processes--in this case processes for converting coal into energy and material products that can supplement or replace petroleum-based products. The methodological steps in the analysis include: Definition of functional operations that enter into coal conversion processes, and modeling of alternative, competing methods to accomplish these functions; compilation of all feasible conversion processes that can be assembled from combinations of competing methods for the functional operations; systematic, iterative evaluation of all feasible conversion processes under a variety of economic situations, environmental constraints, and projected technological advances; and aggregative assessments (economic and environmental) of various industrial development scenarios. An integral part of the present project is additional development of the existing computer model to include: A data base for coal-related materials and coal conversion processes; and an algorithmic structure that facilitates the iterative, systematic evaluations in response to exogenously specified variables, such as tax policy, environmental limitations, and changes in process technology and costs. As an analytical tool, the analysis is intended to satisfy the needs of an analyst working at the process selection level, for example, with respect to the allocation of RDandD funds to competing technologies.

  3. Mathematical principles of signal processing Fourier and wavelet analysis

    CERN Document Server

    Brémaud, Pierre

    2002-01-01

    Fourier analysis is one of the most useful tools in many applied sciences. The recent developments of wavelet analysis indicates that in spite of its long history and well-established applications, the field is still one of active research. This text bridges the gap between engineering and mathematics, providing a rigorously mathematical introduction of Fourier analysis, wavelet analysis and related mathematical methods, while emphasizing their uses in signal processing and other applications in communications engineering. The interplay between Fourier series and Fourier transforms is at the heart of signal processing, which is couched most naturally in terms of the Dirac delta function and Lebesgue integrals. The exposition is organized into four parts. The first is a discussion of one-dimensional Fourier theory, including the classical results on convergence and the Poisson sum formula. The second part is devoted to the mathematical foundations of signal processing - sampling, filtering, digital signal proc...

  4. Computer-Aided Sustainable Process Synthesis-Design and Analysis

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan

    Process synthesis involves the investigation of chemical reactions needed to produce the desired product, selection of the separation techniques needed for downstream processing, as well as taking decisions on sequencing the involved separation operations. For an effective, efficient and flexible...... focuses on the development and application of a computer-aided framework for sustainable synthesis-design and analysis of process flowsheets by generating feasible alternatives covering the entire search space and includes analysis tools for sustainability, LCA and economics. The synthesis method is based...... on group contribution and a hybrid approach, where chemical process flowsheets are synthesized in the same way as atoms or groups of atoms are synthesized to form molecules in computer aided molecular design (CAMD) techniques. The building blocks in flowsheet synthesis problem are called as process...

  5. Analysis of Interpersonal Communication Processes in Digital Factory Environments

    Science.gov (United States)

    Schütze, Jens; Baum, Heiko; Laue, Martin; Müller, Egon

    The paper outlines the scope of influence of digital factory on the interpersonal communication process and the exemplary description of them. On the basis of a brief description about the theoretical basic concepts of the digital factory occurs the illustration of communicative features in digital factory. Practical coherences of interpersonal communication from a human oriented view were analyzed in Volkswagen AG in Wolfsburg in a pilot project. A modeling method was developed within the process analysis. This method makes it possible to visualize interpersonal communication and its human oriented attribute in a technically focused workflow. Due to the results of a developed inquiry about communication analysis and process models of modeling methods it was possible to build the processes in a suitable way for humans and to obtain a positive implication on the communication processes.

  6. Identifying influential factors of business process performance using dependency analysis

    Science.gov (United States)

    Wetzstein, Branimir; Leitner, Philipp; Rosenberg, Florian; Dustdar, Schahram; Leymann, Frank

    2011-02-01

    We present a comprehensive framework for identifying influential factors of business process performance. In particular, our approach combines monitoring of process events and Quality of Service (QoS) measurements with dependency analysis to effectively identify influential factors. The framework uses data mining techniques to construct tree structures to represent dependencies of a key performance indicator (KPI) on process and QoS metrics. These dependency trees allow business analysts to determine how process KPIs depend on lower-level process metrics and QoS characteristics of the IT infrastructure. The structure of the dependencies enables a drill-down analysis of single factors of influence to gain a deeper knowledge why certain KPI targets are not met.

  7. Applied Fourier analysis from signal processing to medical imaging

    CERN Document Server

    Olson, Tim

    2017-01-01

    The first of its kind, this focused textbook serves as a self-contained resource for teaching from scratch the fundamental mathematics of Fourier analysis and illustrating some of its most current, interesting applications, including medical imaging and radar processing. Developed by the author from extensive classroom teaching experience, it provides a breadth of theory that allows students to appreciate the utility of the subject, but at as accessible a depth as possible. With myriad applications included, this book can be adapted to a one or two semester course in Fourier Analysis or serve as the basis for independent study. Applied Fourier Analysis assumes no prior knowledge of analysis from its readers, and begins by making the transition from linear algebra to functional analysis. It goes on to cover basic Fourier series and Fourier transforms before delving into applications in sampling and interpolation theory, digital communications, radar processing, medical i maging, and heat and wave equations. Fo...

  8. ANALYSIS OF A TRANSPORT PROCESS USING HYBRID PETRI NETS

    Directory of Open Access Journals (Sweden)

    Elisabeta Mihaela CIORTEA

    2013-05-01

    Full Text Available Purpose of the paper is to analyze the Petri net model, to describe the transport process, part of amanufacturing system and its dynamics.A hibrid Petri net model is built to describe the dinamics of the transport process manufacturingsystem. Mathematical formulation of the dinamycs processes a detailed description. Based on this model, theanalysis of the transport process is designed to be able to execute a production plan and resolve any conflictsthat may arise in the system.In the analysis dinamics known two stages: in the continuous variables are discrete hybrid system in thehibrid discrete variables are used as safety control with very well defined responsibilities.In terms of the chosen model, analyze transport process is designed to help execute a production planand resolve conflicts that may arise in the process, and then the ones in the system

  9. Development of economic consequence methodology for process risk analysis.

    Science.gov (United States)

    Zadakbar, Omid; Khan, Faisal; Imtiaz, Syed

    2015-04-01

    A comprehensive methodology for economic consequence analysis with appropriate models for risk analysis of process systems is proposed. This methodology uses loss functions to relate process deviations in a given scenario to economic losses. It consists of four steps: definition of a scenario, identification of losses, quantification of losses, and integration of losses. In this methodology, the process deviations that contribute to a given accident scenario are identified and mapped to assess potential consequences. Losses are assessed with an appropriate loss function (revised Taguchi, modified inverted normal) for each type of loss. The total loss is quantified by integrating different loss functions. The proposed methodology has been examined on two industrial case studies. Implementation of this new economic consequence methodology in quantitative risk assessment will provide better understanding and quantification of risk. This will improve design, decision making, and risk management strategies. © 2014 Society for Risk Analysis.

  10. Nonlinear Process Fault Diagnosis Based on Serial Principal Component Analysis.

    Science.gov (United States)

    Deng, Xiaogang; Tian, Xuemin; Chen, Sheng; Harris, Chris J

    2016-12-22

    Many industrial processes contain both linear and nonlinear parts, and kernel principal component analysis (KPCA), widely used in nonlinear process monitoring, may not offer the most effective means for dealing with these nonlinear processes. This paper proposes a new hybrid linear-nonlinear statistical modeling approach for nonlinear process monitoring by closely integrating linear principal component analysis (PCA) and nonlinear KPCA using a serial model structure, which we refer to as serial PCA (SPCA). Specifically, PCA is first applied to extract PCs as linear features, and to decompose the data into the PC subspace and residual subspace (RS). Then, KPCA is performed in the RS to extract the nonlinear PCs as nonlinear features. Two monitoring statistics are constructed for fault detection, based on both the linear and nonlinear features extracted by the proposed SPCA. To effectively perform fault identification after a fault is detected, an SPCA similarity factor method is built for fault recognition, which fuses both the linear and nonlinear features. Unlike PCA and KPCA, the proposed method takes into account both linear and nonlinear PCs simultaneously, and therefore, it can better exploit the underlying process's structure to enhance fault diagnosis performance. Two case studies involving a simulated nonlinear process and the benchmark Tennessee Eastman process demonstrate that the proposed SPCA approach is more effective than the existing state-of-the-art approach based on KPCA alone, in terms of nonlinear process fault detection and identification.

  11. Monitoring a municipal wastewater treatment process using a trend analysis.

    Science.gov (United States)

    Tomperi, Jani; Juuso, Esko; Kuokkanen, Anna; Leiviskä, Kauko

    2017-09-13

    New monitoring methods are required to enhance the operation of a wastewater treatment process and to meet the constantly tightening regulations for the effluent discharges. An on-line optical monitoring device, that analyses the morphological parameters of the flocs, has been shown to be a potential tool for assessing the wastewater quality and the state of the activated sludge process. In this paper, the earlier presented trend analysis method is applied to the operating conditions, the treatment results and the optical monitoring variables of a full-scale biological wastewater treatment process. The trend episodes and the deviation indices resulted from the trend analysis provide warning of the changes in the monitored variables and the received information can be used as assistance in the treatment process operation and avoiding harmful environmental risks.

  12. Mathematical methods in time series analysis and digital image processing

    CERN Document Server

    Kurths, J; Maass, P; Timmer, J

    2008-01-01

    The aim of this volume is to bring together research directions in theoretical signal and imaging processing developed rather independently in electrical engineering, theoretical physics, mathematics and the computer sciences. In particular, mathematically justified algorithms and methods, the mathematical analysis of these algorithms, and methods as well as the investigation of connections between methods from time series analysis and image processing are reviewed. An interdisciplinary comparison of these methods, drawing upon common sets of test problems from medicine and geophysical/enviromental sciences, is also addressed. This volume coherently summarizes work carried out in the field of theoretical signal and image processing. It focuses on non-linear and non-parametric models for time series as well as on adaptive methods in image processing.

  13. Strategic Joint Staff Force Posture and Readiness Process Analysis

    Science.gov (United States)

    2014-03-31

    assessed against standardised Measures of Capability (Annex C) in terms of: Scale of Effect Survivability Reach Persistence Strategic Joint...be varied to test the capabilities using the same resources organized differently. A further complementary analysis process is being established which...consider using the standardised Measures of Capability and the PRICIE metrics. Other DND Information System Data 51. Moving forward, the FP&R process

  14. Processing and analysis techniques involving in-vessel material generation

    Science.gov (United States)

    Schabron, John F [Laramie, WY; Rovani, Jr., Joseph F.

    2011-01-25

    In at least one embodiment, the inventive technology relates to in-vessel generation of a material from a solution of interest as part of a processing and/or analysis operation. Preferred embodiments of the in-vessel material generation (e.g., in-vessel solid material generation) include precipitation; in certain embodiments, analysis and/or processing of the solution of interest may include dissolution of the material, perhaps as part of a successive dissolution protocol using solvents of increasing ability to dissolve. Applications include, but are by no means limited to estimation of a coking onset and solution (e.g., oil) fractionating.

  15. Inferring Group Processes from Computer-Mediated Affective Text Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Schryver, Jack C [ORNL; Begoli, Edmon [ORNL; Jose, Ajith [Missouri University of Science and Technology; Griffin, Christopher [Pennsylvania State University

    2011-02-01

    Political communications in the form of unstructured text convey rich connotative meaning that can reveal underlying group social processes. Previous research has focused on sentiment analysis at the document level, but we extend this analysis to sub-document levels through a detailed analysis of affective relationships between entities extracted from a document. Instead of pure sentiment analysis, which is just positive or negative, we explore nuances of affective meaning in 22 affect categories. Our affect propagation algorithm automatically calculates and displays extracted affective relationships among entities in graphical form in our prototype (TEAMSTER), starting with seed lists of affect terms. Several useful metrics are defined to infer underlying group processes by aggregating affective relationships discovered in a text. Our approach has been validated with annotated documents from the MPQA corpus, achieving a performance gain of 74% over comparable random guessers.

  16. Choosing Appropriate Hazards Analysis Techniques For Your Process

    Science.gov (United States)

    1996-08-21

    Study ( HAZOP ); (v) Failure Mode and Effects Analysis (FMEA); (vi ) Fault Tree Analysis; or (vii) An appropriate equivalent methodology.” The safety...CFR 1910.119: ! Checklist ! What-if ! What-if Checklist ! Hazards and Operability Study ( HAZOP ) ! Fault Tree / Logic Diagram ! Failure Modes and...than the other methods and are more appropriate for a simple process. The HAZOP has found much use in the petroleum and chemical industries and the

  17. Digital image processing and analysis for activated sludge wastewater treatment.

    Science.gov (United States)

    Khan, Muhammad Burhan; Lee, Xue Yong; Nisar, Humaira; Ng, Choon Aun; Yeap, Kim Ho; Malik, Aamir Saeed

    2015-01-01

    Activated sludge system is generally used in wastewater treatment plants for processing domestic influent. Conventionally the activated sludge wastewater treatment is monitored by measuring physico-chemical parameters like total suspended solids (TSSol), sludge volume index (SVI) and chemical oxygen demand (COD) etc. For the measurement, tests are conducted in the laboratory, which take many hours to give the final measurement. Digital image processing and analysis offers a better alternative not only to monitor and characterize the current state of activated sludge but also to predict the future state. The characterization by image processing and analysis is done by correlating the time evolution of parameters extracted by image analysis of floc and filaments with the physico-chemical parameters. This chapter briefly reviews the activated sludge wastewater treatment; and, procedures of image acquisition, preprocessing, segmentation and analysis in the specific context of activated sludge wastewater treatment. In the latter part additional procedures like z-stacking, image stitching are introduced for wastewater image preprocessing, which are not previously used in the context of activated sludge. Different preprocessing and segmentation techniques are proposed, along with the survey of imaging procedures reported in the literature. Finally the image analysis based morphological parameters and correlation of the parameters with regard to monitoring and prediction of activated sludge are discussed. Hence it is observed that image analysis can play a very useful role in the monitoring of activated sludge wastewater treatment plants.

  18. Modeling and stability analysis of the nonlinear reactive sputtering process

    Directory of Open Access Journals (Sweden)

    György Katalin

    2011-12-01

    Full Text Available The model of the reactive sputtering process has been determined from the dynamic equilibrium of the reactive gas inside the chamber and the dynamic equilibrium of the sputtered metal atoms which form the compound with the reactive gas atoms on the surface of the substrate. The analytically obtained dynamical model is a system of nonlinear differential equations which can result in a histeresis-type input/output nonlinearity. The reactive sputtering process has been simulated by integrating these differential equations. Linearization has been applied for classical analysis of the sputtering process and control system design.

  19. A dataflow analysis tool for parallel processing of algorithms

    Science.gov (United States)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  20. Large-Scale Graph Processing Analysis using Supercomputer Cluster

    Science.gov (United States)

    Vildario, Alfrido; Fitriyani; Nugraha Nurkahfi, Galih

    2017-01-01

    Graph implementation is widely use in various sector such as automotive, traffic, image processing and many more. They produce graph in large-scale dimension, cause the processing need long computational time and high specification resources. This research addressed the analysis of implementation large-scale graph using supercomputer cluster. We impelemented graph processing by using Breadth-First Search (BFS) algorithm with single destination shortest path problem. Parallel BFS implementation with Message Passing Interface (MPI) used supercomputer cluster at High Performance Computing Laboratory Computational Science Telkom University and Stanford Large Network Dataset Collection. The result showed that the implementation give the speed up averages more than 30 times and eficiency almost 90%.

  1. Modelling of pyrolysis of coal-biomass blends using thermogravimetric analysis.

    Science.gov (United States)

    Sadhukhan, Anup Kumar; Gupta, Parthapratim; Goyal, Tripurari; Saha, Ranajit Kumar

    2008-11-01

    The primary objective of this work was to develop an appropriate model to explain the co-pyrolysis behaviour of lignite coal-biomass blends with different proportions using a thermogravimetric analyzer. A new parallel-series kinetic model was proposed to predict the pyrolysis behaviour of biomass over the entire pyrolysis regime, while a kinetic model similar to that of Anthony and Howard [Anthony, D.B., Howard, J.B., 1976. Coal devolatilization and hydrogasification. AIChE Journal 22(4), 625-656] was used for pyrolysis of coal. Analysis of mass loss history of blends showed an absence of synergistic effect between coal and biomass. Co-pyrolysis mass-loss profiles of the blends were predicted using the estimated kinetic parameters of coal and biomass. Excellent agreement was found between the predicted and the experimental results.

  2. Model-based risk analysis of coupled process steps.

    Science.gov (United States)

    Westerberg, Karin; Broberg-Hansen, Ernst; Sejergaard, Lars; Nilsson, Bernt

    2013-09-01

    A section of a biopharmaceutical manufacturing process involving the enzymatic coupling of a polymer to a therapeutic protein was characterized with regards to the process parameter sensitivity and design space. To minimize the formation of unwanted by-products in the enzymatic reaction, the substrate was added in small amounts and unreacted protein was separated using size-exclusion chromatography (SEC) and recycled to the reactor. The quality of the final recovered product was thus a result of the conditions in both the reactor and the SEC, and a design space had to be established for both processes together. This was achieved by developing mechanistic models of the reaction and SEC steps, establishing the causal links between process conditions and product quality. Model analysis was used to complement the qualitative risk assessment, and design space and critical process parameters were identified. The simulation results gave an experimental plan focusing on the "worst-case regions" in terms of product quality and yield. In this way, the experiments could be used to verify both the suggested process and the model results. This work demonstrates the necessary steps of model-assisted process analysis, from model development through experimental verification. Copyright © 2013 Wiley Periodicals, Inc.

  3. Conducting Qualitative Data Analysis: Qualitative Data Analysis as a Metaphoric Process

    Science.gov (United States)

    Chenail, Ronald J.

    2012-01-01

    In the second of a series of "how-to" essays on conducting qualitative data analysis, Ron Chenail argues the process can best be understood as a metaphoric process. From this orientation he suggests researchers follow Kenneth Burke's notion of metaphor and see qualitative data analysis as the analyst systematically considering the "this-ness" of…

  4. Processing cassava into chips for industry and export: analysis of ...

    African Journals Online (AJOL)

    Processing cassava into chips for industry and export: analysis of acceptance of technology among small holder processors in Imo State, Nigeria. ... The study revealed a medium level of acceptance (3.52) of the technology among small scale processors in the area. Finally, the results of weighted least square regression ...

  5. Army Information Technology Procurement: A Business Process Analysis

    Science.gov (United States)

    2015-03-27

    Brent Langhals, Lt Col, USAF Member John Elshaw, PhD Member Abstract The integration of Information and Communication Technology ( ICT ...the lack transparency in how resources are allocated. This thesis presents a business process analysis of the Army’s ICT procurement system. The...6  Research Questions .........................................................................................................7  Methodology and

  6. Cumulative analysis of measurement processes and a correcting filtration

    OpenAIRE

    MEHDIYEVA A.M.

    2016-01-01

    The offered systematic approach to creation of information-measuring systems of considered parameters consists of cumulative analysis of measurement processes and a correcting filtration for the purpose of achievement of the balanced metrological, structurally-algorithmic and functional efficiency indicators of developed means.

  7. Design and Analysis of Elliptical Nozzle in AJM Process using ...

    African Journals Online (AJOL)

    ... material removal rate (MRR), so this research mainly focuses on designing nozzle geometry to improve flow rate and MRR in AJM machining process. The elliptical shape nozzle has been designed and analyzed using computational fluid dynamics software (CFD). CFD is the most efficient software for flow rate analysis.

  8. Economic analysis of locust bean processing and marketing in Iwo ...

    African Journals Online (AJOL)

    This study was designed to estimate the economic analysis of locust bean processing and marketing in Iwo Local Government Area of Osun State, Nigeria. Primary data was used and purposive sampling technique was adopted to select the respondents used for the study. A total number of 60 respondents were interviewed ...

  9. Data Analysis as the Search for Signals in Noisy Processes.

    Science.gov (United States)

    Konold, Clifford; Pollatsek, Alexander

    2002-01-01

    Explores challenges of learning to think about data as signal and noise. Examines the signal/noise metaphor in the context of three different statistical processes: (1) repeated measures; (2) measuring individuals; and (3) dichotomous events. Makes several recommendations for research and instruction on the basis of this analysis. (Author/KHR)

  10. Multivariate statistical analysis of a multi-step industrial processes

    DEFF Research Database (Denmark)

    Reinikainen, S.P.; Høskuldsson, Agnar

    2007-01-01

    multivariate multi-step processes, where results from each step are used to evaluate future results, is presented. The methods presented are based on Priority PLS Regression. The basic idea is to compute the weights in the regression analysis for given steps, but adjust all data by the resulting score vectors...

  11. Adaptive processing of geochemical analysis by means of spline model

    Directory of Open Access Journals (Sweden)

    А. Я. Білецький

    2000-09-01

    Full Text Available Proposed is adaptive algorithm of geochemical analysis processing by means of spline model. Demonstrated is the possibility of splines to reveal heterogeneous fields in the composition of iron ore which, because of considerable dispersion of random component, is practically impossible to reveal

  12. Production yield analysis in the poultry processing industry

    NARCIS (Netherlands)

    Somsen, D.J.; Capelle, A.; Tramper, J.

    2004-01-01

    The paper outlines a case study where the PYA-method (production yield analysis) was implemented at a poultry-slaughtering line, processing 9000 broiler chicks per hour. It was shown that the average live weight of a flock of broilers could be used to predict the maximum production yield of the

  13. Sampling and sample processing in pesticide residue analysis.

    Science.gov (United States)

    Lehotay, Steven J; Cook, Jo Marie

    2015-05-13

    Proper sampling and sample processing in pesticide residue analysis of food and soil have always been essential to obtain accurate results, but the subject is becoming a greater concern as approximately 100 mg test portions are being analyzed with automated high-throughput analytical methods by agrochemical industry and contract laboratories. As global food trade and the importance of monitoring increase, the food industry and regulatory laboratories are also considering miniaturized high-throughput methods. In conjunction with a summary of the symposium "Residues in Food and Feed - Going from Macro to Micro: The Future of Sample Processing in Residue Analytical Methods" held at the 13th IUPAC International Congress of Pesticide Chemistry, this is an opportune time to review sampling theory and sample processing for pesticide residue analysis. If collected samples and test portions do not adequately represent the actual lot from which they came and provide meaningful results, then all costs, time, and efforts involved in implementing programs using sophisticated analytical instruments and techniques are wasted and can actually yield misleading results. This paper is designed to briefly review the often-neglected but crucial topic of sample collection and processing and put the issue into perspective for the future of pesticide residue analysis. It also emphasizes that analysts should demonstrate the validity of their sample processing approaches for the analytes/matrices of interest and encourages further studies on sampling and sample mass reduction to produce a test portion.

  14. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    Science.gov (United States)

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  15. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    Energy Technology Data Exchange (ETDEWEB)

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, Francois; Aletti, Pierre [Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France) and DOSIsoft SA, 94230 Cachan (France); Research Laboratory for Innovative Processes (ERPI), Nancy University, EA 3767, 5400 Nancy Cedex (France); Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France); DOSIsoft SA, 94230 Cachan (France); Research Center for Automatic Control (CRAN), Nancy University, CNRS, 54516 Vandoeuvre-les-Nancy, France and Department of Medical Physics, Alexis Vautrin Cancer Center, 54511 Vandoeuvre-les-Nancy Cedex (France)

    2009-04-15

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculating a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should

  16. Example process hazard analysis of a Department of Energy water chlorination process

    Energy Technology Data Exchange (ETDEWEB)

    1993-09-01

    On February 24, 1992, the Occupational Safety and Health Administration (OSHA) released a revised version of Section 29 Code of Federal Regulations CFR Part 1910 that added Section 1910.119, entitled ``Process Safety Management of Highly Hazardous Chemicals`` (the PSM Rule). Because US Department of Energy (DOE) Orders 5480.4 and 5483.1A prescribe OSHA 29 CFR 1910 as a standard in DOE, the PSM Rule is mandatory in the DOE complex. A major element in the PSM Rule is the process hazard analysis (PrHA), which is required for all chemical processes covered by the PSM Rule. The PrHA element of the PSM Rule requires the selection and application of appropriate hazard analysis methods to systematically identify hazards and potential accident scenarios associated with processes involving highly hazardous chemicals (HHCs). The analysis in this report is an example PrHA performed to meet the requirements of the PSM Rule. The PrHA method used in this example is the hazard and operability (HAZOP) study, and the process studied is the new Hanford 300-Area Water Treatment Facility chlorination process, which is currently in the design stage. The HAZOP study was conducted on May 18--21, 1993, by a team from the Westinghouse Hanford Company (WHC), Battelle-Columbus, the DOE, and Pacific Northwest Laboratory (PNL). The chlorination process was chosen as the example process because it is common to many DOE sites, and because quantities of chlorine at those sites generally exceed the OSHA threshold quantities (TQs).

  17. Data Farming Process and Initial Network Analysis Capabilities

    Directory of Open Access Journals (Sweden)

    Gary Horne

    2016-01-01

    Full Text Available Data Farming, network applications and approaches to integrate network analysis and processes to the data farming paradigm are presented as approaches to address complex system questions. Data Farming is a quantified approach that examines questions in large possibility spaces using modeling and simulation. It evaluates whole landscapes of outcomes to draw insights from outcome distributions and outliers. Social network analysis and graph theory are widely used techniques for the evaluation of social systems. Incorporation of these techniques into the data farming process provides analysts examining complex systems with a powerful new suite of tools for more fully exploring and understanding the effect of interactions in complex systems. The integration of network analysis with data farming techniques provides modelers with the capability to gain insight into the effect of network attributes, whether the network is explicitly defined or emergent, on the breadth of the model outcome space and the effect of model inputs on the resultant network statistics.

  18. Process synthesis, design and analysis using process-group contribution method

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Eden, Mario Richard; Gani, Rafiqul

    2014-01-01

    This paper describes the development and application of a framework for synthesis, design, and analysis of chemical and biochemical processes. The framework is based on the principle of group contribution used for prediction of physical properties. The fundamental pillars of this methodology are ...

  19. Integration of human reliability analysis into the high consequence process

    Energy Technology Data Exchange (ETDEWEB)

    Houghton, F.K.; Morzinski, J.

    1998-12-01

    When performing a hazards analysis (HA) for a high consequence process, human error often plays a significant role in the hazards analysis. In order to integrate human error into the hazards analysis, a human reliability analysis (HRA) is performed. Human reliability is the probability that a person will correctly perform a system-required activity in a required time period and will perform no extraneous activity that will affect the correct performance. Even though human error is a very complex subject that can only approximately be addressed in risk assessment, an attempt must be made to estimate the effect of human errors. The HRA provides data that can be incorporated in the hazard analysis event. This paper will discuss the integration of HRA into a HA for the disassembly of a high explosive component. The process was designed to use a retaining fixture to hold the high explosive in place during a rotation of the component. This tool was designed as a redundant safety feature to help prevent a drop of the explosive. This paper will use the retaining fixture to demonstrate the following HRA methodology`s phases. The first phase is to perform a task analysis. The second phase is the identification of the potential human, both cognitive and psychomotor, functions performed by the worker. During the last phase the human errors are quantified. In reality, the HRA process is an iterative process in which the stages overlap and information gathered in one stage may be used to refine a previous stage. The rationale for the decision to use or not use the retaining fixture and the role the HRA played in the decision will be discussed.

  20. Digital-image processing and image analysis of glacier ice

    Science.gov (United States)

    Fitzpatrick, Joan J.

    2013-01-01

    This document provides a methodology for extracting grain statistics from 8-bit color and grayscale images of thin sections of glacier ice—a subset of physical properties measurements typically performed on ice cores. This type of analysis is most commonly used to characterize the evolution of ice-crystal size, shape, and intercrystalline spatial relations within a large body of ice sampled by deep ice-coring projects from which paleoclimate records will be developed. However, such information is equally useful for investigating the stress state and physical responses of ice to stresses within a glacier. The methods of analysis presented here go hand-in-hand with the analysis of ice fabrics (aggregate crystal orientations) and, when combined with fabric analysis, provide a powerful method for investigating the dynamic recrystallization and deformation behaviors of bodies of ice in motion. The procedures described in this document compose a step-by-step handbook for a specific image acquisition and data reduction system built in support of U.S. Geological Survey ice analysis projects, but the general methodology can be used with any combination of image processing and analysis software. The specific approaches in this document use the FoveaPro 4 plug-in toolset to Adobe Photoshop CS5 Extended but it can be carried out equally well, though somewhat less conveniently, with software such as the image processing toolbox in MATLAB, Image-Pro Plus, or ImageJ.

  1. Surplus analysis of Sparre Andersen insurance risk processes

    CERN Document Server

    Willmot, Gordon E

    2017-01-01

    This carefully written monograph covers the Sparre Andersen process in an actuarial context using the renewal process as the model for claim counts. A unified reference on Sparre Andersen (renewal risk) processes is included, often missing from existing literature. The authors explore recent results and analyse various risk theoretic quantities associated with the event of ruin, including the time of ruin and the deficit of ruin. Particular attention is given to the explicit identification of defective renewal equation components, which are needed to analyse various risk theoretic quantities and are also relevant in other subject areas of applied probability such as dams and storage processes, as well as queuing theory. Aimed at researchers interested in risk/ruin theory and related areas, this work will also appeal to graduate students in classical and modern risk theory and Gerber-Shiu analysis.

  2. System Analysis of Flat Grinding Process with Wheel Face

    Directory of Open Access Journals (Sweden)

    T. N. Ivanova

    2014-01-01

    Full Text Available The paper presents a conducted system analysis of the flat grinding wheel face, considers the state parameters, input and output variables of subsystems, namely: machine tool, workpiece, grinding wheel, cutting fluids, and the contact area. It reveals the factors influencing the temperature and power conditions for the grinding process.Aim: conducting the system analysis of the flat grinding process with wheel face expects to enable a development of the system of grinding process parameters as a technical system, which will make it possible to evaluate each parameter individually and implement optimization of the entire system.One of the most important criteria in defining the optimal process conditions is the grinding temperature, which, to avoid defects appearance of on the surface of component, should not exceed the critical temperature values to be experimentally determined. The temperature criterion can be useful for choosing the conditions for the maximum defect-free performance of the mechanical face grinding. To define the maximum performance of defect-free grinding can also use other criteria such as a critical power density, indirectly reflecting the allowable thermal stress grinding process; the structure of the ground surface, which reflects the presence or absence of a defect layer, which is determined after the large number of experiments; flow range of the diamond layer.Optimal conditions should not exceed those of defect-free grinding. It is found that a maximum performance depends on the characteristics of circles and grade of processed material, as well as on the contact area and grinding conditions. Optimal performance depends on the diamond value (cost and specific consumption of diamonds in a circle.Above criteria require formalization as a function of the variable parameters of the grinding process. There is an option for the compromise of inter-criteria optimality, thereby providing a set of acceptable solutions, from

  3. Computer analysis system of the physician-patient consultation process.

    Science.gov (United States)

    Katsuyama, Kimiko; Koyama, Yuichi; Hirano, Yasushi; Mase, Kenji; Kato, Ken; Mizuno, Satoshi; Yamauchi, Kazunobu

    2010-01-01

    Measurements of the quality of physician-patient communication are important in assessing patient outcomes, but the quality of communication is difficult to quantify. The aim of this paper is to develop a computer analysis system for the physician-patient consultation process (CASC), which will use a quantitative method to quantify and analyze communication exchanges between physicians and patients during the consultation process. CASC is based on the concept of narrative-based medicine using a computer-mediated communication (CMC) technique from a cognitive dialog processing system. Effective and ineffective consultation samples from the works of Saito and Kleinman were tested with CASC in order to establish the validity of CASC for use in clinical practice. After validity was confirmed, three researchers compared their assessments of consultation processes in a physician's office with CASCs. Consultations of 56 migraine patients were recorded with permission, and for this study consultations of 29 patients that included more than 50 words were used. Transcribed data from the 29 consultations input into CASC resulted in two diagrams of concept structure and concept space to assess the quality of consultation. The concordance rate between the assessments by CASC and the researchers was 75 percent. In this study, a computer-based communication analysis system was established that efficiently quantifies the quality of the physician-patient consultation process. The system is promising as an effective tool for evaluating the quality of physician-patient communication in clinical and educational settings.

  4. Stochastic analysis for gaussian random processes and fields with applications

    CERN Document Server

    Mandrekar, Vidyadhar S

    2015-01-01

    Stochastic Analysis for Gaussian Random Processes and Fields: With Applications presents Hilbert space methods to study deep analytic properties connecting probabilistic notions. In particular, it studies Gaussian random fields using reproducing kernel Hilbert spaces (RKHSs).The book begins with preliminary results on covariance and associated RKHS before introducing the Gaussian process and Gaussian random fields. The authors use chaos expansion to define the Skorokhod integral, which generalizes the Itô integral. They show how the Skorokhod integral is a dual operator of Skorokhod differenti

  5. THE ANALYSIS OF RISK MANAGEMENT PROCESS WITHIN MANAGEMENT

    Directory of Open Access Journals (Sweden)

    ROMANESCU MARCEL LAURENTIU

    2016-10-01

    Full Text Available This article highlights the risk analysis within management, focusing on how a company could practicaly integrate the risks management in the existing leading process. Subsequently, it is exemplified the way of manage risk effectively, which gives numerous advantages to all firms, including improving their decision-making process. All these lead to the conclusion that the degree of risk specific to companies is very high, but if managers make the best decisions then it can diminish it and all business activitiy and its income are not influenced by factors that could disturb in a negative way .

  6. Analysis of hazardous substances released during CFRP laser processing

    Science.gov (United States)

    Hustedt, Michael; Walter, Juergen; Bluemel, Sven; Jaeschke, Peter; Kaierle, Stefan

    2017-02-01

    Due to their outstanding mechanical properties, in particular their high specific strength parallel to the carbon fibers, carbon fiber reinforced plastics (CFRP) have a high potential regarding resource-efficient lightweight construction. Consequently, these composite materials are increasingly finding application in important industrial branches such as aircraft, automotive and wind energy industry. However, the processing of these materials is highly demanding. On the one hand, mechanical processing methods such as milling or drilling are sometimes rather slow, and they are connected with notable tool wear. On the other hand, thermal processing methods are critical as the two components matrix and reinforcement have widely differing thermophysical properties, possibly leading to damages of the composite structure in terms of pores or delamination. An emerging innovative method for processing of CFRP materials is the laser technology. As principally thermal method, laser processing is connected with the release of potentially hazardous, gaseous and particulate substances. Detailed knowledge of these process emissions is the basis to ensure the protection of man and the environment, according to the existing legal regulations. This knowledge will help to realize adequate protective measures and thus strengthen the development of CFRP laser processing. In this work, selected measurement methods and results of the analysis of the exhaust air and the air at the workplace during different laser processes with CFRP materials are presented. The investigations have been performed in the course of different cooperative projects, funded by the German Federal Ministry of Education and Research (BMBF) in the course of the funding initiative "Photonic Processes and Tools for Resource-Efficient Lightweight Structures".

  7. Towards an amplitude analysis of exclusive. gamma gamma. processes

    Energy Technology Data Exchange (ETDEWEB)

    Pennington, M.R.

    1988-06-01

    The potential of two photon processes to shed light on the parton content of resonances, we maintain, can only be realized in practice by moving towards an Amplitude Analysis of experimental data. By using the process ..gamma gamma.. ..-->.. ..pi pi.. as an example, the way to do this is discussed. Presently claimed uncertainties in the ..gamma gamma.. width of even the well-known f/sub 2/ (1270) are shown to be over-optimistic and the fitted couplings of the overlapping scalar states in the 1 GeV region meaningless. Only the use of Amplitude Analysis techniques on the new higher statistics data from SLAC and DESY can resolve these uncertainties and lead to definite and significant results. 37 refs., 18 figs.

  8. Safety Analysis of Soybean Processing for Advanced Life Support

    Science.gov (United States)

    Hentges, Dawn L.

    1999-01-01

    Soybeans (cv. Hoyt) is one of the crops planned for food production within the Advanced Life Support System Integration Testbed (ALSSIT), a proposed habitat simulation for long duration lunar/Mars missions. Soybeans may be processed into a variety of food products, including soymilk, tofu, and tempeh. Due to the closed environmental system and importance of crew health maintenance, food safety is a primary concern on long duration space missions. Identification of the food safety hazards and critical control points associated with the closed ALSSIT system is essential for the development of safe food processing techniques and equipment. A Hazard Analysis Critical Control Point (HACCP) model was developed to reflect proposed production and processing protocols for ALSSIT soybeans. Soybean processing was placed in the type III risk category. During the processing of ALSSIT-grown soybeans, critical control points were identified to control microbiological hazards, particularly mycotoxins, and chemical hazards from antinutrients. Critical limits were suggested at each CCP. Food safety recommendations regarding the hazards and risks associated with growing, harvesting, and processing soybeans; biomass management; and use of multifunctional equipment were made in consideration of the limitations and restraints of the closed ALSSIT.

  9. Spatial Analysis of Depots for Advanced Biomass Processing

    Energy Technology Data Exchange (ETDEWEB)

    Hilliard, Michael R. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Brandt, Craig C. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Webb, Erin [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Sokhansanj, Shahabaddine [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Eaton, Laurence M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Martinez Gonzalez, Maria I. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2014-09-01

    The objective of this work was to perform a spatial analysis of the total feedstock cost at the conversion reactor for biomass supplied by a conventional system and an advanced system with depots to densify biomass into pellets. From these cost estimates, the conditions (feedstock cost and availability) for which advanced processing depots make it possible to achieve cost and volume targets can be identified.

  10. Heat Balance Analysis of EPS Products Shaping Process

    Directory of Open Access Journals (Sweden)

    Władysiak R.

    2013-09-01

    Full Text Available The work is a part of research into the reduction of energy consumption in the production of EPSthrough the modernization of technological equipment used. This paper presents the results of research and analysis of heat transfer process between the water vapor that was provided to machine, the mold, the product and the environment. The paper shows the calculation of the heat balance of the production cycle for two types of mold: standard and modernized. The performance tests used an infrared imaging camera. The results were used to develop a computer image analysis and statistical analysis. This paper presents the main stages of the production process and the construction of technological equipment used, changing the mold surface temperature field during the production cycle and the structure of the heat balance for the mold and its instrumentation. It has been shown that the modernization of construction of technological equipment has reduced the temperature field and as a consequence of decreased of demand for process steam production cycle.

  11. Contingency Analysis Post-Processing With Advanced Computing and Visualization

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Yousu; Glaesemann, Kurt R.; Fitzhenry, Erin B.

    2017-07-03

    Contingency analysis is a critical function widely used in energy management systems to assess the impact of power system component failures. Its outputs are important for power system operation for improved situational awareness, power system planning studies, and power market operations. With the increased complexity of power system modeling and simulation caused by increased energy production and demand, the penetration of renewable energy and fast deployment of smart grid devices, and the trend of operating grids closer to their capacity for better efficiency, more and more contingencies must be executed and analyzed quickly in order to ensure grid reliability and accuracy for the power market. Currently, many researchers have proposed different techniques to accelerate the computational speed of contingency analysis, but not much work has been published on how to post-process the large amount of contingency outputs quickly. This paper proposes a parallel post-processing function that can analyze contingency analysis outputs faster and display them in a web-based visualization tool to help power engineers improve their work efficiency by fast information digestion. Case studies using an ESCA-60 bus system and a WECC planning system are presented to demonstrate the functionality of the parallel post-processing technique and the web-based visualization tool.

  12. Numerical and experimental microscale analysis of the incremental forming process

    Science.gov (United States)

    Szyndler, Joanna; Delannay, Laurent; Muszka, Krzysztof; Madej, Lukasz

    2017-10-01

    Development of the 2D concurrent multiscale numerical model of novel incremental forming (IF) process is the main aim of the paper. The IF process is used to obtain light and durable integral parts, especially useful in aerospace or automotive industries. Particular attention in the present work is put on numerical investigation of material behavior at both, macro and micro scale levels. A Finite Element Method (FEM) supported by Digital Material Representation (DMR) concept is used during the investigation. Also, the Crystal Plasticity (CP) theory is applied to describe material flow at the grain level. Examples of obtained results both from the macro and micro scales are presented in the form of strain distributions, grain shapes and pole figures at different process stages. Moreover, Electron Backscatter Diffraction (EBSD) analysis is used to obtain detailed information regarding material morphology changes during the incremental forming for the comparison purposes.

  13. An image-processing analysis of skin textures.

    Science.gov (United States)

    Sparavigna, A; Marazzato, R

    2010-05-01

    This paper discusses an image-processing method applied to skin texture analysis. Considering that the characterisation of human skin texture is a task approached only recently by image processing, our goal is to lay out the benefits of this technique for quantitative evaluations of skin features and localisation of defects. We propose a method based on a statistical approach to image pattern recognition. The results of our statistical calculations on the grey-tone distributions of the images are proposed in specific diagrams, the coherence length diagrams. Using the coherence length diagrams, we were able to determine grain size and anisotropy of skin textures. Maps showing the localisation of defects are also proposed. According to the chosen statistical parameters of grey-tone distribution, several procedures to defect detection can be proposed. Here, we follow a comparison of the local coherence lengths with their average values. More sophisticated procedures, suggested by clinical experience, can be used to improve the image processing.

  14. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types......This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....

  15. Memory systems, processes, and tasks: taxonomic clarification via factor analysis.

    Science.gov (United States)

    Bruss, Peter J; Mitchell, David B

    2009-01-01

    The nature of various memory systems was examined using factor analysis. We reanalyzed data from 11 memory tasks previously reported in Mitchell and Bruss (2003). Four well-defined factors emerged, closely resembling episodic and semantic memory and conceptual and perceptual implicit memory, in line with both memory systems and transfer-appropriate processing accounts. To explore taxonomic issues, we ran separate analyses on the implicit tasks. Using a cross-format manipulation (pictures vs. words), we identified 3 prototypical tasks. Word fragment completion and picture fragment identification tasks were "factor pure," tapping perceptual processes uniquely. Category exemplar generation revealed its conceptual nature, yielding both cross-format priming and a picture superiority effect. In contrast, word stem completion and picture naming were more complex, revealing attributes of both processes.

  16. Thermoreflectance spectroscopy—Analysis of thermal processes in semiconductor lasers

    Science.gov (United States)

    Pierścińska, D.

    2018-01-01

    This review focuses on theoretical foundations, experimental implementation and an overview of experimental results of the thermoreflectance spectroscopy as a powerful technique for temperature monitoring and analysis of thermal processes in semiconductor lasers. This is an optical, non-contact, high spatial resolution technique providing high temperature resolution and mapping capabilities. Thermoreflectance is a thermometric technique based on measuring of relative change of reflectivity of the surface of laser facet, which provides thermal images useful in hot spot detection and reliability studies. In this paper, principles and experimental implementation of the technique as a thermography tool is discussed. Some exemplary applications of TR to various types of lasers are presented, proving that thermoreflectance technique provides new insight into heat management problems in semiconductor lasers and in particular, that it allows studying thermal degradation processes occurring at laser facets. Additionally, thermal processes and basic mechanisms of degradation of the semiconductor laser are discussed.

  17. Process Equipment Failure Mode Analysis in a Chemical Industry

    Directory of Open Access Journals (Sweden)

    J. Nasl Seraji

    2008-04-01

    Full Text Available Background and aims   Prevention of potential accidents and safety promotion in chemical processes requires systematic safety management in them. The main objective of this study was analysis of important process equipment components failure modes and effects in H2S and CO2  isolation from extracted natural gas process.   Methods   This study was done in sweetening unit of an Iranian gas refinery. Failure Mode and Effect Analysis (FMEA used for identification of process equipments failures.   Results   Totally 30 failures identified and evaluated using FMEA. P-1 blower's blade breaking and sour gas pressure control valve bearing tight moving had maximum risk Priority number (RPN, P-1 body corrosion and increasing plug lower side angle of reach DEAlevel control valve  in tower - 1 were minimum calculated RPN.   Conclusion   By providing a reliable documentation system for equipment failures and  incidents recording, maintaining of basic information for later safety assessments would be  possible. Also, the probability of failures and effects could be minimized by conducting preventive maintenance.

  18. ACTINIDE REMOVAL PROCESS SAMPLE ANALYSIS, CHEMICAL MODELING, AND FILTRATION EVALUATION

    Energy Technology Data Exchange (ETDEWEB)

    Martino, C.; Herman, D.; Pike, J.; Peters, T.

    2014-06-05

    Filtration within the Actinide Removal Process (ARP) currently limits the throughput in interim salt processing at the Savannah River Site. In this process, batches of salt solution with Monosodium Titanate (MST) sorbent are concentrated by crossflow filtration. The filtrate is subsequently processed to remove cesium in the Modular Caustic Side Solvent Extraction Unit (MCU) followed by disposal in saltstone grout. The concentrated MST slurry is washed and sent to the Defense Waste Processing Facility (DWPF) for vitrification. During recent ARP processing, there has been a degradation of filter performance manifested as the inability to maintain high filtrate flux throughout a multi-batch cycle. The objectives of this effort were to characterize the feed streams, to determine if solids (in addition to MST) are precipitating and causing the degraded performance of the filters, and to assess the particle size and rheological data to address potential filtration impacts. Equilibrium modelling with OLI Analyzer{sup TM} and OLI ESP{sup TM} was performed to determine chemical components at risk of precipitation and to simulate the ARP process. The performance of ARP filtration was evaluated to review potential causes of the observed filter behavior. Task activities for this study included extensive physical and chemical analysis of samples from the Late Wash Pump Tank (LWPT) and the Late Wash Hold Tank (LWHT) within ARP as well as samples of the tank farm feed from Tank 49H. The samples from the LWPT and LWHT were obtained from several stages of processing of Salt Batch 6D, Cycle 6, Batch 16.

  19. Sustainable Process Design under uncertainty analysis: targeting environmental indicators

    DEFF Research Database (Denmark)

    L. Gargalo, Carina; Gani, Rafiqul

    2015-01-01

    This study focuses on uncertainty analysis of environmental indicators used to support sustainable process design efforts. To this end, the Life Cycle Assessment methodology is extended with a comprehensive uncertainty analysis to propagate the uncertainties in input LCA data to the environmental...... indicators. The resulting uncertainties in the environmental indicators are then represented by empirical cumulative distribution function, which provides a probabilistic basis for the interpretation of the indicators. In order to highlight the main features of the extended LCA, the production of biodiesel...... from algae biomass is used as a case study. The results indicate there are considerable uncertainties in the calculated environmental indicators as revealed by CDFs. The underlying sources of these uncertainties are indeed the significant variation in the databases used for the LCA analysis...

  20. Integrated genomic analysis of mitochondrial RNA processing in human cancers.

    Science.gov (United States)

    Idaghdour, Youssef; Hodgkinson, Alan

    2017-04-18

    The mitochondrial genome is transcribed as continuous polycistrons of RNA containing multiple genes. As a consequence, post-transcriptional events are critical for the regulation of gene expression and therefore all aspects of mitochondrial function. One particularly important process is the m1A/m1G RNA methylation of the ninth position of different mitochondrial tRNAs, which allows efficient processing of mitochondrial mRNAs and protein translation, and de-regulation of genes involved in these processes has been associated with altered mitochondrial function. Although mitochondria play a key role in cancer, the status of mitochondrial RNA processing in tumorigenesis is unknown. We measure and assess mitochondrial RNA processing using integrated genomic analysis of RNA sequencing and genotyping data from 1226 samples across 12 different cancer types. We focus on the levels of m1A and m1G RNA methylation in mitochondrial tRNAs in normal and tumor samples and use supervised and unsupervised statistical analysis to compare the levels of these modifications to patient whole genome genotypes, nuclear gene expression, and survival outcomes. We find significant changes to m1A and m1G RNA methylation levels in mitochondrial tRNAs in tumor tissues across all cancers. Pathways of RNA processing are strongly associated with methylation levels in normal tissues (P = 3.27 × 10-31), yet these associations are lost in tumors. Furthermore, we report 18 gene-by-disease-state interactions where altered RNA methylation levels occur under cancer status conditional on genotype, implicating genes associated with mitochondrial function or cancer (e.g., CACNA2D2, LMO2, and FLT3) and suggesting that nuclear genetic variation can potentially modulate an individual's ability to maintain unaltered rates of mitochondrial RNA processing under cancer status. Finally, we report a significant association between the magnitude of methylation level changes in tumors and patient survival

  1. Gear distortion analysis due to heat treatment process

    Science.gov (United States)

    Guterres, Natalino F. D. S.; Rusnaldy, Widodo, Achmad

    2017-01-01

    One way to extend the life time of the gear is minimizing the distortion during the manufacturing process. One of the most important processes in manufacturing to produce gears is heat treatment process. The purpose of this study is to analyze the distortion of the gear after heat treatment process. The material of gear is AISI 1045, and it was designed with the module (m) 1.75, and a number of teeth (z) 29. Gear was heat-treated in the furnace at a temperature of 800°C, holding time of 30 minutes, and then quenched in water. Furthermore, surface hardening process was also performed on gear teeth at a temperature of 820°C and holding time of 35 seconds and the similar procedure of analysis was conducted. The hardness of gear after heat treatment average 63.2 HRC and the teeth surface hardness after gear to induction hardening was 64.9 HRC at the case depth 1 mm. The microstructure of tested gear are martensitic and pearlite. The highest distortion on tooth thickness to upper than 0.063 can cause high precision at the tooth contact is not appropriate. Besides the shrinkage of tooth thickness will also affect to contact angle because the size of gear tolerance was not standardized.

  2. Analysis of business process maturity and organisational performance relations

    Directory of Open Access Journals (Sweden)

    Kalinowski T. Bartosz

    2016-12-01

    Full Text Available The paper aims to present results of the study on business process maturity in relation to organisational performance. A two-phase methodology, based on literature review and survey was used. The literature is a source of knowledge about business process maturity and organisational performance, whereas the research on process maturity vs organisational performance in Polish Enterprises provides findings based on 84 surveyed companies. The main areas of the research covered: identification and analysis of maturity related variables and identification of organisational performance perspectives and its relation to process maturity. The study shows that there is a significant positive relation between process maturity and organisational performance. Although there are research on such relation available, they are scarce and have some significant limitations in terms of research sample or the scope of maturity or organisational performance covered. This publication is part of a project funded by the National Science Centre awarded by decision number DEC-2011/01/D/HS4/04070.

  3. Image processing analysis of traditional Gestalt vision experiments

    Science.gov (United States)

    McCann, John J.

    2002-06-01

    In the late 19th century, the Gestalt Psychology rebelled against the popular new science of Psychophysics. The Gestalt revolution used many fascinating visual examples to illustrate that the whole is greater than the sum of all the parts. Color constancy was an important example. The physical interpretation of sensations and their quantification by JNDs and Weber fractions were met with innumerable examples in which two 'identical' physical stimuli did not look the same. The fact that large changes in the color of the illumination failed to change color appearance in real scenes demanded something more than quantifying the psychophysical response of a single pixel. The debates continues today with proponents of both physical, pixel-based colorimetry and perceptual, image- based cognitive interpretations. Modern instrumentation has made colorimetric pixel measurement universal. As well, new examples of unconscious inference continue to be reported in the literature. Image processing provides a new way of analyzing familiar Gestalt displays. Since the pioneering experiments by Fergus Campbell and Land, we know that human vision has independent spatial channels and independent color channels. Color matching data from color constancy experiments agrees with spatial comparison analysis. In this analysis, simple spatial processes can explain the different appearances of 'identical' stimuli by analyzing the multiresolution spatial properties of their surrounds. Benary's Cross, White's Effect, the Checkerboard Illusion and the Dungeon Illusion can all be understood by the analysis of their low-spatial-frequency components. Just as with color constancy, these Gestalt images are most simply described by the analysis of spatial components. Simple spatial mechanisms account for the appearance of 'identical' stimuli in complex scenes. It does not require complex, cognitive processes to calculate appearances in familiar Gestalt experiments.

  4. Fractal texture analysis of the healing process after bone loss.

    Science.gov (United States)

    Borowska, Marta; Szarmach, Janusz; Oczeretko, Edward

    2015-12-01

    Radiological assessment of treatment effectiveness of guided bone regeneration (GBR) method in postresectal and postcystal bone loss cases, observed for one year. Group of 25 patients (17 females and 8 males) who underwent root resection with cystectomy were evaluated. The following combination therapy of intraosseous deficits was used, consisting of bone augmentation with xenogenic material together with covering regenerative membranes and tight wound closure. The bone regeneration process was estimated, comparing the images taken on the day of the surgery and 12 months later, by means of Kodak RVG 6100 digital radiography set. The interpretation of the radiovisiographic image depends on the evaluation ability of the eye looking at it, which leaves a large margin of uncertainty. So, several texture analysis techniques were developed and used sequentially on the radiographic image. For each method, the results were the mean from the 25 images. These methods compute the fractal dimension (D), each one having its own theoretic basis. We used five techniques for calculating fractal dimension: power spectral density method, triangular prism surface area method, blanket method, intensity difference scaling method and variogram analysis. Our study showed a decrease of fractal dimension during the healing process after bone loss. We also found evidence that various methods of calculating fractal dimension give different results. During the healing process after bone loss, the surfaces of radiographic images became smooth. The result obtained show that our findings may be of great importance for diagnostic purpose. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Process integration and pinch analysis in sugarcane industry

    Energy Technology Data Exchange (ETDEWEB)

    Prado, Adelk de Carvalho; Pinheiro, Ricardo Brant [UFMG, Departamento de Engenharia Nuclear, Programa de Pos-Graduacao em Ciencias e Tecnicas Nucleares, Belo Horizonte, MG (Brazil)], E-mail: rbp@nuclear.ufmg.br

    2010-07-01

    Process integration techniques were applied, particularly through the Pinch Analysis method, to sugarcane industry. Research was performed upon harvest data from an agroindustrial complex which processes sugarcane plant in excess of 3.5 million metric tons per year, producing motor fuel grade ethanol, standard quality sugar, and delivering excess electric power to the grid. Pinch Analysis was used in assessing internal heat recovery as well as external utility demand targets, while keeping the lowest but economically achievable targets for entropy increase. Efficiency on the use of energy was evaluated for the plant as it was found (the base case) as well as for five selected process and/or plant design modifications, always with guidance of the method. The first alternative design (case 2) was proposed to evaluate equipment mean idle time in the base case, to support subsequent comparisons. Cases 3 and 4 were used to estimate the upper limits of combined heat and power generation while raw material supply of the base case is kept; both the cases did not prove worth implementing. Cases 5 and 6 were devised to deal with the bottleneck of the plant, namely boiler capacity, in order to allow for some production increment. Inexpensive, minor modifications considered in case 5 were found unable to produce reasonable outcome gain. Nevertheless, proper changes in cane juice evaporation section (case 6) could allow sugar and ethanol combined production to rise up to 9.1% relative to the base case, without dropping cogenerated power. (author)

  6. Exergy-Based Efficiency Analysis of Pyrometallurgical Processes

    Science.gov (United States)

    Klaasen, Bart; Jones, Peter-Tom; Durinck, Dirk; Dewulf, Jo; Wollants, Patrick; Blanpain, Bart

    2010-12-01

    Exergy-based efficiency analysis provides a powerful tool for optimizing industrial processes. In this article, the use of this technique for pyrometallurgical applications is explored in four steps. First, the exergy concept is introduced, the outline of exergy calculations is presented, and the role of a reference state is discussed. Second, it is shown that an unambiguous exergy calculation for pyrometallurgical streams with a complex, unknown phase composition is not straightforward. Hence, a practical methodology is proposed in which a suitable phase-based stream description is estimated prior to the actual exergy calculation. For this, the equilibrium phase composition is calculated, whereas all known stream properties are incorporated as boundary conditions. Third, the proposed methodology is validated by recalculating literature results. This reveals significant deviations for exergy values of the same pyrometallurgical streams. Our results are probably more accurate because of the incorporation of additional phase-related information. And fourth, a full analysis of a zinc-recycling process is presented. In a base case scenario, the total exergetic efficiency turns out to be only 1.2 pct. Based on this result, different process modifications are suggested and evaluated quantitatively. We find that significant efficiency gains are possible.

  7. Integrating Process Mining and Cognitive Analysis to Study EHR Workflow.

    Science.gov (United States)

    Furniss, Stephanie K; Burton, Matthew M; Grando, Adela; Larson, David W; Kaufman, David R

    2016-01-01

    There are numerous methods to study workflow. However, few produce the kinds of in-depth analyses needed to understand EHR-mediated workflow. Here we investigated variations in clinicians' EHR workflow by integrating quantitative analysis of patterns of users' EHR-interactions with in-depth qualitative analysis of user performance. We characterized 6 clinicians' patterns of information-gathering using a sequential process-mining approach. The analysis revealed 519 different screen transition patterns performed across 1569 patient cases. No one pattern was followed for more than 10% of patient cases, the 15 most frequent patterns accounted for over half ofpatient cases (53%), and 27% of cases exhibited unique patterns. By triangulating quantitative and qualitative analyses, we found that participants' EHR-interactive behavior was associated with their routine processes, patient case complexity, and EHR default settings. The proposed approach has significant potential to inform resource allocation for observation and training. In-depth observations helped us to explain variation across users.

  8. Principal Component Analysis of Process Datasets with Missing Values

    Directory of Open Access Journals (Sweden)

    Kristen A. Severson

    2017-07-01

    Full Text Available Datasets with missing values arising from causes such as sensor failure, inconsistent sampling rates, and merging data from different systems are common in the process industry. Methods for handling missing data typically operate during data pre-processing, but can also occur during model building. This article considers missing data within the context of principal component analysis (PCA, which is a method originally developed for complete data that has widespread industrial application in multivariate statistical process control. Due to the prevalence of missing data and the success of PCA for handling complete data, several PCA algorithms that can act on incomplete data have been proposed. Here, algorithms for applying PCA to datasets with missing values are reviewed. A case study is presented to demonstrate the performance of the algorithms and suggestions are made with respect to choosing which algorithm is most appropriate for particular settings. An alternating algorithm based on the singular value decomposition achieved the best results in the majority of test cases involving process datasets.

  9. The Numerical Analysis and Experiment of Shock Processing for Bouef

    Directory of Open Access Journals (Sweden)

    Y Yamashita

    2016-09-01

    Full Text Available When the shock wave processing is applied to food, it is understood to obtain the change in various physical properties. For instance, when hard beef is processed by the underwater shock wave, the tenderization of meat can be expected. In the future, it is a goal that the shock wave processor is spread in general as a home electrical appliance. In the design for the suitable pressure vessels for food processing, the phenomenon in pressure vessel are very complex in multi-physics manners. Therefore, in numerical calculation, a lot of parameter for the numerical analysis is need for pressure vessel material and various foods. In this study, we chose a beef as a sample of the food processing. First, we obtained an unknown parameter of the beef by measuring the front and the shock wave speed of the sample. Then, we will show some numerical results for shock loading of beef by using LS-DYNA3D. The experiments were carried out using the high-speed image converter camera, high-speed video camera and the explosive experimental facilities.

  10. Liveness and Reachability Analysis of BPMN Process Models

    Directory of Open Access Journals (Sweden)

    Anass Rachdi

    2016-06-01

    Full Text Available Business processes are usually defined by business experts who require intuitive and informal graphical notations such as BPMN (Business Process Management Notation for documenting and communicating their organization activities and behavior. However, BPMN has not been provided with a formal semantics, which limits the analysis of BPMN models to using solely informal techniques such as simulation. In order to address this limitation and use formal verification, it is necessary to define a certain “mapping” between BPMN and a formal language such as Concurrent Sequential Processes (CSP and Petri Nets (PN. This paper proposes a method for the verification of BPMN models by defining formal semantics of BPMN in terms of a mapping to Time Petri Nets (TPN, which are equipped with very efficient analytical techniques. After the translation of BPMN models to TPN, verification is done to ensure that some functional properties are satisfied by the model under investigation, namely liveness and reachability properties. The main advantage of our approach over existing ones is that it takes into account the time components in modeling Business process models. An example is used throughout the paper to illustrate the proposed method.

  11. Numerical analysis of stress fields generated by quenching process

    Directory of Open Access Journals (Sweden)

    A. Bokota

    2011-04-01

    Full Text Available In work the presented numerical models of tool steel hardening processes take into account mechanical phenomena generated by thermalphenomena and phase transformations. In the model of mechanical phenomena, apart from thermal, plastic and structural strain, alsotransformations plasticity was taken into account. The stress and strain fields are obtained using the solution of the Finite Elements Method of the equilibrium equation in rate form. The thermophysical constants occurring in constitutive relation depend on temperature and phase composite. For determination of plastic strain the Huber-Misses condition with isotropic strengthening was applied whereas fordetermination of transformation plasticity a modified Leblond model was used. In order to evaluate the quality and usefulness of thepresented models a numerical analysis of stresses and strains associated hardening process of a fang lathe of cone shaped made of tool steel was carried out.

  12. Analysis of problems with dry fermentation process for biogas production

    Science.gov (United States)

    Pilát, Peter; Patsch, Marek; Jandačka, Jozef

    2012-04-01

    The technology of dry anaerobic fermentation is still meeting with some scepticism, and therefore in most biogas plants are used wet fermentation technology. Fermentation process would be not complete without an optimal controlled condition: dry matter content, density, pH, and in particular the reaction temperature. If is distrust of dry fermentation eligible it was on the workplace of the Department of Power Engineering at University of Zilina built an experimental small-scale biogas station that allows analysis of optimal parameters of the dry anaerobic fermentation, in particular, however, affect the reaction temperature on yield and quality of biogas.

  13. Analysis of Mental Processes Represented in Models of Artificial Consciousness

    Directory of Open Access Journals (Sweden)

    Luana Folchini da Costa

    2013-12-01

    Full Text Available The Artificial Consciousness concept has been used in the engineering area as being an evolution of the Artificial Intelligence. However, consciousness is a complex subject and often used without formalism. As a main contribution, in this work one proposes an analysis of four recent models of artificial consciousness published in the engineering area. The mental processes represented by these models are highlighted and correlations with the theoretical perspective of cognitive psychology are made. Finally, considerations about consciousness in such models are discussed.

  14. Analysis of Transport Processes Management for a Romanian Food Market

    Directory of Open Access Journals (Sweden)

    Maria NEAGU

    2012-12-01

    Full Text Available This paper presents the study of optimization process for the products transportation for a Romanian food-market. The vehicle routing problem was solves using Lingo 13.0 software and an analysis was conducted in order to determine the optimal routes for the vehicles in the conditions of products request variation. The program developed is considering one storing place from where the products are transported to other six delivery points using three vehicles. Each vehicle has a constant capacity and a constant travel velocity.

  15. Static Analysis for Event-Based XML Processing

    DEFF Research Database (Denmark)

    Møller, Anders

    2008-01-01

    Event-based processing of XML data - as exemplified by the popular SAX framework - is a powerful alternative to using W3C's DOM or similar tree-based APIs. The event-based approach is a streaming fashion with minimal memory consumption. This paper discusses challenges for creating program analyses...... for SAX applications. In particular, we consider the problem of statically guaranteeing the a given SAX program always produces only well-formed and valid XML output. We propose an analysis technique based on ecisting anglyses of Servlets, string operations, and XML graphs....

  16. Astro-H Data Analysis, Processing and Archive

    Science.gov (United States)

    Angelini, Lorella; Terada, Yukikatsu; Loewenstein, Michael; Miller, Eric D.; Yamaguchi, Hiroya; Yaqoob, Tahir; Krimm, Hans; Harrus, Ilana; Takahashi, Hiromitsu; Nobukawa, Masayoshi; hide

    2016-01-01

    Astro-H (Hitomi) is an X-ray Gamma-ray mission led by Japan with international participation, launched on February 17, 2016. The payload consists of four different instruments (SXS, SXI, HXI and SGD) that operate simultaneously to cover the energy range from 0.3 keV up to 600 keV. This paper presents the analysis software and the data processing pipeline created to calibrate and analyze the Hitomi science data along with the plan for the archive and user support.These activities have been a collaborative effort shared between scientists and software engineers working in several institutes in Japan and USA.

  17. Analysis of clusterization and networking processes in developing intermodal transportation

    Directory of Open Access Journals (Sweden)

    Sinkevičius Gintaras

    2016-06-01

    Full Text Available Analysis of the processes of clusterization and networking draws attention to the necessity of integration of railway transport into the intermodal or multimodal transport chain. One of the most widespread methods of combined transport is interoperability of railway and road transport. The objective is to create an uninterrupted transport chain in combining several modes of transport. The aim of this is to save energy resources, to form an effective, competitive, attractive to the client and safe and environmentally friendly transport system.

  18. Stochastic analysis in production process and ecology under uncertainty

    CERN Document Server

    Bieda, Bogusław

    2014-01-01

    The monograph addresses a problem of stochastic analysis based on the uncertainty assessment by simulation and application of this method in ecology and steel industry under uncertainty. The first chapter defines the Monte Carlo (MC) method and random variables in stochastic models. Chapter two deals with the contamination transport in porous media. Stochastic approach for Municipal Solid Waste transit time contaminants modeling using MC simulation has been worked out. The third chapter describes the risk analysis of the waste to energy facility proposal for Konin city, including the financial aspects. Environmental impact assessment of the ArcelorMittal Steel Power Plant, in Kraków - in the chapter four - is given. Thus, four scenarios of the energy mix production processes were studied. Chapter five contains examples of using ecological Life Cycle Assessment (LCA) - a relatively new method of environmental impact assessment - which help in preparing pro-ecological strategy, and which can lead to reducing t...

  19. Market potential analysis: Case of food-processing industry companies

    Directory of Open Access Journals (Sweden)

    Nikolić Milan

    2006-01-01

    Full Text Available The paper presents the analysis of the situation with the market potentials in domestic food-processing companies. The analysis has been performed on the basis of the data provided by surveying 150 experts from companies operating in the field of foodstuffs industry. The assessments and parameter ranks (company features are shown, which describe the situation with market potentials. Assessments and ranks of some other parameters are shown for sake of easy reference and comparison. All assessments are obtained as mean values of single assessments given by experts for the analyzed parameters. The experts quantitatively assessed the required parameters with an assessment in the 0 - 10 range according to the situation in their companies. Product quality and company's ambitions are the parameters with the best assessments and promotion and presence on foreign markets are the weakest ones.

  20. Analysis of the processes in pneumatic moulding sand reclamation

    Directory of Open Access Journals (Sweden)

    H. Szlumczyk

    2008-07-01

    Full Text Available This article covers the analysis of the pneumatic moulding sand reclamation, made of different types of binders. The research has been carried out for the sand with resin binder (furan resins as well as water glass hardened with flodur. Reclamation has been carried in the pneumatic conveying system in the linear regenerator in the technical scale and equipped with an abrasion impact disc. Evaluation of the effectiveness of the reclamation has been made on the basis of tests determining the contents of binder's components before and after the process and on the basis of the sieve analysis. The subject of this publication is the comparison of the effectiveness of these two types of solutions.

  1. Point process analysis of noise in early invertebrate vision.

    Directory of Open Access Journals (Sweden)

    Kris V Parag

    2017-10-01

    Full Text Available Noise is a prevalent and sometimes even dominant aspect of many biological processes. While many natural systems have adapted to attenuate or even usefully integrate noise, the variability it introduces often still delimits the achievable precision across biological functions. This is particularly so for visual phototransduction, the process responsible for converting photons of light into usable electrical signals (quantum bumps. Here, randomness of both the photon inputs (regarded as extrinsic noise and the conversion process (intrinsic noise are seen as two distinct, independent and significant limitations on visual reliability. Past research has attempted to quantify the relative effects of these noise sources by using approximate methods that do not fully account for the discrete, point process and time ordered nature of the problem. As a result the conclusions drawn from these different approaches have led to inconsistent expositions of phototransduction noise performance. This paper provides a fresh and complete analysis of the relative impact of intrinsic and extrinsic noise in invertebrate phototransduction using minimum mean squared error reconstruction techniques based on Bayesian point process (Snyder filters. An integrate-fire based algorithm is developed to reliably estimate photon times from quantum bumps and Snyder filters are then used to causally estimate random light intensities both at the front and back end of the phototransduction cascade. Comparison of these estimates reveals that the dominant noise source transitions from extrinsic to intrinsic as light intensity increases. By extending the filtering techniques to account for delays, it is further found that among the intrinsic noise components, which include bump latency (mean delay and jitter and shape (amplitude and width variance, it is the mean delay that is critical to noise performance. As the timeliness of visual information is important for real-time action, this

  2. NeoAnalysis: a Python-based toolbox for quick electrophysiological data processing and analysis.

    Science.gov (United States)

    Zhang, Bo; Dai, Ji; Zhang, Tao

    2017-11-13

    In a typical electrophysiological experiment, especially one that includes studying animal behavior, the data collected normally contain spikes, local field potentials, behavioral responses and other associated data. In order to obtain informative results, the data must be analyzed simultaneously with the experimental settings. However, most open-source toolboxes currently available for data analysis were developed to handle only a portion of the data and did not take into account the sorting of experimental conditions. Additionally, these toolboxes require that the input data be in a specific format, which can be inconvenient to users. Therefore, the development of a highly integrated toolbox that can process multiple types of data regardless of input data format and perform basic analysis for general electrophysiological experiments is incredibly useful. Here, we report the development of a Python based open-source toolbox, referred to as NeoAnalysis, to be used for quick electrophysiological data processing and analysis. The toolbox can import data from different data acquisition systems regardless of their formats and automatically combine different types of data into a single file with a standardized format. In cases where additional spike sorting is needed, NeoAnalysis provides a module to perform efficient offline sorting with a user-friendly interface. Then, NeoAnalysis can perform regular analog signal processing, spike train, and local field potentials analysis, behavioral response (e.g. saccade) detection and extraction, with several options available for data plotting and statistics. Particularly, it can automatically generate sorted results without requiring users to manually sort data beforehand. In addition, NeoAnalysis can organize all of the relevant data into an informative table on a trial-by-trial basis for data visualization. Finally, NeoAnalysis supports analysis at the population level. With the multitude of general-purpose functions provided

  3. Multiobjective Optimization of ELID Grinding Process Using Grey Relational Analysis Coupled with Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    S. Prabhu

    2014-06-01

    Full Text Available Carbon nanotube (CNT mixed grinding wheel has been used in the electrolytic in-process dressing (ELID grinding process to analyze the surface characteristics of AISI D2 Tool steel material. CNT grinding wheel is having an excellent thermal conductivity and good mechanical property which is used to improve the surface finish of the work piece. The multiobjective optimization of grey relational analysis coupled with principal component analysis has been used to optimize the process parameters of ELID grinding process. Based on the Taguchi design of experiments, an L9 orthogonal array table was chosen for the experiments. The confirmation experiment verifies the proposed that grey-based Taguchi method has the ability to find out the optimal process parameters with multiple quality characteristics of surface roughness and metal removal rate. Analysis of variance (ANOVA has been used to verify and validate the model. Empirical model for the prediction of output parameters has been developed using regression analysis and the results were compared for with and without using CNT grinding wheel in ELID grinding process.

  4. [Analysis of complaints in primary care using statistical process control].

    Science.gov (United States)

    Valdivia Pérez, Antonio; Arteaga Pérez, Lourdes; Escortell Mayor, Esperanza; Monge Corella, Susana; Villares Rodríguez, José Enrique

    2009-08-01

    To analyze patient complaints in a Primary Health Care District (PHCD) using statistical process control methods compared to multivariate methods, as regards their results and feasibility of application in this context. Descriptive study based on an aggregate analysis of administrative complaints. Complaints received between January 2005 and August 2008 in the Customer Management Department in the 3rd PHCD Management Office, Madrid Health Services. Complaints are registered through Itrack, a computer software tool used throughout the whole Community of Madrid. Total number of complaints, complaints sorted by Reason and Primary Health Care Team (PHCT), total number of patient visits (including visits on demand, appointment visits and home visits) and visits by PHCT and per month and year. Multivariate analysis and control charts were used. 44-month time series with a mean of 76 complaints per month, an increasing trend in the first three years and decreasing during summer months. Poisson regression detected an excess of complaints in 8 out of the 44 months in the series. The control chart detected the same 8 months plus two additional ones. Statistical process control can be useful for detecting an excess of complaints in a PHCD and enables comparisons to be made between different PHC teams. As it is a simple technique, it can be used for ongoing monitoring of customer perceived quality.

  5. Techno-Economic Analysis of a Secondary Air Stripper Process

    Energy Technology Data Exchange (ETDEWEB)

    Heberle, J.R. [Electric Power Research Inst. (EPRI), Palo Alto, CA (United States); Nikolic, Heather [Center for Applied Energy Research, University of Kentucky, Lexington, KY (United States); Thompson, Jesse [Center for Applied Energy Research, University of Kentucky, Lexington, KY (United States); Liu, Kunlei [Center for Applied Energy Research, University of Kentucky, Lexington, KY (United States); Pinkerton, Lora L. [WorleyParsons, Reading, PA (United States); Brubaker, David [WorleyParsons, Reading, PA (United States); Simpson, James C. [WorleyParsons, Reading, PA (United States); Wu, Song [Mitsubishi Hitachi Power Systems America, Inc, Basking Ridge, NJ (United States); Bhown, Abhoyjit S. [Electric Power Research Inst. (EPRI), Palo Alto, CA (United States)

    2017-08-22

    We present results of an initial techno-economic assessment on a post-combustion CO2 capture process developed by the Center for Applied Energy Research (CAER) at the University of Kentucky using Mitsubishi Hitachi Power Systems’ H3-1 aqueous amine solvent. The analysis is based on data collected at a 0.7 MWe pilot unit combined with laboratory data and process simulations. The process adds a secondary air stripper to a conventional solvent process, which increases the cyclic loading of the solvent in two ways. First, air strips additional CO2 from the solvent downstream of the conventional steam-heated thermal stripper. This extra stripping of CO2 reduces the lean loading entering the absorber. Second, the CO2-enriched air is then sent to the boiler for use as secondary air. This recycling of CO2 results in a higher concentration of CO2 in the flue gas sent to the absorber, and hence a higher rich loading of the solvent exiting the absorber. A process model was incorporated into a full-scale supercritical pulverized coal power plant model to determine the plant performance and heat and mass balances. The performance and heat and mass balance data were used to size equipment and develop cost estimates for capital and operating costs. Lifecycle costs were considered through a levelized cost of electricity (LCOE) assessment based on the capital cost estimate and modeled performance. The results of the simulations show that the CAER process yields a regeneration energy of 3.12 GJ/t CO2, a $53.05/t CO2 capture cost, and LCOE of $174.59/MWh. This compares to the U.S. Department of Energy’s projected costs (Case 10) of regeneration energy of 3.58 GJ/t CO2 , a $61.31/t CO2 capture cost, and LCOE of $189.59/MWh. For H3-1, the CAER process results in a regeneration energy of 2.62 GJ/tCO2 with a stripper pressure of 5.2 bar, a capture cost of $46.93/t CO2, and an LCOE of $164.33/MWh.

  6. Computer-aided analysis of cutting processes for brittle materials

    Science.gov (United States)

    Ogorodnikov, A. I.; Tikhonov, I. N.

    2017-12-01

    This paper is focused on 3D computer simulation of cutting processes for brittle materials and silicon wafers. Computer-aided analysis of wafer scribing and dicing is carried out with the use of the ANSYS CAE (computer-aided engineering) software, and a parametric model of the processes is created by means of the internal ANSYS APDL programming language. Different types of tool tip geometry are analyzed to obtain internal stresses, such as a four-sided pyramid with an included angle of 120° and a tool inclination angle to the normal axis of 15°. The quality of the workpieces after cutting is studied by optical microscopy to verify the FE (finite-element) model. The disruption of the material structure during scribing occurs near the scratch and propagates into the wafer or over its surface at a short range. The deformation area along the scratch looks like a ragged band, but the stress width is rather low. The theory of cutting brittle semiconductor and optical materials is developed on the basis of the advanced theory of metal turning. The fall of stress intensity along the normal on the way from the tip point to the scribe line can be predicted using the developed theory and with the verified FE model. The crystal quality and dimensions of defects are determined by the mechanics of scratching, which depends on the shape of the diamond tip, the scratching direction, the velocity of the cutting tool and applied force loads. The disunity is a rate-sensitive process, and it depends on the cutting thickness. The application of numerical techniques, such as FE analysis, to cutting problems enhances understanding and promotes the further development of existing machining technologies.

  7. Process variability band analysis for quantitative optimization of exposure conditions

    Science.gov (United States)

    Sturtevant, John L.; Jayaram, Srividya; Hong, Le

    2009-03-01

    One of the critical challenges facing lithographers is how to optimize the numerical aperture (NA) and illumination source intensity and polarization distribution to deliver the maximum process window for a given design in manufacturing. While the maximum NA has topped out at 1.35, the available illuminator options continue to increase, including the eventual possibility of dynamically programmable pixelized illumination to deliver nearly any imaginable source shape profile. New approaches to leverage this capability and simultaneously optimize the source and mask shapes (SMO) on a per-design basis are actively being developed. Even with the available "standard" illumination source primitive shapes, however, there exist a huge range of possible choices available to the lithographer. In addition, there are multiple conceivable cost functions which could be considered when determining which illumination to utilize for a specified technology and mask layer. These are related to the primary lithographic variables of exposure dose, focus, and mask size, and include depth of focus (DOF), exposure latitude (EL), normalized image log slope (NILS), image contrast, and mask error enhancement factor (MEEF). The net result can be a very large quantity of simulation data which can prove difficult to assess, and often manifest different extrema, depending upon which cost function is emphasized. We report here on the use of several analysis methods, including process variability bands, as convenient metrics to optimize full-chip post-OPC CD control in conjunction with illumination optimization tooling. The result is a more thorough and versatile statistical analysis capability than what has traditionally been possible with a CD cutline approach. The method is analogous to conventional process window CD plots used in lithography for many years.

  8. Simulated interprofessional education: an analysis of teaching and learning processes.

    Science.gov (United States)

    van Soeren, Mary; Devlin-Cop, Sandra; Macmillan, Kathleen; Baker, Lindsay; Egan-Lee, Eileen; Reeves, Scott

    2011-11-01

    Simulated learning activities are increasingly being used in health professions and interprofessional education (IPE). Specifically, IPE programs are frequently adopting role-play simulations as a key learning approach. Despite this widespread adoption, there is little empirical evidence exploring the teaching and learning processes embedded within this type of simulation. This exploratory study provides insight into the nature of these processes through the use of qualitative methods. A total of 152 clinicians, 101 students and 9 facilitators representing a range of health professions, participated in video-recorded role-plays and debrief sessions. Videotapes were analyzed to explore emerging issues and themes related to teaching and learning processes related to this type of interprofessional simulated learning experience. In addition, three focus groups were conducted with a subset of participants to explore perceptions of their educational experiences. Five key themes emerged from the data analysis: enthusiasm and motivation, professional role assignment, scenario realism, facilitator style and background and team facilitation. Our findings suggest that program developers need to be mindful of these five themes when using role-plays in an interprofessional context and point to the importance of deliberate and skilled facilitation in meeting desired learning outcomes.

  9. Exergetic analysis of distillation processes - A case study

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Antonio B. [Braskem Company, Avenue Assis Chateaubriand 5260, Pontal da Barra, Maceio, Alagoas 57010 800 (Brazil); Brito, Romildo P.; Vasconcelos, Luis S. [Department of Chemical Engineering, Federal University of Campina Grande, Av Aprigio Veloso, 882, Campina Grande, PB 58109 970 (Brazil)

    2007-07-15

    The concept of exergy has been introduced to establish a universal standard for quality and efficient use of energy. In this work, applications of this concept to compression, heat exchange, and separation processes, in addition to the computation of their irreversibility rate and thermodynamic efficiency, are considered. An industrial case study on the purification of 1,2-ethylenedichloride (EDC) in a high-purity distillation column is presented. Due to its large throughput, this distillation column consumes a large amount of thermal energy (steam to the reboiler) and in order to reduce the energy requirements without large process modifications, a new configuration using a vapour compression heat pump is proposed which yields considerable improvement in the use of energy. Both configurations were implemented using the commercial simulator Aspen Plus trademark; the results of the original configuration were validated with data extracted from the plant. The objective of this work was to compare the original configuration and the new proposed one, from a thermodynamic approach. Furthermore, two forms of process thermodynamic analysis based on the concept of exergy were applied to the new proposed configuration. (author)

  10. Exergetic analysis of distillation processes-A case study

    Energy Technology Data Exchange (ETDEWEB)

    Araujo, Antonio B. [Braskem Company, Avenue Assis Chateaubriand 5260, Pontal da Barra, Maceio, Alagoas 57010 800 (Brazil); Brito, Romildo P. [Department of Chemical Engineering, Federal University of Campina Grande, Av Aprigio Veloso, 882, Campina Grande, PB 58109 970 (Brazil)]. E-mail: brito@deq.ufcg.edu.br; Vasconcelos, Luis S. [Department of Chemical Engineering, Federal University of Campina Grande, Av Aprigio Veloso, 882, Campina Grande, PB 58109 970 (Brazil)

    2007-07-15

    The concept of exergy has been introduced to establish a universal standard for quality and efficient use of energy. In this work, applications of this concept to compression, heat exchange, and separation processes, in addition to the computation of their irreversibility rate and thermodynamic efficiency, are considered. An industrial case study on the purification of 1,2-ethylenedichloride (EDC) in a high-purity distillation column is presented. Due to its large throughput, this distillation column consumes a large amount of thermal energy (steam to the reboiler) and in order to reduce the energy requirements without large process modifications, a new configuration using a vapour compression heat pump is proposed which yields considerable improvement in the use of energy. Both configurations were implemented using the commercial simulator Aspen Plus{sup TM}; the results of the original configuration were validated with data extracted from the plant. The objective of this work was to compare the original configuration and the new proposed one, from a thermodynamic approach. Furthermore, two forms of process thermodynamic analysis based on the concept of exergy were applied to the new proposed configuration.

  11. Efficient planning and numerical analysis of industrial hemming processes

    Science.gov (United States)

    Burchitz, Igor; Fritsche, David; Grundmann, Göran; Hillmann, Matthias

    2011-08-01

    Hemming is a forming operation used in the automotive industry to join inner and outer components during the assembly of closures. These are typically opening parts of the body-in-white with additional requirements to their visual appearance. A suitable production concept of hemming operation which satisfies quality, capacity and cost requirements is determined during hemming planning activities. A digital tool to facilitate these activities and minimize the amount of trial and error iterations in try-out phase is presented in this paper. This tool can be used to define process plan, active tool surfaces and suitable process parameters for both die hemming and roll hemming operations. In case of early feasibility studies, when the die layout of the drawing operation is still not available, 3D part geometry is used directly to develop the concept of hemming process. Advanced validation studies, aimed at process optimization and controlling defects associated with hemming, can be based on complete simulation of all forming operations. Validation and analysis of developed concepts of hemming operation is done using the standard AutoForm-Incremental solver. Submesh strategy and special algorithm for contact description between inner and outer parts were implemented to ensure that accurate simulation results can be obtained within reasonable calculation time. Performance of the new software tool for hemming planning and accuracy of simulation results are demonstrated using several simple benchmarks and a real industrial part. It is shown that the new software tool can help to secure the efficient production launch by providing adequate support in try-out phase.

  12. Simulation and analysis of conjunctive use with MODFLOW's farm process

    Science.gov (United States)

    Hanson, R.T.; Schmid, W.; Faunt, C.C.; Lockwood, B.

    2010-01-01

    The extension of MODFLOW onto the landscape with the Farm Process (MF-FMP) facilitates fully coupled simulation of the use and movement of water from precipitation, streamflow and runoff, groundwater flow, and consumption by natural and agricultural vegetation throughout the hydrologic system at all times. This allows for more complete analysis of conjunctive use water-resource systems than previously possible with MODFLOW by combining relevant aspects of the landscape with the groundwater and surface water components. This analysis is accomplished using distributed cell-by-cell supply-constrained and demand-driven components across the landscape within " water-balance subregions" comprised of one or more model cells that can represent a single farm, a group of farms, or other hydrologic or geopolitical entities. Simulation of micro-agriculture in the Pajaro Valley and macro-agriculture in the Central Valley are used to demonstrate the utility of MF-FMP. For Pajaro Valley, the simulation of an aquifer storage and recovery system and related coastal water distribution system to supplant coastal pumpage was analyzed subject to climate variations and additional supplemental sources such as local runoff. For the Central Valley, analysis of conjunctive use from different hydrologic settings of northern and southern subregions shows how and when precipitation, surface water, and groundwater are important to conjunctive use. The examples show that through MF-FMP's ability to simulate natural and anthropogenic components of the hydrologic cycle, the distribution and dynamics of supply and demand can be analyzed, understood, and managed. This analysis of conjunctive use would be difficult without embedding them in the simulation and are difficult to estimate a priori. Journal compilation ?? 2010 National Ground Water Association. No claim to original US government works.

  13. Analysis of the requirements generation process for the Logistics Analysis and Wargame Support Tool

    OpenAIRE

    Swan, Jonathan M.

    2017-01-01

    Approved for public release; distribution is unlimited This thesis conducts an analysis of the system requirements for the Logistics Analysis and Wargame Support Tool (LAWST). It studies the process used to develop those requirements and potential requirements if a systems engineering (SE) approach had been used. The original requirements for LAWST are found in documentation provided by the Marine Corps Expeditionary Energy Office (E2O) along with information indicating the sources of thos...

  14. Marketing Mix Formulation for Higher Education: An Integrated Analysis Employing Analytic Hierarchy Process, Cluster Analysis and Correspondence Analysis

    Science.gov (United States)

    Ho, Hsuan-Fu; Hung, Chia-Chi

    2008-01-01

    Purpose: The purpose of this paper is to examine how a graduate institute at National Chiayi University (NCYU), by using a model that integrates analytic hierarchy process, cluster analysis and correspondence analysis, can develop effective marketing strategies. Design/methodology/approach: This is primarily a quantitative study aimed at…

  15. Alternative process schemes for coal conversion. Progress report No. 2, February 1-April 30, 1979

    Energy Technology Data Exchange (ETDEWEB)

    Sansone, M.J.

    1979-05-01

    The importance of gas separation methods to the economics of hydrogasification and catalytic gasification processes has been emphasized. This importance is due to the fact that these processes require large amounts of recycled hydrogen or hydrogen and carbon monoxide from which the product methane must be removed via some economical method. For example, the Exxon catalytic gasification process utilizes a cryogenic distillation to achieve the separation of CH/sub 4/ from H/sub 2/ and CO. In this report, the energetics of a cryogenic separation process for hydrogen-methane mixtures are calculated and compared with the energy requirements for the separation of H/sub 2//CH/sub 4/ and H/sub 2//CO/CH/sub 4/ mixtures using a gas hydrate separation scheme. It must be stated at the outset that the success of the proposed hydrate process depends upon the kinetics of hydrate formation for which we have no data. Nevertheless, it is still worthwhile to examine such a process within a thermodynamic framework to determine if such a scheme is at least energetically, if not kinetically, feasible.

  16. Analysis of trajectory entropy for continuous stochastic processes at equilibrium.

    Science.gov (United States)

    Haas, Kevin R; Yang, Haw; Chu, Jhih-Wei

    2014-07-17

    The analytical expression for the trajectory entropy of the overdamped Langevin equation is derived via two approaches. The first route goes through the Fokker-Planck equation that governs the propagation of the conditional probability density, while the second method goes through the path integral of the Onsager-Machlup action. The agreement of these two approaches in the continuum limit underscores the equivalence between the partial differential equation and the path integral formulations for stochastic processes in the context of trajectory entropy. The values obtained using the analytical expression are also compared with those calculated with numerical solutions for arbitrary time resolutions of the trajectory. Quantitative agreement is clearly observed consistently across different models as the time interval between snapshots in the trajectories decreases. Furthermore, analysis of different scenarios illustrates how the deterministic and stochastic forces in the Langevin equation contribute to the variation in dynamics measured by the trajectory entropy.

  17. Extended Poisson process modelling and analysis of grouped binary data.

    Science.gov (United States)

    Faddy, Malcolm J; Smith, David M

    2012-05-01

    A simple extension of the Poisson process results in binomially distributed counts of events in a time interval. A further extension generalises this to probability distributions under- or over-dispersed relative to the binomial distribution. Substantial levels of under-dispersion are possible with this modelling, but only modest levels of over-dispersion - up to Poisson-like variation. Although simple analytical expressions for the moments of these probability distributions are not available, approximate expressions for the mean and variance are derived, and used to re-parameterise the models. The modelling is applied in the analysis of two published data sets, one showing under-dispersion and the other over-dispersion. More appropriate assessment of the precision of estimated parameters and reliable model checking diagnostics follow from this more general modelling of these data sets. © 2012 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Accident sequences and causes analysis in a hydrogen production process

    Energy Technology Data Exchange (ETDEWEB)

    Jae, Moo Sung; Hwang, Seok Won; Kang, Kyong Min; Ryu, Jung Hyun; Kim, Min Soo; Cho, Nam Chul; Jeon, Ho Jun; Jung, Gun Hyo; Han, Kyu Min; Lee, Seng Woo [Hanyang Univ., Seoul (Korea, Republic of)

    2006-03-15

    Since hydrogen production facility using IS process requires high temperature of nuclear power plant, safety assessment should be performed to guarantee the safety of facility. First of all, accident cases of hydrogen production and utilization has been surveyed. Based on the results, risk factors which can be derived from hydrogen production facility were identified. Besides the correlation between risk factors are schematized using influence diagram. Also initiating events of hydrogen production facility were identified and accident scenario development and quantification were performed. PSA methodology was used for identification of initiating event and master logic diagram was used for selection method of initiating event. Event tree analysis was used for quantification of accident scenario. The sum of all the leakage frequencies is 1.22x10{sup -4} which is similar value (1.0x10{sup -4}) for core damage frequency that International Nuclear Safety Advisory Group of IAEA suggested as a criteria.

  19. Bony change of apical lesion healing process using fractal analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Ji Min; Park, Hyok; Jeong, Ho Gul; Kim, Kee Deog; Park, Chang Seo [Yonsei University College of Medicine, Seoul (Korea, Republic of)

    2005-06-15

    To investigate the change of bone healing process after endodontic treatment of the tooth with an apical lesion by fractal analysis. Radiographic images of 35 teeth from 33 patients taken on first diagnosis, 6 months, and 1 year after endodontic treatment were selected. Radiographic images were taken by JUPITER computerized Dental X-ray System. Fractal dimensions were calculated three times at each area by Scion Image PC program. Rectangular region of interest (30 x 30) were selected at apical lesion and normal apex of each image. The fractal dimension at apical lesion of first diagnosis (L{sub 0}) is 0.940 {+-} 0.361 and that of normal area (N{sub 0}) is 1.186 {+-} 0.727 (p<0.05). Fractal dimension at apical lesion of 6 months after endodontic treatment (L{sub 1}) is 1.076 {+-} 0.069 and that of normal area (N{sub 1}) is 1.192 {+-} 0.055 (p<0.05). Fractal dimension at apical lesion of 1 year after endodontic treatment (L{sub 2}) is 1.163 {+-} 0.074 and that of normal area (N{sub 2}) is 1.225 {+-} 0.079 (p<0.05). After endodontic treatment, the fractal dimensions at each apical lesions depending on time showed statistically significant difference. And there are statistically significant different between normal area and apical lesion on first diagnosis, 6 months after, 1 year after. But the differences were grow smaller as time flows. The evaluation of the prognosis after the endodontic treatment of the apical lesion was estimated by bone regeneration in apical region. Fractal analysis was attempted to overcome the limit of subjective reading, and as a result the change of the bone during the healing process was able to be detected objectively and quantitatively.

  20. Characterization of the biosolids composting process by hyperspectral analysis.

    Science.gov (United States)

    Ilani, Talli; Herrmann, Ittai; Karnieli, Arnon; Arye, Gilboa

    2016-02-01

    Composted biosolids are widely used as a soil supplement to improve soil quality. However, the application of immature or unstable compost can cause the opposite effect. To date, compost maturation determination is time consuming and cannot be done at the composting site. Hyperspectral spectroscopy was suggested as a simple tool for assessing compost maturity and quality. Nevertheless, there is still a gap in knowledge regarding several compost maturation characteristics, such as dissolved organic carbon, NO3, and NH4 contents. In addition, this approach has not yet been tested on a sample at its natural water content. Therefore, in the current study, hyperspectral analysis was employed in order to characterize the biosolids composting process as a function of composting time. This goal was achieved by correlating the reflectance spectra in the range of 400-2400nm, using the partial least squares-regression (PLS-R) model, with the chemical properties of wet and oven-dried biosolid samples. The results showed that the proposed method can be used as a reliable means to evaluate compost maturity and stability. Specifically, the PLS-R model was found to be an adequate tool to evaluate the biosolids' total carbon and dissolved organic carbon, total nitrogen and dissolved nitrogen, and nitrate content, as well as the absorbance ratio of 254/365nm (E2/E3) and C/N ratios in the dry and wet samples. It failed, however, to predict the ammonium content in the dry samples since the ammonium evaporated during the drying process. It was found that in contrast to what is commonly assumed, the spectral analysis of the wet samples can also be successfully used to build a model for predicting the biosolids' compost maturity. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Analysis of DIRAC's behavior using model checking with process algebra

    Science.gov (United States)

    Remenska, Daniela; Templon, Jeff; Willemse, Tim; Bal, Henri; Verstoep, Kees; Fokkink, Wan; Charpentier, Philippe; Graciani Diaz, Ricardo; Lanciotti, Elisa; Roiser, Stefan; Ciba, Krzysztof

    2012-12-01

    DIRAC is the grid solution developed to support LHCb production activities as well as user data analysis. It consists of distributed services and agents delivering the workload to the grid resources. Services maintain database back-ends to store dynamic state information of entities such as jobs, queues, staging requests, etc. Agents use polling to check and possibly react to changes in the system state. Each agent's logic is relatively simple; the main complexity lies in their cooperation. Agents run concurrently, and collaborate using the databases as shared memory. The databases can be accessed directly by the agents if running locally or through a DIRAC service interface if necessary. This shared-memory model causes entities to occasionally get into inconsistent states. Tracing and fixing such problems becomes formidable due to the inherent parallelism present. We propose more rigorous methods to cope with this. Model checking is one such technique for analysis of an abstract model of a system. Unlike conventional testing, it allows full control over the parallel processes execution, and supports exhaustive state-space exploration. We used the mCRL2 language and toolset to model the behavior of two related DIRAC subsystems: the workload and storage management system. Based on process algebra, mCRL2 allows defining custom data types as well as functions over these. This makes it suitable for modeling the data manipulations made by DIRAC's agents. By visualizing the state space and replaying scenarios with the toolkit's simulator, we have detected race-conditions and deadlocks in these systems, which, in several cases, were confirmed to occur in the reality. Several properties of interest were formulated and verified with the tool. Our future direction is automating the translation from DIRAC to a formal model.

  2. NMR data visualization, processing, and analysis on mobile devices.

    Science.gov (United States)

    Cobas, Carlos; Iglesias, Isaac; Seoane, Felipe

    2015-08-01

    Touch-screen computers are emerging as a popular platform for many applications, including those in chemistry and analytical sciences. In this work, we present our implementation of a new NMR 'app' designed for hand-held and portable touch-controlled devices, such as smartphones and tablets. It features a flexible architecture formed by a powerful NMR processing and analysis kernel and an intuitive user interface that makes full use of the smart devices haptic capabilities. Routine 1D and 2D NMR spectra acquired in most NMR instruments can be processed in a fully unattended way. More advanced experiments such as non-uniform sampled NMR spectra are also supported through a very efficient parallelized Modified Iterative Soft Thresholding algorithm. Specific technical development features as well as the overall feasibility of using NMR software apps will also be discussed. All aspects considered the functionalities of the app allowing it to work as a stand-alone tool or as a 'companion' to more advanced desktop applications such as Mnova NMR. Copyright © 2015 John Wiley & Sons, Ltd.

  3. An Accident Precursor Analysis Process Tailored for NASA Space Systems

    Science.gov (United States)

    Groen, Frank; Stamatelatos, Michael; Dezfuli, Homayoon; Maggio, Gaspare

    2010-01-01

    Accident Precursor Analysis (APA) serves as the bridge between existing risk modeling activities, which are often based on historical or generic failure statistics, and system anomalies, which provide crucial information about the failure mechanisms that are actually operative in the system and which may differ in frequency or type from those in the various models. These discrepancies between the models (perceived risk) and the system (actual risk) provide the leading indication of an underappreciated risk. This paper presents an APA process developed specifically for NASA Earth-to-Orbit space systems. The purpose of the process is to identify and characterize potential sources of system risk as evidenced by anomalous events which, although not necessarily presenting an immediate safety impact, may indicate that an unknown or insufficiently understood risk-significant condition exists in the system. Such anomalous events are considered accident precursors because they signal the potential for severe consequences that may occur in the future, due to causes that are discernible from their occurrence today. Their early identification allows them to be integrated into the overall system risk model used to intbrm decisions relating to safety.

  4. Quantitative analysis of histopathological findings using image processing software.

    Science.gov (United States)

    Horai, Yasushi; Kakimoto, Tetsuhiro; Takemoto, Kana; Tanaka, Masaharu

    2017-10-01

    In evaluating pathological changes in drug efficacy and toxicity studies, morphometric analysis can be quite robust. In this experiment, we examined whether morphometric changes of major pathological findings in various tissue specimens stained with hematoxylin and eosin could be recognized and quantified using image processing software. Using Tissue Studio, hypertrophy of hepatocytes and adrenocortical cells could be quantified based on the method of a previous report, but the regions of red pulp, white pulp, and marginal zones in the spleen could not be recognized when using one setting condition. Using Image-Pro Plus, lipid-derived vacuoles in the liver and mucin-derived vacuoles in the intestinal mucosa could be quantified using two criteria (area and/or roundness). Vacuoles derived from phospholipid could not be quantified when small lipid deposition coexisted in the liver and adrenal cortex. Mononuclear inflammatory cell infiltration in the liver could be quantified to some extent, except for specimens with many clustered infiltrating cells. Adipocyte size and the mean linear intercept could be quantified easily and efficiently using morphological processing and the macro tool equipped in Image-Pro Plus. These methodologies are expected to form a base system that can recognize morphometric features and analyze quantitatively pathological findings through the use of information technology.

  5. Parallel Latent Semantic Analysis using a Graphics Processing Unit

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Xiaohui [ORNL; Potok, Thomas E [ORNL; Cavanagh, Joseph M [ORNL

    2009-01-01

    Latent Semantic Analysis (LSA) can be used to reduce the dimensions of large Term-Document datasets using Singular Value Decomposition. However, with the ever expanding size of data sets, current implementations are not fast enough to quickly and easily compute the results on a standard PC. The Graphics Processing Unit (GPU) can solve some highly parallel problems much faster than the traditional sequential processor (CPU). Thus, a deployable system using a GPU to speedup large-scale LSA processes would be a much more effective choice (in terms of cost/performance ratio) than using a computer cluster. In this paper, we presented a parallel LSA implementation on the GPU, using NVIDIA Compute Unified Device Architecture (CUDA) and Compute Unified Basic Linear Algebra Subprograms (CUBLAS). The performance of this implementation is compared to traditional LSA implementation on CPU using an optimized Basic Linear Algebra Subprograms library. For large matrices that have dimensions divisible by 16, the GPU algorithm ran five to six times faster than the CPU version.

  6. Sensitivity analysis on parameters and processes affecting vapor intrusion risk

    KAUST Repository

    Picone, Sara

    2012-03-30

    A one-dimensional numerical model was developed and used to identify the key processes controlling vapor intrusion risks by means of a sensitivity analysis. The model simulates the fate of a dissolved volatile organic compound present below the ventilated crawl space of a house. In contrast to the vast majority of previous studies, this model accounts for vertical variation of soil water saturation and includes aerobic biodegradation. The attenuation factor (ratio between concentration in the crawl space and source concentration) and the characteristic time to approach maximum concentrations were calculated and compared for a variety of scenarios. These concepts allow an understanding of controlling mechanisms and aid in the identification of critical parameters to be collected for field situations. The relative distance of the source to the nearest gas-filled pores of the unsaturated zone is the most critical parameter because diffusive contaminant transport is significantly slower in water-filled pores than in gas-filled pores. Therefore, attenuation factors decrease and characteristic times increase with increasing relative distance of the contaminant dissolved source to the nearest gas diffusion front. Aerobic biodegradation may decrease the attenuation factor by up to three orders of magnitude. Moreover, the occurrence of water table oscillations is of importance. Dynamic processes leading to a retreating water table increase the attenuation factor by two orders of magnitude because of the enhanced gas phase diffusion. © 2012 SETAC.

  7. Radionuclides in Bayer process residues: previous analysis for radiological protection

    Energy Technology Data Exchange (ETDEWEB)

    Cuccia, Valeria; Rocha, Zildete, E-mail: vc@cdtn.b, E-mail: rochaz@cdtn.b [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil); Oliveira, Arno H. de, E-mail: heeren@nuclear.ufmg.b [Universidade Federal de Minas Gerais (DEN/UFMG), Belo Horizonte, MG (Brazil). Dept. de Engenharia Nuclear

    2011-07-01

    Natural occurring radionuclides are present in many natural resources. Human activities may enhance concentrations of radionuclides and/or enhance potential of exposure to naturally occurring radioactive material (NORM). The industrial residues containing radionuclides have been receiving a considerable global attention, because of the large amounts of NORM containing wastes and the potential long term risks of long-lived radionuclides. Included in this global concern, this work focuses on the characterization of radioactivity in the main residues of Bayer process for alumina production: red mud and sand samples. Usually, the residues of Bayer process are named red mud, in their totality. However, in the industry where the samples were collected, there is an additional residues separation: sand and red mud. The analytical techniques used were gamma spectrometry (HPGe detector) and neutron activation analysis. The concentrations of radionuclides are higher in the red mud than in the sand. These solid residues present activities concentrations enhanced, when compared to bauxite. Further uses for the residues as building material must be more evaluated from the radiological point of view, due to its potential of radiological exposure enhancement, specially caused by radon emission. (author)

  8. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes

    DEFF Research Database (Denmark)

    Abdul Samad, Noor Asma Fazli Bin; Sin, Gürkan; Gernaey, Krist

    2013-01-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic modelbased process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty...

  9. A signal processing analysis of Purkinje cells in vitro

    Directory of Open Access Journals (Sweden)

    Ze'ev R Abrams

    2010-05-01

    Full Text Available Cerebellar Purkinje cells in vitro fire recurrent sequences of Sodium and Calcium spikes. Here, we analyze the Purkinje cell using harmonic analysis, and our experiments reveal that its output signal is comprised of three distinct frequency bands, which are combined using Amplitude and Frequency Modulation (AM/FM. We find that the three characteristic frequencies - Sodium, Calcium and Switching – occur in various combinations in all waveforms observed using whole-cell current clamp recordings. We found that the Calcium frequency can display a frequency doubling of its frequency mode, and the Switching frequency can act as a possible generator of pauses that are typically seen in Purkinje output recordings. Using a reversibly photo-switchable kainate receptor agonist, we demonstrate the external modulation of the Calcium and Switching frequencies. These experiments and Fourier analysis suggest that the Purkinje cell can be understood as a harmonic signal oscillator, enabling a higher level of interpretation of Purkinje signaling based on modern signal processing techniques.

  10. Visual Analysis and Processing of Clusters Structures in Multidimensional Datasets

    Science.gov (United States)

    Bondarev, A. E.

    2017-05-01

    The article is devoted to problems of visual analysis of clusters structures for a multidimensional datasets. For visual analyzing an approach of elastic maps design [1,2] is applied. This approach is quite suitable for processing and visualizing of multidimensional datasets. To analyze clusters in original data volume the elastic maps are used as the methods of original data points mapping to enclosed manifolds having less dimensionality. Diminishing the elasticity parameters one can design map surface which approximates the multidimensional dataset in question much better. Then the points of dataset in question are projected to the map. The extension of designed map to a flat plane allows one to get an insight about the cluster structure of multidimensional dataset. The approach of elastic maps does not require any a priori information about data in question and does not depend on data nature, data origin, etc. Elastic maps are usually combined with PCA approach. Being presented in the space based on three first principal components the elastic maps provide quite good results. The article describes the results of elastic maps approach application to visual analysis of clusters for different multidimensional datasets including medical data.

  11. A meta-analysis of motivational interviewing process: Technical, relational, and conditional process models of change.

    Science.gov (United States)

    Magill, Molly; Apodaca, Timothy R; Borsari, Brian; Gaume, Jacques; Hoadley, Ariel; Gordon, Rebecca E F; Tonigan, J Scott; Moyers, Theresa

    2018-02-01

    In the present meta-analysis, we test the technical and relational hypotheses of Motivational Interviewing (MI) efficacy. We also propose an a priori conditional process model where heterogeneity of technical path effect sizes should be explained by interpersonal/relational (i.e., empathy, MI Spirit) and intrapersonal (i.e., client treatment seeking status) moderators. A systematic review identified k = 58 reports, describing 36 primary studies and 40 effect sizes (N = 3,025 participants). Statistical methods calculated the inverse variance-weighted pooled correlation coefficient for the therapist to client and the client to outcome paths across multiple target behaviors (i.e., alcohol use, other drug use, other behavior change). Therapist MI-consistent skills were correlated with more client change talk (r = .55, p technical hypothesis was supported. Specifically, proportion MI consistency was related to higher proportion change talk (r = .11, p = .004) and higher proportion change talk was related to reductions in risk behavior at follow up (r = -.16, p technical hypothesis path effect sizes was partially explained by inter- and intrapersonal moderators. This meta-analysis provides additional support for the technical hypothesis of MI efficacy; future research on the relational hypothesis should occur in the field rather than in the context of clinical trials. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  12. Analysis of the medication reconciliation process conducted at hospital admission

    Directory of Open Access Journals (Sweden)

    María Beatriz Contreras Rey

    2016-07-01

    Full Text Available Objective: To analyze the outcomes of a medication reconciliation process at admission in the hospital setting. To assess the role of the Pharmacist in detecting reconciliation errors and preventing any adverse events entailed. Method: A retrospective study was conducted to analyze the medication reconciliation activity during the previous six months. The study included those patients for whom an apparently not justified discrepancy was detected at admission, after comparing the hospital medication prescribed with the home treatment stated in their clinical hospital records. Those patients for whom the physician ordered the introduction of home medication without any specification were also considered. In order to conduct the reconciliation process, the Pharmacist prepared the best pharmacotherapeutical history possible, reviewing all available information about the medication the patient could be taking before admission, and completing the process with a clinical interview. The discrepancies requiring clarification were reported to the physician. It was considered that the reconciliation proposal had been accepted if the relevant modification was made in the next visit of the physician, or within 24-48 hours maximum; this case was then labeled as a reconciliation error. For the descriptive analysis, the Statistics® SPSS program, version 17.0, was used. Outcomes: 494 medications were reconciled in 220 patients, with a mean of 2.25 medications per patient. More than half of patients (59.5% had some discrepancy that required clarification; the most frequent was the omission of a medication that the patient was taking before admission (86.2%, followed by an unjustified modification in dosing or way of administration (5.9%. In total, 312 discrepancies required clarification; out of these, 93 (29.8% were accepted and considered as reconciliation errors, 126 (40% were not accepted, and in 93 cases (29,8% acceptance was not relevant due to a change in

  13. Qualitative process analysis and modelling of emergency care workflow and interface management: identification of critical process steps.

    Science.gov (United States)

    Möckel, Martin; Searle, Julia; Hüttner, Ingo; Vollert, Joern O

    2015-04-01

    Over the past few years, the number of patients attending emergency services has increased steadily. As a result, emergency departments (EDs) worldwide face frequent crowding, with the risk of reduced treatment quality and impaired patient outcome, patient and staff dissatisfaction and inefficient use of ED resources. A qualitative process analysis and process modelling was used as a method to detect critical process steps in the ED with respect to time and efficiency. The analysis was carried out by independent external process experts. Over a period of 1 week, the complete treatment process of 25 patients was recorded. The monitoring of overall activities, decision points, causalities and interfaces was based on the treatment of 100 additional patients and on interviews with nurses and physicians. The project was closed with the identification of the three most critical process steps and modelling of the standard emergency care process in an event-process chain (EPC). The most time-crucial steps detected by the analysis were the process of developing a tentative diagnosis, including consultation and advice seeking by inexperienced physicians, the interface to imaging diagnostics and the search of hospital beds for inpatients. The results were visualized by standardized modelling of an event-process chain (EPC). The process analysis helped to identify inefficient process steps in the ED. Modelling with EPC is a useful tool to visualize and to understand the complexity of the emergency medical care and to identify key performance indicators for effective quality management.

  14. HYBRID SULFUR PROCESS REFERENCE DESIGN AND COST ANALYSIS

    Energy Technology Data Exchange (ETDEWEB)

    Gorensek, M.; Summers, W.; Boltrunis, C.; Lahoda, E.; Allen, D.; Greyvenstein, R.

    2009-05-12

    This report documents a detailed study to determine the expected efficiency and product costs for producing hydrogen via water-splitting using energy from an advanced nuclear reactor. It was determined that the overall efficiency from nuclear heat to hydrogen is high, and the cost of hydrogen is competitive under a high energy cost scenario. It would require over 40% more nuclear energy to generate an equivalent amount of hydrogen using conventional water-cooled nuclear reactors combined with water electrolysis compared to the proposed plant design described herein. There is a great deal of interest worldwide in reducing dependence on fossil fuels, while also minimizing the impact of the energy sector on global climate change. One potential opportunity to contribute to this effort is to replace the use of fossil fuels for hydrogen production by the use of water-splitting powered by nuclear energy. Hydrogen production is required for fertilizer (e.g. ammonia) production, oil refining, synfuels production, and other important industrial applications. It is typically produced by reacting natural gas, naphtha or coal with steam, which consumes significant amounts of energy and produces carbon dioxide as a byproduct. In the future, hydrogen could also be used as a transportation fuel, replacing petroleum. New processes are being developed that would permit hydrogen to be produced from water using only heat or a combination of heat and electricity produced by advanced, high temperature nuclear reactors. The U.S. Department of Energy (DOE) is developing these processes under a program known as the Nuclear Hydrogen Initiative (NHI). The Republic of South Africa (RSA) also is interested in developing advanced high temperature nuclear reactors and related chemical processes that could produce hydrogen fuel via water-splitting. This report focuses on the analysis of a nuclear hydrogen production system that combines the Pebble Bed Modular Reactor (PBMR), under development by

  15. Analysis of Wide-Band Signals Using Wavelet Array Processing

    Science.gov (United States)

    Nisii, V.; Saccorotti, G.

    2005-12-01

    Wavelets transforms allow for precise time-frequency localization in the analysis of non-stationary signals. In wavelet analysis the trade-off between frequency bandwidth and time duration, also known as Heisenberg inequality, is by-passed using a fully scalable modulated window which solves the signal-cutting problem of Windowed Fourier Transform. We propose a new seismic array data processing procedure capable of displaying the localized spatial coherence of the signal in both the time- and frequency-domain, in turn deriving the propagation parameters of the most coherent signals crossing the array. The procedure consists in: a) Wavelet coherence analysis for each station pair of the instruments array, aimed at retrieving the frequency- and time-localisation of coherent signals. To this purpose, we use the normalised wavelet cross- power spectrum, smoothed along the time and scale domains. We calculate different coherence spectra adopting smoothing windows of increasing lengths; a final, robust estimate of the time-frequency localisation of spatially-coherent signals is eventually retrieved from the stack of the individual coherence distribution. This step allows for a quick and reliable signal discrimination: wave groups propagating across the network will manifest as high-coherence patches spanning the corresponding time-scale region. b) Once the signals have been localised in the time and frequency domain,their propagation parameters are estimated using a modified MUSIC (MUltiple SIgnal Characterization) algorithm. We select the MUSIC approach as it demonstrated superior performances in the case of low SNR signals, more plane waves contemporaneously impinging at the array and closely separated sources. The narrow-band Coherent Signal Subspace technique is applied to the complex Continuous Wavelet Transform of multichannel data for improving the singularity of the estimated cross-covariance matrix and the accuracy of the estimated signal eigenvectors. Using

  16. Focus on Materials Analysis and Processing in Magnetic Fields

    Directory of Open Access Journals (Sweden)

    Yoshio Sakka, Noriyuki Hirota, Shigeru Horii and Tsutomu Ando

    2009-01-01

    Full Text Available Recently, interest in the applications of feeble (diamagnetic and paramagnetic magnetic materials has grown, whereas the popularity of ferromagnetic materials remains steady and high. This trend is due to the progress of superconducting magnet technology, particularly liquid-helium-free superconducting magnets that can generate magnetic fields of 10 T and higher. As the magnetic energy is proportional to the square of the applied magnetic field, the magnetic energy of such 10 T magnets is in excess of 10 000 times that of conventional 0.1 T permanent magnets. Consequently, many interesting phenomena have been observed over the last decade, such as the Moses effect, magnetic levitation and the alignment of feeble magnetic materials. Researchers in this area are widely spread around the world, but their number in Japan is relatively high, which might explain the success of magnetic field science and technology in Japan.Processing in magnetic fields is a rapidly expanding research area with a wide range of promising applications in materials science. The 3rd International Workshop on Materials Analysis and Processing in Magnetic Fields (MAP3, which was held on 14–16 May 2008 at the University of Tokyo, Japan, focused on various topics including magnetic field effects on chemical, physical, biological, electrochemical, thermodynamic and hydrodynamic phenomena; magnetic field effects on the crystal growth and processing of materials; diamagnetic levitation, the magneto-Archimedes effect, spin chemistry, magnetic orientation, control of structure by magnetic fields, magnetic separation and purification, magnetic-field-induced phase transitions, properties of materials in high magnetic fields, the development of NMR and MRI, medical applications of magnetic fields, novel magnetic phenomena, physical property measurement by magnetic fields, and the generation of high magnetic fields.This focus issue compiles 13 key papers selected from the proceedings

  17. Kinetic analysis of manure pyrolysis and combustion processes.

    Science.gov (United States)

    Fernandez-Lopez, M; Pedrosa-Castro, G J; Valverde, J L; Sanchez-Silva, L

    2016-12-01

    Due to the depletion of fossil fuel reserves and the environmental issues derived from their use, biomass seems to be an excellent source of renewable energy. In this work, the kinetics of the pyrolysis and combustion of three different biomass waste samples (two dairy manure samples before (Pre) and after (Dig R) anaerobic digestion and one swine manure sample (SW)) was studied by means of thermogravimetric analysis. In this work, three iso-conversional methods (Friedman, Flynn-Wall-Ozawa (FWO) and Kissinger-Akahira-Sunose (KAS)) were compared with the Coats-Redfern method. The Ea values of devolatilization stages were in the range of 152-170kJ/mol, 148-178kJ/mol and 156-209kJ/mol for samples Pre, Dig R and SW, respectively. Concerning combustion process, char oxidation stages showed lower Ea values than that obtained for the combustion devolatilization stage, being in the range of 140-175kJ/mol, 178-199kJ/mol and 122-144kJ/mol for samples Pre, Dig R and SW, respectively. These results were practically the same for samples Pre and Dig R, which means that the kinetics of the thermochemical processes were not affected by anaerobic digestion. Finally, the distributed activation energy model (DAEM) and the pseudo-multi component stage model (PMSM) were applied to predict the weight loss curves of pyrolysis and combustion. DAEM was the best model that fitted the experimental data. Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Design of process displays based on risk analysis techniques

    Energy Technology Data Exchange (ETDEWEB)

    Lundtang Paulsen, J

    2004-05-01

    This thesis deals with the problems of designing display systems for process plants. We state the reasons why it is important to discuss information systems for operators in a control room, especially in view of the enormous amount of information available in computer-based supervision systems. The state of the art is discussed: How are supervision systems designed today and why? Which strategies are used? What kind of research is going on? Four different plants and their display systems, designed by the author, are described and discussed. Next we outline different methods for eliciting knowledge of a plant, particularly the risks, which is necessary information for the display designer. A chapter presents an overview of the various types of operation references: constitutive equations, set points, design parameters, component characteristics etc., and their validity in different situations. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engineer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described in some detail. Finally we address the problem of where to put the dot and the lines: when all information is on the table, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose was to develop a software tool for maintenance supervision of components in a nuclear power plant. (au)

  19. Processing, analysis, recognition, and automatic understanding of medical images

    Science.gov (United States)

    Tadeusiewicz, Ryszard; Ogiela, Marek R.

    2004-07-01

    Paper presents some new ideas introducing automatic understanding of the medical images semantic content. The idea under consideration can be found as next step on the way starting from capturing of the images in digital form as two-dimensional data structures, next going throw images processing as a tool for enhancement of the images visibility and readability, applying images analysis algorithms for extracting selected features of the images (or parts of images e.g. objects), and ending on the algorithms devoted to images classification and recognition. In the paper we try to explain, why all procedures mentioned above can not give us full satisfaction in many important medical problems, when we do need understand image semantic sense, not only describe the image in terms of selected features and/or classes. The general idea of automatic images understanding is presented as well as some remarks about the successful applications of such ideas for increasing potential possibilities and performance of computer vision systems dedicated to advanced medical images analysis. This is achieved by means of applying linguistic description of the picture merit content. After this we try use new AI methods to undertake tasks of the automatic understanding of images semantics in intelligent medical information systems. A successful obtaining of the crucial semantic content of the medical image may contribute considerably to the creation of new intelligent multimedia cognitive medical systems. Thanks to the new idea of cognitive resonance between stream of the data extracted form the image using linguistic methods and expectations taken from the representation of the medical knowledge, it is possible to understand the merit content of the image even if the form of the image is very different from any known pattern.

  20. Sensitivity Analysis of Proposed LNG liquefaction Processes for LNG FPSO

    OpenAIRE

    Pwaga, Sultan Seif

    2011-01-01

    The four liquefaction processes proposed as a good candidate for LNG FPSO are simulated and evaluated. These processes include a single mixed refrigerant (SMR), dual mixed refrigerant (DMR), Niche LNG ( CH4 and N2 process) and dual nitrogen expander. The steady state hysys simulation of the processes were undertaken to ensure that each simulated liquefaction process was compared on the identical parameters. An in-depth optimization has not been conducted but the simulation was aimed at obtain...

  1. Human movement analysis with image processing in real time

    Science.gov (United States)

    Fauvet, Eric; Paindavoine, Michel; Cannard, F.

    1991-04-01

    In the field of the human sciences, a lot of applications needs to know the kinematic characteristics of the human movements Psycology is associating the characteristics with the control mechanism, sport and biomechariics are associating them with the performance of the sportman or of the patient. So the trainers or the doctors can correct the gesture of the subject to obtain a better performance if he knows the motion properties. Roherton's studies show the children motion evolution2 . Several investigations methods are able to measure the human movement But now most of the studies are based on image processing. Often the systems are working at the T.V. standard (50 frame per secund ). they permit only to study very slow gesture. A human operator analyses the digitizing sequence of the film manually giving a very expensive, especially long and unprecise operation. On these different grounds many human movement analysis systems were implemented. They consist of: - markers which are fixed to the anatomical interesting points on the subject in motion, - Image compression which is the art to coding picture data. Generally the compression Is limited to the centroid coordinates calculation tor each marker. These systems differ from one other in image acquisition and markers detection.

  2. Probabilistic Analysis of the Hard Rock Disintegration Process

    Directory of Open Access Journals (Sweden)

    K. Frydrýšek

    2008-01-01

    Full Text Available This paper focuses on a numerical analysis of the hard rock (ore disintegration process. The bit moves and sinks into the hard rock (mechanical contact with friction between the ore and the cutting bit and subsequently disintegrates it. The disintegration (i.e. the stress-strain relationship, contact forces, reaction forces and fracture of the ore is solved via the FEM (MSC.Marc/Mentat software and SBRA (Simulation-Based Reliability Assessment method (Monte Carlo simulations, Anthill and Mathcad software. The ore is disintegrated by deactivating the finite elements which satisfy the fracture condition. The material of the ore (i.e. yield stress, fracture limit, Young’s modulus and Poisson’s ratio, is given by bounded histograms (i.e. stochastic inputs which better describe reality. The results (reaction forces in the cutting bit are also of stochastic quantity and they are compared with experimental measurements. Application of the SBRA method in this area is a modern and innovative trend in mechanics. However, it takes a long time to solve this problem (due to material and structural nonlinearities, the large number of elements, many iteration steps and many Monte Carlo simulations. Parallel computers were therefore used to handle the large computational needs of this problem. 

  3. FOREWORD: Focus on Materials Analysis and Processing in Magnetic Fields Focus on Materials Analysis and Processing in Magnetic Fields

    Science.gov (United States)

    Sakka, Yoshio; Hirota, Noriyuki; Horii, Shigeru; Ando, Tsutomu

    2009-03-01

    Recently, interest in the applications of feeble (diamagnetic and paramagnetic) magnetic materials has grown, whereas the popularity of ferromagnetic materials remains steady and high. This trend is due to the progress of superconducting magnet technology, particularly liquid-helium-free superconducting magnets that can generate magnetic fields of 10 T and higher. As the magnetic energy is proportional to the square of the applied magnetic field, the magnetic energy of such 10 T magnets is in excess of 10 000 times that of conventional 0.1 T permanent magnets. Consequently, many interesting phenomena have been observed over the last decade, such as the Moses effect, magnetic levitation and the alignment of feeble magnetic materials. Researchers in this area are widely spread around the world, but their number in Japan is relatively high, which might explain the success of magnetic field science and technology in Japan. Processing in magnetic fields is a rapidly expanding research area with a wide range of promising applications in materials science. The 3rd International Workshop on Materials Analysis and Processing in Magnetic Fields (MAP3), which was held on 14-16 May 2008 at the University of Tokyo, Japan, focused on various topics including magnetic field effects on chemical, physical, biological, electrochemical, thermodynamic and hydrodynamic phenomena; magnetic field effects on the crystal growth and processing of materials; diamagnetic levitation, the magneto-Archimedes effect, spin chemistry, magnetic orientation, control of structure by magnetic fields, magnetic separation and purification, magnetic-field-induced phase transitions, properties of materials in high magnetic fields, the development of NMR and MRI, medical applications of magnetic fields, novel magnetic phenomena, physical property measurement by magnetic fields, and the generation of high magnetic fields. This focus issue compiles 13 key papers selected from the proceedings of MAP3. Other

  4. Exergy analysis for a freeze-drying process

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Yongzhong; Zhao, Yanfei; Feng, Xiao [Department of Chemical Engineering, Xi' an Jiaotong University, 28, Xianning West Road, Xi' an, Shaanxi 710049 (China)

    2008-05-15

    A mathematical model for exergy loss analysis of a freeze-drying process was established to evaluate the exergy losses in the individual operations and the distribution of exergy losses in a freeze-dryer. The exergy losses of five operations, namely, freezing, primary drying, secondary drying and vapor condensation as well as vacuum pumping were included in the model. The unique feature of the model is the incorporation of dynamics into the expressions of the exergy loss analyses for freezing, primary drying and secondary drying stages. The distribution of exergy losses at various operating parameters of freeze-drying was investigated using this model. Take freeze dying beef for an example. The effects of various operation conditions on the exergy losses in the three stages were investigated. The results show that the exergy consumption in the primary drying reaches 35.69% of the total exergy input, while exergy consumption in vapor condensing is 31.76% of the total exergy input. In the vacuum pumping 23.29% of the total exergy input is consumed. In contrast the exergy consumption in the freezing and the secondary drying is only 3.56% and 5.71% of the total exergy input, respectively. The exergy analyses based on various operating parameters show that the exergy losses of the drying process can be remarkably reduced by increasing the temperature of the cooling source in the vapor condenser. In this study, when the temperature of the cooling source in the vapor condenser increases from -70 C to -25 C, it leads to the total exergy losses reducing from 1409 kJ/kg(moist basis) to 604 kJ/kg(moist basis). It indicates that increasing the temperature of vapor condensation is an effective way to reduce the total exergy losses of the system, as long as the conditions of drying dynamics are satisfied during drying. Moreover, there is an optimal surface temperature corresponding to the minimum total exergy losses during the primary and secondary drying stages. If the surface

  5. School Board Decision Making: An Analysis of the Process

    Science.gov (United States)

    Crum, Karen S.

    2007-01-01

    The goal of this study was to analyze the characteristics in the school board decision-making process and to discover whether school board members are aware of the characteristics surrounding the school board's decision-making process. Specifically, this study examines the decision-making process of a school board in Virginia, and it provides…

  6. Reliability analysis of common hazardous waste treatment processes

    Energy Technology Data Exchange (ETDEWEB)

    Waters, Robert D. [Vanderbilt Univ., Nashville, TN (United States)

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  7. Discovery and analysis of e-mail-driven business processes

    NARCIS (Netherlands)

    Stuit, Marco; Wortmann, Hans

    E-mail is used as the primary tool for business communication and collaboration. This paper presents a novel e-mail interaction mining method to discover and analyze e-mail-driven business processes. An e-mail-driven business process is perceived as a human collaboration process that consists of

  8. Micromagnetic Modeling and Analysis for Memory and Processing Applications

    Science.gov (United States)

    Lubarda, Marko V.

    Magnetic nanostructures are vital components of numerous existing and prospective magnetic devices, including hard disk drives, magnetic sensors, and microwave generators. The ability to examine and predict the behavior of magnetic nanostructures is essential for improving existing devices and exploring new technologies and areas of application. This thesis consists of three parts. In part I, key concepts of magnetism are covered (chapter 1), followed by an introduction to micromagnetics (chapter 2). Key interactions are discussed. The Landau-Lifshitz-Gilbert equation is introduced, and the variational approach of W. F. Brown is presented. Part II is devoted to computational micromagnetics. Interaction energies, fields and torques, introduced in part I, are transcribed from the continuum to their finite element form. The validity of developed models is discussed with reference to physical assumptions and discretization criteria. Chapter 3 introduces finite element modeling, and provides derivations of micromagnetic fields in the linear basis representation. Spin transfer torques are modeled in chapter 4. Thermal effects are included in the computational framework in chapter 5. Chapter 6 discusses an implementation of the nudged elastic band method for the computation of energy barriers. A model accounting for polycrystallinity is developed in chapter 7. The model takes into account the wide variety of distributions and imperfections which characterize true systems. The modeling presented in chapters 3-7 forms a general framework for the computational study of diverse magnetic phenomena in contemporary structures and devices. Chapter 8 concludes part II with an outline of powerful acceleration schemes, which were essential for the large-scale micromagnetic simulations presented in part III. Part III begins with the analysis of the perpendicular magnetic recording system (chapter 9). A simulation study of the recording process with readback analysis is presented

  9. Analysis of chemical coal cleaning processes. Final report

    Energy Technology Data Exchange (ETDEWEB)

    1980-06-01

    Six chemical coal cleaning processes were examined. Conceptual designs and costs were prepared for these processes and coal preparation facilities, including physical cleaning and size reduction. Transportation of fine coal in agglomerated and unagglomerated forms was also discussed. Chemical cleaning processes were: Pittsburgh Energy Technology Center, Ledgemont, Ames Laboratory, Jet Propulsion Laboratory (two versions), and Guth Process (KVB). Three of the chemical cleaning processes are similar in concept: PETC, Ledgemont, and Ames. Each of these is based on the reaction of sulfur with pressurized oxygen, with the controlling factor being the partial pressure of oxygen in the reactor. All of the processes appear technically feasible. Economic feasibility is less certain. The recovery of process chemicals is vital to the JPL and Guth processes. All of the processes consume significant amounts of energy in the form of electric power and coal. Energy recovery and increased efficiency are potential areas for study in future more detailed designs. The Guth process (formally designed KVB) appears to be the simplest of the systems evaluated. All of the processes require future engineering to better determine methods for scaling laboratory designs/results to commercial-scale operations. A major area for future engineering is to resolve problems related to handling, feeding, and flow control of the fine and often hot coal.

  10. Individual differences in emotion word processing: A diffusion model analysis.

    Science.gov (United States)

    Mueller, Christina J; Kuchinke, Lars

    2016-06-01

    The exploratory study investigated individual differences in implicit processing of emotional words in a lexical decision task. A processing advantage for positive words was observed, and differences between happy and fear-related words in response times were predicted by individual differences in specific variables of emotion processing: Whereas more pronounced goal-directed behavior was related to a specific slowdown in processing of fear-related words, the rate of spontaneous eye blinks (indexing brain dopamine levels) was associated with a processing advantage of happy words. Estimating diffusion model parameters revealed that the drift rate (rate of information accumulation) captures unique variance of processing differences between happy and fear-related words, with highest drift rates observed for happy words. Overall emotion recognition ability predicted individual differences in drift rates between happy and fear-related words. The findings emphasize that a significant amount of variance in emotion processing is explained by individual differences in behavioral data.

  11. Process synthesis, design and analysis using a process-group contribution method

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Eden, Mario R.; Gani, Rafiqul

    2015-01-01

    This paper describes the development and application of a process-group contribution method to model, simulate and synthesize chemical processes. Process flowsheets are generated in the same way as atoms or groups of atoms are combined to form molecules in computer aided molecular design (CAMD) t...

  12. Analysis of production flow process with lean manufacturing approach

    Science.gov (United States)

    Siregar, Ikhsan; Arif Nasution, Abdillah; Prasetio, Aji; Fadillah, Kharis

    2017-09-01

    This research was conducted on the company engaged in the production of Fast Moving Consumer Goods (FMCG). The production process in the company are still exists several activities that cause waste. Non value added activity (NVA) in the implementation is still widely found, so the cycle time generated to make the product will be longer. A form of improvement on the production line is by applying lean manufacturing method to identify waste along the value stream to find non value added activities. Non value added activity can be eliminated and reduced by utilizing value stream mapping and identifying it with activity mapping process. According to the results obtained that there are 26% of value-added activities and 74% non value added activity. The results obtained through the current state map of the production process of process lead time value of 678.11 minutes and processing time of 173.94 minutes. While the results obtained from the research proposal is the percentage of value added time of 41% of production process activities while non value added time of the production process of 59%. While the results obtained through the future state map of the production process of process lead time value of 426.69 minutes and processing time of 173.89 minutes.

  13. Retro-Techno-Economic Analysis: Using (Bio)Process Systems Engineering Tools to Attain Process Target Values

    DEFF Research Database (Denmark)

    Furlan, Felipe F.; Costa, Caliane B B; Secchi, Argimiro R.

    2016-01-01

    for the main process metrics, providing feedback to the research and development team and setting goals for experimental efforts. The present study proposes a methodology for performing such a "retro" techno-economic analysis. It consists of choosing the most important variables of the process and finding...

  14. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  15. Analysis of Using Resources in Business Process Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Vasilecas Olegas

    2014-12-01

    Full Text Available One of the key purposes of Business Process Model and Notation (BPMN is to support graphical representation of the process model. However, such models have a lack of support for the graphical representation of resources, whose processes are used during simulation or execution of process instance. The paper analyzes different methods and their extensions for resource modeling. Further, this article presents a selected set of resource properties that are relevant for resource modeling. The paper proposes an approach that explains how to use the selected set of resource properties for extension of process modeling using BPMN and simulation tools. They are based on BPMN, where business process instances use resources in a concurrency manner.

  16. Potential analysis of stable processes and its extensions

    CERN Document Server

    Stos, Andrzej

    2009-01-01

    Stable Lévy processes and related stochastic processes play an important role in stochastic modelling in applied sciences, in particular in financial mathematics. This book is about the potential theory of stable stochastic processes. It also deals with related topics, such as the subordinate Brownian motions (including the relativistic process) and Feynman–Kac semigroups generated by certain Schroedinger operators. The authors focus on classes of stable and related processes that contain the Brownian motion as a special case. This is the first book devoted to the probabilistic potential theory of stable stochastic processes, and, from the analytical point of view, of the fractional Laplacian. The introduction is accessible to non-specialists and provides a general presentation of the fundamental objects of the theory. Besides recent and deep scientific results the book also provides a didactic approach to its topic, as all chapters have been tested on a wide audience, including young mathematicians at a C...

  17. Resource Analysis of Cognitive Process Flow Used to Achieve Autonomy

    Science.gov (United States)

    2016-03-01

    after it has been created (i.e., post- fabrication program- ming). The von Neumann general purpose processing (GPP) architecture heavily shares (e.g...Cognitive Processing Architecture DHS Department of Homeland Security DoF Degrees of Freedom DSP Digital Signal Processors FPGA Field Programmable Gate...second part, we introduce and analyze a canonical architecture called context switching cognitive processing architecture that exploits heterogeneous

  18. Simulation analysis of resource flexibility on healthcare processes

    Directory of Open Access Journals (Sweden)

    Simwita YW

    2016-10-01

    Full Text Available Yusta W Simwita, Berit I Helgheim Department of Logistics, Molde University College, Molde, Norway Purpose: This paper uses discrete event simulation to explore the best resource flexibility scenario and examine the effect of implementing resource flexibility on different stages of patient treatment process. Specifically we investigate the effect of resource flexibility on patient waiting time and throughput in an orthopedic care process. We further seek to explore on how implementation of resource flexibility on patient treatment processes affects patient access to healthcare services. We focus on two resources, namely, orthopedic surgeon and operating room. Methods: The observational approach was used to collect process data. The developed model was validated by comparing the simulation output with actual patient data collected from the studied orthopedic care process. We developed different scenarios to identify the best resource flexibility scenario and explore the effect of resource flexibility on patient waiting time, throughput, and future changes in demand. The developed scenarios focused on creating flexibility on service capacity of this care process by altering the amount of additional human resource capacity at different stages of patient care process and extending the use of operating room capacity.Results: The study found that resource flexibility can improve responsiveness to patient demand in the treatment process. Testing different scenarios showed that the introduction of resource flexibility reduces patient waiting time and improves throughput. The simulation results show that patient access to health services can be improved by implementing resource flexibility at different stages of the patient treatment process.Conclusion: This study contributes to the current health care literature by explaining how implementing resource flexibility at different stages of patient care processes can improve ability to respond to increasing

  19. Contract Management Process Maturity: Analysis of Recent Organizational Assessments

    OpenAIRE

    Rendon, Rene G.

    2009-01-01

    Proceedings Paper (for Acquisition Research Program) Approved for public release; distribution unlimited. This research builds upon the emerging body of knowledge on organizational assessments of contract management processes. Since the development of the Contract Management Maturity Modelユ_ᄅ in 2003, several DoD, Air Force, Navy, Army, and defense contractor organizations have undergone contract management process assessments as a part of their process-improvement effort. The assessm...

  20. A Lean Six Sigma Analysis of Student In-Processing

    Science.gov (United States)

    2012-12-01

    Rehabilitation Program SIPOC: Suppliers, Inputs, Process, Outputs, Customers TOC: Theory of Constraints TQM : Total Quality Management VA: Value...has its origins in total quality management ( TQM ) and statistical process control (Maleyeff, 2007, p.9). Made famous by Motorola, the process is...20First%20Hour.pdf Arnheiter, E., & Maleyeff, J. (2005). The integration of lean management and Six Sigma. The TQM Magazine, 17(1), 5–18. Carretero, J

  1. Infrared Signature Analysis: Real Time Monitoring Of Manufacturing Processes

    Science.gov (United States)

    Bangs, Edmund R.

    1988-01-01

    The ability to monitor manufacturing processes in an adaptive control mode and perform an inspection in real time is of interest to fabricators in the pressure vessel, aerospace, automotive, nuclear and shipbuilding industries. Results of a series of experiments using infrared thermography as the principal sensing mode are presented to show how artificial intelligence contained in infrared isotherm, contains vast critical process variables. Image processing computer software development has demonstrated in a spot welding application how the process can be monitored and controlled in real time. The IR vision sensor program is now under way. Research thus far has focused on fusion welding, resistance spot welding and metal removal.

  2. Stage efficiency in the analysis of thermochemical water decomposition processes

    Science.gov (United States)

    Conger, W. L.; Funk, J. E.; Carty, R. H.; Soliman, M. A.; Cox, K. E.

    1976-01-01

    The procedure for analyzing thermochemical water-splitting processes using the figure of merit is expanded to include individual stage efficiencies and loss coefficients. The use of these quantities to establish the thermodynamic insufficiencies of each stage is shown. A number of processes are used to illustrate these concepts and procedures and to demonstrate the facility with which process steps contributing most to the cycle efficiency are found. The procedure allows attention to be directed to those steps of the process where the greatest increase in total cycle efficiency can be obtained.

  3. The Humanization Processes: A Social, Behavioral Analysis of Children's Problems.

    Science.gov (United States)

    Hamblin, Robert L.; And Others

    Research and development work performed by the authors as employees of the Central Midwestern Regional Educational Laboratory is treated in this book concerning the acculturation processes through which children develop the essential human characteristics, in particular, the humane processes of humanization. The 10 chapters of the book are: 1.…

  4. Analysis of profitability and poverty reduction of yoghurt processing ...

    African Journals Online (AJOL)

    The study assessed the profitability of yoghurt processing with a view of determining its potentials for reducing poverty in Maiduguri Metropolitan Area. Data were collected from a survey of 10 yoghurt processing firms in Maiduguri and analysed using profit model and descriptive statistics. Results revealed that yoghurt ...

  5. Profitability Analysis of Rice Processing and Marketing in Kano State ...

    African Journals Online (AJOL)

    ABSTRACT: The study determined the profitability of rice processing and marketing in Kano. State. The objective of the study was to assess the profitability levels of rice processing and marketing, evaluate the value added to the commodity at each stage in the study area and determine the most efficient services produce.

  6. Socio-economic analysis of processing Pachyrhizus erosus (L.) Urb ...

    African Journals Online (AJOL)

    protein, iron, zinc, etc.) that are used in various kinds of food processing. The objective of this study was to analyse producers' and processors' perception regarding processing P. erosus tubers into gari in on-farm conditions and its financial ...

  7. Simple process capability analysis and quality validation of ...

    African Journals Online (AJOL)

    Many ways can be applied to improve the process and one of them is by choosing the correct six sigma's design of experiment (DOE). In this study, Taguchi's experimental design was applied to achieve high percentage of cell viability in the fermentation experiment. The process capability of this study was later analyzed by ...

  8. A Practical Decision-Analysis Process for Forest Ecosystem Management

    Science.gov (United States)

    H. Michael Rauscher; F. Thomas Lloyd; David L. Loftis; Mark J. Twery

    2000-01-01

    Many authors have pointed out the need to firm up the 'fuzzy' ecosystem management paradigm and develop operationally practical processes to allow forest managers to accommodate more effectively the continuing rapid change in societal perspectives and goals. There are three spatial scales where clear, precise, practical ecosystem management processes are...

  9. Guidelines and cost analysis for catalyst production in biocatalytic processes

    DEFF Research Database (Denmark)

    Tufvesson, Pär; Lima Ramos, Joana; Nordblad, Mathias

    2011-01-01

    Biocatalysis is an emerging area of technology, and to date few reports have documented the economics of such processes. As it is a relatively new technology, many processes do not immediately fulfill the economic requirements for commercial operation. Hence, early-stage economic assessment could...

  10. The analytical hierarchy process applied for design analysis

    NARCIS (Netherlands)

    Ciftcioglu, O.; Sariyildiz, I.S.

    2005-01-01

    Being an intelligent activity, design is a complex process to accomplish. The complexity stems from the elusive character of this activity, which cannot be explained in precise terms, in general. In a design process, the determined relationships among the design elements provide important

  11. Economic analysis of fish processing and marketing in Ogun ...

    African Journals Online (AJOL)

    Despite the high profitability of the business, fish processors identified lack of collateral security for bank loan (96.5%), erratic power supply (92.0%) and lack of modern fish processing facilities (43.4%) as their most prevailing problems. With this high level of profitability and viability in fish processing and marketing, it is ...

  12. Simulation and Flexibility Analysis of Milk Production Process

    DEFF Research Database (Denmark)

    Cheng, Hongyuan; Friis, Alan

    In this work, process simulation method is used to simulate pasteurised market milk production line. A commercial process simulation tool - Pro/II from Simulation Science Inc. is used in the simulation work. In the simulation, a new model is used to calculate the thermal property of milk....... In this work, a simulator is obtained for the milk production line. Using the simulator, different milk processing situation can be quantitatively simulated investigated, such as different products production, capacity changes, fat content changes in raw milk, energy cost at different operation conditions etc....... As the pasteurised market milk production line involves typical milk processing steps, such as pasteurisation, centrifugal separation, standardisation, the simulator can be modified to simulate similar milk processing lines. In many cases, the rapidly changed market requires a flexible milk production line...

  13. Stability Analysis of Radial Turning Process for Superalloys

    Science.gov (United States)

    Jiménez, Alberto; Boto, Fernando; Irigoien, Itziar; Sierra, Basilio; Suarez, Alfredo

    2017-09-01

    Stability detection in machining processes is an essential component for the design of efficient machining processes. Automatic methods are able to determine when instability is happening and prevent possible machine failures. In this work a variety of methods are proposed for detecting stability anomalies based on the measured forces in the radial turning process of superalloys. Two different methods are proposed to determine instabilities. Each one is tested on real data obtained in the machining of Waspalloy, Haynes 282 and Inconel 718. Experimental data, in both Conventional and High Pressure Coolant (HPC) environments, are set in four different states depending on materials grain size and Hardness (LGA, LGS, SGA and SGS). Results reveal that PCA method is useful for visualization of the process and detection of anomalies in online processes.

  14. STABILITY ANALYSIS OF RADIAL TURNING PROCESS FOR SUPERALLOYS

    Directory of Open Access Journals (Sweden)

    Alberto JIMÉNEZ

    2017-07-01

    Full Text Available Stability detection in machining processes is an essential component for the design of efficient machining processes. Automatic methods are able to determine when instability is happening and prevent possible machine failures. In this work a variety of methods are proposed for detecting stability anomalies based on the measured forces in the radial turning process of superalloys. Two different methods are proposed to determine instabilities. Each one is tested on real data obtained in the machining of Waspalloy, Haynes 282 and Inconel 718. Experimental data, in both Conventional and High Pressure Coolant (HPC environments, are set in four different states depending on materials grain size and Hard-ness (LGA, LGS, SGA and SGS. Results reveal that PCA method is useful for visualization of the process and detection of anomalies in online processes.

  15. Geometric anisotropic spatial point pattern analysis and Cox processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Toftaker, Håkon

    We consider spatial point processes with a pair correlation function g(u) which depends only on the lag vector u between a pair of points. Our interest is in statistical models with a special kind of ‘structured’ anisotropy: g is geometric anisotropy if it is elliptical but not spherical. In part......We consider spatial point processes with a pair correlation function g(u) which depends only on the lag vector u between a pair of points. Our interest is in statistical models with a special kind of ‘structured’ anisotropy: g is geometric anisotropy if it is elliptical but not spherical....... In particular we study Cox process models with an elliptical pair correlation function, including shot noise Cox processes and log Gaussian Cox processes, and we develop estimation procedures using summary statistics and Bayesian methods. Our methodology is illustrated on real and synthetic datasets of spatial...

  16. Analysis of Alternatives (AoA) Process Improvement Study

    Science.gov (United States)

    2016-12-01

    CAA) • Assistant Secretary of the Army (Acquisition, Logistics and Technology) ASA(ALT) • The Office of Cost Assessment and Program Evaluation (CAPE... Analysis Directorate (SLAD) ARL/SLAD • The Deputy Assistant Secretary of the Army for Cost and Economics (DASA-CE) • Army Capabilities Integration Center...broad set of solutions, key trades among cost , schedule, performance, affordability analysis , risk analysis , and planning for risk mitigation. The

  17. Analysis of Student Satisfaction in The Process of Teaching and Learning Using Importance Performance Analysis

    Science.gov (United States)

    Sembiring, P.; Sembiring, S.; Tarigan, G.; Sembiring, OD

    2017-12-01

    This study aims to determine the level of student satisfaction in the learning process at the University of Sumatra Utara, Indonesia. The sample size of the study consisted 1204 students. Students’ response measured through questionnaires an adapted on a 5-point likert scale and interviews directly to the respondent. SERVQUAL method used to measure the quality of service with five dimensions of service characteristics, namely, physical evidence, reliability, responsiveness, assurance and concern. The result of Importance Performance Analysis reveals that six services attributes must be corrected by policy maker of University Sumatera Utara. The quality of service is still considered low by students.

  18. Process Synthesis, Design and Analysis using Process-Group Contribution Method

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Eden, Mario Richard; Gani, Rafiqul

    Process synthesis implies the investigation of chemical reactions needed to produce the desired product, selection of the separation tec hniques needed for downstream processing, as well as making decisions on sequencing the involved reaction and separation operations. This work highlights the de...

  19. Analysis of Optimal Process Flow Diagrams of Light Naphtha Isomerization Process by Mathematic Modelling Method

    Directory of Open Access Journals (Sweden)

    Chuzlov Vjacheslav

    2016-01-01

    Full Text Available An approach to simulation of hydrocarbons refining processes catalytic reactors. The kinetic and thermodynamic research of light naphtha isomerization process was conducted. The kinetic parameters of hydrocarbon feedstock chemical conversion on different types of platinum-content catalysts was established. The estimation of efficiency of including different types of isomerization technologies in oil refinery flow diagram was performed.

  20. Analysis of Optimal Process Flow Diagrams of Light Naphtha Isomerization Process by Mathematic Modelling Method

    OpenAIRE

    Chuzlov Vjacheslav; Molotov Konstantin

    2016-01-01

    An approach to simulation of hydrocarbons refining processes catalytic reactors. The kinetic and thermodynamic research of light naphtha isomerization process was conducted. The kinetic parameters of hydrocarbon feedstock chemical conversion on different types of platinum-content catalysts was established. The estimation of efficiency of including different types of isomerization technologies in oil refinery flow diagram was performed.

  1. Comparative analysis of business rules and business process modeling languages

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2013-03-01

    Full Text Available During developing an information system is important to create clear models and choose suitable modeling languages. The article analyzes the SRML, SBVR, PRR, SWRL, OCL rules specifying language and UML, DFD, CPN, EPC and IDEF3 BPMN business process modeling language. The article presents business rules and business process modeling languages theoretical comparison. The article according to selected modeling aspects of the comparison between different business process modeling languages ​​and business rules representation languages sets. Also, it is selected the best fit of language set for three layer framework for business rule based software modeling.

  2. Formal concept analysis in knowledge processing: a survey on applications

    NARCIS (Netherlands)

    Poelmans, J.; Ignatov, D.I.; Kuznetsov, S.O.; Dedene, G.

    2013-01-01

    This is the second part of a large survey paper in which we analyze recent literature on Formal Concept Analysis (FCA) and some closely related disciplines using FCA. We collected 1072 papers published between 2003 and 2011 mentioning terms related to Formal Concept Analysis in the title, abstract

  3. Formative Research on the Heuristic Task Analysis Process.

    Science.gov (United States)

    Reigeluth, Charles M.; Lee, Ji-Yeon; Peterson, Bruce; Chavez, Michael

    Corporate and educational settings increasingly require decision making, problem solving and other complex cognitive skills to handle ill-structured, or heuristic, tasks, but the growing need for heuristic task expertise has outpaced the refinement of task analysis methods for heuristic expertise. The Heuristic Task Analysis (HTA) Method was…

  4. Reachability for Finite-State Process Algebras Using Static Analysis

    DEFF Research Database (Denmark)

    Skrypnyuk, Nataliya; Nielson, Flemming

    2011-01-01

    of the Data Flow Analysis are used in order to “cut off” some of the branches in the reachability analysis that are not important for determining, whether or not a state is reachable. In this way, it is possible for our reachability algorithm to avoid building large parts of the system altogether and still...

  5. Thermodynamic analysis of lignocellulosic biofuel production via a biochemical process: guiding technology selection and research focus.

    Science.gov (United States)

    Sohel, M Imroz; Jack, Michael W

    2011-02-01

    The aim of this paper is to present an exergy analysis of bioethanol production process from lignocellulosic feedstock via a biochemical process to asses the overall thermodynamic efficiency and identify the main loss processes. The thermodynamic efficiency of the biochemical process was found to be 35% and the major inefficiencies of this process were identified as: the combustion of lignin for process heat and power production and the simultaneous scarification and co-fermentation process accounting for 67% and 27% of the lost exergy, respectively. These results were also compared with a previous analysis of a thermochemical process for producing biofuel. Despite fundamental differences, the biochemical and thermochemical processes considered here had similar levels of thermodynamic efficiency. Process heat and power production was the major contributor to exergy loss in both of the processes. Unlike the thermochemical process, the overall efficiency of the biochemical process largely depends on how the lignin is utilized. Copyright © 2010 Elsevier Ltd. All rights reserved.

  6. Reliability Analysis and Standardization of Spacecraft Command Generation Processes

    Science.gov (United States)

    Meshkat, Leila; Grenander, Sven; Evensen, Ken

    2011-01-01

    center dot In order to reduce commanding errors that are caused by humans, we create an approach and corresponding artifacts for standardizing the command generation process and conducting risk management during the design and assurance of such processes. center dot The literature review conducted during the standardization process revealed that very few atomic level human activities are associated with even a broad set of missions. center dot Applicable human reliability metrics for performing these atomic level tasks are available. center dot The process for building a "Periodic Table" of Command and Control Functions as well as Probabilistic Risk Assessment (PRA) models is demonstrated. center dot The PRA models are executed using data from human reliability data banks. center dot The Periodic Table is related to the PRA models via Fault Links.

  7. Socially Grounded Analysis of Knowledge Management Systems and Processes

    NARCIS (Netherlands)

    Guizzardi, R.S.S.; Perini, A.; Dignum, V.

    2008-01-01

    In the struggle to survive and compete in face of constant technological changes and unstable business environments, organizations recognize knowledge as its most valuable asset. Consequently, these organizations often invest on Knowledge Management (KM), seeking to enhance their internal processes

  8. Analysis of paper machine process waters; Paperikoneen prosessivesianalytiikka - MPKT 09

    Energy Technology Data Exchange (ETDEWEB)

    Knuutinen, J.; Alen, R.; Harjula, P.; Kilpinen, J.; Pallonen, R.; Jurvela, V.

    1998-12-31

    The closure of paper machine circuits demands a better knowledge of the chemical structures and behaviour of organic compounds in pulp mill process waters. Nonionic or negatively charged detrimental substances (anionic trash) which will eventually cause runnability. Paper quality problems are of special interest. The main purpose of the project was to develop routine `fingerprint` analytical procedures to study various process waters. Our major interest was focused on low molecular weight carboxylic acids, carbohydrates and lignin based material. The `fingerprints` (chromatograms and electropherograms) can be used to differentiate various process waters or to find out changes between the composition of organic compounds in various stages of the papermaking process. Until now the most characteristic `fingerprints` were obtained by capillary zone electrophoresis (CZE) and by pyrolysis - gas chromatography - mass spectrometry (Py-GC/MS). Examples of using these techniques are briefly discussed. (orig.)

  9. THE MARKOV PROCESS ANALYSIS OF LUCHAK’S QUEUING MODEL.

    Science.gov (United States)

    negative exponentially distributed time. This system is analyzed by considering the bivariate Markov process formed by the queue-length and the number of residual service phases of the customer at the counter. (Author)

  10. Techno-economic analysis of decentralized biomass processing depots.

    Science.gov (United States)

    Lamers, Patrick; Roni, Mohammad S; Tumuluru, Jaya S; Jacobson, Jacob J; Cafferty, Kara G; Hansen, Jason K; Kenney, Kevin; Teymouri, Farzaneh; Bals, Bryan

    2015-10-01

    Decentralized biomass processing facilities, known as biomass depots, may be necessary to achieve feedstock cost, quantity, and quality required to grow the future U.S. bioeconomy. In this paper, we assess three distinct depot configurations for technical difference and economic performance. The depot designs were chosen to compare and contrast a suite of capabilities that a depot could perform ranging from conventional pelleting to sophisticated pretreatment technologies. Our economic analyses indicate that depot processing costs are likely to range from ∼US$30 to US$63 per dry metric tonne (Mg), depending upon the specific technology implemented and the energy consumption for processing equipment such as grinders and dryers. We conclude that the benefits of integrating depots into the overall biomass feedstock supply chain will outweigh depot processing costs and that incorporation of this technology should be aggressively pursued. Copyright © 2015. Published by Elsevier Ltd.

  11. Automatic physiological waveform processing for FMRI noise correction and analysis.

    Directory of Open Access Journals (Sweden)

    Daniel J Kelley

    2008-03-01

    Full Text Available Functional MRI resting state and connectivity studies of brain focus on neural fluctuations at low frequencies which share power with physiological fluctuations originating from lung and heart. Due to the lack of automated software to process physiological signals collected at high magnetic fields, a gap exists in the processing pathway between the acquisition of physiological data and its use in fMRI software for both physiological noise correction and functional analyses of brain activation and connectivity. To fill this gap, we developed an open source, physiological signal processing program, called PhysioNoise, in the python language. We tested its automated processing algorithms and dynamic signal visualization on resting monkey cardiac and respiratory waveforms. PhysioNoise consistently identifies physiological fluctuations for fMRI noise correction and also generates covariates for subsequent analyses of brain activation and connectivity.

  12. Synthesis and analysis of processes with electrolyte mixtures

    DEFF Research Database (Denmark)

    Thomsen, Kaj; Gani, Rafiqul; Rasmussen, Peter

    1995-01-01

    A computer aided system for synthesis, design and simulation of crystallization and fractional crystallization processes with electrolyte mixtures is presented. The synthesis methodology is based on the use of computed solubility diagrams for the corresponding electrolyte systems....

  13. [Psychosocial analysis of the health-disease process].

    Science.gov (United States)

    Sawaia, B B

    1994-04-01

    This article is a reflection about the transdisciplinary paradigmas of the health-illness process noting the symbolic mediation between the reactions of the biological organism and the socio-environment factors including the pathogenic ones. The symbolic-affective mediation is analyzed from the perspective of Social Representation theory allowing one to comprehend the references of individual and collective actions in the health-illness process.

  14. Analysis of Uncertainty in Dynamic Processes Development of Banks Functioning

    Directory of Open Access Journals (Sweden)

    Aleksei V. Korovyakovskii

    2013-01-01

    Full Text Available The paper offers the approach to measure of uncertainty estimation in dynamic processes of banks functioning, using statistic data of different banking operations indicators. To calculate measure of uncertainty in dynamic processes of banks functioning the phase images of relevant sets of statistic data are considered. Besides, it is shown that the form of phase image of the studied sets of statistic data can act as a basis of measure of uncertainty estimation in dynamic processes of banks functioning. The set of analytical characteristics are offered to formalize the form of phase image definition of the studied sets of statistic data. It is shown that the offered analytical characteristics consider inequality of changes in values of the studied sets of statistic data, which is one of the ways of uncertainty display in dynamic processes development. The invariant estimates of measure of uncertainty in dynamic processes of banks functioning, considering significant changes in absolute values of the same indicators for different banks were obtained. The examples of calculation of measure of uncertainty in dynamic processes of concrete banks functioning were cited.

  15. Analysis of the ATR fuel element swaging process

    Energy Technology Data Exchange (ETDEWEB)

    Richins, W.D.; Miller, G.K.

    1995-12-01

    This report documents a detailed evaluation of the swaging process used to connect fuel plates to side plates in Advanced Test Reactor (ATR) fuel elements. The swaging is a mechanical process that begins with fitting a fuel plate into grooves in the side plates. Once a fuel plate is positioned, a lip on each of two side plate grooves is pressed into the fuel plate using swaging wheels to form the joints. Each connection must have a specified strength (measured in terms, of a pullout force capacity) to assure that these joints do not fail during reactor operation. The purpose of this study is to analyze the swaging process and associated procedural controls, and to provide recommendations to assure that the manufacturing process produces swaged connections that meet the minimum strength requirement. The current fuel element manufacturer, Babcock and Wilcox (B&W) of Lynchburg, Virginia, follows established procedures that include quality inspections and process controls in swaging these connections. The procedures have been approved by Lockheed Martin Idaho Technologies and are designed to assure repeatability of the process and structural integrity of each joint. Prior to July 1994, ATR fuel elements were placed in the Hydraulic Test Facility (HTF) at the Idaho National Engineering Laboratory (AGNAIL), Test Reactor Area (TRA) for application of Boehmite (an aluminum oxide) film and for checking structural integrity before placement of the elements into the ATR. The results presented in this report demonstrate that the pullout strength of the swaged connections is assured by the current manufacturing process (with several recommended enhancements) without the need for- testing each element in the HTF.

  16. Analysis of the Transitional Process into Naval Electrical Equipment

    Directory of Open Access Journals (Sweden)

    Florenţiu Deliu

    2010-10-01

    Full Text Available The analysis is based on a naval power with synchronous generator and consumers of various powers. The paper presents a systemic approach to naval power systems based on mathematical models of specific generators and consumers.

  17. Bioactive phytochemicals in wheat: Extraction, analysis, processing, and functional properties

    Science.gov (United States)

    Whole wheat provides a rich source of bioactive phytochemicals namely, phenolic acids, carotenoids, tocopherols, alkylresorcinols, arabinoxylans, benzoxazinoids, phytosterols, and lignans. This review provides information on the distribution, extractability, analysis, and nutraceutical properties of...

  18. Histopathological Image Analysis Using Image Processing Techniques: An Overview

    OpenAIRE

    A. D. Belsare; M.M. Mushrif

    2012-01-01

    This paper reviews computer assisted histopathology image analysis for cancer detection and classification. Histopathology refers to the examination of invasive or less invasive biopsy sample by a pathologist under microscope for locating, analyzing and classifying most of the diseases like cancer. The analysis of histoapthological image is done manually by the pathologist to detect disease which leads to subjective diagnosis of sample and varies with level of expertise of examine...

  19. Domain Endurants: An Analysis and Description Process Model

    DEFF Research Database (Denmark)

    Bjørner, Dines

    2014-01-01

    We present a summary, Sect. 2, of a structure of domain analysis and description concepts: techniques and tools. And we link, in Sect. 3, these concepts, embodied in domain analysis prompts and domain description prompts, in a model of how a diligent domain analyser cum describer would use them. We...... claim that both sections, Sects. 2–3, contribute to a methodology of software engineering....

  20. Dynamic analysis of a guided projectile during engraving process

    Directory of Open Access Journals (Sweden)

    Tao Xue

    2014-06-01

    Full Text Available The reliability of the electronic components inside a guided projectile is highly affected by the launch dynamics of guided projectile. The engraving process plays a crucial role on determining the ballistic performance and projectile stability. This paper analyzes the dynamic response of a guided projectile during the engraving process. By considering the projectile center of gravity moving during the engraving process, a dynamics model is established with the coupling of interior ballistic equations. The results detail the stress situation of a guided projectile band during its engraving process. Meanwhile, the axial dynamic response of projectile in the several milliseconds following the engraving process is also researched. To further explore how the different performance of the engraving band can affect the dynamics of guided projectile, this paper focuses on these two aspects: (a the effects caused by the different band geometry; and (b the effects caused by different band materials. The time domain and frequency domain responses show that the dynamics of the projectile are quite sensitive to the engraving band width. A material with a small modulus of elasticity is more stable than one with a high modulus of elasticity.

  1. A comparative analysis of neural taste processing in animals

    Science.gov (United States)

    de Brito Sanchez, Gabriela; Giurfa, Martin

    2011-01-01

    Understanding taste processing in the nervous system is a fundamental challenge of modern neuroscience. Recent research on the neural bases of taste coding in invertebrates and vertebrates allows discussion of whether labelled-line or across-fibre pattern encoding applies to taste perception. While the former posits that each gustatory receptor responds to one stimulus or a very limited range of stimuli and sends a direct ‘line’ to the central nervous system to communicate taste information, the latter postulates that each gustatory receptor responds to a wider range of stimuli so that the entire population of taste-responsive neurons participates in the taste code. Tastes are represented in the brain of the fruitfly and of the rat by spatial patterns of neural activity containing both distinct and overlapping regions, which are in accord with both labelled-line and across-fibre pattern processing of taste, respectively. In both animal models, taste representations seem to relate to the hedonic value of the tastant (e.g. palatable versus non-palatable). Thus, although the labelled-line hypothesis can account for peripheral taste processing, central processing remains either unknown or differs from a pure labelled-line coding. The essential task for a neuroscience of taste is, therefore, to determine the connectivity of taste-processing circuits in central nervous systems. Such connectivity may determine coding strategies that differ significantly from both the labelled-line and the across-fibre pattern models. PMID:21690133

  2. Analysis of the migration process of Colombian families in Spain

    Directory of Open Access Journals (Sweden)

    Adelina Gimeno Collado

    2014-04-01

    Full Text Available This study analyses migration as a process centred on the transnational family as told by its main characters: migrants – parents and children –¬ and their families in Colombia. The study is based on the systematic model and methodology of the Grounded Theory approach. The migration process is triggered by a combination of push and pull factors. The pioneers, mainly women, have very diverse profiles. We highlight the difficulty of their first experiences, which they overcome via personal tenacity and external support. Despite the difficulties of the acculturation process, the overall outcome is positive, especially regarding their expectations for their children, who wish to stay in Spain having overcome the initial challenges of adaptation. Children experience their own acculturation process, but there is no conflict between children and parents despite their different acculturation levels. Despite hopes that their integration process Spain would have been better, they are thankful for the support received. Decisions are made and adaptation occurs in the private domain, i.e., the family; however, there is a lack of group awareness or joint social action to improve conditions in the country of origin or to improve integration in the host country.

  3. Ontological Analysis of Integrated Process Models: testing hypotheses

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    2001-11-01

    Full Text Available Integrated process modeling is achieving prominence in helping to document and manage business administration and IT processes in organizations. The ARIS framework is a popular example for a framework of integrated process modeling not least because it underlies the 800 or more reference models embedded in the world's most popular ERP package, SAP R/3. This paper demonstrates the usefulness of the Bunge-Wand-Weber (BWW representation model for evaluating modeling grammars such as those constituting ARIS. It reports some initial insights gained from pilot testing Green and Rosemann's (2000 evaluative propositions. Even when considering all five views of ARIS, modelers have problems representing business rules, the scope and boundary of systems, and decomposing models. However, even though it is completely ontologically redundant, users still find the function view useful in modeling.

  4. A Selection Process for Genetic Algorithm Using Clustering Analysis

    Directory of Open Access Journals (Sweden)

    Adam Chehouri

    2017-11-01

    Full Text Available This article presents a newly proposed selection process for genetic algorithms on a class of unconstrained optimization problems. The k-means genetic algorithm selection process (KGA is composed of four essential stages: clustering, membership phase, fitness scaling and selection. Inspired from the hypothesis that clustering the population helps to preserve a selection pressure throughout the evolution of the population, a membership probability index is assigned to each individual following the clustering phase. Fitness scaling converts the membership scores in a range suitable for the selection function which selects the parents of the next generation. Two versions of the KGA process are presented: using a fixed number of clusters K (KGAf and via an optimal partitioning Kopt (KGAo determined by two different internal validity indices. The performance of each method is tested on seven benchmark problems.

  5. Finite Element Approach to Analysis of Axisymmetric Reverse Drawing Process

    Directory of Open Access Journals (Sweden)

    Keran, Z.

    2006-01-01

    Full Text Available The intention of this research is to make analyze of deep drawing Cr-Ni stainless steel process. The research is related to forces that appear in machine tool during the process and also to material stress and its behaviour. The results are taken from two sources and their comparison is made. The first source of results are experiments made on hydraulic press, and the other source are results obtained by creation of finite element model (FEM and process simulation on MSC Marc Mentat program package. The measurements are made in cases of different reduction coefficient and different tool material. Comparison that is given is related to punch and pressure plate forces, and the state of material stress for each reduction coefficient is observed too. Datasheets and force diagrams present the results, and material stress can be seen on figures that are result of the simulation.

  6. Analysis and optimisation of a mixed fluid cascade (MFC) process

    Science.gov (United States)

    Ding, He; Sun, Heng; Sun, Shoujun; Chen, Cheng

    2017-04-01

    A mixed fluid cascade (MFC) process that comprises three refrigeration cycles has great capacity for large-scale LNG production, which consumes a great amount of energy. Therefore, any performance enhancement of the liquefaction process will significantly reduce the energy consumption. The MFC process is simulated and analysed by use of proprietary software, Aspen HYSYS. The effect of feed gas pressure, LNG storage pressure, water-cooler outlet temperature, different pre-cooling regimes, liquefaction, and sub-cooling refrigerant composition on MFC performance are investigated and presented. The characteristics of its excellent numerical calculation ability and the user-friendly interface of MATLAB™ and powerful thermo-physical property package of Aspen HYSYS are combined. A genetic algorithm is then invoked to optimise the MFC process globally. After optimisation, the unit power consumption can be reduced to 4.655 kW h/kmol, or 4.366 kW h/kmol on condition that the compressor adiabatic efficiency is 80%, or 85%, respectively. Additionally, to improve the process further, with regards its thermodynamic efficiency, configuration optimisation is conducted for the MFC process and several configurations are established. By analysing heat transfer and thermodynamic performances, the configuration entailing a pre-cooling cycle with three pressure levels, liquefaction, and a sub-cooling cycle with one pressure level is identified as the most efficient and thus optimal: its unit power consumption is 4.205 kW h/kmol. Additionally, the mechanism responsible for the weak performance of the suggested liquefaction cycle configuration lies in the unbalanced distribution of cold energy in the liquefaction temperature range.

  7. Analysis of green liquor influence on coal steam gasification process

    Directory of Open Access Journals (Sweden)

    Karczewski Mateusz

    2017-01-01

    Full Text Available Gasification is a clean and efficient technology with a long history dating up to the 19th century. The possible application of this process ranges from gas production and chemical synthesis to the energy sector and therefore this technology holds noticeable potential for future applications. In order to advance it, a new efficient approaches for this complex process are necessary. Among possible methods, a process enhancing additives, such as alkali and alkaline earth metals seems to be a promising way of achieving such a goal, but in practice might turn to be a wasteful approach for metal economy, especially in large scale production. This paper shows alkali abundant waste material that are green liquor dregs as a viable substitute. Green liquor dregs is a waste material known for its low potential as a fuel, when used separately, due to its low organic content, but its high ash content that is also abundant in alkali and alkaline earth elements seems to make it a suitable candidate for application in coal gasification processes. The aim of this work is an evaluation of the suitability of green liquor waste to work as a potential process enhancing additive for coal steam gasification process. During the experiment, three blends of hard coal and green liquor dregs were selected, with consideration for low corrosive potential and possibly high catalytic activity. The mixtures were gasified in steam under four different temperatures. Their energies syngas yield, coal conversion degree and energies of activation were calculated with use of Random Pore Model (RPM and Grain Model (GM which allowed for their comparison.

  8. Students’ views on the block evaluation process: A descriptive analysis

    Directory of Open Access Journals (Sweden)

    Ntefeleng E. Pakkies

    2016-02-01

    Full Text Available Background: Higher education institutions have executed policies and practices intended to determine and promote good teaching. Students’ evaluation of the teaching and learning process is seen as one measure of evaluating quality and effectiveness of instruction and courses. Policies and procedures guiding this process are discernible in universities, but it isoften not the case for nursing colleges.Objective: To analyse and describe the views of nursing students on block evaluation, and how feedback obtained from this process was managed.Method: A quantitative descriptive study was conducted amongst nursing students (n = 177 in their second to fourth year of training from one nursing college in KwaZulu-Natal. A questionnaire was administered by the researcher and data were analysed using the Statistical Package of Social Sciences Version 19.0.Results: The response rate was 145 (81.9%. The participants perceived the aim of block evaluation as improving the quality of teaching and enhancing their experiences as students.They questioned the significance of their input as stakeholders given that they had never been consulted about the development or review of the evaluation tool, or the administration process; and they often did not receive feedback from the evaluation they participated in.Conclusion: The college management should develop a clear organisational structure with supporting policies and operational guidelines for administering the evaluation process. The administration, implementation procedures, reporting of results and follow-up mechanisms should be made transparent and communicated to all concerned. Reports and actions related to these evaluations should provide feedback into relevant courses or programmes.Keywords: Student evaluation of teaching; perceptions; undergraduate nursing students; evaluation process

  9. The Analysis of a Real Life Declarative Process

    DEFF Research Database (Denmark)

    Debois, Søren; Slaats, Tijs

    2015-01-01

    This paper reports on a qualitative study of the use of declarative process notations used in a commercial setting. Specifically, we investigate the actual use of a system implemented in terms of DCR graphs for the Danish "Dreyer Foundation" by our industry partner Exformatics A/S. The study...... by the declarative model, and (2) use process discovery techniques to examine if a perfect-fitness flow-based model representing the main business constraints is in fact easy to come by. For (1), we find evidence in various forms, most notably an apparent change in best practices by end-users allowed by the model...

  10. Thermomechanical Stresses Analysis of a Single Event Burnout Process

    Science.gov (United States)

    Tais, Carlos E.; Romero, Eduardo; Demarco, Gustavo L.

    2009-06-01

    This work analyzes the thermal and mechanical effects arising in a power Diffusion Metal Oxide Semiconductor (DMOS) during a Single Event Burnout (SEB) process. For studying these effects we propose a more detailed simulation structure than the previously used by other authors, solving the mathematical models by means of the Finite Element Method. We use a cylindrical heat generation region, with 5 W, 10 W, 50 W and 100 W for emulating the thermal phenomena occurring during SEB processes, avoiding the complexity of the mathematical treatment of the ion-semiconductor interaction.

  11. Multi-fluid CFD analysis in Process Engineering

    Science.gov (United States)

    Hjertager, B. H.

    2017-12-01

    An overview of modelling and simulation of flow processes in gas/particle and gas/liquid systems are presented. Particular emphasis is given to computational fluid dynamics (CFD) models that use the multi-dimensional multi-fluid techniques. Turbulence modelling strategies for gas/particle flows based on the kinetic theory for granular flows are given. Sub models for the interfacial transfer processes and chemical kinetics modelling are presented. Examples are shown for some gas/particle systems including flow and chemical reaction in risers as well as gas/liquid systems including bubble columns and stirred tanks.

  12. Reflector antenna analysis using physical optics on Graphics Processing Units

    DEFF Research Database (Denmark)

    Borries, Oscar Peter; Sørensen, Hans Henrik Brandenborg; Dammann, Bernd

    2014-01-01

    The Physical Optics approximation is a widely used asymptotic method for calculating the scattering from electrically large bodies. It requires significant computational work and little memory, and is thus well suited for application on a Graphics Processing Unit. Here, we investigate the perform......The Physical Optics approximation is a widely used asymptotic method for calculating the scattering from electrically large bodies. It requires significant computational work and little memory, and is thus well suited for application on a Graphics Processing Unit. Here, we investigate...

  13. Risk analysis of the thermal sterilization process. Analysis of factors affecting the thermal resistance of microorganisms.

    Science.gov (United States)

    Akterian, S G; Fernandez, P S; Hendrickx, M E; Tobback, P P; Periago, P M; Martinez, A

    1999-03-01

    A risk analysis was applied to experimental heat resistance data. This analysis is an approach for processing experimental thermobacteriological data in order to study the variability of D and z values of target microorganisms depending on the deviations range of environmental factors, to determine the critical factors and to specify their critical tolerance. This analysis is based on sets of sensitivity functions applied to a specific case of experimental data related to the thermoresistance of Clostridium sporogenes and Bacillus stearothermophilus spores. The effect of the following factors was analyzed: the type of target microorganism; nature of the heating substrate; pH, temperature; type of acid employed and NaCl concentration. The type of target microorganism to be inactivated, the nature of the substrate (reference or real food) and the heating temperature were identified as critical factors, determining about 90% of the alteration of the microbiological risk. The effect of the type of acid used for the acidification of products and the concentration of NaCl can be assumed to be negligible factors for the purposes of engineering calculations. The critical non-uniformity in temperature during thermobacteriological studies was set as 0.5% and the critical tolerances of pH value and NaCl concentration were 5%. These results are related to a specific case study, for that reason their direct generalization is not correct.

  14. Process analysis and optimization mapping through design of experiments and its application to a polymerization process

    Directory of Open Access Journals (Sweden)

    K. V. Pontes

    2011-03-01

    Full Text Available The technique of experimental design is used on an ethylene polymerization process model in order to map the feasible optimal region as preliminary information for process optimization. Through the use of this statistical tool, together with a detailed deterministic model validated with industrial data, it is possible to identify the most relevant variables to be considered as degrees of freedom for the optimization and also to acquire significant process knowledge, which is valuable not only for future explicit optimization but also for current operational practice. The responses evaluated by the experimental design approach include the objective function and the constraints of the optimization, which also consider the polymer properties. A Plackett-Burman design with 16 trials is first carried out in order to identify the most important inlet variables. This reduces the number of decision variables, hence the complexity of the optimization model. In order to carry out a deeper investigation of the process, complete factorial designs are further implemented. They provide valuable process knowledge because interaction effects, including highly non-linear interactions between the variables, are treated methodically and are easily observed.

  15. Energy and environmental analysis of a rapeseed biorefinery conversion process

    DEFF Research Database (Denmark)

    Boldrin, Alessio; Balzan, Alberto; Astrup, Thomas Fruergaard

    2013-01-01

    positive effects on the greenhouse gases (GHG) footprint of the biorefinery system, with improvements in the range of 9 % to 29 %, depending on the considered alternative. The mass and energy balances showed the potential for improvement of straw treatment processes (hydrothermal pre-treatment and dark...

  16. Cognitive Task Analysis of the Battalion Level Visualization Process

    Science.gov (United States)

    2007-10-01

    Lecture Notes in Artificial Intelligence 1889. Berlin: Springer. Nonaka , I. & Takeuchi . H. (1995). The Knowledge-Creating Company. New York: Oxford...organizational sense-making (c.f., Weick, 1995; Weick, & Sutcliffe, 2001; Choo, 1998; Klein, Phillips, Rall, & Peluso (In Preparation), and Nonaka ... Takeuchi 1995.) 21 Information Overall Visualization Process Environment Individual ( Interpretation , Collaborative Negotiated Negotiated Engagement

  17. The Entrepreneurial Process: An International Analysis of Entry and Exit

    NARCIS (Netherlands)

    P.W. van der Zwan (Peter)

    2011-01-01

    textabstractThis thesis deals with the entrepreneurial process from an international perspective. The first part explores which people decide to enter entrepreneurship. A distinction is made between two modes of entrepreneurial entry: taking over an existing firm and starting a new firm. The second

  18. Finite element analysis for three dimensional welding processes

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Ju Wan; Cho, Young Sam; Kim, Hyun Gyu; Choi, Kang Hyouk; Im, Se Young [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    2001-07-01

    We propose an implicit numerical implementation for the Leblond's transformation plasticity constitutive equations, which are widely used in welded steel structure. We apply generalized trapezoidal rule to integrate the equations and determine the consistent tangent moduli. The implementation may be used with updated Lagrangian formulation. We test a simple butt-welding process to compare with SYSWELD and discuss the accuracy.

  19. Experimental and Numerical Analysis of Fracture Processes in Concrete

    NARCIS (Netherlands)

    Schlangen, H.E.J.G.

    1993-01-01

    A combined experimental and numerical approach is adopted to investigate fracture processes in concrete. The experimental programme focuses on the failure of concrete subjected to mixed mode I and II loading. The influence of shear load on the nucleation and propagation of cracks in concrete is

  20. Analysis of an Intelligent Temperature Transmitter for Process Control

    African Journals Online (AJOL)

    It also identifies low power microprocessor and analog to digital converters working with the basic sensor circuit as the key propellants in the advancement of transmitter technology. Despite several sensors available in the process control industry, the authors focus on temperature sensors and analyze a typical Rosemount ...

  1. Performance analysis of a dynamic query processing scheme

    NARCIS (Netherlands)

    M.L. Kersten (Martin); S. Shair-Ali; C.A. van den Berg

    1990-01-01

    textabstractTraditional query optimizers produce a fixed query evaluation plan based on assumptions about data distribution and processor workloads. However, these assumptions may not hold at query execution time. In this paper, we propose a dynamic query processing scheme and we present the

  2. Analysis of decision making process for a systematic engineering design

    NARCIS (Netherlands)

    Rajabali Nejad, Mohammadreza; Spitas, Christos; Roozenburg, Norbert; Chen, Lin-Lin; Stappers, Pieter Jan

    2011-01-01

    In this paper, we present a new methodology for a systematic design process that shows if the acquired knowledge, modeled in the K-space, is enough or further exploration of the C-space is required. We treat the uncertainty [4] and apply it to the Cold Facts project [5] as a distinguished project

  3. A Review of Literature on analysis of JIG Grinding Process

    DEFF Research Database (Denmark)

    Sudheesh, P. K.; Puthumana, Govindan

    2016-01-01

    Jig grinding is a process practically used by tool and die makers in the creation of jigs or mating holes and pegs on dies.The abrasives normally used in jig grinding are divided into Natural Abrasives and Artificial Abrasives. Artificial Abrasiveare preferred in manufacturing of grinding wheels ...

  4. Yield analysis at a poultry processing plant in Harare, Zimbabwe ...

    African Journals Online (AJOL)

    This investigation was conducted to establish the yield of parts or organs of chickens brought for slaughter at a poultry processing plant in Harare. Results of the study will furnish management and other poultry farmers with information that will enable them to identify yield losses and sustainable ways of minimizing resultant ...

  5. Joint time frequency analysis in digital signal processing

    DEFF Research Database (Denmark)

    Pedersen, Flemming

    In order to obtain simultaneous time and frequency energy distribution of a signal, Joint Time Frequency Analysis (JFTA) is often employed. The standard method is the Fourier Spectogram where the time and frequency resolution can be adjusted by changing a window function. A problem with this tech......In order to obtain simultaneous time and frequency energy distribution of a signal, Joint Time Frequency Analysis (JFTA) is often employed. The standard method is the Fourier Spectogram where the time and frequency resolution can be adjusted by changing a window function. A problem...... with this technique is that the resolution is limited because of distortion. To overcome the resolution limitations of the Fourier Spectogram, many new distributions have been developed. In spite of this the Fourier Spectogram is by far the prime method for the analysis of signals whose spectral content is varying...

  6. The Analysis of Net Factors Influence on Remote Process Monitoring

    Directory of Open Access Journals (Sweden)

    Baluch Dušan

    2001-09-01

    Full Text Available The contribution deals with the process monitoring based on www technologies. It also researches the influence of clients‘ number and the network transfer rate on the process monitoring quality. The process monitoring is realized by the distributed application. The server part of application (written in Delphi acquires actually measured data and sends them through the socket communication channel to the client. The client part of application is realized as applet (written in Java, which receives data from the server and executes their processing. There are some factors influencing the quality of the client–server communication on the server and client side, such as the number of running tasks, exploitation of system recourses, number of connected clients and network rate. Their influences are presented in graphical form. The first course represents a monitored signal, the second the accuracy of server’s time sending, the third the accuracy of client’s receiving and the fourth the duration of the packet transfer between the client and server. The computer working as a server is marked as S, the client is marked as K. The influence of client quality showed that the usable sample period is about 0.1s. By the study of quality of server S1 the applicable sample period 0.17s was specified. For shorter sample periods, the client isn’t capable to process received data and so occurs data buffering, which causes the time shift concerning to signal on server side. In this area monitoring is not desirable. The influence of client number showed that in the same time can correctly operate 19 to 33 clients at sample periods 0.085s to 0.33s. The influence of transfer rate showed applicability of monitoring so for local as for remote distances of clients.

  7. Implementing SCRUM using Business Process Management and Pattern Analysis Methodologies

    Directory of Open Access Journals (Sweden)

    Ron S. Kenett

    2013-11-01

    Full Text Available The National Institute of Standards and Technology in the US has estimated that software defects and problems annually cost 59.5 billions the U.S. economy (http://www.abeacha.com/NIST_press_release_bugs_cost.htm. The study is only one of many that demonstrate the need for significant improvements in software development processes and practices. US Federal agencies, that depend on IT to support their missions and spent at least $76 billion on IT in fiscal year 2011, experienced numerous examples of lengthy IT projects that incurred cost overruns and schedule delays while contributing little to mission-related outcomes (www.gao.gov/products/GAO-12-681. To reduce the risk of such problems, the US Office of Management and Budget recommended deploying an agile software delivery, which calls for producing software in small, short increments (GAO, 2012. Consistent with this recommendation, this paper is about the application of Business Process Management to the improvement of software and system development through SCRUM or agile techniques. It focuses on how organizational behavior and process management techniques can be integrated with knowledge management approaches to deploy agile development. The context of this work is a global company developing software solutions for service operators such as cellular phone operators. For a related paper with a comprehensive overview of agile methods in project management see Stare (2013. Through this comprehensive case study we demonstrate how such an integration can be achieved. SCRUM is a paradigm shift in many organizations in that it results in a new balance between focus on results and focus on processes. In order to describe this new paradigm of business processes this work refers to Enterprise Knowledge Development (EKD, a comprehensive approach to map and document organizational patterns. In that context, the paper emphasizes the concept of patterns, reviews the main elements of SCRUM and shows how

  8. Using the design of simulation experiments to failures interactive effects analysis in process: a hypothetical case

    OpenAIRE

    Fabiano Leal; Dagoberto Alves De Almeida; José Arnaldo Barra Montevechi; Fernando Augusto Silva Marins

    2007-01-01

    This work presents a failures interactive effects analysis in a process, by means ofsimulations experiments. The chosen process is a hypothetical system, simulated throughsoftware Promodel®. Two conceptual models are generated, representing the system(mapping process) and the failures considered in system (Fault Tree Analysis). Theseconceptual models are translated in a computerized model, for the analysis of individual andcombined effects on the variable “number of produced pieces”. This exp...

  9. Rethinking the process of operational research & systems analysis

    CERN Document Server

    Tomlinson, R

    1984-01-01

    Invited contributions from distinguished practitioners and methodologists of operational research and applied systems analysis which represent a true state-of-the-art and which provide, perhaps for the first time, a coherent, interlocking, set of ideas which may be considered the foundations of the subject as a science in its own right.

  10. analysis of profitability and poverty reduction of yoghurt processing ...

    African Journals Online (AJOL)

    Admin

    mostly males (70%) who were in their active age group of 36-45 years. Profitability analysis revealed that yoghurt ... 2007). Poverty, food insecurity and malnutrition are prevalent throughout Nigeria (Innovative .... Storage problem caused by the epileptic power supply in Maiduguri and improper packaging of products.

  11. Unraveling cell processes: interference imaging interwoven with data analysis

    DEFF Research Database (Denmark)

    Brazhe, Nadezda; Brazhe, Alexey; Pavlov, A N

    2006-01-01

    study the modulation of the 1 Hz rhythm in neurons and reveal its changes under depolarization and hyperpolarization of the plasma membrane. We conclude that interference microscopy combined with wavelet analysis is a useful technique for non-invasive cell studies, cell visualization, and investigation...

  12. Sensitivity Analysis of Down Woody Material Data Processing Routines

    Science.gov (United States)

    Christopher W. Woodall; Duncan C. Lutes

    2005-01-01

    Weight per unit area (load) estimates of Down Woody Material (DWM) are the most common requests by users of the USDA Forest Service's Forest Inventory and Analysis (FIA) program's DWM inventory. Estimating of DWM loads requires the uniform compilation of DWM transect data for the entire United States. DWM weights may vary by species, level of decay, woody...

  13. Carbon dioxide capture processes: Simulation, design and sensitivity analysis

    DEFF Research Database (Denmark)

    Zaman, Muhammad; Lee, Jay Hyung; Gani, Rafiqul

    2012-01-01

    Carbon dioxide is the main greenhouse gas and its major source is combustion of fossil fuels for power generation. The objective of this study is to carry out the steady-state sensitivity analysis for chemical absorption of carbon dioxide capture from flue gas using monoethanolamine solvent. First...

  14. Stochastic Process Analysis of Interactive Discourse in Early Counseling Interviews.

    Science.gov (United States)

    Friedlander, Myrna L.; Phillips, Susan D.

    1984-01-01

    Examined patterns of interactive discourse to suggest how client and counselor establish a working alliance in their early interviews. Based on classification of 312 conversational turns from 14 dyads, a stochastic analysis was conducted. Results showed the sequences of talk were highly stable and predictable. (JAC)

  15. Human Processes in Intelligence Analysis: Phase I Overview

    Science.gov (United States)

    1979-12-01

    Importance lnrvwig rm brigo analysi nforma ze-th-fce imporance- maintaining availability of all Informatioc, tono l apertinent to "hat area, as well...Saccepts raw Information from the very recent auditory, touch, or ienses and makes it available to the muscular sense Inputs outside awareness and

  16. Mesh Processing in Medical-Image Analysis-a Tutorial

    DEFF Research Database (Denmark)

    Levine, Joshua A.; Paulsen, Rasmus Reinhold; Zhang, Yongjie

    2012-01-01

    Medical-image analysis requires an understanding of sophisticated scanning modalities, constructing geometric models, building meshes to represent domains, and downstream biological applications. These four steps form an image-to-mesh pipeline. For research in this field to progress, the imaging...

  17. Efficient implicit finite element analysis of sheet forming processes

    NARCIS (Netherlands)

    van den Boogaard, Antonius H.; Meinders, Vincent T.; Huetink, Han

    2003-01-01

    The computation time for implicit finite element analyses tends to increase disproportionally with increasing problem size. This is due to the repeated solution of linear sets of equations, if direct solvers are used. By using iterative linear equation solvers the total analysis time can be reduced

  18. The National Health Educator Job Analysis 2010: Process and Outcomes

    Science.gov (United States)

    Doyle, Eva I.; Caro, Carla M.; Lysoby, Linda; Auld, M. Elaine; Smith, Becky J.; Muenzen, Patricia M.

    2012-01-01

    The National Health Educator Job Analysis 2010 was conducted to update the competencies model for entry- and advanced-level health educators. Qualitative and quantitative methods were used. Structured interviews, focus groups, and a modified Delphi technique were implemented to engage 59 health educators from diverse work settings and experience…

  19. Analysis of some methods for reduced rank Gaussian process regression

    DEFF Research Database (Denmark)

    Quinonero-Candela, J.; Rasmussen, Carl Edward

    2005-01-01

    proliferation of a number of cost-effective approximations to GPs, both for classification and for regression. In this paper we analyze one popular approximation to GPs for regression: the reduced rank approximation. While generally GPs are equivalent to infinite linear models, we show that Reduced Rank......While there is strong motivation for using Gaussian Processes (GPs) due to their excellent performance in regression and classification problems, their computational complexity makes them impractical when the size of the training set exceeds a few thousand cases. This has motivated the recent...... Gaussian Processes (RRGPs) are equivalent to finite sparse linear models. We also introduce the concept of degenerate GPs and show that they correspond to inappropriate priors. We show how to modify the RRGP to prevent it from being degenerate at test time. Training RRGPs consists both in learning...

  20. Analysis of reaction and transport processes in zinc air batteries

    CERN Document Server

    Schröder, Daniel

    2016-01-01

    This book contains a novel combination of experimental and model-based investigations, elucidating the complex processes inside zinc air batteries. The work presented helps to answer which battery composition and which air-composition should be adjusted to maintain stable and efficient charge/discharge cycling. In detail, electrochemical investigations and X-ray transmission tomography are applied on button cell zinc air batteries and in-house set-ups. Moreover, model-based investigations of the battery anode and the impact of relative humidity, active operation, carbon dioxide and oxygen on zinc air battery operation are presented. The techniques used in this work complement each other well and yield an unprecedented understanding of zinc air batteries. The methods applied are adaptable and can potentially be applied to gain further understanding of other metal air batteries. Contents Introduction on Zinc Air Batteries Characterizing Reaction and Transport Processes Identifying Factors for Long-Term Stable O...

  1. An electrical test system for conductor formation process analysis

    Energy Technology Data Exchange (ETDEWEB)

    Estes, T.A. [Sandia National Labs., Albuquerque, NM (United States); Rhodes, R.J. [AT and T Bell Labs., Whippany, NJ (United States)

    1994-03-01

    Sandia National Laboratories has designed and built an electrical test system which fulfills a requirement to quickly, accurately and precisely measure the resistance of conductors formed on Printed Wiring Board (PWB) substrates. This requirement stems from the need to measure small variations in conductors and thus to determine the source of the variations. With this test technology, experiments can be conducted with new materials, equipment, and processes in a timely and scientific manner. Conductor formation processes can be optimized for both conductor yield and uniformity, and process equipment can be fine-tuned prior to processing product to ensure that conductor attributes fulfill requirements. Significant resources have been spent by Sandia National Laboratories and Texas Instruments modifying commercially available two-probe testers. AT&T has built a two-probe tester and obtained a commercially available ``bed-of-nails`` test system. The two-probe systems have limitations in speed and precision; the ``bed-of-nails`` system has proved to be superior to the two-probe designs but is expensive, and lacks test pattern flexibility and ease of use. Due to the need to establish a testing technology which meets the requirements of Sandia National Laboratories and the National Center for Manufacturing Sciences PWB Consortium Imaging Team (current Imaging Team members; AT&T, Texas Instruments, AlliedSignal, IBM, and Sandia National Laboratories), a prototype test system was designed and built by Sandia. This paper will discuss the design and performance of the test system and the results of a comparison to other test systems.

  2. Using Critical Path Analysis (CPA) in Place Marketing process

    OpenAIRE

    Metaxas, Theodore; Deffner, Alex

    2013-01-01

    The article awards the use of CPA as a methodological tool in Place Marketing implementation. Taking into account that Place Marketing is a strategic process based on ‘project’ meaning with particular actions in specific time horizon, the article proposed that CPΑ has the capacity to satisfy this hypothesis. For this reason, the article creates a hypothetical scenario of CPA in four phases, planning, programming, implementation and feedback, taking as a case study the city of Rostock in Germa...

  3. An Analysis of the Department of Defense Strategic Management Process

    Science.gov (United States)

    1976-06-01

    work. Henri Fayol added the administrative views of a pyramidal form, unity of command, exception principle, authority delegation, and span of control...DIcisionmaking is largely judgmental and based on a look at all rele’vant inputs to the problem-solving process.32 N[. Henry Plintzberg authored an article in the...an attempt in overcoming these problems. Mr. McNamara, following M? II, had been hired by Henry Ford II, along with several other ex-air force

  4. Writing In Efl: An Analysis Of Developing Cognitive Processes

    Directory of Open Access Journals (Sweden)

    Tans Feliks

    2016-02-01

    Full Text Available This study aims at finding out a writer's developing cognitive processes in EFL writing. The data are analyzed based on the classic theory of Odell (1977. It was found that the writer develops better in using: 1 grammatical subjects; 2 connectors and superlative forms; 3 lexicons showing similarity, resemblance, and class; 4 physical words; and, 5 sequence. He is less developed or stagnant in using comparisons and negatives, syntaxes, lexicons showing difference, change, paradox, contrast and examples.

  5. Performance Analysis of Alignment Process of MEMS IMU

    Directory of Open Access Journals (Sweden)

    Vadim Bistrov

    2012-01-01

    Full Text Available The procedure of determining the initial values of the attitude angles (pitch, roll, and heading is known as the alignment. Also, it is essential to align an inertial system before the start of navigation. Unless the inertial system is not aligned with the vehicle, the information provided by MEMS (microelectromechanical system sensors is not useful for navigating the vehicle. At the moment MEMS gyroscopes have poor characteristics and it’s necessary to develop specific algorithms in order to obtain the attitude information of the object. Most of the standard algorithms for the attitude estimation are not suitable when using MEMS inertial sensors. The wavelet technique, the Kalman filter, and the quaternion are not new in navigation data processing. But the joint use of those techniques for MEMS sensor data processing can give some new results. In this paper the performance of a developed algorithm for the attitude estimation using MEMS IMU (inertial measurement unit is tested. The obtained results are compared with the attitude output of another commercial GPS/IMU device by Xsens. The impact of MEMS sensor measurement noises on an alignment process is analysed. Some recommendations for the Kalman filter algorithm tuning to decrease standard deviation of the attitude estimation are given.

  6. Functional genomic analysis of human mitochondrial RNA processing.

    Science.gov (United States)

    Wolf, Ashley R; Mootha, Vamsi K

    2014-05-08

    Both strands of human mtDNA are transcribed in continuous, multigenic units that are cleaved into the mature rRNAs, tRNAs, and mRNAs required for respiratory chain biogenesis. We sought to systematically identify nuclear-encoded proteins that contribute to processing of mtRNAs within the organelle. First, we devised and validated a multiplex MitoString assay that quantitates 27 mature and precursor mtDNA transcripts. Second, we applied MitoString profiling to evaluate the impact of silencing each of 107 mitochondrial-localized, predicted RNA-binding proteins. With the resulting data set, we rediscovered the roles of recently identified RNA-processing enzymes, detected unanticipated roles of known disease genes in RNA processing, and identified new regulatory factors. We demonstrate that one such factor, FASTKD4, modulates the half-lives of a subset of mt-mRNAs and associates with mtRNAs in vivo. MitoString profiling may be useful for diagnosing and deciphering the pathogenesis of mtDNA disorders. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Functional Genomic Analysis of Human Mitochondrial RNA Processing

    Directory of Open Access Journals (Sweden)

    Ashley R. Wolf

    2014-05-01

    Full Text Available Both strands of human mtDNA are transcribed in continuous, multigenic units that are cleaved into the mature rRNAs, tRNAs, and mRNAs required for respiratory chain biogenesis. We sought to systematically identify nuclear-encoded proteins that contribute to processing of mtRNAs within the organelle. First, we devised and validated a multiplex MitoString assay that quantitates 27 mature and precursor mtDNA transcripts. Second, we applied MitoString profiling to evaluate the impact of silencing each of 107 mitochondrial-localized, predicted RNA-binding proteins. With the resulting data set, we rediscovered the roles of recently identified RNA-processing enzymes, detected unanticipated roles of known disease genes in RNA processing, and identified new regulatory factors. We demonstrate that one such factor, FASTKD4, modulates the half-lives of a subset of mt-mRNAs and associates with mtRNAs in vivo. MitoString profiling may be useful for diagnosing and deciphering the pathogenesis of mtDNA disorders.

  8. Analysis of lagoon sludge characteristics for choice of treatment process

    Energy Technology Data Exchange (ETDEWEB)

    Park, J. H.; Hwang, D. S.; Choi, Y. D.; Lee, K. I.; Hwang, S. T.; Jung, K. J. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-04-01

    The Korea Atomic Energy Research Institute has launched a decommissioning program of uranium conversion plant. One of the important tasks in the decommissioning program is the treatment of the sludge, which was generated during operation and stored in the two ponds of the lagoon. The treatment requires the volume reduction of lagoon sludges for the low cost of the program and the conversion of the chemical forms, including uranium, for the acceptance at the final disposal site. The physical properties, such as densities, were measured and chemical compositions and radiological properties were analyzed. The denitration was a candidate process which would satisfy the requirements for sludge treatment, and the characteristics of thermal decomposition and dissolution with water were analyzed. The main compounds of the sludge were ammonium and sodium nitrate from conversion plant and calcium nitrate, calcium carbonate from Ca precipitation and impurities of the yellow cake. The content of uranium, thorium and Ra-226 was high in pond-1 and low in pond-2 because those were removed during Ca precipitation. On the base of the characteristics of the sludge and available technologies, reviewed in this study and being developed in Korea Atomic Energy Research Institute, two processes were proposed and evaluated in points of the expected technological difficulties. And the cost for treatment of sludges are estimated for both processes. 79 refs., 44 figs., 37 tabs. (Author)

  9. SPORT EDUCATION INSTITUTIONS BOLOGNA PROCESS APPLICATION EXPERIENCES AND PROBLEMS ANALYSIS

    Directory of Open Access Journals (Sweden)

    Vladislav Ilić

    2008-08-01

    Full Text Available Current changes in education legislative and efforts in direction of aligning domestic educational system with European union legislative and Bologna declaration were broadly welcomed in scientific institutions as positive and necessary step towards educational system modernization. However, together with new Higher education law implementation, ac creditation process start and education system modification a few important problems came to an attention. Although the time frame from the beginning of the changes is relatively short, certain conclusions and experiences about current problems can be presented. According to current experiences, new legislation was inadequately precise and correct in proper sport categorization, considering its distinctions as multidisciplinary and specific scientific area. It also failed to recognize needs and differences of sport higher education institutions in connection with students and teaching staff profile and quality. Above-mentioned factors caused problems which occurred in process of accreditation, knowledge transfer process, finding and adequate teaching staff acquiring with danger of potential lowering of numbers and quality of future graduates. As a conclusion,it can be said that prompt improvements and changes of current legislative are needed in order to meet true needs of sport and sport education.

  10. Analysis of a biorefinery integration in a bisulfite pulp process

    Energy Technology Data Exchange (ETDEWEB)

    Perin-Levasseur, Z. [Natural Resources Canada, Varennes, PQ (Canada). CANMET Energy Technology Centre; Marechal, F. [Ecole Polytechnique Federale de Lausanne (Switzerland). Laboratoire d' Energetique Industrielle; Paris, J. [Ecole Polytechnique, Montreal, PQ (Canada). Dept. de Genie Chimique

    2010-05-15

    This study used process integration techniques to analyze a bisulfite mill that produced pulp along with bioethanol, lignosulfonate, and yeast. The study examined the integration of the chemical recycling loops which have had a significant impact on the mill's process energy balances. Scenarios were developed to determine the trade-off between the conversion of materials and the conversion of energy. A systematic definition of process heat transfer requirements was developed. Heat requirements for the by-products were computed as a function of by-product flow rates as well as by accounting for the maximum production capacity of each by-product. Operating costs were determined as a function of the mass flow rate, electrical power, the cost of imported steam, purchased fuel, electricity, and operating times. The study showed that the use of heat recovery equipment will lead to higher energy savings. The energy penalty is compensated by the increased income from selling value-added bioproducts. Efficient energy conversion is crucial for the economic implementation of biorefineries in chemical pulp mills. 2 refs., 3 tabs., 2 figs.

  11. Rheological analysis of hybrid hydrogels during polymerization processes

    Directory of Open Access Journals (Sweden)

    Illner Sabine

    2017-09-01

    Full Text Available Development of new implant coatings with temperature-controlled drug release to treat infections after device implantation can be triggered by highly elastic hydrogels with adequate stability and adhesive strength in the swollen state. By using an ionic liquid (IL [ViPrIm]+[Br]− as additive to N-isopropylacrylamide (NIPAAm unique effects on volumetric changes and mechanical properties as well as thermoresponsive drug release of the obtained hybrid hydrogels were observed. In this context, rheological measurements allow the monitoring of gelation processes as well as chemical, mechanical, and thermal treatments and effects of additives. Hybrid hydrogels of pNIPAAm and poly (ionic liquid (PIL were prepared by radical emulsion polymerization with N,N′-methylenebis(acrylamide as 3D crosslinking agent. By varying monomer, initiator and crosslinker amounts the multi-compound system during polymerization was monitored by oscillatory time sweep experiments. The time dependence of the storage modulus (G′ and the loss modulus (G″ was measured, whereby the intersection of G′ and G″ indicates the sol-gel transition. Viscoelastic behavior and complex viscosity of crosslinked and non-crosslinked hydrogels were obtained. Within material characterization rheology can be used to determine process capability and optimal working conditions. For biomedical applications complete hydrogelation inter-connecting all compounds can be received providing the possibility to process mechanically stable, swellable implant coatings or wound closures.

  12. From the analysis of verbal data to the analysis of organizations: organizing as a dialogical process.

    Science.gov (United States)

    Lorino, Philippe

    2014-12-01

    The analysis of conversational turn-taking and its implications on time (the speaker cannot completely anticipate the future effects of her/his speech) and sociality (the speech is co-produced by the various speakers rather than by the speaking individual) can provide a useful basis to analyze complex organizing processes and collective action: the actor cannot completely anticipate the future effects of her/his acts and the act is co-produced by multiple actors. This translation from verbal to broader classes of interaction stresses the performativity of speeches, the importance of the situation, the role of semiotic mediations to make temporally and spatially distant "ghosts" present in the dialog, and the dissymmetrical relationship between successive conversational turns, due to temporal irreversibility.

  13. An Analysis of Department of Energy Cost Proposal Process and Effectiveness

    Science.gov (United States)

    2011-10-11

    Linkov, I. (2005). Application of multicriteria decision analysis in environmental decision making. Integrated Environmental Assessment and Management... Multicriteria decision analysis : A comprehensive decision approach for management of contaminated sediments. Risk Analysis , 26(1), 61–78. Lock, D. (2007...An Analysis of Department of Energy Cost Proposal Process and Effectiveness 11 October 2011 by Dr. Timothy Reed, Professor Graduate

  14. Analysis of DIRAC's behavior using model checking with process algebra

    CERN Document Server

    Remenska, Daniela; Willemse, Tim; Bal, Henri; Verstoep, Kees; Fokkink, Wan; Charpentier, Philippe; Diaz, Ricardo Graciani; Lanciotti, Elisa; Roiser, Stefan; Ciba, Krzysztof

    2012-01-01

    DIRAC is the grid solution developed to support LHCb production activities as well as user data analysis. It consists of distributed services and agents delivering the workload to the grid resources. Services maintain database back-ends to store dynamic state information of entities such as jobs, queues, staging requests, etc. Agents use polling to check and possibly react to changes in the system state. Each agent's logic is relatively simple, the main complexity lies in their cooperation. Agents run concurrently, and collaborate using the databases as shared memory. The databases can be accessed directly by the agents if running locally or through a DIRAC service interface if necessary. This shared-memory model causes entities to occasionally get into inconsistent states. Tracing and fixing such problems becomes formidable due to the inherent parallelism present. We propose more rigorous methods to cope with this. Model checking is one such technique for analysis of an abstract model of a system. Unlike con...

  15. Design of Process Displays based on Risk Analysis Techniques

    DEFF Research Database (Denmark)

    Paulsen, Jette Lundtang

    -tions. On the basis of her experience with the design of display systems; with risk analysis methods and from 8 years, as an engi-neer-on-shift at a research reactor, the author developed a method to elicit necessary information to the operator. The method, a combination of a Goal-Tree and a Fault-Tree, is described...... of a plant, particularly the risks, which is necessary information for the display designer. A chap-ter presents an overview of the various types of operation refer-ences: constitutive equations, set points, design parameters, com-ponent characteristics etc., and their validity in different situa...... in some detail. Finally we address the problem of where to put the dot and the lines: when all information is ‘on the table’, how should it be presented most adequately. Included, as an appendix is a paper concerning the analysis of maintenance reports and visualization of their information. The purpose...

  16. Nonlinear analysis and control of a continuous fermentation process

    DEFF Research Database (Denmark)

    Szederkényi, G.; Kristensen, Niels Rode; Hangos, K.M

    2002-01-01

    Different types of nonlinear controllers are designed and compared for a simple continuous bioreactor operating near optimal productivity. This operating point is located close to a fold bifurcation point. Nonlinear analysis of stability, controllability and zero dynamics is used to investigate o...... are recommended for the simple fermenter. Passivity based controllers have been found to be globally stable, not very sensitive to the uncertainties in the reaction rate and controller parameter but they require full nonlinear state feedback....

  17. Human Modeling for Ground Processing Human Factors Engineering Analysis

    Science.gov (United States)

    Stambolian, Damon B.; Lawrence, Brad A.; Stelges, Katrine S.; Steady, Marie-Jeanne O.; Ridgwell, Lora C.; Mills, Robert E.; Henderson, Gena; Tran, Donald; Barth, Tim

    2011-01-01

    There have been many advancements and accomplishments over the last few years using human modeling for human factors engineering analysis for design of spacecraft. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the human modeling currently used at Kennedy Space Center (KSC), and to explain the future plans for human modeling for future spacecraft designs

  18. Human Modeling For Ground Processing Human Factors Engineering Analysis

    Science.gov (United States)

    Tran, Donald; Stambolian, Damon; Henderson, Gena; Barth, Tim

    2011-01-01

    There have been many advancements and accomplishments over that last few years using human modeling for human factors engineering analysis for design of spacecraft and launch vehicles. The key methods used for this are motion capture and computer generated human models. The focus of this paper is to explain the different types of human modeling used currently and in the past at Kennedy Space Center (KSC) currently, and to explain the future plans for human modeling for future spacecraft designs.

  19. Processed Apple Product Marketing Analysis: Hard Cider and Apple Wine

    OpenAIRE

    Rowles, Kristin

    2000-01-01

    Hard cider and apple wine offer new value-added marketing opportunities to the apple industry. Both products are situated in rapidly growing categories of the beverage industry. The development of effective marketing strategies for these products requires an understanding of the forces driving competition in these markets. This paper provides background information to support competitive analysis and strategy development. Development of these markets will be positive for the apple industry, b...

  20. Integrated System for Design and Analysis of Separation Processes with Electrolyte Systems

    DEFF Research Database (Denmark)

    Takano, Kiyoteru; Gani, Rafiqul; Ishikawa, T.

    2000-01-01

    a calculation engine. The main feature of the algorithm is the use of thermodynamic insights, not only to identify and generate the feasible process alternatives, but also to obtain good initial estimates for the process simulation part, and for visualization of process synthesis/design. The main steps......A thermodynamic insights based algorithm for integrated design and analysis of crystallization processes with electrolyte systems is presented. This algorithm consists of a thermodynamic calculation part, a process design/analysis part and a process simulation part, which are integrated through...

  1. Theoretical analysis of transcription process with polymerase stalling

    Science.gov (United States)

    Li, Jingwei; Zhang, Yunxin

    2015-05-01

    Experimental evidence shows that in gene transcription RNA polymerase has the possibility to be stalled at a certain position of the transcription template. This may be due to the template damage or protein barriers. Once stalled, polymerase may backtrack along the template to the previous nucleotide to wait for the repair of the damaged site, simply bypass the barrier or damaged site and consequently synthesize an incorrect messenger RNA, or degrade and detach from the template. Thus, the effective transcription rate (the rate to synthesize correct product mRNA) and the transcription effectiveness (the ratio of the effective transcription rate to the effective transcription initiation rate) are both influenced by polymerase stalling events. So far, no theoretical model has been given to discuss the gene transcription process including polymerase stalling. In this study, based on the totally asymmetric simple exclusion process, the transcription process including polymerase stalling is analyzed theoretically. The dependence of the effective transcription rate, effective transcription initiation rate, and transcription effectiveness on the transcription initiation rate, termination rate, as well as the backtracking rate, bypass rate, and detachment (degradation) rate when stalling, are discussed in detail. The results showed that backtracking restart after polymerase stalling is an ideal mechanism to increase both the effective transcription rate and the transcription effectiveness. Without backtracking, detachment of stalled polymerase can also help to increase the effective transcription rate and transcription effectiveness. Generally, the increase of the bypass rate of the stalled polymerase will lead to the decrease of the effective transcription rate and transcription effectiveness. However, when both detachment rate and backtracking rate of the stalled polymerase vanish, the effective transcription rate may also be increased by the bypass mechanism.

  2. Analysis of Tire Contact Parameters Using Visual Processing

    Directory of Open Access Journals (Sweden)

    Valentin Ivanov

    2010-01-01

    The first part of this paper presents the results of experimental estimation of the contact patch area depending on the normal wheel load and inflation pressure for different car tires. The data were obtained for test bench conditions on the basis of the visual processing of tread footprint. Further, the contact length in the cohesion area during wheel rolling for single points on the tire profile has been chosen as a benchmark criterion. This paper has analyzed the influence of the wheel normal load and tire inflation pressure on the contact length with small rolling velocities. The results of the investigations are given for winter and racing tires with different grades of wear.

  3. PROCESSING AND ANALYSIS OF THE MEASURED ALIGNMENT ERRORS FOR RHIC.

    Energy Technology Data Exchange (ETDEWEB)

    PILAT,F.; HEMMER,M.; PTITSIN,V.; TEPIKIAN,S.; TRBOJEVIC,D.

    1999-03-29

    All elements of the Relativistic Heavy Ion Collider (RHIC) have been installed in ideal survey locations, which are defined as the optimum locations of the fiducials with respect to the positions generated by the design. The alignment process included the presurvey of all elements which could affect the beams. During this procedure a special attention was paid to the precise determination of the quadrupole centers as well as the roll angles of the quadrupoles and dipoles. After installation the machine has been surveyed and the resulting as-built measured position of the fiducials have been stored and structured in the survey database. We describe how the alignment errors, inferred by comparison of ideal and as-built data, have been processed and analyzed by including them in the RHIC modeling software. The RHIC model, which also includes individual measured errors for all magnets in the machine and is automatically generated from databases, allows the study of the impact of the measured alignment errors on the machine.

  4. Analysis and optimization of coagulation and flocculation process

    Science.gov (United States)

    Saritha, V.; Srinivas, N.; Srikanth Vuppala, N. V.

    2017-03-01

    Natural coagulants have been the focus of research of many investigators through the last decade owing to the problems caused by the chemical coagulants. Optimization of process parameters is vital for the effectiveness of coagulation process. In the present study optimization of parameters like pH, dose of coagulant and mixing speed were studied using natural coagulants sago and chitin in comparison with alum. Jar test apparatus was used to perform the coagulation. The results showed that the removal of turbidity was up to 99 % by both alum and chitin at lower doses of coagulant, i.e., 0.1-0.3 g/L, whereas sago has shown a reduction of 70-100 % at doses of 0.1 and 0.2 g/L. The optimum conditions observed for sago were 6 and 7 whereas chitin was stable at all pH ranges, lower coagulant doses, i.e., 0.1-0.3 g/L and mixing speed—rapid mixing at 100 rpm for 10 min and slow mixing 20 rpm for 20 min. Hence, it can be concluded that sago and chitin can be used for treating water even with large seasonal variation in turbidity.

  5. Embodying analysis: the body and the therapeutic process.

    Science.gov (United States)

    Martini, Salvatore

    2016-02-01

    This paper considers the transfer of somatic effects from patient to analyst, which gives rise to embodied countertransference, functioning as an organ of primitive communication. By means of processes of projective identification, the analyst experiences somatic disturbances within himself or herself that are connected to the split-off complexes of the analysand. The analysty's own attempt at mind-body integration ushers the patient towards a progressive understanding and acceptance of his or her inner suffering. Such experiences of psychic contagion between patient and analyst are related to Jung's 'psychology of the transference' and the idea of the 'subtle body' as an unconscious shared area. The re-attribution of meaning to pre-verbal psychic experiences within the 'embodied reverie' of the analyst enables the analytic dyad to reach the archetypal energies and structuring power of the collective unconscious. A detailed case example is presented of how the emergence of the vitalizing connection between the psyche and the soma, severed through traumatic early relations with parents or carers, allows the instinctual impulse of the Self to manifest, thereby reactivating the process of individuation. © 2016, The Society of Analytical Psychology.

  6. Experimental analysis of drilling process in cortical bone.

    Science.gov (United States)

    Wang, Wendong; Shi, Yikai; Yang, Ning; Yuan, Xiaoqing

    2014-02-01

    Bone drilling is an essential part in orthopaedics, traumatology and bone biopsy. Prediction and control of drilling forces and torque are critical to the success of operations involving bone drilling. This paper studied the drilling force, torque and drilling process with automatic and manual drill penetrating into bovine cortical bone. The tests were performed on a drilling system which is used to drill and measure forces and torque during drilling. The effects of drilling speed, feed rate and drill bit diameter on force and torque were discussed separately. The experimental results were proven to be in accordance with the mathematic expressions introduced in this paper. The automatic drilling saved drilling time by 30-60% in the tested range and created less vibration, compared to manual drilling. The deviation between maximum and average force of the automatic drilling was 5N but 25N for manual drilling. To conclude, using the automatic method has significant advantages in control drilling force, torque and drilling process in bone drilling. Copyright © 2013 IPEM. Published by Elsevier Ltd. All rights reserved.

  7. Multitask Gaussian processes for multivariate physiological time-series analysis.

    Science.gov (United States)

    Dürichen, Robert; Pimentel, Marco A F; Clifton, Lei; Schweikard, Achim; Clifton, David A

    2015-01-01

    Gaussian process (GP) models are a flexible means of performing nonparametric Bayesian regression. However, GP models in healthcare are often only used to model a single univariate output time series, denoted as single-task GPs (STGP). Due to an increasing prevalence of sensors in healthcare settings, there is an urgent need for robust multivariate time-series tools. Here, we propose a method using multitask GPs (MTGPs) which can model multiple correlated multivariate physiological time series simultaneously. The flexible MTGP framework can learn the correlation between multiple signals even though they might be sampled at different frequencies and have training sets available for different intervals. Furthermore, prior knowledge of any relationship between the time series such as delays and temporal behavior can be easily integrated. A novel normalization is proposed to allow interpretation of the various hyperparameters used in the MTGP. We investigate MTGPs for physiological monitoring with synthetic data sets and two real-world problems from the field of patient monitoring and radiotherapy. The results are compared with standard Gaussian processes and other existing methods in the respective biomedical application areas. In both cases, we show that our framework learned the correlation between physiological time series efficiently, outperforming the existing state of the art.

  8. Tracking the aging process by multiple 3D scans analysis

    Science.gov (United States)

    Bunsch, Eryk; Sitnik, Robert; Michonski, Jakub

    2012-03-01

    Currently, a lot of different 3D scanning devices are used for 3D acquisition of art artifact surface shape and color. Each of them has different technical parameters starting from measurement principle (structured light, laser triangulation, interferometry, holography) and ending on parameters like measurement volume size, spatial resolution and precision of output data and color information. Some of the 3D scanners can grab additional information like surface normal vectors, BRDF distribution, multispectral color. In this paper, we plan to present results of the measurements with selected sampling densities together with discussion of the problem of recognition and assessment of the aging process. We focus our interest on features that are important for the art conservators to define state of preservation of the object as well as to assess changes on the surface from last and previous measurement. Also different materials and finishing techniques requires different algorithms for detection and localization of aging changes. In this paper we consider exemplary stone samples to visualize what object features can be detected and tracked during aging process. The changes in sandstone surface shape, affected by salt weathering, will be presented as well as possibilities of identification of surface degradation on real object (garden relief made in sandstone).

  9. Analysis for Cellinoid shape model in inverse process from lightcurves

    Science.gov (United States)

    Lu, Xiao-Ping; Ip, Wing-Huen; Huang, Xiang-Jie; Zhao, Hai-Bin

    2017-01-01

    Based on the special shape first introduced by Alberto Cellino, which consists of eight ellipsoidal octants with the constraint that adjacent octants must have two identical semi-axes, an efficient algorithm to derive the physical parameters, such as the rotational period, pole orientation, and overall shape from either lightcurves or sparse photometric data of asteroids, is developed by Lu et al. and named as 'Cellinoid' shape model. For thoroughly investigating the relationship between the morphology of the synthetic lightcurves generated by the Cellinoid shape and its six semi-axes as well as rotational period and pole, the numerical tests are implemented to compare the synthetic lightcurves generated by three Cellinoid models with different parameters in this article. Furthermore, from the synthetic lightcurves generated by two convex shape models of (6) Hebe and (4179) Toutatis, the inverse process based on Cellinoid shape model is applied to search the best-fit parameters. Especially, for better simulating the real observations, the synthetic lightcurves are generated under the orbit limit of the two asteroids. By comparing the results derived from synthetic lightcurves observed in one apparition and multiple apparitions, the performance of Cellinoid shape model is confirmed and the suggestions for observations are presented. Finally, the whole process is also applied to real observed lightcurves of (433) Eros and the derived results are consistent with the known results.

  10. Topographic analysis of eyelid position using digital image processing software.

    Science.gov (United States)

    Chun, Yeoun Sook; Park, Hong Hyun; Park, In Ki; Moon, Nam Ju; Park, Sang Joon; Lee, Jeong Kyu

    2017-11-01

    To propose a novel analysis technique for objective quantification of topographic eyelid position with an algorithmatically calculated scheme and to determine its feasibility. One hundred normal eyelids from 100 patients were segmented using a graph cut algorithm, and 11 shape features of eyelids were semi-automatically quantified using in-house software. To evaluate the intra- and inter-examiner reliability of this software, intra-class correlation coefficients (ICCs) were used. To evaluate the diagnostic value of this scheme, the correlations between semi-automatic and manual measurements of margin reflex distance 1 (MRD1) and margin reflex distance 2 (MRD2) were analysed using a Bland-Altman analysis. To determine the degree of agreement according to manual MRD length, the relationship between the variance of semi-automatic measurements and the manual measurements was evaluated using linear regression. Intra- and inter-examiner reliability were excellent, with ICCs ranging from 0.913 to 0.980 in 11 shape features including MRD1, MRD2, palpebral fissure, lid perimeter, upper and lower lid lengths, roundness, total area, and medial, central, and lateral areas. The correlations between semi-automatic and manual MRDs were also excellent, with better correlation in MRD1 than in MRD2 (R = 0.893 and 0.823, respectively). In addition, significant positive relationships were observed between the variance and the length of MRD1 and 2; the longer the MRD length, the more the variance. The proposed novel optimized integrative scheme, which is shown to have high repeatability and reproducibility, is useful for topographic analysis of eyelid position. © 2017 Acta Ophthalmologica Scandinavica Foundation. Published by John Wiley & Sons Ltd.

  11. A Low Temperature Analysis of the Boundary Driven Kawasaki Process

    Science.gov (United States)

    Maes, Christian; O'Kelly de Galway, Winny

    2013-12-01

    Low temperature analysis of nonequilibrium systems requires finding the states with the longest lifetime and that are most accessible from other states. We determine these dominant states for a one-dimensional diffusive lattice gas subject to exclusion and with nearest neighbor interaction. They do not correspond to lowest energy configurations even though the particle current tends to zero as the temperature reaches zero. That is because the dynamical activity that sets the effective time scale, also goes to zero with temperature. The result is a non-trivial asymptotic phase diagram, which crucially depends on the interaction coupling and the relative chemical potentials of the reservoirs.

  12. Simulation Process Analysis of Rubber Shock Absorber for Machine Tool

    Directory of Open Access Journals (Sweden)

    Chai Rong Xia

    2016-01-01

    Full Text Available The simulation on rubber shock absorber of machine tool was studied. The simple material model of rubber was obtained by through the finite element analysis software ABAQUS. The compression speed and the hardness of rubber material were considered to obtain the deformation law of rubber shock absorber. The location of fatigue were confirmed from the simulation results. The results shown that the fatigue position is distributed in the corner of shock absorber. The degree of deformation is increased with increasing of compress speed, and the hardness of rubber material is proportional to deformation.

  13. Rescattering processes for elliptical polarization: A quantum trajectory analysis

    Science.gov (United States)

    Kopold; Milosevic; Becker

    2000-04-24

    High-harmonic generation and high-order above-threshold ionization spectra calculated in the strong-field approximation are analyzed in terms of the complex space-time orbits that result from a saddle point analysis of the underlying integrals. For elliptical polarization, the plateaus of the spectra of high-harmonic generation and high-order above-threshold ionization each turn into a staircase of very similar appearance. Each step of the stair can be traced to a particular pair of orbits which are almost identical in both cases.

  14. Introducing uncertainty analysis of nucleation and crystal growth models in Process Analytical Technology (PAT) system design of crystallization processes.

    Science.gov (United States)

    Samad, Noor Asma Fazli Abdul; Sin, Gürkan; Gernaey, Krist V; Gani, Rafiqul

    2013-11-01

    This paper presents the application of uncertainty and sensitivity analysis as part of a systematic model-based process monitoring and control (PAT) system design framework for crystallization processes. For the uncertainty analysis, the Monte Carlo procedure is used to propagate input uncertainty, while for sensitivity analysis, global methods including the standardized regression coefficients (SRC) and Morris screening are used to identify the most significant parameters. The potassium dihydrogen phosphate (KDP) crystallization process is used as a case study, both in open-loop and closed-loop operation. In the uncertainty analysis, the impact on the predicted output of uncertain parameters related to the nucleation and the crystal growth model has been investigated for both a one- and two-dimensional crystal size distribution (CSD). The open-loop results show that the input uncertainties lead to significant uncertainties on the CSD, with appearance of a secondary peak due to secondary nucleation for both cases. The sensitivity analysis indicated that the most important parameters affecting the CSDs are nucleation order and growth order constants. In the proposed PAT system design (closed-loop), the target CSD variability was successfully reduced compared to the open-loop case, also when considering uncertainty in nucleation and crystal growth model parameters. The latter forms a strong indication of the robustness of the proposed PAT system design in achieving the target CSD and encourages its transfer to full-scale implementation. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. A Scalable Gaussian Process Analysis Algorithm for Biomass Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Chandola, Varun [ORNL; Vatsavai, Raju [ORNL

    2011-01-01

    Biomass monitoring is vital for studying the carbon cycle of earth's ecosystem and has several significant implications, especially in the context of understanding climate change and its impacts. Recently, several change detection methods have been proposed to identify land cover changes in temporal profiles (time series) of vegetation collected using remote sensing instruments, but do not satisfy one or both of the two requirements of the biomass monitoring problem, i.e., {\\em operating in online mode} and {\\em handling periodic time series}. In this paper, we adapt Gaussian process regression to detect changes in such time series in an online fashion. While Gaussian process (GP) have been widely used as a kernel based learning method for regression and classification, their applicability to massive spatio-temporal data sets, such as remote sensing data, has been limited owing to the high computational costs involved. We focus on addressing the scalability issues associated with the proposed GP based change detection algorithm. This paper makes several significant contributions. First, we propose a GP based online time series change detection algorithm and demonstrate its effectiveness in detecting different types of changes in {\\em Normalized Difference Vegetation Index} (NDVI) data obtained from a study area in Iowa, USA. Second, we propose an efficient Toeplitz matrix based solution which significantly improves the computational complexity and memory requirements of the proposed GP based method. Specifically, the proposed solution can analyze a time series of length $t$ in $O(t^2)$ time while maintaining a $O(t)$ memory footprint, compared to the $O(t^3)$ time and $O(t^2)$ memory requirement of standard matrix manipulation based methods. Third, we describe a parallel version of the proposed solution which can be used to simultaneously analyze a large number of time series. We study three different parallel implementations: using threads, MPI, and a

  16. Analysis of Evaporation and Condensation Processes in Complex Convective Flows.

    Science.gov (United States)

    Xu, Xun

    There are two parts in this dissertation. Part I, a numerical model was developed to analyze the flow and cloud formation processes in a concurrent-flow cloud chamber that recently has been designed by a group of researchers at Lawrence Berkeley Laboratory to examine the nucleation properties of smoke particles. This numerical model solves for the flow pattern and the distributions of temperature, water vapor, and liquid water droplets in the test chamber. Detailed information regarding these fields is difficult to obtain either by observation or by measurement during the experiment. The computational scheme uses a two-equation turbulence model (k-varepsilon model), which has been modified to include the effects of buoyancy and droplet condensation. The turbulent transport of momentum, heat, species, and droplets are simultaneously determined. The model also incorporates a treatment of the droplet growth and sedimentation mechanisms during the cloud formation process. Streamlines, isothermals, and constant contours of the concentrations have been obtained for a matrix of running conditions. Results from this numerical model indicate that the wall of the cylindrical chamber (oriented vertically) has a very strong influence on the flow field and on the temperature distribution inside the chamber. In Part II of this thesis, an analytical model is presented which can be used to predict the heat transfer characteristics of film evaporation on a microgroove surface. The model assumes that the liquid flow along a 'V' shaped groove channel is driven primarily by the capillary pressure difference due to the receding of the meniscus toward the apex of the groove, and the flow up the groove side wall is driven by the disjoining pressure difference. It also assumes that conduction across the thin liquid film is the dominant mechanism of heat transfer. A correlation between the Nusselt number and a non-dimensional parameter, Psi, is developed from this model which relates the

  17. Spectroscopy for amateur astronomers recording, processing, analysis and interpretation

    CERN Document Server

    Trypsteen , Marc F M

    2017-01-01

    This accessible guide presents the astrophysical concepts behind astronomical spectroscopy, covering both the theory and the practical elements of recording, processing, analysing and interpreting your spectra. It covers astronomical objects, such as stars, planets, nebulae, novae, supernovae, and events such as eclipses and comet passages. Suitable for anyone with only a little background knowledge and access to amateur-level equipment, the guide's many illustrations, sketches and figures will help you understand and practise this scientifically important and growing field of amateur astronomy, up to the level of Pro-Am collaborations. Accessible to non-academics, it benefits many groups from novices and learners in astronomy clubs, to advanced students and teachers of astrophysics. This volume is the perfect companion to the Spectral Atlas for Amateur Astronomers, which provides detailed commented spectral profiles of more than 100 astronomical objects.

  18. Application of simplified model to sensitivity analysis of solidification process

    Directory of Open Access Journals (Sweden)

    R. Szopa

    2007-12-01

    Full Text Available The sensitivity models of thermal processes proceeding in the system casting-mould-environment give the essential information concerning the influence of physical and technological parameters on a course of solidification. Knowledge of time-dependent sensitivity field is also very useful in a case of inverse problems numerical solution. The sensitivity models can be constructed using the direct approach, this means by differentiation of basic energy equations and boundary-initial conditions with respect to parameter considered. Unfortunately, the analytical form of equations and conditions obtained can be very complex both from the mathematical and numerical points of view. Then the other approach consisting in the application of differential quotient can be applied. In the paper the exact and approximate approaches to the modelling of sensitivity fields are discussed, the examples of computations are also shown.

  19. Industrial process heat data analysis and evaluation. Volume 2

    Energy Technology Data Exchange (ETDEWEB)

    Lewandowski, A; Gee, R; May, K

    1984-07-01

    The Solar Energy Research Institute (SERI) has modeled seven of the Department of Energy (DOE) sponsored solar Industrial Process Heat (IPH) field experiments and has generated thermal performance predictions for each project. Additionally, these performance predictions have been compared with actual performance measurements taken at the projects. Predictions were generated using SOLIPH, an hour-by-hour computer code with the capability for modeling many types of solar IPH components and system configurations. Comparisons of reported and predicted performance resulted in good agreement when the field test reliability and availability was high. Volume I contains the main body of the work; objective model description, site configurations, model results, data comparisons, and summary. Volume II contains complete performance prediction results (tabular and graphic output) and computer program listings.

  20. Traffic analysis and signal processing in optical packet switched networks

    DEFF Research Database (Denmark)

    Fjelde, Tina

    2002-01-01

    Gbit/s demultiplexing and 2x10 to 20 Gbit/s multiplexing. Lastly, the IWC’s capabilities as an optical logic gate for enabling more complex signal processing are demonstrated and four applications hereof are discussed. Logic OR and AND are verified in full at 10 Gbit/s using PRBS sequences coupled...... into an MI. Moreover, logic XOR is demonstrated in an MZI at 10 and 20 Gbit/s with good results. Using an MI, the excellent performance of a novel scheme for MPLS label swapping exploiting logic XOR is demonstrated at 10 Gbit/s with a negligible 0.4 dB penalty. Finally, three novel schemes are described...

  1. Process analysis of an in store production of knitted clothing

    Science.gov (United States)

    Buecher, D.; Kemper, M.; Schmenk, B.; Gloy, Y.-S.; Gries, T.

    2017-10-01

    In the textile and clothing industry, global value-added networks are widespread for textile and clothing production. As a result of global networking, the value chain is fragmented and a great deal of effort is required to coordinate the production processes [1]. In addition, the planning effort on the quantity and design of the goods is high and risky. Today the fashion industry is facing an increasing customer demand for individual and customizable products in addition to short delivery times [2]. These challenges are passed down to the textile and clothing industry decreasing batch sizes and production times. Conventional clothing production cannot fulfill those demands especially when combined with more and more individual or customizable designs. Hence new production concepts have to be developed.

  2. Plug and Process Loads Capacity and Power Requirements Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sheppy, M.; Gentile-Polese, L.

    2014-09-01

    This report addresses gaps in actionable knowledge that would help reduce the plug load capacities designed into buildings. Prospective building occupants and real estate brokers lack accurate references for plug and process load (PPL) capacity requirements, so they often request 5-10 W/ft2 in their lease agreements. Limited initial data, however, suggest that actual PPL densities in leased buildings are substantially lower. Overestimating PPL capacity leads designers to oversize electrical infrastructure and cooling systems. Better guidance will enable improved sizing and design of these systems, decrease upfront capital costs, and allow systems to operate more energy efficiently. The main focus of this report is to provide industry with reliable, objective third-party guidance to address the information gap in typical PPL densities for commercial building tenants. This could drive changes in negotiations about PPL energy demands.

  3. Analysis and testing of propellant feed system priming process

    Science.gov (United States)

    Lin, T. Y.; Baker, D.

    1992-01-01

    This paper presents the analytical and the experimental results pertaining to the priming process of the propellant feed system with initial line pressures starting from 0 psia and greater. The analytical methods employ the method of characteristics to solve for the one-dimensional liquid transients in the liquid-full segments and the lumped inertia technique to model the dynamics of the partially filled segments or the two-phase segments. The advantages of these methods are (1) fluid compressibility and piping flexibility are accounted for in the solution; and (2) the characteristics method can be used in the solution of a complex system. The analytical results were compared with test results. Excellent correlation was obtained between predictions and test results verifying the essential aspects of the analytical modeling techniques and providing the foundation for analyzing a complicated network system.

  4. PROCESS MANAGEMENT: COMPARISON AND ANALYSIS BETWEEN METHODOLOGIES FOR THE IMPLEMENTATION OF MANAGEMENT BASED ON PROCESSES AND ITS MAIN CONCEPTS

    Directory of Open Access Journals (Sweden)

    Liane Mahlmann Kipper

    2011-12-01

    Full Text Available Nowadays, many organizations have a lot of questions about everything that encompasses the Process Management Model, from how actually happens the deployment of this new management model, to what the migration to this new model will add in performance to the organization. Aware of these questions, the paper aims to contribute to the analysis of methodologies for the Process Management, raising important issues relevant to the topic, making a comparison of the main concepts discussed in each methodology studied, as their similarities, and their innovations. To develop this paper it was performed literature searches on Process Management implementation and a case study analysis. The findings indicate that an important aspect for achieving success in the Process Management implementation is the pre-definition of a methodology to be used as a reference, and this should be the one that best fits to the profile of the organization. Moreover, it is highlighted that the correct identification of the main proceedings also exerts great influence on the success of the implementation process of the Process Management in an organization.

  5. Analysis of verbal interactions in tutorial groups: a process study.

    Science.gov (United States)

    Visschers-Pleijers, Astrid J S F; Dolmans, Diana H J M; de Leng, Bas A; Wolfhagen, Ineke H A P; van der Vleuten, Cees P M

    2006-02-01

    Collaborative learning, including problem-based learning (PBL), is a powerful learning method. Group interaction plays a crucial role in stimulating student learning. However, few studies on learning processes in medical education have examined group interactions. Most studies on collaboration within PBL used self-reported data rather than observational data. We investigated the following types of interactions in PBL tutorial groups: learning-oriented interactions (exploratory questioning, cumulative reasoning and handling conflicts about knowledge); procedural interactions, and irrelevant/off-task interactions. The central question concerned how much time is spent on the different types of interaction during group sessions and how the types of interaction are distributed over the meeting. Four tutorial group sessions in Year 2 of the PBL undergraduate curriculum of Maastricht Medical School were videotaped and analysed. The sessions concerned the reporting phase of the PBL process. We analysed the interactions using a coding scheme distinguishing several verbal interaction types, such as questions, arguments and evaluations. Learning-orientated interactions accounted for 80% of the interactions, with cumulative reasoning, exploratory questioning and handling conflicts about knowledge accounting for about 63%, 10% and 7% of the interactions, respectively. Exploratory questioning often preceded cumulative reasoning. Both types occurred throughout the meeting. Handling conflicts mainly occurred after the first 20 minutes. Task involvement in the tutorial groups was high. All types of learning-orientated interactions were observed. Relatively little time was spent on exploratory questions and handling conflicts about knowledge. Problem-based learning training should pay special attention to stimulating discussion about contradictory information.

  6. A Methodology for Project Selection Using Economic Analysis and the Analytic Hierarchy Process

    Science.gov (United States)

    1992-09-01

    x I. Introduction ...................... ....................... 1 Capital Budgeting and the Analytic Hierarchy Process ............ ............. 1...applicability of a multiple criterion decision making ( MCDM ) method, known as the Analytic Hierarchy Process (AHP), to economic analysis decisions involving...economic analysis of projects. Several MCDM methods were examined and the AHP was found to be the most promising technique to rate projects on a ratio scale

  7. PROCESS VALUE ANALYSIS PADA INSTALASI LABORATORIUM PATOLOGI KLINIK RSUP DR. WAHIDIN SUDIROHUSODO MAKASSAR

    OpenAIRE

    FILDZAH, AUNI PRATIWI

    2017-01-01

    2017 ABSTRAK Process Value Analysis Pada Instalasi Laboratorium Patologi Klinik RSUP Dr. Wahidin Sudirohusodo Makassar RSUP Dr. Wahidin Sudirohusodo Makassar Clinical Pathology Laboratory Process Value Analysis Fildzah Auni Pratiwi Sri Sundari Muallimin Rumah sakit sebagai suatu organisasi nonprofit yang memberikan pelayanan kesehatan kepada masyarakat perlu untuk melakukan perbaikan secara berkelanjutan pad...

  8. Techno-economic analysis of organosolv pretreatment process from lignocellulosic biomass

    DEFF Research Database (Denmark)

    Rodrigues Gurgel da Silva, Andrè; Errico, Massimiliano; Rong, Ben-Guang

    2018-01-01

    data, we propose a feasible process flowsheet for organosolv pretreatment. Simulation of the pretreatment process provided mass and energy balances for a techno-economic analysis, and the values were compared with the most prevalent and mature pretreatment method: diluted acid. Organosolv pretreatment...... in the sensitivity analysis turned into possible savings of 42.8% in the minimum ethanol selling price for organosolv pretreatment....

  9. Modeling and Analysis of Power Processing Systems. [use of a digital computer for designing power plants

    Science.gov (United States)

    Fegley, K. A.; Hayden, J. H.; Rehmann, D. W.

    1974-01-01

    The feasibility of formulating a methodology for the modeling and analysis of aerospace electrical power processing systems is investigated. It is shown that a digital computer may be used in an interactive mode for the design, modeling, analysis, and comparison of power processing systems.

  10. Analysis of delamination related fracture processes in composites

    Science.gov (United States)

    Armanios, Erian A.

    1992-01-01

    This is a final report that summarizes the results achieved under this grant. The first major accomplishment is the development of the sublaminate modeling approach and shear deformation theory. The sublaminate approach allows the flexibility of considering one ply or groups of plies as a single laminated unit with effective properties. This approach is valid when the characteristic length of the response is small compared to the sublaminate thickness. The sublaminate approach was validated comparing its predictions with a finite element solution. A shear deformation theory represents an optimum compromise between accuracy and computational effort in delamination analysis of laminated composites. This conclusion was reached by applying several theories with increasing level of complexity to the prediction of interlaminar stresses and strain energy release rate in a double cracked-lap-shear configuration.

  11. Environmental accounting in Spain: structured review process and theoretical analysis

    Directory of Open Access Journals (Sweden)

    Fabricia Silva da Rosa

    2012-12-01

    Full Text Available One way to perceive and understand the level of development of environmental accounting is to study the main features of its publications. Thus, the purpose of this paper is to identify and analyze the profile of Spanish publications in accounting journals. To this end, 15 journals were selected and analyzed 74 articles in the period 2001 to 2010. The results show that the peak years of publication are 2001, 2003 and 2006, and authors with more articles in the sample are Moneva Abadía, Larrinaga González, Fernández Cuesta and Archel Domench. In terms of methodology, the works of review, case studies and content analysis, addressing standardization issues, fundamentals of environmental accounting, environmental sustainability indicators and reporting.

  12. NERSC-6 Workload Analysis and Benchmark Selection Process

    Energy Technology Data Exchange (ETDEWEB)

    Antypas, Katie; Shalf, John; Wasserman, Harvey

    2008-08-29

    This report describes efforts carried out during early 2008 to determine some of the science drivers for the"NERSC-6" next-generation high-performance computing system acquisition. Although the starting point was existing Greenbooks from DOE and the NERSC User Group, the main contribution of this work is an analysis of the current NERSC computational workload combined with requirements information elicited from key users and other scientists about expected needs in the 2009-2011 timeframe. The NERSC workload is described in terms of science areas, computer codes supporting research within those areas, and description of key algorithms that comprise the codes. This work was carried out in large part to help select a small set of benchmark programs that accurately capture the science and algorithmic characteristics of the workload. The report concludes with a description of the codes selected and some preliminary performance data for them on several important systems.

  13. Systems analysis of N-glycan processing in mammalian cells.

    Directory of Open Access Journals (Sweden)

    Patrick Hossler

    2007-08-01

    Full Text Available N-glycosylation plays a key role in the quality of many therapeutic glycoprotein biologics. The biosynthesis reactions of these oligosaccharides are a type of network in which a relatively small number of enzymes give rise to a large number of N-glycans as the reaction intermediates and terminal products. Multiple glycans appear on the glycoprotein molecules and give rise to a heterogeneous product. Controlling the glycan distribution is critical to the quality control of the product. Understanding N-glycan biosynthesis and the etiology of microheterogeneity would provide physiological insights, and facilitate cellular engineering to enhance glycoprotein quality. We developed a mathematical model of glycan biosynthesis in the Golgi and analyzed the various reaction variables on the resulting glycan distribution. The Golgi model was modeled as four compartments in series. The mechanism of protein transport across the Golgi is still controversial. From the viewpoint of their holding time distribution characteristics, the two main hypothesized mechanisms, vesicular transport and Golgi maturation models, resemble four continuous mixing-tanks (4CSTR and four plug-flow reactors (4PFR in series, respectively. The two hypotheses were modeled accordingly and compared. The intrinsic reaction kinetics were first evaluated using a batch (or single PFR reactor. A sufficient holding time is needed to produce terminally-processed glycans. Altering enzyme concentrations has a complex effect on the final glycan distribution, as the changes often affect many reaction steps in the network. Comparison of the glycan profiles predicted by the 4CSTR and 4PFR models points to the 4PFR system as more likely to be the true mechanism. To assess whether glycan heterogeneity can be eliminated in the biosynthesis of biotherapeutics the 4PFR model was further used to assess whether a homogeneous glycan profile can be created through metabolic engineering. We demonstrate by

  14. Software Integration of Life Cycle Assessment and Economic Analysis for Process Evaluation

    DEFF Research Database (Denmark)

    Kalakula, Sawitree; Malakula, Pomthong; Siemanonda, Kitipat

    2013-01-01

    This study is focused on the sustainable process design of bioethanol production from cassava rhizome. The study includes: process simulation, sustainability analysis, economic evaluation and life cycle assessment (LCA). A steady state process simulation if performed to generate a base case design...... of the bioethanol conversion process using cassava rhizome as a feedstock. The sustainability analysis is performed to analyze the relevant indicators in sustainability metrics, to definedesign/retrofit targets for process improvements. Economic analysis is performed to evaluate the profitability of the process........ Also, simultaneously with sustainability analysis, the life cycle impact on environment associated with bioethanol production is performed. Finally, candidate alternative designs are generated and compared with the base case design in terms of LCA, economics, waste, energy usage and enviromental impact...

  15. A software tool for design of process monitoring and analysis systems

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2009-01-01

    A well designed process monitoring and analysis system is necessary to consistently achieve any predefined end product quality. Systematic computer aided methods and tools provide the means to design the necessary process monitoring and analysis systems and/or to validate any existing monitoring...... and analysis system. A software to achieve this has been developed. Two developed supporting tools for the design, a knowledge base (consisting of the process knowledge as well as the knowledge on measurement methods & tools) and a model library (consisting of the process operational models) have been extended...... rigorously and integrated with the user interface, which made the software more generic and applicable to a wide range of problems. The software for the design of a process monitoring and analysis system is presented and illustrated with a tablet manufacturing process example....

  16. [A SAS marco program for batch processing of univariate Cox regression analysis for great database].

    Science.gov (United States)

    Yang, Rendong; Xiong, Jie; Peng, Yangqin; Peng, Xiaoning; Zeng, Xiaomin

    2015-02-01

    To realize batch processing of univariate Cox regression analysis for great database by SAS marco program. We wrote a SAS macro program, which can filter, integrate, and export P values to Excel by SAS9.2. The program was used for screening survival correlated RNA molecules of ovarian cancer. A SAS marco program could finish the batch processing of univariate Cox regression analysis, the selection and export of the results. The SAS macro program has potential applications in reducing the workload of statistical analysis and providing a basis for batch processing of univariate Cox regression analysis.

  17. [Plasma spectral analysis of laser cleaning process in air].

    Science.gov (United States)

    Tong, Yan-Qun; Zhang, Yong-Kang; Yao, Hong-Bing; Meng, Chun-Mei; Guan, Hai-Bing

    2011-09-01

    It is quick and accurate to on-line monitor the sample condition of laser cleaning by means of laser-induced plasma spectrum in air. In the present article, the echelle grating spectrometer was used to detect the plasma spectral lines induced by pulsed laser interaction with copper coin samples with or without contamination. The spectrogram showed that there were clear Cu I spectrum lines and air atom spectrum lines of N I and O I. In order to eliminate the uncertainty of single measurement, the statistical regularity of N I and O I spectrum lines was analyzed. Their intensity distribution laws were consistent and their relative standard deviations were the same basically. So a single measurement spectrum could be used to monitor cleaning process. The spectra of copper samples with contamination consisted of many elements atomic spectral lines and continuous spectral lines. But there are Cu I spectral lines in the spectra of clean copper samples. As a result, the authors could detect the change of spectral lines to judge whether the laser cleaning samples were clean.

  18. Comparative Analysis between Fuzzy and Traditional Analytical Hierarchy Process

    Directory of Open Access Journals (Sweden)

    Mulubrhan Freselam

    2014-07-01

    Full Text Available Analytic Hierarchy Process (AHP is one of the techniques commonly used for prioritizing different alternatives, by using complex criteria. In real applications, conventional AHP assumes the expert judgment as it is exact and use crisp number leading to inconsideration of the uncertainty that came from linguistic variable. Fuzzy logic deals with situations which are vague or unwell defined and gives a quantify value. In this study a comparison is made between traditional AHP and fuzzy AHP by taking a case of selecting an effective oil refinery. The selection is conducted using system effectiveness as a criterion. The two approaches have been compared on the same hierarchy structure and criteria set and the result show that in both case dual drum scheme (DDS has the highest priority but different value that is 0.51 and 0.36 for AHP and FAHP respectively which shows that if the expert opinion is certain AHP should be used if not FAHP should be preferred

  19. Rate process analysis of thermal damage in cartilage

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, Sergio H; Nelson, J Stuart; Wong, Brian J F [Beckman Laser Institute and Medical Clinic, University of California, Irvine, CA (United States)

    2003-01-07

    Cartilage laser thermoforming (CLT) is a new surgical procedure that allows in situ treatment of deformities in the head and neck with less morbidity than traditional approaches. While some animal and human studies have shown promising results, the clinical feasibility of CLT depends on preservation of chondrocyte viability, which has not been extensively studied. The present paper characterizes cellular damage due to heat in rabbit nasal cartilage. Damage was modelled as a first order rate process for which two experimentally derived coefficients, A=1.2x10{sup 70} s{sup -1} and E{sub a}=4.5x10{sup 5} J mole{sup -1}, were determined by quantifying the decrease in concentration of healthy chondrocytes in tissue samples as a function of exposure time to constant-temperature water baths. After immersion, chondrocytes were enzymatically isolated from the matrix and stained with a two-component fluorescent dye. The dye binds nuclear DNA differentially depending upon chondrocyte viability. A flow cytometer was used to detect differential cell fluorescence to determine the percentage of live and dead cells in each sample. As a result, a damage kinetic model was obtained that can be used to predict the onset, extent and severity of cellular injury to thermal exposure.

  20. Porosity Measurements and Analysis for Metal Additive Manufacturing Process Control.

    Science.gov (United States)

    Slotwinski, John A; Garboczi, Edward J; Hebenstreit, Keith M

    2014-01-01

    Additive manufacturing techniques can produce complex, high-value metal parts, with potential applications as critical metal components such as those found in aerospace engines and as customized biomedical implants. Material porosity in these parts is undesirable for aerospace parts - since porosity could lead to premature failure - and desirable for some biomedical implants - since surface-breaking pores allows for better integration with biological tissue. Changes in a part's porosity during an additive manufacturing build may also be an indication of an undesired change in the build process. Here, we present efforts to develop an ultrasonic sensor for monitoring changes in the porosity in metal parts during fabrication on a metal powder bed fusion system. The development of well-characterized reference samples, measurements of the porosity of these samples with multiple techniques, and correlation of ultrasonic measurements with the degree of porosity are presented. A proposed sensor design, measurement strategy, and future experimental plans on a metal powder bed fusion system are also presented.

  1. Analysis of Trans Fat in Edible Oils with Cooking Process.

    Science.gov (United States)

    Song, Juhee; Park, Joohyeok; Jung, Jinyeong; Lee, Chankyu; Gim, Seo Yeoung; Ka, HyeJung; Yi, BoRa; Kim, Mi-Ja; Kim, Cho-Il; Lee, JaeHwan

    2015-09-01

    Trans fat is a unsaturated fatty acid with trans configuration and separated double bonds. Analytical methods have been introduced to analyze trans fat content in foods including infrared (IR) spectroscopy, gas chromatography (GC), Fourier transform-infrared (FT-IR) spectroscopy, reverses-phase silver ion high performance liquid chromatography, and silver nitrate thin layer chromatography. Currently, FT-IR spectroscopy and GC are mostly used methods. Trans fat content in 6 vegetable oils were analyzed and processing effects including baking, stir-frying, pan-frying, and frying on the formation of trans fat in corn oil was evaluated by GC. Among tested vegetable oils, corn oil has 0.25 g trans fat/100 g, whereas other oils including rapeseed, soybean, olive, perilla, and sesame oils did not have detectable amount of trans fat content. Among cooking methods, stir-frying increased trans fat in corn oil whereas baking, pan-frying, and frying procedures did not make changes in trans fat content compared to untreated corn oils. However, the trans fat content was so low and food label can be declared as '0' trans based on the regulation of Ministry of Food ad Drug Safety (MFDS) (< 2 g/100 g edible oil).

  2. Grey Relational Analysis Coupled with Principal Component Analysis for Optimization of Stereolithography Process to Enhance Part Quality

    Science.gov (United States)

    Raju, B. S.; Sekhar, U. Chandra; Drakshayani, D. N.

    2017-08-01

    The paper investigates optimization of stereolithography process for SL5530 epoxy resin material to enhance part quality. The major characteristics indexed for performance selected to evaluate the processes are tensile strength, Flexural strength, Impact strength and Density analysis and corresponding process parameters are Layer thickness, Orientation and Hatch spacing. In this study, the process is intrinsically with multiple parameters tuning so that grey relational analysis which uses grey relational grade as performance index is specially adopted to determine the optimal combination of process parameters. Moreover, the principal component analysis is applied to evaluate the weighting values corresponding to various performance characteristics so that their relative importance can be properly and objectively desired. The results of confirmation experiments reveal that grey relational analysis coupled with principal component analysis can effectively acquire the optimal combination of process parameters. Hence, this confirm that the proposed approach in this study can be an useful tool to improve the process parameters in stereolithography process, which is very useful information for machine designers as well as RP machine users.

  3. SIX SIGMA BENCHMARKING OF PROCESS CAPABILITY ANALYSIS AND MAPPING OF PROCESS PARAMETERS

    Directory of Open Access Journals (Sweden)

    Jagadeesh Rajashekharaiah

    2016-12-01

    Full Text Available Inventory classification aims to ensure that business-driving inventory items are efficiently managed in spite of constrained resources. There are numerous single- and multiple-criteria approaches to it. We compare several approaches using a subset of a large spare parts inventory data. Our objective is to improve resource allocation leading to focus on items that can lead to high equipment availability. This concern is typical of many service industries such as military logistics, airlines, amusement parks and public works. We find that a modified multi-criteria weighted non-linear optimization (WNO technique is a powerful approach for classifying inventory, far outperforming traditional techniques such as ABC analysis and other methods available in the literature.

  4. Statistical analysis of the breaking processes of Ni nanowires

    Energy Technology Data Exchange (ETDEWEB)

    Garcia-Mochales, P [Departamento de Fisica de la Materia Condensada, Facultad de Ciencias, Universidad Autonoma de Madrid, c/ Francisco Tomas y Valiente 7, Campus de Cantoblanco, E-28049-Madrid (Spain); Paredes, R [Centro de Fisica, Instituto Venezolano de Investigaciones CientIficas, Apartado 20632, Caracas 1020A (Venezuela); Pelaez, S; Serena, P A [Instituto de Ciencia de Materiales de Madrid, Consejo Superior de Investigaciones CientIficas, c/ Sor Juana Ines de la Cruz 3, Campus de Cantoblanco, E-28049-Madrid (Spain)], E-mail: pedro.garciamochales@uam.es

    2008-06-04

    We have performed a massive statistical analysis on the breaking behaviour of Ni nanowires using molecular dynamic simulations. Three stretching directions, five initial nanowire sizes and two temperatures have been studied. We have constructed minimum cross-section histograms and analysed for the first time the role played by monomers and dimers. The shape of such histograms and the absolute number of monomers and dimers strongly depend on the stretching direction and the initial size of the nanowire. In particular, the statistical behaviour of the breakage final stages of narrow nanowires strongly differs from the behaviour obtained for large nanowires. We have analysed the structure around monomers and dimers. Their most probable local configurations differ from those usually appearing in static electron transport calculations. Their non-local environments show disordered regions along the nanowire if the stretching direction is [100] or [110]. Additionally, we have found that, at room temperature, [100] and [110] stretching directions favour the appearance of non-crystalline staggered pentagonal structures. These pentagonal Ni nanowires are reported in this work for the first time. This set of results suggests that experimental Ni conducting histograms could show a strong dependence on the orientation and temperature.

  5. N-terminal protein processing: A comparative proteogenomic analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bonissone, Stefano; Gupta, Nitin; Romine, Margaret F.; Bradshaw, Ralph A.; Pevzner, Pavel A.

    2013-01-01

    N-Terminal Methionine Excision (NME) is a universally conserved mechanism with the same specificity across all life forms that removes the first Methionine in proteins when the second residue is Gly, Ala, Ser, Cys, Thr, Pro, or Val. In spite of its necessity for proper cell functioning, the functional role of NME remains unclear. In 1988, Arfin and Bradshaw connected NME with the N-end protein degradation rule and postulated that the role of NME is to expose the stabilizing residues with the goal to resist protein degradation. While this explanation (that treats 7 stabilizing residues in the same manner) has become the de facto dogma of NME, comparative proteogenomics analysis of NME tells a different story. We suggest that the primary role of NME is to expose only two (rather than seven) amino acids Ala and Ser for post-translational modifications (e.g., acetylation) rather than to regulate protein degradation. We argue that, contrary to the existing view, NME is not crucially important for proteins with 5 other stabilizing residue at the 2nd positions that are merely bystanders (their function is not affected by NME) that become exposed to NME because their sizes are comparable or smaller than the size of Ala and Ser.

  6. Vehicle Lightweighting: Mass Reduction Spectrum Analysis and Process Cost Modeling

    Energy Technology Data Exchange (ETDEWEB)

    Mascarin, Anthony [IBIS Associates, Inc., Waltham, MA (United States); Hannibal, Ted [IBIS Associates, Inc., Waltham, MA (United States); Raghunathan, Anand [Energetics Inc., Columbia, MD (United States); Ivanic, Ziga [Energetics Inc., Columbia, MD (United States); Clark, Michael [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-03-01

    The U.S. Department of Energy’s Vehicle Technologies Office, Materials area commissioned a study to model and assess manufacturing economics of alternative design and production strategies for a series of lightweight vehicle concepts. In the first two phases of this effort examined combinations of strategies aimed at achieving strategic targets of 40% and a 45% mass reduction relative to a standard North American midsize passenger sedan at an effective cost of $3.42 per pound (lb) saved. These results have been reported in the Idaho National Laboratory report INL/EXT-14-33863 entitled Vehicle Lightweighting: 40% and 45% Weight Savings Analysis: Technical Cost Modeling for Vehicle Lightweighting published in March 2015. The data for these strategies were drawn from many sources, including Lotus Engineering Limited and FEV, Inc. lightweighting studies, U.S. Department of Energy-funded Vehma International of America, Inc./Ford Motor Company Multi-Material Lightweight Prototype Vehicle Demonstration Project, the Aluminum Association Transportation Group, many United States Council for Automotive Research’s/United States Automotive Materials Partnership LLC lightweight materials programs, and IBIS Associates, Inc.’s decades of experience in automotive lightweighting and materials substitution analyses.

  7. Genetic analysis of processed in-line mastitis indicator data

    DEFF Research Database (Denmark)

    Sørensen, Lars Peter; Løvendahl, Peter

    2013-01-01

    The aim of this study was to estimate heritability of elevated mastitis risk (EMR), a trait derived from in-line measurements of cell counts expressing risk of mastitis on a continuous scale, and its genetic correlation with in-line somatic cell counts. Log-transformed somatic cell counts (SCC; n...... on exponential smoothing of the SCC values followed by factor analysis for estimation of the latent variable EMR was used. Finally, EMR was expressed as a continuum on the interval [0;1] using sigmoid transformation. Thus, an EMR value close to zero indicates low risk of mastitis and a value close to one...... indicates high risk of mastitis. The EMR values were summarized for each cow using the log-transformed median EMR. A second trait was defined as the median of the log-transformed SCC values from 5 to 305 d in milk. A bivariate animal model was used for estimation of co-variance components for the 2 traits...

  8. Analysis of Fiber deposition using Automatic Image Processing Method

    Science.gov (United States)

    Belka, M.; Lizal, F.; Jedelsky, J.; Jicha, M.

    2013-04-01

    Fibers are permanent threat for a human health. They have an ability to penetrate deeper in the human lung, deposit there and cause health hazards, e.glung cancer. An experiment was carried out to gain more data about deposition of fibers. Monodisperse glass fibers were delivered into a realistic model of human airways with an inspiratory flow rate of 30 l/min. Replica included human airways from oral cavity up to seventh generation of branching. Deposited fibers were rinsed from the model and placed on nitrocellulose filters after the delivery. A new novel method was established for deposition data acquisition. The method is based on a principle of image analysis. The images were captured by high definition camera attached to a phase contrast microscope. Results of new method were compared with standard PCM method, which follows methodology NIOSH 7400, and a good match was found. The new method was found applicable for evaluation of fibers and deposition fraction and deposition efficiency were calculated afterwards.

  9. Social comparison processes and catastrophising in fibromyalgia: A path analysis.

    Science.gov (United States)

    Cabrera-Perona, V; Buunk, A P; Terol-Cantero, M C; Quiles-Marcos, Y; Martín-Aragón, M

    2017-06-01

    In addition to coping strategies, social comparison may play a role in illness adjustment. However, little is known about the role of contrast and identification in social comparison in adaptation to fibromyalgia. To evaluate through a path analysis in a sample of fibromyalgia patients, the association between identification and contrast in social comparison, catastrophising and specific health outcomes (fibromyalgia illness impact and psychological distress). 131 Spanish fibromyalgia outpatients (mean age: 50.15, SD = 11.1) filled out a questionnaire. We present a model that explained 33% of the variance in catastrophising by direct effects of more use of upward contrast and downward identification. In addition, 35% of fibromyalgia illness impact variance was explained by less upward identification, more upward contrast and more catastrophising and 42% of the variance in psychological distress by a direct effect of more use of upward contrast together with higher fibromyalgia illness impact. We suggest that intervention programmes with chronic pain and fibromyalgia patients should focus on enhancing the use of upward identification in social comparison, and on minimising the use of upward contrast and downward identification in social comparison.

  10. Microscopic Evaluation of Friction Plug Welds- Correlation to a Processing Analysis

    Science.gov (United States)

    Rabenberg, Ellen M.; Chen, Poshou; Gorti, Sridhar

    2017-01-01

    Recently an analysis of dynamic forge load data from the friction plug weld (FPW) process and the corresponding tensile test results showed that good plug welds fit well within an analytically determined processing parameter box. There were, however, some outliers that compromised the predictions. Here the microstructure of the plug weld material is presented in view of the load analysis with the intent of further understanding the FPW process and how it is affected by the grain structure and subsequent mechanical properties.

  11. Preparation and Cluster Analysis of Data from the Industrial Production Process for Failure Prediction

    Directory of Open Access Journals (Sweden)

    Németh Martin

    2016-12-01

    Full Text Available This article is devoted to the initial phase of data analysis of failure data from process control systems. Failure data can be used for example to detect weak spots in a production process, but also for failure prediction. To achieve these goals data mining techniques can be used. In this article, we propose a method to prepare and transform failure data from process control systems for application of data mining algorithms, especially cluster analysis.

  12. Preparation and Cluster Analysis of Data from the Industrial Production Process for Failure Prediction

    Science.gov (United States)

    Németh, Martin; Michaľčonok, German

    2016-12-01

    This article is devoted to the initial phase of data analysis of failure data from process control systems. Failure data can be used for example to detect weak spots in a production process, but also for failure prediction. To achieve these goals data mining techniques can be used. In this article, we propose a method to prepare and transform failure data from process control systems for application of data mining algorithms, especially cluster analysis.

  13. An empirical analysis of the distribution of overshoots in a stationary Gaussian stochastic process

    Science.gov (United States)

    Carter, M. C.; Madison, M. W.

    1973-01-01

    The frequency distribution of overshoots in a stationary Gaussian stochastic process is analyzed. The primary processes involved in this analysis are computer simulation and statistical estimation. Computer simulation is used to simulate stationary Gaussian stochastic processes that have selected autocorrelation functions. An analysis of the simulation results reveals a frequency distribution for overshoots with a functional dependence on the mean and variance of the process. Statistical estimation is then used to estimate the mean and variance of a process. It is shown that for an autocorrelation function, the mean and the variance for the number of overshoots, a frequency distribution for overshoots can be estimated.

  14. Formal analysis of executions of organizational scenarios based on process-oriented models

    NARCIS (Netherlands)

    Popova, V.; Sharpanskykh, A.

    2007-01-01

    This paper presents various formal techniques for analysis of executions of organizational scenarios based on process-oriented models of organizations. Process-oriented models describe (prescribe) ordering and timing relations on organizational processes, modes of use of resources, allocations of

  15. The governance of higher education regionalisation: comparative analysis of the Bologna Process and MERCOSUR-Educativo

    NARCIS (Netherlands)

    Verger, A.; Hermo, J.P.

    2010-01-01

    The article analyses two processes of higher education regionalisation, MERCOSUR‐Educativo in Latin America and the Bologna Process in Europe, from a comparative perspective. The comparative analysis is centered on the content and the governance of both processes and, specifically, on the reasons of

  16. The Governance of Higher Education Regionalisation: Comparative Analysis of the Bologna Process and MERCOSUR-Educativo

    Science.gov (United States)

    Verger, Antoni; Hermo, Javier Pablo

    2010-01-01

    The article analyses two processes of higher education regionalisation, MERCOSUR-Educativo in Latin America and the Bologna Process in Europe, from a comparative perspective. The comparative analysis is centered on the content and the governance of both processes and, specifically, on the reasons of their uneven evolution and implementation. We…

  17. Lightning Detection Efficiency Analysis Process: Modeling Based on Empirical Data

    Science.gov (United States)

    Rompala, John T.

    2005-01-01

    A ground based lightning detection system employs a grid of sensors, which record and evaluate the electromagnetic signal produced by a lightning strike. Several detectors gather information on that signal s strength, time of arrival, and behavior over time. By coordinating the information from several detectors, an event solution can be generated. That solution includes the signal s point of origin, strength and polarity. Determination of the location of the lightning strike uses algorithms based on long used techniques of triangulation. Determination of the event s original signal strength relies on the behavior of the generated magnetic field over distance and time. In general the signal from the event undergoes geometric dispersion and environmental attenuation as it progresses. Our knowledge of that radial behavior together with the strength of the signal received by detecting sites permits an extrapolation and evaluation of the original strength of the lightning strike. It also limits the detection efficiency (DE) of the network. For expansive grids and with a sparse density of detectors, the DE varies widely over the area served. This limits the utility of the network in gathering information on regional lightning strike density and applying it to meteorological studies. A network of this type is a grid of four detectors in the Rondonian region of Brazil. The service area extends over a million square kilometers. Much of that area is covered by rain forests. Thus knowledge of lightning strike characteristics over the expanse is of particular value. I have been developing a process that determines the DE over the region [3]. In turn, this provides a way to produce lightning strike density maps, corrected for DE, over the entire region of interest. This report offers a survey of that development to date and a record of present activity.

  18. What cognitive processes drive response biases? A diffusion model analysis

    Directory of Open Access Journals (Sweden)

    Fabio P. Leite

    2011-10-01

    Full Text Available We used a diffusion model to examine the effects of response-bias manipulations on response time (RT and accuracy data collected in two experiments involving a two-choice decision making task. We asked 18 subjects to respond ``low'' or ``high'' to the number of asterisks in a 10x10 grid, based on an experimenter-determined decision cutoff. In the model, evidence is accumulated until either a ``low'' or ``high'' decision criterion is reached, and this, in turn, initiates a response. We performed two experiments with four experimental conditions. In conditions 1 and 2, the decision cutoff between low and high judgments was fixed at 50. In condition 1, we manipulated the frequency with which low- and high-stimuli were presented. In condition 2, we used payoff structures that mimicked the frequency manipulation. We found that manipulating stimulus frequency resulted in a larger effect on RT and accuracy than did manipulating payoff structure. In the model, we found that manipulating stimulus frequency produced greater changes in the starting point of the evidence accumulation process than did manipulating payoff structure. In conditions 3 and 4, we set the decision cutoff at 40, 50, or 60 (Experiment 1 and at 45 or 55 (Experiment 2. In condition 3, there was an equal number of low- and high-stimuli, whereas in condition 4 there were unequal proportions of low- and high-stimuli. The model analyses showed that starting-point changes accounted for biases produced by changes in stimulus proportions, whereas evidence biases accounted for changes in the decision cutoff.

  19. Morphometric analysis of the uncinate processes of the cervical vertebrae.

    Science.gov (United States)

    Kocabiyik, N; Ercikti, N; Tunali, S

    2017-01-01

    Uncinate processes (UPs) are distinct features unique to cervical vertebrae. They are consistently found on posterolateral aspect of the superior end plate of 3rd to 7th cervical vertebrae. In this study, we investigated the morphology of the UPs with a particular emphasis on the regional anatomy and clinical significance. The study included 63 vertebrae. The width, height and length of UPs were measured with a digital calliper. We also assessed inclination angle of UP relative to sagittal plane, angle between medial surface of UP and superior surface of vertebra, angle between long axis of the UP and frontal plane, angle between long axis of UP and sagittal plane. Average width of the UPs ranged from 4.25 mm at C3 to 6.33 mm at T1; average height ranged from 4.88 mm at T1 to 7.54 mm at C4; and average length ranged from 6.88 mm at T1 to 11.46 mm at C4. We measured the inclination angle of UP relative to sagittal plane, and found it to be relatively constant with T1 having the largest value. The average angle was 41.39°, and the range was 17° to 85°. The angle between the long axis of the UP and the sagittal plane was increasing signifi-cantly from C5 to T1. The average angle was 20.74° and the range was 6° to 65°. Anatomy of UPs is significant for surgeon who operates on the cervical spine. Hopefully, the information presented herein would decrease complications during surgical approaches to the cervical spine.

  20. A framework for process-solution analysis in collaborative learning environments

    OpenAIRE

    Bravo Santos, Crescencio; Redondo Duque, Miguel Angel; Verdejo Maillo, María Felisa; Ortega Cantero, Manuel

    2008-01-01

    One of the most challenging aspects of computer-supported collaborative learning (CSCL) research is automation of collaboration and interaction analysis in order to understand and improve the learning processes. It is particularly necessary to look in more depth at the joint analysis of the collaborative process and its resulting product. In this article, we present a framework for comprehensive analysis in CSCL synchronous environments supporting a problem-solving approach to learning. This ...

  1. Formal concept analysis applied to the prediction of additives for galvanizing process

    Directory of Open Access Journals (Sweden)

    J. Klimeš

    2010-04-01

    Full Text Available Formal concept analysis is a new mathematical approach to data analysis, data mining and to discavering patterns in data. The result of the application of the formal concept analysis method to the behavior of the galvanizing of rimmed steel is presented. Effects of additives in the galvanizing process have been correlated to the chemical element properties of the additives. This model may also help to design new alloys as additives in the galvanizing process.

  2. Formal concept analysis applied to the prediction of additives for galvanizing process

    OpenAIRE

    J. Klimeš

    2010-01-01

    Formal concept analysis is a new mathematical approach to data analysis, data mining and to discavering patterns in data. The result of the application of the formal concept analysis method to the behavior of the galvanizing of rimmed steel is presented. Effects of additives in the galvanizing process have been correlated to the chemical element properties of the additives. This model may also help to design new alloys as additives in the galvanizing process.

  3. Cognitive Processes in Intelligence Analysis: A Descriptive Model and Review of the Literature

    Science.gov (United States)

    1979-12-01

    The descriptive model of cognitive processes in intelligence analysis presented in this report was developed as a part of a study entitled...8217Investigation of Methodologies and Techniques for Intelligence Analysis .’ The approach to constructing the model is based on the investigation of analytical...investigation, intelligence analysis was defined as a spectrum of analytical and judgmental activities involved in the processing and production of

  4. The analysis of bottom forming process for hybrid heating device

    Science.gov (United States)

    Bałon, Paweł; Świątoniowski, Andrzej; Kiełbasa, Bartłomiej

    2017-10-01

    In this paper the authors present an unusual method for bottom forming applicable for various industrial purposes including the manufacture of water heaters or pressure equipment. This method allows for the fabrication of the bottom of a given piece of stainless steel into a pre-determined shape conforming to the DIN standard which determines the most advantageous dimensions for the bottom cross section in terms of working pressure loading. The authors checked the validity of the method in a numerical and experimental way generating a tool designed to produce bottoms of specified geometry. Many problems are encountered during the design and production of parts, especially excessive sheet wrinkling over a large area of the part. The experiment showed that a lack of experience and numerical analysis in the design of such elements would result in the production of highly wrinkled parts. This defect would render the parts impossible to assemble with the cylindrical part. Many tool shops employ a method for drawing elements with a spherical surface which involves additional spinning, stamping, and grading operations, which greatly increases the cost of parts production. The authors present and compare two forming methods for spherical and parabolic objects, and experimentally confirm the validity of the sheet reversing method with adequate pressure force. The applied method produces parts in one drawing operation and in a following operation that is based on laser or water cutting to obtain a round blank. This reduces the costs of tooling manufacturing by requiring just one tool which can be placed on any hydraulic press with a minimum force of 2 000 kN.

  5. Detection of dominant runoff generation processes in flood frequency analysis

    Science.gov (United States)

    Iacobellis, Vito; Fiorentino, Mauro; Gioia, Andrea; Manfreda, Salvatore

    2010-05-01

    The investigation on hydrologic similarity represents one of the most exciting challenges faced by hydrologists in the last few years, in order to reduce uncertainty on flood prediction in ungauged basins (e.g., IAHS Decade on Predictions in Ungauged Basins (PUB) - Sivapalan et al., 2003). In perspective, the identification of dominant runoff generation mechanisms may provide a strategy for catchment classification and identification hydrologically omogeneous regions. In this context, we exploited the framework of theoretically derived flood probability distributions, in order to interpret the physical behavior of real basins. Recent developments on theoretically derived distributions have highlighted that in a given basin different runoff processes may coexistence and modify or affect the shape of flood distributions. The identification of dominant runoff generation mechanisms represents a key signatures of flood distributions providing an insight in hydrologic similarity. Iacobellis and Fiorentino (2000) introduced a novel distribution of flood peak annual maxima, the "IF" distribution, which exploited the variable source area concept, coupled with a runoff threshold having scaling properties. More recently, Gioia et al (2008) introduced the Two Component-IF (TCIF) distribution, generalizing the IF distribution, based on two different threshold mechanisms, associated respectively to ordinary and extraordinary events. Indeed, ordinary floods are mostly due to rainfall events exceeding a threshold infiltration rate in a small source area, while the so-called outlier events, often responsible of the high skewness of flood distributions, are triggered by severe rainfalls exceeding a threshold storage in a large portion of the basin. Within this scheme, we focused on the application of both models (IF and TCIF) over a considerable number of catchments belonging to different regions of Southern Italy. In particular, we stressed, as a case of strong general interest in

  6. Mathematical analysis study for radar data processing and enhancement. Part 1: Radar data analysis

    Science.gov (United States)

    James, R.; Brownlow, J. D.

    1985-01-01

    A study is performed under NASA contract to evaluate data from an AN/FPS-16 radar installed for support of flight programs at Dryden Flight Research Facility of NASA Ames Research Center. The purpose of this study is to provide information necessary for improving post-flight data reduction and knowledge of accuracy of derived radar quantities. Tracking data from six flights are analyzed. Noise and bias errors in raw tracking data are determined for each of the flights. A discussion of an altiude bias error during all of the tracking missions is included. This bias error is defined by utilizing pressure altitude measurements made during survey flights. Four separate filtering methods, representative of the most widely used optimal estimation techniques for enhancement of radar tracking data, are analyzed for suitability in processing both real-time and post-mission data. Additional information regarding the radar and its measurements, including typical noise and bias errors in the range and angle measurements, is also presented. This is in two parts. This is part 1, an analysis of radar data.

  7. Performance Analysis of the United States Marine Corps War Reserve Materiel Program Process Flow

    Science.gov (United States)

    2016-12-01

    55 2. Cost /Benefit Analysis of Maintaining Inventory .......................55 B. TRANSPORTATION...REVIEW A. DOD LOGISTICS OVERVIEW Acquiring and supplying materiel to deployed forces is a complicated process. While we conducted our analysis on... Cost /Benefit Analysis of Maintaining Inventory Additionally, should certain item types prove to be more prone to delays or incur proportionally

  8. Statistical process control methods allow the analysis and improvement of anesthesia care.

    Science.gov (United States)

    Fasting, Sigurd; Gisvold, Sven E

    2003-10-01

    Quality aspects of the anesthetic process are reflected in the rate of intraoperative adverse events. The purpose of this report is to illustrate how the quality of the anesthesia process can be analyzed using statistical process control methods, and exemplify how this analysis can be used for quality improvement. We prospectively recorded anesthesia-related data from all anesthetics for five years. The data included intraoperative adverse events, which were graded into four levels, according to severity. We selected four adverse events, representing important quality and safety aspects, for statistical process control analysis. These were: inadequate regional anesthesia, difficult emergence from general anesthesia, intubation difficulties and drug errors. We analyzed the underlying process using 'p-charts' for statistical process control. In 65,170 anesthetics we recorded adverse events in 18.3%; mostly of lesser severity. Control charts were used to define statistically the predictable normal variation in problem rate, and then used as a basis for analysis of the selected problems with the following results: Inadequate plexus anesthesia: stable process, but unacceptably high failure rate; Difficult emergence: unstable process, because of quality improvement efforts; Intubation difficulties: stable process, rate acceptable; Medication errors: methodology not suited because of low rate of errors. By applying statistical process control methods to the analysis of adverse events, we have exemplified how this allows us to determine if a process is stable, whether an intervention is required, and if quality improvement efforts have the desired effect.

  9. Post-processing tools for nonlinear fe analysis of concrete structures

    OpenAIRE

    Cervenka, Vladimir; Pukl, Radomir; Eligehausen, Rolf

    1990-01-01

    Finite clement analysis of the cracking process in concrete structures brings new requirements for the post-processing environment. Crack direction and location are important for identification of the failure mode. The problem have been solved in the finite element program SBETA which was developed by the authors for simulation of the failure processes in reinforced concrete structures. The post-processing system creates the graphical images of crack patterns. Graphical sequences for simulati...

  10. An ontological knowledge based system for selection of process monitoring and analysis tools

    DEFF Research Database (Denmark)

    Singh, Ravendra; Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    Efficient process monitoring and analysis tools provide the means for automated supervision and control of manufacturing plants and therefore play an important role in plant safety, process control and assurance of end product quality. The availability of a large number of different process...... monitoring and analysis tools for a wide range of operations has made their selection a difficult, time consuming and challenging task. Therefore, an efficient and systematic knowledge base coupled with an inference system is necessary to support the optimal selection of process monitoring and analysis tools...... hand, it facilitates the selection of proper monitoring and analysis tools for a given application or process. On the other hand, it permits the identification of potential applications for a given monitoring technique or tool. An efficient inference system based on forward as well as reverse search...

  11. Quantitative analysis of geomorphic processes using satellite image data at different scales

    Science.gov (United States)

    Williams, R. S., Jr.

    1985-01-01

    When aerial and satellite photographs and images are used in the quantitative analysis of geomorphic processes, either through direct observation of active processes or by analysis of landforms resulting from inferred active or dormant processes, a number of limitations in the use of such data must be considered. Active geomorphic processes work at different scales and rates. Therefore, the capability of imaging an active or dormant process depends primarily on the scale of the process and the spatial-resolution characteristic of the imaging system. Scale is an important factor in recording continuous and discontinuous active geomorphic processes, because what is not recorded will not be considered or even suspected in the analysis of orbital images. If the geomorphic process of landform change caused by the process is less than 200 m in x to y dimension, then it will not be recorded. Although the scale factor is critical, in the recording of discontinuous active geomorphic processes, the repeat interval of orbital-image acquisition of a planetary surface also is a consideration in order to capture a recurring short-lived geomorphic process or to record changes caused by either a continuous or a discontinuous geomorphic process.

  12. 'Dialectical process' and 'constructive method': micro-analysis of relational process in an example from parent-infant psychotherapy.

    Science.gov (United States)

    Woodhead, Judith

    2004-04-01

    Jung defined experience that takes place between therapist and patient as 'dialectical process', achieved through 'constructive method'. Perspectives from attachment theory, neurobiology, cognitive science, systems thinking and infancy research confirm and extend his view of the centrality of relational process in the development of self. Interactional experiences are embedded within the history of the primary parent-infant relationship and structure within the mind implicit patterns of relating. These patterns influence capacities for managing a whole lifetime of affective relational experience within the self and with others. This paper shows how parent-infant psychotherapy seeks to intervene during the formation of disturbed relational patterns. I offer detailed micro-analysis of the moment-to-moment 'dialectical process' that a mother, her four-month-old infant and myself 'constructed' together.

  13. Biosignal processing and analysis using Mathcad--pedagogical and research issues--.

    Science.gov (United States)

    Sandham, Williams A; Hamilton, David J

    2006-01-01

    Biosignal processing and analysis is generally perceived by many students to be a challenging topic to understand, and to become adept with the necessary analytical skills. This is a direct consequence of the high mathematical content involved, and the many abstract features of the topic. The Mathcad package offers an excellent algorithm development environment for teaching biosignal processing and analysis modules, and can also be used effectively in many biosignal, and indeed bioengineering, research areas. In this paper, traditional introductory and advanced biosignal processing (and analysis) syllabi are reviewed, and the use of Mathcad for teaching and research is illustrated with a number of examples.

  14. A model-based systems approach to pharmaceutical product-process design and analysis

    DEFF Research Database (Denmark)

    Gernaey, Krist; Gani, Rafiqul

    2010-01-01

    This is a perspective paper highlighting the need for systematic model-based design and analysis in pharmaceutical product-process development. A model-based framework is presented and the role, development and use of models of various types are discussed together with the structure of the models...... for the product and the process. The need for a systematic modelling framework is highlighted together with modelling issues related to model identification, adaptation and extension. In the area of product design and analysis, predictive models are needed with a wide application range. In the area of process...... analysis and model generation are presented....

  15. The process system analysis for advanced spent fuel management technology (I)

    Energy Technology Data Exchange (ETDEWEB)

    Lee, H. H.; Lee, J. R.; Kang, D. S.; Seo, C. S.; Shin, Y. J.; Park, S. W.

    1997-12-01

    Various pyrochemical processes were evaluated, and viable options were selected in consideration of the proliferation safety, technological feasibility and compatibility to the domestic nuclear power system. Detailed technical analysis were followed on the selected options such as unit process flowsheet including physico-chemical characteristics of the process systems, preliminary concept development, process design criteria and materials for equipment. Supplementary analysis were also carried out on the support technologies including sampling and transport technologies of molten salt, design criteria and equipment for glove box systems, and remote operation technologies. (author). 40 refs., 49 tabs., 37 figs.

  16. Scenario Object Model Based On-Line Safety Analysis for Chemical Process

    Directory of Open Access Journals (Sweden)

    Dong Gao

    2017-11-01

    Full Text Available HAZOP (Hazard and Operability Analysis is a method of safety analysis, which is widely used in chemical processes. The conventional methods for safety analysis consist of human based safety analysis and computer aid safety analysis. All of them are off-line and qualitative and it is difficult to carry out on-line safety analysis. On-line safety analysis based on scenario object model was proposed for chemical processes. The scenario object model was built using ontology, by which the safety information can be transferred, reused and shared effectively. Deviation degree and qualitative trend were added to the model. Based on the model and new inference algorithm, on-line safety analysis can be implemented for chemical processes. Once a fault or abnormal event occurs, the causes can be traced and the consequences can be predicted. At the same time, semi-quantitative safety analysis is carried out. The resolution can be improved and it can help the operators handle the problems in time and effectively. The method was used for safety analysis of a reactor process and the effectiveness of the method was proved.

  17. Progress in Root Cause and Fault Propagation Analysis of Large-Scale Industrial Processes

    Directory of Open Access Journals (Sweden)

    Fan Yang

    2012-01-01

    Full Text Available In large-scale industrial processes, a fault can easily propagate between process units due to the interconnections of material and information flows. Thus the problem of fault detection and isolation for these processes is more concerned about the root cause and fault propagation before applying quantitative methods in local models. Process topology and causality, as the key features of the process description, need to be captured from process knowledge and process data. The modelling methods from these two aspects are overviewed in this paper. From process knowledge, structural equation modelling, various causal graphs, rule-based models, and ontological models are summarized. From process data, cross-correlation analysis, Granger causality and its extensions, frequency domain methods, information-theoretical methods, and Bayesian nets are introduced. Based on these models, inference methods are discussed to find root causes and fault propagation paths under abnormal situations. Some future work is proposed in the end.

  18. Formal analysis of executions of organizational scenarios based on process-oriented specifications

    NARCIS (Netherlands)

    Popova, V.; Sharpanskykh, O.

    2011-01-01

    Abstract This paper presents various formal techniques for analysis of executions of organizational scenarios based on specifications of organizations. Organizational specifications describe (prescribe) ordering and timing relations on organizational processes, modes of use of resources, allocations

  19. SPATIALLY ADAPTIVE SEMI-SUPERVISED LEARNING WITH GAUSSIAN PROCESSES FOR HYPERSPECTRAL DATA ANALYSIS

    Data.gov (United States)

    National Aeronautics and Space Administration — SPATIALLY ADAPTIVE SEMI-SUPERVISED LEARNING WITH GAUSSIAN PROCESSES FOR HYPERSPECTRAL DATA ANALYSIS GOO JUN * AND JOYDEEP GHOSH* Abstract. A semi-supervised learning...

  20. About numerical analysis of electromagnetic field induce in gear wheels during hardening process

    Directory of Open Access Journals (Sweden)

    Gabriel Cheregi

    2008-05-01

    Full Text Available The paper presents the results of a numericalsimulation using finite element analysis for a coupledmagneto-thermal problem, specific for inductionhardening processes. The analysis takes into account therelative movement between inductor and the heated part.Numerical simulation allows to determine accurately thethermal regime of the induction heating process and theoptimal parameters which offer maximum efficiency.Therefore the experiments number in designing processcan be decreased and a better knowledge of the processcan be obtained.

  1. Computer-Aided Model Based Analysis for Design and Operation of a Copolymerization Process

    DEFF Research Database (Denmark)

    Lopez-Arenas, Maria Teresa; Sales-Cruz, Alfonso Mauricio; Gani, Rafiqul

    2006-01-01

    through the study of a copolymerization process, where operational problems due to their complex nonlinear behaviour are usually encountered, indicating thereby, the need for the development of an appropriate process model that can describe the dynamic behaviour over the complete range of conversion....... This will allow analysis of the process behaviour, contribute to a better understanding of the polymerization process, help to avoid unsafe conditions of operation, and to develop operational and optimizing control strategies. In this work, through a computer-aided modeling system ICAS-MoT, two first......, the process design and conditions of operation on the polymer grade and the production rate....

  2. Second-order analysis of structured inhomogeneous spatio-temporal point processes

    DEFF Research Database (Denmark)

    Møller, Jesper; Ghorbani, Mohammad

    Statistical methodology for spatio-temporal point processes is in its infancy. We consider second-order analysis based on pair correlation functions and K-functions for first general inhomogeneous spatio-temporal point processes and second inhomogeneous spatio-temporal Cox processes. Assuming......-temporal Gaussian process. Another concerns shot-noise Cox processes with a separable spatio-temporal covariance density. We propose diagnostic procedures for checking hypotheses of second-order spatio-temporal separability, which we apply on simulated and real data (the UK 2001 epidemic foot and mouth disease data)....

  3. Analysis of the growth of concomitant nitride layers produced by a post-discharge assisted process

    Energy Technology Data Exchange (ETDEWEB)

    Oseguera, J. [ITESM-CEM, Carretera al Lago de Guadalupe km. 3.5 Atizapan, 52926 (Mexico)]. E-mail: joseguer@itesm.mx; Castillo, F. [ITESM-CEM, Carretera al Lago de Guadalupe km. 3.5 Atizapan, 52926 (Mexico); Gomez, A. [UFRO, Av. Francisco Salazar 01145, Temuco, Casilla 54-d (Chile); Fraguela, A. [BUAP, Rio Verde y Ave. San Claudio, San Manuel, Puebla, 72570 (Mexico)

    2006-11-23

    In the present work, the growth of concomitant nitride layers during a post-discharge process is studied. The analysis takes into account the similarities and differences between nitriding post-discharge processes and other nitriding processes, employing a mathematical simulation of nitrogen diffusion. The considered differences are related to the thermodynamic standard states, the nitrogen concentration on the surface and the sputtering of the surface (this one for plasma processes). Nitrogen diffusion and layer formation are described from the beginning of the process by means of a mathematical model.

  4. Comprehensive NMR analysis of compositional changes of black garlic during thermal processing.

    Science.gov (United States)

    Liang, Tingfu; Wei, Feifei; Lu, Yi; Kodani, Yoshinori; Nakada, Mitsuhiko; Miyakawa, Takuya; Tanokura, Masaru

    2015-01-21

    Black garlic is a processed food product obtained by subjecting whole raw garlic to thermal processing that causes chemical reactions, such as the Maillard reaction, which change the composition of the garlic. In this paper, we report a nuclear magnetic resonance (NMR)-based comprehensive analysis of raw garlic and black garlic extracts to determine the compositional changes resulting from thermal processing. (1)H NMR spectra with a detailed signal assignment showed that 38 components were altered by thermal processing of raw garlic. For example, the contents of 11 l-amino acids increased during the first step of thermal processing over 5 days and then decreased. Multivariate data analysis revealed changes in the contents of fructose, glucose, acetic acid, formic acid, pyroglutamic acid, cycloalliin, and 5-(hydroxymethyl)furfural (5-HMF). Our results provide comprehensive information on changes in NMR-detectable components during thermal processing of whole garlic.

  5. Classification and analysis of the fermenter with mechanical mixing devices in aerobic processes of biotechnology

    Directory of Open Access Journals (Sweden)

    Дмитро Миколайович Закоморний

    2015-05-01

    Full Text Available We have proposed modern classification of industrial fermenters with the introduction of mechanical energy by mixing devices and contained analyses of specific multiphase flow system, based on the analysis of oxygen mass transfer processes in the processes of aerobic cultivation allows to calculate basic parameters of typical groups of devices

  6. XbD Video 3, The SEEing process of qualitative data analysis

    DEFF Research Database (Denmark)

    2013-01-01

    This is the third video in the Experience-based Designing series. It presents a live classroom demonstration of a nine step qualitative data analysis process called SEEing: The process is useful for uncovering or discovering deeper layers of 'meaning' and meaning structures in an experience...

  7. Powder stickiness in milk drying: uncertainty and sensitivity analysis for process understanding

    DEFF Research Database (Denmark)

    Ferrari, Adrián; Gutiérrez, Soledad; Sin, Gürkan

    2017-01-01

    A powder stickiness model based in the glass transition temperature (Gordon – Taylor equations) was built for a production scale milk drying process (including a spray chamber, and internal/external fluid beds). To help process understanding, the model was subjected to sensitivity analysis (SA) o...

  8. Investigation of milling processes of semiconductor zinc oxide nanostructured powders by X-ray phase analysis

    Science.gov (United States)

    Pronin, I. A.; Averin, I. A.; Yakushova, N. D.; Vishnevskaya, G. V.; Sychov, M. M.; Moshnikov, V. A.; Terukov, E. I.

    2017-11-01

    The processes of mechanical activation of nanostructured zinc oxide powders are investigated by X-ray phase analysis. It was determined, that samples remain in a single phase state during the milling process. The particle size decreases according to the linear time law, and microstrains grow parabolically.

  9. Stacker’s Crane Position Fixing Based on Real Time Image Processing and Analysis

    Directory of Open Access Journals (Sweden)

    Kmeid Saad

    2015-06-01

    Full Text Available This study illustrates the usage of stacker cranes and image processing in automated warehouse systems. The aim is to use real time image processing and analysis for a stacker’s crane position fixing in order to use it as a pick-up and delivery system (P/D, to be controlled by a programmable logic controller unit (PLC.

  10. Quantum theory analysis of triple photons generated by a χ(3) process

    Science.gov (United States)

    Dot, A.; Borne, A.; Boulanger, B.; Bencheikh, K.; Levenson, J. A.

    2012-02-01

    We present a quantum theoretical analysis of triple photons generated by a phase-matched third-order nonlinear process in a KTP crystal in a weak-interaction regime. We show that the quantum properties of the triple photons can be brought to light through optical second- and third-order sum frequency generation processes.

  11. A FRAMEWORK FOR DOCUMENT PRE-PROCESSING IN FORENSIC HANDWRITING ANALYSIS

    NARCIS (Netherlands)

    Franke, K.; Köppen, M.

    2004-01-01

    We propose an open layered framework, which might be adapted to fulfill sophisticated demands in forensic handwriting analysis. Due to the contradicting requirements of processing a huge amount of different document types as well as providing high quality processed images of singular document

  12. Modeling a production scale milk drying process: parameter estimation, uncertainty and sensitivity analysis

    DEFF Research Database (Denmark)

    Ferrari, A.; Gutierrez, S.; Sin, Gürkan

    2016-01-01

    A steady state model for a production scale milk drying process was built to help process understanding and optimization studies. It involves a spray chamber and also internal/external fluid beds. The model was subjected to a comprehensive statistical analysis for quality assurance using sensitiv...

  13. The specifics of the aplication of social and structural approach to electoral processes analysis

    OpenAIRE

    V F Kovrov

    2009-01-01

    The analysis of a number of problems of the investigation of the electoral process viewed as a social phenomenon contributes to the overcoming of a number of theoretical and methodological obstacles in the process of its sociological cognition. The complexity and delicacy of the electoral process entails the application of a set of distinct approaches, research and description techniques. The article provides the rationale for the most complete insight into the social component of the elector...

  14. High Performance Processing and Analysis of Geospatial Data Using CUDA on GPU

    OpenAIRE

    STOJANOVIC, N.; STOJANOVIC, D.

    2014-01-01

    In this paper, the high-performance processing of massive geospatial data on many-core GPU (Graphic Processing Unit) is presented. We use CUDA (Compute Unified Device Architecture) programming framework to implement parallel processing of common Geographic Information Systems (GIS) algorithms, such as viewshed analysis and map-matching. Experimental evaluation indicates the improvement in performance with respect to CPU-based solutions and shows feasibility of using GPU and CU...

  15. Integration of thermodynamic insights and MINLP optimization for synthesis, design and analysis of process flowsheets

    DEFF Research Database (Denmark)

    Gani, Rafiqul; Hostrup, Martin; Kravanja, Z.

    2001-01-01

    This paper presents an integrated approach to the solution of process synthesis, design and analysis problems. Integration is achieved by combining two different process synthesis techniques, one based on thermodynamic insights and the other based on structural optimisation, together...... with a simulation engine and a properties prediction package. Process flowsheets with or without reaction blocks are considered in this paper. Results from three illustrative case studies, highlighting different features of the integrated approach, are presented. (C) 2001 Elsevier Science Ltd. All rights reserved....

  16. Model Property Based Material Balance and Energy Conservation Analysis for Process Industry Energy Transfer Systems

    OpenAIRE

    Fumin Ma; Gregory M. P. O’Hare; Tengfei Zhang; Michael J. O’Grady

    2015-01-01

    Conventional historical data based material and energy balance analyses are static and isolated computations. Such methods cannot embody the cross-coupling effect of energy flow, material flow and information flow in the process industry; furthermore, they cannot easily realize the effective evaluation and comparison of different energy transfer processes by alternating the model module. In this paper, a novel method for material balance and energy conservation analysis of process industry en...

  17. A Techno-Economic Analysis of Chemical Processing with Ionizing Radiation

    OpenAIRE

    McConnaughy, Thomas B.; Shaner, Matthew R.; McFarland, Eric W.

    2017-01-01

    Photons and electrons with energies above the ionization potential of most atoms can be used to facilitate chemical reactions not otherwise possible thermochemically or under more preferable process conditions. An analysis and comparison of the economics of using sources of ultraviolet photons, high-energy electrons, γ-rays, and X-rays in a chemical conversion process is presented. In many processes where the penetration depth is sufficient, the overall production costs for equivalent product...

  18. Reflections on Practical Approaches to Involving Children and Young People in the Data Analysis Process

    Science.gov (United States)

    Coad, Jane; Evans, Ruth

    2008-01-01

    This article reflects on key methodological issues emerging from children and young people's involvement in data analysis processes. We outline a pragmatic framework illustrating different approaches to engaging children, using two case studies of children's experiences of participating in data analysis. The article highlights methods of…

  19. Semi-on-line analysis for fast and precise monitoring of bioreaction processes

    DEFF Research Database (Denmark)

    Christensen, L.H.; Marcher, J.; Schulze, Ulrik

    1996-01-01

    Monitoring of substrates and products during fermentation processes can be achieved either by on-line, in situ sensors or by semi-on-line analysis consisting of an automatic sampling step followed by an ex situ analysis of the retrieved sample. The potential risk of introducing time delays...

  20. Linear circuit analysis program for IBM 1620 Monitor 2, 1311/1443 data processing system /CIRCS/

    Science.gov (United States)

    Hatfield, J.

    1967-01-01

    CIRCS is modification of IBSNAP Circuit Analysis Program, for use on smaller systems. This data processing system retains the basic dc, transient analysis, and FORTRAN 2 formats. It can be used on the IBM 1620/1311 Monitor I Mod 5 system, and solves a linear network containing 15 nodes and 45 branches.

  1. Designing discovery learning environments: process analysis and implications for designing an information system

    NARCIS (Netherlands)

    Pieters, Julius Marie; Limbach, R.; de Jong, Anthonius J.M.

    2004-01-01

    A systematic analysis of the design process of authors of (simulation based) discovery learning environments was carried out. The analysis aimed at identifying the design activities of authors and categorising knowledge gaps that they experience. First, five existing studies were systematically

  2. How qualitative data analysis software may support the qualitative analysis process

    NARCIS (Netherlands)

    Peters, V.A.M.; Wester, F.P.J.

    2007-01-01

    The last decades have shown large progress in the elaboration of procedures for qualitative data analysis and in the development of computer programs to support this kind of analysis. We believe, however, that the link between methodology and computer software tools is too loose, especially for a

  3. Operation, Modeling and Analysis of the Reverse Water Gas Shift Process

    Science.gov (United States)

    Whitlow, Jonathan E.

    2001-01-01

    The Reverse Water Gas Shift process is a candidate technology for water and oxygen production on Mars under the In-Situ Propellant Production project. This report focuses on the operation and analysis of the Reverse Water Gas Shift (RWGS) process, which has been constructed at Kennedy Space Center. A summary of results from the initial operation of the RWGS, process along with an analysis of these results is included in this report. In addition an evaluation of a material balance model developed from the work performed previously under the summer program is included along with recommendations for further experimental work.

  4. Applications Associated With Morphological Analysis And Generation In Natural Language Processing

    Directory of Open Access Journals (Sweden)

    Neha Yadav

    2017-08-01

    Full Text Available Natural Language Processing is one of the most developing fields in research area. In most of the applications related to the Natural Language Processing findings of the Morphological Analysis and Morphological Generation can be considered very important. As morphological study is the technique to recognise a word and its output can be used on later on stages .Keeping in view this importance this paper describes how Morphological Analysis and Morphological Generation can be proved as an important part of various Natural Language Processing fields such as Spell checker Machine Translation etc.

  5. Distributed data processing and analysis environment for neutron scattering experiments at CSNS

    Science.gov (United States)

    Tian, H. L.; Zhang, J. R.; Yan, L. L.; Tang, M.; Hu, L.; Zhao, D. X.; Qiu, Y. X.; Zhang, H. Y.; Zhuang, J.; Du, R.

    2016-10-01

    China Spallation Neutron Source (CSNS) is the first high-performance pulsed neutron source in China, which will meet the increasing fundamental research and technique applications demands domestically and overseas. A new distributed data processing and analysis environment has been developed, which has generic functionalities for neutron scattering experiments. The environment consists of three parts, an object-oriented data processing framework adopting a data centered architecture, a communication and data caching system based on the C/S paradigm, and data analysis and visualization software providing the 2D/3D experimental data display. This environment will be widely applied in CSNS for live data processing.

  6. An introduction to audio content analysis applications in signal processing and music informatics

    CERN Document Server

    Lerch, Alexander

    2012-01-01

    "With the proliferation of digital audio distribution over digital media, audio content analysis is fast becoming a requirement for designers of intelligent signal-adaptive audio processing systems. Written by a well-known expert in the field, this book provides quick access to different analysis algorithms and allows comparison between different approaches to the same task, making it useful for newcomers to audio signal processing and industry experts alike. A review of relevant fundamentals in audio signal processing, psychoacoustics, and music theory, as well as downloadable MATLAB files are also included"--

  7. Functional analysis, harmonic analysis, and image processing a collection of papers in honor of Bj"orn Jawerth

    CERN Document Server

    Cwikel, Michael

    2017-01-01

    This volume is dedicated to the memory of Björn Jawerth. It contains original research contributions and surveys in several of the areas of mathematics to which Björn made important contributions. Those areas include harmonic analysis, image processing, and functional analysis, which are of course interrelated in many significant and productive ways. Among the contributors are some of the world's leading experts in these areas. With its combination of research papers and surveys, this book may become an important reference and research tool. This book should be of interest to advanced graduate students and professional researchers in the areas of functional analysis, harmonic analysis, image processing, and approximation theory. It combines articles presenting new research with insightful surveys written by foremost experts.

  8. Image analysis for ophthalmological diagnosis image processing of Corvis ST images using Matlab

    CERN Document Server

    Koprowski, Robert

    2016-01-01

    This monograph focuses on the use of analysis and processing methods for images from the Corvis® ST tonometer. The presented analysis is associated with the quantitative, repeatable and fully automatic evaluation of the response of the eye, eyeball and cornea to an air-puff. All the described algorithms were practically implemented in MATLAB®. The monograph also describes and provides the full source code designed to perform the discussed calculations. As a result, this monograph is intended for scientists, graduate students and students of computer science and bioengineering as well as doctors wishing to expand their knowledge of modern diagnostic methods assisted by various image analysis and processing methods.

  9. Human performance variation analysis: A process for human performance problem solving

    Directory of Open Access Journals (Sweden)

    Anerie Rademeyer

    2009-04-01

    Full Text Available Problem-solving ability is a much sought-after trait in executives, especially if it includes the ability to solve human performance problems. This paper proposes a systematic root cause analysis process that effectively and consistently uncovers the root causes of human performance problems and controls the causes in a way that prevents the problems from recurring. Applying action research the study brings into being a Human Performance Variation Analysis (HPVA process, which consists of three phases: (1 performance variation assessment, (2 performance variation analysis, and (3 performance variation resolution. The HPVA provides much-needed capability in solving human performance problems in organisations.

  10. Definition of distance for nonlinear time series analysis of marked point process data

    Energy Technology Data Exchange (ETDEWEB)

    Iwayama, Koji, E-mail: koji@sat.t.u-tokyo.ac.jp [Research Institute for Food and Agriculture, Ryukoku Univeristy, 1-5 Yokotani, Seta Oe-cho, Otsu-Shi, Shiga 520-2194 (Japan); Hirata, Yoshito; Aihara, Kazuyuki [Institute of Industrial Science, The University of Tokyo, 4-6-1 Komaba, Meguro-ku, Tokyo 153-8505 (Japan)

    2017-01-30

    Marked point process data are time series of discrete events accompanied with some values, such as economic trades, earthquakes, and lightnings. A distance for marked point process data allows us to apply nonlinear time series analysis to such data. We propose a distance for marked point process data which can be calculated much faster than the existing distance when the number of marks is small. Furthermore, under some assumptions, the Kullback–Leibler divergences between posterior distributions for neighbors defined by this distance are small. We performed some numerical simulations showing that analysis based on the proposed distance is effective. - Highlights: • A new distance for marked point process data is proposed. • The distance can be computed fast enough for a small number of marks. • The method to optimize parameter values of the distance is also proposed. • Numerical simulations indicate that the analysis based on the distance is effective.

  11. [A novel image processing and analysis system for medical images based on IDL language].

    Science.gov (United States)

    Tang, Min

    2009-08-01

    Medical image processing and analysis system, which is of great value in medical research and clinical diagnosis, has been a focal field in recent years. Interactive data language (IDL) has a vast library of built-in math, statistics, image analysis and information processing routines, therefore, it has become an ideal software for interactive analysis and visualization of two-dimensional and three-dimensional scientific datasets. The methodology is proposed to design a novel image processing and analysis system for medical images based on IDL. There are five functional modules in this system: Image Preprocessing, Image Segmentation, Image Reconstruction, Image Measurement and Image Management. Experimental results demonstrate that this system is effective and efficient, and it has the advantages of extensive applicability, friendly interaction, convenient extension and favorable transplantation.

  12. Automics: an integrated platform for NMR-based metabonomics spectral processing and data analysis

    Directory of Open Access Journals (Sweden)

    Qu Lijia

    2009-03-01

    Full Text Available Abstract Background Spectral processing and post-experimental data analysis are the major tasks in NMR-based metabonomics studies. While there are commercial and free licensed software tools available to assist these tasks, researchers usually have to use multiple software packages for their studies because software packages generally focus on specific tasks. It would be beneficial to have a highly integrated platform, in which these tasks can be completed within one package. Moreover, with open source architecture, newly proposed algorithms or methods for spectral processing and data analysis can be implemented much more easily and accessed freely by the public. Results In this paper, we report an open source software tool, Automics, which is specifically designed for NMR-based metabonomics studies. Automics is a highly integrated platform that provides functions covering almost all the stages of NMR-based metabonomics studies. Automics provides high throughput automatic modules with most recently proposed algorithms and powerful manual modules for 1D NMR spectral processing. In addition to spectral processing functions, powerful features for data organization, data pre-processing, and data analysis have been implemented. Nine statistical methods can be applied to analyses including: feature selection (Fisher's criterion, data reduction (PCA, LDA, ULDA, unsupervised clustering (K-Mean and supervised regression and classification (PLS/PLS-DA, KNN, SIMCA, SVM. Moreover, Automics has a user-friendly graphical interface for visualizing NMR spectra and data analysis results. The functional ability of Automics is demonstrated with an analysis of a type 2 diabetes metabolic profile. Conclusion Automics facilitates high throughput 1D NMR spectral processing and high dimensional data analysis for NMR-based metabonomics applications. Using Automics, users can complete spectral processing and data analysis within one software package in most cases

  13. SDI-based business processes: A territorial analysis web information system in Spain

    Science.gov (United States)

    Béjar, Rubén; Latre, Miguel Á.; Lopez-Pellicer, Francisco J.; Nogueras-Iso, Javier; Zarazaga-Soria, F. J.; Muro-Medrano, Pedro R.

    2012-09-01

    Spatial Data Infrastructures (SDIs) provide access to geospatial data and operations through interoperable Web services. These data and operations can be chained to set up specialized geospatial business processes, and these processes can give support to different applications. End users can benefit from these applications, while experts can integrate the Web services in their own business processes and developments. This paper presents an SDI-based territorial analysis Web information system for Spain, which gives access to land cover, topography and elevation data, as well as to a number of interoperable geospatial operations by means of a Web Processing Service (WPS). Several examples illustrate how different territorial analysis business processes are supported. The system has been established by the Spanish National SDI (Infraestructura de Datos Espaciales de España, IDEE) both as an experimental platform for geoscientists and geoinformation system developers, and as a mechanism to contribute to the Spanish citizens knowledge about their territory.

  14. Analysis of Seed Sorting Process by Estimation of Seed Motion Trajectories

    DEFF Research Database (Denmark)

    Buus, Ole Thomsen; Jørgensen, Johannes Ravn; Carstensen, Jens Michael

    2011-01-01

    Seed sorting is a mechanical process in which the goal is to achieve a high level of purity and quality in the final product. Prediction and control of such processes are generally considered very difficult. One possible solution is a systems identification approach in which the seeds...... and their movement are directly observed and data about important process parameters extracted. Image analysis was used to extract such data from the internal sorting process in one particular seed sorting device - the so-called “indented cylinder”. Twenty high speed image sequences were recorded of the indented...

  15. Integrated system for design and analysis of industrial processes with electrolyte system

    DEFF Research Database (Denmark)

    Takano, Kiyoteru; Gani, Rafiqul

    1999-01-01

    An algorithm for design and analysis of crystallization processes with electrolyte systems is presented. This algorithm consists of a thermodynamic part, a synthesis part and a design part. The three parts are integrated through a simulation engine. The main features of the algorithm is the use...... of thermodynamic insights not only to generate process alternatives but also to obtain good initial estimates for the simulation engine and for visualization of process synthesis/design. The main steps of the algorithm are highlighted through a case study involving an industrial crystallization process....

  16. Functional Unfold Principal Component Regression Methodology for Analysis of Industrial Batch Process Data

    DEFF Research Database (Denmark)

    Mears, Lisa; Nørregaard, Rasmus; Sin, Gürkan

    2016-01-01

    . It is shown that application of functional data analysis and the choice of variance scaling method have the greatest impact on the prediction accuracy. Considering the vast amount of batch process data continuously generated in industry, this methodology can potentially contribute as a tool to identify...... desirable process operating conditions from complex industrial datasets. © 2016 American Institute of Chemical Engineers AIChE J, 2016......This work proposes a methodology utilizing functional unfold principal component regression (FUPCR), for application to industrial batch process data as a process modeling and optimization tool. The methodology is applied to an industrial fermentation dataset, containing 30 batches of a production...

  17. Traffic characteristics analysis in optical burst switching networks with optical label processing

    Directory of Open Access Journals (Sweden)

    Edson Moschim

    2007-03-01

    Full Text Available An analysis is carried out with burst-switching optical networks which use label processing consisting of orthogonal optical codes (OOC, considering traffic characteristics such as length/duration and arrival rate of bursts. Main results show that the use of OOC label processing influences on the decrease of burst loss probability, especially for short-lived bursts. Therefore, short bursts that would be blocked in conventional electronic processing networks are transmitted when the OOC label processing is used. Thus, an increase in the network use occurs as well as a decrease in the burst transmission latency, reaching a granularity close to packets networks.

  18. The UN peacebuilding process: an analysis of its shortcomings in Timor-Leste

    Directory of Open Access Journals (Sweden)

    Ramon Blanco

    2015-06-01

    Full Text Available This paper brings a comprehensive analysis of the peacebuilding process conducted by the UN in Timor-Leste. Drawing on fieldwork, interviews, and secondary sources, the paper brings light to the main fragilities of this process. Firstly, the paper briefly outlines the scholarly debate around UN peacebuilding process. Then, the paper brings an overview of the UN missions deployed to Timor-Leste. Finally, the paper identifies the major limitations of such engagement. By highlighting the main flaws of this peacebuilding process, the paper opens the space for (rethinking alternative ways of building peace in post-conflict scenarios.

  19. Analysis the parameters of seed quality in ns sunflower hybrid after processing in gravity separator

    Directory of Open Access Journals (Sweden)

    Jokić Goran

    2016-01-01

    Full Text Available This paper analyzed the processed seed of five sunflower hybrid seed developed at the Institute of Field and Vegetable Crops in Novi Sad before and after processing in gravity separator. The cultivars were Pegaz, Duško, NS Fantazija, Sumo 1 PR and NS Oskar. The analysis was conducted on seed lots processed in 2015 and involved the following parameters: seed purity percentage, 1.000-seed weight, germination energy, germination, seed moisture, number of sclerotinia per 1.000. The results showed that all the parameters of seed quality of sunflower hybrids were better after processing seeds in the gravity separator.

  20. Design of a novel biomedical signal processing and analysis tool for functional neuroimaging.

    Science.gov (United States)

    Kaçar, Sezgin; Sakoğlu, Ünal

    2016-03-01

    In this paper, a MATLAB-based graphical user interface (GUI) software tool for general biomedical signal processing and analysis of functional neuroimaging data is introduced. Specifically, electroencephalography (EEG) and electrocardiography (ECG) signals can be processed and analyzed by the developed tool, which incorporates commonly used temporal and frequency analysis methods. In addition to common methods, the tool also provides non-linear chaos analysis with Lyapunov exponents and entropies; multivariate analysis with principal and independent component analyses; and pattern classification with discriminant analysis. This tool can also be utilized for training in biomedical engineering education. This easy-to-use and easy-to-learn, intuitive tool is described in detail in this paper. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. Laser apparatus and method for microscopic and spectroscopic analysis and processing of biological cells

    Science.gov (United States)

    Gourley, Paul L.; Gourley, Mark F.

    1997-01-01

    An apparatus and method for microscopic and spectroscopic analysis and processing of biological cells. The apparatus comprises a laser having an analysis region within the laser cavity for containing one or more biological cells to be analyzed. The presence of a cell within the analysis region in superposition with an activated portion of a gain medium of the laser acts to encode information about the cell upon the laser beam, the cell information being recoverable by an analysis means that preferably includes an array photodetector such as a CCD camera and a spectrometer. The apparatus and method may be used to analyze biomedical cells including blood cells and the like, and may include processing means for manipulating, sorting, or eradicating cells after analysis thereof.

  2. Safety analysis in process facilities: Comparison of fault tree and Bayesian network approaches

    Energy Technology Data Exchange (ETDEWEB)

    Khakzad, Nima [Process Engineering, Faculty of Engineering and Applied Science, Memorial University, St. John' s, NL, A1B 3X5 (Canada); Khan, Faisal, E-mail: fikhan@mun.c [Process Engineering, Faculty of Engineering and Applied Science, Memorial University, St. John' s, NL, A1B 3X5 (Canada); Amyotte, Paul [Department of Process Engineering and Applied Science, Dalhousie University, Halifax, NS, B3J 2X4 (Canada)

    2011-08-15

    Safety analysis in gas process facilities is necessary to prevent unwanted events that may cause catastrophic accidents. Accident scenario analysis with probability updating is the key to dynamic safety analysis. Although conventional failure assessment techniques such as fault tree (FT) have been used effectively for this purpose, they suffer severe limitations of static structure and uncertainty handling, which are of great significance in process safety analysis. Bayesian network (BN) is an alternative technique with ample potential for application in safety analysis. BNs have a strong similarity to FTs in many respects; however, the distinct advantages making them more suitable than FTs are their ability in explicitly representing the dependencies of events, updating probabilities, and coping with uncertainties. The objective of this paper is to demonstrate the application of BNs in safety analysis of process systems. The first part of the paper shows those modeling aspects that are common between FT and BN, giving preference to BN due to its ability to update probabilities. The second part is devoted to various modeling features of BN, helping to incorporate multi-state variables, dependent failures, functional uncertainty, and expert opinion which are frequently encountered in safety analysis, but cannot be considered by FT. The paper concludes that BN is a superior technique in safety analysis because of its flexible structure, allowing it to fit a wide variety of accident scenarios.

  3. Subject analysis during the cataloging process: the case of academic libraries

    Directory of Open Access Journals (Sweden)

    Jelka Kos

    2011-01-01

    Full Text Available Purpose: The study tried to answer how catalogers determine the subject of a document.The aim of the research is to understand: (1 which parts of document are most important in the subject analysis process, (2 which approaches in the subject determination are involved, (3 which features of document are most significant, (4 which stages and cognitive processes are present, and, (5 if and how the subject analysis process of catalogers at Slovenian academic libraries differ from conceptions described in textbooks and ISO 5963 standard.Methodology/approach: Ten catalogers from nine Slovenian academic libraries were included into the qualitative research. Observation, think-aloud procedures, diaries and follow up discussions (non-structured interviews were used for collecting the data.Results: Regarding the parts of the document, the participants mostly examine the title,the table of contents, the introduction/preface, they also check the entire document.The document-oriented approach to subject determination was predominant. There were four stages in the process noted: the data input, data processing, text reduction and assignment of subject headings. Based on the results two general models of the subject analysis process were constructed.Research limitation: The small and convenient sample limits the generalization of findings.Originality/practical implications: It is a rare empirical study with qualitative approach in researching the subject analysis in Slovenia.

  4. Modeling of thinning process of structures in temperature analysis and its verification

    Energy Technology Data Exchange (ETDEWEB)

    Tsukimori, K.; Furuhashi, I. [Japan Nuclear Cycle Development Institute, JNC, Ibaraki-ken (Japan)

    2001-07-01

    It is important to consider the thinning process in analyzing the behavior of structures including the change of their strength when thinning of structures is significant due to corrosion, melting, etc. The thinning process in the stress or strain analysis can be expressed by using artificial creep and reduction of elastic modulus for example. If the thinning process goes with temperature change, temperature analysis has to be needed. If the structures are relatively thin like thin plates or thin shells, the effect of thinning process may be neglected in the temperature analysis. However, in the cases of thick structures or the structures of which temperature gradient in the thickness is expected to be large due to thermal boundary conditions, the thinning process should be considered in the temperature analyses as well as stress or strain analyses. In this study the modeling of thinning process in the temperature analysis has been developed. The detailed formulation is described and the function of this modeling is verified by simple one dimensional problem. As an applied example, a problem of thinning heat tube is analyzed. (authors)

  5. A generic Transcriptomics Reporting Framework (TRF) for 'omics data processing and analysis.

    Science.gov (United States)

    Gant, Timothy W; Sauer, Ursula G; Zhang, Shu-Dong; Chorley, Brian N; Hackermüller, Jörg; Perdichizzi, Stefania; Tollefsen, Knut E; van Ravenzwaay, Ben; Yauk, Carole; Tong, Weida; Poole, Alan

    2017-12-01

    A generic Transcriptomics Reporting Framework (TRF) is presented that lists parameters that should be reported in 'omics studies used in a regulatory context. The TRF encompasses the processes from transcriptome profiling from data generation to a processed list of differentially expressed genes (DEGs) ready for interpretation. Included within the TRF is a reference baseline analysis (RBA) that encompasses raw data selection; data normalisation; recognition of outliers; and statistical analysis. The TRF itself does not dictate the methodology for data processing, but deals with what should be reported. Its principles are also applicable to sequencing data and other 'omics. In contrast, the RBA specifies a simple data processing and analysis methodology that is designed to provide a comparison point for other approaches and is exemplified here by a case study. By providing transparency on the steps applied during 'omics data processing and analysis, the TRF will increase confidence processing of 'omics data, and regulatory use. Applicability of the TRF is ensured by its simplicity and generality. The TRF can be applied to all types of regulatory 'omics studies, and it can be executed using different commonly available software tools. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  6. CRBLASTER: A Parallel-Processing Computational Framework for Embarrassingly-Parallel Image-Analysis Algorithms

    Science.gov (United States)

    Mighell, Kenneth John

    2011-11-01

    The development of parallel-processing image-analysis codes is generally a challenging task that requires complicated choreography of interprocessor communications. If, however, the image-analysis algorithm is embarrassingly parallel, then the development of a parallel-processing implementation of that algorithm can be a much easier task to accomplish because, by definition, there is little need for communication between the compute processes. I describe the design, implementation, and performance of a parallel-processing image-analysis application, called CRBLASTER, which does cosmic-ray rejection of CCD (charge-coupled device) images using the embarrassingly-parallel L.A.COSMIC algorithm. CRBLASTER is written in C using the high-performance computing industry standard Message Passing Interface (MPI) library. The code has been designed to be used by research scientists who are familiar with C as a parallel-processing computational framework that enables the easy development of parallel-processing image-analysis programs based on embarrassingly-parallel algorithms. The CRBLASTER source code is freely available at the official application website at the National Optical Astronomy Observatory. Removing cosmic rays from a single 800x800 pixel Hubble Space Telescope WFPC2 image takes 44 seconds with the IRAF script lacos_im.cl running on a single core of an Apple Mac Pro computer with two 2.8-GHz quad-core Intel Xeon processors. CRBLASTER is 7.4 times faster processing the same image on a single core on the same machine. Processing the same image with CRBLASTER simultaneously on all 8 cores of the same machine takes 0.875 seconds - which is a speedup factor of 50.3 times faster than the IRAF script. A detailed analysis is presented of the performance of CRBLASTER using between 1 and 57 processors on a low-power Tilera 700-MHz 64-core TILE64 processor.

  7. Process

    Energy Technology Data Exchange (ETDEWEB)

    Geenen, P.V.; Bennis, J.

    1989-04-04

    A process is described for minimizing the cracking tendency and uncontrolled dimensional change, and improving the strength of a rammed plastic refractory reactor liner comprising phosphate-bonded silicon carbide or phosphate-bonded alumina. It consists of heating the reactor liner placed or mounted in a reactor, prior to its first use, from ambient temperature up to a temperature of from about 490/sup 0/C to about 510/sup 0/C, the heating being carried out by heating the liner at a rate to produce a temperature increase of the liner not greater than about 6/sup 0/C per hour.

  8. Designing Progressive and Interactive Analytics Processes for High-Dimensional Data Analysis.

    Science.gov (United States)

    Turkay, Cagatay; Kaya, Erdem; Balcisoy, Selim; Hauser, Helwig

    2017-01-01

    In interactive data analysis processes, the dialogue between the human and the computer is the enabling mechanism that can lead to actionable observations about the phenomena being investigated. It is of paramount importance that this dialogue is not interrupted by slow computational mechanisms that do not consider any known temporal human-computer interaction characteristics that prioritize the perceptual and cognitive capabilities of the users. In cases where the analysis involves an integrated computational method, for instance to reduce the dimensionality of the data or to perform clustering, such non-optimal processes are often likely. To remedy this, progressive computations, where results are iteratively improved, are getting increasing interest in visual analytics. In this paper, we present techniques and design considerations to incorporate progressive methods within interactive analysis processes that involve high-dimensional data. We define methodologies to facilitate processes that adhere to the perceptual characteristics of users and describe how online algorithms can be incorporated within these. A set of design recommendations and according methods to support analysts in accomplishing high-dimensional data analysis tasks are then presented. Our arguments and decisions here are informed by observations gathered over a series of analysis sessions with analysts from finance. We document observations and recommendations from this study and present evidence on how our approach contribute to the efficiency and productivity of interactive visual analysis sessions involving high-dimensional data.

  9. Rapid Automated Dissolution and Analysis Techniques for Radionuclides in Recycle Process Streams

    Energy Technology Data Exchange (ETDEWEB)

    Sudowe, Ralf [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program and Health Physics Dept.; Roman, Audrey [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program; Dailey, Ashlee [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program; Go, Elaine [Univ. of Nevada, Las Vegas, NV (United States). Radiochemistry Program

    2013-07-18

    The analysis of process samples for radionuclide content is an important part of current procedures for material balance and accountancy in the different process streams of a recycling plant. The destructive sample analysis techniques currently available necessitate a significant amount of time. It is therefore desirable to develop new sample analysis procedures that allow for a quick turnaround time and increased sample throughput with a minimum of deviation between samples. In particular, new capabilities for rapid sample dissolution and radiochemical separation are required. Most of the radioanalytical techniques currently employed for sample analysis are based on manual laboratory procedures. Such procedures are time- and labor-intensive, and not well suited for situations in which a rapid sample analysis is required and/or large number of samples need to be analyzed. To address this issue we are currently investigating radiochemical separation methods based on extraction chromatography that have been specifically optimized for the analysis of process stream samples. The influence of potential interferences present in the process samples as well as mass loading, flow rate and resin performance is being studied. In addition, the potential to automate these procedures utilizing a robotic platform is evaluated. Initial studies have been carried out using the commercially available DGA resin. This resin shows an affinity for Am, Pu, U, and Th and is also exhibiting signs of a possible synergistic effects in the presence of iron.

  10. Fit Gap Analysis – The Role of Business Process Reference Models

    Directory of Open Access Journals (Sweden)

    Dejan Pajk

    2013-12-01

    Full Text Available Enterprise resource planning (ERP systems support solutions for standard business processes such as financial, sales, procurement and warehouse. In order to improve the understandability and efficiency of their implementation, ERP vendors have introduced reference models that describe the processes and underlying structure of an ERP system. To select and successfully implement an ERP system, the capabilities of that system have to be compared with a company’s business needs. Based on a comparison, all of the fits and gaps must be identified and further analysed. This step usually forms part of ERP implementation methodologies and is called fit gap analysis. The paper theoretically overviews methods for applying reference models and describes fit gap analysis processes in detail. The paper’s first contribution is its presentation of a fit gap analysis using standard business process modelling notation. The second contribution is the demonstration of a process-based comparison approach between a supply chain process and an ERP system process reference model. In addition to its theoretical contributions, the results can also be practically applied to projects involving the selection and implementation of ERP systems.

  11. Data-driven fault detection for industrial processes canonical correlation analysis and projection based methods

    CERN Document Server

    Chen, Zhiwen

    2017-01-01

    Zhiwen Chen aims to develop advanced fault detection (FD) methods for the monitoring of industrial processes. With the ever increasing demands on reliability and safety in industrial processes, fault detection has become an important issue. Although the model-based fault detection theory has been well studied in the past decades, its applications are limited to large-scale industrial processes because it is difficult to build accurate models. Furthermore, motivated by the limitations of existing data-driven FD methods, novel canonical correlation analysis (CCA) and projection-based methods are proposed from the perspectives of process input and output data, less engineering effort and wide application scope. For performance evaluation of FD methods, a new index is also developed. Contents A New Index for Performance Evaluation of FD Methods CCA-based FD Method for the Monitoring of Stationary Processes Projection-based FD Method for the Monitoring of Dynamic Processes Benchmark Study and Real-Time Implementat...

  12. Hazard Analysis of Software Requirements Specification for Process Module of FPGA-based Controllers in NPP

    Energy Technology Data Exchange (ETDEWEB)

    Jung; Sejin; Kim, Eui-Sub; Yoo, Junbeom [Konkuk University, Seoul (Korea, Republic of); Keum, Jong Yong; Lee, Jang-Soo [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    Software in PLC, FPGA which are used to develop I and C system also should be analyzed to hazards and risks before used. NUREG/CR-6430 proposes the method for performing software hazard analysis. It suggests analysis technique for software affected hazards and it reveals that software hazard analysis should be performed with the aspects of software life cycle such as requirements analysis, design, detailed design, implements. It also provides the guide phrases for applying software hazard analysis. HAZOP (Hazard and operability analysis) is one of the analysis technique which is introduced in NUREG/CR-6430 and it is useful technique to use guide phrases. HAZOP is sometimes used to analyze the safety of software. Analysis method of NUREG/CR-6430 had been used in Korea nuclear power plant software for PLC development. Appropriate guide phrases and analysis process are selected to apply efficiently and NUREG/CR-6430 provides applicable methods for software hazard analysis is identified in these researches. We perform software hazard analysis of FPGA software requirements specification with two approaches which are NUREG/CR-6430 and HAZOP with using general GW. We also perform the comparative analysis with them. NUREG/CR-6430 approach has several pros and cons comparing with the HAZOP with general guide words and approach. It is enough applicable to analyze the software requirements specification of FPGA.

  13. Item Unique Identification Capability Expansion: Established Process Analysis, Cost Benefit Analysis, and Optimal Marking Procedures

    Science.gov (United States)

    2014-12-01

    proven to be a problem, however, since they can easily be peeled off the equipment or peel off due to wear and tear caused by extreme heat when...cold, heat, steam, liquids, chemicals, or by personnel peeling the labels off is also a concern for the current labelling process. While doing a site...method.  Negatives: Can be destroyed by temperature extremes, can fall off/be removed, are soft ( abrasion problems), vulnerable to certain chemicals

  14. Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants

    Science.gov (United States)

    Kulbjakina, A. V.; Dolotovskij, I. V.

    2018-01-01

    The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.

  15. Computer Aided Design and Analysis of Separation Processes with Electrolyte Systems

    DEFF Research Database (Denmark)

    A methodology for computer aided design and analysis of separation processes involving electrolyte systems is presented. The methodology consists of three main parts. The thermodynamic part "creates" the problem specific property model package, which is a collection of pure component and mixture...... property models. The design and analysis part generates process (flowsheet) alternatives, evaluates/analyses feasibility of separation and provides a visual operation path for the desired separation. The simulation part consists of a simulation/calculation engine that allows the screening and validation...... of process alternatives. For the simulation part, a general multi-purpose, multi-phase separation model has been developed and integrated to an existing computer aided system. Application of the design and analysis methodology is highlighted through two illustrative case studies....

  16. Computer Aided Design and Analysis of Separation Processes with Electrolyte Systems

    DEFF Research Database (Denmark)

    Takano, Kiyoteru; Gani, Rafiqul; Kolar, P.

    2000-01-01

    A methodology for computer aided design and analysis of separation processes involving electrolyte systems is presented. The methodology consists of three main parts. The thermodynamic part 'creates' the problem specific property model package, which is a collection of pure component and mixture...... property models. The design and analysis part generates process (flowsheet) alternatives, evaluates/analyses feasibility of separation and provides a visual operation path for the desired separation. The simulation part consists of a simulation/calculation engine that allows the screening and validation...... of process alternatives. For the simulation part, a general multi-purpose, multi-phase separation model has been developed and integrated to an existing computer aided system. Application of the design and analysis methodology is highlighted through two illustrative case studies, (C) 2000 Elsevier Science...

  17. Analysis of parameters of coal gasification process for demand of clean coal technology

    Energy Technology Data Exchange (ETDEWEB)

    Zaporowski, B. (Technical University of Poznan, Poznan (Poland))

    1993-01-01

    The paper presents the complex energy analysis of the process of total, pressure coal gasification. The basis of this analysis is an elaborated mathematical model of the coal gasification process. This model is elaborated in a form that allows a simulation of the total pressure of gasification of coal, with the use of various gasifying media. The model constitutes a system of equations, describing chemical, physical and energy processes taking place in the gas generator. The laws of statistical quantum thermodynamics are used to formulate the equations describing chemical and physical processes proceeding in the gas generator. On the basis of the elaborated mathematical model of coal gasification process, special computer program was derived. This program allows multivariant calculations of parameters of coal gasification process to be made. For each variant the following were calculated: composition of gas produced in the process of coal gasification, caloric value of produced gas, volume of gas obtained from 1 kg of coal, consumption of gasifying medium per 1 kg of coal and chemical and energy efficiency of coal gasification process. 4 refs., 14 figs.

  18. Forecast Value Added (FVA Analysis as a Means to Improve the Efficiency of a Forecasting Process

    Directory of Open Access Journals (Sweden)

    Filip Chybalski

    2017-01-01

    Full Text Available A praxeological approach has been proposed in order to improve a forecasting process through the employment of the forecast value added (FVA analysis. This may be interpreted as a manifestation of lean management in forecasting. The author discusses the concepts of the effectiveness and efficiency of forecasting. The former, defined in the praxeology as the degree to which goals are achieved, refers to the accuracy of forecasts. The latter reflects the relation between the benefits accruing from the results of forecasting and the costs incurred in this process. Since measuring the benefits accruing from a forecasting is very difficult, a simplification according to which this benefit is a function of the forecast accuracy is proposed. This enables evaluating the efficiency of the forecasting process. Since improving this process may consist of either reducing forecast error or decreasing costs, FVA analysis, which expresses the concept of lean management, may be applied to reduce the waste accompanying forecasting. (original abstract

  19. Proposal for elicitation and analysis of environmental requirements into the construction design process: a case study

    Directory of Open Access Journals (Sweden)

    Camila Pegoraro

    2010-05-01

    Full Text Available Proposal: As new demands from sustainable development, environmental requirements arise as another challenge to design process management. It is already known that companies which design buildings are usually exposed to many managerial difficulties. Faced to the environmental demands, these companies require new facilities to align environmental requirements to the business goals and to include them properly in design process. This paper is based on a case study in a construction company, which was developed through interviews and document analysis. It is intended to present a procedure for the project environmental requirements elicitation, organization and analysis, which is based on the requirements engineering (ER concepts. As results it was concluded that the ER concepts are useful for the environmental requirements integration into the design process and that strategic planning should give directions for the effective environmental requirements adherence. Moreover, a procedure for environmental requirements modeling is proposed. Key-words: Design process, Requirements management, Environmental requirements, Construction

  20. Microreactors with integrated UV/Vis spectroscopic detection for online process analysis under segmented flow.

    Science.gov (United States)

    Yue, Jun; Falke, Floris H; Schouten, Jaap C; Nijhuis, T Alexander

    2013-12-21

    Combining reaction and detection in multiphase microfluidic flow is becoming increasingly important for accelerating process development in microreactors. We report the coupling of UV/Vis spectroscopy with microreactors for online process analysis under segmented flow conditions. Two integration schemes are presented: one uses a cross-type flow-through cell subsequent to a capillary microreactor for detection in the transmission mode; the other uses embedded waveguides on a microfluidic chip for detection in the evanescent wave field. Model experiments reveal the capabilities of the integrated systems in real-time concentration measurements and segmented flow characterization. The application of such integration for process analysis during gold nanoparticle synthesis is demonstrated, showing its great potential in process monitoring in microreactors operated under segmented flow.