WorldWideScience

Sample records for normalisation process model

  1. Random forest meteorological normalisation models for Swiss PM10 trend analysis

    Science.gov (United States)

    Grange, Stuart K.; Carslaw, David C.; Lewis, Alastair C.; Boleti, Eirini; Hueglin, Christoph

    2018-05-01

    Meteorological normalisation is a technique which accounts for changes in meteorology over time in an air quality time series. Controlling for such changes helps support robust trend analysis because there is more certainty that the observed trends are due to changes in emissions or chemistry, not changes in meteorology. Predictive random forest models (RF; a decision tree machine learning technique) were grown for 31 air quality monitoring sites in Switzerland using surface meteorological, synoptic scale, boundary layer height, and time variables to explain daily PM10 concentrations. The RF models were used to calculate meteorologically normalised trends which were formally tested and evaluated using the Theil-Sen estimator. Between 1997 and 2016, significantly decreasing normalised PM10 trends ranged between -0.09 and -1.16 µg m-3 yr-1 with urban traffic sites experiencing the greatest mean decrease in PM10 concentrations at -0.77 µg m-3 yr-1. Similar magnitudes have been reported for normalised PM10 trends for earlier time periods in Switzerland which indicates PM10 concentrations are continuing to decrease at similar rates as in the past. The ability for RF models to be interpreted was leveraged using partial dependence plots to explain the observed trends and relevant physical and chemical processes influencing PM10 concentrations. Notably, two regimes were suggested by the models which cause elevated PM10 concentrations in Switzerland: one related to poor dispersion conditions and a second resulting from high rates of secondary PM generation in deep, photochemically active boundary layers. The RF meteorological normalisation process was found to be robust, user friendly and simple to implement, and readily interpretable which suggests the technique could be useful in many air quality exploratory data analysis situations.

  2. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

    LENUS (Irish Health Repository)

    Murray, Elizabeth

    2010-10-20

    Abstract Background The past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT) addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation). Discussion In this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential. Summary The NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.

  3. Normalisation process theory: a framework for developing, evaluating and implementing complex interventions

    Directory of Open Access Journals (Sweden)

    Ong Bie

    2010-10-01

    Full Text Available Abstract Background The past decade has seen considerable interest in the development and evaluation of complex interventions to improve health. Such interventions can only have a significant impact on health and health care if they are shown to be effective when tested, are capable of being widely implemented and can be normalised into routine practice. To date, there is still a problematic gap between research and implementation. The Normalisation Process Theory (NPT addresses the factors needed for successful implementation and integration of interventions into routine work (normalisation. Discussion In this paper, we suggest that the NPT can act as a sensitising tool, enabling researchers to think through issues of implementation while designing a complex intervention and its evaluation. The need to ensure trial procedures that are feasible and compatible with clinical practice is not limited to trials of complex interventions, and NPT may improve trial design by highlighting potential problems with recruitment or data collection, as well as ensuring the intervention has good implementation potential. Summary The NPT is a new theory which offers trialists a consistent framework that can be used to describe, assess and enhance implementation potential. We encourage trialists to consider using it in their next trial.

  4. Supervised Object Class Colour Normalisation

    DEFF Research Database (Denmark)

    Riabchenko, Ekatarina; Lankinen, Jukka; Buch, Anders Glent

    2013-01-01

    . In this work, we develop a such colour normalisation technique, where true colours are not important per se but where examples of same classes have photometrically consistent appearance. This is achieved by supervised estimation of a class specic canonical colour space where the examples have minimal variation......Colour is an important cue in many applications of computer vision and image processing, but robust usage often requires estimation of the unknown illuminant colour. Usually, to obtain images invariant to the illumination conditions under which they were taken, color normalisation is used...... in their colours. We demonstrate the effectiveness of our method with qualitative and quantitative examples from the Caltech-101 data set and a real application of 3D pose estimation for robot grasping....

  5. ENEKuS--A Key Model for Managing the Transformation of the Normalisation of the Basque Language in the Workplace

    Science.gov (United States)

    Marko, Inazio; Pikabea, Inaki

    2013-01-01

    The aim of this study is to develop a reference model for intervention in the language processes applied to the transformation of language normalisation within organisations of a socio-economic nature. It is based on a case study of an experiment carried out over 10 years within a trade union confederation, and has pursued a strategy of a…

  6. Normalisation genes for expression analyses in the brown alga model Ectocarpus siliculosus

    Directory of Open Access Journals (Sweden)

    Rousvoal Sylvie

    2008-08-01

    Full Text Available Abstract Background Brown algae are plant multi-cellular organisms occupying most of the world coasts and are essential actors in the constitution of ecological niches at the shoreline. Ectocarpus siliculosus is an emerging model for brown algal research. Its genome has been sequenced, and several tools are being developed to perform analyses at different levels of cell organization, including transcriptomic expression analyses. Several topics, including physiological responses to osmotic stress and to exposure to contaminants and solvents are being studied in order to better understand the adaptive capacity of brown algae to pollution and environmental changes. A series of genes that can be used to normalise expression analyses is required for these studies. Results We monitored the expression of 13 genes under 21 different culture conditions. These included genes encoding proteins and factors involved in protein translation (ribosomal protein 26S, EF1alpha, IF2A, IF4E and protein degradation (ubiquitin, ubiquitin conjugating enzyme or folding (cyclophilin, and proteins involved in both the structure of the cytoskeleton (tubulin alpha, actin, actin-related proteins and its trafficking function (dynein, as well as a protein implicated in carbon metabolism (glucose 6-phosphate dehydrogenase. The stability of their expression level was assessed using the Ct range, and by applying both the geNorm and the Normfinder principles of calculation. Conclusion Comparisons of the data obtained with the three methods of calculation indicated that EF1alpha (EF1a was the best reference gene for normalisation. The normalisation factor should be calculated with at least two genes, alpha tubulin, ubiquitin-conjugating enzyme or actin-related proteins being good partners of EF1a. Our results exclude actin as a good normalisation gene, and, in this, are in agreement with previous studies in other organisms.

  7. Learning from doing: the case for combining normalisation process theory and participatory learning and action research methodology for primary healthcare implementation research.

    Science.gov (United States)

    de Brún, Tomas; O'Reilly-de Brún, Mary; O'Donnell, Catherine A; MacFarlane, Anne

    2016-08-03

    The implementation of research findings is not a straightforward matter. There are substantive and recognised gaps in the process of translating research findings into practice and policy. In order to overcome some of these translational difficulties, a number of strategies have been proposed for researchers. These include greater use of theoretical approaches in research focused on implementation, and use of a wider range of research methods appropriate to policy questions and the wider social context in which they are placed. However, questions remain about how to combine theory and method in implementation research. In this paper, we respond to these proposals. Focussing on a contemporary social theory, Normalisation Process Theory, and a participatory research methodology, Participatory Learning and Action, we discuss the potential of their combined use for implementation research. We note ways in which Normalisation Process Theory and Participatory Learning and Action are congruent and may therefore be used as heuristic devices to explore, better understand and support implementation. We also provide examples of their use in our own research programme about community involvement in primary healthcare. Normalisation Process Theory alone has, to date, offered useful explanations for the success or otherwise of implementation projects post-implementation. We argue that Normalisation Process Theory can also be used to prospectively support implementation journeys. Furthermore, Normalisation Process Theory and Participatory Learning and Action can be used together so that interventions to support implementation work are devised and enacted with the expertise of key stakeholders. We propose that the specific combination of this theory and methodology possesses the potential, because of their combined heuristic force, to offer a more effective means of supporting implementation projects than either one might do on its own, and of providing deeper understandings of

  8. Infinitary Combinatory Reduction Systems: Normalising Reduction Strategies

    NARCIS (Netherlands)

    Ketema, J.; Simonsen, Jakob Grue

    2010-01-01

    We study normalising reduction strategies for infinitary Combinatory Reduction Systems (iCRSs). We prove that all fair, outermost-fair, and needed-fair strategies are normalising for orthogonal, fully-extended iCRSs. These facts properly generalise a number of results on normalising strategies in

  9. Normalised flood losses in Europe: 1970-2006

    Science.gov (United States)

    Barredo, J. I.

    2009-02-01

    This paper presents an assessment of normalised flood losses in Europe for the period 1970-2006. Normalisation provides an estimate of the losses that would occur if the floods from the past take place under current societal conditions. Economic losses from floods are the result of both societal and climatological factors. Failing to adjust for time-variant socio-economic factors produces loss amounts that are not directly comparable over time, but rather show an ever-growing trend for purely socio-economic reasons. This study has used available information on flood losses from the Emergency Events Database (EM-DAT) and the Natural Hazards Assessment Network (NATHAN). Following the conceptual approach of previous studies, we normalised flood losses by considering the effects of changes in population, wealth, and inflation at the country level. Furthermore, we removed inter-country price differences by adjusting the losses for purchasing power parities (PPP). We assessed normalised flood losses in 31 European countries. These include the member states of the European Union, Norway, Switzerland, Croatia, and the Former Yugoslav Republic of Macedonia. Results show no detectable sign of human-induced climate change in normalised flood losses in Europe. The observed increase in the original flood losses is mostly driven by societal factors.

  10. Guidelines for normalising Early Modern English corpora: Decisions and justifications

    Directory of Open Access Journals (Sweden)

    Archer Dawn

    2015-03-01

    Full Text Available Corpora of Early Modern English have been collected and released for research for a number of years. With large scale digitisation activities gathering pace in the last decade, much more historical textual data is now available for research on numerous topics including historical linguistics and conceptual history. We summarise previous research which has shown that it is necessary to map historical spelling variants to modern equivalents in order to successfully apply natural language processing and corpus linguistics methods. Manual and semiautomatic methods have been devised to support this normalisation and standardisation process. We argue that it is important to develop a linguistically meaningful rationale to achieve good results from this process. In order to do so, we propose a number of guidelines for normalising corpora and show how these guidelines have been applied in the Corpus of English Dialogues.

  11. A comparison of parametric and nonparametric methods for normalising cDNA microarray data.

    Science.gov (United States)

    Khondoker, Mizanur R; Glasbey, Chris A; Worton, Bruce J

    2007-12-01

    Normalisation is an essential first step in the analysis of most cDNA microarray data, to correct for effects arising from imperfections in the technology. Loess smoothing is commonly used to correct for trends in log-ratio data. However, parametric models, such as the additive plus multiplicative variance model, have been preferred for scale normalisation, though the variance structure of microarray data may be of a more complex nature than can be accommodated by a parametric model. We propose a new nonparametric approach that incorporates location and scale normalisation simultaneously using a Generalised Additive Model for Location, Scale and Shape (GAMLSS, Rigby and Stasinopoulos, 2005, Applied Statistics, 54, 507-554). We compare its performance in inferring differential expression with Huber et al.'s (2002, Bioinformatics, 18, 96-104) arsinh variance stabilising transformation (AVST) using real and simulated data. We show GAMLSS to be as powerful as AVST when the parametric model is correct, and more powerful when the model is wrong. (c) 2007 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim

  12. An application of Extended Normalisation Process Theory in a randomised controlled trial of a complex social intervention: Process evaluation of the Strengthening Families Programme (10–14 in Wales, UK

    Directory of Open Access Journals (Sweden)

    Jeremy Segrott

    2017-12-01

    Conclusions: Extended Normalisation Process Theory provided a useful framework for assessing implementation and explaining variation by examining intervention-context interactions. Findings highlight the need for process evaluations to consider both the structural and process components of implementation to explain whether programme activities are delivered as intended and why.

  13. Rules of Normalisation and their Importance for Interpretation of Systems of Optimal Taxation

    DEFF Research Database (Denmark)

    Munk, Knud Jørgen

    representation of the general equilibrium conditions the rules of normalisation in standard optimal tax models. This allows us to provide an intuitive explanation of what determines the optimal tax system. Finally, we review a number of examples where lack of precision with respect to normalisation in otherwise...... important contributions to the literature on optimal taxation has given rise to misinterpretations of of analytical results....

  14. A Bayesian MCMC method for point process models with intractable normalising constants

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2004-01-01

    to simulate from the "unknown distribution", perfect simulation algorithms become useful. We illustrate the method in cases whre the likelihood is given by a Markov point process model. Particularly, we consider semi-parametric Bayesian inference in connection to both inhomogeneous Markov point process models...... and pairwise interaction point processes....

  15. The applicability of normalisation process theory to speech and language therapy: a review of qualitative research on a speech and language intervention.

    Science.gov (United States)

    James, Deborah M

    2011-08-12

    The Bercow review found a high level of public dissatisfaction with speech and language services for children. Children with speech, language, and communication needs (SLCN) often have chronic complex conditions that require provision from health, education, and community services. Speech and language therapists are a small group of Allied Health Professionals with a specialist skill-set that equips them to work with children with SLCN. They work within and across the diverse range of public service providers. The aim of this review was to explore the applicability of Normalisation Process Theory (NPT) to the case of speech and language therapy. A review of qualitative research on a successfully embedded speech and language therapy intervention was undertaken to test the applicability of NPT. The review focused on two of the collective action elements of NPT (relational integration and interaction workability) using all previously published qualitative data from both parents and practitioners' perspectives on the intervention. The synthesis of the data based on the Normalisation Process Model (NPM) uncovered strengths in the interpersonal processes between the practitioners and parents, and weaknesses in how the accountability of the intervention is distributed in the health system. The analysis based on the NPM uncovered interpersonal processes between the practitioners and parents that were likely to have given rise to successful implementation of the intervention. In previous qualitative research on this intervention where the Medical Research Council's guidance on developing a design for a complex intervention had been used as a framework, the interpersonal work within the intervention had emerged as a barrier to implementation of the intervention. It is suggested that the design of services for children and families needs to extend beyond the consideration of benefits and barriers to embrace the social processes that appear to afford success in embedding

  16. Nuclear power 1984: Progressive normalisation

    International Nuclear Information System (INIS)

    Popp, M.

    1984-01-01

    The peaceful use of nuclear power is being integrated into the overall concept of a safe long-term power supply in West Germany. The progress of normalisation is shown particularly in the takeover of all stations of the nuclear fuel circuit by the economy, with the exception of the final storage of radioactive waste, which is the responsibility of the West German Government. Normalisation also means the withdrawal of the state from financing projects after completion of the two prototypes SNR-300 and THTR-300 and the German uranium enrichment plant. The state will, however, support future research and development projects in the nuclear field. The expansion of nuclear power capacity is at present being slowed down by the state of the economy, i.e. only nuclear power projects being built are proceeding. (orig./HP) [de

  17. Implementation of the SMART MOVE intervention in primary care: a qualitative study using normalisation process theory.

    Science.gov (United States)

    Glynn, Liam G; Glynn, Fergus; Casey, Monica; Wilkinson, Louise Gaffney; Hayes, Patrick S; Heaney, David; Murphy, Andrew W M

    2018-05-02

    Problematic translational gaps continue to exist between demonstrating the positive impact of healthcare interventions in research settings and their implementation into routine daily practice. The aim of this qualitative evaluation of the SMART MOVE trial was to conduct a theoretically informed analysis, using normalisation process theory, of the potential barriers and levers to the implementation of a mhealth intervention to promote physical activity in primary care. The study took place in the West of Ireland with recruitment in the community from the Clare Primary Care Network. SMART MOVE trial participants and the staff from four primary care centres were invited to take part and all agreed to do so. A qualitative methodology with a combination of focus groups (general practitioners, practice nurses and non-clinical staff from four separate primary care centres, n = 14) and individual semi-structured interviews (intervention and control SMART MOVE trial participants, n = 4) with purposeful sampling utilising the principles of Framework Analysis was utilised. The Normalisation Process Theory was used to develop the topic guide for the interviews and also informed the data analysis process. Four themes emerged from the analysis: personal and professional exercise strategies; roles and responsibilities to support active engagement; utilisation challenges; and evaluation, adoption and adherence. It was evident that introducing a new healthcare intervention demands a comprehensive evaluation of the intervention itself and also the environment in which it is to operate. Despite certain obstacles, the opportunity exists for the successful implementation of a novel healthcare intervention that addresses a hitherto unresolved healthcare need, provided that the intervention has strong usability attributes for both disseminators and target users and coheres strongly with the core objectives and culture of the health care environment in which it is to operate. We

  18. The implementation of medical revalidation: an assessment using normalisation process theory

    Directory of Open Access Journals (Sweden)

    Abigail Tazzyman

    2017-11-01

    Full Text Available Abstract Background Medical revalidation is the process by which all licensed doctors are legally required to demonstrate that they are up to date and fit to practise in order to maintain their licence. Revalidation was introduced in the United Kingdom (UK in 2012, constituting significant change in the regulation of doctors. The governing body, the General Medical Council (GMC, envisages that revalidation will improve patient care and safety. This potential however is, in part, dependent upon how successfully revalidation is embedded into routine practice. The aim of this study was to use Normalisation Process Theory (NPT to explore issues contributing to or impeding the implementation of revalidation in practice. Methods We conducted seventy-one interviews with sixty UK policymakers and senior leaders at different points during the development and implementation of revalidation: in 2011 (n = 31, 2013 (n = 26 and 2015 (n = 14. We selected interviewees using purposeful sampling. NPT was used as a framework to enable systematic analysis across the interview sets. Results Initial lack of consensus over revalidation’s purpose, and scepticism about its value, decreased over time as participants recognised the benefits it brought to their practice (coherence category of NPT. Though acceptance increased across time, revalidation was not seen as a legitimate part of their role by all doctors. Key individuals, notably the Responsible Officer (RO, were vital for the successful implementation of revalidation in organisations (cognitive participation category. The ease with which revalidation could be integrated into working practices varied greatly depending on the type of role a doctor held and the organisation they work for and the provision of resources was a significant variable in this (collective action category. Formal evaluation of revalidation in organisations was lacking but informal evaluation was taking place. Revalidation had

  19. Using normalisation process theory to understand barriers and facilitators to implementing mindfulness-based stress reduction for people with multiple sclerosis.

    Science.gov (United States)

    Simpson, Robert; Simpson, Sharon; Wood, Karen; Mercer, Stewart W; Mair, Frances S

    2018-01-01

    Objectives To study barriers and facilitators to implementation of mindfulness-based stress reduction for people with multiple sclerosis. Methods Qualitative interviews were used to explore barriers and facilitators to implementation of mindfulness-based stress reduction, including 33 people with multiple sclerosis, 6 multiple sclerosis clinicians and 2 course instructors. Normalisation process theory provided the underpinning conceptual framework. Data were analysed deductively using normalisation process theory constructs (coherence, cognitive participation, collective action and reflexive monitoring). Results Key barriers included mismatched stakeholder expectations, lack of knowledge about mindfulness-based stress reduction, high levels of comorbidity and disability and skepticism about embedding mindfulness-based stress reduction in routine multiple sclerosis care. Facilitators to implementation included introducing a pre-course orientation session; adaptations to mindfulness-based stress reduction to accommodate comorbidity and disability and participants suggested smaller, shorter classes, shortened practices, exclusion of mindful-walking and more time with peers. Post-mindfulness-based stress reduction booster sessions may be required, and objective and subjective reports of benefit would increase clinician confidence in mindfulness-based stress reduction. Discussion Multiple sclerosis patients and clinicians know little about mindfulness-based stress reduction. Mismatched expectations are a barrier to participation, as is rigid application of mindfulness-based stress reduction in the context of disability. Course adaptations in response to patient needs would facilitate uptake and utilisation. Rendering access to mindfulness-based stress reduction rapid and flexible could facilitate implementation. Embedded outcome assessment is desirable.

  20. Use and misuse of temperature normalisation in meta-analyses of thermal responses of biological traits

    Directory of Open Access Journals (Sweden)

    Dimitrios - Georgios Kontopoulos

    2018-02-01

    Full Text Available There is currently unprecedented interest in quantifying variation in thermal physiology among organisms, especially in order to understand and predict the biological impacts of climate change. A key parameter in this quantification of thermal physiology is the performance or value of a rate, across individuals or species, at a common temperature (temperature normalisation. An increasingly popular model for fitting thermal performance curves to data—the Sharpe-Schoolfield equation—can yield strongly inflated estimates of temperature-normalised rate values. These deviations occur whenever a key thermodynamic assumption of the model is violated, i.e., when the enzyme governing the performance of the rate is not fully functional at the chosen reference temperature. Using data on 1,758 thermal performance curves across a wide range of species, we identify the conditions that exacerbate this inflation. We then demonstrate that these biases can compromise tests to detect metabolic cold adaptation, which requires comparison of fitness or rate performance of different species or genotypes at some fixed low temperature. Finally, we suggest alternative methods for obtaining unbiased estimates of temperature-normalised rate values for meta-analyses of thermal performance across species in climate change impact studies.

  1. Attitudes to Normalisation and Inclusive Education

    Science.gov (United States)

    Sanagi, Tomomi

    2016-01-01

    The purpose of this paper was to clarify the features of teachers' image on normalisation and inclusive education. The participants of the study were both mainstream teachers and special teachers. One hundred and thirty-eight questionnaires were analysed. (1) Teachers completed the questionnaire of SD (semantic differential) images on…

  2. Reference gene identification for reliable normalisation of quantitative RT-PCR data in Setaria viridis.

    Science.gov (United States)

    Nguyen, Duc Quan; Eamens, Andrew L; Grof, Christopher P L

    2018-01-01

    Quantitative real-time polymerase chain reaction (RT-qPCR) is the key platform for the quantitative analysis of gene expression in a wide range of experimental systems and conditions. However, the accuracy and reproducibility of gene expression quantification via RT-qPCR is entirely dependent on the identification of reliable reference genes for data normalisation. Green foxtail ( Setaria viridis ) has recently been proposed as a potential experimental model for the study of C 4 photosynthesis and is closely related to many economically important crop species of the Panicoideae subfamily of grasses, including Zea mays (maize), Sorghum bicolor (sorghum) and Sacchurum officinarum (sugarcane). Setaria viridis (Accession 10) possesses a number of key traits as an experimental model, namely; (i) a small sized, sequenced and well annotated genome; (ii) short stature and generation time; (iii) prolific seed production, and; (iv) is amendable to Agrobacterium tumefaciens -mediated transformation. There is currently however, a lack of reference gene expression information for Setaria viridis ( S. viridis ). We therefore aimed to identify a cohort of suitable S. viridis reference genes for accurate and reliable normalisation of S. viridis RT-qPCR expression data. Eleven putative candidate reference genes were identified and examined across thirteen different S. viridis tissues. Of these, the geNorm and NormFinder analysis software identified SERINE / THERONINE - PROTEIN PHOSPHATASE 2A ( PP2A ), 5 '- ADENYLYLSULFATE REDUCTASE 6 ( ASPR6 ) and DUAL SPECIFICITY PHOSPHATASE ( DUSP ) as the most suitable combination of reference genes for the accurate and reliable normalisation of S. viridis RT-qPCR expression data. To demonstrate the suitability of the three selected reference genes, PP2A , ASPR6 and DUSP , were used to normalise the expression of CINNAMYL ALCOHOL DEHYDROGENASE ( CAD ) genes across the same tissues. This approach readily demonstrated the suitably of the three

  3. An application of Extended Normalisation Process Theory in a randomised controlled trial of a complex social intervention: Process evaluation of the Strengthening Families Programme (10-14) in Wales, UK.

    Science.gov (United States)

    Segrott, Jeremy; Murphy, Simon; Rothwell, Heather; Scourfield, Jonathan; Foxcroft, David; Gillespie, David; Holliday, Jo; Hood, Kerenza; Hurlow, Claire; Morgan-Trimmer, Sarah; Phillips, Ceri; Reed, Hayley; Roberts, Zoe; Moore, Laurence

    2017-12-01

    Process evaluations generate important data on the extent to which interventions are delivered as intended. However, the tendency to focus only on assessment of pre-specified structural aspects of fidelity has been criticised for paying insufficient attention to implementation processes and how intervention-context interactions influence programme delivery. This paper reports findings from a process evaluation nested within a randomised controlled trial of the Strengthening Families Programme 10-14 (SFP 10-14) in Wales, UK. It uses Extended Normalisation Process Theory to theorise how interaction between SFP 10-14 and local delivery systems - particularly practitioner commitment/capability and organisational capacity - influenced delivery of intended programme activities: fidelity (adherence to SFP 10-14 content and implementation requirements); dose delivered; dose received (participant engagement); participant recruitment and reach (intervention attendance). A mixed methods design was utilised. Fidelity assessment sheets (completed by practitioners), structured observation by researchers, and routine data were used to assess: adherence to programme content; staffing numbers and consistency; recruitment/retention; and group size and composition. Interviews with practitioners explored implementation processes and context. Adherence to programme content was high - with some variation, linked to practitioner commitment to, and understanding of, the intervention's content and mechanisms. Variation in adherence rates was associated with the extent to which multi-agency delivery team planning meetings were held. Recruitment challenges meant that targets for group size/composition were not always met, but did not affect adherence levels or family engagement. Targets for staffing numbers and consistency were achieved, though capacity within multi-agency networks reduced over time. Extended Normalisation Process Theory provided a useful framework for assessing

  4. Queer Literature in Spain: Pathways to Normalisation

    Directory of Open Access Journals (Sweden)

    Martínez-Expósito, Alfredo

    2013-06-01

    Full Text Available More than any other, the idea of normalisation has provoked deep divisions within queer activism both at a philosophical and also at a political level. At the root of these divisions lies the irreconcilable divergence between an agenda for social change, which advocates the need for society to accept all sexual behaviours and identities as normal, and an approach of radical resistance against some social structures that can only offer a bourgeois and conformist normalisation. Literary fiction and homo-gay-queer themed cinema have explored these and other sides of the idea of normalisation and have thus contributed to the taking of decisive steps: from the poetics of transgression towards the poetics of celebration and social transformation. In this paper we examine two of these literary normalisation strategies: the use of humour and the proliferation of discursive perspectives both in the cinema and in narrative fiction during the last decades.Más quizá que ninguna otra, la idea de normalización ha provocado profundas divisiones en el seno del activismo queer, tanto a nivel filosófico/conceptual como a nivel de estrategia política. En el origen de estas divisiones se encuentra la irreconciliable divergencia entre una agenda de cambio social, que propugna la necesidad de que la sociedad acepte como normales todas las conductas e identidades sexuales, y un planteamiento de resistencia radical ante unas estructuras sociales que sólo pueden ofrecer una normalización burguesa y acomodaticia. La literatura de ficción y el cine de temática homo-gay-queer han explorado éstas y otras facetas de la idea de normalización, contribuyendo así a dar pasos decisivos desde las poéticas de la transgresión hacia poéticas de la celebración y transformación social. En esta presentación se exploran dos de estas estrategias de normalización literaria: el uso del humor y la proliferación de perspectivas discursivas en el cine y la narrativa de

  5. Oral benfotiamine plus alpha-lipoic acid normalises complication-causing pathways in type 1 diabetes.

    Science.gov (United States)

    Du, X; Edelstein, D; Brownlee, M

    2008-10-01

    We determined whether fixed doses of benfotiamine in combination with slow-release alpha-lipoic acid normalise markers of reactive oxygen species-induced pathways of complications in humans. Male participants with and without type 1 diabetes were studied in the General Clinical Research Centre of the Albert Einstein College of Medicine. Glycaemic status was assessed by measuring baseline values of three different indicators of hyperglycaemia. Intracellular AGE formation, hexosamine pathway activity and prostacyclin synthase activity were measured initially, and after 2 and 4 weeks of treatment. In the nine participants with type 1 diabetes, treatment had no effect on any of the three indicators used to assess hyperglycaemia. However, treatment with benfotiamine plus alpha-lipoic acid completely normalised increased AGE formation, reduced increased monocyte hexosamine-modified proteins by 40% and normalised the 70% decrease in prostacyclin synthase activity from 1,709 +/- 586 pg/ml 6-keto-prostaglandin F(1alpha) to 4,696 +/- 533 pg/ml. These results show that the previously demonstrated beneficial effects of these agents on complication-causing pathways in rodent models of diabetic complications also occur in humans with type 1 diabetes.

  6. Using Normalisation Process Theory to investigate the implementation of school-based oral health promotion.

    Science.gov (United States)

    Olajide, O J; Shucksmith, J; Maguire, A; Zohoori, F V

    2017-09-01

    Despite the considerable improvement in oral health of children in the UK over the last forty years, a significant burden of dental caries remains prevalent in some groups of children, indicating the need for more effective oral health promotion intervention (OHPI) strategies in this population. To explore the implementation process of a community-based OHPI, in the North East of England, using Normalisation Process Theory (NPT) to provide insights on how effectiveness could be maximised. Utilising a generic qualitative research approach, 19 participants were recruited into the study. In-depth interviews were conducted with relevant National Health Service (NHS) staff and primary school teachers while focus group discussions were conducted with reception teachers and teaching assistants. Analyses were conducted using thematic analysis with emergent themes mapped onto NPT constructs. Participants highlighted the benefits of OHPI and the need for evidence in practice. However, implementation of 'best evidence' was hampered by lack of adequate synthesis of evidence from available clinical studies on effectiveness of OHPI as these generally have insufficient information on the dynamics of implementation and how effectiveness obtained in clinical studies could be achieved in 'real life'. This impacted on the decision-making process, levels of commitment, collaboration among OHP teams, resource allocation and evaluation of OHPI. A large gap exists between available research evidence and translation of evidence in OHPI in community settings. Effectiveness of OHPI requires not only an awareness of evidence of clinical effectiveness but also synthesised information about change mechanisms and implementation protocols. Copyright© 2017 Dennis Barber Ltd.

  7. Introducing carrying capacity-based normalisation in LCA: framework and development of references at midpoint level

    DEFF Research Database (Denmark)

    Bjørn, Anders; Hauschild, Michael Zwicky

    2015-01-01

    carrying capacity-based normalisation references. The purpose of this article is to present a framework for normalisation against carrying capacity-based references and to develop average normalisation references (NR) for Europe and the world for all those midpoint impact categories commonly included....... A literature review was carried out to identify scientifically sound thresholds for each impact category. Carrying capacities were then calculated from these thresholds and expressed in metrics identical to midpoint indicators giving priority to those recommended by ILCD. NR was expressed as the carrying...... ozone formation and soil quality were found to exceed carrying capacities several times.The developed carrying capacity-based normalisation references offer relevant supplementary reference information to the currently applied references based on society’s background interventions by supporting...

  8. Normalised quantitative polymerase chain reaction for diagnosis of tuberculosis-associated uveitis.

    Science.gov (United States)

    Barik, Manas Ranjan; Rath, Soveeta; Modi, Rohit; Rana, Rajkishori; Reddy, Mamatha M; Basu, Soumyava

    2018-05-01

    Polymerase chain reaction (PCR)-based diagnosis of tuberculosis-associated uveitis (TBU) in TB-endemic countries is challenging due to likelihood of latent mycobacterial infection in both immune and non-immune cells. In this study, we investigated normalised quantitative PCR (nqPCR) in ocular fluids (aqueous/vitreous) for diagnosis of TBU in a TB-endemic population. Mycobacterial copy numbers (mpb64 gene) were normalised to host genome copy numbers (RNAse P RNA component H1 [RPPH1] gene) in TBU (n = 16) and control (n = 13) samples (discovery cohort). The mpb64:RPPH1 ratios (normalised value) from each TBU and control sample were tested against the current reference standard i.e. clinically-diagnosed TBU, to generate Receiver Operating Characteristic (ROC) curves. The optimum cut-off value of mpb64:RPPH1 ratio (0.011) for diagnosing TBU was identified from the highest Youden index. This cut-off value was then tested in a different cohort of TBU and controls (validation cohort, 20 cases and 18 controls), where it yielded specificity, sensitivity and diagnostic accuracy of 94.4%, 85.0%, and 89.4% respectively. The above values for conventional quantitative PCR (≥1 copy of mpb64 per reaction) were 61.1%, 90.0%, and 74.3% respectively. Normalisation markedly improved the specificity and diagnostic accuracy of quantitative PCR for diagnosis of TBU. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Normalising convenience food?

    DEFF Research Database (Denmark)

    Halkier, Bente

    2017-01-01

    The construction of convenience food as a social and cultural category for food provisioning, cooking and eating seems to slide between or across understandings of what is considered “proper food” in the existing discourses in everyday life and media. This article sheds light upon some...... of the social and cultural normativities around convenience food by describing the ways in which convenience food forms part of the daily life of young Danes. Theoretically, the article is based on a practice theoretical perspective. Empirically, the article builds upon a qualitative research project on food...... habits among Danes aged 20–25. The article presents two types of empirical patterns. The first types of patterns are the degree to which and the different ways in which convenience food is normalised to use among the young Danes. The second types of patterns are the normative places of convenient food...

  10. A combination of low-dose bevacizumab and imatinib enhances vascular normalisation without inducing extracellular matrix deposition.

    Science.gov (United States)

    Schiffmann, L M; Brunold, M; Liwschitz, M; Goede, V; Loges, S; Wroblewski, M; Quaas, A; Alakus, H; Stippel, D; Bruns, C J; Hallek, M; Kashkar, H; Hacker, U T; Coutelle, O

    2017-02-28

    Vascular endothelial growth factor (VEGF)-targeting drugs normalise the tumour vasculature and improve access for chemotherapy. However, excessive VEGF inhibition fails to improve clinical outcome, and successive treatment cycles lead to incremental extracellular matrix (ECM) deposition, which limits perfusion and drug delivery. We show here, that low-dose VEGF inhibition augmented with PDGF-R inhibition leads to superior vascular normalisation without incremental ECM deposition thus maintaining access for therapy. Collagen IV expression was analysed in response to VEGF inhibition in liver metastasis of colorectal cancer (CRC) patients, in syngeneic (Panc02) and xenograft tumours of human colorectal cancer cells (LS174T). The xenograft tumours were treated with low (0.5 mg kg -1 body weight) or high (5 mg kg -1 body weight) doses of the anti-VEGF antibody bevacizumab with or without the tyrosine kinase inhibitor imatinib. Changes in tumour growth, and vascular parameters, including microvessel density, pericyte coverage, leakiness, hypoxia, perfusion, fraction of vessels with an open lumen, and type IV collagen deposition were compared. ECM deposition was increased after standard VEGF inhibition in patients and tumour models. In contrast, treatment with low-dose bevacizumab and imatinib produced similar growth inhibition without inducing detrimental collagen IV deposition, leading to superior vascular normalisation, reduced leakiness, improved oxygenation, more open vessels that permit perfusion and access for therapy. Low-dose bevacizumab augmented by imatinib selects a mature, highly normalised and well perfused tumour vasculature without inducing incremental ECM deposition that normally limits the effectiveness of VEGF targeting drugs.

  11. A normalised seawater strontium isotope curve. Possible implications for Neoproterozoic-Cambrian weathering rates and the further oxygenation of the Earth

    International Nuclear Information System (INIS)

    Shields, G.A.

    2007-01-01

    The strontium isotope composition of seawater is strongly influenced on geological time scales by changes in the rates of continental weathering relative to ocean crust alteration. However, the potential of the seawater 87 Sr/ 86 Sr curve to trace globally integrated chemical weathering rates has not been fully realised because ocean 87 Sr/ 86 Sr is also influenced by the isotopic evolution of Sr sources to the ocean. A preliminary attempt is made here to normalise the seawater 87 Sr/ 86 Sr curve to plausible trends in the 87 Sr/ 86 Sr ratios of the three major Sr sources: carbonate dissolution, silicate weathering and submarine hydrothermal exchange. The normalised curve highlights the Neoproterozoic-Phanerozoic transition as a period of exceptionally high continental influence, indicating that this interval was characterised by a transient increase in global weathering rates and/or by the weathering of unusually radiogenic crustal rocks. Close correlation between the normalised 87 Sr/ 86 Sr curve, a published seawater δ 34 S curve and atmospheric pCO 2 models is used here to argue that elevated chemical weathering rates were a major contributing factor to the steep rise in seawater 87 Sr/ 86 Sr from 650 Ma to 500 Ma. Elevated weathering rates during the Neoproterozoic-Cambrian interval led to increased nutrient availability, organic burial and to the further oxygenation of Earth's surface environment. Use of normalised seawater 87 Sr/ 86 Sr curves will, it is hoped, help to improve future geochemical models of Earth System dynamics. (orig.)

  12. Normalisation and weighting in life cycle assessment: quo vadis?

    DEFF Research Database (Denmark)

    Pizzol, Massimo; Laurent, Alexis; Sala, Serenella

    2017-01-01

    Purpose: Building on the rhetoric question “quo vadis?” (literally “Where are you going?”), this article critically investigates the state of the art of normalisation and weighting approaches within life cycle assessment. It aims at identifying purposes, current practises, pros and cons, as well...

  13. A novel approach to signal normalisation in atmospheric pressure ionisation mass spectrometry.

    Science.gov (United States)

    Vogeser, Michael; Kirchhoff, Fabian; Geyer, Roland

    2012-07-01

    The aim of our study was to test an alternative principle of signal normalisation in LC-MS/MS. During analyses, post column infusion of the target analyte is done via a T-piece, generating an "area under the analyte peak" (AUP). The ratio of peak area to AUP is assessed as assay response. Acceptable analytical performance of this principle was found for an exemplary analyte. Post-column infusion may allow normalisation of ion suppression not requiring any additional standard compound. This approach can be useful in situations where no appropriate compound is available for classical internal standardisation. Copyright © 2012 Elsevier B.V. All rights reserved.

  14. Trends of air pollution in Denmark - Normalised by a simple weather index model

    International Nuclear Information System (INIS)

    Kiilsholm, S.; Rasmussen, A.

    2000-01-01

    This report is a part of the Traffic Pool projects on 'Traffic and Environments', 1995-99, financed by the Danish Ministry of Transport. The Traffic Pool projects included five different projects on 'Surveillance of the Air Quality', 'Atmospheric Modelling', 'Atmospheric Chemistry Modelling', 'Smog and ozone' and 'Greenhouse effects and Climate', [Rasmussen, 2000]. This work is a part of the project on 'Surveillance of the Air Quality' with the main objectives to make trend analysis of levels of air pollution from traffic in Denmark. Other participants were from the Road Directory mainly focusing on measurement of traffic and trend analysis of the air quality utilising a nordic model for the air pollution in street canyons called BLB (Beregningsmodel for Luftkvalitet i Byluftgader) [Vejdirektoratet 2000], National Environmental Research Institute (HERI) mainly focusing on. measurements of air pollution and trend analysis with the Operational Street Pollution Model (OSPM) [DMU 2000], and the Copenhagen Environmental Protection Agency mainly focusing on measurements. In this study a more simple statistical model has been developed for trend analysis of the air quality. The model is filtering out the influence of the variations from year to year in the meteorological conditions on the air pollution levels. The weather factors found most important are wind speed, wind direction and mixing height. Measurements of CO, NO and NO 2 from three streets in Copenhagen have been used, these streets are Jagtvej, Bredgade and H. C. Andersen's Boulevard (HCAB). The years 1994-1996 were used for evaluation of the method and annual indexes of air pollution index dependent only on meteorological parameters, called WEATHIX, were calculated for the years 1990-1997 and used for normalisation of the observed air pollution trends. Meteorological data were taken from either the background stations at the H.C. Oersted - building situated close to one of the street stations or the synoptic

  15. Understanding clinician attitudes towards implementation of guided self-help cognitive behaviour therapy for those who hear distressing voices: using factor analysis to test normalisation process theory.

    Science.gov (United States)

    Hazell, Cassie M; Strauss, Clara; Hayward, Mark; Cavanagh, Kate

    2017-07-24

    The Normalisation Process Theory (NPT) has been used to understand the implementation of physical health care interventions. The current study aims to apply the NPT model to a secondary mental health context, and test the model using exploratory factor analysis. This study will consider the implementation of a brief cognitive behaviour therapy for psychosis (CBTp) intervention. Mental health clinicians were asked to complete a NPT-based questionnaire on the implementation of a brief CBTp intervention. All clinicians had experience of either working with the target client group or were able to deliver psychological therapies. In total, 201 clinicians completed the questionnaire. The results of the exploratory factor analysis found partial support for the NPT model, as three of the NPT factors were extracted: (1) coherence, (2) cognitive participation, and (3) reflexive monitoring. We did not find support for the fourth NPT factor (collective action). All scales showed strong internal consistency. Secondary analysis of these factors showed clinicians to generally support the implementation of the brief CBTp intervention. This study provides strong evidence for the validity of the three NPT factors extracted. Further research is needed to determine whether participants' level of seniority moderates factor extraction, whether this factor structure can be generalised to other healthcare settings, and whether pre-implementation attitudes predict actual implementation outcomes.

  16. Normalisation et certification dans le photovoltaïque: perspectives juridiques.

    OpenAIRE

    Boy , Laurence

    2012-01-01

    International audience; Legal approach of standardization in photovoltaic industry in France. Legal sources. Stakeholder"s liabillities. Competition aspects.; Approche juridique de la normalisation et de la certification dans le domaine du photovoltaïque en France. Sources du droit. Responsabilités des acteurs.Aspects concurrentiels.

  17. Total body neutron activation analysis of calcium: calibration and normalisation

    Energy Technology Data Exchange (ETDEWEB)

    Kennedy, N S.J.; Eastell, R; Ferrington, C M; Simpson, J D; Strong, J A [Western General Hospital, Edinburgh (UK); Smith, M A; Tothill, P [Royal Infirmary, Edinburgh (UK)

    1982-05-01

    An irradiation system has been designed, using a neutron beam from a cyclotron, which optimises the uniformity of activation of calcium. Induced activity is measured in a scanning, shadow-shield whole-body counter. Calibration has been effected and reproducibility assessed with three different types of phantom. Corrections were derived for variations in body height, depth and fat thickness. The coefficient of variation for repeated measurements of an anthropomorphic phantom was 1.8% for an absorbed dose equivalent of 13 mSv (1.3 rem). Measurements of total body calcium in 40 normal adults were used to derive normalisation factors which predict the normal calcium in a subject of given size and age. The coefficient of variation of normalised calcium was 6.2% in men and 6.6% in women, with the demonstration of an annual loss of 1.5% after the menopause. The narrow range should make single measurements useful for diagnostic purposes.

  18. Évolution de la normalisation dans le domaine des oléagineux et des corps gras

    Directory of Open Access Journals (Sweden)

    Quinsac Alain

    2003-07-01

    Full Text Available La normalisation joue un grand rôle dans les échanges économiques en participant à l’ouverture et à la transparence des marchés. La filière des Oléagineux et des Corps Gras a intégré depuis longtemps la normalisation dans sa stratégie. Élaborés à partir des besoins de la profession et notamment au niveau de la relation client-fournisseur, les programmes ont concerné principalement l’échantillonnage et l’analyse. Depuis quelques années, une forte évolution du contexte socio-économique et réglementaire (utilisation non-alimentaire, sécurité alimentaire, assurance qualité, a élargi le champ de la normalisation. La démarche normative adoptée dans le cas des bio-diesels et de la détection des OGM dans les oléagineux est expliquée. Les conséquences de l’évolution de la normalisation et les enjeux pour la profession des oléagineux dans le futur sont évoqués.

  19. Selection of reference genes for normalisation of real-time RT-PCR in brain-stem death injury in Ovis aries

    Directory of Open Access Journals (Sweden)

    Fraser John F

    2009-07-01

    Full Text Available Abstract Background Heart and lung transplantation is frequently the only therapeutic option for patients with end stage cardio respiratory disease. Organ donation post brain stem death (BSD is a pre-requisite, yet BSD itself causes such severe damage that many organs offered for donation are unusable, with lung being the organ most affected by BSD. In Australia and New Zealand, less than 50% of lungs offered for donation post BSD are suitable for transplantation, as compared with over 90% of kidneys, resulting in patients dying for lack of suitable lungs. Our group has developed a novel 24 h sheep BSD model to mimic the physiological milieu of the typical human organ donor. Characterisation of the gene expression changes associated with BSD is critical and will assist in determining the aetiology of lung damage post BSD. Real-time PCR is a highly sensitive method involving multiple steps from extraction to processing RNA so the choice of housekeeping genes is important in obtaining reliable results. Little information however, is available on the expression stability of reference genes in the sheep pulmonary artery and lung. We aimed to establish a set of stably expressed reference genes for use as a standard for analysis of gene expression changes in BSD. Results We evaluated the expression stability of 6 candidate normalisation genes (ACTB, GAPDH, HGPRT, PGK1, PPIA and RPLP0 using real time quantitative PCR. There was a wide range of Ct-values within each tissue for pulmonary artery (15–24 and lung (16–25 but the expression pattern for each gene was similar across the two tissues. After geNorm analysis, ACTB and PPIA were shown to be the most stably expressed in the pulmonary artery and ACTB and PGK1 in the lung tissue of BSD sheep. Conclusion Accurate normalisation is critical in obtaining reliable and reproducible results in gene expression studies. This study demonstrates tissue associated variability in the selection of these

  20. Normalisation of spot urine samples to 24-h collection for assessment of exposure to uranium

    International Nuclear Information System (INIS)

    Marco, R.; Katorza, E.; Gonen, R.; German, U.; Tshuva, A.; Pelled, O.; Paz-tal, O.; Adout, A.; Karpas, Z.

    2008-01-01

    For dose assessment of workers at Nuclear Research Center Negev exposed to natural uranium, spot urine samples are analysed and the results are normalised to 24-h urine excretion based on 'standard' man urine volume of 1.6 l d -1 . In the present work, the urine volume, uranium level and creatinine concentration were determined in two or three 24-h urine collections from 133 male workers (319 samples) and 33 female workers (88 samples). Three volunteers provided urine spot samples from each voiding during a 24-h period and a good correlation was found between the relative level of creatinine and uranium in spot samples collected from the same individual. The results show that normalisation of uranium concentration to creatinine in a spot sample represents the 24-h content of uranium better than normalisation to the standard volume and may be used to reduce the uncertainty of dose assessment based on spot samples. (authors)

  1. OpenPrescribing: normalised data and software tool to research trends in English NHS primary care prescribing 1998-2016.

    Science.gov (United States)

    Curtis, Helen J; Goldacre, Ben

    2018-02-23

    We aimed to compile and normalise England's national prescribing data for 1998-2016 to facilitate research on long-term time trends and create an open-data exploration tool for wider use. We compiled data from each individual year's national statistical publications and normalised them by mapping each drug to its current classification within the national formulary where possible. We created a freely accessible, interactive web tool to allow anyone to interact with the processed data. We downloaded all available annual prescription cost analysis datasets, which include cost and quantity for all prescription items dispensed in the community in England. Medical devices and appliances were excluded. We measured the extent of normalisation of data and aimed to produce a functioning accessible analysis tool. All data were imported successfully. 87.5% of drugs were matched exactly on name to the current formulary and a further 6.5% to similar drug names. All drugs in core clinical chapters were reconciled to their current location in the data schema, with only 1.26% of drugs not assigned a current chemical code. We created an openly accessible interactive tool to facilitate wider use of these data. Publicly available data can be made accessible through interactive online tools to help researchers and policy-makers explore time trends in prescribing. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. Normalisation of body composition parameters for nutritional assessment

    International Nuclear Information System (INIS)

    Preston, Thomas

    2014-01-01

    Full text: Normalisation of body composition parameters to an index of body size facilitates comparison of a subject’s measurements with those of a population. There is an obvious focus on indexes of obesity, but first it is informative to consider Fat Free Mass (FFM) in the context of common anthropometric measures of body size namely, height and weight. The contention is that FFM is a more physiological measure of body size than body mass. Many studies have shown that FFM relates to height ^p. Although there is debate over the appropriate exponent especially in early life, it appears to lie between 2 and 3. If 2, then FFM Index (FFMI; kg/m2) and Fat Mass Index (FMI; kg/m2) can be summed to give BMI. If 3 were used as exponent, then FFMI (kg/m3) plus FMI (kg/m3) gives the Ponderal Index (PI; weight/height3). In 2013, Burton argued that that a cubic exponent is appropriate for normalisation as it is a dimensionless quotient. In 2012, Wang and co-workers repeated earlier observations showing a strong linear relationship between FFM and height3. The importance of the latter study comes from the fact that a 4 compartment body composition model was used, which is recognised as the most accurate means of describing FFM. Once the basis of a FFMI has been defined it can be used to compare measurements with those of a population, either directly, as a ratio to a norm or as a Z-score. FFMI charts could be developed for use in child growth. Other related indexes can be determined for use in specific circumstances such as: body cell mass index (growth and wasting); skeletal muscle mass index (SMMI) or appendicular SMMI (growth and sarcopenia); bone mineral mass index (osteoporosis); extracellular fluid index (hydration). Finally, it is logical that the same system is used to define an adiposity index, so Fat Mass Index (FMI; kg/height3) can be used as it is consistent with FFMI (kg/height3) and PI. It should also be noted that the index FM/FFM, describes an individual

  3. What work has to be done to implement collaborative care for depression? Process evaluation of a trial utilizing the Normalization Process Model

    Directory of Open Access Journals (Sweden)

    Lankshear Annette J

    2010-02-01

    Full Text Available Abstract Background There is a considerable evidence base for 'collaborative care' as a method to improve quality of care for depression, but an acknowledged gap between efficacy and implementation. This study utilises the Normalisation Process Model (NPM to inform the process of implementation of collaborative care in both a future full-scale trial, and the wider health economy. Methods Application of the NPM to qualitative data collected in both focus groups and one-to-one interviews before and after an exploratory randomised controlled trial of a collaborative model of care for depression. Results Findings are presented as they relate to the four factors of the NPM (interactional workability, relational integration, skill-set workability, and contextual integration and a number of necessary tasks are identified. Using the model, it was possible to observe that predictions about necessary work to implement collaborative care that could be made from analysis of the pre-trial data relating to the four different factors of the NPM were indeed borne out in the post-trial data. However, additional insights were gained from the post-trial interview participants who, unlike those interviewed before the trial, had direct experience of a novel intervention. The professional freedom enjoyed by more senior mental health workers may work both for and against normalisation of collaborative care as those who wish to adopt new ways of working have the freedom to change their practice but are not obliged to do so. Conclusions The NPM provides a useful structure for both guiding and analysing the process by which an intervention is optimized for testing in a larger scale trial or for subsequent full-scale implementation.

  4. What work has to be done to implement collaborative care for depression? Process evaluation of a trial utilizing the Normalization Process Model.

    Science.gov (United States)

    Gask, Linda; Bower, Peter; Lovell, Karina; Escott, Diane; Archer, Janine; Gilbody, Simon; Lankshear, Annette J; Simpson, Angela E; Richards, David A

    2010-02-10

    There is a considerable evidence base for 'collaborative care' as a method to improve quality of care for depression, but an acknowledged gap between efficacy and implementation. This study utilises the Normalisation Process Model (NPM) to inform the process of implementation of collaborative care in both a future full-scale trial, and the wider health economy. Application of the NPM to qualitative data collected in both focus groups and one-to-one interviews before and after an exploratory randomised controlled trial of a collaborative model of care for depression. Findings are presented as they relate to the four factors of the NPM (interactional workability, relational integration, skill-set workability, and contextual integration) and a number of necessary tasks are identified. Using the model, it was possible to observe that predictions about necessary work to implement collaborative care that could be made from analysis of the pre-trial data relating to the four different factors of the NPM were indeed borne out in the post-trial data. However, additional insights were gained from the post-trial interview participants who, unlike those interviewed before the trial, had direct experience of a novel intervention. The professional freedom enjoyed by more senior mental health workers may work both for and against normalisation of collaborative care as those who wish to adopt new ways of working have the freedom to change their practice but are not obliged to do so. The NPM provides a useful structure for both guiding and analysing the process by which an intervention is optimized for testing in a larger scale trial or for subsequent full-scale implementation.

  5. Bounded real and positive real balanced truncation using Σ-normalised coprime factors

    NARCIS (Netherlands)

    Trentelman, H.L.

    2009-01-01

    In this article, we will extend the method of balanced truncation using normalised right coprime factors of the system transfer matrix to balanced truncation with preservation of half line dissipativity. Special cases are preservation of positive realness and bounded realness. We consider a half

  6. The one-dimensional normalised generalised equivalence theory (NGET) for generating equivalent diffusion theory group constants for PWR reflector regions

    International Nuclear Information System (INIS)

    Mueller, E.Z.

    1991-01-01

    An equivalent diffusion theory PWR reflector model is presented, which has as its basis Smith's generalisation of Koebke's Equivalent Theory. This method is an adaptation, in one-dimensional slab geometry, of the Generalised Equivalence Theory (GET). Since the method involves the renormalisation of the GET discontinuity factors at nodal interfaces, it is called the Normalised Generalised Equivalence Theory (NGET) method. The advantages of the NGET method for modelling the ex-core nodes of a PWR are summarized. 23 refs

  7. From theory to 'measurement' in complex interventions: methodological lessons from the development of an e-health normalisation instrument.

    Science.gov (United States)

    Finch, Tracy L; Mair, Frances S; O'Donnell, Catherine; Murray, Elizabeth; May, Carl R

    2012-05-17

    Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1) describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2) identify key issues and methodological challenges for advancing work in this field. A 30-item instrument (Technology Adoption Readiness Scale (TARS)) for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT). NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice) was used by health care professionals. The developed instrument was pre-tested in two professional samples (N=46; N=231). Ratings of items representing normalisation 'processes' were significantly related to staff members' perceptions of whether or not e-health had become 'routine'. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1) greater attention to underlying theoretical assumptions and extent of translation work required; (2) the need for appropriate but flexible approaches to outcomes measurement; (3) representation of multiple perspectives and collaborative nature of

  8. An efficient Markov chain Monte Carlo method for distributions with intractable normalising constants

    DEFF Research Database (Denmark)

    Møller, Jesper; Pettitt, A. N.; Reeves, R.

    2006-01-01

    Maximum likelihood parameter estimation and sampling from Bayesian posterior distributions are problematic when the probability density for the parameter of interest involves an intractable normalising constant which is also a function of that parameter. In this paper, an auxiliary variable metho...

  9. From theory to 'measurement' in complex interventions: Methodological lessons from the development of an e-health normalisation instrument

    Directory of Open Access Journals (Sweden)

    Finch Tracy L

    2012-05-01

    Full Text Available Abstract Background Although empirical and theoretical understanding of processes of implementation in health care is advancing, translation of theory into structured measures that capture the complex interplay between interventions, individuals and context remain limited. This paper aimed to (1 describe the process and outcome of a project to develop a theory-based instrument for measuring implementation processes relating to e-health interventions; and (2 identify key issues and methodological challenges for advancing work in this field. Methods A 30-item instrument (Technology Adoption Readiness Scale (TARS for measuring normalisation processes in the context of e-health service interventions was developed on the basis on Normalization Process Theory (NPT. NPT focuses on how new practices become routinely embedded within social contexts. The instrument was pre-tested in two health care settings in which e-health (electronic facilitation of healthcare decision-making and practice was used by health care professionals. Results The developed instrument was pre-tested in two professional samples (N = 46; N = 231. Ratings of items representing normalisation ‘processes’ were significantly related to staff members’ perceptions of whether or not e-health had become ‘routine’. Key methodological challenges are discussed in relation to: translating multi-component theoretical constructs into simple questions; developing and choosing appropriate outcome measures; conducting multiple-stakeholder assessments; instrument and question framing; and more general issues for instrument development in practice contexts. Conclusions To develop theory-derived measures of implementation process for progressing research in this field, four key recommendations are made relating to (1 greater attention to underlying theoretical assumptions and extent of translation work required; (2 the need for appropriate but flexible approaches to outcomes

  10. Identification of endogenous control genes for normalisation of real-time quantitative PCR data in colorectal cancer.

    LENUS (Irish Health Repository)

    Kheirelseid, Elrasheid A H

    2010-01-01

    BACKGROUND: Gene expression analysis has many applications in cancer diagnosis, prognosis and therapeutic care. Relative quantification is the most widely adopted approach whereby quantification of gene expression is normalised relative to an endogenously expressed control (EC) gene. Central to the reliable determination of gene expression is the choice of control gene. The purpose of this study was to evaluate a panel of candidate EC genes from which to identify the most stably expressed gene(s) to normalise RQ-PCR data derived from primary colorectal cancer tissue. RESULTS: The expression of thirteen candidate EC genes: B2M, HPRT, GAPDH, ACTB, PPIA, HCRT, SLC25A23, DTX3, APOC4, RTDR1, KRTAP12-3, CHRNB4 and MRPL19 were analysed in a cohort of 64 colorectal tumours and tumour associated normal specimens. CXCL12, FABP1, MUC2 and PDCD4 genes were chosen as target genes against which a comparison of the effect of each EC gene on gene expression could be determined. Data analysis using descriptive statistics, geNorm, NormFinder and qBasePlus indicated significant difference in variances between candidate EC genes. We determined that two genes were required for optimal normalisation and identified B2M and PPIA as the most stably expressed and reliable EC genes. CONCLUSION: This study identified that the combination of two EC genes (B2M and PPIA) more accurately normalised RQ-PCR data in colorectal tissue. Although these control genes might not be optimal for use in other cancer studies, the approach described herein could serve as a template for the identification of valid ECs in other cancer types.

  11. Normalisation of the peaceful use of nuclear energy - consequences for its legal regulation

    International Nuclear Information System (INIS)

    Birkhofer, A.; Lukes, R.

    1985-01-01

    The five reports in this book deal with the importance of the peaceful use of nuclear energy, as well as with several aspects of normalisation. The spectrum of the reports underlines the benefit for the support of the peaceful use of nuclear energy. (WG) [de

  12. Volatility Determination in an Ambit Process Setting

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole; Graversen, Svend-Erik

    The probability limit behaviour of normalised quadratic variation is studied for a simple tempo-spatial ambit process, with particular regard to the question of volatility memorylessness.......The probability limit behaviour of normalised quadratic variation is studied for a simple tempo-spatial ambit process, with particular regard to the question of volatility memorylessness....

  13. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    collected retrospectively for a period of six months. For each patient, time in therapeutic range (TTR) was calculated and correlated with practice and patient characteristics using multilevel linear regression models. RESULTS: We identified 447 patients in warfarin treatment in the 20 practices using POCT......INTRODUCTION: Oral anticoagulation treatment (OACT) with warfarin is common in general practice. Increasingly, international normalised ratio (INR) point of care testing (POCT) is being used to manage patients. The aim of this study was to describe and analyse the quality of OACT with warfarin...

  14. The contribution of online content to the promotion and normalisation of female genital cosmetic surgery: a systematic review of the literature.

    Science.gov (United States)

    Mowat, Hayley; McDonald, Karalyn; Dobson, Amy Shields; Fisher, Jane; Kirkman, Maggie

    2015-11-25

    Women considering female genital cosmetic surgery (FGCS) are likely to use the internet as a key source of information during the decision-making process. The aim of this systematic review was to determine what is known about the role of the internet in the promotion and normalisation of female genital cosmetic surgery and to identify areas for future research. Eight social science, medical, and communication databases and Google Scholar were searched for peer-reviewed papers published in English. Results from all papers were analysed to identify recurring and unique themes. Five papers met inclusion criteria. Three of the papers reported investigations of website content of FGCS providers, a fourth compared motivations for labiaplasty publicised on provider websites with those disclosed by women in online communities, and the fifth analysed visual depictions of female genitalia in online pornography. Analysis yielded five significant and interrelated patterns of representation, each functioning to promote and normalise the practice of FGCS: pathologisation of genital diversity; female genital appearance as important to wellbeing; characteristics of women's genitals are important for sex life; female body as degenerative and improvable through surgery; and FGCS as safe, easy, and effective. A significant gap was identified in the literature: the ways in which user-generated content might function to perpetuate, challenge, or subvert the normative discourses prevalent in online pornography and surgical websites. Further research is needed to contribute to knowledge of the role played by the internet in the promotion and normalisation of female genital cosmetic surgery.

  15. Preoperative mapping of cortical language areas in adult brain tumour patients using PET and individual non-normalised SPM analyses

    International Nuclear Information System (INIS)

    Meyer, Philipp T.; Sturz, Laszlo; Schreckenberger, Mathias; Setani, Keyvan S.; Buell, Udalrich; Spetzger, Uwe; Meyer, Georg F.; Sabri, Osama

    2003-01-01

    In patients scheduled for the resection of perisylvian brain tumours, knowledge of the cortical topography of language functions is crucial in order to avoid neurological deficits. We investigated the applicability of statistical parametric mapping (SPM) without stereotactic normalisation for individual preoperative language function brain mapping using positron emission tomography (PET). Seven right-handed adult patients with left-sided brain tumours (six frontal and one temporal) underwent 12 oxygen-15 labelled water PET scans during overt verb generation and rest. Individual activation maps were calculated for P<0.005 and P<0.001 without anatomical normalisation and overlaid onto the individuals' magnetic resonance images for preoperative planning. Activations corresponding to Broca's and Wernicke's areas were found in five and six cases, respectively, for P<0.005 and in three and six cases, respectively, for P<0.001. One patient with a glioma located in the classical Broca's area without aphasic symptoms presented an activation of the adjacent inferior frontal cortex and of a right-sided area homologous to Broca's area. Four additional patients with left frontal tumours also presented activations of the right-sided Broca's homologue; two of these showed aphasic symptoms and two only a weak or no activation of Broca's area. Other frequently observed activations included bilaterally the superior temporal gyri, prefrontal cortices, anterior insulae, motor areas and the cerebellum. The middle and inferior temporal gyri were activated predominantly on the left. An SPM group analysis (P<0.05, corrected) in patients with left frontal tumours confirmed the activation pattern shown by the individual analyses. We conclude that SPM analyses without stereotactic normalisation offer a promising alternative for analysing individual preoperative language function brain mapping studies. The observed right frontal activations agree with proposed reorganisation processes, but

  16. Relationships between the normalised difference vegetation index and temperature fluctuations in post-mining sites

    Czech Academy of Sciences Publication Activity Database

    Bujalský, L.; Jirka, V.; Zemek, František; Frouz, J.

    2018-01-01

    Roč. 32, č. 4 (2018), s. 254-263 ISSN 1748-0930 R&D Projects: GA MŠk(CZ) LO1415 Institutional support: RVO:67179843 Keywords : temperature * normalised difference * vegetation index (NDVI) * vegetation cover * remote sensing Subject RIV: DF - Soil Science Impact factor: 1.078, year: 2016

  17. Effect of food matrix and thermal processing on the performance of a normalised quantitative real-time PCR approach for lupine (Lupinus albus) detection as a potential allergenic food.

    Science.gov (United States)

    Villa, Caterina; Costa, Joana; Gondar, Cristina; Oliveira, M Beatriz P P; Mafra, Isabel

    2018-10-01

    Lupine is widely used as an ingredient in diverse food products, but it is also a source of allergens. This work aimed at proposing a method to detect/quantify lupine as an allergen in processed foods based on a normalised real-time PCR assay targeting the Lup a 4 allergen-encoding gene of Lupinus albus. Sensitivities down to 0.0005%, 0.01% and 0.05% (w/w) of lupine in rice flour, wheat flour and bread, respectively, and 1 pg of L. albus DNA were obtained, with adequate real-time PCR performance parameters using the ΔCt method. Both food matrix and processing affected negatively the quantitative performance of the assay. The method was successfully validated with blind samples and applied to processed foods. Lupine was estimated between 4.12 and 22.9% in foods, with some results suggesting the common practice of precautionary labelling. In this work, useful and effective tools were proposed for the detection/quantification of lupine in food products. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. A normalisation for the four - detector system for gamma - gamma angular correlation studies

    International Nuclear Information System (INIS)

    Kiang, G.C.; Chen, C.H.; Niu, W.F.

    1994-01-01

    A normalisation method for the multiple - HPGe - detector system is described. The system consists of four coaxial HPGe detectors with a CAMAC event - by - event data acquisition system, enabling to measure six gamma -gamma coincidences of angles simultaneously. An application for gamma - gamma correlation studies of Kr 82 is presented and discussed. 3 figs., 6 refs. (author)

  19. Assessing the facilitators and barriers of interdisciplinary team working in primary care using normalisation process theory: An integrative review.

    Science.gov (United States)

    O'Reilly, Pauline; Lee, Siew Hwa; O'Sullivan, Madeleine; Cullen, Walter; Kennedy, Catriona; MacFarlane, Anne

    2017-01-01

    Interdisciplinary team working is of paramount importance in the reform of primary care in order to provide cost-effective and comprehensive care. However, international research shows that it is not routine practice in many healthcare jurisdictions. It is imperative to understand levers and barriers to the implementation process. This review examines interdisciplinary team working in practice, in primary care, from the perspective of service providers and analyses 1 barriers and facilitators to implementation of interdisciplinary teams in primary care and 2 the main research gaps. An integrative review following the PRISMA guidelines was conducted. Following a search of 10 international databases, 8,827 titles were screened for relevance and 49 met the criteria. Quality of evidence was appraised using predetermined criteria. Data were analysed following the principles of framework analysis using Normalisation Process Theory (NPT), which has four constructs: sense making, enrolment, enactment, and appraisal. The literature is dominated by a focus on interdisciplinary working between physicians and nurses. There is a dearth of evidence about all NPT constructs apart from enactment. Physicians play a key role in encouraging the enrolment of others in primary care team working and in enabling effective divisions of labour in the team. The experience of interdisciplinary working emerged as a lever for its implementation, particularly where communication and respect were strong between professionals. A key lever for interdisciplinary team working in primary care is to get professionals working together and to learn from each other in practice. However, the evidence base is limited as it does not reflect the experiences of all primary care professionals and it is primarily about the enactment of team working. We need to know much more about the experiences of the full network of primary care professionals regarding all aspects of implementation work. International

  20. Confluence via strong normalisation in an algebraic λ-calculus with rewriting

    Directory of Open Access Journals (Sweden)

    Pablo Buiras

    2012-03-01

    Full Text Available The linear-algebraic lambda-calculus and the algebraic lambda-calculus are untyped lambda-calculi extended with arbitrary linear combinations of terms. The former presents the axioms of linear algebra in the form of a rewrite system, while the latter uses equalities. When given by rewrites, algebraic lambda-calculi are not confluent unless further restrictions are added. We provide a type system for the linear-algebraic lambda-calculus enforcing strong normalisation, which gives back confluence. The type system allows an abstract interpretation in System F.

  1. Quantification of tumour {sup 18}F-FDG uptake: Normalise to blood glucose or scale to liver uptake?

    Energy Technology Data Exchange (ETDEWEB)

    Keramida, Georgia [Brighton and Sussex Medical School, Clinical Imaging Sciences Centre, Brighton (United Kingdom); Brighton and Sussex University Hospitals NHS Trust, Department of Nuclear Medicine, Brighton (United Kingdom); University of Sussex, Clinical Imaging Sciences Centre, Brighton (United Kingdom); Dizdarevic, Sabina; Peters, A.M. [Brighton and Sussex Medical School, Clinical Imaging Sciences Centre, Brighton (United Kingdom); Brighton and Sussex University Hospitals NHS Trust, Department of Nuclear Medicine, Brighton (United Kingdom); Bush, Janice [Brighton and Sussex Medical School, Clinical Imaging Sciences Centre, Brighton (United Kingdom)

    2015-09-15

    To compare normalisation to blood glucose (BG) with scaling to hepatic uptake for quantification of tumour {sup 18}F-FDG uptake using the brain as a surrogate for tumours. Standardised uptake value (SUV) was measured over the liver, cerebellum, basal ganglia, and frontal cortex in 304 patients undergoing {sup 18}F-FDG PET/CT. The relationship between brain FDG clearance and SUV was theoretically defined. Brain SUV decreased exponentially with BG, with similar constants between cerebellum, basal ganglia, and frontal cortex (0.099-0.119 mmol/l{sup -1}) and similar to values for tumours estimated from the literature. Liver SUV, however, correlated positively with BG. Brain-to-liver SUV ratio therefore showed an inverse correlation with BG, well-fitted with a hyperbolic function (R = 0.83), as theoretically predicted. Brain SUV normalised to BG (nSUV) displayed a nonlinear correlation with BG (R = 0.55); however, as theoretically predicted, brain nSUV/liver SUV showed almost no correlation with BG. Correction of brain SUV using BG raised to an exponential power of 0.099 mmol/l{sup -1} also eliminated the correlation between brain SUV and BG. Brain SUV continues to correlate with BG after normalisation to BG. Likewise, liver SUV is unsuitable as a reference for tumour FDG uptake. Brain SUV divided by liver SUV, however, shows minimal dependence on BG. (orig.)

  2. Normalisation: ROI optimal treatment planning - SNDH pattern

    International Nuclear Information System (INIS)

    Shilvat, D.V.; Bhandari, Virendra; Tamane, Chandrashekhar; Pangam, Suresh

    2001-01-01

    Dose precision maximally to the target / ROI (Region of Interest), taking care of tolerance dose of normal tissue is the aim of ideal treatment planning. This goal is achieved with advanced modalities such as, micro MLC, simulator and 3-dimensional treatment planning system. But SNDH PATTERN uses minimum available resources as, ALCYON II Telecobalt unit, CT Scan, MULTIDATA 2-dimensional treatment planning system to their maximum utility and reaches to the required precision, same as that with advance modalities. Among the number of parameters used, 'NORMALISATION TO THE ROI' will achieve the aim of the treatment planning effectively. This is dealing with an example of canal of esophagus modified treatment planning based on SNDH pattern. Results are attractive and self explanatory. By implementing SNDH pattern, the QUALITY INDEX of treatment plan will reach to greater than 90%, with substantial reduction in dose to the vital organs. Aim is to utilize the minimum available resources efficiently to achieve highest possible precision for delivering homogenous dose to ROI while taking care of tolerance dose to vital organs

  3. 18S rRNA is a reliable normalisation gene for real time PCR based on influenza virus infected cells

    Directory of Open Access Journals (Sweden)

    Kuchipudi Suresh V

    2012-10-01

    Full Text Available Abstract Background One requisite of quantitative reverse transcription PCR (qRT-PCR is to normalise the data with an internal reference gene that is invariant regardless of treatment, such as virus infection. Several studies have found variability in the expression of commonly used housekeeping genes, such as beta-actin (ACTB and glyceraldehyde-3-phosphate dehydrogenase (GAPDH, under different experimental settings. However, ACTB and GAPDH remain widely used in the studies of host gene response to virus infections, including influenza viruses. To date no detailed study has been described that compares the suitability of commonly used housekeeping genes in influenza virus infections. The present study evaluated several commonly used housekeeping genes [ACTB, GAPDH, 18S ribosomal RNA (18S rRNA, ATP synthase, H+ transporting, mitochondrial F1 complex, beta polypeptide (ATP5B and ATP synthase, H+ transporting, mitochondrial Fo complex, subunit C1 (subunit 9 (ATP5G1] to identify the most stably expressed gene in human, pig, chicken and duck cells infected with a range of influenza A virus subtypes. Results The relative expression stability of commonly used housekeeping genes were determined in primary human bronchial epithelial cells (HBECs, pig tracheal epithelial cells (PTECs, and chicken and duck primary lung-derived cells infected with five influenza A virus subtypes. Analysis of qRT-PCR data from virus and mock infected cells using NormFinder and BestKeeper software programmes found that 18S rRNA was the most stable gene in HBECs, PTECs and avian lung cells. Conclusions Based on the presented data from cell culture models (HBECs, PTECs, chicken and duck lung cells infected with a range of influenza viruses, we found that 18S rRNA is the most stable reference gene for normalising qRT-PCR data. Expression levels of the other housekeeping genes evaluated in this study (including ACTB and GPADH were highly affected by influenza virus infection and

  4. Technology, normalisation and male sex work.

    Science.gov (United States)

    MacPhail, Catherine; Scott, John; Minichiello, Victor

    2015-01-01

    Technological change, particularly the growth of the Internet and smart phones, has increased the visibility of male escorts, expanded their client base and diversified the range of venues in which male sex work can take place. Specifically, the Internet has relocated some forms of male sex work away from the street and thereby increased market reach, visibility and access and the scope of sex work advertising. Using the online profiles of 257 male sex workers drawn from six of the largest websites advertising male sexual services in Australia, the role of the Internet in facilitating the normalisation of male sex work is discussed. Specifically we examine how engagement with the sex industry has been reconstituted in term of better informed consumer-seller decisions for both clients and sex workers. Rather than being seen as a 'deviant' activity, understood in terms of pathology or criminal activity, male sex work is increasingly presented as an everyday commodity in the market place. In this context, the management of risks associated with sex work has shifted from formalised social control to more informal practices conducted among online communities of clients and sex workers. We discuss the implications for health, legal and welfare responses within an empowerment paradigm.

  5. Assessing the facilitators and barriers of interdisciplinary team working in primary care using normalisation process theory: An integrative review

    Science.gov (United States)

    O’Reilly, Pauline; Lee, Siew Hwa; O’Sullivan, Madeleine; Cullen, Walter; Kennedy, Catriona; MacFarlane, Anne

    2017-01-01

    Background Interdisciplinary team working is of paramount importance in the reform of primary care in order to provide cost-effective and comprehensive care. However, international research shows that it is not routine practice in many healthcare jurisdictions. It is imperative to understand levers and barriers to the implementation process. This review examines interdisciplinary team working in practice, in primary care, from the perspective of service providers and analyses 1 barriers and facilitators to implementation of interdisciplinary teams in primary care and 2 the main research gaps. Methods and findings An integrative review following the PRISMA guidelines was conducted. Following a search of 10 international databases, 8,827 titles were screened for relevance and 49 met the criteria. Quality of evidence was appraised using predetermined criteria. Data were analysed following the principles of framework analysis using Normalisation Process Theory (NPT), which has four constructs: sense making, enrolment, enactment, and appraisal. The literature is dominated by a focus on interdisciplinary working between physicians and nurses. There is a dearth of evidence about all NPT constructs apart from enactment. Physicians play a key role in encouraging the enrolment of others in primary care team working and in enabling effective divisions of labour in the team. The experience of interdisciplinary working emerged as a lever for its implementation, particularly where communication and respect were strong between professionals. Conclusion A key lever for interdisciplinary team working in primary care is to get professionals working together and to learn from each other in practice. However, the evidence base is limited as it does not reflect the experiences of all primary care professionals and it is primarily about the enactment of team working. We need to know much more about the experiences of the full network of primary care professionals regarding all aspects

  6. The dynamics of the oesophageal squamous epithelium 'normalisation' process in patients with gastro-oesophageal reflux disease treated with long-term acid suppression or anti-reflux surgery.

    Science.gov (United States)

    Mastracci, L; Fiocca, R; Engström, C; Attwood, S; Ell, C; Galmiche, J P; Hatlebakk, J G; Långström, G; Eklund, S; Lind, T; Lundell, L

    2017-05-01

    Proton pump inhibitors and laparoscopic anti-reflux surgery (LARS) offer long-term symptom control to patients with gastro-oesophageal reflux disease (GERD). To evaluate the process of 'normalisation' of the squamous epithelium morphology of the distal oesophagus on these therapies. In the LOTUS trial, 554 patients with chronic GERD were randomised to receive either esomeprazole (20-40 mg daily) or LARS. After 5 years, 372 patients remained in the study (esomeprazole, 192; LARS, 180). Biopsies were taken at the Z-line and 2 cm above, at baseline, 1, 3 and 5 years. A severity score was calculated based on: papillae elongation, basal cell hyperplasia, intercellular space dilatations and eosinophilic infiltration. The epithelial proliferative activity was assessed by Ki-67 immunohistochemistry. A gradual improvement in all variables over 5 years was noted in both groups, at both the Z-line and 2 cm above. The severity score decreased from baseline at each subsequent time point in both groups (P refluxate seems to play the predominant role in restoring tissue morphology. © 2017 John Wiley & Sons Ltd.

  7. Supporting the use of theory in cross-country health services research: a participatory qualitative approach using Normalisation Process Theory as an example.

    Science.gov (United States)

    O'Donnell, Catherine A; Mair, Frances S; Dowrick, Christopher; Brún, Mary O'Reilly-de; Brún, Tomas de; Burns, Nicola; Lionis, Christos; Saridaki, Aristoula; Papadakaki, Maria; Muijsenbergh, Maria van den; Weel-Baumgarten, Evelyn van; Gravenhorst, Katja; Cooper, Lucy; Princz, Christine; Teunissen, Erik; Mareeuw, Francine van den Driessen; Vlahadi, Maria; Spiegel, Wolfgang; MacFarlane, Anne

    2017-08-21

    To describe and reflect on the process of designing and delivering a training programme supporting the use of theory, in this case Normalisation Process Theory (NPT), in a multisite cross-country health services research study. Participatory research approach using qualitative methods. Six European primary care settings involving research teams from Austria, England, Greece, Ireland, The Netherlands and Scotland. RESTORE research team consisting of 8 project applicants, all senior primary care academics, and 10 researchers. Professional backgrounds included general practitioners/family doctors, social/cultural anthropologists, sociologists and health services/primary care researchers. Views of all research team members (n=18) were assessed using qualitative evaluation methods, analysed qualitatively by the trainers after each session. Most of the team had no experience of using NPT and many had not applied theory to prospective, qualitative research projects. Early training proved didactic and overloaded participants with information. Drawing on RESTORE's methodological approach of Participatory Learning and Action, workshops using role play, experiential interactive exercises and light-hearted examples not directly related to the study subject matter were developed. Evaluation showed the study team quickly grew in knowledge and confidence in applying theory to fieldwork.Recommendations applicable to other studies include: accepting that theory application is not a linear process, that time is needed to address researcher concerns with the process, and that experiential, interactive learning is a key device in building conceptual and practical knowledge. An unanticipated benefit was the smooth transition to cross-country qualitative coding of study data. A structured programme of training enhanced and supported the prospective application of a theory, NPT, to our work but raised challenges. These were not unique to NPT but could arise with the application of any

  8. REPORTING SOCIETAL : LIMITES ET ENJEUX DE LA PROPOSITION DE NORMALISATION INTERNATIONALE " GLOBAL REPORTING INITIATIVE "

    OpenAIRE

    Michel Capron; Françoise Quairel

    2003-01-01

    International audience; En s'inspirant de la normalisation comptable anglo-saxonne, la Global Reporting Initiative (GRI) propose un référentiel de publication volontaire d'informations sociétales. La transposition présente des limites qui rendent en fait ses principes inapplicables. Néanmoins il tend à s'imposer et les grandes entreprises peuvent y trouver le moyen d'éviter une régulation contraignante.

  9. The stories we tell: qualitative research interviews, talking technologies and the 'normalisation' of life with HIV.

    Science.gov (United States)

    Mazanderani, Fadhila; Paparini, Sara

    2015-04-01

    Since the earliest days of the HIV/AIDS epidemic, talking about the virus has been a key way affected communities have challenged the fear and discrimination directed against them and pressed for urgent medical and political attention. Today, HIV/AIDS is one of the most prolifically and intimately documented of all health conditions, with entrenched infrastructures, practices and technologies--what Vinh-Kim Nguyen has dubbed 'confessional technologies'--aimed at encouraging those affected to share their experiences. Among these technologies, we argue, is the semi-structured interview: the principal methodology used in qualitative social science research focused on patient experiences. Taking the performative nature of the research interview as a talking technology seriously has epistemological implications not merely for how we interpret interview data, but also for how we understand the role of research interviews in the enactment of 'life with HIV'. This paper focuses on one crucial aspect of this enactment: the contemporary 'normalisation' of HIV as 'just another' chronic condition--a process taking place at the level of individual subjectivities, social identities, clinical practices and global health policy, and of which social science research is a vital part. Through an analysis of 76 interviews conducted in London (2009-10), we examine tensions in the experiential narratives of individuals living with HIV in which life with the virus is framed as 'normal', yet where this 'normality' is beset with contradictions and ambiguities. Rather than viewing these as a reflection of resistances to or failures of the enactment of HIV as 'normal', we argue that, insofar as these contradictions are generated by the research interview as a distinct 'talking technology', they emerge as crucial to the normative (re)production of what counts as 'living with HIV' (in the UK) and are an inherent part of the broader performative 'normalisation' of the virus. Copyright © 2015

  10. Facilitating professional liaison in collaborative care for depression in UK primary care; a qualitative study utilising normalisation process theory.

    Science.gov (United States)

    Coupe, Nia; Anderson, Emma; Gask, Linda; Sykes, Paul; Richards, David A; Chew-Graham, Carolyn

    2014-05-01

    Collaborative care (CC) is an organisational framework which facilitates the delivery of a mental health intervention to patients by case managers in collaboration with more senior health professionals (supervisors and GPs), and is effective for the management of depression in primary care. However, there remains limited evidence on how to successfully implement this collaborative approach in UK primary care. This study aimed to explore to what extent CC impacts on professional working relationships, and if CC for depression could be implemented as routine in the primary care setting. This qualitative study explored perspectives of the 6 case managers (CMs), 5 supervisors (trial research team members) and 15 general practitioners (GPs) from practices participating in a randomised controlled trial of CC for depression. Interviews were transcribed verbatim and data was analysed using a two-step approach using an initial thematic analysis, and a secondary analysis using the Normalisation Process Theory concepts of coherence, cognitive participation, collective action and reflexive monitoring with respect to the implementation of CC in primary care. Supervisors and CMs demonstrated coherence in their understanding of CC, and consequently reported good levels of cognitive participation and collective action regarding delivering and supervising the intervention. GPs interviewed showed limited understanding of the CC framework, and reported limited collaboration with CMs: barriers to collaboration were identified. All participants identified the potential or experienced benefits of a collaborative approach to depression management and were able to discuss ways in which collaboration can be facilitated. Primary care professionals in this study valued the potential for collaboration, but GPs' understanding of CC and organisational barriers hindered opportunities for communication. Further work is needed to address these organisational barriers in order to facilitate

  11. Implementing online consultations in primary care: a mixed-method evaluation extending normalisation process theory through service co-production.

    Science.gov (United States)

    Farr, Michelle; Banks, Jonathan; Edwards, Hannah B; Northstone, Kate; Bernard, Elly; Salisbury, Chris; Horwood, Jeremy

    2018-03-19

    To examine patient and staff views, experiences and acceptability of a UK primary care online consultation system and ask how the system and its implementation may be improved. Mixed-method evaluation of a primary care e-consultation system. Primary care practices in South West England. Qualitative interviews with 23 practice staff in six practices. Patient survey data for 756 e-consultations from 36 practices, with free-text survey comments from 512 patients, were analysed thematically. Anonymised patients' records were abstracted for 485 e-consultations from eight practices, including consultation types and outcomes. Descriptive statistics were used to analyse quantitative data. Analysis of implementation and the usage of the e-consultation system were informed by: (1) normalisation process theory, (2) a framework that illustrates how e-consultations were co-produced and (3) patients' and staff touchpoints. We found different expectations between patients and staff on how to use e-consultations 'appropriately'. While some patients used the system to try and save time for themselves and their general practitioners (GPs), some used e-consultations when they could not get a timely face-to-face appointment. Most e-consultations resulted in either follow-on phone (32%) or face-to-face appointments (38%) and GPs felt that this duplicated their workload. Patient satisfaction of the system was high, but a minority were dissatisfied with practice communication about their e-consultation. Where both patients and staff interact with technology, it is in effect 'co-implemented'. How patients used e-consultations impacted on practice staff's experiences and appraisal of the system. Overall, the e-consultation system studied could improve access for some patients, but in its current form, it was not perceived by practices as creating sufficient efficiencies to warrant financial investment. We illustrate how this e-consultation system and its implementation can be improved

  12. Analysis of a simulated microarray dataset: Comparison of methods for data normalisation and detection of differential expression (Open Access publication

    Directory of Open Access Journals (Sweden)

    Mouzaki Daphné

    2007-11-01

    Full Text Available Abstract Microarrays allow researchers to measure the expression of thousands of genes in a single experiment. Before statistical comparisons can be made, the data must be assessed for quality and normalisation procedures must be applied, of which many have been proposed. Methods of comparing the normalised data are also abundant, and no clear consensus has yet been reached. The purpose of this paper was to compare those methods used by the EADGENE network on a very noisy simulated data set. With the a priori knowledge of which genes are differentially expressed, it is possible to compare the success of each approach quantitatively. Use of an intensity-dependent normalisation procedure was common, as was correction for multiple testing. Most variety in performance resulted from differing approaches to data quality and the use of different statistical tests. Very few of the methods used any kind of background correction. A number of approaches achieved a success rate of 95% or above, with relatively small numbers of false positives and negatives. Applying stringent spot selection criteria and elimination of data did not improve the false positive rate and greatly increased the false negative rate. However, most approaches performed well, and it is encouraging that widely available techniques can achieve such good results on a very noisy data set.

  13. Development and validation of a computational model of the knee joint for the evaluation of surgical treatments for osteoarthritis.

    Science.gov (United States)

    Mootanah, R; Imhauser, C W; Reisse, F; Carpanen, D; Walker, R W; Koff, M F; Lenhoff, M W; Rozbruch, S R; Fragomen, A T; Dewan, Z; Kirane, Y M; Cheah, K; Dowell, J K; Hillstrom, H J

    2014-01-01

    A three-dimensional (3D) knee joint computational model was developed and validated to predict knee joint contact forces and pressures for different degrees of malalignment. A 3D computational knee model was created from high-resolution radiological images to emulate passive sagittal rotation (full-extension to 65°-flexion) and weight acceptance. A cadaveric knee mounted on a six-degree-of-freedom robot was subjected to matching boundary and loading conditions. A ligament-tuning process minimised kinematic differences between the robotically loaded cadaver specimen and the finite element (FE) model. The model was validated by measured intra-articular force and pressure measurements. Percent full scale error between FE-predicted and in vitro-measured values in the medial and lateral compartments were 6.67% and 5.94%, respectively, for normalised peak pressure values, and 7.56% and 4.48%, respectively, for normalised force values. The knee model can accurately predict normalised intra-articular pressure and forces for different loading conditions and could be further developed for subject-specific surgical planning.

  14. Repeated lysergic acid diethylamide in an animal model of depression: Normalisation of learning behaviour and hippocampal serotonin 5-HT2 signalling.

    Science.gov (United States)

    Buchborn, Tobias; Schröder, Helmut; Höllt, Volker; Grecksch, Gisela

    2014-06-01

    A re-balance of postsynaptic serotonin (5-HT) receptor signalling, with an increase in 5-HT1A and a decrease in 5-HT2A signalling, is a final common pathway multiple antidepressants share. Given that the 5-HT1A/2A agonist lysergic acid diethylamide (LSD), when repeatedly applied, selectively downregulates 5-HT2A, but not 5-HT1A receptors, one might expect LSD to similarly re-balance the postsynaptic 5-HT signalling. Challenging this idea, we use an animal model of depression specifically responding to repeated antidepressant treatment (olfactory bulbectomy), and test the antidepressant-like properties of repeated LSD treatment (0.13 mg/kg/d, 11 d). In line with former findings, we observe that bulbectomised rats show marked deficits in active avoidance learning. These deficits, similarly as we earlier noted with imipramine, are largely reversed by repeated LSD administration. Additionally, bulbectomised rats exhibit distinct anomalies of monoamine receptor signalling in hippocampus and/or frontal cortex; from these, only the hippocampal decrease in 5-HT2 related [(35)S]-GTP-gamma-S binding is normalised by LSD. Importantly, the sham-operated rats do not profit from LSD, and exhibit reduced hippocampal 5-HT2 signalling. As behavioural deficits after bulbectomy respond to agents classified as antidepressants only, we conclude that the effect of LSD in this model can be considered antidepressant-like, and discuss it in terms of a re-balance of hippocampal 5-HT2/5-HT1A signalling. © The Author(s) 2014.

  15. Living under the influence: normalisation of alcohol consumption in our cities

    Directory of Open Access Journals (Sweden)

    Xisca Sureda

    2017-01-01

    Full Text Available Harmful use of alcohol is one of the world's leading health risks. A positive association between certain characteristics of the urban environment and individual alcohol consumption has been documented in previous research. When developing a tool characterising the urban environment of alcohol in the cities of Barcelona and Madrid we observed that alcohol is ever present in our cities. Urban residents are constantly exposed to a wide variety of alcohol products, marketing and promotion and signs of alcohol consumption. In this field note, we reflect the normalisation of alcohol in urban environments. We highlight the need for further research to better understand attitudes and practices in relation to alcohol consumption. This type of urban studies is necessary to support policy interventions to prevent and control harmful alcohol use.

  16. ReadqPCR and NormqPCR: R packages for the reading, quality checking and normalisation of RT-qPCR quantification cycle (Cq data

    Directory of Open Access Journals (Sweden)

    Perkins James R

    2012-07-01

    Full Text Available Abstract Background Measuring gene transcription using real-time reverse transcription polymerase chain reaction (RT-qPCR technology is a mainstay of molecular biology. Technologies now exist to measure the abundance of many transcripts in parallel. The selection of the optimal reference gene for the normalisation of this data is a recurring problem, and several algorithms have been developed in order to solve it. So far nothing in R exists to unite these methods, together with other functions to read in and normalise the data using the chosen reference gene(s. Results We have developed two R/Bioconductor packages, ReadqPCR and NormqPCR, intended for a user with some experience with high-throughput data analysis using R, who wishes to use R to analyse RT-qPCR data. We illustrate their potential use in a workflow analysing a generic RT-qPCR experiment, and apply this to a real dataset. Packages are available from http://www.bioconductor.org/packages/release/bioc/html/ReadqPCR.htmland http://www.bioconductor.org/packages/release/bioc/html/NormqPCR.html Conclusions These packages increase the repetoire of RT-qPCR analysis tools available to the R user and allow them to (amongst other things read their data into R, hold it in an ExpressionSet compatible R object, choose appropriate reference genes, normalise the data and look for differential expression between samples.

  17. Four weeks of near-normalisation of blood glucose improves the insulin response to glucagon-like peptide-1 and glucose-dependent insulinotropic polypeptide in patients with type 2 diabetes

    DEFF Research Database (Denmark)

    Højberg, P V; Vilsbøll, T; Rabøl, R

    2008-01-01

    of near-normalisation of the blood glucose level could improve insulin responses to GIP and GLP-1 in patients with type 2 diabetes. METHODS: Eight obese patients with type 2 diabetes with poor glycaemic control (HbA(1c) 8.6 +/- 1.3%), were investigated before and after 4 weeks of near......-normalisation of blood glucose (mean blood glucose 7.4 +/- 1.2 mmol/l) using insulin treatment. Before and after insulin treatment the participants underwent three hyperglycaemic clamps (15 mmol/l) with infusion of GLP-1, GIP or saline. Insulin responses were evaluated as the incremental area under the plasma C......-peptide curve. RESULTS: Before and after near-normalisation of blood glucose, the C-peptide responses did not differ during the early phase of insulin secretion (0-10 min). The late phase C-peptide response (10-120 min) increased during GIP infusion from 33.0 +/- 8.5 to 103.9 +/- 24.2 (nmol/l) x (110 min)(-1...

  18. Quantitative seafloor characterization using angular backscatter data of the multi-beam echo-sounding system - Use of models and model free techniques

    Digital Repository Service at National Institute of Oceanography (India)

    Chakraborty, B.

    processing gain, bottom slope corrections, and bottom insonification area normalisation were proposed to generate angular backscattering strength for modelling to infer bottom roughness parameters. A software package (NORGCOR) for similar purpose... bottom backscatter data from multibeam system. For each seafloor area, processed backscatter strength values [presented in Fig.: I(c)], are binned at intervals of 1° from --45° to +45°, and averaged over the entire dataset (approximately around 100...

  19. Attention training normalises combat-related post-traumatic stress disorder effects on emotional Stroop performance using lexically matched word lists.

    Science.gov (United States)

    Khanna, Maya M; Badura-Brack, Amy S; McDermott, Timothy J; Shepherd, Alex; Heinrichs-Graham, Elizabeth; Pine, Daniel S; Bar-Haim, Yair; Wilson, Tony W

    2015-08-26

    We examined two groups of combat veterans, one with post-traumatic stress disorder (PTSD) (n = 27) and another without PTSD (n = 16), using an emotional Stroop task (EST) with word lists matched across a series of lexical variables (e.g. length, frequency, neighbourhood size, etc.). Participants with PTSD exhibited a strong EST effect (longer colour-naming latencies for combat-relevant words as compared to neutral words). Veterans without PTSD produced no such effect, t  .37. Participants with PTSD then completed eight sessions of attention training (Attention Control Training or Attention Bias Modification Training) with a dot-probe task utilising threatening and neutral faces. After training, participants-especially those undergoing Attention Control Training-no longer produced longer colour-naming latencies for combat-related words as compared to other words, indicating normalised attention allocation processes after treatment.

  20. Calculation of normalised organ and effective doses to adult reference computational phantoms from contemporary computed tomography scanners

    International Nuclear Information System (INIS)

    Jansen, Jan T.M.; Shrimpton, Paul C.

    2010-01-01

    The general-purpose Monte Carlo radiation transport code MCNPX has been used to simulate photon transport and energy deposition in anthropomorphic phantoms due to the x-ray exposure from the Philips iCT 256 and Siemens Definition CT scanners, together with the previously studied General Electric 9800. The MCNPX code was compiled with the Intel FORTRAN compiler and run on a Linux PC cluster. A patch has been successfully applied to reduce computing times by about 4%. The International Commission on Radiological Protection (ICRP) has recently published the Adult Male (AM) and Adult Female (AF) reference computational voxel phantoms as successors to the Medical Internal Radiation Dose (MIRD) stylised hermaphrodite mathematical phantoms that form the basis for the widely-used ImPACT CT dosimetry tool. Comparisons of normalised organ and effective doses calculated for a range of scanner operating conditions have demonstrated significant differences in results (in excess of 30%) between the voxel and mathematical phantoms as a result of variations in anatomy. These analyses illustrate the significant influence of choice of phantom on normalised organ doses and the need for standardisation to facilitate comparisons of dose. Further such dose simulations are needed in order to update the ImPACT CT Patient Dosimetry spreadsheet for contemporary CT practice. (author)

  1. Living under the influence: normalisation of alcohol consumption in our cities.

    Science.gov (United States)

    Sureda, Xisca; Villalbí, Joan R; Espelt, Albert; Franco, Manuel

    Harmful use of alcohol is one of the world's leading health risks. A positive association between certain characteristics of the urban environment and individual alcohol consumption has been documented in previous research. When developing a tool characterising the urban environment of alcohol in the cities of Barcelona and Madrid we observed that alcohol is ever present in our cities. Urban residents are constantly exposed to a wide variety of alcohol products, marketing and promotion and signs of alcohol consumption. In this field note, we reflect the normalisation of alcohol in urban environments. We highlight the need for further research to better understand attitudes and practices in relation to alcohol consumption. This type of urban studies is necessary to support policy interventions to prevent and control harmful alcohol use. Copyright © 2016 SESPAS. Publicado por Elsevier España, S.L.U. All rights reserved.

  2. Normalised subband adaptive filtering with extended adaptiveness on degree of subband filters

    Science.gov (United States)

    Samuyelu, Bommu; Rajesh Kumar, Pullakura

    2017-12-01

    This paper proposes an adaptive normalised subband adaptive filtering (NSAF) to accomplish the betterment of NSAF performance. In the proposed NSAF, an extended adaptiveness is introduced from its variants in two ways. In the first way, the step-size is set adaptive, and in the second way, the selection of subbands is set adaptive. Hence, the proposed NSAF is termed here as variable step-size-based NSAF with selected subbands (VS-SNSAF). Experimental investigations are carried out to demonstrate the performance (in terms of convergence) of the VS-SNSAF against the conventional NSAF and its state-of-the-art adaptive variants. The results report the superior performance of VS-SNSAF over the traditional NSAF and its variants. It is also proved for its stability, robustness against noise and substantial computing complexity.

  3. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    INTRODUCTION: Oral anticoagulation treatment (OACT) with warfarin is common in general practice. Increasingly, international normalised ratio (INR) point of care testing (POCT) is being used to manage patients. The aim of this study was to describe and analyse the quality of OACT with warfarin...... practices using INR POCT in the management of patients in warfarin treatment provided good quality of care. Sampling interval and diagnostic coding were significantly correlated with treatment quality....

  4. Good quality of oral anticoagulation treatment in general practice using international normalised ratio point of care testing

    DEFF Research Database (Denmark)

    Løkkegaard, Thomas; Pedersen, Tina Heidi; Lind, Bent

    2015-01-01

    INTRODUCTION: Oral anticoagulation treatment (OACT)with warfarin is common in general practice. Increasingly,international normalised ratio (INR) point of care testing(POCT) is being used to manage patients. The aim of thisstudy was to describe and analyse the quality of OACT withwarfarin...... in the management of patients in warfarintreatment provided good quality of care. Sampling intervaland diagnostic coding were significantly correlated withtreatment quality. FUNDING: The study received financial support from theSarah Krabbe Foundation, the General Practitioners’ Educationand Development Foundation...

  5. Identifying Stable Reference Genes for qRT-PCR Normalisation in Gene Expression Studies of Narrow-Leafed Lupin (Lupinus angustifolius L..

    Directory of Open Access Journals (Sweden)

    Candy M Taylor

    Full Text Available Quantitative Reverse Transcription PCR (qRT-PCR is currently one of the most popular, high-throughput and sensitive technologies available for quantifying gene expression. Its accurate application depends heavily upon normalisation of gene-of-interest data with reference genes that are uniformly expressed under experimental conditions. The aim of this study was to provide the first validation of reference genes for Lupinus angustifolius (narrow-leafed lupin, a significant grain legume crop using a selection of seven genes previously trialed as reference genes for the model legume, Medicago truncatula. In a preliminary evaluation, the seven candidate reference genes were assessed on the basis of primer specificity for their respective targeted region, PCR amplification efficiency, and ability to discriminate between cDNA and gDNA. Following this assessment, expression of the three most promising candidates [Ubiquitin C (UBC, Helicase (HEL, and Polypyrimidine tract-binding protein (PTB] was evaluated using the NormFinder and RefFinder statistical algorithms in two narrow-leafed lupin lines, both with and without vernalisation treatment, and across seven organ types (cotyledons, stem, leaves, shoot apical meristem, flowers, pods and roots encompassing three developmental stages. UBC was consistently identified as the most stable candidate and has sufficiently uniform expression that it may be used as a sole reference gene under the experimental conditions tested here. However, as organ type and developmental stage were associated with greater variability in relative expression, it is recommended using UBC and HEL as a pair to achieve optimal normalisation. These results highlight the importance of rigorously assessing candidate reference genes for each species across a diverse range of organs and developmental stages. With emerging technologies, such as RNAseq, and the completion of valuable transcriptome data sets, it is possible that other

  6. Optimisation of hardness and tensile strength of friction stir welded ...

    African Journals Online (AJOL)

    DR OKE

    adopted to develop mathematical model between the response and process parameters. .... Table 3 Normalised values and Deviational Sequence ... If the expectancy is the smaller the better, then the original sequence should be normalised ...

  7. Inference of financial networks using the normalised mutual information rate

    Science.gov (United States)

    2018-01-01

    In this paper, we study data from financial markets, using the normalised Mutual Information Rate. We show how to use it to infer the underlying network structure of interrelations in the foreign currency exchange rates and stock indices of 15 currency areas. We first present the mathematical method and discuss its computational aspects, and apply it to artificial data from chaotic dynamics and to correlated normal-variates data. We then apply the method to infer the structure of the financial system from the time-series of currency exchange rates and stock indices. In particular, we study and reveal the interrelations among the various foreign currency exchange rates and stock indices in two separate networks, of which we also study their structural properties. Our results show that both inferred networks are small-world networks, sharing similar properties and having differences in terms of assortativity. Importantly, our work shows that global economies tend to connect with other economies world-wide, rather than creating small groups of local economies. Finally, the consistent interrelations depicted among the 15 currency areas are further supported by a discussion from the viewpoint of economics. PMID:29420644

  8. Inference of financial networks using the normalised mutual information rate.

    Science.gov (United States)

    Goh, Yong Kheng; Hasim, Haslifah M; Antonopoulos, Chris G

    2018-01-01

    In this paper, we study data from financial markets, using the normalised Mutual Information Rate. We show how to use it to infer the underlying network structure of interrelations in the foreign currency exchange rates and stock indices of 15 currency areas. We first present the mathematical method and discuss its computational aspects, and apply it to artificial data from chaotic dynamics and to correlated normal-variates data. We then apply the method to infer the structure of the financial system from the time-series of currency exchange rates and stock indices. In particular, we study and reveal the interrelations among the various foreign currency exchange rates and stock indices in two separate networks, of which we also study their structural properties. Our results show that both inferred networks are small-world networks, sharing similar properties and having differences in terms of assortativity. Importantly, our work shows that global economies tend to connect with other economies world-wide, rather than creating small groups of local economies. Finally, the consistent interrelations depicted among the 15 currency areas are further supported by a discussion from the viewpoint of economics.

  9. E-IMPACT - A ROBUST HAZARD-BASED ENVIRONMENTAL IMPACT ASSESSMENT APPROACH FOR PROCESS INDUSTRIES

    Directory of Open Access Journals (Sweden)

    KHANDOKER A. HOSSAIN

    2008-04-01

    Full Text Available This paper proposes a hazard-based environmental impact assessment approach (E-Impact, for evaluating the environmental impact during process design and retrofit stages. E-Impact replaces the normalisation step of the conventional impact assessment phase. This approach compares the impact scores for different options and assigns a relative score to each option. This eliminates the complexity of the normalisation step in the evaluation phase. The applicability of the E-Impact has been illustrated through a case study of solvent selection in an acrylic acid manufacturing plant. E-Impact is used in conjunction with Aspen-HYSYS process simulator to develop mass and heat balance data.

  10. Normalisation in product life cycle assessment: an LCA of the global and European economic systems in the year 2000.

    Science.gov (United States)

    Sleeswijk, Anneke Wegener; van Oers, Lauran F C M; Guinée, Jeroen B; Struijs, Jaap; Huijbregts, Mark A J

    2008-02-01

    In the methodological context of the interpretation of environmental life cycle assessment (LCA) results, a normalisation study was performed. 15 impact categories were accounted for, including climate change, acidification, eutrophication, human toxicity, ecotoxicity, depletion of fossil energy resources, and land use. The year 2000 was chosen as a reference year, and information was gathered on two spatial levels: the global and the European level. From the 860 environmental interventions collected, 48 interventions turned out to account for at least 75% of the impact scores of all impact categories. All non-toxicity related, emission dependent impacts are fully dominated by the bulk emissions of only 10 substances or substance groups: CO(2), CH(4), SO(2), NO(x), NH(3), PM(10), NMVOC, and (H)CFCs emissions to air and emissions of N- and P-compounds to fresh water. For the toxicity-related emissions (pesticides, organics, metal compounds and some specific inorganics), the availability of information was still very limited, leading to large uncertainty in the corresponding normalisation factors. Apart from their usefulness as a reference for LCA studies, the results of this study stress the importance of efficient measures to combat bulk emissions and to promote the registration of potentially toxic emissions on a more comprehensive scale.

  11. No upward trend in normalised windstorm losses in Europe: 1970-2008

    Science.gov (United States)

    Barredo, J. I.

    2010-01-01

    On 18 January 2007, windstorm Kyrill battered Europe with hurricane-force winds killing 47 people and causing 10 billion US in damage. Kyrill poses several questions: is Kyrill an isolated or exceptional case? Have there been events costing as much in the past? This paper attempts to put Kyrill into an historical context by examining large historical windstorm event losses in Europe for the period 1970-2008 across 29 European countries. It asks the question what economic losses would these historical events cause if they were to recur under 2008 societal conditions? Loss data were sourced from reinsurance firms and augmented with historical reports, peer-reviewed articles and other ancillary sources. Following the same conceptual approach outlined in previous studies, the data were then adjusted for changes in population, wealth, and inflation at the country level and for inter-country price differences using purchasing power parity. The analyses reveal no trend in the normalised windstorm losses and confirm increasing disaster losses are driven by societal factors and increasing exposure.

  12. Microstructural characterisation of a P91 steel normalised and tempered at different temperatures

    International Nuclear Information System (INIS)

    Hurtado-Norena, C.; Danon, C.A.; Luppo, M.I.; Bruzzoni, P.

    2015-01-01

    9%Cr-1%Mo martensitic-ferritic steels are used in power plant components with operating temperatures of around 600 deg. C because of their good mechanical properties at high temperature as well as good oxidation resistance. These steels are generally used in the normalised and tempered condition. This treatment results in a structure of tempered lath martensite where the precipitates are distributed along the lath interfaces and within the martensite laths. The characterisation of these precipitates is of fundamental importance because of their relationship with the creep behaviour of these steels in service. In the present work, the different types of precipitates found in these steels have been studied on specimens in different metallurgical conditions. The techniques used in this investigation were X-ray diffraction with synchrotron light, scanning electron microscopy, energy dispersive microanalysis and transmission electron microscopy. (authors)

  13. Bayesian analysis of Markov point processes

    DEFF Research Database (Denmark)

    Berthelsen, Kasper Klitgaard; Møller, Jesper

    2006-01-01

    Recently Møller, Pettitt, Berthelsen and Reeves introduced a new MCMC methodology for drawing samples from a posterior distribution when the likelihood function is only specified up to a normalising constant. We illustrate the method in the setting of Bayesian inference for Markov point processes...... a partially ordered Markov point process as the auxiliary variable. As the method requires simulation from the "unknown" likelihood, perfect simulation algorithms for spatial point processes become useful....

  14. The moral experience of illness and its impact on normalisation: Examples from narratives with Punjabi women living with rheumatoid arthritis in the UK.

    Science.gov (United States)

    Sanderson, Tessa; Calnan, Michael; Kumar, Kanta

    2015-11-01

    The moral component of living with illness has been neglected in analyses of long-term illness experiences. This article attempts to fill this gap by exploring the role of the moral experience of illness in mediating the ability of those living with a long-term condition (LTC) to normalise. This is explored through an empirical study of women of Punjabi origin living with rheumatoid arthritis (RA) in the UK. Sixteen informants were recruited through three hospitals in UK cities and interviews conducted and analysed using a grounded theory approach. The intersection between moral experience and normalisation, within the broader context of ethnic, gender and socioeconomic influences, was evident in the following: disruption of a core lived value (the centrality of family duty), beliefs about illness causation affecting informants' 'moral career', and perceived discrimination in the workplace. The data illustrate the importance of considering an ethnic community's specific values and beliefs when understanding differences in adapting to LTCs and changing identities. © 2015 Foundation for the Sociology of Health & Illness.

  15. A formative evaluation of the implementation of a medication safety data collection tool in English healthcare settings: A qualitative interview study using normalisation process theory.

    Science.gov (United States)

    Rostami, Paryaneh; Ashcroft, Darren M; Tully, Mary P

    2018-01-01

    Reducing medication-related harm is a global priority; however, impetus for improvement is impeded as routine medication safety data are seldom available. Therefore, the Medication Safety Thermometer was developed within England's National Health Service. This study aimed to explore the implementation of the tool into routine practice from users' perspectives. Fifteen semi-structured interviews were conducted with purposely sampled National Health Service staff from primary and secondary care settings. Interview data were analysed using an initial thematic analysis, and subsequent analysis using Normalisation Process Theory. Secondary care staff understood that the Medication Safety Thermometer's purpose was to measure medication safety and improvement. However, other uses were reported, such as pinpointing poor practice. Confusion about its purpose existed in primary care, despite further training, suggesting unsuitability of the tool. Decreased engagement was displayed by staff less involved with medication use, who displayed less ownership. Nonetheless, these advocates often lacked support from management and frontline levels, leading to an overall lack of engagement. Many participants reported efforts to drive scale-up of the use of the tool, for example, by securing funding, despite uncertainty around how to use data. Successful improvement was often at ward-level and went unrecognised within the wider organisation. There was mixed feedback regarding the value of the tool, often due to a perceived lack of "capacity". However, participants demonstrated interest in learning how to use their data and unexpected applications of data were reported. Routine medication safety data collection is complex, but achievable and facilitates improvements. However, collected data must be analysed, understood and used for further work to achieve improvement, which often does not happen. The national roll-out of the tool has accelerated shared learning; however, a number of

  16. A formative evaluation of the implementation of a medication safety data collection tool in English healthcare settings: A qualitative interview study using normalisation process theory.

    Directory of Open Access Journals (Sweden)

    Paryaneh Rostami

    Full Text Available Reducing medication-related harm is a global priority; however, impetus for improvement is impeded as routine medication safety data are seldom available. Therefore, the Medication Safety Thermometer was developed within England's National Health Service. This study aimed to explore the implementation of the tool into routine practice from users' perspectives.Fifteen semi-structured interviews were conducted with purposely sampled National Health Service staff from primary and secondary care settings. Interview data were analysed using an initial thematic analysis, and subsequent analysis using Normalisation Process Theory.Secondary care staff understood that the Medication Safety Thermometer's purpose was to measure medication safety and improvement. However, other uses were reported, such as pinpointing poor practice. Confusion about its purpose existed in primary care, despite further training, suggesting unsuitability of the tool. Decreased engagement was displayed by staff less involved with medication use, who displayed less ownership. Nonetheless, these advocates often lacked support from management and frontline levels, leading to an overall lack of engagement. Many participants reported efforts to drive scale-up of the use of the tool, for example, by securing funding, despite uncertainty around how to use data. Successful improvement was often at ward-level and went unrecognised within the wider organisation. There was mixed feedback regarding the value of the tool, often due to a perceived lack of "capacity". However, participants demonstrated interest in learning how to use their data and unexpected applications of data were reported.Routine medication safety data collection is complex, but achievable and facilitates improvements. However, collected data must be analysed, understood and used for further work to achieve improvement, which often does not happen. The national roll-out of the tool has accelerated shared learning; however

  17. Impact of particle density and initial volume on mathematical compression models

    DEFF Research Database (Denmark)

    Sonnergaard, Jørn

    2000-01-01

    In the calculation of the coefficients of compression models for powders either the initial volume or the particle density is introduced as a normalising factor. The influence of these normalising factors is, however, widely different on coefficients derived from the Kawakita, Walker and Heckel...... equations. The problems are illustrated by investigations on compaction profiles of 17 materials with different molecular structures and particle densities. It is shown that the particle density of materials with covalent bonds in the Heckel model acts as a key parameter with a dominating influence...

  18. Rational parametrisation of normalised Stiefel manifolds, and explicit non-'t Hooft solutions of the Atiyah-Drinfeld-Hitchin-Manin instanton matrix equations for Sp(n)

    International Nuclear Information System (INIS)

    McCarthy, P.J.

    1981-01-01

    It is proved that normalised Stiefel manifolds admit a rational parametrisation which generalises Cayley's parametrisation of the unitary groups. Applying (the quaternionic case of) this parametrisation to the Atiyah-Drinfeld-Hitchin-Manin (ADHM) instanton matrix equations, large families of new explicit rational solutions emerge. In particular, new explicit non-'t Hooft solutions are presented. (orig.)

  19. Normalised Mutual Information of High-Density Surface Electromyography during Muscle Fatigue

    Directory of Open Access Journals (Sweden)

    Adrian Bingham

    2017-12-01

    Full Text Available This study has developed a technique for identifying the presence of muscle fatigue based on the spatial changes of the normalised mutual information (NMI between multiple high density surface electromyography (HD-sEMG channels. Muscle fatigue in the tibialis anterior (TA during isometric contractions at 40% and 80% maximum voluntary contraction levels was investigated in ten healthy participants (Age range: 21 to 35 years; Mean age = 26 years; Male = 4, Female = 6. HD-sEMG was used to record 64 channels of sEMG using a 16 by 4 electrode array placed over the TA. The NMI of each electrode with every other electrode was calculated to form an NMI distribution for each electrode. The total NMI for each electrode (the summation of the electrode’s NMI distribution highlighted regions of high dependence in the electrode array and was observed to increase as the muscle fatigued. To summarise this increase, a function, M(k, was defined and was found to be significantly affected by fatigue and not by contraction force. The technique discussed in this study has overcome issues regarding electrode placement and was used to investigate how the dependences between sEMG signals within the same muscle change spatially during fatigue.

  20. Aberrant brain responses to emotionally valent words is normalised after cognitive behavioural therapy in female depressed adolescents.

    Science.gov (United States)

    Chuang, Jie-Yu; J Whitaker, Kirstie; Murray, Graham K; Elliott, Rebecca; Hagan, Cindy C; Graham, Julia Me; Ooi, Cinly; Tait, Roger; Holt, Rosemary J; van Nieuwenhuizen, Adrienne O; Reynolds, Shirley; Wilkinson, Paul O; Bullmore, Edward T; Lennox, Belinda R; Sahakian, Barbara J; Goodyer, Ian; Suckling, John

    2016-01-01

    Depression in adolescence is debilitating with high recurrence in adulthood, yet its pathophysiological mechanism remains enigmatic. To examine the interaction between emotion, cognition and treatment, functional brain responses to sad and happy distractors in an affective go/no-go task were explored before and after Cognitive Behavioural Therapy (CBT) in depressed female adolescents, and healthy participants. Eighty-two Depressed and 24 healthy female adolescents, aged 12-17 years, performed a functional magnetic resonance imaging (fMRI) affective go/no-go task at baseline. Participants were instructed to withhold their responses upon seeing happy or sad words. Among these participants, 13 patients had CBT over approximately 30 weeks. These participants and 20 matched controls then repeated the task. At baseline, increased activation in response to happy relative to neutral distractors was observed in the orbitofrontal cortex in depressed patients which was normalised after CBT. No significant group differences were found behaviourally or in brain activation in response to sad distractors. Improvements in symptoms (mean: 9.31, 95% CI: 5.35-13.27) were related at trend-level to activation changes in orbitofrontal cortex. In the follow-up section, a limited number of post-CBT patients were recruited. To our knowledge, this is the first fMRI study addressing the effect of CBT in adolescent depression. Although a bias toward negative information is widely accepted as a hallmark of depression, aberrant brain hyperactivity to positive distractors was found and normalised after CBT. Research, assessment and treatment focused on positive stimuli could be a future consideration. Moreover, a pathophysiological mechanism distinct from adult depression may be suggested and awaits further exploration. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  1. Does normalisation improve the diagnostic performance of apparent diffusion coefficient values for prostate cancer assessment? A blinded independent-observer evaluation

    International Nuclear Information System (INIS)

    Rosenkrantz, A.B.; Khalef, V.; Xu, W.; Babb, J.S.; Taneja, S.S.; Doshi, A.M.

    2015-01-01

    Aim: To evaluate the performance of normalised apparent diffusion coefficient (ADC) values for prostate cancer assessment when performed by independent observers blinded to histopathology findings. Materials and methods: Fifty-eight patients undergoing 3 T phased-array coil magnetic resonance imaging (MRI) including diffusion-weighted imaging (DWI; maximal b-value 1000 s/mm 2 ) before prostatectomy were included. Two radiologists independently evaluated the images, unaware of the histopathology findings. Regions of interest (ROIs) were drawn within areas showing visually low ADC within the peripheral zone (PZ) and transition zone (TZ) bilaterally. ROIs were also placed within regions in both lobes not suspicious for tumour, allowing computation of normalised ADC (nADC) ratios between suspicious and non-suspicious regions. The diagnostic performance of ADC and nADC were compared. Results: For PZ tumour detection, ADC achieved significantly higher area under the receiver operating characteristic curve (AUC; p=0.026) and specificity (p=0.021) than nADC for reader 1, and significantly higher AUC (p=0.025) than nADC for reader 2. For TZ tumour detection, nADC achieved significantly higher specificity (p=0.003) and accuracy (p=0.004) than ADC for reader 2. For PZ Gleason score >3+3 tumour detection, ADC achieved significantly higher AUC (p=0.003) and specificity (p=0.005) than nADC for reader 1, and significantly higher AUC (p=0.023) than nADC for reader 2. For TZ Gleason score >3+3 tumour detection, ADC achieved significantly higher specificity (p=0.019) than nADC for reader 1. Conclusion: In contrast to prior studies performing unblinded evaluations, ADC was observed to outperform nADC overall for two independent observers blinded to the histopathology findings. Therefore, although strategies to improve the utility of ADC measurements in prostate cancer assessment merit continued investigation, caution is warranted when applying normalisation to improve diagnostic

  2. Process modelling on a canonical basis[Process modelling; Canonical modelling

    Energy Technology Data Exchange (ETDEWEB)

    Siepmann, Volker

    2006-12-20

    Based on an equation oriented solving strategy, this thesis investigates a new approach to process modelling. Homogeneous thermodynamic state functions represent consistent mathematical models of thermodynamic properties. Such state functions of solely extensive canonical state variables are the basis of this work, as they are natural objective functions in optimisation nodes to calculate thermodynamic equilibrium regarding phase-interaction and chemical reactions. Analytical state function derivatives are utilised within the solution process as well as interpreted as physical properties. By this approach, only a limited range of imaginable process constraints are considered, namely linear balance equations of state variables. A second-order update of source contributions to these balance equations is obtained by an additional constitutive equation system. These equations are general dependent on state variables and first-order sensitivities, and cover therefore practically all potential process constraints. Symbolic computation technology efficiently provides sparsity and derivative information of active equations to avoid performance problems regarding robustness and computational effort. A benefit of detaching the constitutive equation system is that the structure of the main equation system remains unaffected by these constraints, and a priori information allows to implement an efficient solving strategy and a concise error diagnosis. A tailor-made linear algebra library handles the sparse recursive block structures efficiently. The optimisation principle for single modules of thermodynamic equilibrium is extended to host entire process models. State variables of different modules interact through balance equations, representing material flows from one module to the other. To account for reusability and encapsulation of process module details, modular process modelling is supported by a recursive module structure. The second-order solving algorithm makes it

  3. A comprehensive, multi-level investigation of the implementation of a novel digital substance misuse intervention, Breaking Free Online: conceptualising implementation processes within services using the MRC framework and health psychology theory.

    Directory of Open Access Journals (Sweden)

    Stephanie Dugdale

    2015-08-01

    Methods: Semi-structured qualitative interviews were conducted with staff, peer mentors and service users to investigate initial diffusion and subsequent normalisation of BFO within CRI, and the impact of BFO on peer mentors’ own substance misuse recovery journeys. Thematic analyses were conducted, and models derived from health psychology and implementation science used to conceptualise implementation processes from an organisational level. Further analyses using Interpretative Phenomenological Analysis (IPA expanded investigation down to the level of individual people within CRI, and the role of peer mentors delivering the programme within the organisation. Results: ‘Diffusion of innovation theory’ conceptualised initial implementation and diffusion of BFO throughout CRI. Although there were perceived barriers to implementation, such as lack of IT equipment, anxieties around staff and service user IT skills and the impact on staff’s professional roles, intentions to continue using BFO were reported. Analyses investigating continued implementation processes of the programme used ‘normalisation process theory’ to demonstrate how a process of normalisation of the programme is underway following initial diffusion. Findings suggested that staff were beginning to take greater ‘ownership’ of BFO since it was initially introduced into the organisation, and that the programme was influencing changes to work-role responsibilities in delivering BFO. Data using the ‘trans-theoretical model’ also indicated that peer mentors benefited from implementing BFO to support others and assist their own recovery maintenance. Conclusion: Whilst the principal focus must always be on establishing clinical effectiveness when developing and evaluating complex behaviour change interventions, such as digital interventions, implementation process analysis is also key. This analysis is important in order for interventions to be translated into real-world outcomes, as without

  4. Symmorphosis through dietary regulation: a combinatorial role for proteolysis, autophagy and protein synthesis in normalising muscle metabolism and function of hypertrophic mice after acute starvation.

    Directory of Open Access Journals (Sweden)

    Henry Collins-Hooper

    Full Text Available Animals are imbued with adaptive mechanisms spanning from the tissue/organ to the cellular scale which insure that processes of homeostasis are preserved in the landscape of size change. However we and others have postulated that the degree of adaptation is limited and that once outside the normal levels of size fluctuations, cells and tissues function in an aberant manner. In this study we examine the function of muscle in the myostatin null mouse which is an excellent model for hypertrophy beyond levels of normal growth and consequeces of acute starvation to restore mass. We show that muscle growth is sustained through protein synthesis driven by Serum/Glucocorticoid Kinase 1 (SGK1 rather than Akt1. Furthermore our metabonomic profiling of hypertrophic muscle shows that carbon from nutrient sources is being channelled for the production of biomass rather than ATP production. However the muscle displays elevated levels of autophagy and decreased levels of muscle tension. We demonstrate the myostatin null muscle is acutely sensitive to changes in diet and activates both the proteolytic and autophagy programmes and shutting down protein synthesis more extensively than is the case for wild-types. Poignantly we show that acute starvation which is detrimental to wild-type animals is beneficial in terms of metabolism and muscle function in the myostatin null mice by normalising tension production.

  5. Analysis of experimental data: The average shape of extreme wave forces on monopile foundations and the NewForce model

    DEFF Research Database (Denmark)

    Schløer, Signe; Bredmose, Henrik; Ghadirian, Amin

    2017-01-01

    Experiments with a stiff pile subjected to extreme wave forces typical of offshore wind farm storm conditions are considered. The exceedance probability curves of the nondimensional force peaks and crest heights are analysed. The average force time history normalised with their peak values are co...... to the average shapes. For more nonlinear wave shapes, higher order terms has to be considered in order for the NewForce model to be able to predict the expected shapes.......Experiments with a stiff pile subjected to extreme wave forces typical of offshore wind farm storm conditions are considered. The exceedance probability curves of the nondimensional force peaks and crest heights are analysed. The average force time history normalised with their peak values...... are compared across the sea states. It is found that the force shapes show a clear similarity when grouped after the values of the normalised peak force, F/(ρghR2), normalised depth h/(gT2p) and presented in a normalised time scale t/Ta. For the largest force events, slamming can be seen as a distinct ‘hat...

  6. Modelling and Control of TCV

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, A.S.; Limebeer, D.J.N.; Jaimoukha, I.M.; Lister, J.B

    2001-11-01

    A new approach to the modelling and control of tokamak fusion reactors is presented. A nonlinear model is derived using the classical arguments of Hamiltonian mechanics and a low-order linear model is derived from it. The modelling process used here addresses flux and energy conservation issues explicitly and self-consistently. The model is of particular value, because it shows the relationship between the initial modelling assumptions and the resulting predictions. The mechanisms behind the creation of uncontrollable modes in tokamak models are discussed. A normalised coprime factorisation controller is developed for the TCV tokamak using the verified linear model. Recent theory is applied to reduce the controller order significantly whilst guaranteeing a priori bounds on the robust stability and performance. The controller is shown to track successfully reference signals that dictate the plasma's shape, position and current. The tests used to verify this were carried out on linear and nonlinear models. (author)

  7. Modelling and Control of TCV

    International Nuclear Information System (INIS)

    Sharma, A.S.; Limebeer, D.J.N.; Jaimoukha, I.M.; Lister, J.B.

    2001-11-01

    A new approach to the modelling and control of tokamak fusion reactors is presented. A nonlinear model is derived using the classical arguments of Hamiltonian mechanics and a low-order linear model is derived from it. The modelling process used here addresses flux and energy conservation issues explicitly and self-consistently. The model is of particular value, because it shows the relationship between the initial modelling assumptions and the resulting predictions. The mechanisms behind the creation of uncontrollable modes in tokamak models are discussed. A normalised coprime factorisation controller is developed for the TCV tokamak using the verified linear model. Recent theory is applied to reduce the controller order significantly whilst guaranteeing a priori bounds on the robust stability and performance. The controller is shown to track successfully reference signals that dictate the plasma's shape, position and current. The tests used to verify this were carried out on linear and nonlinear models. (author)

  8. The normalisation of terror: the response of Israel's stock market to long periods of terrorism.

    Science.gov (United States)

    Peleg, Kobi; Regens, James L; Gunter, James T; Jaffe, Dena H

    2011-01-01

    Man-made disasters such as acts of terrorism may affect a society's resiliency and sensitivity to prolonged physical and psychological stress. The Israeli Tel Aviv stock market TA-100 Index was used as an indicator of reactivity to suicide terror bombings. After accounting for factors such as world market changes and attack severity and intensity, the analysis reveals that although Israel's financial base remained sensitive to each act of terror across the entire period of the Second Intifada (2000-06), sustained psychological resilience was indicated with no apparent overall market shift. In other words, we saw a 'normalisation of terror' following an extended period of continued suicide bombings. The results suggest that investors responded to less transitory global market forces, indicating sustained resilience and long-term market confidence. Future studies directly measuring investor expectations and reactions to man-made disasters, such as terrorism, are warranted. © 2011 The Author(s). Disasters © Overseas Development Institute, 2011.

  9. Modeling styles in business process modeling

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Zugal, S.; Weber, B.; Weidlich, M.; Fahland, D.; Reijers, H.A.; Mendling, J.; Bider, I.; Halpin, T.; Krogstie, J.; Nurcan, S.; Proper, E.; Schmidt, R.; Soffer, P.; Wrycza, S.

    2012-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models. As a consequence, the question arises whether different ways of creating process models exist. In this vein, we observed 115 students engaged in the act of modeling, recording

  10. Process modeling style

    CERN Document Server

    Long, John

    2014-01-01

    Process Modeling Style focuses on other aspects of process modeling beyond notation that are very important to practitioners. Many people who model processes focus on the specific notation used to create their drawings. While that is important, there are many other aspects to modeling, such as naming, creating identifiers, descriptions, interfaces, patterns, and creating useful process documentation. Experience author John Long focuses on those non-notational aspects of modeling, which practitioners will find invaluable. Gives solid advice for creating roles, work produ

  11. Clinical, immunological and treatment-related factors associated with normalised CD4+/CD8+ T-cell ratio: effect of naïve and memory T-cell subsets.

    LENUS (Irish Health Repository)

    Tinago, Willard

    2014-01-01

    Although effective antiretroviral therapy(ART) increases CD4+ T-cell count, responses to ART vary considerably and only a minority of patients normalise their CD4+\\/CD8+ ratio. Although retention of naïve CD4+ T-cells is thought to predict better immune responses, relationships between CD4+ and CD8+ T-cell subsets and CD4+\\/CD8+ ratio have not been well described.

  12. Markov Chain Modelling for Short-Term NDVI Time Series Forecasting

    Directory of Open Access Journals (Sweden)

    Stepčenko Artūrs

    2016-12-01

    Full Text Available In this paper, the NDVI time series forecasting model has been developed based on the use of discrete time, continuous state Markov chain of suitable order. The normalised difference vegetation index (NDVI is an indicator that describes the amount of chlorophyll (the green mass and shows the relative density and health of vegetation; therefore, it is an important variable for vegetation forecasting. A Markov chain is a stochastic process that consists of a state space. This stochastic process undergoes transitions from one state to another in the state space with some probabilities. A Markov chain forecast model is flexible in accommodating various forecast assumptions and structures. The present paper discusses the considerations and techniques in building a Markov chain forecast model at each step. Continuous state Markov chain model is analytically described. Finally, the application of the proposed Markov chain model is illustrated with reference to a set of NDVI time series data.

  13. Transforming Collaborative Process Models into Interface Process Models by Applying an MDA Approach

    Science.gov (United States)

    Lazarte, Ivanna M.; Chiotti, Omar; Villarreal, Pablo D.

    Collaborative business models among enterprises require defining collaborative business processes. Enterprises implement B2B collaborations to execute these processes. In B2B collaborations the integration and interoperability of processes and systems of the enterprises are required to support the execution of collaborative processes. From a collaborative process model, which describes the global view of the enterprise interactions, each enterprise must define the interface process that represents the role it performs in the collaborative process in order to implement the process in a Business Process Management System. Hence, in this work we propose a method for the automatic generation of the interface process model of each enterprise from a collaborative process model. This method is based on a Model-Driven Architecture to transform collaborative process models into interface process models. By applying this method, interface processes are guaranteed to be interoperable and defined according to a collaborative process.

  14. Spatial generalised linear mixed models based on distances.

    Science.gov (United States)

    Melo, Oscar O; Mateu, Jorge; Melo, Carlos E

    2016-10-01

    Risk models derived from environmental data have been widely shown to be effective in delineating geographical areas of risk because they are intuitively easy to understand. We present a new method based on distances, which allows the modelling of continuous and non-continuous random variables through distance-based spatial generalised linear mixed models. The parameters are estimated using Markov chain Monte Carlo maximum likelihood, which is a feasible and a useful technique. The proposed method depends on a detrending step built from continuous or categorical explanatory variables, or a mixture among them, by using an appropriate Euclidean distance. The method is illustrated through the analysis of the variation in the prevalence of Loa loa among a sample of village residents in Cameroon, where the explanatory variables included elevation, together with maximum normalised-difference vegetation index and the standard deviation of normalised-difference vegetation index calculated from repeated satellite scans over time. © The Author(s) 2013.

  15. Standard Model processes

    CERN Document Server

    Mangano, M.L.; Aguilar-Saavedra, Juan Antonio; Alekhin, S.; Badger, S.; Bauer, C.W.; Becher, T.; Bertone, V.; Bonvini, M.; Boselli, S.; Bothmann, E.; Boughezal, R.; Cacciari, M.; Carloni Calame, C.M.; Caola, F.; Campbell, J.M.; Carrazza, S.; Chiesa, M.; Cieri, L.; Cimaglia, F.; Febres Cordero, F.; Ferrarese, P.; D'Enterria, D.; Ferrera, G.; Garcia i Tormo, X.; Garzelli, M.V.; Germann, E.; Hirschi, V.; Han, T.; Ita, H.; Jäger, B.; Kallweit, S.; Karlberg, A.; Kuttimalai, S.; Krauss, F.; Larkoski, A.J.; Lindert, J.; Luisoni, G.; Maierhöfer, P.; Mattelaer, O.; Martinez, H.; Moch, S.; Montagna, G.; Moretti, M.; Nason, P.; Nicrosini, O.; Oleari, C.; Pagani, D.; Papaefstathiou, A.; Petriello, F.; Piccinini, F.; Pierini, M.; Pierog, T.; Pozzorini, S.; Re, E.; Robens, T.; Rojo, J.; Ruiz, R.; Sakurai, K.; Salam, G.P.; Salfelder, L.; Schönherr, M.; Schulze, M.; Schumann, S.; Selvaggi, M.; Shivaji, A.; Siodmok, A.; Skands, P.; Torrielli, P.; Tramontano, F.; Tsinikos, I.; Tweedie, B.; Vicini, A.; Westhoff, S.; Zaro, M.; Zeppenfeld, D.; CERN. Geneva. ATS Department

    2017-06-22

    This report summarises the properties of Standard Model processes at the 100 TeV pp collider. We document the production rates and typical distributions for a number of benchmark Standard Model processes, and discuss new dynamical phenomena arising at the highest energies available at this collider. We discuss the intrinsic physics interest in the measurement of these Standard Model processes, as well as their role as backgrounds for New Physics searches.

  16. Visualizing the process of process modeling with PPMCharts

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Pinggera, J.; Reijers, H.A.; Weber, B.; Poels, G.; La Rosa, M.; Soffer, P.

    2013-01-01

    In the quest for knowledge about how to make good process models, recent research focus is shifting from studying the quality of process models to studying the process of process modeling (often abbreviated as PPM) itself. This paper reports on our efforts to visualize this specific process in such

  17. What Controls the Vertical Distribution of Aerosol? Relationships Between Process Sensitivity in HadGEM3-UKCA and Inter-Model Variation from AeroCom Phase II

    Science.gov (United States)

    Kipling, Zak; Stier, Philip; Johnson, Colin E.; Mann, Graham W.; Bellouin, Nicolas; Bauer, Susanne E.; Bergman, Tommi; Chin, Mian; Diehl, Thomas; Ghan, Steven J.; hide

    2016-01-01

    same processes as the component mass profiles, plus the size distribution of primary emissions. We also show that the processes that affect the AOD-normalised radiative forcing in the model are predominantly those that affect the vertical mass distribution, in particular convective transport, in-cloud scavenging, aqueous oxidation, ageing and the vertical extent of biomass-burning emissions.

  18. The economic costs of natural disasters globally from 1900-2015: historical and normalised floods, storms, earthquakes, volcanoes, bushfires, drought and other disasters

    Science.gov (United States)

    Daniell, James; Wenzel, Friedemann; Schaefer, Andreas

    2016-04-01

    For the first time, a breakdown of natural disaster losses from 1900-2015 based on over 30,000 event economic losses globally is given based on increased analysis within the CATDAT Damaging Natural Disaster databases. Using country-CPI and GDP deflator adjustments, over 7 trillion (2015-adjusted) in losses have occurred; over 40% due to flood/rainfall, 26% due to earthquake, 19% due to storm effects, 12% due to drought, 2% due to wildfire and under 1% due to volcano. Using construction cost indices, higher percentages of flood losses are seen. Depending on how the adjustment of dollars are made to 2015 terms (CPI vs. construction cost indices), between 6.5 and 14.0 trillion USD (2015-adjusted) of natural disaster losses have been seen from 1900-2015 globally. Significant reductions in economic losses have been seen in China and Japan from 1950 onwards. An AAL of around 200 billion in the last 16 years has been seen equating to around 0.25% of Global GDP or around 0.1% of Net Capital Stock per year. Normalised losses have also been calculated to examine the trends in vulnerability through time for economic losses. The normalisation methodology globally using the exposure databases within CATDAT that were undertaken previously in papers for the earthquake and volcano databases, are used for this study. The original event year losses are adjusted directly by capital stock change, very high losses are observed with respect to floods over time (however with improved flood control structures). This shows clear trends in the improvement of building stock towards natural disasters and a decreasing trend in most perils for most countries.

  19. Business process model repositories : efficient process retrieval

    NARCIS (Netherlands)

    Yan, Z.

    2012-01-01

    As organizations increasingly work in process-oriented manner, the number of business process models that they develop and have to maintain increases. As a consequence, it has become common for organizations to have collections of hundreds or even thousands of business process models. When a

  20. Modeling Aspects of Activated Sludge Processes Part l l: Mathematical Process Modeling and Biokinetics of Activated Sludge Processes

    Energy Technology Data Exchange (ETDEWEB)

    AbdElHaleem, H S [Cairo Univ.-CivlI Eng. Dept., Giza (Egypt); EI-Ahwany, A H [CairoUlmrsity- Faculty ofEngincering - Chemical Engineering Department, Giza (Egypt); Ibrahim, H I [Helwan University- Faculty of Engineering - Biomedical Engineering Department, Helwan (Egypt); Ibrahim, G [Menofia University- Faculty of Engineering Sbebin EI Kom- Basic Eng. Sc. Dept., Menofia (Egypt)

    2004-07-01

    Mathematical process modeling and biokinetics of activated sludge process were reviewed considering different types of models. It has been evaluated the task group models of ASMI. and 2, and 3 versioned by Henze et al considering the conditions of each model and the different processes of which every model consists. It is revealed that ASMI contains some defects avoided in ASM3. Relied on homogeneity, Models can be classified into homogenous models characterized by taking the activated sludge process as one phase. In this type of models, the internal mass transfer inside the floes was neglected.. Hence, the kinetic parameter produces can be considered inaccurate. The other type of models is the heterogeneous model This type considers the mass transfer operations in addition to the biochemical reaction processes; hence, the resulted kinetic parameters can be considered more accurate than that of homogenous type.

  1. Modeling Aspects of Activated Sludge Processes Part l l: Mathematical Process Modeling and Biokinetics of Activated Sludge Processes

    International Nuclear Information System (INIS)

    AbdElHaleem, H.S.; EI-Ahwany, A. H.; Ibrahim, H.I.; Ibrahim, G.

    2004-01-01

    Mathematical process modeling and biokinetics of activated sludge process were reviewed considering different types of models. It has been evaluated the task group models of ASMI. and 2, and 3 versioned by Henze et al considering the conditions of each model and the different processes of which every model consists. It is revealed that ASMI contains some defects avoided in ASM3. Relied on homogeneity, Models can be classified into homogenous models characterized by taking the activated sludge process as one phase. In this type of models, the internal mass transfer inside the floes was neglected.. Hence, the kinetic parameter produces can be considered inaccurate. The other type of models is the heterogeneous model This type considers the mass transfer operations in addition to the biochemical reaction processes; hence, the resulted kinetic parameters can be considered more accurate than that of homogenous type

  2. Process correlation analysis model for process improvement identification.

    Science.gov (United States)

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  3. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers

    DEFF Research Database (Denmark)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel

    2016-01-01

    of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling....... the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension...

  4. Product and Process Modelling

    DEFF Research Database (Denmark)

    Cameron, Ian T.; Gani, Rafiqul

    . These approaches are put into the context of life cycle modelling, where multiscale and multiform modelling is increasingly prevalent in the 21st century. The book commences with a discussion of modern product and process modelling theory and practice followed by a series of case studies drawn from a variety......This book covers the area of product and process modelling via a case study approach. It addresses a wide range of modelling applications with emphasis on modelling methodology and the subsequent in-depth analysis of mathematical models to gain insight via structural aspects of the models...... to biotechnology applications, food, polymer and human health application areas. The book highlights to important nature of modern product and process modelling in the decision making processes across the life cycle. As such it provides an important resource for students, researchers and industrial practitioners....

  5. Modeling of biopharmaceutical processes. Part 2: Process chromatography unit operation

    DEFF Research Database (Denmark)

    Kaltenbrunner, Oliver; McCue, Justin; Engel, Philip

    2008-01-01

    Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent. The theoret......Process modeling can be a useful tool to aid in process development, process optimization, and process scale-up. When modeling a chromatography process, one must first select the appropriate models that describe the mass transfer and adsorption that occurs within the porous adsorbent...

  6. The impact of working memory and the "process of process modelling" on model quality: Investigating experienced versus inexperienced modellers.

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R; Weber, Barbara

    2016-05-09

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling.

  7. Applying the Business Process and Practice Alignment Meta-model: Daily Practices and Process Modelling

    Directory of Open Access Journals (Sweden)

    Ventura Martins Paula

    2017-03-01

    Full Text Available Background: Business Process Modelling (BPM is one of the most important phases of information system design. Business Process (BP meta-models allow capturing informational and behavioural aspects of business processes. Unfortunately, standard BP meta-modelling approaches focus just on process description, providing different BP models. It is not possible to compare and identify related daily practices in order to improve BP models. This lack of information implies that further research in BP meta-models is needed to reflect the evolution/change in BP. Considering this limitation, this paper introduces a new BP meta-model designed by Business Process and Practice Alignment Meta-model (BPPAMeta-model. Our intention is to present a meta-model that addresses features related to the alignment between daily work practices and BP descriptions. Objectives: This paper intends to present a metamodel which is going to integrate daily work information into coherent and sound process definitions. Methods/Approach: The methodology employed in the research follows a design-science approach. Results: The results of the case study are related to the application of the proposed meta-model to align the specification of a BP model with work practices models. Conclusions: This meta-model can be used within the BPPAM methodology to specify or improve business processes models based on work practice descriptions.

  8. Technical Note: On methodologies for determining the size-normalised weight of planktic foraminifera

    Directory of Open Access Journals (Sweden)

    C. J. Beer

    2010-07-01

    Full Text Available The size-normalised weight (SNW of planktic foraminifera, a measure of test wall thickness and density, is potentially a valuable palaeo-proxy for marine carbon chemistry. As increasing attention is given to developing this proxy it is important that methods are comparable between studies. Here, we compare SNW data generated using two different methods to account for variability in test size, namely (i the narrow (50 μm range sieve fraction method and (ii the individually measured test size method. Using specimens from the 200–250 μm sieve fraction range collected in multinet samples from the North Atlantic, we find that sieving does not constrain size sufficiently well to isolate changes in weight driven by variations in test wall thickness and density from those driven by size. We estimate that the SNW data produced as part of this study are associated with an uncertainty, or error bar, of about ±11%. Errors associated with the narrow sieve fraction method may be reduced by decreasing the size of the sieve window, by using larger tests and by increasing the number tests employed. In situations where numerous large tests are unavailable, however, substantial errors associated with this sieve method remain unavoidable. In such circumstances the individually measured test size method provides a better means for estimating SNW because, as our results show, this method isolates changes in weight driven by variations in test wall thickness and density from those driven by size.

  9. Modeling multiphase materials processes

    CERN Document Server

    Iguchi, Manabu

    2010-01-01

    ""Modeling Multiphase Materials Processes: Gas-Liquid Systems"" describes the methodology and application of physical and mathematical modeling to multi-phase flow phenomena in materials processing. The book focuses on systems involving gas-liquid interaction, the most prevalent in current metallurgical processes. The performance characteristics of these processes are largely dependent on transport phenomena. This volume covers the inherent characteristics that complicate the modeling of transport phenomena in such systems, including complex multiphase structure, intense turbulence, opacity of

  10. Semantic Business Process Modeling

    OpenAIRE

    Markovic, Ivan

    2010-01-01

    This book presents a process-oriented business modeling framework based on semantic technologies. The framework consists of modeling languages, methods, and tools that allow for semantic modeling of business motivation, business policies and rules, and business processes. Quality of the proposed modeling framework is evaluated based on the modeling content of SAP Solution Composer and several real-world business scenarios.

  11. Business Process Modeling: Perceived Benefits

    Science.gov (United States)

    Indulska, Marta; Green, Peter; Recker, Jan; Rosemann, Michael

    The process-centered design of organizations and information systems is globally seen as an appropriate response to the increased economic pressure on organizations. At the methodological core of process-centered management is process modeling. However, business process modeling in large initiatives can be a time-consuming and costly exercise, making it potentially difficult to convince executive management of its benefits. To date, and despite substantial interest and research in the area of process modeling, the understanding of the actual benefits of process modeling in academia and practice is limited. To address this gap, this paper explores the perception of benefits derived from process modeling initiatives, as reported through a global Delphi study. The study incorporates the views of three groups of stakeholders - academics, practitioners and vendors. Our findings lead to the first identification and ranking of 19 unique benefits associated with process modeling. The study in particular found that process modeling benefits vary significantly between practitioners and academics. We argue that the variations may point to a disconnect between research projects and practical demands.

  12. WWTP Process Tank Modelling

    DEFF Research Database (Denmark)

    Laursen, Jesper

    The present thesis considers numerical modeling of activated sludge tanks on municipal wastewater treatment plants. Focus is aimed at integrated modeling where the detailed microbiological model the Activated Sludge Model 3 (ASM3) is combined with a detailed hydrodynamic model based on a numerical...... solution of the Navier-Stokes equations in a multiphase scheme. After a general introduction to the activated sludge tank as a system, the activated sludge tank model is gradually setup in separate stages. The individual sub-processes that are often occurring in activated sludge tanks are initially...... hydrofoil shaped propellers. These two sub-processes deliver the main part of the supplied energy to the activated sludge tank, and for this reason they are important for the mixing conditions in the tank. For other important processes occurring in the activated sludge tank, existing models and measurements...

  13. Implementation of the Business Process Modelling Notation (BPMN) in the modelling of anatomic pathology processes.

    Science.gov (United States)

    Rojo, Marcial García; Rolón, Elvira; Calahorra, Luis; García, Felix Oscar; Sánchez, Rosario Paloma; Ruiz, Francisco; Ballester, Nieves; Armenteros, María; Rodríguez, Teresa; Espartero, Rafael Martín

    2008-07-15

    Process orientation is one of the essential elements of quality management systems, including those in use in healthcare. Business processes in hospitals are very complex and variable. BPMN (Business Process Modelling Notation) is a user-oriented language specifically designed for the modelling of business (organizational) processes. Previous experiences of the use of this notation in the processes modelling within the Pathology in Spain or another country are not known. We present our experience in the elaboration of the conceptual models of Pathology processes, as part of a global programmed surgical patient process, using BPMN. With the objective of analyzing the use of BPMN notation in real cases, a multidisciplinary work group was created, including software engineers from the Dep. of Technologies and Information Systems from the University of Castilla-La Mancha and health professionals and administrative staff from the Hospital General de Ciudad Real. The work in collaboration was carried out in six phases: informative meetings, intensive training, process selection, definition of the work method, process describing by hospital experts, and process modelling. The modelling of the processes of Anatomic Pathology is presented using BPMN. The presented subprocesses are those corresponding to the surgical pathology examination of the samples coming from operating theatre, including the planning and realization of frozen studies. The modelling of Anatomic Pathology subprocesses has allowed the creation of an understandable graphical model, where management and improvements are more easily implemented by health professionals.

  14. modeling grinding modeling grinding processes as micro processes

    African Journals Online (AJOL)

    eobe

    industrial precision grinding processes are cylindrical, center less and ... Several model shave been proposed and used to study grinding ..... grinding force for the two cases were 9.07237N/mm ..... International Journal of Machine Tools &.

  15. Modeling of column apparatus processes

    CERN Document Server

    Boyadjiev, Christo; Boyadjiev, Boyan; Popova-Krumova, Petya

    2016-01-01

    This book presents a new approach for the modeling of chemical and interphase mass transfer processes in industrial column apparatuses, using convection-diffusion and average-concentration models. The convection-diffusion type models are used for a qualitative analysis of the processes and to assess the main, small and slight physical effects, and then reject the slight effects. As a result, the process mechanism can be identified. It also introduces average concentration models for quantitative analysis, which use the average values of the velocity and concentration over the cross-sectional area of the column. The new models are used to analyze different processes (simple and complex chemical reactions, absorption, adsorption and catalytic reactions), and make it possible to model the processes of gas purification with sulfur dioxide, which form the basis of several patents.

  16. Investigating the Process of Process Modeling with Eye Movement Analysis

    OpenAIRE

    Pinggera, Jakob; Furtner, Marco; Martini, Markus; Sachse, Pierre; Reiter, Katharina; Zugal, Stefan; Weber, Barbara

    2015-01-01

    Research on quality issues of business process models has recently begun to explore the process of creating process models by analyzing the modeler's interactions with the modeling environment. In this paper we aim to complement previous insights on the modeler's modeling behavior with data gathered by tracking the modeler's eye movements when engaged in the act of modeling. We present preliminary results and outline directions for future research to triangulate toward a more comprehensive un...

  17. What makes process models understandable?

    NARCIS (Netherlands)

    Mendling, J.; Reijers, H.A.; Cardoso, J.; Alonso, G.; Dadam, P.; Rosemann, M.

    2007-01-01

    Despite that formal and informal quality aspects are of significant importance to business process modeling, there is only little empirical work reported on process model quality and its impact factors. In this paper we investigate understandability as a proxy for quality of process models and focus

  18. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    NARCIS (Netherlands)

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is

  19. Modeling the Object-Oriented Software Process: OPEN and the Unified Process

    OpenAIRE

    van den Berg, Klaas; Aksit, Mehmet; van den Broek, P.M.

    1999-01-01

    A short introduction to software process modeling is presented, particularly object-oriented modeling. Two major industrial process models are discussed: the OPEN model and the Unified Process model. In more detail, the quality assurance in the Unified Process tool (formally called Objectory) is reviewed.

  20. Search for Standard Model H→τ"+τ"- decays in the lepton-hadron final state in proton-proton collisions with the ATLAS detector at the LHC

    International Nuclear Information System (INIS)

    Ruthmann, Nils

    2014-01-01

    This thesis presents a search for Standard Model (SM) Higgs boson decays to a pair of τ leptons in the lepton-hadron final state with the ATLAS detector at the Large Hadron Collider (LHC). The analysis is based on proton-proton collision data recorded during Run 1 of the LHC, corresponding to integrated luminosities of 4.5 fb"-"1 and 20.3 fb"-"1 at centre-of-mass energies of 7 TeV and 8 TeV, respectively. Background events from various SM processes contribute to the selected event sample at a high rate. Their contribution is efficiently separated from the expected Higgs boson signal by using boosted decision trees (BDT) in two analysis categories, which are enriched in events emerging from vector boson fusion and gluon fusion processes. The expected number of events from background processes is modelled using data-driven estimation techniques. The signal contribution is measured using a maximum likelihood fit of the BDT output distributions. An excess of events over the expected level of background events is found and corresponds to an observed (expected) significance of 2.3(2.4) standard deviations at a Higgs boson mass hypothesis of 125 GeV. The signal strength normalised to the Standard Model expectation is measured to be 0.98"+"0"."5_-_0_._5. A combined analysis of all τ-τ final states rejects the background-only hypothesis at a level of 4.5 standard deviations at m_H=125 GeV, while a significance of 3.5 standard deviations is expected. This provides evidence for the direct coupling of the recently discovered Higgs boson to tau leptons. The measured normalised signal strength of 1.4"+"0"."4"3_-_0_._3_7 is consistent with the predicted Yukawa coupling strength in the Standard Model.

  1. Application of process tomography in gas-solid fluidised beds in different scales and structures

    Science.gov (United States)

    Wang, H. G.; Che, H. Q.; Ye, J. M.; Tu, Q. Y.; Wu, Z. P.; Yang, W. Q.; Ocone, R.

    2018-04-01

    Gas-solid fluidised beds are commonly used in particle-related processes, e.g. for coal combustion and gasification in the power industry, and the coating and granulation process in the pharmaceutical industry. Because the operation efficiency depends on the gas-solid flow characteristics, it is necessary to investigate the flow behaviour. This paper is about the application of process tomography, including electrical capacitance tomography (ECT) and microwave tomography (MWT), in multi-scale gas-solid fluidisation processes in the pharmaceutical and power industries. This is the first time that both ECT and MWT have been applied for this purpose in multi-scale and complex structure. To evaluate the sensor design and image reconstruction and to investigate the effects of sensor structure and dimension on the image quality, a normalised sensitivity coefficient is introduced. In the meantime, computational fluid dynamic (CFD) analysis based on a computational particle fluid dynamic (CPFD) model and a two-phase fluid model (TFM) is used. Part of the CPFD-TFM simulation results are compared and validated by experimental results from ECT and/or MWT. By both simulation and experiment, the complex flow hydrodynamic behaviour in different scales is analysed. Time-series capacitance data are analysed both in time and frequency domains to reveal the flow characteristics.

  2. Multi-enzyme Process Modeling

    DEFF Research Database (Denmark)

    Andrade Santacoloma, Paloma de Gracia

    are affected (in a positive or negative way) by the presence of the other enzymes and compounds in the media. In this thesis the concept of multi-enzyme in-pot term is adopted for processes that are carried out by the combination of enzymes in a single reactor and implemented at pilot or industrial scale...... features of the process and provides the information required to structure the process model by using a step-by-step procedure with the required tools and methods. In this way, this framework increases efficiency of the model development process with respect to time and resources needed (fast and effective....... In this way the model parameters that drives the main dynamic behavior can be identified and thus a better understanding of this type of processes. In order to develop, test and verify the methodology, three case studies were selected, specifically the bi-enzyme process for the production of lactobionic acid...

  3. Process model repositories and PNML

    NARCIS (Netherlands)

    Hee, van K.M.; Post, R.D.J.; Somers, L.J.A.M.; Werf, van der J.M.E.M.; Kindler, E.

    2004-01-01

    Bringing system and process models together in repositories facilitates the interchange of model information between modelling tools, and allows the combination and interlinking of complementary models. Petriweb is a web application for managing such repositories. It supports hierarchical process

  4. Model-based software process improvement

    Science.gov (United States)

    Zettervall, Brenda T.

    1994-01-01

    The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.

  5. Validation of nuclear models in Geant4 using the dose distribution of a 177 MeV proton pencil beam

    International Nuclear Information System (INIS)

    Hall, David C; Paganetti, Harald; Makarova, Anastasia; Gottschalk, Bernard

    2016-01-01

    A proton pencil beam is associated with a surrounding low-dose envelope, originating from nuclear interactions. It is important for treatment planning systems to accurately model this envelope when performing dose calculations for pencil beam scanning treatments, and Monte Carlo (MC) codes are commonly used for this purpose. This work aims to validate the nuclear models employed by the Geant4 MC code, by comparing the simulated absolute dose distribution to a recent experiment of a 177 MeV proton pencil beam stopping in water. Striking agreement is observed over five orders of magnitude, with both the shape and normalisation well modelled. The normalisations of two depth dose curves are lower than experiment, though this could be explained by an experimental positioning error. The Geant4 neutron production model is also verified in the distal region. The entrance dose is poorly modelled, suggesting an unaccounted upstream source of low-energy protons. Recommendations are given for a follow-up experiment which could resolve these issues. (note)

  6. Using Unified Modelling Language (UML) as a process-modelling technique for clinical-research process improvement.

    Science.gov (United States)

    Kumarapeli, P; De Lusignan, S; Ellis, T; Jones, B

    2007-03-01

    The Primary Care Data Quality programme (PCDQ) is a quality-improvement programme which processes routinely collected general practice computer data. Patient data collected from a wide range of different brands of clinical computer systems are aggregated, processed, and fed back to practices in an educational context to improve the quality of care. Process modelling is a well-established approach used to gain understanding and systematic appraisal, and identify areas of improvement of a business process. Unified modelling language (UML) is a general purpose modelling technique used for this purpose. We used UML to appraise the PCDQ process to see if the efficiency and predictability of the process could be improved. Activity analysis and thinking-aloud sessions were used to collect data to generate UML diagrams. The UML model highlighted the sequential nature of the current process as a barrier for efficiency gains. It also identified the uneven distribution of process controls, lack of symmetric communication channels, critical dependencies among processing stages, and failure to implement all the lessons learned in the piloting phase. It also suggested that improved structured reporting at each stage - especially from the pilot phase, parallel processing of data and correctly positioned process controls - should improve the efficiency and predictability of research projects. Process modelling provided a rational basis for the critical appraisal of a clinical data processing system; its potential maybe underutilized within health care.

  7. Developing engineering processes through integrated modelling of product and process

    DEFF Research Database (Denmark)

    Nielsen, Jeppe Bjerrum; Hvam, Lars

    2012-01-01

    This article aims at developing an operational tool for integrated modelling of product assortments and engineering processes in companies making customer specific products. Integrating a product model in the design of engineering processes will provide a deeper understanding of the engineering...... activities as well as insight into how product features affect the engineering processes. The article suggests possible ways of integrating models of products with models of engineering processes. The models have been tested and further developed in an action research study carried out in collaboration...... with a major international engineering company....

  8. UML in business process modeling

    Directory of Open Access Journals (Sweden)

    Bartosz Marcinkowski

    2013-03-01

    Full Text Available Selection and proper application of business process modeling methods and techniques have a significant impact on organizational improvement capabilities as well as proper understanding of functionality of information systems that shall support activity of the organization. A number of business process modeling notations were popularized in practice in recent decades. Most significant of the notations include Business Process Modeling Notation (OMG BPMN and several Unified Modeling Language (OMG UML extensions. In this paper, the assessment whether one of the most flexible and strictly standardized contemporary business process modeling notations, i.e. Rational UML Profile for Business Modeling, enable business analysts to prepare business models that are all-embracing and understandable by all the stakeholders. After the introduction, methodology of research is discussed. Section 2 presents selected case study results. The paper is concluded with a summary.

  9. Automatic extraction of process categories from process model collections

    NARCIS (Netherlands)

    Malinova, M.; Dijkman, R.M.; Mendling, J.; Lohmann, N.; Song, M.; Wohed, P.

    2014-01-01

    Many organizations build up their business process management activities in an incremental way. As a result, there is no overarching structure defined at the beginning. However, as business process modeling initiatives often yield hundreds to thousands of process models, there is a growing need for

  10. Biosphere Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    J. Schmitt

    2000-05-25

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  11. Biosphere Process Model Report

    International Nuclear Information System (INIS)

    Schmitt, J.

    2000-01-01

    To evaluate the postclosure performance of a potential monitored geologic repository at Yucca Mountain, a Total System Performance Assessment (TSPA) will be conducted. Nine Process Model Reports (PMRs), including this document, are being developed to summarize the technical basis for each of the process models supporting the TSPA model. These reports cover the following areas: (1) Integrated Site Model; (2) Unsaturated Zone Flow and Transport; (3) Near Field Environment; (4) Engineered Barrier System Degradation, Flow, and Transport; (5) Waste Package Degradation; (6) Waste Form Degradation; (7) Saturated Zone Flow and Transport; (8) Biosphere; and (9) Disruptive Events. Analysis/Model Reports (AMRs) contain the more detailed technical information used to support TSPA and the PMRs. The AMRs consists of data, analyses, models, software, and supporting documentation that will be used to defend the applicability of each process model for evaluating the postclosure performance of the potential Yucca Mountain repository system. This documentation will ensure the traceability of information from its source through its ultimate use in the TSPA-Site Recommendation (SR) and in the National Environmental Policy Act (NEPA) analysis processes. The objective of the Biosphere PMR is to summarize (1) the development of the biosphere model, and (2) the Biosphere Dose Conversion Factors (BDCFs) developed for use in TSPA. The Biosphere PMR does not present or summarize estimates of potential radiation doses to human receptors. Dose calculations are performed as part of TSPA and will be presented in the TSPA documentation. The biosphere model is a component of the process to evaluate postclosure repository performance and regulatory compliance for a potential monitored geologic repository at Yucca Mountain, Nevada. The biosphere model describes those exposure pathways in the biosphere by which radionuclides released from a potential repository could reach a human receptor

  12. Conceptual models of information processing

    Science.gov (United States)

    Stewart, L. J.

    1983-01-01

    The conceptual information processing issues are examined. Human information processing is defined as an active cognitive process that is analogous to a system. It is the flow and transformation of information within a human. The human is viewed as an active information seeker who is constantly receiving, processing, and acting upon the surrounding environmental stimuli. Human information processing models are conceptual representations of cognitive behaviors. Models of information processing are useful in representing the different theoretical positions and in attempting to define the limits and capabilities of human memory. It is concluded that an understanding of conceptual human information processing models and their applications to systems design leads to a better human factors approach.

  13. Improved model management with aggregated business process models

    NARCIS (Netherlands)

    Reijers, H.A.; Mans, R.S.; Toorn, van der R.A.

    2009-01-01

    Contemporary organizations invest much efforts in creating models of their business processes. This raises the issue of how to deal with large sets of process models that become available over time. This paper proposes an extension of Event-driven Process Chains, called the aggregate EPC (aEPC),

  14. Business process modeling in healthcare.

    Science.gov (United States)

    Ruiz, Francisco; Garcia, Felix; Calahorra, Luis; Llorente, César; Gonçalves, Luis; Daniel, Christel; Blobel, Bernd

    2012-01-01

    The importance of the process point of view is not restricted to a specific enterprise sector. In the field of health, as a result of the nature of the service offered, health institutions' processes are also the basis for decision making which is focused on achieving their objective of providing quality medical assistance. In this chapter the application of business process modelling - using the Business Process Modelling Notation (BPMN) standard is described. Main challenges of business process modelling in healthcare are the definition of healthcare processes, the multi-disciplinary nature of healthcare, the flexibility and variability of the activities involved in health care processes, the need of interoperability between multiple information systems, and the continuous updating of scientific knowledge in healthcare.

  15. Integrating textual and model-based process descriptions for comprehensive process search

    NARCIS (Netherlands)

    Leopold, Henrik; van der Aa, Han; Pittke, Fabian; Raffel, Manuel; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Documenting business processes using process models is common practice in many organizations. However, not all process information is best captured in process models. Hence, many organizations complement these models with textual descriptions that specify additional details. The problem with this

  16. Multi-Boson Simulation for 13 TeV ATLAS Analyses

    CERN Document Server

    The ATLAS collaboration

    2016-01-01

    This note describes the Monte Carlo setup used by ATLAS to model multi-boson processes in 13 TeV $pp$ collisions. The baseline Monte Carlo generators are compared with each other in key kinematic distributions of the processes under study. Sample normalisation and assignment of systematics uncertainties are discussed.

  17. Process modeling for Humanities: tracing and analyzing scientific processes

    OpenAIRE

    Hug , Charlotte; Salinesi , Camille; Deneckere , Rebecca; Lamasse , Stéphane

    2011-01-01

    International audience; This paper concerns epistemology and the understanding of research processes in Humanities, such as Archaeology. We believe that to properly understand research processes, it is essential to trace them. The collected traces depend on the process model established, which has to be as accurate as possible to exhaustively record the traces. In this paper, we briefly explain why the existing process models for Humanities are not sufficient to represent traces. We then pres...

  18. Modeling nuclear processes by Simulink

    Energy Technology Data Exchange (ETDEWEB)

    Rashid, Nahrul Khair Alang Md, E-mail: nahrul@iium.edu.my [Faculty of Engineering, International Islamic University Malaysia, Jalan Gombak, Selangor (Malaysia)

    2015-04-29

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples.

  19. Modeling nuclear processes by Simulink

    International Nuclear Information System (INIS)

    Rashid, Nahrul Khair Alang Md

    2015-01-01

    Modelling and simulation are essential parts in the study of dynamic systems behaviours. In nuclear engineering, modelling and simulation are important to assess the expected results of an experiment before the actual experiment is conducted or in the design of nuclear facilities. In education, modelling can give insight into the dynamic of systems and processes. Most nuclear processes can be described by ordinary or partial differential equations. Efforts expended to solve the equations using analytical or numerical solutions consume time and distract attention from the objectives of modelling itself. This paper presents the use of Simulink, a MATLAB toolbox software that is widely used in control engineering, as a modelling platform for the study of nuclear processes including nuclear reactor behaviours. Starting from the describing equations, Simulink models for heat transfer, radionuclide decay process, delayed neutrons effect, reactor point kinetic equations with delayed neutron groups, and the effect of temperature feedback are used as examples

  20. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    -change of a homogeneous Levy process. While the processes in these two classes share the same average behavior over time, the associated intensities exhibit very different properties. Concrete specifications are calibrated to data on the single names included in the iTraxx Europe index. The performances are compared......In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...... specifying the cumulative hazard process directly. Within this framework we present a new model class where cumulative hazards are described by self-similar additive processes, also known as Sato processes. Furthermore we also analyze specifications obtained via a simple deterministic time...

  1. Long-term performance of grid-connected photovoltaic plant - Appendix 2: normalised monthly statistics; Langzeitverhalten von netzgekoppelten Photovoltaikanlagen 2 (LZPV2). Anhang 2: Normierte Monatsstatistiken

    Energy Technology Data Exchange (ETDEWEB)

    Renken, C.; Haeberlin, H.

    2003-07-01

    This is the third part of a four-part final report for the Swiss Federal Office of Energy (SFOE) made by the University of Applied Sciences in Burgdorf, Switzerland. This report presents the findings of a project begun in 1992 that monitored the performance of around 40 photovoltaic (PV) installations in Switzerland. This extensive second appendix to the report describes the eight installations that were monitored in detail, including - amongst others - the demonstration installations on Mont Soleil in the Jura mountains and on the Jungfraujoch in the Alps as well as three test installations using modern thin-film technologies in Burgdorf. The normalised monthly specific performance of these installations was monitored. The report presents the various performance figures in graphical form.

  2. A framework model for water-sharing among co-basin states of a river basin

    Science.gov (United States)

    Garg, N. K.; Azad, Shambhu

    2018-05-01

    A new framework model is presented in this study for sharing of water in a river basin using certain governing variables, in an effort to enhance the objectivity for a reasonable and equitable allocation of water among co-basin states. The governing variables were normalised to reduce the governing variables of different co-basin states of a river basin on same scale. In the absence of objective methods for evaluating the weights to be assigned to co-basin states for water allocation, a framework was conceptualised and formulated to determine the normalised weighting factors of different co-basin states as a function of the governing variables. The water allocation to any co-basin state had been assumed to be proportional to its struggle for equity, which in turn was assumed to be a function of the normalised discontent, satisfaction, and weighting factors of each co-basin state. System dynamics was used effectively to represent and solve the proposed model formulation. The proposed model was successfully applied to the Vamsadhara river basin located in the South-Eastern part of India, and a sensitivity analysis of the proposed model parameters was carried out to prove its robustness in terms of the proposed model convergence and validity over the broad spectrum values of the proposed model parameters. The solution converged quickly to a final allocation of 1444 million cubic metre (MCM) in the case of the Odisha co-basin state, and to 1067 MCM for the Andhra Pradesh co-basin state. The sensitivity analysis showed that the proposed model's allocation varied from 1584 MCM to 1336 MCM for Odisha state and from 927 to 1175 MCM for Andhra, depending upon the importance weights given to the governing variables for the calculation of the weighting factors. Thus, the proposed model was found to be very flexible to explore various policy options to arrive at a decision in a water sharing problem. It can therefore be effectively applied to any trans-boundary problem where

  3. Multiphysics modelling of manufacturing processes: A review

    DEFF Research Database (Denmark)

    Jabbari, Masoud; Baran, Ismet; Mohanty, Sankhya

    2018-01-01

    Numerical modelling is increasingly supporting the analysis and optimization of manufacturing processes in the production industry. Even if being mostly applied to multistep processes, single process steps may be so complex by nature that the needed models to describe them must include multiphysics...... the diversity in the field of modelling of manufacturing processes as regards process, materials, generic disciplines as well as length scales: (1) modelling of tape casting for thin ceramic layers, (2) modelling the flow of polymers in extrusion, (3) modelling the deformation process of flexible stamps...... for nanoimprint lithography, (4) modelling manufacturing of composite parts and (5) modelling the selective laser melting process. For all five examples, the emphasis is on modelling results as well as describing the models in brief mathematical details. Alongside with relevant references to the original work...

  4. Configurable multi-perspective business process models

    NARCIS (Netherlands)

    La Rosa, M.; Dumas, M.; Hofstede, ter A.H.M.; Mendling, J.

    2011-01-01

    A configurable process model provides a consolidated view of a family of business processes. It promotes the reuse of proven practices by providing analysts with a generic modeling artifact from which to derive individual process models. Unfortunately, the scope of existing notations for

  5. Probabilistic evaluation of process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2016-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  6. Identification of the most sensitive parameters in the activated sludge model implemented in BioWin software.

    Science.gov (United States)

    Liwarska-Bizukojc, Ewa; Biernacki, Rafal

    2010-10-01

    In order to simulate biological wastewater treatment processes, data concerning wastewater and sludge composition, process kinetics and stoichiometry are required. Selection of the most sensitive parameters is an important step of model calibration. The aim of this work is to verify the predictability of the activated sludge model, which is implemented in BioWin software, and select its most influential kinetic and stoichiometric parameters with the help of sensitivity analysis approach. Two different measures of sensitivity are applied: the normalised sensitivity coefficient (S(i,j)) and the mean square sensitivity measure (delta(j)(msqr)). It occurs that 17 kinetic and stoichiometric parameters of the BioWin activated sludge (AS) model can be regarded as influential on the basis of S(i,j) calculations. Half of the influential parameters are associated with growth and decay of phosphorus accumulating organisms (PAOs). The identification of the set of the most sensitive parameters should support the users of this model and initiate the elaboration of determination procedures for the parameters, for which it has not been done yet. Copyright 2010 Elsevier Ltd. All rights reserved.

  7. Syntax highlighting in business process models

    NARCIS (Netherlands)

    Reijers, H.A.; Freytag, T.; Mendling, J.; Eckleder, A.

    2011-01-01

    Sense-making of process models is an important task in various phases of business process management initiatives. Despite this, there is currently hardly any support in business process modeling tools to adequately support model comprehension. In this paper we adapt the concept of syntax

  8. Genetic Process Mining: Alignment-based Process Model Mutation

    NARCIS (Netherlands)

    Eck, van M.L.; Buijs, J.C.A.M.; Dongen, van B.F.; Fournier, F.; Mendling, J.

    2015-01-01

    The Evolutionary Tree Miner (ETM) is a genetic process discovery algorithm that enables the user to guide the discovery process based on preferences with respect to four process model quality dimensions: replay fitness, precision, generalization and simplicity. Traditionally, the ETM algorithm uses

  9. Understanding bicycling in cities using system dynamics modelling.

    Science.gov (United States)

    Macmillan, Alexandra; Woodcock, James

    2017-12-01

    Increasing urban bicycling has established net benefits for human and environmental health. Questions remain about which policies are needed and in what order, to achieve an increase in cycling while avoiding negative consequences. Novel ways of considering cycling policy are needed, bringing together expertise across policy, community and research to develop a shared understanding of the dynamically complex cycling system. In this paper we use a collaborative learning process to develop a dynamic causal model of urban cycling to develop consensus about the nature and order of policies needed in different cycling contexts to optimise outcomes. We used participatory system dynamics modelling to develop causal loop diagrams (CLDs) of cycling in three contrasting contexts: Auckland, London and Nijmegen. We combined qualitative interviews and workshops to develop the CLDs. We used the three CLDs to compare and contrast influences on cycling at different points on a "cycling trajectory" and drew out policy insights. The three CLDs consisted of feedback loops dynamically influencing cycling, with significant overlap between the three diagrams. Common reinforcing patterns emerged: growing numbers of people cycling lifts political will to improve the environment; cycling safety in numbers drives further growth; and more cycling can lead to normalisation across the population. By contrast, limits to growth varied as cycling increases. In Auckland and London, real and perceived danger was considered the main limit, with added barriers to normalisation in London. Cycling congestion and "market saturation" were important in the Netherlands. A generalisable, dynamic causal theory for urban cycling enables a more ordered set of policy recommendations for different cities on a cycling trajectory. Participation meant the collective knowledge of cycling stakeholders was represented and triangulated with research evidence. Extending this research to further cities, especially in low

  10. The triconnected abstraction of process models

    OpenAIRE

    Polyvyanyy, Artem; Smirnov, Sergey; Weske, Mathias

    2009-01-01

    Contents: Artem Polyvanny, Sergey Smirnow, and Mathias Weske The Triconnected Abstraction of Process Models 1 Introduction 2 Business Process Model Abstraction 3 Preliminaries 4 Triconnected Decomposition 4.1 Basic Approach for Process Component Discovery 4.2 SPQR-Tree Decomposition 4.3 SPQR-Tree Fragments in the Context of Process Models 5 Triconnected Abstraction 5.1 Abstraction Rules 5.2 Abstraction Algorithm 6 Related Work and Conclusions

  11. Modeling Suspension and Continuation of a Process

    Directory of Open Access Journals (Sweden)

    Oleg Svatos

    2012-04-01

    Full Text Available This work focuses on difficulties an analyst encounters when modeling suspension and continuation of a process in contemporary process modeling languages. As a basis there is introduced general lifecycle of an activity which is then compared to activity lifecycles supported by individual process modeling languages. The comparison shows that the contemporary process modeling languages cover the defined general lifecycle of an activity only partially. There are picked two popular process modeling languages and there is modeled real example, which reviews how the modeling languages can get along with their lack of native support of suspension and continuation of an activity. Upon the unsatisfying results of the contemporary process modeling languages in the modeled example, there is presented a new process modeling language which, as demonstrated, is capable of capturing suspension and continuation of an activity in much simpler and precise way.

  12. Declarative modeling for process supervision

    International Nuclear Information System (INIS)

    Leyval, L.

    1989-01-01

    Our work is a contribution to computer aided supervision of continuous processes. It is inspired by an area of Artificial Intelligence: qualitative physics. Here, supervision is based on a model which continuously provides operators with a synthetic view of the process; but this model is founded on general principles of control theory rather than on physics. It involves concepts such as high gain or small time response. It helps in linking temporally the evolution of various variables. Moreover, the model provides predictions of the future behaviour of the process, which allows action advice and alarm filtering. This should greatly reduce the famous cognitive overload associated to any complex and dangerous evolution of the process

  13. ECONOMIC MODELING PROCESSES USING MATLAB

    Directory of Open Access Journals (Sweden)

    Anamaria G. MACOVEI

    2008-06-01

    Full Text Available To study economic phenomena and processes using mathem atical modeling, and to determine the approximatesolution to a problem we need to choose a method of calculation and a numerical computer program, namely thepackage of programs MatLab. Any economic process or phenomenon is a mathematical description of h is behavior,and thus draw up an economic and mathematical model that has the following stages: formulation of the problem, theanalysis process modeling, the production model and design verification, validation and implementation of the model.This article is presented an economic model and its modeling is using mathematical equations and software packageMatLab, which helps us approximation effective solution. As data entry is considered the net cost, the cost of direct andtotal cost and the link between them. I presented the basic formula for determining the total cost. Economic modelcalculations were made in MatLab software package and with graphic representation of its interpretation of the resultsachieved in terms of our specific problem.

  14. Styles in business process modeling: an exploration and a model

    NARCIS (Netherlands)

    Pinggera, J.; Soffer, P.; Fahland, D.; Weidlich, M.; Zugal, S.; Weber, B.; Reijers, H.A.; Mendling, J.

    2015-01-01

    Business process models are an important means to design, analyze, implement, and control business processes. As with every type of conceptual model, a business process model has to meet certain syntactic, semantic, and pragmatic quality requirements to be of value. For many years, such quality

  15. Composing Models of Geographic Physical Processes

    Science.gov (United States)

    Hofer, Barbara; Frank, Andrew U.

    Processes are central for geographic information science; yet geographic information systems (GIS) lack capabilities to represent process related information. A prerequisite to including processes in GIS software is a general method to describe geographic processes independently of application disciplines. This paper presents such a method, namely a process description language. The vocabulary of the process description language is derived formally from mathematical models. Physical processes in geography can be described in two equivalent languages: partial differential equations or partial difference equations, where the latter can be shown graphically and used as a method for application specialists to enter their process models. The vocabulary of the process description language comprises components for describing the general behavior of prototypical geographic physical processes. These process components can be composed by basic models of geographic physical processes, which is shown by means of an example.

  16. The Structured Process Modeling Theory (SPMT) : a cognitive view on why and how modelers benefit from structuring the process of process modeling

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Gailly, F.; Grefen, P.W.P.J.; Poels, G.

    2015-01-01

    After observing various inexperienced modelers constructing a business process model based on the same textual case description, it was noted that great differences existed in the quality of the produced models. The impression arose that certain quality issues originated from cognitive failures

  17. Integrated modelling in materials and process technology

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri

    2008-01-01

    Integrated modelling of entire process sequences and the subsequent in-service conditions, and multiphysics modelling of the single process steps are areas that increasingly support optimisation of manufactured parts. In the present paper, three different examples of modelling manufacturing...... processes from the viewpoint of combined materials and process modelling are presented: solidification of thin walled ductile cast iron, integrated modelling of spray forming and multiphysics modelling of friction stir welding. The fourth example describes integrated modelling applied to a failure analysis...

  18. The impact of working memory and the “process of process modelling” on model quality: Investigating experienced versus inexperienced modellers

    Science.gov (United States)

    Martini, Markus; Pinggera, Jakob; Neurauter, Manuel; Sachse, Pierre; Furtner, Marco R.; Weber, Barbara

    2016-01-01

    A process model (PM) represents the graphical depiction of a business process, for instance, the entire process from online ordering a book until the parcel is delivered to the customer. Knowledge about relevant factors for creating PMs of high quality is lacking. The present study investigated the role of cognitive processes as well as modelling processes in creating a PM in experienced and inexperienced modellers. Specifically, two working memory (WM) functions (holding and processing of information and relational integration) and three process of process modelling phases (comprehension, modelling, and reconciliation) were related to PM quality. Our results show that the WM function of relational integration was positively related to PM quality in both modelling groups. The ratio of comprehension phases was negatively related to PM quality in inexperienced modellers and the ratio of reconciliation phases was positively related to PM quality in experienced modellers. Our research reveals central cognitive mechanisms in process modelling and has potential practical implications for the development of modelling software and teaching the craft of process modelling. PMID:27157858

  19. Detecting Difference between Process Models Based on the Refined Process Structure Tree

    Directory of Open Access Journals (Sweden)

    Jing Fan

    2017-01-01

    Full Text Available The development of mobile workflow management systems (mWfMS leads to large number of business process models. In the meantime, the location restriction embedded in mWfMS may result in different process models for a single business process. In order to help users quickly locate the difference and rebuild the process model, detecting the difference between different process models is needed. Existing detection methods either provide a dissimilarity value to represent the difference or use predefined difference template to generate the result, which cannot reflect the entire composition of the difference. Hence, in this paper, we present a new approach to solve this problem. Firstly, we parse the process models to their corresponding refined process structure trees (PSTs, that is, decomposing a process model into a hierarchy of subprocess models. Then we design a method to convert the PST to its corresponding task based process structure tree (TPST. As a consequence, the problem of detecting difference between two process models is transformed to detect difference between their corresponding TPSTs. Finally, we obtain the difference between two TPSTs based on the divide and conquer strategy, where the difference is described by an edit script and we make the cost of the edit script close to minimum. The extensive experimental evaluation shows that our method can meet the real requirements in terms of precision and efficiency.

  20. Strategies to Automatically Derive a Process Model from a Configurable Process Model Based on Event Data

    Directory of Open Access Journals (Sweden)

    Mauricio Arriagada-Benítez

    2017-10-01

    Full Text Available Configurable process models are frequently used to represent business workflows and other discrete event systems among different branches of large organizations: they unify commonalities shared by all branches and describe their differences, at the same time. The configuration of such models is usually done manually, which is challenging. On the one hand, when the number of configurable nodes in the configurable process model grows, the size of the search space increases exponentially. On the other hand, the person performing the configuration may lack the holistic perspective to make the right choice for all configurable nodes at the same time, since choices influence each other. Nowadays, information systems that support the execution of business processes create event data reflecting how processes are performed. In this article, we propose three strategies (based on exhaustive search, genetic algorithms and a greedy heuristic that use event data to automatically derive a process model from a configurable process model that better represents the characteristics of the process in a specific branch. These strategies have been implemented in our proposed framework and tested in both business-like event logs as recorded in a higher educational enterprise resource planning system and a real case scenario involving a set of Dutch municipalities.

  1. Repairing process models to reflect reality

    NARCIS (Netherlands)

    Fahland, D.; Aalst, van der W.M.P.; Barros, A.; Gal, A.; Kindler, E.

    2012-01-01

    Process mining techniques relate observed behavior (i.e., event logs) to modeled behavior (e.g., a BPMN model or a Petri net). Processes models can be discovered from event logs and conformance checking techniques can be used to detect and diagnose differences between observed and modeled behavior.

  2. Business process modeling for processing classified documents using RFID technology

    Directory of Open Access Journals (Sweden)

    Koszela Jarosław

    2016-01-01

    Full Text Available The article outlines the application of the processing approach to the functional description of the designed IT system supporting the operations of the secret office, which processes classified documents. The article describes the application of the method of incremental modeling of business processes according to the BPMN model to the description of the processes currently implemented (“as is” in a manual manner and target processes (“to be”, using the RFID technology for the purpose of their automation. Additionally, the examples of applying the method of structural and dynamic analysis of the processes (process simulation to verify their correctness and efficiency were presented. The extension of the process analysis method is a possibility of applying the warehouse of processes and process mining methods.

  3. Business Process Modelling for Measuring Quality

    NARCIS (Netherlands)

    Heidari, F.; Loucopoulos, P.; Brazier, F.M.

    2013-01-01

    Business process modelling languages facilitate presentation, communication and analysis of business processes with different stakeholders. This paper proposes an approach that drives specification and measurement of quality requirements and in doing so relies on business process models as

  4. Investigation of Mediational Processes Using Parallel Process Latent Growth Curve Modeling

    Science.gov (United States)

    Cheong, JeeWon; MacKinnon, David P.; Khoo, Siek Toon

    2010-01-01

    This study investigated a method to evaluate mediational processes using latent growth curve modeling. The mediator and the outcome measured across multiple time points were viewed as 2 separate parallel processes. The mediational process was defined as the independent variable influencing the growth of the mediator, which, in turn, affected the growth of the outcome. To illustrate modeling procedures, empirical data from a longitudinal drug prevention program, Adolescents Training and Learning to Avoid Steroids, were used. The program effects on the growth of the mediator and the growth of the outcome were examined first in a 2-group structural equation model. The mediational process was then modeled and tested in a parallel process latent growth curve model by relating the prevention program condition, the growth rate factor of the mediator, and the growth rate factor of the outcome. PMID:20157639

  5. Behavioral conformance of artifact-centric process models

    NARCIS (Netherlands)

    Fahland, D.; Leoni, de M.; Dongen, van B.F.; Aalst, van der W.M.P.; Abramowicz, W.

    2011-01-01

    The use of process models in business information systems for analysis, execution, and improvement of processes assumes that the models describe reality. Conformance checking is a technique to validate how good a given process model describes recorded executions of the actual process. Recently,

  6. Cost Models for MMC Manufacturing Processes

    Science.gov (United States)

    Elzey, Dana M.; Wadley, Haydn N. G.

    1996-01-01

    Processes for the manufacture of advanced metal matrix composites are rapidly approaching maturity in the research laboratory and there is growing interest in their transition to industrial production. However, research conducted to date has almost exclusively focused on overcoming the technical barriers to producing high-quality material and little attention has been given to the economical feasibility of these laboratory approaches and process cost issues. A quantitative cost modeling (QCM) approach was developed to address these issues. QCM are cost analysis tools based on predictive process models relating process conditions to the attributes of the final product. An important attribute, of the QCM approach is the ability to predict the sensitivity of material production costs to product quality and to quantitatively explore trade-offs between cost and quality. Applications of the cost models allow more efficient direction of future MMC process technology development and a more accurate assessment of MMC market potential. Cost models were developed for two state-of-the art metal matrix composite (MMC) manufacturing processes: tape casting and plasma spray deposition. Quality and Cost models are presented for both processes and the resulting predicted quality-cost curves are presented and discussed.

  7. Conceptual modelling of human resource evaluation process

    Directory of Open Access Journals (Sweden)

    Negoiţă Doina Olivia

    2017-01-01

    Full Text Available Taking into account the highly diverse tasks which employees have to fulfil due to complex requirements of nowadays consumers, the human resource within an enterprise has become a strategic element for developing and exploiting products which meet the market expectations. Therefore, organizations encounter difficulties when approaching the human resource evaluation process. Hence, the aim of the current paper is to design a conceptual model of the aforementioned process, which allows the enterprises to develop a specific methodology. In order to design the conceptual model, Business Process Modelling instruments were employed - Adonis Community Edition Business Process Management Toolkit using the ADONIS BPMS Notation. The conceptual model was developed based on an in-depth secondary research regarding the human resource evaluation process. The proposed conceptual model represents a generic workflow (sequential and/ or simultaneously activities, which can be extended considering the enterprise’s needs regarding their requirements when conducting a human resource evaluation process. Enterprises can benefit from using software instruments for business process modelling as they enable process analysis and evaluation (predefined / specific queries and also model optimization (simulations.

  8. Error analysis of short term wind power prediction models

    International Nuclear Information System (INIS)

    De Giorgi, Maria Grazia; Ficarella, Antonio; Tarantino, Marco

    2011-01-01

    The integration of wind farms in power networks has become an important problem. This is because the electricity produced cannot be preserved because of the high cost of storage and electricity production must follow market demand. Short-long-range wind forecasting over different lengths/periods of time is becoming an important process for the management of wind farms. Time series modelling of wind speeds is based upon the valid assumption that all the causative factors are implicitly accounted for in the sequence of occurrence of the process itself. Hence time series modelling is equivalent to physical modelling. Auto Regressive Moving Average (ARMA) models, which perform a linear mapping between inputs and outputs, and Artificial Neural Networks (ANNs) and Adaptive Neuro-Fuzzy Inference Systems (ANFIS), which perform a non-linear mapping, provide a robust approach to wind power prediction. In this work, these models are developed in order to forecast power production of a wind farm with three wind turbines, using real load data and comparing different time prediction periods. This comparative analysis takes in the first time, various forecasting methods, time horizons and a deep performance analysis focused upon the normalised mean error and the statistical distribution hereof in order to evaluate error distribution within a narrower curve and therefore forecasting methods whereby it is more improbable to make errors in prediction. (author)

  9. Error analysis of short term wind power prediction models

    Energy Technology Data Exchange (ETDEWEB)

    De Giorgi, Maria Grazia; Ficarella, Antonio; Tarantino, Marco [Dipartimento di Ingegneria dell' Innovazione, Universita del Salento, Via per Monteroni, 73100 Lecce (Italy)

    2011-04-15

    The integration of wind farms in power networks has become an important problem. This is because the electricity produced cannot be preserved because of the high cost of storage and electricity production must follow market demand. Short-long-range wind forecasting over different lengths/periods of time is becoming an important process for the management of wind farms. Time series modelling of wind speeds is based upon the valid assumption that all the causative factors are implicitly accounted for in the sequence of occurrence of the process itself. Hence time series modelling is equivalent to physical modelling. Auto Regressive Moving Average (ARMA) models, which perform a linear mapping between inputs and outputs, and Artificial Neural Networks (ANNs) and Adaptive Neuro-Fuzzy Inference Systems (ANFIS), which perform a non-linear mapping, provide a robust approach to wind power prediction. In this work, these models are developed in order to forecast power production of a wind farm with three wind turbines, using real load data and comparing different time prediction periods. This comparative analysis takes in the first time, various forecasting methods, time horizons and a deep performance analysis focused upon the normalised mean error and the statistical distribution hereof in order to evaluate error distribution within a narrower curve and therefore forecasting methods whereby it is more improbable to make errors in prediction. (author)

  10. Modeling Dynamic Systems with Efficient Ensembles of Process-Based Models.

    Directory of Open Access Journals (Sweden)

    Nikola Simidjievski

    Full Text Available Ensembles are a well established machine learning paradigm, leading to accurate and robust models, predominantly applied to predictive modeling tasks. Ensemble models comprise a finite set of diverse predictive models whose combined output is expected to yield an improved predictive performance as compared to an individual model. In this paper, we propose a new method for learning ensembles of process-based models of dynamic systems. The process-based modeling paradigm employs domain-specific knowledge to automatically learn models of dynamic systems from time-series observational data. Previous work has shown that ensembles based on sampling observational data (i.e., bagging and boosting, significantly improve predictive performance of process-based models. However, this improvement comes at the cost of a substantial increase of the computational time needed for learning. To address this problem, the paper proposes a method that aims at efficiently learning ensembles of process-based models, while maintaining their accurate long-term predictive performance. This is achieved by constructing ensembles with sampling domain-specific knowledge instead of sampling data. We apply the proposed method to and evaluate its performance on a set of problems of automated predictive modeling in three lake ecosystems using a library of process-based knowledge for modeling population dynamics. The experimental results identify the optimal design decisions regarding the learning algorithm. The results also show that the proposed ensembles yield significantly more accurate predictions of population dynamics as compared to individual process-based models. Finally, while their predictive performance is comparable to the one of ensembles obtained with the state-of-the-art methods of bagging and boosting, they are substantially more efficient.

  11. Information-Processing Models and Curriculum Design

    Science.gov (United States)

    Calfee, Robert C.

    1970-01-01

    "This paper consists of three sections--(a) the relation of theoretical analyses of learning to curriculum design, (b) the role of information-processing models in analyses of learning processes, and (c) selected examples of the application of information-processing models to curriculum design problems." (Author)

  12. Measuring similarity between business process models

    NARCIS (Netherlands)

    Dongen, van B.F.; Dijkman, R.M.; Mendling, J.

    2007-01-01

    Quality aspects become increasingly important when business process modeling is used in a large-scale enterprise setting. In order to facilitate a storage without redundancy and an efficient retrieval of relevant process models in model databases it is required to develop a theoretical understanding

  13. Model medication management process in Australian nursing homes using business process modeling.

    Science.gov (United States)

    Qian, Siyu; Yu, Ping

    2013-01-01

    One of the reasons for end user avoidance or rejection to use health information systems is poor alignment of the system with healthcare workflow, likely causing by system designers' lack of thorough understanding about healthcare process. Therefore, understanding the healthcare workflow is the essential first step for the design of optimal technologies that will enable care staff to complete the intended tasks faster and better. The often use of multiple or "high risk" medicines by older people in nursing homes has the potential to increase medication error rate. To facilitate the design of information systems with most potential to improve patient safety, this study aims to understand medication management process in nursing homes using business process modeling method. The paper presents study design and preliminary findings from interviewing two registered nurses, who were team leaders in two nursing homes. Although there were subtle differences in medication management between the two homes, major medication management activities were similar. Further field observation will be conducted. Based on the data collected from observations, an as-is process model for medication management will be developed.

  14. Simulation Modeling of Software Development Processes

    Science.gov (United States)

    Calavaro, G. F.; Basili, V. R.; Iazeolla, G.

    1996-01-01

    A simulation modeling approach is proposed for the prediction of software process productivity indices, such as cost and time-to-market, and the sensitivity analysis of such indices to changes in the organization parameters and user requirements. The approach uses a timed Petri Net and Object Oriented top-down model specification. Results demonstrate the model representativeness, and its usefulness in verifying process conformance to expectations, and in performing continuous process improvement and optimization.

  15. Social software for business process modeling

    NARCIS (Netherlands)

    Koschmider, A.; Song, M.S.; Reijers, H.A.

    2010-01-01

    Formal models of business processes are used for a variety of purposes. But where the elicitation of the characteristics of a business process usually takes place in a collaborative fashion, the building of the final, formal process model is done mostly by a single person. This article presents the

  16. Modeling business processes: theoretical and practical aspects

    Directory of Open Access Journals (Sweden)

    V.V. Dubininа

    2015-06-01

    Full Text Available The essence of process-oriented enterprise management has been examined in the article. The content and types of information technology have been analyzed in the article, due to the complexity and differentiation of existing methods, as well as the specificity of language, terminology of the enterprise business processes modeling. The theoretical aspects of business processes modeling have been reviewed and the modern traditional modeling techniques received practical application in the visualization model of retailers activity have been studied in the article. In the process of theoretical analysis of the modeling methods found that UFO-toolkit method that has been developed by Ukrainian scientists due to it systemology integrated opportunities, is the most suitable for structural and object analysis of retailers business processes. It was designed visualized simulation model of the business process "sales" as is" of retailers using a combination UFO-elements with the aim of the further practical formalization and optimization of a given business process.

  17. Qualitative simulation in formal process modelling

    International Nuclear Information System (INIS)

    Sivertsen, Elin R.

    1999-01-01

    In relation to several different research activities at the OECD Halden Reactor Project, the usefulness of formal process models has been identified. Being represented in some appropriate representation language, the purpose of these models is to model process plants and plant automatics in a unified way to allow verification and computer aided design of control strategies. The present report discusses qualitative simulation and the tool QSIM as one approach to formal process models. In particular, the report aims at investigating how recent improvements of the tool facilitate the use of the approach in areas like process system analysis, procedure verification, and control software safety analysis. An important long term goal is to provide a basis for using qualitative reasoning in combination with other techniques to facilitate the treatment of embedded programmable systems in Probabilistic Safety Analysis (PSA). This is motivated from the potential of such a combination in safety analysis based on models comprising both software, hardware, and operator. It is anticipated that the research results from this activity will benefit V and V in a wide variety of applications where formal process models can be utilized. Examples are operator procedures, intelligent decision support systems, and common model repositories (author) (ml)

  18. Modeling closed nuclear fuel cycles processes

    Energy Technology Data Exchange (ETDEWEB)

    Shmidt, O.V. [A.A. Bochvar All-Russian Scientific Research Institute for Inorganic Materials, Rogova, 5a street, Moscow, 123098 (Russian Federation); Makeeva, I.R. [Zababakhin All-Russian Scientific Research Institute of Technical Physics, Vasiliev street 13, Snezhinsk, Chelyabinsk region, 456770 (Russian Federation); Liventsov, S.N. [Tomsk Polytechnic University, Tomsk, Lenin Avenue, 30, 634050 (Russian Federation)

    2016-07-01

    Computer models of processes are necessary for determination of optimal operating conditions for closed nuclear fuel cycle (NFC) processes. Computer models can be quickly changed in accordance with new and fresh data from experimental research. 3 kinds of process simulation are necessary. First, the VIZART software package is a balance model development used for calculating the material flow in technological processes. VIZART involves taking into account of equipment capacity, transport lines and storage volumes. Secondly, it is necessary to simulate the physico-chemical processes that are involved in the closure of NFC. The third kind of simulation is the development of software that allows the optimization, diagnostics and control of the processes which implies real-time simulation of product flows on the whole plant or on separate lines of the plant. (A.C.)

  19. Eye Tracking Meets the Process of Process Modeling: a Visual Analytic Approach

    DEFF Research Database (Denmark)

    Burattin, Andrea; Kaiser, M.; Neurauter, Manuel

    2017-01-01

    Research on the process of process modeling (PPM) studies how process models are created. It typically uses the logs of the interactions with the modeling tool to assess the modeler’s behavior. In this paper we suggest to introduce an additional stream of data (i.e., eye tracking) to improve the ...

  20. Modeling biochemical transformation processes and information processing with Narrator.

    Science.gov (United States)

    Mandel, Johannes J; Fuss, Hendrik; Palfreyman, Niall M; Dubitzky, Werner

    2007-03-27

    Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs), which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Narrator is a flexible and intuitive systems biology tool. It is

  1. Stochastic Evolution Dynamic of the Rock-Scissors-Paper Game Based on a Quasi Birth and Death Process.

    Science.gov (United States)

    Yu, Qian; Fang, Debin; Zhang, Xiaoling; Jin, Chen; Ren, Qiyu

    2016-06-27

    Stochasticity plays an important role in the evolutionary dynamic of cyclic dominance within a finite population. To investigate the stochastic evolution process of the behaviour of bounded rational individuals, we model the Rock-Scissors-Paper (RSP) game as a finite, state dependent Quasi Birth and Death (QBD) process. We assume that bounded rational players can adjust their strategies by imitating the successful strategy according to the payoffs of the last round of the game, and then analyse the limiting distribution of the QBD process for the game stochastic evolutionary dynamic. The numerical experiments results are exhibited as pseudo colour ternary heat maps. Comparisons of these diagrams shows that the convergence property of long run equilibrium of the RSP game in populations depends on population size and the parameter of the payoff matrix and noise factor. The long run equilibrium is asymptotically stable, neutrally stable and unstable respectively according to the normalised parameters in the payoff matrix. Moreover, the results show that the distribution probability becomes more concentrated with a larger population size. This indicates that increasing the population size also increases the convergence speed of the stochastic evolution process while simultaneously reducing the influence of the noise factor.

  2. The Formalization of the Business Process Modeling Goals

    Directory of Open Access Journals (Sweden)

    Ligita Bušinska

    2016-10-01

    Full Text Available In business process modeling the de facto standard BPMN has emerged. However, the applications of this notation have many subsets of elements and various extensions. Also, BPMN still coincides with many other modeling languages, forming a large set of available options for business process modeling languages and dialects. While, in general, the goal of modelers is a central notion in the choice of modeling languages and notations, in most researches that propose guidelines, techniques, and methods for business process modeling language evaluation and/or selection, the business process modeling goal is not formalized and not transparently taken into account. To overcome this gap, and to explicate and help to handle business process modeling complexity, the approach to formalize the business process modeling goal, and the supporting three dimensional business process modeling framework, are proposed.

  3. Pre-Processing and Modeling Tools for Bigdata

    Directory of Open Access Journals (Sweden)

    Hashem Hadi

    2016-09-01

    Full Text Available Modeling tools and operators help the user / developer to identify the processing field on the top of the sequence and to send into the computing module only the data related to the requested result. The remaining data is not relevant and it will slow down the processing. The biggest challenge nowadays is to get high quality processing results with a reduced computing time and costs. To do so, we must review the processing sequence, by adding several modeling tools. The existing processing models do not take in consideration this aspect and focus on getting high calculation performances which will increase the computing time and costs. In this paper we provide a study of the main modeling tools for BigData and a new model based on pre-processing.

  4. Kopernik : modeling business processes for digital customers

    OpenAIRE

    Estañol Lamarca, Montserrat; Castro, Manuel; Díaz-Montenegro, Sylvia; Teniente López, Ernest

    2016-01-01

    This paper presents the Kopernik methodology for modeling business processes for digital customers. These processes require a high degree of flexibility in the execution of their tasks or actions. We achieve this by using the artifact-centric approach to process modeling and the use of condition-action rules. The processes modeled following Kopernik can then be implemented in an existing commercial tool, Balandra.

  5. Specification of e-business process model for PayPal online payment process using Reo

    OpenAIRE

    Xie, M.

    2005-01-01

    textabstractE-business process modeling allows business analysts to better understand and analyze the business processes, and eventually to use software systems to automate (parts of) these business processes to achieve higher profit. To support e-business process modeling, many business process modeling languages have been used as tools. However, many existing business process modeling languages lack (a) formal semantics, (b) formal computational model, and (c) an integrated view of the busi...

  6. Process mining using BPMN: relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; van der Aalst, W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2017-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  7. Process mining using BPMN : relating event logs and process models

    NARCIS (Netherlands)

    Kalenkova, A.A.; Aalst, van der W.M.P.; Lomazova, I.A.; Rubin, V.A.

    2015-01-01

    Process-aware information systems (PAIS) are systems relying on processes, which involve human and software resources to achieve concrete goals. There is a need to develop approaches for modeling, analysis, improvement and monitoring processes within PAIS. These approaches include process mining

  8. A Comparative of business process modelling techniques

    Science.gov (United States)

    Tangkawarow, I. R. H. T.; Waworuntu, J.

    2016-04-01

    In this era, there is a lot of business process modeling techniques. This article is the research about differences of business process modeling techniques. For each technique will explain about the definition and the structure. This paper presents a comparative analysis of some popular business process modelling techniques. The comparative framework is based on 2 criteria: notation and how it works when implemented in Somerleyton Animal Park. Each technique will end with the advantages and disadvantages. The final conclusion will give recommend of business process modeling techniques that easy to use and serve the basis for evaluating further modelling techniques.

  9. Piecewise deterministic processes in biological models

    CERN Document Server

    Rudnicki, Ryszard

    2017-01-01

    This book presents a concise introduction to piecewise deterministic Markov processes (PDMPs), with particular emphasis on their applications to biological models. Further, it presents examples of biological phenomena, such as gene activity and population growth, where different types of PDMPs appear: continuous time Markov chains, deterministic processes with jumps, processes with switching dynamics, and point processes. Subsequent chapters present the necessary tools from the theory of stochastic processes and semigroups of linear operators, as well as theoretical results concerning the long-time behaviour of stochastic semigroups induced by PDMPs and their applications to biological models. As such, the book offers a valuable resource for mathematicians and biologists alike. The first group will find new biological models that lead to interesting and often new mathematical questions, while the second can observe how to include seemingly disparate biological processes into a unified mathematical theory, and...

  10. Computer Aided Continuous Time Stochastic Process Modelling

    DEFF Research Database (Denmark)

    Kristensen, N.R.; Madsen, Henrik; Jørgensen, Sten Bay

    2001-01-01

    A grey-box approach to process modelling that combines deterministic and stochastic modelling is advocated for identification of models for model-based control of batch and semi-batch processes. A computer-aided tool designed for supporting decision-making within the corresponding modelling cycle...

  11. Neuroscientific Model of Motivational Process

    OpenAIRE

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Rewa...

  12. Measures of Quality in Business Process Modelling

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-06-01

    Full Text Available Business process modelling and analysing is undoubtedly one of the most important parts of Applied (Business Informatics. Quality of business process models (diagrams is crucial for any purpose in this area. The goal of a process analyst’s work is to create generally understandable, explicit and error free models. If a process is properly described, created models can be used as an input into deep analysis and optimization. It can be assumed that properly designed business process models (similarly as in the case of correctly written algorithms contain characteristics that can be mathematically described. Besides it will be possible to create a tool that will help process analysts to design proper models. As part of this review will be conducted systematic literature review in order to find and analyse business process model’s design and business process model’s quality measures. It was found that mentioned area had already been the subject of research investigation in the past. Thirty-three suitable scietific publications and twenty-two quality measures were found. Analysed scientific publications and existing quality measures do not reflect all important attributes of business process model’s clarity, simplicity and completeness. Therefore it would be appropriate to add new measures of quality.

  13. Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.

    Science.gov (United States)

    Tute, Erik; Steiner, Jochen

    2018-01-01

    Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.

  14. Simulation Models of Human Decision-Making Processes

    Directory of Open Access Journals (Sweden)

    Nina RIZUN

    2014-10-01

    Full Text Available The main purpose of the paper is presentation of the new concept of human decision-making process modeling via using the analogy with Automatic Control Theory. From the author's point of view this concept allows to develop and improve the theory of decision-making in terms of the study and classification of specificity of the human intellectual processes in different conditions. It was proved that the main distinguishing feature between the Heuristic / Intuitive and Rational Decision-Making Models is the presence of so-called phenomenon of "enrichment" of the input information with human propensity, hobbies, tendencies, expectations, axioms and judgments, presumptions or bias and their justification. In order to obtain additional knowledge about the basic intellectual processes as well as the possibility of modeling the decision results in various parameters characterizing the decision-maker, the complex of the simulation models was developed. These models are based on the assumptions that:  basic intellectual processes of the Rational Decision-Making Model can be adequately simulated and identified by the transient processes of the proportional-integral-derivative controller; basic intellectual processes of the Bounded Rationality and Intuitive Models can be adequately simulated and identified by the transient processes of the nonlinear elements.The taxonomy of the most typical automatic control theory elements and their compliance with certain decision-making models with a point of view of decision-making process specificity and decision-maker behavior during a certain time of professional activity was obtained.

  15. Diabetic ketoacidosis in adult patients: an audit of factors influencing time to normalisation of metabolic parameters.

    Science.gov (United States)

    Lee, Melissa H; Calder, Genevieve L; Santamaria, John D; MacIsaac, Richard J

    2018-05-01

    Diabetic ketoacidosis (DKA) is an acute life-threatening metabolic complication of diabetes that imposes substantial burden on our healthcare system. There is a paucity of published data in Australia assessing factors influencing time to resolution of DKA and length of stay (LOS). To identify factors that predict a slower time to resolution of DKA in adults with diabetes. Retrospective audit of patients admitted to St Vincent's Hospital Melbourne between 2010 to 2014 coded with a diagnosis of 'Diabetic Ketoacidosis'. The primary outcome was time to resolution of DKA based on normalisation of biochemical markers. Episodes of DKA within the wider Victorian hospital network were also explored. Seventy-one patients met biochemical criteria for DKA; median age 31 years (26-45 years), 59% were male and 23% had newly diagnosed diabetes. Insulin omission was the most common precipitant (42%). Median time to resolution of DKA was 11 h (6.5-16.5 h). Individual factors associated with slower resolution of DKA were lower admission pH (P < 0.001) and higher admission serum potassium level (P = 0.03). Median LOS was 3 days (2-5 days), compared to a Victorian state-wide LOS of 2 days. Higher comorbidity scores were associated with longer LOS (P < 0.001). Lower admission pH levels and higher admission serum potassium levels are independent predictors of slower time to resolution of DKA. This may assist to stratify patients with DKA using markers of severity to determine who may benefit from closer monitoring and to predict LOS. © 2018 Royal Australasian College of Physicians.

  16. User-guided discovery of declarative process models

    NARCIS (Netherlands)

    Maggi, F.M.; Mooij, A.J.; Aalst, van der W.M.P.; Chawla, N.; King, I.; Sperduti, A.

    2011-01-01

    Process mining techniques can be used to effectively discover process models from logs with example behaviour. Cross-correlating a discovered model with information in the log can be used to improve the underlying process. However, existing process discovery techniques have two important drawbacks.

  17. Deferred Action: Theoretical model of process architecture design for emergent business processes

    Directory of Open Access Journals (Sweden)

    Patel, N.V.

    2007-01-01

    Full Text Available E-Business modelling and ebusiness systems development assumes fixed company resources, structures, and business processes. Empirical and theoretical evidence suggests that company resources and structures are emergent rather than fixed. Planning business activity in emergent contexts requires flexible ebusiness models based on better management theories and models . This paper builds and proposes a theoretical model of ebusiness systems capable of catering for emergent factors that affect business processes. Drawing on development of theories of the ‘action and design’class the Theory of Deferred Action is invoked as the base theory for the theoretical model. A theoretical model of flexible process architecture is presented by identifying its core components and their relationships, and then illustrated with exemplar flexible process architectures capable of responding to emergent factors. Managerial implications of the model are considered and the model’s generic applicability is discussed.

  18. Modelling Hospital Materials Management Processes

    Directory of Open Access Journals (Sweden)

    Raffaele Iannone

    2013-06-01

    integrated and detailed analysis and description model for hospital materials management data and tasks, which is able to tackle information from patient requirements to usage, from replenishment requests to supplying and handling activities. The model takes account of medical risk reduction, traceability and streamlined processes perspectives. Second, the paper translates this information into a business process model and mathematical formalization.The study provides a useful guide to the various relevant technology‐related, management and business issues, laying the foundations of an efficient reengineering of the supply chain to reduce healthcare costs and improve the quality of care.

  19. Regime-switching models to study psychological process

    NARCIS (Netherlands)

    Hamaker, E.L.; Grasman, R.P.P.P.; Kamphuis, J.H.

    2010-01-01

    Many psychological processes are characterized by recurrent shifts between different states. To model these processes at the level of the individual, regime-switching models may prove useful. In this chapter we discuss two of these models: the threshold autoregressive model and the Markov

  20. Modeling biochemical transformation processes and information processing with Narrator

    Directory of Open Access Journals (Sweden)

    Palfreyman Niall M

    2007-03-01

    Full Text Available Abstract Background Software tools that model and simulate the dynamics of biological processes and systems are becoming increasingly important. Some of these tools offer sophisticated graphical user interfaces (GUIs, which greatly enhance their acceptance by users. Such GUIs are based on symbolic or graphical notations used to describe, interact and communicate the developed models. Typically, these graphical notations are geared towards conventional biochemical pathway diagrams. They permit the user to represent the transport and transformation of chemical species and to define inhibitory and stimulatory dependencies. A critical weakness of existing tools is their lack of supporting an integrative representation of transport, transformation as well as biological information processing. Results Narrator is a software tool facilitating the development and simulation of biological systems as Co-dependence models. The Co-dependence Methodology complements the representation of species transport and transformation together with an explicit mechanism to express biological information processing. Thus, Co-dependence models explicitly capture, for instance, signal processing structures and the influence of exogenous factors or events affecting certain parts of a biological system or process. This combined set of features provides the system biologist with a powerful tool to describe and explore the dynamics of life phenomena. Narrator's GUI is based on an expressive graphical notation which forms an integral part of the Co-dependence Methodology. Behind the user-friendly GUI, Narrator hides a flexible feature which makes it relatively easy to map models defined via the graphical notation to mathematical formalisms and languages such as ordinary differential equations, the Systems Biology Markup Language or Gillespie's direct method. This powerful feature facilitates reuse, interoperability and conceptual model development. Conclusion Narrator is a

  1. Probabilistic modeling of discourse-aware sentence processing.

    Science.gov (United States)

    Dubey, Amit; Keller, Frank; Sturt, Patrick

    2013-07-01

    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.

  2. The (Mathematical) Modeling Process in Biosciences.

    Science.gov (United States)

    Torres, Nestor V; Santos, Guido

    2015-01-01

    In this communication, we introduce a general framework and discussion on the role of models and the modeling process in the field of biosciences. The objective is to sum up the common procedures during the formalization and analysis of a biological problem from the perspective of Systems Biology, which approaches the study of biological systems as a whole. We begin by presenting the definitions of (biological) system and model. Particular attention is given to the meaning of mathematical model within the context of biology. Then, we present the process of modeling and analysis of biological systems. Three stages are described in detail: conceptualization of the biological system into a model, mathematical formalization of the previous conceptual model and optimization and system management derived from the analysis of the mathematical model. All along this work the main features and shortcomings of the process are analyzed and a set of rules that could help in the task of modeling any biological system are presented. Special regard is given to the formative requirements and the interdisciplinary nature of this approach. We conclude with some general considerations on the challenges that modeling is posing to current biology.

  3. Normalisation of cerebrospinal fluid biomarkers parallels improvement of neurological symptoms following HAART in HIV dementia – case report

    Directory of Open Access Journals (Sweden)

    Blennow Kaj

    2006-09-01

    Full Text Available Abstract Background Since the introduction of HAART the incidence of HIV dementia has declined and HAART seems to improve neurocognitive function in patients with HIV dementia. Currently, HIV dementia develops mainly in patients without effective treatment, though it has also been described in patients on HAART and milder HIV-associated neuropsychological impairment is still frequent among HIV-1 infected patients regardless of HAART. Elevated cerebrospinal fluid (CSF levels of markers of neural injury and immune activation have been found in HIV dementia, but neither of those, nor CSF HIV-1 RNA levels have been proven useful as diagnostic or prognostic pseudomarkers in HIV dementia. Case presentation We report a case of HIV dementia (MSK stage 3 in a 57 year old antiretroviral naïve man who was introduced on zidovudine, lamivudine and ritonavir boosted indinavir, and followed with consecutive lumbar punctures before and after two and 15 months after initiation of HAART. Improvement of neurocognitive function was paralleled by normalisation of CSF neural markers (NFL, Tau and GFAP levels and a decline in CSF and serum neopterin and CSF and plasma HIV-1 RNA levels. Conclusion The value of these CSF markers as prognostic pseudomarkers of the effect of HAART on neurocognitive impairment in HIV dementia ought to be evaluated in longitudinal studies.

  4. Model for Simulating a Spiral Software-Development Process

    Science.gov (United States)

    Mizell, Carolyn; Curley, Charles; Nayak, Umanath

    2010-01-01

    A discrete-event simulation model, and a computer program that implements the model, have been developed as means of analyzing a spiral software-development process. This model can be tailored to specific development environments for use by software project managers in making quantitative cases for deciding among different software-development processes, courses of action, and cost estimates. A spiral process can be contrasted with a waterfall process, which is a traditional process that consists of a sequence of activities that include analysis of requirements, design, coding, testing, and support. A spiral process is an iterative process that can be regarded as a repeating modified waterfall process. Each iteration includes assessment of risk, analysis of requirements, design, coding, testing, delivery, and evaluation. A key difference between a spiral and a waterfall process is that a spiral process can accommodate changes in requirements at each iteration, whereas in a waterfall process, requirements are considered to be fixed from the beginning and, therefore, a waterfall process is not flexible enough for some projects, especially those in which requirements are not known at the beginning or may change during development. For a given project, a spiral process may cost more and take more time than does a waterfall process, but may better satisfy a customer's expectations and needs. Models for simulating various waterfall processes have been developed previously, but until now, there have been no models for simulating spiral processes. The present spiral-process-simulating model and the software that implements it were developed by extending a discrete-event simulation process model of the IEEE 12207 Software Development Process, which was built using commercially available software known as the Process Analysis Tradeoff Tool (PATT). Typical inputs to PATT models include industry-average values of product size (expressed as number of lines of code

  5. Software-Engineering Process Simulation (SEPS) model

    Science.gov (United States)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  6. Metrics for Business Process Models

    Science.gov (United States)

    Mendling, Jan

    Up until now, there has been little research on why people introduce errors in real-world business process models. In a more general context, Simon [404] points to the limitations of cognitive capabilities and concludes that humans act rationally only to a certain extent. Concerning modeling errors, this argument would imply that human modelers lose track of the interrelations of large and complex models due to their limited cognitive capabilities and introduce errors that they would not insert in a small model. A recent study by Mendling et al. [275] explores in how far certain complexity metrics of business process models have the potential to serve as error determinants. The authors conclude that complexity indeed appears to have an impact on error probability. Before we can test such a hypothesis in a more general setting, we have to establish an understanding of how we can define determinants that drive error probability and how we can measure them.

  7. Revising process models through inductive learning

    NARCIS (Netherlands)

    Maggi, F.M.; Corapi, D.; Russo, A.; Lupu, E.; Visaggio, G.; Muehlen, zur M.; Su, J.

    2011-01-01

    Discovering the Business Process (BP) model underpinning existing practices through analysis of event logs, allows users to understand, analyse and modify the process. But, to be useful, the BP model must be kept in line with practice throughout its lifetime, as changes occur to the business

  8. Diagnosing differences between business process models

    NARCIS (Netherlands)

    Dijkman, R.M.; Dumas, M.; Reichert, M.; Shan, M.-C.

    2008-01-01

    This paper presents a technique to diagnose differences between business process models in the EPC notation. The diagnosis returns the exact position of a difference in the business process models and diagnoses the type of a difference, using a typology of differences developed in previous work.

  9. AMFIBIA: A Meta-Model for the Integration of Business Process Modelling Aspects

    DEFF Research Database (Denmark)

    Axenath, Björn; Kindler, Ekkart; Rubin, Vladimir

    2007-01-01

    AMFIBIA is a meta-model that formalises the essential aspects and concepts of business processes. Though AMFIBIA is not the first approach to formalising the aspects and concepts of business processes, it is more ambitious in the following respects: Firstly, it is independent from particular...... modelling formalisms of business processes and it is designed in such a way that any formalism for modelling some aspect of a business process can be plugged into AMFIBIA. Therefore, AMFIBIA is formalism-independent. Secondly, it is not biased toward any aspect of business processes; the different aspects...... can be considered and modelled independently of each other. Moreover, AMFIBIA is not restricted to a fixed set of aspects; new aspects of business processes can be easily integrated. Thirdly, AMFIBIA does not only name and relate the concepts of business process modelling, as it is typically done...

  10. Model based process-product design and analysis

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    This paper gives a perspective on modelling and the important role it has within product-process design and analysis. Different modelling issues related to development and application of systematic model-based solution approaches for product-process design is discussed and the need for a hybrid...... model-based framework is highlighted. This framework should be able to manage knowledge-data, models, and associated methods and tools integrated with design work-flows and data-flows for specific product-process design problems. In particular, the framework needs to manage models of different types......, forms and complexity, together with their associated parameters. An example of a model-based system for design of chemicals based formulated products is also given....

  11. Understanding Quality in Process Modelling: Towards a Holistic Perspective

    Directory of Open Access Journals (Sweden)

    Jan Recker

    2007-09-01

    Full Text Available Quality is one of the main topics in current conceptual modelling research, as is the field of business process modelling. Yet, widely acknowledged academic contributions towards an understanding or measurement of business process model quality are limited at best. In this paper I argue that the development of methodical theories concerning the measurement or establishment of process model quality must be preceded by methodological elaborations on business process modelling. I further argue that existing epistemological foundations of process modelling are insufficient for describing all extrinsic and intrinsic traits of model quality. This in turn has led to a lack of holistic understanding of process modelling. Taking into account the inherent social and purpose-oriented character of process modelling in contemporary organizations I present a socio-pragmatic constructionist methodology of business process modelling and sketch out implications of this perspective towards an understanding of process model quality. I anticipate that, based on this research, theories can be developed that facilitate the evaluation of the ’goodness’ of a business process model.

  12. Animated-simulation modeling facilitates clinical-process costing.

    Science.gov (United States)

    Zelman, W N; Glick, N D; Blackmore, C C

    2001-09-01

    Traditionally, the finance department has assumed responsibility for assessing process costs in healthcare organizations. To enhance process-improvement efforts, however, many healthcare providers need to include clinical staff in process cost analysis. Although clinical staff often use electronic spreadsheets to model the cost of specific processes, PC-based animated-simulation tools offer two major advantages over spreadsheets: they allow clinicians to interact more easily with the costing model so that it more closely represents the process being modeled, and they represent cost output as a cost range rather than as a single cost estimate, thereby providing more useful information for decision making.

  13. How visual cognition influences process model comprehension

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2017-01-01

    Process analysts and other professionals extensively use process models to analyze business processes and identify performance improvement opportunities. Therefore, it is important that such models can be easily and properly understood. Previous research has mainly focused on two types of factors

  14. A general model for membrane-based separation processes

    DEFF Research Database (Denmark)

    Soni, Vipasha; Abildskov, Jens; Jonsson, Gunnar Eigil

    2009-01-01

    behaviour will play an important role. In this paper, modelling of membrane-based processes for separation of gas and liquid mixtures are considered. Two general models, one for membrane-based liquid separation processes (with phase change) and another for membrane-based gas separation are presented....... The separation processes covered are: membrane-based gas separation processes, pervaporation and various types of membrane distillation processes. The specific model for each type of membrane-based process is generated from the two general models by applying the specific system descriptions and the corresponding...

  15. APROMORE : an advanced process model repository

    NARCIS (Netherlands)

    La Rosa, M.; Reijers, H.A.; Aalst, van der W.M.P.; Dijkman, R.M.; Mendling, J.; Dumas, M.; García-Bañuelos, L.

    2011-01-01

    Business process models are becoming available in large numbers due to their widespread use in many industrial applications such as enterprise and quality engineering projects. On the one hand, this raises a challenge as to their proper management: how can it be ensured that the proper process model

  16. A model evaluation checklist for process-based environmental models

    Science.gov (United States)

    Jackson-Blake, Leah

    2015-04-01

    Mechanistic catchment-scale phosphorus models appear to perform poorly where diffuse sources dominate. The reasons for this were investigated for one commonly-applied model, the INtegrated model of CAtchment Phosphorus (INCA-P). Model output was compared to 18 months of daily water quality monitoring data in a small agricultural catchment in Scotland, and model structure, key model processes and internal model responses were examined. Although the model broadly reproduced dissolved phosphorus dynamics, it struggled with particulates. The reasons for poor performance were explored, together with ways in which improvements could be made. The process of critiquing and assessing model performance was then generalised to provide a broadly-applicable model evaluation checklist, incorporating: (1) Calibration challenges, relating to difficulties in thoroughly searching a high-dimensional parameter space and in selecting appropriate means of evaluating model performance. In this study, for example, model simplification was identified as a necessary improvement to reduce the number of parameters requiring calibration, whilst the traditionally-used Nash Sutcliffe model performance statistic was not able to discriminate between realistic and unrealistic model simulations, and alternative statistics were needed. (2) Data limitations, relating to a lack of (or uncertainty in) input data, data to constrain model parameters, data for model calibration and testing, and data to test internal model processes. In this study, model reliability could be improved by addressing all four kinds of data limitation. For example, there was insufficient surface water monitoring data for model testing against an independent dataset to that used in calibration, whilst additional monitoring of groundwater and effluent phosphorus inputs would help distinguish between alternative plausible model parameterisations. (3) Model structural inadequacies, whereby model structure may inadequately represent

  17. The evolution of process-based hydrologic models

    NARCIS (Netherlands)

    Clark, Martyn P.; Bierkens, Marc F.P.; Samaniego, Luis; Woods, Ross A.; Uijlenhoet, Remko; Bennett, Katrina E.; Pauwels, Valentijn R.N.; Cai, Xitian; Wood, Andrew W.; Peters-Lidard, Christa D.

    2017-01-01

    The diversity in hydrologic models has historically led to great controversy on the "correct" approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this

  18. Software Process Validation: Quantitatively Measuring the Correspondence of a Process to a Model

    National Research Council Canada - National Science Library

    Cook, Jonathan E; Wolf, Alexander L

    1997-01-01

    .... When process models and process executions diverge, something significant is happening. The authors have developed techniques for uncovering and measuring the discrepancies between models and executions, which they call process validation...

  19. Distillation modeling for a uranium refining process

    Energy Technology Data Exchange (ETDEWEB)

    Westphal, B.R.

    1996-03-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed {open_quotes}cathode processing{close_quotes}. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process.

  20. On estimation of the noise variance in high-dimensional linear models

    OpenAIRE

    Golubev, Yuri; Krymova, Ekaterina

    2017-01-01

    We consider the problem of recovering the unknown noise variance in the linear regression model. To estimate the nuisance (a vector of regression coefficients) we use a family of spectral regularisers of the maximum likelihood estimator. The noise estimation is based on the adaptive normalisation of the squared error. We derive the upper bound for the concentration of the proposed method around the ideal estimator (the case of zero nuisance).

  1. Model Process Control Language

    Data.gov (United States)

    National Aeronautics and Space Administration — The MPC (Model Process Control) language enables the capture, communication and preservation of a simulation instance, with sufficient detail that it can be...

  2. Correctness-preserving configuration of business process models

    NARCIS (Netherlands)

    Aalst, van der W.M.P.; Dumas, M.; Gottschalk, F.; Hofstede, ter A.H.M.; La Rosa, M.; Mendling, J.; Fiadeiro, J.; Inverardi, P.

    2008-01-01

    Reference process models capture recurrent business operations in a given domain such as procurement or logistics. These models are intended to be configured to fit the requirements of specific organizations or projects, leading to individualized process models that are subsequently used for domain

  3. Process models and model-data fusion in dendroecology

    Directory of Open Access Journals (Sweden)

    Joel eGuiot

    2014-08-01

    Full Text Available Dendrochronology (i.e. the study of annually dated tree-ring time series has proved to be a powerful technique to understand tree-growth. This paper intends to show the interest of using ecophysiological modeling not only to understand and predict tree-growth (dendroecology but also to reconstruct past climates (dendroclimatology. Process models have been used for several decades in dendroclimatology, but it is only recently that methods of model-data fusion have led to significant progress in modeling tree-growth as a function of climate and in reconstructing past climates. These model-data fusion (MDF methods, mainly based on the Bayesian paradigm, have been shown to be powerful for both model calibration and model inversion. After a rapid survey of tree-growth modeling, we illustrate MDF with examples based on series of Southern France Aleppo pines and Central France oaks. These examples show that if plants experienced CO2 fertilization, this would have a significant effect on tree-growth which in turn would bias the climate reconstructions. This bias could be extended to other environmental non-climatic factors directly or indirectly affecting annual ring formation and not taken into account in classical empirical models, which supports the use of more complex process-based models. Finally, we conclude by showing the interest of the data assimilation methods applied in climatology to produce climate re-analyses.

  4. Catastrophe Insurance Modeled by Shot-Noise Processes

    Directory of Open Access Journals (Sweden)

    Thorsten Schmidt

    2014-02-01

    Full Text Available Shot-noise processes generalize compound Poisson processes in the following way: a jump (the shot is followed by a decline (noise. This constitutes a useful model for insurance claims in many circumstances; claims due to natural disasters or self-exciting processes exhibit similar features. We give a general account of shot-noise processes with time-inhomogeneous drivers inspired by recent results in credit risk. Moreover, we derive a number of useful results for modeling and pricing with shot-noise processes. Besides this, we obtain some highly tractable examples and constitute a useful modeling tool for dynamic claims processes. The results can in particular be used for pricing Catastrophe Bonds (CAT bonds, a traded risk-linked security. Additionally, current results regarding the estimation of shot-noise processes are reviewed.

  5. Modeling Aspects Of Activated Sludge Processes Part I: Process Modeling Of Activated Sludge Facilitation And Sedimentation

    International Nuclear Information System (INIS)

    Ibrahim, H. I.; EI-Ahwany, A.H.; Ibrahim, G.

    2004-01-01

    Process modeling of activated sludge flocculation and sedimentation reviews consider the activated sludge floc characteristics such as: morphology viable and non-viable cell ratio density and water content, bio flocculation and its kinetics were studied considering the characteristics of bio flocculation and explaining theory of Divalent Cation Bridging which describes the major role of cations in bio flocculation. Activated sludge flocculation process modeling was studied considering mass transfer limitations from Clifft and Andrew, 1981, Benefild and Molz 1983 passing Henze 1987, until Tyagi 1996 and G. Ibrahim et aI. 2002. Models of aggregation and breakage of flocs were studied by Spicer and Pratsinis 1996,and Biggs 2002 Size distribution of floes influences mass transfer and biomass separation in the activated sludge process. Therefore, it is of primary importance to establish the role of specific process operation factors, such as sludge loading dynamic sludge age and dissolved oxygen, on this distribution with special emphasis on the formation of primary particles

  6. A Petri Net-Based Software Process Model for Developing Process-Oriented Information Systems

    Science.gov (United States)

    Li, Yu; Oberweis, Andreas

    Aiming at increasing flexibility, efficiency, effectiveness, and transparency of information processing and resource deployment in organizations to ensure customer satisfaction and high quality of products and services, process-oriented information systems (POIS) represent a promising realization form of computerized business information systems. Due to the complexity of POIS, explicit and specialized software process models are required to guide POIS development. In this chapter we characterize POIS with an architecture framework and present a Petri net-based software process model tailored for POIS development with consideration of organizational roles. As integrated parts of the software process model, we also introduce XML nets, a variant of high-level Petri nets as basic methodology for business processes modeling, and an XML net-based software toolset providing comprehensive functionalities for POIS development.

  7. A probabilistic evaluation procedure for process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2018-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  8. Investigating nurse practitioners in the private sector: a theoretically informed research protocol.

    Science.gov (United States)

    Adams, Margaret; Gardner, Glenn; Yates, Patsy

    2017-06-01

    To report a study protocol and the theoretical framework normalisation process theory that informs this protocol for a case study investigation of private sector nurse practitioners. Most research evaluating nurse practitioner service is focused on public, mainly acute care environments where nurse practitioner service is well established with strong structures for governance and sustainability. Conversely, there is lack of clarity in governance for emerging models in the private sector. In a climate of healthcare reform, nurse practitioner service is extending beyond the familiar public health sector. Further research is required to inform knowledge of the practice, operational framework and governance of new nurse practitioner models. The proposed research will use a multiple exploratory case study design to examine private sector nurse practitioner service. Data collection includes interviews, surveys and audits. A sequential mixed method approach to analysis of each case will be conducted. Findings from within-case analysis will lead to a meta-synthesis across all four cases to gain a holistic understanding of the cases under study, private sector nurse practitioner service. Normalisation process theory will be used to guide the research process, specifically coding and analysis of data using theory constructs and the relevant components associated with those constructs. This article provides a blueprint for the research and describes a theoretical framework, normalisation process theory in terms of its flexibility as an analytical framework. Consistent with the goals of best research practice, this study protocol will inform the research community in the field of primary health care about emerging research in this field. Publishing a study protocol ensures researcher fidelity to the analysis plan and supports research collaboration across teams. © 2016 John Wiley & Sons Ltd.

  9. Properties of spatial Cox process models

    DEFF Research Database (Denmark)

    Møller, Jesper

    Probabilistic properties of Cox processes of relevance for statistical modelling and inference are studied. Particularly, we study the most important classes of Cox processes, including log Gaussian Cox processes, shot noise Cox processes, and permanent Cox processes. We consider moment properties...... and point process operations such as thinning, displacements, and superpositioning. We also discuss how to simulate specific Cox processes....

  10. Model for amorphous aggregation processes

    Science.gov (United States)

    Stranks, Samuel D.; Ecroyd, Heath; van Sluyter, Steven; Waters, Elizabeth J.; Carver, John A.; von Smekal, Lorenz

    2009-11-01

    The amorphous aggregation of proteins is associated with many phenomena, ranging from the formation of protein wine haze to the development of cataract in the eye lens and the precipitation of recombinant proteins during their expression and purification. While much literature exists describing models for linear protein aggregation, such as amyloid fibril formation, there are few reports of models which address amorphous aggregation. Here, we propose a model to describe the amorphous aggregation of proteins which is also more widely applicable to other situations where a similar process occurs, such as in the formation of colloids and nanoclusters. As first applications of the model, we have tested it against experimental turbidimetry data of three proteins relevant to the wine industry and biochemistry, namely, thaumatin, a thaumatinlike protein, and α -lactalbumin. The model is very robust and describes amorphous experimental data to a high degree of accuracy. Details about the aggregation process, such as shape parameters of the aggregates and rate constants, can also be extracted.

  11. Online Rule Generation Software Process Model

    OpenAIRE

    Sudeep Marwaha; Alka Aroa; Satma M C; Rajni Jain; R C Goyal

    2013-01-01

    For production systems like expert systems, a rule generation software can facilitate the faster deployment. The software process model for rule generation using decision tree classifier refers to the various steps required to be executed for the development of a web based software model for decision rule generation. The Royce’s final waterfall model has been used in this paper to explain the software development process. The paper presents the specific output of various steps of modified wat...

  12. Fermentation process diagnosis using a mathematical model

    Energy Technology Data Exchange (ETDEWEB)

    Yerushalmi, L; Volesky, B; Votruba, J

    1988-09-01

    Intriguing physiology of a solvent-producing strain of Clostridium acetobutylicum led to the synthesis of a mathematical model of the acetone-butanol fermentation process. The model presented is capable of describing the process dynamics and the culture behavior during a standard and a substandard acetone-butanol fermentation. In addition to the process kinetic parameters, the model includes the culture physiological parameters, such as the cellular membrane permeability and the number of membrane sites for active transport of sugar. Computer process simulation studies for different culture conditions used the model, and quantitatively pointed out the importance of selected culture parameters that characterize the cell membrane behaviour and play an important role in the control of solvent synthesis by the cell. The theoretical predictions by the new model were confirmed by experimental determination of the cellular membrane permeability.

  13. A Generic Modeling Process to Support Functional Fault Model Development

    Science.gov (United States)

    Maul, William A.; Hemminger, Joseph A.; Oostdyk, Rebecca; Bis, Rachael A.

    2016-01-01

    Functional fault models (FFMs) are qualitative representations of a system's failure space that are used to provide a diagnostic of the modeled system. An FFM simulates the failure effect propagation paths within a system between failure modes and observation points. These models contain a significant amount of information about the system including the design, operation and off nominal behavior. The development and verification of the models can be costly in both time and resources. In addition, models depicting similar components can be distinct, both in appearance and function, when created individually, because there are numerous ways of representing the failure space within each component. Generic application of FFMs has the advantages of software code reuse: reduction of time and resources in both development and verification, and a standard set of component models from which future system models can be generated with common appearance and diagnostic performance. This paper outlines the motivation to develop a generic modeling process for FFMs at the component level and the effort to implement that process through modeling conventions and a software tool. The implementation of this generic modeling process within a fault isolation demonstration for NASA's Advanced Ground System Maintenance (AGSM) Integrated Health Management (IHM) project is presented and the impact discussed.

  14. Deterministic geologic processes and stochastic modeling

    International Nuclear Information System (INIS)

    Rautman, C.A.; Flint, A.L.

    1992-01-01

    This paper reports that recent outcrop sampling at Yucca Mountain, Nevada, has produced significant new information regarding the distribution of physical properties at the site of a potential high-level nuclear waste repository. consideration of the spatial variability indicates that her are a number of widespread deterministic geologic features at the site that have important implications for numerical modeling of such performance aspects as ground water flow and radionuclide transport. Because the geologic processes responsible for formation of Yucca Mountain are relatively well understood and operate on a more-or-less regional scale, understanding of these processes can be used in modeling the physical properties and performance of the site. Information reflecting these deterministic geologic processes may be incorporated into the modeling program explicitly using geostatistical concepts such as soft information, or implicitly, through the adoption of a particular approach to modeling

  15. Path modeling and process control

    DEFF Research Database (Denmark)

    Høskuldsson, Agnar; Rodionova, O.; Pomerantsev, A.

    2007-01-01

    and having three or more stages. The methods are applied to a process control of a multi-stage production process having 25 variables and one output variable. When moving along the process, variables change their roles. It is shown how the methods of path modeling can be applied to estimate variables...... be performed regarding the foreseeable output property y, and with respect to an admissible range of correcting actions for the parameters of the next stage. In this paper the basic principles of path modeling is presented. The mathematics is presented for processes having only one stage, having two stages...... of the next stage with the purpose of obtaining optimal or almost optimal quality of the output variable. An important aspect of the methods presented is the possibility of extensive graphic analysis of data that can provide the engineer with a detailed view of the multi-variate variation in data....

  16. User's manual for a process model code

    International Nuclear Information System (INIS)

    Kern, E.A.; Martinez, D.P.

    1981-03-01

    The MODEL code has been developed for computer modeling of materials processing facilities associated with the nuclear fuel cycle. However, it can also be used in other modeling applications. This report provides sufficient information for a potential user to apply the code to specific process modeling problems. Several examples that demonstrate most of the capabilities of the code are provided

  17. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    In reduced form default models, the instantaneous default intensity is classically the modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature has shown a tendency towards...

  18. [The dual process model of addiction. Towards an integrated model?].

    Science.gov (United States)

    Vandermeeren, R; Hebbrecht, M

    2012-01-01

    Neurobiology and cognitive psychology have provided us with a dual process model of addiction. According to this model, behavior is considered to be the dynamic result of a combination of automatic and controlling processes. In cases of addiction the balance between these two processes is severely disturbed. Automated processes will continue to produce impulses that ensure the continuance of addictive behavior. Weak, reflective or controlling processes are both the reason for and the result of the inability to forgo addiction. To identify features that are common to current neurocognitive insights into addiction and psychodynamic views on addiction. The picture that emerges from research is not clear. There is some evidence that attentional bias has a causal effect on addiction. There is no evidence that automatic associations have a causal effect, but there is some evidence that automatic action-tendencies do have a causal effect. Current neurocognitive views on the dual process model of addiction can be integrated with an evidence-based approach to addiction and with psychodynamic views on addiction.

  19. Modelling of the UO2 dissolution mechanisms in synthetic groundwater solutions. Dissolution experiments carried out under oxic conditions

    International Nuclear Information System (INIS)

    Cera, E.; Grive, M.; Bruno, J.; Ollila, K.

    2001-02-01

    The analytical data generated during the last three years within the 4th framework program of the European Community at VTT Chemical Technology concerning UO 2 dissolution under oxidising conditions have been modelled in the present work. The modelling work has been addressed to perform a kinetic study of the dissolution data generated by Ollila (1999) under oxidising conditions by using unirradiated uranium dioxide as solid sample. The average of the normalised UO 2 dissolution rates determined by using the initial dissolution data generated in all the experimental tests is (6.06 ± 3.64)* 10 -7 mol m -2 d -1 . This dissolution rate agrees with most of the dissolution rates reported in the literature under similar experimental conditions. The results obtained in this modelling exercise show that the same bicarbonate promoted oxidative dissolution processes operate for uranium dioxide, as a chemical analogue of the spent fuel matrix, independently of the composition of the aqueous solution used. (orig.)

  20. Distillation modeling for a uranium refining process

    International Nuclear Information System (INIS)

    Westphal, B.R.

    1996-01-01

    As part of the spent fuel treatment program at Argonne National Laboratory, a vacuum distillation process is being employed for the recovery of uranium following an electrorefining process. Distillation of a salt electrolyte, containing a eutectic mixture of lithium and potassium chlorides, from uranium is achieved by a simple batch operation and is termed open-quotes cathode processingclose quotes. The incremental distillation of electrolyte salt will be modeled by an equilibrium expression and on a molecular basis since the operation is conducted under moderate vacuum conditions. As processing continues, the two models will be compared and analyzed for correlation with actual operating results. Possible factors that may contribute to aberrations from the models include impurities at the vapor-liquid boundary, distillate reflux, anomalous pressure gradients, and mass transport phenomena at the evaporating surface. Ultimately, the purpose of either process model is to enable the parametric optimization of the process

  1. Specification of e-business process model for PayPal online payment process using Reo

    NARCIS (Netherlands)

    M. Xie

    2005-01-01

    textabstractE-business process modeling allows business analysts to better understand and analyze the business processes, and eventually to use software systems to automate (parts of) these business processes to achieve higher profit. To support e-business process modeling, many business process

  2. Global Earthquake and Volcanic Eruption Economic losses and costs from 1900-2014: 115 years of the CATDAT database - Trends, Normalisation and Visualisation

    Science.gov (United States)

    Daniell, James; Skapski, Jens-Udo; Vervaeck, Armand; Wenzel, Friedemann; Schaefer, Andreas

    2015-04-01

    Over the past 12 years, an in-depth database has been constructed for socio-economic losses from earthquakes and volcanoes. The effects of earthquakes and volcanic eruptions have been documented in many databases, however, many errors and incorrect details are often encountered. To combat this, the database was formed with socioeconomic checks of GDP, capital stock, population and other elements, as well as providing upper and lower bounds to each available event loss. The definition of economic losses within the CATDAT Damaging Earthquakes Database (Daniell et al., 2011a) as of v6.1 has now been redefined to provide three options of natural disaster loss pricing, including reconstruction cost, replacement cost and actual loss, in order to better define the impact of historical disasters. Similarly for volcanoes as for earthquakes, a reassessment has been undertaken looking at the historical net and gross capital stock and GDP at the time of the event, including the depreciated stock, in order to calculate the actual loss. A normalisation has then been undertaken using updated population, GDP and capital stock. The difference between depreciated and gross capital can be removed from the historical loss estimates which have been all calculated without taking depreciation of the building stock into account. The culmination of time series from 1900-2014 of net and gross capital stock, GDP, direct economic loss data, use of detailed studies of infrastructure age, and existing damage surveys, has allowed the first estimate of this nature. The death tolls in earthquakes from 1900-2014 are presented in various forms, showing around 2.32 million deaths due to earthquakes (with a range of 2.18 to 2.63 million) and around 59% due to masonry buildings and 28% from secondary effects. For the death tolls from the volcanic eruption database, 98000 deaths with a range from around 83000 to 107000 is seen from 1900-2014. The application of VSL life costing from death and injury

  3. Sato Processes in Default Modeling

    DEFF Research Database (Denmark)

    Kokholm, Thomas; Nicolato, Elisa

    2010-01-01

    In reduced form default models, the instantaneous default intensity is the classical modeling object. Survival probabilities are then given by the Laplace transform of the cumulative hazard defined as the integrated intensity process. Instead, recent literature tends to specify the cumulative haz...

  4. Dimensional modeling: beyond data processing constraints.

    Science.gov (United States)

    Bunardzic, A

    1995-01-01

    The focus of information processing requirements is shifting from the on-line transaction processing (OLTP) issues to the on-line analytical processing (OLAP) issues. While the former serves to ensure the feasibility of the real-time on-line transaction processing (which has already exceeded a level of up to 1,000 transactions per second under normal conditions), the latter aims at enabling more sophisticated analytical manipulation of data. The OLTP requirements, or how to efficiently get data into the system, have been solved by applying the Relational theory in the form of Entity-Relation model. There is presently no theory related to OLAP that would resolve the analytical processing requirements as efficiently as Relational theory provided for the transaction processing. The "relational dogma" also provides the mathematical foundation for the Centralized Data Processing paradigm in which mission-critical information is incorporated as 'one and only one instance' of data, thus ensuring data integrity. In such surroundings, the information that supports business analysis and decision support activities is obtained by running predefined reports and queries that are provided by the IS department. In today's intensified competitive climate, businesses are finding that this traditional approach is not good enough. The only way to stay on top of things, and to survive and prosper, is to decentralize the IS services. The newly emerging Distributed Data Processing, with its increased emphasis on empowering the end user, does not seem to find enough merit in the relational database model to justify relying upon it. Relational theory proved too rigid and complex to accommodate the analytical processing needs. In order to satisfy the OLAP requirements, or how to efficiently get the data out of the system, different models, metaphors, and theories have been devised. All of them are pointing to the need for simplifying the highly non-intuitive mathematical constraints found

  5. Model-based processing for underwater acoustic arrays

    CERN Document Server

    Sullivan, Edmund J

    2015-01-01

    This monograph presents a unified approach to model-based processing for underwater acoustic arrays. The use of physical models in passive array processing is not a new idea, but it has been used on a case-by-case basis, and as such, lacks any unifying structure. This work views all such processing methods as estimation procedures, which then can be unified by treating them all as a form of joint estimation based on a Kalman-type recursive processor, which can be recursive either in space or time, depending on the application. This is done for three reasons. First, the Kalman filter provides a natural framework for the inclusion of physical models in a processing scheme. Second, it allows poorly known model parameters to be jointly estimated along with the quantities of interest. This is important, since in certain areas of array processing already in use, such as those based on matched-field processing, the so-called mismatch problem either degrades performance or, indeed, prevents any solution at all. Third...

  6. Neuroscientific Model of Motivational Process

    Science.gov (United States)

    Kim, Sung-il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment. PMID:23459598

  7. Neuroscientific model of motivational process.

    Science.gov (United States)

    Kim, Sung-Il

    2013-01-01

    Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three sub processes, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous sub processes, namely reward-driven approach, value-based decision-making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area) in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision-making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area) play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area) and the dorsolateral prefrontal cortex (cognitive control area) are the main neural circuits related to regulation of motivation. These three sub processes interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  8. Aerospace Materials Process Modelling

    Science.gov (United States)

    1988-08-01

    Cooling Transformation diagram ( CCT diagram ) When a IT diagram is used in the heat process modelling, we suppose that a sudden cooling (instantaneous...processes. CE, chooses instead to study thermo-mechanical properties referring to a CCT diagram . This is thinked to be more reliable to give a true...k , mm-_____sml l ml A I 1 III 12.4 This determination is however based on the following approximations: i) A CCT diagram is valid only for the

  9. Advanced oxidation processes: overall models

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez, M. [Univ. de los Andes, Escuela Basica de Ingenieria, La Hechicera, Merida (Venezuela); Curco, D.; Addardak, A.; Gimenez, J.; Esplugas, S. [Dept. de Ingenieria Quimica. Univ. de Barcelona, Barcelona (Spain)

    2003-07-01

    Modelling AOPs implies to consider all the steps included in the process, that means, mass transfer, kinetic (reaction) and luminic steps. In this way, recent works develop models which relate the global reaction rate to catalyst concentration and radiation absorption. However, the application of such models requires to know what is the controlling step for the overall process. In this paper, a simple method is explained which allows to determine the controlling step. Thus, it is assumed that reactor is divided in two hypothetical zones (dark and illuminated), and according to the experimental results, obtained by varying only the reaction volume, it can be decided if reaction occurs only in the illuminated zone or in the all reactor, including dark zone. The photocatalytic degradation of phenol, by using titania degussa P-25 as catalyst, is studied as reaction model. The preliminary results obtained are presented here, showing that it seems that, in this case, reaction only occurs in the illuminated zone of photoreactor. A model is developed to explain this behaviour. (orig.)

  10. Modeling Business Processes in Public Administration

    Science.gov (United States)

    Repa, Vaclav

    During more than 10 years of its existence business process modeling became a regular part of organization management practice. It is mostly regarded. as a part of information system development or even as a way to implement some supporting technology (for instance workflow system). Although I do not agree with such reduction of the real meaning of a business process, it is necessary to admit that information technologies play an essential role in business processes (see [1] for more information), Consequently, an information system is inseparable from a business process itself because it is a cornerstone of the general basic infrastructure of a business. This fact impacts on all dimensions of business process management. One of these dimensions is the methodology that postulates that the information systems development provide the business process management with exact methods and tools for modeling business processes. Also the methodology underlying the approach presented in this paper has its roots in the information systems development methodology.

  11. Group Contribution Based Process Flowsheet Synthesis, Design and Modelling

    DEFF Research Database (Denmark)

    d'Anterroches, Loïc; Gani, Rafiqul

    2004-01-01

    This paper presents a process-group-contribution Method to model. simulate and synthesize a flowsheet. The process-group based representation of a flowsheet together with a process "property" model are presented. The process-group based synthesis method is developed on the basis of the computer...... aided molecular design methods and gives the ability to screen numerous process alternatives without the need to use the rigorous process simulation models. The process "property" model calculates the design targets for the generated flowsheet alternatives while a reverse modelling method (also...... developed) determines the design variables matching the target. A simple illustrative example highlighting the main features of the methodology is also presented....

  12. Software Engineering Laboratory (SEL) cleanroom process model

    Science.gov (United States)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  13. Business Model Process Configurations

    DEFF Research Database (Denmark)

    Taran, Yariv; Nielsen, Christian; Thomsen, Peter

    2015-01-01

    , by developing (inductively) an ontological classification framework, in view of the BM process configurations typology developed. Design/methodology/approach – Given the inconsistencies found in the business model studies (e.g. definitions, configurations, classifications) we adopted the analytical induction...

  14. Product/Process (P/P) Models For The Defense Waste Processing Facility (DWPF): Model Ranges And Validation Ranges For Future Processing

    Energy Technology Data Exchange (ETDEWEB)

    Jantzen, C. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Edwards, T. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2015-09-25

    Radioactive high level waste (HLW) at the Savannah River Site (SRS) has successfully been vitrified into borosilicate glass in the Defense Waste Processing Facility (DWPF) since 1996. Vitrification requires stringent product/process (P/P) constraints since the glass cannot be reworked once it is poured into ten foot tall by two foot diameter canisters. A unique “feed forward” statistical process control (SPC) was developed for this control rather than statistical quality control (SQC). In SPC, the feed composition to the DWPF melter is controlled prior to vitrification. In SQC, the glass product would be sampled after it is vitrified. Individual glass property-composition models form the basis for the “feed forward” SPC. The models transform constraints on the melt and glass properties into constraints on the feed composition going to the melter in order to guarantee, at the 95% confidence level, that the feed will be processable and that the durability of the resulting waste form will be acceptable to a geologic repository.

  15. Improving the process of process modelling by the use of domain process patterns

    NARCIS (Netherlands)

    Koschmider, A.; Reijers, H.A.

    2015-01-01

    The use of business process models has become prevalent in a wide area of enterprise applications. But while their popularity is expanding, concerns are growing with respect to their proper creation and maintenance. An obvious way to boost the efficiency of creating high-quality business process

  16. Research on Process-oriented Spatio-temporal Data Model

    Directory of Open Access Journals (Sweden)

    XUE Cunjin

    2016-02-01

    Full Text Available According to the analysis of the present status and existing problems of spatio-temporal data models developed in last 20 years,this paper proposes a process-oriented spatio-temporal data model (POSTDM,aiming at representing,organizing and storing continuity and gradual geographical entities. The dynamic geographical entities are graded and abstracted into process objects series from their intrinsic characteristics,which are process objects,process stage objects,process sequence objects and process state objects. The logical relationships among process entities are further studied and the structure of UML models and storage are also designed. In addition,through the mechanisms of continuity and gradual changes impliedly recorded by process objects,and the modes of their procedure interfaces offered by the customized ObjcetStorageTable,the POSTDM can carry out process representation,storage and dynamic analysis of continuity and gradual geographic entities. Taking a process organization and storage of marine data as an example,a prototype system (consisting of an object-relational database and a functional analysis platform is developed for validating and evaluating the model's practicability.

  17. Markov Decision Process Measurement Model.

    Science.gov (United States)

    LaMar, Michelle M

    2018-03-01

    Within-task actions can provide additional information on student competencies but are challenging to model. This paper explores the potential of using a cognitive model for decision making, the Markov decision process, to provide a mapping between within-task actions and latent traits of interest. Psychometric properties of the model are explored, and simulation studies report on parameter recovery within the context of a simple strategy game. The model is then applied to empirical data from an educational game. Estimates from the model are found to correlate more strongly with posttest results than a partial-credit IRT model based on outcome data alone.

  18. Stochastic process corrosion growth models for pipeline reliability

    International Nuclear Information System (INIS)

    Bazán, Felipe Alexander Vargas; Beck, André Teófilo

    2013-01-01

    Highlights: •Novel non-linear stochastic process corrosion growth model is proposed. •Corrosion rate modeled as random Poisson pulses. •Time to corrosion initiation and inherent time-variability properly represented. •Continuous corrosion growth histories obtained. •Model is shown to precisely fit actual corrosion data at two time points. -- Abstract: Linear random variable corrosion models are extensively employed in reliability analysis of pipelines. However, linear models grossly neglect well-known characteristics of the corrosion process. Herein, a non-linear model is proposed, where corrosion rate is represented as a Poisson square wave process. The resulting model represents inherent time-variability of corrosion growth, produces continuous growth and leads to mean growth at less-than-one power of time. Different corrosion models are adjusted to the same set of actual corrosion data for two inspections. The proposed non-linear random process corrosion growth model leads to the best fit to the data, while better representing problem physics

  19. Model feedstock supply processing plants

    Directory of Open Access Journals (Sweden)

    V. M. Bautin

    2013-01-01

    Full Text Available The model of raw providing the processing enterprises entering into vertically integrated structure on production and processing of dairy raw materials, differing by an orientation on achievement of cumulative effect by the integrated structure acting as criterion function which maximizing is reached by optimization of capacities, volumes of deliveries of raw materials and its qualitative characteristics, costs of industrial processing of raw materials and demand for dairy production is developed.

  20. Assessing healthcare process maturity: challenges of using a business process maturity model

    NARCIS (Netherlands)

    Tarhan, A.; Turetken, O.; van den Biggelaar, F.J.H.M.

    2015-01-01

    Doi: 10.4108/icst.pervasivehealth.2015.259105 The quality of healthcare services is influenced by the maturity of healthcare processes used to develop it. A maturity model is an instrument to assess and continually improve organizational processes. In the last decade, a number of maturity models

  1. Computer modeling of lung cancer diagnosis-to-treatment process.

    Science.gov (United States)

    Ju, Feng; Lee, Hyo Kyung; Osarogiagbon, Raymond U; Yu, Xinhua; Faris, Nick; Li, Jingshan

    2015-08-01

    We introduce an example of a rigorous, quantitative method for quality improvement in lung cancer care-delivery. Computer process modeling methods are introduced for lung cancer diagnosis, staging and treatment selection process. Two types of process modeling techniques, discrete event simulation (DES) and analytical models, are briefly reviewed. Recent developments in DES are outlined and the necessary data and procedures to develop a DES model for lung cancer diagnosis, leading up to surgical treatment process are summarized. The analytical models include both Markov chain model and closed formulas. The Markov chain models with its application in healthcare are introduced and the approach to derive a lung cancer diagnosis process model is presented. Similarly, the procedure to derive closed formulas evaluating the diagnosis process performance is outlined. Finally, the pros and cons of these methods are discussed.

  2. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Full Text Available Background: Competitive intelligence (CI provides actionable intelligence, which provides a competitive edge in enterprises. However, without proper process, it is difficult to develop actionable intelligence. There are disagreements about how the CI process should be structured. For CI professionals to focus on producing actionable intelligence, and to do so with simplicity, they need a common CI process model.Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model.Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used.Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase.Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  3. Modeling and simulation of heterogeneous catalytic processes

    CERN Document Server

    Dixon, Anthony

    2014-01-01

    Heterogeneous catalysis and mathematical modeling are essential components of the continuing search for better utilization of raw materials and energy, with reduced impact on the environment. Numerical modeling of chemical systems has progressed rapidly due to increases in computer power, and is used extensively for analysis, design and development of catalytic reactors and processes. This book presents reviews of the state-of-the-art in modeling of heterogeneous catalytic reactors and processes. Reviews by leading authorities in the respective areas Up-to-date reviews of latest techniques in modeling of catalytic processes Mix of US and European authors, as well as academic/industrial/research institute perspectives Connections between computation and experimental methods in some of the chapters.

  4. Synergy of modeling processes in the area of soft and hard modeling

    Directory of Open Access Journals (Sweden)

    Sika Robert

    2017-01-01

    Full Text Available High complexity of production processes results in more frequent use of computer systems for their modeling and simulation. Process modeling helps to find optimal solution, verify some assumptions before implementation and eliminate errors. In practice, modeling of production processes concerns two areas: hard modeling (based on differential equations of mathematical physics and soft (based on existing data. In the paper the possibility of synergistic connection of these two approaches was indicated: it means hard modeling support based on the tools used in soft modeling. It aims at significant reducing the time in order to obtain final results with the use of hard modeling. Some test were carried out in the Calibrate module of NovaFlow&Solid (NF&S simulation system in the frame of thermal analysis (ATAS-cup. The authors tested output values forecasting in NF&S system (solidification time on the basis of variable parameters of the thermal model (heat conduction, specific heat, density. Collected data was used as an input to prepare soft model with the use of MLP (Multi-Layer Perceptron neural network regression model. The approach described above enable to reduce the time of production process modeling with use of hard modeling and should encourage production companies to use it.

  5. Self-exciting point process in modeling earthquake occurrences

    International Nuclear Information System (INIS)

    Pratiwi, H.; Slamet, I.; Respatiwulan; Saputro, D. R. S.

    2017-01-01

    In this paper, we present a procedure for modeling earthquake based on spatial-temporal point process. The magnitude distribution is expressed as truncated exponential and the event frequency is modeled with a spatial-temporal point process that is characterized uniquely by its associated conditional intensity process. The earthquakes can be regarded as point patterns that have a temporal clustering feature so we use self-exciting point process for modeling the conditional intensity function. The choice of main shocks is conducted via window algorithm by Gardner and Knopoff and the model can be fitted by maximum likelihood method for three random variables. (paper)

  6. Computer Modelling of Dynamic Processes

    Directory of Open Access Journals (Sweden)

    B. Rybakin

    2000-10-01

    Full Text Available Results of numerical modeling of dynamic problems are summed in the article up. These problems are characteristic for various areas of human activity, in particular for problem solving in ecology. The following problems are considered in the present work: computer modeling of dynamic effects on elastic-plastic bodies, calculation and determination of performances of gas streams in gas cleaning equipment, modeling of biogas formation processes.

  7. Modeling process-structure-property relationships for additive manufacturing

    Science.gov (United States)

    Yan, Wentao; Lin, Stephen; Kafka, Orion L.; Yu, Cheng; Liu, Zeliang; Lian, Yanping; Wolff, Sarah; Cao, Jian; Wagner, Gregory J.; Liu, Wing Kam

    2018-02-01

    This paper presents our latest work on comprehensive modeling of process-structure-property relationships for additive manufacturing (AM) materials, including using data-mining techniques to close the cycle of design-predict-optimize. To illustrate the processstructure relationship, the multi-scale multi-physics process modeling starts from the micro-scale to establish a mechanistic heat source model, to the meso-scale models of individual powder particle evolution, and finally to the macro-scale model to simulate the fabrication process of a complex product. To link structure and properties, a highefficiency mechanistic model, self-consistent clustering analyses, is developed to capture a variety of material response. The model incorporates factors such as voids, phase composition, inclusions, and grain structures, which are the differentiating features of AM metals. Furthermore, we propose data-mining as an effective solution for novel rapid design and optimization, which is motivated by the numerous influencing factors in the AM process. We believe this paper will provide a roadmap to advance AM fundamental understanding and guide the monitoring and advanced diagnostics of AM processing.

  8. Efficient querying of large process model repositories

    NARCIS (Netherlands)

    Jin, Tao; Wang, Jianmin; La Rosa, M.; Hofstede, ter A.H.M.; Wen, Lijie

    2013-01-01

    Recent years have seen an increased uptake of business process management technology in industries. This has resulted in organizations trying to manage large collections of business process models. One of the challenges facing these organizations concerns the retrieval of models from large business

  9. [GSH fermentation process modeling using entropy-criterion based RBF neural network model].

    Science.gov (United States)

    Tan, Zuoping; Wang, Shitong; Deng, Zhaohong; Du, Guocheng

    2008-05-01

    The prediction accuracy and generalization of GSH fermentation process modeling are often deteriorated by noise existing in the corresponding experimental data. In order to avoid this problem, we present a novel RBF neural network modeling approach based on entropy criterion. It considers the whole distribution structure of the training data set in the parameter learning process compared with the traditional MSE-criterion based parameter learning, and thus effectively avoids the weak generalization and over-learning. Then the proposed approach is applied to the GSH fermentation process modeling. Our results demonstrate that this proposed method has better prediction accuracy, generalization and robustness such that it offers a potential application merit for the GSH fermentation process modeling.

  10. Discovering Process Reference Models from Process Variants Using Clustering Techniques

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    2008-01-01

    In today's dynamic business world, success of an enterprise increasingly depends on its ability to react to changes in a quick and flexible way. In response to this need, process-aware information systems (PAIS) emerged, which support the modeling, orchestration and monitoring of business processes

  11. The Role(s) of Process Models in Design Practice

    DEFF Research Database (Denmark)

    Iversen, Søren; Jensen, Mads Kunø Nyegaard; Vistisen, Peter

    2018-01-01

    This paper investigates how design process models are implemented and used in design-driven organisations. The archetypical theoretical framing of process models, describe their primary role as guiding the design process, and assign roles and deliverables throughout the process. We hypothesise...... that the process models also take more communicative roles in practice, both in terms of creating an internal design rationale, as well as demystifying the black box of design thinking to external stakeholders. We investigate this hypothesis through an interview study of four major danish design......-driven organisations, and analyse the different roles their archetypical process models take in their organisations. The main contribution is the identification of three, often overlapping roles, which design process models showed to assume in design-driven organisations: process guidance, adding transparency...

  12. Causally nonseparable processes admitting a causal model

    International Nuclear Information System (INIS)

    Feix, Adrien; Araújo, Mateus; Brukner, Caslav

    2016-01-01

    A recent framework of quantum theory with no global causal order predicts the existence of ‘causally nonseparable’ processes. Some of these processes produce correlations incompatible with any causal order (they violate so-called ‘causal inequalities’ analogous to Bell inequalities ) while others do not (they admit a ‘causal model’ analogous to a local model ). Here we show for the first time that bipartite causally nonseparable processes with a causal model exist, and give evidence that they have no clear physical interpretation. We also provide an algorithm to generate processes of this kind and show that they have nonzero measure in the set of all processes. We demonstrate the existence of processes which stop violating causal inequalities but are still causally nonseparable when mixed with a certain amount of ‘white noise’. This is reminiscent of the behavior of Werner states in the context of entanglement and nonlocality. Finally, we provide numerical evidence for the existence of causally nonseparable processes which have a causal model even when extended with an entangled state shared among the parties. (paper)

  13. Towards simplification of hydrologic modeling: Identification of dominant processes

    Science.gov (United States)

    Markstrom, Steven; Hay, Lauren E.; Clark, Martyn P.

    2016-01-01

    The Precipitation–Runoff Modeling System (PRMS), a distributed-parameter hydrologic model, has been applied to the conterminous US (CONUS). Parameter sensitivity analysis was used to identify: (1) the sensitive input parameters and (2) particular model output variables that could be associated with the dominant hydrologic process(es). Sensitivity values of 35 PRMS calibration parameters were computed using the Fourier amplitude sensitivity test procedure on 110 000 independent hydrologically based spatial modeling units covering the CONUS and then summarized to process (snowmelt, surface runoff, infiltration, soil moisture, evapotranspiration, interflow, baseflow, and runoff) and model performance statistic (mean, coefficient of variation, and autoregressive lag 1). Identified parameters and processes provide insight into model performance at the location of each unit and allow the modeler to identify the most dominant process on the basis of which processes are associated with the most sensitive parameters. The results of this study indicate that: (1) the choice of performance statistic and output variables has a strong influence on parameter sensitivity, (2) the apparent model complexity to the modeler can be reduced by focusing on those processes that are associated with sensitive parameters and disregarding those that are not, (3) different processes require different numbers of parameters for simulation, and (4) some sensitive parameters influence only one hydrologic process, while others may influence many

  14. Neuroscientific Model of Motivational Process

    Directory of Open Access Journals (Sweden)

    Sung-Il eKim

    2013-03-01

    Full Text Available Considering the neuroscientific findings on reward, learning, value, decision-making, and cognitive control, motivation can be parsed into three subprocesses, a process of generating motivation, a process of maintaining motivation, and a process of regulating motivation. I propose a tentative neuroscientific model of motivational processes which consists of three distinct but continuous subprocesses, namely reward-driven approach, value-based decision making, and goal-directed control. Reward-driven approach is the process in which motivation is generated by reward anticipation and selective approach behaviors toward reward. This process recruits the ventral striatum (reward area in which basic stimulus-action association is formed, and is classified as an automatic motivation to which relatively less attention is assigned. By contrast, value-based decision making is the process of evaluating various outcomes of actions, learning through positive prediction error, and calculating the value continuously. The striatum and the orbitofrontal cortex (valuation area play crucial roles in sustaining motivation. Lastly, the goal-directed control is the process of regulating motivation through cognitive control to achieve goals. This consciously controlled motivation is associated with higher-level cognitive functions such as planning, retaining the goal, monitoring the performance, and regulating action. The anterior cingulate cortex (attention area and the dorsolateral prefrontal cortex (cognitive control area are the main neural circuits related to regulation of motivation. These three subprocesses interact with each other by sending reward prediction error signals through dopaminergic pathway from the striatum and to the prefrontal cortex. The neuroscientific model of motivational process suggests several educational implications with regard to the generation, maintenance, and regulation of motivation to learn in the learning environment.

  15. Business Process Simulation: Requirements for Business and Resource Models

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2015-07-01

    Full Text Available The purpose of Business Process Model and Notation (BPMN is to provide easily understandable graphical representation of business process. Thus BPMN is widely used and applied in various areas one of them being a business process simulation. This paper addresses some BPMN model based business process simulation problems. The paper formulate requirements for business process and resource models in enabling their use for business process simulation.

  16. Model of diffusers / permeators for hydrogen processing

    International Nuclear Information System (INIS)

    Jacobs, W. D.; Hang, T.

    2008-01-01

    Palladium-silver (Pd-Ag) diffusers are mainstays of hydrogen processing. Diffusers separate hydrogen from inert species such as nitrogen, argon or helium. The tubing becomes permeable to hydrogen when heated to more than 250 C and a differential pressure is created across the membrane. The hydrogen diffuses better at higher temperatures. Experimental or experiential results have been the basis for determining or predicting a diffuser's performance. However, the process can be mathematically modeled, and comparison to experimental or other operating data can be utilized to improve the fit of the model. A reliable model-based diffuser system design is the goal which will have impacts on tritium and hydrogen processing. A computer model has been developed to solve the differential equations for diffusion given the operating boundary conditions. The model was compared to operating data for a low pressure diffuser system. The modeling approach and the results are presented in this paper. (authors)

  17. Models of transport processes in concrete

    International Nuclear Information System (INIS)

    Pommersheim, J.M.; Clifton, J.R.

    1991-01-01

    An approach being considered by the US Nuclear Regulatory Commission for disposal of low-level radioactive waste is to place the waste forms in concrete vaults buried underground. The vaults would need a service life of 500 years. Approaches for predicting the service life of concrete of such vaults include the use of mathematical models. Mathematical models are presented in this report for the major degradation processes anticipated for the concrete vaults, which are corrosion of steel reinforcement, sulfate attack, acid attack, and leaching. The models mathematically represent rate controlling processes including diffusion, convection, and reaction and sorption of chemical species. These models can form the basis for predicting the life of concrete under in-service conditions. 33 refs., 6 figs., 7 tabs

  18. Toward Cognitively Constrained Models of Language Processing: A Review

    Directory of Open Access Journals (Sweden)

    Margreet Vogelzang

    2017-09-01

    Full Text Available Language processing is not an isolated capacity, but is embedded in other aspects of our cognition. However, it is still largely unexplored to what extent and how language processing interacts with general cognitive resources. This question can be investigated with cognitively constrained computational models, which simulate the cognitive processes involved in language processing. The theoretical claims implemented in cognitive models interact with general architectural constraints such as memory limitations. This way, it generates new predictions that can be tested in experiments, thus generating new data that can give rise to new theoretical insights. This theory-model-experiment cycle is a promising method for investigating aspects of language processing that are difficult to investigate with more traditional experimental techniques. This review specifically examines the language processing models of Lewis and Vasishth (2005, Reitter et al. (2011, and Van Rij et al. (2010, all implemented in the cognitive architecture Adaptive Control of Thought—Rational (Anderson et al., 2004. These models are all limited by the assumptions about cognitive capacities provided by the cognitive architecture, but use different linguistic approaches. Because of this, their comparison provides insight into the extent to which assumptions about general cognitive resources influence concretely implemented models of linguistic competence. For example, the sheer speed and accuracy of human language processing is a current challenge in the field of cognitive modeling, as it does not seem to adhere to the same memory and processing capacities that have been found in other cognitive processes. Architecture-based cognitive models of language processing may be able to make explicit which language-specific resources are needed to acquire and process natural language. The review sheds light on cognitively constrained models of language processing from two angles: we

  19. Beyond dual-process models: A categorisation of processes underlying intuitive judgement and decision making

    NARCIS (Netherlands)

    Glöckner, A.; Witteman, C.L.M.

    2010-01-01

    Intuitive-automatic processes are crucial for making judgements and decisions. The fascinating complexity of these processes has attracted many decision researchers, prompting them to start investigating intuition empirically and to develop numerous models. Dual-process models assume a clear

  20. Modelling of transport and biogeochemical processes in pollution plumes: Literature review of model development

    DEFF Research Database (Denmark)

    Brun, A.; Engesgaard, Peter Knudegaard

    2002-01-01

    A literature survey shows how biogeochemical (coupled organic and inorganic reaction processes) transport models are based on considering the complete biodegradation process as either a single- or as a two-step process. It is demonstrated that some two-step process models rely on the Partial...... Equilibrium Approach (PEA). The PEA assumes the organic degradation step, and not the electron acceptor consumption step, is rate limiting. This distinction is not possible in one-step process models, where consumption of both the electron donor and acceptor are treated kinetically. A three-dimensional, two......-step PEA model is developed. The model allows for Monod kinetics and biomass growth, features usually included only in one-step process models. The biogeochemical part of the model is tested for a batch system with degradation of organic matter under the consumption of a sequence of electron acceptors...

  1. Dual processing model of medical decision-making

    Science.gov (United States)

    2012-01-01

    Background Dual processing theory of human cognition postulates that reasoning and decision-making can be described as a function of both an intuitive, experiential, affective system (system I) and/or an analytical, deliberative (system II) processing system. To date no formal descriptive model of medical decision-making based on dual processing theory has been developed. Here we postulate such a model and apply it to a common clinical situation: whether treatment should be administered to the patient who may or may not have a disease. Methods We developed a mathematical model in which we linked a recently proposed descriptive psychological model of cognition with the threshold model of medical decision-making and show how this approach can be used to better understand decision-making at the bedside and explain the widespread variation in treatments observed in clinical practice. Results We show that physician’s beliefs about whether to treat at higher (lower) probability levels compared to the prescriptive therapeutic thresholds obtained via system II processing is moderated by system I and the ratio of benefit and harms as evaluated by both system I and II. Under some conditions, the system I decision maker’s threshold may dramatically drop below the expected utility threshold derived by system II. This can explain the overtreatment often seen in the contemporary practice. The opposite can also occur as in the situations where empirical evidence is considered unreliable, or when cognitive processes of decision-makers are biased through recent experience: the threshold will increase relative to the normative threshold value derived via system II using expected utility threshold. This inclination for the higher diagnostic certainty may, in turn, explain undertreatment that is also documented in the current medical practice. Conclusions We have developed the first dual processing model of medical decision-making that has potential to enrich the current medical

  2. Dual processing model of medical decision-making.

    Science.gov (United States)

    Djulbegovic, Benjamin; Hozo, Iztok; Beckstead, Jason; Tsalatsanis, Athanasios; Pauker, Stephen G

    2012-09-03

    Dual processing theory of human cognition postulates that reasoning and decision-making can be described as a function of both an intuitive, experiential, affective system (system I) and/or an analytical, deliberative (system II) processing system. To date no formal descriptive model of medical decision-making based on dual processing theory has been developed. Here we postulate such a model and apply it to a common clinical situation: whether treatment should be administered to the patient who may or may not have a disease. We developed a mathematical model in which we linked a recently proposed descriptive psychological model of cognition with the threshold model of medical decision-making and show how this approach can be used to better understand decision-making at the bedside and explain the widespread variation in treatments observed in clinical practice. We show that physician's beliefs about whether to treat at higher (lower) probability levels compared to the prescriptive therapeutic thresholds obtained via system II processing is moderated by system I and the ratio of benefit and harms as evaluated by both system I and II. Under some conditions, the system I decision maker's threshold may dramatically drop below the expected utility threshold derived by system II. This can explain the overtreatment often seen in the contemporary practice. The opposite can also occur as in the situations where empirical evidence is considered unreliable, or when cognitive processes of decision-makers are biased through recent experience: the threshold will increase relative to the normative threshold value derived via system II using expected utility threshold. This inclination for the higher diagnostic certainty may, in turn, explain undertreatment that is also documented in the current medical practice. We have developed the first dual processing model of medical decision-making that has potential to enrich the current medical decision-making field, which is still to the

  3. Modeling nonhomogeneous Markov processes via time transformation.

    Science.gov (United States)

    Hubbard, R A; Inoue, L Y T; Fann, J R

    2008-09-01

    Longitudinal studies are a powerful tool for characterizing the course of chronic disease. These studies are usually carried out with subjects observed at periodic visits giving rise to panel data. Under this observation scheme the exact times of disease state transitions and sequence of disease states visited are unknown and Markov process models are often used to describe disease progression. Most applications of Markov process models rely on the assumption of time homogeneity, that is, that the transition rates are constant over time. This assumption is not satisfied when transition rates depend on time from the process origin. However, limited statistical tools are available for dealing with nonhomogeneity. We propose models in which the time scale of a nonhomogeneous Markov process is transformed to an operational time scale on which the process is homogeneous. We develop a method for jointly estimating the time transformation and the transition intensity matrix for the time transformed homogeneous process. We assess maximum likelihood estimation using the Fisher scoring algorithm via simulation studies and compare performance of our method to homogeneous and piecewise homogeneous models. We apply our methodology to a study of delirium progression in a cohort of stem cell transplantation recipients and show that our method identifies temporal trends in delirium incidence and recovery.

  4. Mathematical modeling of biomass fuels formation process

    International Nuclear Information System (INIS)

    Gaska, Krzysztof; Wandrasz, Andrzej J.

    2008-01-01

    The increasing demand for thermal and electric energy in many branches of industry and municipal management accounts for a drastic diminishing of natural resources (fossil fuels). Meanwhile, in numerous technical processes, a huge mass of wastes is produced. A segregated and converted combustible fraction of the wastes, with relatively high calorific value, may be used as a component of formed fuels. The utilization of the formed fuel components from segregated groups of waste in associated processes of co-combustion with conventional fuels causes significant savings resulting from partial replacement of fossil fuels, and reduction of environmental pollution resulting directly from the limitation of waste migration to the environment (soil, atmospheric air, surface and underground water). The realization of technological processes with the utilization of formed fuel in associated thermal systems should be qualified by technical criteria, which means that elementary processes as well as factors of sustainable development, from a global viewpoint, must not be disturbed. The utilization of post-process waste should be preceded by detailed technical, ecological and economic analyses. In order to optimize the mixing process of fuel components, a mathematical model of the forming process was created. The model is defined as a group of data structures which uniquely identify a real process and conversion of this data in algorithms based on a problem of linear programming. The paper also presents the optimization of parameters in the process of forming fuels using a modified simplex algorithm with a polynomial worktime. This model is a datum-point in the numerical modeling of real processes, allowing a precise determination of the optimal elementary composition of formed fuels components, with assumed constraints and decision variables of the task

  5. Process Model for Friction Stir Welding

    Science.gov (United States)

    Adams, Glynn

    1996-01-01

    Friction stir welding (FSW) is a relatively new process being applied for joining of metal alloys. The process was initially developed by The Welding Institute (TWI) in Cambridge, UK. The FSW process is being investigated at NASA/MSEC as a repair/initial weld procedure for fabrication of the super-light-weight aluminum-lithium shuttle external tank. The FSW investigations at MSFC were conducted on a horizontal mill to produce butt welds of flat plate material. The weldment plates are butted together and fixed to a backing plate on the mill bed. A pin tool is placed into the tool holder of the mill spindle and rotated at approximately 400 rpm. The pin tool is then plunged into the plates such that the center of the probe lies at, one end of the line of contact, between the plates and the shoulder of the pin tool penetrates the top surface of the weldment. The weld is produced by traversing the tool along the line of contact between the plates. A lead angle allows the leading edge of the shoulder to remain above the top surface of the plate. The work presented here is the first attempt at modeling a complex phenomenon. The mechanical aspects of conducting the weld process are easily defined and the process itself is controlled by relatively few input parameters. However, in the region of the weld, plasticizing and forging of the parent material occurs. These are difficult processes to model. The model presented here addresses only variations in the radial dimension outward from the pin tool axis. Examinations of the grain structure of the weld reveal that a considerable amount of material deformation also occurs in the direction parallel to the pin tool axis of rotation, through the material thickness. In addition, measurements of the axial load on the pin tool demonstrate that the forging affect of the pin tool shoulder is an important process phenomenon. Therefore, the model needs to be expanded to account for the deformations through the material thickness and the

  6. Towards a structured process modeling method: Building the prescriptive modeling theory information on submission

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Gailly, F.; Grefen, P.W.P.J.; Poels, G.

    2017-01-01

    In their effort to control and manage processes, organizations often create process models. The quality of such models is not always optimal, because it is challenging for a modeler to translate her mental image of the process into a formal process description. In order to support this complex human

  7. The Model of the Production Process for the Quality Management

    Directory of Open Access Journals (Sweden)

    Alot Zbigniew

    2017-02-01

    Full Text Available This article is a result of the research on the models of the production processes for the quality management and their identification. It discusses the classical model and the indicators for evaluating the capabilities by taking as its starting point the assumption of the normal distribution of the process characteristics. The division of the process types proposed by ISO 21747:2006 standard introducing models for non-stationary processes is presented. A general process model that allows in any real case to precisely describe the statistical characteristics of the process is proposed. It gives the opportunity for more detailed description, in comparison to the model proposed by ISO 21747:2006 standard, of the process characteristics and determining its capability. This model contains the type of process, statistical distribution, and the method for determining the capability and performance (long-term capability of the process. One of the model elements is proposed, own classification and resulting set of process types. The classification follows the recommendations of ISO 21747:2006 introducing models for the non-stationary processes. However, the set of the process types allows, beyond a more precise description of the process characteristics, its usage to monitor the process.

  8. Aspect-Oriented Business Process Modeling with AO4BPMN

    Science.gov (United States)

    Charfi, Anis; Müller, Heiko; Mezini, Mira

    Many crosscutting concerns in business processes need to be addressed already at the business process modeling level such as compliance, auditing, billing, and separation of duties. However, existing business process modeling languages including OMG's Business Process Modeling Notation (BPMN) lack appropriate means for expressing such concerns in a modular way. In this paper, we motivate the need for aspect-oriented concepts in business process modeling languages and propose an aspect-oriented extension to BPMN called AO4BPMN. We also present a graphical editor supporting that extension.

  9. Dual processing model of medical decision-making

    OpenAIRE

    Djulbegovic, Benjamin; Hozo, Iztok; Beckstead, Jason; Tsalatsanis, Athanasios; Pauker, Stephen G

    2012-01-01

    Abstract Background Dual processing theory of human cognition postulates that reasoning and decision-making can be described as a function of both an intuitive, experiential, affective system (system I) and/or an analytical, deliberative (system II) processing system. To date no formal descriptive model of medical decision-making based on dual processing theory has been developed. Here we postulate such a model and apply it to a common clinical situation: whether treatment should be administe...

  10. Data-driven process decomposition and robust online distributed modelling for large-scale processes

    Science.gov (United States)

    Shu, Zhang; Lijuan, Li; Lijuan, Yao; Shipin, Yang; Tao, Zou

    2018-02-01

    With the increasing attention of networked control, system decomposition and distributed models show significant importance in the implementation of model-based control strategy. In this paper, a data-driven system decomposition and online distributed subsystem modelling algorithm was proposed for large-scale chemical processes. The key controlled variables are first partitioned by affinity propagation clustering algorithm into several clusters. Each cluster can be regarded as a subsystem. Then the inputs of each subsystem are selected by offline canonical correlation analysis between all process variables and its controlled variables. Process decomposition is then realised after the screening of input and output variables. When the system decomposition is finished, the online subsystem modelling can be carried out by recursively block-wise renewing the samples. The proposed algorithm was applied in the Tennessee Eastman process and the validity was verified.

  11. Ontological Analysis of Integrated Process Models: testing hypotheses

    Directory of Open Access Journals (Sweden)

    Michael Rosemann

    2001-11-01

    Full Text Available Integrated process modeling is achieving prominence in helping to document and manage business administration and IT processes in organizations. The ARIS framework is a popular example for a framework of integrated process modeling not least because it underlies the 800 or more reference models embedded in the world's most popular ERP package, SAP R/3. This paper demonstrates the usefulness of the Bunge-Wand-Weber (BWW representation model for evaluating modeling grammars such as those constituting ARIS. It reports some initial insights gained from pilot testing Green and Rosemann's (2000 evaluative propositions. Even when considering all five views of ARIS, modelers have problems representing business rules, the scope and boundary of systems, and decomposing models. However, even though it is completely ontologically redundant, users still find the function view useful in modeling.

  12. Computer-Aided Multiscale Modelling for Chemical Process Engineering

    DEFF Research Database (Denmark)

    Morales Rodriguez, Ricardo; Gani, Rafiqul

    2007-01-01

    Chemical processes are generally modeled through monoscale approaches, which, while not adequate, satisfy a useful role in product-process design. In this case, use of a multi-dimensional and multi-scale model-based approach has importance in product-process development. A computer-aided framework...

  13. Guided interaction exploration in artifact-centric process models

    NARCIS (Netherlands)

    van Eck, M.L.; Sidorova, N.; van der Aalst, W.M.P.

    2017-01-01

    Artifact-centric process models aim to describe complex processes as a collection of interacting artifacts. Recent development in process mining allow for the discovery of such models. However, the focus is often on the representation of the individual artifacts rather than their interactions. Based

  14. Towards simplification of hydrologic modeling: identification of dominant processes

    Directory of Open Access Journals (Sweden)

    S. L. Markstrom

    2016-11-01

    Full Text Available parameter hydrologic model, has been applied to the conterminous US (CONUS. Parameter sensitivity analysis was used to identify: (1 the sensitive input parameters and (2 particular model output variables that could be associated with the dominant hydrologic process(es. Sensitivity values of 35 PRMS calibration parameters were computed using the Fourier amplitude sensitivity test procedure on 110 000 independent hydrologically based spatial modeling units covering the CONUS and then summarized to process (snowmelt, surface runoff, infiltration, soil moisture, evapotranspiration, interflow, baseflow, and runoff and model performance statistic (mean, coefficient of variation, and autoregressive lag 1. Identified parameters and processes provide insight into model performance at the location of each unit and allow the modeler to identify the most dominant process on the basis of which processes are associated with the most sensitive parameters. The results of this study indicate that: (1 the choice of performance statistic and output variables has a strong influence on parameter sensitivity, (2 the apparent model complexity to the modeler can be reduced by focusing on those processes that are associated with sensitive parameters and disregarding those that are not, (3 different processes require different numbers of parameters for simulation, and (4 some sensitive parameters influence only one hydrologic process, while others may influence many.

  15. BUSINESS PROCESS MODELLING: A FOUNDATION FOR KNOWLEDGE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Vesna Bosilj-Vukšić

    2006-12-01

    Full Text Available Knowledge management (KM is increasingly recognised as a strategic practice of knowledge-intensive companies, becoming an integral part of an organisation's strategy to improve business performance. This paper provides an overview of business process modelling applications and analyses the relationship between business process modelling and knowledge management projects. It presents the case study of Croatian leading banks and the insurance company, discussing its practical experience in conducting business process modelling projects and investigating the opportunity for integrating business process repository and organisational knowledge as the foundation for knowledge management system development.

  16. Towards a universal competitive intelligence process model

    Directory of Open Access Journals (Sweden)

    Rene Pellissier

    2013-08-01

    Objectives: The purpose of this research is to review the current literature on CI, to look at the aims of identifying and analysing CI process models, and finally to propose a universal CI process model. Method: The study was qualitative in nature and content analysis was conducted on all identified sources establishing and analysing CI process models. To identify relevant literature, academic databases and search engines were used. Moreover, a review of references in related studies led to more relevant sources, the references of which were further reviewed and analysed. To ensure reliability, only peer-reviewed articles were used. Results: The findings reveal that the majority of scholars view the CI process as a cycle of interrelated phases. The output of one phase is the input of the next phase. Conclusion: The CI process is a cycle of interrelated phases. The output of one phase is the input of the next phase. These phases are influenced by the following factors: decision makers, process and structure, organisational awareness and culture, and feedback.

  17. Pedagogic process modeling: Humanistic-integrative approach

    Directory of Open Access Journals (Sweden)

    Boritko Nikolaj M.

    2007-01-01

    Full Text Available The paper deals with some current problems of modeling the dynamics of the subject-features development of the individual. The term "process" is considered in the context of the humanistic-integrative approach, in which the principles of self education are regarded as criteria for efficient pedagogic activity. Four basic characteristics of the pedagogic process are pointed out: intentionality reflects logicality and regularity of the development of the process; discreteness (stageability in dicates qualitative stages through which the pedagogic phenomenon passes; nonlinearity explains the crisis character of pedagogic processes and reveals inner factors of self-development; situationality requires a selection of pedagogic conditions in accordance with the inner factors, which would enable steering the pedagogic process. Offered are two steps for singling out a particular stage and the algorithm for developing an integrative model for it. The suggested conclusions might be of use for further theoretic research, analyses of educational practices and for realistic predicting of pedagogical phenomena. .

  18. Updating parameters of the chicken processing line model

    DEFF Research Database (Denmark)

    Kurowicka, Dorota; Nauta, Maarten; Jozwiak, Katarzyna

    2010-01-01

    A mathematical model of chicken processing that quantitatively describes the transmission of Campylobacter on chicken carcasses from slaughter to chicken meat product has been developed in Nauta et al. (2005). This model was quantified with expert judgment. Recent availability of data allows...... updating parameters of the model to better describe processes observed in slaughterhouses. We propose Bayesian updating as a suitable technique to update expert judgment with microbiological data. Berrang and Dickens’s data are used to demonstrate performance of this method in updating parameters...... of the chicken processing line model....

  19. Kinetic and thermodynamic modelling of TBP synthesis processes

    International Nuclear Information System (INIS)

    Azzouz, A.; Attou, M.

    1989-02-01

    The present paper deals with kinetic and thermodynamic modellisation of tributylphosphate (TBP) synthesis processes. Its aim consists in a purely comparative study of two different synthesis ways i.e. direct and indirect estirification of butanol. The methodology involves two steps. The first step consists in approximating curves which describe the process evolution and their dependence on the main parameters. The results gave a kinetic model of the process rate yielding in TBP. Further, on the basis of thermodynamic data concerning the various involved compounds a theoretical model was achieved. The calculations were carried out in Basic language and an interpolation mathematical method was applied to approximate the kinetic curves. The thermodynamic calculations were achieved on the basis of GIBBS' free energy using a VAX type computer and a VT240 terminal. The calculations accuracy was reasonable and within the norms. For each process, the confrontation of both models leads to an appreciable accord. In the two processes, the thermodynamic models were similar although the kinetic equations present different reaction orders. Hence the reaction orders were determined by a mathematical method which conists in searching the minimal difference between an empiric relation and a kinetic model with fixed order. This corresponds in fact in testing the model proposed at various reaction order around the suspected value. The main idea which results from such a work is that this kind of processes is well fitting with the model without taking into account the side chain reactions. The process behaviour is like that of a single reaction having a quasi linear dependence of the rate yielding and the reaction time for both processes

  20. On the correlation between process model metrics and errors

    NARCIS (Netherlands)

    Mendling, J.; Neumann, G.; Aalst, van der W.M.P.; Grundy, J.; Hartmann, S.; Laender, S.; Maciaszek, L.; Roddick, J.F.

    2007-01-01

    Business process models play an important role for the management, design, and improvement of process organizations and process-aware information systems. Despite the extensive application of process modeling in practice there are hardly empirical results available on quality aspects of process

  1. A generalized logarithmic image processing model based on the gigavision sensor model.

    Science.gov (United States)

    Deng, Guang

    2012-03-01

    The logarithmic image processing (LIP) model is a mathematical theory providing generalized linear operations for image processing. The gigavision sensor (GVS) is a new imaging device that can be described by a statistical model. In this paper, by studying these two seemingly unrelated models, we develop a generalized LIP (GLIP) model. With the LIP model being its special case, the GLIP model not only provides new insights into the LIP model but also defines new image representations and operations for solving general image processing problems that are not necessarily related to the GVS. A new parametric LIP model is also developed. To illustrate the application of the new scalar multiplication operation, we propose an energy-preserving algorithm for tone mapping, which is a necessary step in image dehazing. By comparing with results using two state-of-the-art algorithms, we show that the new scalar multiplication operation is an effective tool for tone mapping.

  2. A Companion Model Approach to Modelling and Simulation of Industrial Processes

    International Nuclear Information System (INIS)

    Juslin, K.

    2005-09-01

    Modelling and simulation provides for huge possibilities if broadly taken up by engineers as a working method. However, when considering the launching of modelling and simulation tools in an engineering design project, they shall be easy to learn and use. Then, there is no time to write equations, to consult suppliers' experts, or to manually transfer data from one tool to another. The answer seems to be in the integration of easy to use and dependable simulation software with engineering tools. Accordingly, the modelling and simulation software shall accept as input such structured design information on industrial unit processes and their connections, as provided for by e.g. CAD software and product databases. The software technology, including required specification and communication standards, is already available. Internet based service repositories make it possible for equipment manufacturers to supply 'extended products', including such design data as needed by engineers engaged in process and automation integration. There is a market niche evolving for simulation service centres, operating in co-operation with project consultants, equipment manufacturers, process integrators, automation designers, plant operating personnel, and maintenance centres. The companion model approach for specification and solution of process simulation models, as presented herein, is developed from the above premises. The focus is on how to tackle real world processes, which from the modelling point of view are heterogeneous, dynamic, very stiff, very nonlinear and only piece vice continuous, without extensive manual interventions of human experts. An additional challenge, to solve the arising equations fast and reliable, is dealt with, as well. (orig.)

  3. From BPMN process models to BPEL web services

    NARCIS (Netherlands)

    Ouyang, C.; Aalst, van der W.M.P.; Dumas, M.; Hofstede, ter A.H.M.; Feig, E.; Kumar, A.

    2006-01-01

    The Business Process Modelling Notation (BPMN) is a graph-oriented language in which control and action nodes can be connected almost arbitrarily. It is supported by various modelling tools but so far no systems can directly execute BPMN models. The Business Process Execution Language for Web

  4. Integrated Site Model Process Model Report

    International Nuclear Information System (INIS)

    Booth, T.

    2000-01-01

    The Integrated Site Model (ISM) provides a framework for discussing the geologic features and properties of Yucca Mountain, which is being evaluated as a potential site for a geologic repository for the disposal of nuclear waste. The ISM is important to the evaluation of the site because it provides 3-D portrayals of site geologic, rock property, and mineralogic characteristics and their spatial variabilities. The ISM is not a single discrete model; rather, it is a set of static representations that provide three-dimensional (3-D), computer representations of site geology, selected hydrologic and rock properties, and mineralogic-characteristics data. These representations are manifested in three separate model components of the ISM: the Geologic Framework Model (GFM), the Rock Properties Model (RPM), and the Mineralogic Model (MM). The GFM provides a representation of the 3-D stratigraphy and geologic structure. Based on the framework provided by the GFM, the RPM and MM provide spatial simulations of the rock and hydrologic properties, and mineralogy, respectively. Functional summaries of the component models and their respective output are provided in Section 1.4. Each of the component models of the ISM considers different specific aspects of the site geologic setting. Each model was developed using unique methodologies and inputs, and the determination of the modeled units for each of the components is dependent on the requirements of that component. Therefore, while the ISM represents the integration of the rock properties and mineralogy into a geologic framework, the discussion of ISM construction and results is most appropriately presented in terms of the three separate components. This Process Model Report (PMR) summarizes the individual component models of the ISM (the GFM, RPM, and MM) and describes how the three components are constructed and combined to form the ISM

  5. QUALITY IMPROVEMENT MODEL AT THE MANUFACTURING PROCESS PREPARATION LEVEL

    Directory of Open Access Journals (Sweden)

    Dusko Pavletic

    2009-12-01

    Full Text Available The paper expresses base for an operational quality improvement model at the manufacturing process preparation level. A numerous appropriate related quality assurance and improvement methods and tools are identified. Main manufacturing process principles are investigated in order to scrutinize one general model of manufacturing process and to define a manufacturing process preparation level. Development and introduction of the operational quality improvement model is based on a research conducted and results of methods and tools application possibilities in real manufacturing processes shipbuilding and automotive industry. Basic model structure is described and presented by appropriate general algorithm. Operational quality improvement model developed lays down main guidelines for practical and systematic application of quality improvements methods and tools.

  6. Residence time modeling of hot melt extrusion processes.

    Science.gov (United States)

    Reitz, Elena; Podhaisky, Helmut; Ely, David; Thommes, Markus

    2013-11-01

    The hot melt extrusion process is a widespread technique to mix viscous melts. The residence time of material in the process frequently determines the product properties. An experimental setup and a corresponding mathematical model were developed to evaluate residence time and residence time distribution in twin screw extrusion processes. The extrusion process was modeled as the convolution of a mass transport process described by a Gaussian probability function, and a mixing process represented by an exponential function. The residence time of the extrusion process was determined by introducing a tracer at the extruder inlet and measuring the tracer concentration at the die. These concentrations were fitted to the residence time model, and an adequate correlation was found. Different parameters were derived to characterize the extrusion process including the dead time, the apparent mixing volume, and a transport related axial mixing. A 2(3) design of experiments was performed to evaluate the effect of powder feed rate, screw speed, and melt viscosity of the material on the residence time. All three parameters affect the residence time of material in the extruder. In conclusion, a residence time model was developed to interpret experimental data and to get insights into the hot melt extrusion process. Copyright © 2013 Elsevier B.V. All rights reserved.

  7. Quasilinear Extreme Learning Machine Model Based Internal Model Control for Nonlinear Process

    Directory of Open Access Journals (Sweden)

    Dazi Li

    2015-01-01

    Full Text Available A new strategy for internal model control (IMC is proposed using a regression algorithm of quasilinear model with extreme learning machine (QL-ELM. Aimed at the chemical process with nonlinearity, the learning process of the internal model and inverse model is derived. The proposed QL-ELM is constructed as a linear ARX model with a complicated nonlinear coefficient. It shows some good approximation ability and fast convergence. The complicated coefficients are separated into two parts. The linear part is determined by recursive least square (RLS, while the nonlinear part is identified through extreme learning machine. The parameters of linear part and the output weights of ELM are estimated iteratively. The proposed internal model control is applied to CSTR process. The effectiveness and accuracy of the proposed method are extensively verified through numerical results.

  8. Mathematical model of seed germination process

    International Nuclear Information System (INIS)

    Gładyszewska, B.; Koper, R.; Kornarzyński, K.

    1999-01-01

    An analytical model of seed germination process was described. The model based on proposed working hypothesis leads - by analogy - to a law corresponding with Verhulst-Pearl's law, known from the theory of population kinetics. The model was applied to describe the germination kinetics of tomato seeds, Promyk field cultivar, biostimulated by laser treatment. Close agreement of experimental and model data was obtained [pl

  9. The gut microbiota influence behavior in the subchronic PCP induced animal model of schizophrenia

    DEFF Research Database (Denmark)

    Jørgensen, Bettina Merete Pyndt; Redrobe, Paul; Brønnum Pedersen, Tina

    The gut microbiota has major impact on the individual. Here we show that the gut microbiota influence behavior in the subchronic PCP induced animal model of schizophrenia. The gut microbiota were changed in the group treated subchronic with PCP, and restoration coincided with normalisation...... of memory performance in lister hooded rats. Furthermore the individual gut microbiota correlated to the individual behavior abserved in the tests conducted. In conclusion results show an influence of the gut microbiota on behavior in this model, and therefore it might be relavant to include the information...

  10. A linear time layout algorithm for business process models

    NARCIS (Netherlands)

    Gschwind, T.; Pinggera, J.; Zugal, S.; Reijers, H.A.; Weber, B.

    2014-01-01

    The layout of a business process model influences how easily it can beunderstood. Existing layout features in process modeling tools often rely on graph representations, but do not take the specific properties of business process models into account. In this paper, we propose an algorithm that is

  11. Towards Model Checking Stochastic Process Algebra

    NARCIS (Netherlands)

    Hermanns, H.; Grieskamp, W.; Santen, T.; Katoen, Joost P.; Stoddart, B.; Meyer-Kayser, J.; Siegle, M.

    2000-01-01

    Stochastic process algebras have been proven useful because they allow behaviour-oriented performance and reliability modelling. As opposed to traditional performance modelling techniques, the behaviour- oriented style supports composition and abstraction in a natural way. However, analysis of

  12. A Comprehensive and Harmonized Digital Forensic Investigation Process Model.

    Science.gov (United States)

    Valjarevic, Aleksandar; Venter, Hein S

    2015-11-01

    Performing a digital forensic investigation (DFI) requires a standardized and formalized process. There is currently neither an international standard nor does a global, harmonized DFI process (DFIP) exist. The authors studied existing state-of-the-art DFIP models and concluded that there are significant disparities pertaining to the number of processes, the scope, the hierarchical levels, and concepts applied. This paper proposes a comprehensive model that harmonizes existing models. An effort was made to incorporate all types of processes proposed by the existing models, including those aimed at achieving digital forensic readiness. The authors introduce a novel class of processes called concurrent processes. This is a novel contribution that should, together with the rest of the model, enable more efficient and effective DFI, while ensuring admissibility of digital evidence. Ultimately, the proposed model is intended to be used for different types of DFI and should lead to standardization. © 2015 American Academy of Forensic Sciences.

  13. Ontological Model of Business Process Management Systems

    Science.gov (United States)

    Manoilov, G.; Deliiska, B.

    2008-10-01

    The activities which constitute business process management (BPM) can be grouped into five categories: design, modeling, execution, monitoring and optimization. Dedicated software packets for business process management system (BPMS) are available on the market. But the efficiency of its exploitation depends on used ontological model in the development time and run time of the system. In the article an ontological model of BPMS in area of software industry is investigated. The model building is preceded by conceptualization of the domain and taxonomy of BPMS development. On the base of the taxonomy an simple online thesaurus is created.

  14. Dual processing model of medical decision-making

    Directory of Open Access Journals (Sweden)

    Djulbegovic Benjamin

    2012-09-01

    Full Text Available Abstract Background Dual processing theory of human cognition postulates that reasoning and decision-making can be described as a function of both an intuitive, experiential, affective system (system I and/or an analytical, deliberative (system II processing system. To date no formal descriptive model of medical decision-making based on dual processing theory has been developed. Here we postulate such a model and apply it to a common clinical situation: whether treatment should be administered to the patient who may or may not have a disease. Methods We developed a mathematical model in which we linked a recently proposed descriptive psychological model of cognition with the threshold model of medical decision-making and show how this approach can be used to better understand decision-making at the bedside and explain the widespread variation in treatments observed in clinical practice. Results We show that physician’s beliefs about whether to treat at higher (lower probability levels compared to the prescriptive therapeutic thresholds obtained via system II processing is moderated by system I and the ratio of benefit and harms as evaluated by both system I and II. Under some conditions, the system I decision maker’s threshold may dramatically drop below the expected utility threshold derived by system II. This can explain the overtreatment often seen in the contemporary practice. The opposite can also occur as in the situations where empirical evidence is considered unreliable, or when cognitive processes of decision-makers are biased through recent experience: the threshold will increase relative to the normative threshold value derived via system II using expected utility threshold. This inclination for the higher diagnostic certainty may, in turn, explain undertreatment that is also documented in the current medical practice. Conclusions We have developed the first dual processing model of medical decision-making that has potential to

  15. Workflow Patterns for Business Process Modeling

    NARCIS (Netherlands)

    Thom, Lucineia Heloisa; Lochpe, Cirano; Reichert, M.U.

    For its reuse advantages, workflow patterns (e.g., control flow patterns, data patterns, resource patterns) are increasingly attracting the interest of both researchers and vendors. Frequently, business process or workflow models can be assembeled out of a set of recurrent process fragments (or

  16. A Model of Process-Based Automation: Cost and Quality Implications in the Medication Management Process

    Science.gov (United States)

    Spaulding, Trent Joseph

    2011-01-01

    The objective of this research is to understand how a set of systems, as defined by the business process, creates value. The three studies contained in this work develop the model of process-based automation. The model states that complementarities among systems are specified by handoffs in the business process. The model also provides theory to…

  17. The Formalization of the Business Process Modeling Goals

    OpenAIRE

    Bušinska, Ligita; Kirikova, Mārīte

    2016-01-01

    In business process modeling the de facto standard BPMN has emerged. However, the applications of this notation have many subsets of elements and various extensions. Also, BPMN still coincides with many other modeling languages, forming a large set of available options for business process modeling languages and dialects. While, in general, the goal of modelers is a central notion in the choice of modeling languages and notations, in most researches that propose guidelines, techniques, and me...

  18. Process and Context in Choice Models

    DEFF Research Database (Denmark)

    Ben-Akiva, Moshe; Palma, André de; McFadden, Daniel

    2012-01-01

    . The extended choice framework includes more behavioral richness through the explicit representation of the planning process preceding an action and its dynamics and the effects of context (family, friends, and market) on the process leading to a choice, as well as the inclusion of new types of subjective data...... in choice models. We discuss the key issues involved in applying the extended framework, focusing on richer data requirements, theories, and models, and present three partial demonstrations of the proposed framework. Future research challenges include the development of more comprehensive empirical tests...

  19. Business process modeling using Petri nets

    NARCIS (Netherlands)

    Hee, van K.M.; Sidorova, N.; Werf, van der J.M.E.M.; Jensen, K.; Aalst, van der W.M.P.; Balbo, G.; Koutny, M.; Wolf, K.

    2013-01-01

    Business process modeling has become a standard activity in many organizations. We start with going back into the history and explain why this activity appeared and became of such importance for organizations to achieve their business targets. We discuss the context in which business process

  20. Modeling spatial processes with unknown extremal dependence class

    KAUST Repository

    Huser, Raphaë l G.; Wadsworth, Jennifer L.

    2017-01-01

    Many environmental processes exhibit weakening spatial dependence as events become more extreme. Well-known limiting models, such as max-stable or generalized Pareto processes, cannot capture this, which can lead to a preference for models

  1. Process modeling study of the CIF incinerator

    International Nuclear Information System (INIS)

    Hang, T.

    1995-01-01

    The Savannah River Site (SRS) plans to begin operating the Consolidated Incineration Facility (CIF) in 1996. The CIF will treat liquid and solid low-level radioactive, mixed and RCRA hazardous wastes generated at SRS. In addition to experimental test programs, process modeling was applied to provide guidance in areas of safety, environmental regulation compliances, process improvement and optimization. A steady-state flowsheet model was used to calculate material/energy balances and to track key chemical constituents throughout the process units. Dynamic models were developed to predict the CIF transient characteristics in normal and abnormal operation scenarios. Predictions include the rotary kiln heat transfer, dynamic responses of the CIF to fluctuations in the solid waste feed or upsets in the system equipments, performance of the control system, air inleakage in the kiln, etc. This paper reviews the modeling study performed to assist in the deflagration risk assessment

  2. Parameter identification in multinomial processing tree models

    NARCIS (Netherlands)

    Schmittmann, V.D.; Dolan, C.V.; Raijmakers, M.E.J.; Batchelder, W.H.

    2010-01-01

    Multinomial processing tree models form a popular class of statistical models for categorical data that have applications in various areas of psychological research. As in all statistical models, establishing which parameters are identified is necessary for model inference and selection on the basis

  3. Study on a Process-oriented Knowledge Management Model

    OpenAIRE

    Zhang, Lingling; Li, Jun; Zheng, Xiuyu; Li, Xingsen; Shi, Yong

    2007-01-01

    Now knowledge has become the most important resource of enterprises. Process-oriented knowledge management (POKM) is a new and valuable research field. It may be the most practical method to deal with difficulties in knowledge management. The paper analyzes background, hypothesis and proposes of POKM, define the process knowledge, and give a process-oriented knowledge management model. The model integrates knowledge, process, human, and technology. It can improve the decision support capabili...

  4. Bariatric surgery in morbidly obese insulin resistant humans normalises insulin signalling but not insulin-stimulated glucose disposal.

    Directory of Open Access Journals (Sweden)

    Mimi Z Chen

    Full Text Available Weight-loss after bariatric surgery improves insulin sensitivity, but the underlying molecular mechanism is not clear. To ascertain the effect of bariatric surgery on insulin signalling, we examined glucose disposal and Akt activation in morbidly obese volunteers before and after Roux-en-Y gastric bypass surgery (RYGB, and compared this to lean volunteers.The hyperinsulinaemic euglycaemic clamp, at five infusion rates, was used to determine glucose disposal rates (GDR in eight morbidly obese (body mass index, BMI=47.3 ± 2.2 kg/m(2 patients, before and after RYGB, and in eight lean volunteers (BMI=20.7 ± 0.7 kg/m2. Biopsies of brachioradialis muscle, taken at fasting and insulin concentrations that induced half-maximal (GDR50 and maximal (GDR100 GDR in each subject, were used to examine the phosphorylation of Akt-Thr308, Akt-473, and pras40, in vivo biomarkers for Akt activity.Pre-operatively, insulin-stimulated GDR was lower in the obese compared to the lean individuals (P<0.001. Weight-loss of 29.9 ± 4 kg after surgery significantly improved GDR50 (P=0.004 but not GDR100 (P=0.3. These subjects still remained significantly more insulin resistant than the lean individuals (p<0.001. Weight loss increased insulin-stimulated skeletal muscle Akt-Thr308 and Akt-Ser473 phosphorylation, P=0.02 and P=0.03 respectively (MANCOVA, and Akt activity towards the substrate PRAS40 (P=0.003, MANCOVA, and in contrast to GDR, were fully normalised after the surgery (obese vs lean, P=0.6, P=0.35, P=0.46, respectively.Our data show that although Akt activity substantially improved after surgery, it did not lead to a full restoration of insulin-stimulated glucose disposal. This suggests that a major defect downstream of, or parallel to, Akt signalling remains after significant weight-loss.

  5. Analysis of a normalised expressed sequence tag (EST) library from a key pollinator, the bumblebee Bombus terrestris.

    Science.gov (United States)

    Sadd, Ben M; Kube, Michael; Klages, Sven; Reinhardt, Richard; Schmid-Hempel, Paul

    2010-02-15

    The bumblebee, Bombus terrestris (Order Hymenoptera), is of widespread importance. This species is extensively used for commercial pollination in Europe, and along with other Bombus spp. is a key member of natural pollinator assemblages. Furthermore, the species is studied in a wide variety of biological fields. The objective of this project was to create a B. terrestris EST resource that will prove to be valuable in obtaining a deeper understanding of this significant social insect. A normalised cDNA library was constructed from the thorax and abdomen of B. terrestris workers in order to enhance the discovery of rare genes. A total of 29'428 ESTs were sequenced. Subsequent clustering resulted in 13'333 unique sequences. Of these, 58.8 percent had significant similarities to known proteins, with 54.5 percent having a "best-hit" to existing Hymenoptera sequences. Comparisons with the honeybee and other insects allowed the identification of potential candidates for gene loss, pseudogene evolution, and possible incomplete annotation in the honeybee genome. Further, given the focus of much basic research and the perceived threat of disease to natural and commercial populations, the immune system of bumblebees is a particularly relevant component. Although the library is derived from unchallenged bees, we still uncover transcription of a number of immune genes spanning the principally described insect immune pathways. Additionally, the EST library provides a resource for the discovery of genetic markers that can be used in population level studies. Indeed, initial screens identified 589 simple sequence repeats and 854 potential single nucleotide polymorphisms. The resource that these B. terrestris ESTs represent is valuable for ongoing work. The ESTs provide direct evidence of transcriptionally active regions, but they will also facilitate further functional genomics, gene discovery and future genome annotation. These are important aspects in obtaining a greater

  6. Comparing single- and dual-process models of memory development.

    Science.gov (United States)

    Hayes, Brett K; Dunn, John C; Joubert, Amy; Taylor, Robert

    2017-11-01

    This experiment examined single-process and dual-process accounts of the development of visual recognition memory. The participants, 6-7-year-olds, 9-10-year-olds and adults, were presented with a list of pictures which they encoded under shallow or deep conditions. They then made recognition and confidence judgments about a list containing old and new items. We replicated the main trends reported by Ghetti and Angelini () in that recognition hit rates increased from 6 to 9 years of age, with larger age changes following deep than shallow encoding. Formal versions of the dual-process high threshold signal detection model and several single-process models (equal variance signal detection, unequal variance signal detection, mixture signal detection) were fit to the developmental data. The unequal variance and mixture signal detection models gave a better account of the data than either of the other models. A state-trace analysis found evidence for only one underlying memory process across the age range tested. These results suggest that single-process memory models based on memory strength are a viable alternative to dual-process models for explaining memory development. © 2016 John Wiley & Sons Ltd.

  7. Modelling of additive manufacturing processes: a review and classification

    Science.gov (United States)

    Stavropoulos, Panagiotis; Foteinopoulos, Panagis

    2018-03-01

    Additive manufacturing (AM) is a very promising technology; however, there are a number of open issues related to the different AM processes. The literature on modelling the existing AM processes is reviewed and classified. A categorization of the different AM processes in process groups, according to the process mechanism, has been conducted and the most important issues are stated. Suggestions are made as to which approach is more appropriate according to the key performance indicator desired to be modelled and a discussion is included as to the way that future modelling work can better contribute to improving today's AM process understanding.

  8. Integrated Process Modeling-A Process Validation Life Cycle Companion.

    Science.gov (United States)

    Zahel, Thomas; Hauer, Stefan; Mueller, Eric M; Murphy, Patrick; Abad, Sandra; Vasilieva, Elena; Maurer, Daniel; Brocard, Cécile; Reinisch, Daniela; Sagmeister, Patrick; Herwig, Christoph

    2017-10-17

    During the regulatory requested process validation of pharmaceutical manufacturing processes, companies aim to identify, control, and continuously monitor process variation and its impact on critical quality attributes (CQAs) of the final product. It is difficult to directly connect the impact of single process parameters (PPs) to final product CQAs, especially in biopharmaceutical process development and production, where multiple unit operations are stacked together and interact with each other. Therefore, we want to present the application of Monte Carlo (MC) simulation using an integrated process model (IPM) that enables estimation of process capability even in early stages of process validation. Once the IPM is established, its capability in risk and criticality assessment is furthermore demonstrated. IPMs can be used to enable holistic production control strategies that take interactions of process parameters of multiple unit operations into account. Moreover, IPMs can be trained with development data, refined with qualification runs, and maintained with routine manufacturing data which underlines the lifecycle concept. These applications will be shown by means of a process characterization study recently conducted at a world-leading contract manufacturing organization (CMO). The new IPM methodology therefore allows anticipation of out of specification (OOS) events, identify critical process parameters, and take risk-based decisions on counteractions that increase process robustness and decrease the likelihood of OOS events.

  9. Analysis of Using Resources in Business Process Modeling and Simulation

    Directory of Open Access Journals (Sweden)

    Vasilecas Olegas

    2014-12-01

    Full Text Available One of the key purposes of Business Process Model and Notation (BPMN is to support graphical representation of the process model. However, such models have a lack of support for the graphical representation of resources, whose processes are used during simulation or execution of process instance. The paper analyzes different methods and their extensions for resource modeling. Further, this article presents a selected set of resource properties that are relevant for resource modeling. The paper proposes an approach that explains how to use the selected set of resource properties for extension of process modeling using BPMN and simulation tools. They are based on BPMN, where business process instances use resources in a concurrency manner.

  10. Social Models: Blueprints or Processes?

    Science.gov (United States)

    Little, Graham R.

    1981-01-01

    Discusses the nature and implications of two different models for societal planning: (1) the problem-solving process approach based on Karl Popper; and (2) the goal-setting "blueprint" approach based on Karl Marx. (DC)

  11. Process Models for Security Architectures

    Directory of Open Access Journals (Sweden)

    Floarea NASTASE

    2006-01-01

    Full Text Available This paper presents a model for an integrated security system, which can be implemented in any organization. It is based on security-specific standards and taxonomies as ISO 7498-2 and Common Criteria. The functionalities are derived from the classes proposed in the Common Criteria document. In the paper we present the process model for each functionality and also we focus on the specific components.

  12. The Structured Process Modeling Method (SPMM) : what is the best way for me to construct a process model?

    NARCIS (Netherlands)

    Claes, J.; Vanderfeesten, I.T.P.; Gailly, F.; Grefen, P.W.P.J.; Poels, G.

    2017-01-01

    More and more organizations turn to the construction of process models to support strategical and operational tasks. At the same time, reports indicate quality issues for a considerable part of these models, caused by modeling errors. Therefore, the research described in this paper investigates the

  13. Quantitative Modeling of Earth Surface Processes

    Science.gov (United States)

    Pelletier, Jon D.

    This textbook describes some of the most effective and straightforward quantitative techniques for modeling Earth surface processes. By emphasizing a core set of equations and solution techniques, the book presents state-of-the-art models currently employed in Earth surface process research, as well as a set of simple but practical research tools. Detailed case studies demonstrate application of the methods to a wide variety of processes including hillslope, fluvial, aeolian, glacial, tectonic, and climatic systems. Exercises at the end of each chapter begin with simple calculations and then progress to more sophisticated problems that require computer programming. All the necessary computer codes are available online at www.cambridge.org/9780521855976. Assuming some knowledge of calculus and basic programming experience, this quantitative textbook is designed for advanced geomorphology courses and as a reference book for professional researchers in Earth and planetary science looking for a quantitative approach to Earth surface processes. More details...

  14. Multivariate Product-Shot-noise Cox Point Process Models

    DEFF Research Database (Denmark)

    Jalilian, Abdollah; Guan, Yongtao; Mateu, Jorge

    We introduce a new multivariate product-shot-noise Cox process which is useful for model- ing multi-species spatial point patterns with clustering intra-specific interactions and neutral, negative or positive inter-specific interactions. The auto and cross pair correlation functions of the process...... can be obtained in closed analytical forms and approximate simulation of the process is straightforward. We use the proposed process to model interactions within and among five tree species in the Barro Colorado Island plot....

  15. A simplified computational memory model from information processing.

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-11-23

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view.

  16. Simple Models for Process Control

    Czech Academy of Sciences Publication Activity Database

    Gorez, R.; Klán, Petr

    2011-01-01

    Roč. 22, č. 2 (2011), s. 58-62 ISSN 0929-2268 Institutional research plan: CEZ:AV0Z10300504 Keywords : process model s * PID control * second order dynamics Subject RIV: JB - Sensors, Measurment, Regulation

  17. Modelling the presence of myelin and oedema in the brain based on multi-parametric quantitative MRI

    Directory of Open Access Journals (Sweden)

    Marcel eWarntjes

    2016-02-01

    Full Text Available The aim of this study was to present a model that uses multi-parametric quantitative MRI to estimate the presence of myelin and oedema in the brain. The model relates simultaneous measurement of R1 and R2 relaxation rates and proton density to four partial volume compartments, consisting of myelin partial volume, cellular partial volume, free water partial volume and excess parenchymal water partial volume. The model parameters were obtained using spatially normalised brain images of a group of 20 healthy controls. The pathological brain was modelled in terms of the reduction of myelin content and presence of excess parenchymal water, which indicates the degree of oedema. The method was tested on spatially normalised brain images of a group of 20 age-matched multiple sclerosis (MS patients. Clear differences were observed with respect to the healthy controls: the MS group had a 79 mL smaller brain volume (1069 vs. 1148 mL, a 38 mL smaller myelin volume (119 vs. 157 mL and a 21 mL larger excess parenchymal water volume (78 vs. 57 mL. Template regions of interest of various brain structures indicated that the myelin partial volume in the MS group was 1.6±1.5% lower for grey matter (GM structures and 2.8±1.0% lower for white matter (WM structures. The excess parenchymal water partial volume was 9±10% larger for GM and 5±2% larger for WM. Manually placed ROIs indicated that the results using the template ROIs may have suffered from loss of anatomical detail due to the spatial normalization process. Examples of the application of the method on high-resolution images are provided for three individual subjects, a 45-year-old healthy subject, a 72-year-old healthy subject and a 45-year-old MS patient. The observed results agreed with the expected behaviour considering both age and disease. In conclusion, the proposed model may provide clinically important parameters such as the total brain volume, degree of myelination and degree of oedema, based on

  18. Functional Dual Adaptive Control with Recursive Gaussian Process Model

    International Nuclear Information System (INIS)

    Prüher, Jakub; Král, Ladislav

    2015-01-01

    The paper deals with dual adaptive control problem, where the functional uncertainties in the system description are modelled by a non-parametric Gaussian process regression model. Current approaches to adaptive control based on Gaussian process models are severely limited in their practical applicability, because the model is re-adjusted using all the currently available data, which keeps growing with every time step. We propose the use of recursive Gaussian process regression algorithm for significant reduction in computational requirements, thus bringing the Gaussian process-based adaptive controllers closer to their practical applicability. In this work, we design a bi-criterial dual controller based on recursive Gaussian process model for discrete-time stochastic dynamic systems given in an affine-in-control form. Using Monte Carlo simulations, we show that the proposed controller achieves comparable performance with the full Gaussian process-based controller in terms of control quality while keeping the computational demands bounded. (paper)

  19. MOUNTAIN-SCALE COUPLED PROCESSES (TH/THC/THM) MODELS

    International Nuclear Information System (INIS)

    Y.S. Wu

    2005-01-01

    This report documents the development and validation of the mountain-scale thermal-hydrologic (TH), thermal-hydrologic-chemical (THC), and thermal-hydrologic-mechanical (THM) models. These models provide technical support for screening of features, events, and processes (FEPs) related to the effects of coupled TH/THC/THM processes on mountain-scale unsaturated zone (UZ) and saturated zone (SZ) flow at Yucca Mountain, Nevada (BSC 2005 [DIRS 174842], Section 2.1.1.1). The purpose and validation criteria for these models are specified in ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Drift-Scale Abstraction) Model Report Integration'' (BSC 2005 [DIRS 174842]). Model results are used to support exclusion of certain FEPs from the total system performance assessment for the license application (TSPA-LA) model on the basis of low consequence, consistent with the requirements of 10 CFR 63.342 [DIRS 173273]. Outputs from this report are not direct feeds to the TSPA-LA. All the FEPs related to the effects of coupled TH/THC/THM processes on mountain-scale UZ and SZ flow are discussed in Sections 6 and 7 of this report. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The mountain-scale TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH model captures mountain-scale three-dimensional flow effects, including lateral diversion and mountain-scale flow patterns. The mountain-scale THC model evaluates TH effects on water and gas

  20. MOUNTAIN-SCALE COUPLED PROCESSES (TH/THC/THM)MODELS

    Energy Technology Data Exchange (ETDEWEB)

    Y.S. Wu

    2005-08-24

    This report documents the development and validation of the mountain-scale thermal-hydrologic (TH), thermal-hydrologic-chemical (THC), and thermal-hydrologic-mechanical (THM) models. These models provide technical support for screening of features, events, and processes (FEPs) related to the effects of coupled TH/THC/THM processes on mountain-scale unsaturated zone (UZ) and saturated zone (SZ) flow at Yucca Mountain, Nevada (BSC 2005 [DIRS 174842], Section 2.1.1.1). The purpose and validation criteria for these models are specified in ''Technical Work Plan for: Near-Field Environment and Transport: Coupled Processes (Mountain-Scale TH/THC/THM, Drift-Scale THC Seepage, and Drift-Scale Abstraction) Model Report Integration'' (BSC 2005 [DIRS 174842]). Model results are used to support exclusion of certain FEPs from the total system performance assessment for the license application (TSPA-LA) model on the basis of low consequence, consistent with the requirements of 10 CFR 63.342 [DIRS 173273]. Outputs from this report are not direct feeds to the TSPA-LA. All the FEPs related to the effects of coupled TH/THC/THM processes on mountain-scale UZ and SZ flow are discussed in Sections 6 and 7 of this report. The mountain-scale coupled TH/THC/THM processes models numerically simulate the impact of nuclear waste heat release on the natural hydrogeological system, including a representation of heat-driven processes occurring in the far field. The mountain-scale TH simulations provide predictions for thermally affected liquid saturation, gas- and liquid-phase fluxes, and water and rock temperature (together called the flow fields). The main focus of the TH model is to predict the changes in water flux driven by evaporation/condensation processes, and drainage between drifts. The TH model captures mountain-scale three-dimensional flow effects, including lateral diversion and mountain-scale flow patterns. The mountain-scale THC model evaluates TH effects on

  1. Welding process modelling and control

    Science.gov (United States)

    Romine, Peter L.; Adenwala, Jinen A.

    1993-01-01

    The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.

  2. Mechanistic Fermentation Models for Process Design, Monitoring, and Control

    DEFF Research Database (Denmark)

    Mears, Lisa; Stocks, Stuart M.; Albæk, Mads Orla

    2017-01-01

    Mechanistic models require a significant investment of time and resources, but their application to multiple stages of fermentation process development and operation can make this investment highly valuable. This Opinion article discusses how an established fermentation model may be adapted...... for application to different stages of fermentation process development: planning, process design, monitoring, and control. Although a longer development time is required for such modeling methods in comparison to purely data-based model techniques, the wide range of applications makes them a highly valuable tool...... for fermentation research and development. In addition, in a research environment, where collaboration is important, developing mechanistic models provides a platform for knowledge sharing and consolidation of existing process understanding....

  3. Thermochemical equilibrium modelling of a gasifying process

    International Nuclear Information System (INIS)

    Melgar, Andres; Perez, Juan F.; Laget, Hannes; Horillo, Alfonso

    2007-01-01

    This article discusses a mathematical model for the thermochemical processes in a downdraft biomass gasifier. The model combines the chemical equilibrium and the thermodynamic equilibrium of the global reaction, predicting the final composition of the producer gas as well as its reaction temperature. Once the composition of the producer gas is obtained, a range of parameters can be derived, such as the cold gas efficiency of the gasifier, the amount of dissociated water in the process and the heating value and engine fuel quality of the gas. The model has been validated experimentally. This work includes a parametric study of the influence of the gasifying relative fuel/air ratio and the moisture content of the biomass on the characteristics of the process and the producer gas composition. The model helps to predict the behaviour of different biomass types and is a useful tool for optimizing the design and operation of downdraft biomass gasifiers

  4. Innovation Process Planning Model in the Bpmn Standard

    Directory of Open Access Journals (Sweden)

    Jurczyk-Bunkowska Magdalena

    2013-12-01

    Full Text Available The aim of the article is to show the relations in the innovation process planning model. The relations argued here guarantee the stable and reliable way to achieve the result in the form of an increased competitiveness by a professionally directed development of the company. The manager needs to specify the effect while initiating the realisation of the process, has to be achieved this by the system of indirect goals. The original model proposed here shows the standard of dependence between the plans of the fragments of the innovation process which make up for achieving its final goal. The relation in the present article was shown by using the standard Business Process Model and Notation. This enabled the specification of interrelations between the decision levels at which subsequent fragments of the innovation process are planned. This gives the possibility of a better coordination of the process, reducing the time needed for the achievement of its effect. The model has been compiled on the basis of the practises followed in Polish companies. It is not, however, the reflection of these practises, but rather an idealised standard of proceedings which aims at improving the effectiveness of the management of innovations on the operational level. The model shown could be the basis of the creation of systems supporting the decision making, supporting the knowledge management or those supporting the communication in the innovation processes.

  5. Towards Using Reo for Compliance-Aware Business Process Modeling

    Science.gov (United States)

    Arbab, Farhad; Kokash, Natallia; Meng, Sun

    Business process modeling and implementation of process supporting infrastructures are two challenging tasks that are not fully aligned. On the one hand, languages such as Business Process Modeling Notation (BPMN) exist to capture business processes at the level of domain analysis. On the other hand, programming paradigms and technologies such as Service-Oriented Computing (SOC) and web services have emerged to simplify the development of distributed web systems that underly business processes. BPMN is the most recognized language for specifying process workflows at the early design steps. However, it is rather declarative and may lead to the executable models which are incomplete or semantically erroneous. Therefore, an approach for expressing and analyzing BPMN models in a formal setting is required. In this paper we describe how BPMN diagrams can be represented by means of a semantically precise channel-based coordination language called Reo which admits formal analysis using model checking and bisimulation techniques. Moreover, since additional requirements may come from various regulatory/legislative documents, we discuss the opportunities offered by Reo and its mathematical abstractions for expressing process-related constraints such as Quality of Service (QoS) or time-aware conditions on process states.

  6. Concept of a cognitive-numeric plant and process modelizer

    International Nuclear Information System (INIS)

    Vetterkind, D.

    1990-01-01

    To achieve automatic modeling of plant distrubances and failure limitation procedures, first the system's hardware and the present media (water, steam, coolant fluid) are formalized into fully computable matrices, called topographies. Secondly a microscopic cellular automation model, using lattice gases and state transition rules, is combined with a semi - microscopic cellular process model and with a macroscopic model, too. In doing this, at semi-microscopic level there are acting a cellular data compressor, a feature detection device and the Intelligent Physical Element's process dynamics. At macroscopic level the Walking Process Elements, a process evolving module, a test-and-manage device and abstracting process net are involved. Additionally, a diagnosis-coordinating and a counter measurements coordinating device are used. In order to automatically get process insights, object transformations, elementary process functions and associative methods are used. Developments of optoelectronic hardware language components are under consideration

  7. Modeling of Reaction Processes Controlled by Diffusion

    International Nuclear Information System (INIS)

    Revelli, Jorge

    2003-01-01

    Stochastic modeling is quite powerful in science and technology.The technics derived from this process have been used with great success in laser theory, biological systems and chemical reactions.Besides, they provide a theoretical framework for the analysis of experimental results on the field of particle's diffusion in ordered and disordered materials.In this work we analyze transport processes in one-dimensional fluctuating media, which are media that change their state in time.This fact induces changes in the movements of the particles giving rise to different phenomena and dynamics that will be described and analyzed in this work.We present some random walk models to describe these fluctuating media.These models include state transitions governed by different dynamical processes.We also analyze the trapping problem in a lattice by means of a simple model which predicts a resonance-like phenomenon.Also we study effective diffusion processes over surfaces due to random walks in the bulk.We consider different boundary conditions and transitions movements.We derive expressions that describe diffusion behaviors constrained to bulk restrictions and the dynamic of the particles.Finally it is important to mention that the theoretical results obtained from the models proposed in this work are compared with Monte Carlo simulations.We find, in general, excellent agreements between the theory and the simulations

  8. Mathematical models for sleep-wake dynamics: comparison of the two-process model and a mutual inhibition neuronal model.

    Directory of Open Access Journals (Sweden)

    Anne C Skeldon

    Full Text Available Sleep is essential for the maintenance of the brain and the body, yet many features of sleep are poorly understood and mathematical models are an important tool for probing proposed biological mechanisms. The most well-known mathematical model of sleep regulation, the two-process model, models the sleep-wake cycle by two oscillators: a circadian oscillator and a homeostatic oscillator. An alternative, more recent, model considers the mutual inhibition of sleep promoting neurons and the ascending arousal system regulated by homeostatic and circadian processes. Here we show there are fundamental similarities between these two models. The implications are illustrated with two important sleep-wake phenomena. Firstly, we show that in the two-process model, transitions between different numbers of daily sleep episodes can be classified as grazing bifurcations. This provides the theoretical underpinning for numerical results showing that the sleep patterns of many mammals can be explained by the mutual inhibition model. Secondly, we show that when sleep deprivation disrupts the sleep-wake cycle, ostensibly different measures of sleepiness in the two models are closely related. The demonstration of the mathematical similarities of the two models is valuable because not only does it allow some features of the two-process model to be interpreted physiologically but it also means that knowledge gained from study of the two-process model can be used to inform understanding of the behaviour of the mutual inhibition model. This is important because the mutual inhibition model and its extensions are increasingly being used as a tool to understand a diverse range of sleep-wake phenomena such as the design of optimal shift-patterns, yet the values it uses for parameters associated with the circadian and homeostatic processes are very different from those that have been experimentally measured in the context of the two-process model.

  9. Determinantal point process models on the sphere

    DEFF Research Database (Denmark)

    Møller, Jesper; Nielsen, Morten; Porcu, Emilio

    defined on Sd × Sd . We review the appealing properties of such processes, including their specific moment properties, density expressions and simulation procedures. Particularly, we characterize and construct isotropic DPPs models on Sd , where it becomes essential to specify the eigenvalues......We consider determinantal point processes on the d-dimensional unit sphere Sd . These are finite point processes exhibiting repulsiveness and with moment properties determined by a certain determinant whose entries are specified by a so-called kernel which we assume is a complex covariance function...... and eigenfunctions in a spectral representation for the kernel, and we figure out how repulsive isotropic DPPs can be. Moreover, we discuss the shortcomings of adapting existing models for isotropic covariance functions and consider strategies for developing new models, including a useful spectral approach....

  10. Modelling energy spot prices by Lévy semistationary processes

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole; Benth, Fred Espen; Veraart, Almut

    This paper introduces a new modelling framework for energy spot prices based on Lévy semistationary processes. Lévy semistationary processes are special cases of the general class of ambit processes. We provide a detailed analysis of the probabilistic properties of such models and we show how...... they are able to capture many of the stylised facts observed in energy markets. Furthermore, we derive forward prices based on our spot price model. As it turns out, many of the classical spot models can be embedded into our novel modelling framework....

  11. Dynamic modeling of ultrafiltration membranes for whey separation processes

    NARCIS (Netherlands)

    Saltik, M.B.; Ozkan, L.; Jacobs, M.; van der Padt, A.

    2017-01-01

    In this paper, we present a control relevant rigorous dynamic model for an ultrafiltration membrane unit in a whey separation process. The model consists of a set of differential algebraic equations and is developed for online model based applications such as model based control and process

  12. The semantics of hybrid process models

    NARCIS (Netherlands)

    Slaats, T.; Schunselaar, D.M.M.; Maggi, F.M.; Reijers, H.A.; Debruyne, C.; Panetto, H.; Meersman, R.; Dillon, T.; Kuhn, E.; O'Sullivan, D.; Agostino Ardagna, C.

    2016-01-01

    In the area of business process modelling, declarative notations have been proposed as alternatives to notations that follow the dominant, imperative paradigm. Yet, the choice between an imperative or declarative style of modelling is not always easy to make. Instead, a mixture of these styles is

  13. Demand-based maintenance and operators support based on process models; Behovsstyrt underhaall och operatoersstoed baserat paa process modeller

    Energy Technology Data Exchange (ETDEWEB)

    Dahlquist, Erik; Widarsson, Bjoern; Tomas-Aparicio, Elena

    2012-02-15

    There is a strong demand for systems that can give early warnings on upcoming problems in process performance or sensor measurements. In this project we have developed and implemented such a system on-line. The goal with the system is to give warnings about both faults needing urgent actions, as well giving advice on roughly when service may be needed for specific functions. The use of process simulation models on-line can offer a significant tool for operators and process engineers to analyse the performance of the process and make the most correct and fastest decision when problems arise. In this project physical simulation models are used in combination with decision support tools. By using a physical model it is possible to compare the measured data to the data obtained from the simulation and give these deviations as input to a decision support tool with Bayesian Networks (BN) that will result in information about the probability for wrong measurement in the instruments, process problems and maintenance needs. The application has been implemented in a CFB boiler at Maelarenergi AB. After tuning the model the system has been used online during September - October 2010 and May - October 2011, showing that the system is working on-line with respect to running the simulation model but with batch runs with respect to the BN. Examples have been made for several variables where trends of the deviation between simulation results and measured data have been used as input to a BN, where the probability for different faults has been calculated. Combustion up in the separator/cyclones has been detected several times, problems with fuel feed on both sides of the boiler as well. A moisture sensor not functioning as it should and suspected malfunctioning temperature meters as well. Deeper investigations of the true cause of problems have been used as input to tune the BN

  14. A Queuing Model of the Airport Departure Process

    OpenAIRE

    Balakrishnan, Hamsa; Simaiakis, Ioannis

    2013-01-01

    This paper presents an analytical model of the aircraft departure process at an airport. The modeling procedure includes the estimation of unimpeded taxi-out time distributions and the development of a queuing model of the departure runway system based on the transient analysis of D/E/1 queuing systems. The parameters of the runway service process are estimated using operational data. Using the aircraft pushback schedule as input, the model predicts the expected runway schedule and takeoff ti...

  15. Computational Process Modeling for Additive Manufacturing (OSU)

    Science.gov (United States)

    Bagg, Stacey; Zhang, Wei

    2015-01-01

    Powder-Bed Additive Manufacturing (AM) through Direct Metal Laser Sintering (DMLS) or Selective Laser Melting (SLM) is being used by NASA and the Aerospace industry to "print" parts that traditionally are very complex, high cost, or long schedule lead items. The process spreads a thin layer of metal powder over a build platform, then melts the powder in a series of welds in a desired shape. The next layer of powder is applied, and the process is repeated until layer-by-layer, a very complex part can be built. This reduces cost and schedule by eliminating very complex tooling and processes traditionally used in aerospace component manufacturing. To use the process to print end-use items, NASA seeks to understand SLM material well enough to develop a method of qualifying parts for space flight operation. Traditionally, a new material process takes many years and high investment to generate statistical databases and experiential knowledge, but computational modeling can truncate the schedule and cost -many experiments can be run quickly in a model, which would take years and a high material cost to run empirically. This project seeks to optimize material build parameters with reduced time and cost through modeling.

  16. Liveness and Reachability Analysis of BPMN Process Models

    Directory of Open Access Journals (Sweden)

    Anass Rachdi

    2016-06-01

    Full Text Available Business processes are usually defined by business experts who require intuitive and informal graphical notations such as BPMN (Business Process Management Notation for documenting and communicating their organization activities and behavior. However, BPMN has not been provided with a formal semantics, which limits the analysis of BPMN models to using solely informal techniques such as simulation. In order to address this limitation and use formal verification, it is necessary to define a certain “mapping” between BPMN and a formal language such as Concurrent Sequential Processes (CSP and Petri Nets (PN. This paper proposes a method for the verification of BPMN models by defining formal semantics of BPMN in terms of a mapping to Time Petri Nets (TPN, which are equipped with very efficient analytical techniques. After the translation of BPMN models to TPN, verification is done to ensure that some functional properties are satisfied by the model under investigation, namely liveness and reachability properties. The main advantage of our approach over existing ones is that it takes into account the time components in modeling Business process models. An example is used throughout the paper to illustrate the proposed method.

  17. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  18. Case Studies in Modelling, Control in Food Processes.

    Science.gov (United States)

    Glassey, J; Barone, A; Montague, G A; Sabou, V

    This chapter discusses the importance of modelling and control in increasing food process efficiency and ensuring product quality. Various approaches to both modelling and control in food processing are set in the context of the specific challenges in this industrial sector and latest developments in each area are discussed. Three industrial case studies are used to demonstrate the benefits of advanced measurement, modelling and control in food processes. The first case study illustrates the use of knowledge elicitation from expert operators in the process for the manufacture of potato chips (French fries) and the consequent improvements in process control to increase the consistency of the resulting product. The second case study highlights the economic benefits of tighter control of an important process parameter, moisture content, in potato crisp (chips) manufacture. The final case study describes the use of NIR spectroscopy in ensuring effective mixing of dry multicomponent mixtures and pastes. Practical implementation tips and infrastructure requirements are also discussed.

  19. A simplified computational memory model from information processing

    Science.gov (United States)

    Zhang, Lanhua; Zhang, Dongsheng; Deng, Yuqin; Ding, Xiaoqian; Wang, Yan; Tang, Yiyuan; Sun, Baoliang

    2016-01-01

    This paper is intended to propose a computational model for memory from the view of information processing. The model, called simplified memory information retrieval network (SMIRN), is a bi-modular hierarchical functional memory network by abstracting memory function and simulating memory information processing. At first meta-memory is defined to express the neuron or brain cortices based on the biology and graph theories, and we develop an intra-modular network with the modeling algorithm by mapping the node and edge, and then the bi-modular network is delineated with intra-modular and inter-modular. At last a polynomial retrieval algorithm is introduced. In this paper we simulate the memory phenomena and functions of memorization and strengthening by information processing algorithms. The theoretical analysis and the simulation results show that the model is in accordance with the memory phenomena from information processing view. PMID:27876847

  20. Model-based internal wave processing

    Energy Technology Data Exchange (ETDEWEB)

    Candy, J.V.; Chambers, D.H.

    1995-06-09

    A model-based approach is proposed to solve the oceanic internal wave signal processing problem that is based on state-space representations of the normal-mode vertical velocity and plane wave horizontal velocity propagation models. It is shown that these representations can be utilized to spatially propagate the modal (dept) vertical velocity functions given the basic parameters (wave numbers, Brunt-Vaisala frequency profile etc.) developed from the solution of the associated boundary value problem as well as the horizontal velocity components. Based on this framework, investigations are made of model-based solutions to the signal enhancement problem for internal waves.

  1. Modelling Template for the Development of the Process Flowsheet

    DEFF Research Database (Denmark)

    Fedorova, Marina; Gani, Rafiqul

    2015-01-01

    Models are playing important roles in design and analysis of chemicals/bio-chemicals based products and the processes that manufacture them. Model-based methods and tools have the potential to decrease the number of experiments, which can be expensive and time consuming, and point to candidates...... in connection to other modelling tools within the modelling framework are forming a user-friendly system, which will make the model development process easier and faster and provide the way for unified and consistent model documentation. The modeller can use the template for their specific problem or to extend...... models systematically, efficiently and reliably. In this way, development of products and processes can be faster, cheaper and very efficient. The developed modelling framework involves three main parts: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which...

  2. Measures of quality of process models created in BPMN

    Directory of Open Access Journals (Sweden)

    Radek Hronza

    2015-12-01

    Full Text Available Description, documentation, evaluation and redesign of key processes during their execution should be an essential part of the strategic management of any organization. All organizations live in a dynamically changing environment. Therefore they must adapt its internal processes to market changes. These processes must be described. Suitable way of description could be BPMN notation. Right after description of processes via BPMN, processes should be controlled to ensure expected of their quality. System (which could be automated based on mathematical expression of qualitative characteristics of process models (i.e. measures of quality of process models can support mentioned process controls. Research team trying to design and get into practical use such a tool. The aim of this publication is description of mentioned system – based on measures of the quality of process models - and answer associated scientific questions.

  3. Various Models for Reading Comprehension Process

    Directory of Open Access Journals (Sweden)

    Parastoo Babashamsi

    2013-11-01

    Full Text Available In recent years reading can be viewed as a process, as a form of thinking, as a true experience, and as a tool subject. As a process, reading includes visual discrimination, independent recognition of word, rhythmic progression along a line of print, precision in the return sweep of the eyes, and adjustment of rate. In the same line, the present paper aims at considering the various models of reading process. Moreover, the paper will take a look at various factors such as schema and vocabulary knowledge which affect reading comprehension process.

  4. Dynamic process model of a plutonium oxalate precipitator. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Miller, C.L.; Hammelman, J.E.; Borgonovi, G.M.

    1977-11-01

    In support of LLL material safeguards program, a dynamic process model was developed which simulates the performance of a plutonium (IV) oxalate precipitator. The plutonium oxalate precipitator is a component in the plutonium oxalate process for making plutonium oxide powder from plutonium nitrate. The model is based on state-of-the-art crystallization descriptive equations, the parameters of which are quantified through the use of batch experimental data. The dynamic model predicts performance very similar to general Hanford oxalate process experience. The utilization of such a process model in an actual plant operation could promote both process control and material safeguards control by serving as a baseline predictor which could give early warning of process upsets or material diversion. The model has been incorporated into a FORTRAN computer program and is also compatible with the DYNSYS 2 computer code which is being used at LLL for process modeling efforts.

  5. Dynamic process model of a plutonium oxalate precipitator. Final report

    International Nuclear Information System (INIS)

    Miller, C.L.; Hammelman, J.E.; Borgonovi, G.M.

    1977-11-01

    In support of LLL material safeguards program, a dynamic process model was developed which simulates the performance of a plutonium (IV) oxalate precipitator. The plutonium oxalate precipitator is a component in the plutonium oxalate process for making plutonium oxide powder from plutonium nitrate. The model is based on state-of-the-art crystallization descriptive equations, the parameters of which are quantified through the use of batch experimental data. The dynamic model predicts performance very similar to general Hanford oxalate process experience. The utilization of such a process model in an actual plant operation could promote both process control and material safeguards control by serving as a baseline predictor which could give early warning of process upsets or material diversion. The model has been incorporated into a FORTRAN computer program and is also compatible with the DYNSYS 2 computer code which is being used at LLL for process modeling efforts

  6. Difference-based Model Synchronization in an Industrial MDD Process

    DEFF Research Database (Denmark)

    Könemann, Patrick; Kindler, Ekkart; Unland, Ludger

    2009-01-01

    Models play a central role in model-driven software engineering. There are different kinds of models during the development process, which are related to each other and change over time. Therefore, it is difficult to keep the different models consistent with each other. Consistency of different m...... model versions, and for synchronizing other types of models. The main concern is to apply our concepts to an industrial process, in particular keeping usability and performance in mind. Keyword: Model Differencing, Model Merging, Model Synchronization...

  7. Amalgamation of Marginal Gains (AMG) as a potential system to deliver high-quality fundamental nursing care: A qualitative analysis of interviews from high-performance AMG sports and healthcare practitioners.

    Science.gov (United States)

    Pentecost, Claire; Richards, David A; Frost, Julia

    2017-11-28

    To investigate the components of the Amalgamation of Marginal Gains (AMG) performance system to identify a set of principles that can be built into an innovative fundamental nursing care protocol. Nursing is urged to refocus on its fundamental care activities, but little evidence exists to guide practising nurses. Fundamental care is a combination of many small behaviours aimed at meeting a person's care needs. AMG is a successful system of performance management that focusses on small (or marginal) gains, and might provide a new delivery framework for fundamental nursing care. Qualitative interview study. We undertook in-depth interviews with healthcare and sports professionals experienced in AMG. We analysed data using open coding in a framework analysis, and then interrogated the data using Normalisation Process Theory (NPT). We triangulated findings with AMG literature to develop an intervention logic model. We interviewed 20 AMG practitioners. AMG processes were as follows: focusing on many details to optimise performance, identification of marginal gains using different sources, understanding current versus optimum performance, monitoring at micro and macro level and strong leadership. Elements of normalisation were as follows: whole team belief in AMG to improve performance, a collective desire for excellence using evidence-based actions, whole team engagement to identify choose and implement changes, and individual and group responsibility for monitoring performance. We have elicited the processes described by AMG innovators in health care and sport and have mapped the normalisation potential and work required to embed such a system into nursing practice. The development of our logic model based on AMG and NPT may provide a practical framework for improving fundamental nursing care and is ripe for further development and testing in clinical trials. © 2017 The Authors Journal of Clinical Nursing Published by John Wiley & Sons Ltd.

  8. Semantics of Temporal Models with Multiple Temporal Dimensions

    DEFF Research Database (Denmark)

    Kraft, Peter; Sørensen, Jens Otto

    ending up with lexical data models. In particular we look upon the representations by sets of normalised tables, by sets of 1NF tables and by sets of N1NF/nested tables. At each translation step we focus on how the temporal semantic is consistently maintained. In this way we recognise the requirements...... for representation of temporal properties in different models and the correspondence between the models. The results rely on the assumptions that the temporal dimensions are interdependent and ordered. Thus for example the valid periods of existences of a property in a mini world are dependent on the transaction...... periods in which the corresponding recordings are valid. This is not the normal way of looking at temporal dimensions and we give arguments supporting our assumption....

  9. Computer-Aided Modeling of Lipid Processing Technology

    DEFF Research Database (Denmark)

    Diaz Tovar, Carlos Axel

    2011-01-01

    increase along with growing interest in biofuels, the oleochemical industry faces in the upcoming years major challenges in terms of design and development of better products and more sustainable processes to make them. Computer-aided methods and tools for process synthesis, modeling and simulation...... are widely used for design, analysis, and optimization of processes in the chemical and petrochemical industries. These computer-aided tools have helped the chemical industry to evolve beyond commodities toward specialty chemicals and ‘consumer oriented chemicals based products’. Unfortunately...... to develop systematic computer-aided methods (property models) and tools (database) related to the prediction of the necessary physical properties suitable for design and analysis of processes employing lipid technologies. The methods and tools include: the development of a lipid-database (CAPEC...

  10. A dynamic dual process model of risky decision making.

    Science.gov (United States)

    Diederich, Adele; Trueblood, Jennifer S

    2018-03-01

    Many phenomena in judgment and decision making are often attributed to the interaction of 2 systems of reasoning. Although these so-called dual process theories can explain many types of behavior, they are rarely formalized as mathematical or computational models. Rather, dual process models are typically verbal theories, which are difficult to conclusively evaluate or test. In the cases in which formal (i.e., mathematical) dual process models have been proposed, they have not been quantitatively fit to experimental data and are often silent when it comes to the timing of the 2 systems. In the current article, we present a dynamic dual process model framework of risky decision making that provides an account of the timing and interaction of the 2 systems and can explain both choice and response-time data. We outline several predictions of the model, including how changes in the timing of the 2 systems as well as time pressure can influence behavior. The framework also allows us to explore different assumptions about how preferences are constructed by the 2 systems as well as the dynamic interaction of the 2 systems. In particular, we examine 3 different possible functional forms of the 2 systems and 2 possible ways the systems can interact (simultaneously or serially). We compare these dual process models with 2 single process models using risky decision making data from Guo, Trueblood, and Diederich (2017). Using this data, we find that 1 of the dual process models significantly outperforms the other models in accounting for both choices and response times. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  11. Task-specific visual cues for improving process model understanding

    NARCIS (Netherlands)

    Petrusel, Razvan; Mendling, Jan; Reijers, Hajo A.

    2016-01-01

    Context Business process models support various stakeholders in managing business processes and designing process-aware information systems. In order to make effective use of these models, they have to be readily understandable. Objective Prior research has emphasized the potential of visual cues to

  12. Integrated Main Propulsion System Performance Reconstruction Process/Models

    Science.gov (United States)

    Lopez, Eduardo; Elliott, Katie; Snell, Steven; Evans, Michael

    2013-01-01

    The Integrated Main Propulsion System (MPS) Performance Reconstruction process provides the MPS post-flight data files needed for postflight reporting to the project integration management and key customers to verify flight performance. This process/model was used as the baseline for the currently ongoing Space Launch System (SLS) work. The process utilizes several methodologies, including multiple software programs, to model integrated propulsion system performance through space shuttle ascent. It is used to evaluate integrated propulsion systems, including propellant tanks, feed systems, rocket engine, and pressurization systems performance throughout ascent based on flight pressure and temperature data. The latest revision incorporates new methods based on main engine power balance model updates to model higher mixture ratio operation at lower engine power levels.

  13. Modeling and Control of Multivariable Process Using Intelligent Techniques

    Directory of Open Access Journals (Sweden)

    Subathra Balasubramanian

    2010-10-01

    Full Text Available For nonlinear dynamic systems, the first principles based modeling and control is difficult to implement. In this study, a fuzzy controller and recurrent fuzzy controller are developed for MIMO process. Fuzzy logic controller is a model free controller designed based on the knowledge about the process. In fuzzy controller there are two types of rule-based fuzzy models are available: one the linguistic (Mamdani model and the other is Takagi–Sugeno model. Of these two, Takagi-Sugeno model (TS has attracted most attention. The fuzzy controller application is limited to static processes due to their feedforward structure. But, most of the real-time processes are dynamic and they require the history of input/output data. In order to store the past values a memory unit is needed, which is introduced by the recurrent structure. The proposed recurrent fuzzy structure is used to develop a controller for the two tank heating process. Both controllers are designed and implemented in a real time environment and their performance is compared.

  14. A security modeling approach for web-service-based business processes

    DEFF Research Database (Denmark)

    Jensen, Meiko; Feja, Sven

    2009-01-01

    a transformation that automatically derives WS-SecurityPolicy-conformant security policies from the process model, which in conjunction with the generated WS-BPEL processes and WSDL documents provides the ability to deploy and run the complete security-enhanced process based on Web Service technology.......The rising need for security in SOA applications requires better support for management of non-functional properties in web-based business processes. Here, the model-driven approach may provide valuable benefits in terms of maintainability and deployment. Apart from modeling the pure functionality...... of a process, the consideration of security properties at the level of a process model is a promising approach. In this work-in-progress paper we present an extension to the ARIS SOA Architect that is capable of modeling security requirements as a separate security model view. Further we provide...

  15. Modeling of Dielectric Heating within Lyophilization Process

    Directory of Open Access Journals (Sweden)

    Jan Kyncl

    2014-01-01

    Full Text Available A process of lyophilization of paper books is modeled. The process of drying is controlled by a dielectric heating system. From the physical viewpoint, the task represents a 2D coupled problem described by two partial differential equations for the electric and temperature fields. The material parameters are supposed to be temperature-dependent functions. The continuous mathematical model is solved numerically. The methodology is illustrated with some examples whose results are discussed.

  16. Modeling veterans healthcare administration disclosure processes :

    Energy Technology Data Exchange (ETDEWEB)

    Beyeler, Walter E; DeMenno, Mercy B.; Finley, Patrick D.

    2013-09-01

    As with other large healthcare organizations, medical adverse events at the Department of Veterans Affairs (VA) facilities can expose patients to unforeseen negative risks. VHA leadership recognizes that properly handled disclosure of adverse events can minimize potential harm to patients and negative consequences for the effective functioning of the organization. The work documented here seeks to help improve the disclosure process by situating it within the broader theoretical framework of issues management, and to identify opportunities for process improvement through modeling disclosure and reactions to disclosure. The computational model will allow a variety of disclosure actions to be tested across a range of incident scenarios. Our conceptual model will be refined in collaboration with domain experts, especially by continuing to draw on insights from VA Study of the Communication of Adverse Large-Scale Events (SCALE) project researchers.

  17. MODELLING OF THE PROCESS OF TEACHING READING ENGLISH LANGUAGE PERIODICALS

    Directory of Open Access Journals (Sweden)

    Тетяна Глушко

    2014-07-01

    Full Text Available The article reveals a scientifically substantiated process of teaching reading English language periodicals in all its components, which are consistently developed, and form of interconnection of the structural elements in the process of teaching reading. This process is presented as a few interconnected and interdetermined models: 1 the models of the process of acquiring standard and expressive lexical knowledge; 2 the models of the process of formation of skills to use such vocabulary; 3 the models of the development of skills to read texts of the different linguistic levels.

  18. Knowledge-based inspection:modelling complex processes with the integrated Safeguards Modelling Method (iSMM)

    International Nuclear Information System (INIS)

    Abazi, F.

    2011-01-01

    Increased level of complexity in almost every discipline and operation today raises the demand for knowledge in order to successfully run an organization whether to generate profit or to attain a non-profit mission. Traditional way of transferring knowledge to information systems rich in data structures and complex algorithms continue to hinder the ability to swiftly turnover concepts into operations. Diagrammatic modelling commonly applied in engineering in order to represent concepts or reality remains to be an excellent way of converging knowledge from domain experts. The nuclear verification domain represents ever more a matter which has great importance to the World safety and security. Demand for knowledge about nuclear processes and verification activities used to offset potential misuse of nuclear technology will intensify with the growth of the subject technology. This Doctoral thesis contributes with a model-based approach for representing complex process such as nuclear inspections. The work presented contributes to other domains characterized with knowledge intensive and complex processes. Based on characteristics of a complex process a conceptual framework was established as the theoretical basis for creating a number of modelling languages to represent the domain. The integrated Safeguards Modelling Method (iSMM) is formalized through an integrated meta-model. The diagrammatic modelling languages represent the verification domain and relevant nuclear verification aspects. Such a meta-model conceptualizes the relation between practices of process management, knowledge management and domain specific verification principles. This fusion is considered as necessary in order to create quality processes. The study also extends the formalization achieved through a meta-model by contributing with a formalization language based on Pattern Theory. Through the use of graphical and mathematical constructs of the theory, process structures are formalized enhancing

  19. Integrated modelling of near field and engineered barrier system processes

    International Nuclear Information System (INIS)

    Lamont, A.; Gansemer, J.

    1994-01-01

    The Yucca Mountain Integrating Model (YMIM) is an integrated model of the Engineered barrier System has been developed to assist project managers at LLNL in identifying areas where research emphasis should be placed. The model was designed to be highly modular so that a model of an individual process could be easily modified or replaced without interfering with the models of other processes. The modules modelling container failure and the dissolution of nuclides include particularly detailed, temperature dependent models of their corresponding processes

  20. The two-process model : Origin and perspective

    NARCIS (Netherlands)

    Daan, S.; Hut, R. A.; Beersma, D.

    In the two-process model as developed in the early 1980's sleep is controlled by a process-S, representing the rise and fall of sleep demand resulting from prior sleep-wake history, interacting with a process-C representing circadian variation in sleep propensity. S and C together optimize sleep

  1. A geomorphology-based ANFIS model for multi-station modeling of rainfall-runoff process

    Science.gov (United States)

    Nourani, Vahid; Komasi, Mehdi

    2013-05-01

    This paper demonstrates the potential use of Artificial Intelligence (AI) techniques for predicting daily runoff at multiple gauging stations. Uncertainty and complexity of the rainfall-runoff process due to its variability in space and time in one hand and lack of historical data on the other hand, cause difficulties in the spatiotemporal modeling of the process. In this paper, an Integrated Geomorphological Adaptive Neuro-Fuzzy Inference System (IGANFIS) model conjugated with C-means clustering algorithm was used for rainfall-runoff modeling at multiple stations of the Eel River watershed, California. The proposed model could be used for predicting runoff in the stations with lack of data or any sub-basin within the watershed because of employing the spatial and temporal variables of the sub-basins as the model inputs. This ability of the integrated model for spatiotemporal modeling of the process was examined through the cross validation technique for a station. In this way, different ANFIS structures were trained using Sugeno algorithm in order to estimate daily discharge values at different stations. In order to improve the model efficiency, the input data were then classified into some clusters by the means of fuzzy C-means (FCMs) method. The goodness-of-fit measures support the gainful use of the IGANFIS and FCM methods in spatiotemporal modeling of hydrological processes.

  2. Testing process predictions of models of risky choice: a quantitative model comparison approach

    Science.gov (United States)

    Pachur, Thorsten; Hertwig, Ralph; Gigerenzer, Gerd; Brandstätter, Eduard

    2013-01-01

    This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or non-linear functions thereof) and the separate evaluation of risky options (expectation models). Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models). We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter et al., 2006), and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up) and direction of search (i.e., gamble-wise vs. reason-wise). In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly); acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988) called “similarity.” In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies. PMID:24151472

  3. Testing Process Predictions of Models of Risky Choice: A Quantitative Model Comparison Approach

    Directory of Open Access Journals (Sweden)

    Thorsten ePachur

    2013-09-01

    Full Text Available This article presents a quantitative model comparison contrasting the process predictions of two prominent views on risky choice. One view assumes a trade-off between probabilities and outcomes (or nonlinear functions thereof and the separate evaluation of risky options (expectation models. Another view assumes that risky choice is based on comparative evaluation, limited search, aspiration levels, and the forgoing of trade-offs (heuristic models. We derived quantitative process predictions for a generic expectation model and for a specific heuristic model, namely the priority heuristic (Brandstätter, Gigerenzer, & Hertwig, 2006, and tested them in two experiments. The focus was on two key features of the cognitive process: acquisition frequencies (i.e., how frequently individual reasons are looked up and direction of search (i.e., gamble-wise vs. reason-wise. In Experiment 1, the priority heuristic predicted direction of search better than the expectation model (although neither model predicted the acquisition process perfectly; acquisition frequencies, however, were inconsistent with both models. Additional analyses revealed that these frequencies were primarily a function of what Rubinstein (1988 called similarity. In Experiment 2, the quantitative model comparison approach showed that people seemed to rely more on the priority heuristic in difficult problems, but to make more trade-offs in easy problems. This finding suggests that risky choice may be based on a mental toolbox of strategies.

  4. A semantic approach for business process model abstraction

    NARCIS (Netherlands)

    Smirnov, S.; Reijers, H.A.; Weske, M.H.; Mouratidis, H.; Rolland, C.

    2011-01-01

    Models of business processes can easily become large and difficult to understand. Abstraction has proven to be an effective means to present a readable, high-level view of a business process model, by showing aggregated activities and leaving out irrelevant details. Yet, it is an open question how

  5. Business Process Modeling Notation - An Overview

    Directory of Open Access Journals (Sweden)

    Alexandra Fortiş

    2006-01-01

    Full Text Available BPMN represents an industrial standard created to offer a common and user friendly notation to all the participants to a business process. The present paper aims to briefly present the main features of this notation as well as an interpretation of some of the main patterns characterizing a business process modeled by the working fluxes.

  6. Cognitive processes, models and metaphors in decision research

    Directory of Open Access Journals (Sweden)

    Ben Newell

    2008-03-01

    Full Text Available Decision research in psychology has traditionally been influenced by the extit{homo oeconomicus} metaphor with its emphasis on normative models and deviations from the predictions of those models. In contrast, the principal metaphor of cognitive psychology conceptualizes humans as `information processors', employing processes of perception, memory, categorization, problem solving and so on. Many of the processes described in cognitive theories are similar to those involved in decision making, and thus increasing cross-fertilization between the two areas is an important endeavour. A wide range of models and metaphors has been proposed to explain and describe `information processing' and many models have been applied to decision making in ingenious ways. This special issue encourages cross-fertilization between cognitive psychology and decision research by providing an overview of current perspectives in one area that continues to highlight the benefits of the synergistic approach: cognitive modeling of multi-attribute decision making. In this introduction we discuss aspects of the cognitive system that need to be considered when modeling multi-attribute decision making (e.g., automatic versus controlled processing, learning and memory constraints, metacognition and illustrate how such aspects are incorporated into the approaches proposed by contributors to the special issue. We end by discussing the challenges posed by the contrasting and sometimes incompatible assumptions of the models and metaphors.

  7. Modeling and simulation for process and safeguards system design

    International Nuclear Information System (INIS)

    Gutmacher, R.G.; Kern, E.A.; Duncan, D.R.; Benecke, M.W.

    1983-01-01

    A computer modeling and simulation approach that meets the needs of both the process and safeguards system designers is described. The results have been useful to Westinghouse Hanford Company process designers in optimizing the process scenario and operating scheme of the Secure Automated Fabrication line. The combined process/measurements model will serve as the basis for design of the safeguards system. Integration of the process design and the safeguards system design should result in a smoothly operating process that is easier to safeguard

  8. Asymptotic Poisson distribution for the number of system failures of a monotone system

    International Nuclear Information System (INIS)

    Aven, Terje; Haukis, Harald

    1997-01-01

    It is well known that for highly available monotone systems, the time to the first system failure is approximately exponentially distributed. Various normalising factors can be used as the parameter of the exponential distribution to ensure the asymptotic exponentiality. More generally, it can be shown that the number of system failures is asymptotic Poisson distributed. In this paper we study the performance of some of the normalising factors by using Monte Carlo simulation. The results show that the exponential/Poisson distribution gives in general very good approximations for highly available components. The asymptotic failure rate of the system gives best results when the process is in steady state, whereas other normalising factors seem preferable when the process is not in steady state. From a computational point of view the asymptotic system failure rate is most attractive

  9. Fuzzy model for Laser Assisted Bending Process

    Directory of Open Access Journals (Sweden)

    Giannini Oliviero

    2016-01-01

    Full Text Available In the present study, a fuzzy model was developed to predict the residual bending in a conventional metal bending process assisted by a high power diode laser. The study was focused on AA6082T6 aluminium thin sheets. In most dynamic sheet metal forming operations, the highly nonlinear deformation processes cause large amounts of elastic strain energy stored in the formed material. The novel hybrid forming process was thus aimed at inducing the local heating of the mechanically bent workpiece in order to decrease or eliminate the related springback phenomena. In particular, the influence on the extent of springback phenomena of laser process parameters such as source power, scan speed and starting elastic deformation of mechanically bent sheets, was experimentally assessed. Consistent trends in experimental response according to operational parameters were found. Accordingly, 3D process maps of the extent of the springback phenomena according to operational parameters were constructed. The effect of the inherent uncertainties on the predicted residual bending caused by the approximation in the model parameters was evaluated. In particular, a fuzzy-logic based approach was used to describe the model uncertainties and the transformation method was applied to propagate their effect on the residual bending.

  10. Numerical modeling of atmospheric washout processes

    International Nuclear Information System (INIS)

    Bayer, D.; Beheng, K.D.; Herbert, F.

    1987-01-01

    For the washout of particles from the atmosphere by clouds and rain one has to distinguish between processes which work in the first phase of cloud development, when condensation nuclei build up in saturated air (Nucleation Aerosol Scavenging, NAS) and those processes which work at the following cloud development. In the second case particles are taken off by cloud droplets or by falling rain drops via collision (Collision Aerosol Scavenging, CAS). The physics of both processes is described. For the CAS process a numerical model is presented. The report contains a documentation of the mathematical equations and the computer programs (FORTRAN). (KW) [de

  11. Monte Carlo based toy model for fission process

    International Nuclear Information System (INIS)

    Kurniadi, R.; Waris, A.; Viridi, S.

    2014-01-01

    There are many models and calculation techniques to obtain visible image of fission yield process. In particular, fission yield can be calculated by using two calculations approach, namely macroscopic approach and microscopic approach. This work proposes another calculation approach in which the nucleus is treated as a toy model. Hence, the fission process does not represent real fission process in nature completely. The toy model is formed by Gaussian distribution of random number that randomizes distance like the distance between particle and central point. The scission process is started by smashing compound nucleus central point into two parts that are left central and right central points. These three points have different Gaussian distribution parameters such as mean (μ CN , μ L , μ R ), and standard deviation (σ CN , σ L , σ R ). By overlaying of three distributions, the number of particles (N L , N R ) that are trapped by central points can be obtained. This process is iterated until (N L , N R ) become constant numbers. Smashing process is repeated by changing σ L and σ R , randomly

  12. Mashup Model and Verification Using Mashup Processing Network

    Science.gov (United States)

    Zahoor, Ehtesham; Perrin, Olivier; Godart, Claude

    Mashups are defined to be lightweight Web applications aggregating data from different Web services, built using ad-hoc composition and being not concerned with long term stability and robustness. In this paper we present a pattern based approach, called Mashup Processing Network (MPN). The idea is based on Event Processing Network and is supposed to facilitate the creation, modeling and the verification of mashups. MPN provides a view of how different actors interact for the mashup development namely the producer, consumer, mashup processing agent and the communication channels. It also supports modeling transformations and validations of data and offers validation of both functional and non-functional requirements, such as reliable messaging and security, that are key issues within the enterprise context. We have enriched the model with a set of processing operations and categorize them into data composition, transformation and validation categories. These processing operations can be seen as a set of patterns for facilitating the mashup development process. MPN also paves a way for realizing Mashup Oriented Architecture where mashups along with services are used as building blocks for application development.

  13. Measurement and modeling of advanced coal conversion processes

    Energy Technology Data Exchange (ETDEWEB)

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G. (Advanced Fuel Research, Inc., East Hartford, CT (United States)); Smoot, L.D.; Brewster, B.S. (Brigham Young Univ., Provo, UT (United States))

    1991-01-01

    The objective of this study are to establish the mechanisms and rates of basic steps in coal conversion processes, to integrate and incorporate this information into comprehensive computer models for coal conversion processes, to evaluate these models and to apply them to gasification, mild gasification and combustion in heat engines.

  14. Measurement and modeling of advanced coal conversion processes

    Energy Technology Data Exchange (ETDEWEB)

    Solomon, P.R.; Serio, M.A.; Hamblen, D.G. (Advanced Fuel Research, Inc., East Hartford, CT (United States)); Smoot, L.D.; Brewster, B.S. (Brigham Young Univ., Provo, UT (United States))

    1991-09-25

    The objectives of this study are to establish the mechanisms and rates of basic steps in coal conversion processes, to integrate and incorporate this information into comprehensive computer models for coal conversion processes, to evaluate these models and to apply them to gasification, mild gasification and combustion in heat engines. (VC)

  15. Isothermal CFD-model of Peirce-Smith converting process

    Energy Technology Data Exchange (ETDEWEB)

    Vaarno, J.; Pitkaelae, J.; Ahokainen, T.; Jokilaakso, A.

    1997-12-31

    The Peirce-Smith converter has been a dominating copper and nickel matte refining process since 1905. Due to extremely difficult process conditions, very little measured data has been available for studying interactions of the gas injection and molten sulphide matte. Detailed information on fluid dynamics of the gas injection is needed in solving gas injection related problems like refractory wear, accretion growth and tuyere blockage as well as optimising the efficiency of momentum and mass transfer created by the gas jets. A commercial CFD-code PHOENICS was used to solve isothermal flow field of gas and liquid in a Peirce-Smith converter. An Euler-Euler based algorithm was chosen for modelling fluid dynamics and evaluating controlling forces of a submerged gas injection generally. Predictions were made with a {kappa}-{epsilon} turbulence model in the body fitted co-ordinate system. The model has been verified with a 1/4 scale water model, and a parametric study with the mathematical model of submerged gas injection was made for the PS-process and the ladle injection processes. Limits of the modelling technique used were recognised, but calculated results indicates that the present model predicts the general flow field with reasonable accuracy and it can be used as input for more detailed mathematical models of gas plumes. Predicted bubble distribution, pattern of the flow field and magnitude of flow velocities were also used to evaluate scaling factors of physical models and general flow conditions of an industrial PS-converter. (orig.) 28 refs.

  16. CFD Modeling and Simulation in Materials Processing 2018

    OpenAIRE

    Nastac, Laurentiu; Pericleous, Koulis; Sabau, Adrian S.; Zhang, Lifeng; Thomas, Brian G.

    2018-01-01

    This book contains the proceedings of the symposium “CFD Modeling and Simulation in Materials Processing” held at the TMS 2018 Annual Meeting & Exhibition in Phoenix, Arizona, USA, March 11–15, 2018. This symposium dealt with computational fluid dynamics (CFD) modeling and simulation of engineering processes. The papers published in this book were requested from researchers and engineers involved in the modeling of multiscale and multiphase phenomena in material processing systems. The sympos...

  17. A Typology for Modeling Processes in Clinical Guidelines and Protocols

    Science.gov (United States)

    Tu, Samson W.; Musen, Mark A.

    We analyzed the graphical representations that are used by various guideline-modeling methods to express process information embodied in clinical guidelines and protocols. From this analysis, we distilled four modeling formalisms and the processes they typically model: (1) flowcharts for capturing problem-solving processes, (2) disease-state maps that link decision points in managing patient problems over time, (3) plans that specify sequences of activities that contribute toward a goal, (4) workflow specifications that model care processes in an organization. We characterized the four approaches and showed that each captures some aspect of what a guideline may specify. We believe that a general guideline-modeling system must provide explicit representation for each type of process.

  18. Process models as tools in forestry research and management

    Science.gov (United States)

    Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox

    2001-01-01

    Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...

  19. Using CASE to Exploit Process Modeling in Technology Transfer

    Science.gov (United States)

    Renz-Olar, Cheryl

    2003-01-01

    A successful business will be one that has processes in place to run that business. Creating processes, reengineering processes, and continually improving processes can be accomplished through extensive modeling. Casewise(R) Corporate Modeler(TM) CASE is a computer aided software engineering tool that will enable the Technology Transfer Department (TT) at NASA Marshall Space Flight Center (MSFC) to capture these abilities. After successful implementation of CASE, it could then go on to be applied in other departments at MSFC and other centers at NASA. The success of a business process is dependent upon the players working as a team and continuously improving the process. A good process fosters customer satisfaction as well as internal satisfaction in the organizational infrastructure. CASE provides a method for business process success through functions consisting of systems and processes business models; specialized diagrams; matrix management; simulation; report generation and publishing; and, linking, importing, and exporting documents and files. The software has an underlying repository or database to support these functions. The Casewise. manual informs us that dynamics modeling is a technique used in business design and analysis. Feedback is used as a tool for the end users and generates different ways of dealing with the process. Feedback on this project resulted from collection of issues through a systems analyst interface approach of interviews with process coordinators and Technical Points of Contact (TPOCs).

  20. A generic Transcriptomics Reporting Framework (TRF) for 'omics data processing and analysis.

    Science.gov (United States)

    Gant, Timothy W; Sauer, Ursula G; Zhang, Shu-Dong; Chorley, Brian N; Hackermüller, Jörg; Perdichizzi, Stefania; Tollefsen, Knut E; van Ravenzwaay, Ben; Yauk, Carole; Tong, Weida; Poole, Alan

    2017-12-01

    A generic Transcriptomics Reporting Framework (TRF) is presented that lists parameters that should be reported in 'omics studies used in a regulatory context. The TRF encompasses the processes from transcriptome profiling from data generation to a processed list of differentially expressed genes (DEGs) ready for interpretation. Included within the TRF is a reference baseline analysis (RBA) that encompasses raw data selection; data normalisation; recognition of outliers; and statistical analysis. The TRF itself does not dictate the methodology for data processing, but deals with what should be reported. Its principles are also applicable to sequencing data and other 'omics. In contrast, the RBA specifies a simple data processing and analysis methodology that is designed to provide a comparison point for other approaches and is exemplified here by a case study. By providing transparency on the steps applied during 'omics data processing and analysis, the TRF will increase confidence processing of 'omics data, and regulatory use. Applicability of the TRF is ensured by its simplicity and generality. The TRF can be applied to all types of regulatory 'omics studies, and it can be executed using different commonly available software tools. Crown Copyright © 2017. Published by Elsevier Inc. All rights reserved.

  1. Edgar Schein's Process versus Content Consultation Models.

    Science.gov (United States)

    Rockwood, Gary F.

    1993-01-01

    Describes Schein's three models of consultation based on assumptions inherent in different helping styles: purchase of expertise and doctor-patient models, which focus on content of organization problems; and process consultation model, which focuses on how organizational problems are solved. Notes that Schein has suggested that consultants begin…

  2. Development of an equipment management model to improve effectiveness of processes

    International Nuclear Information System (INIS)

    Chang, H. S.; Ju, T. Y.; Song, T. Y.

    2012-01-01

    The nuclear industries have developed and are trying to create a performance model to improve effectiveness of the processes implemented at nuclear plants in order to enhance performance. Most high performing nuclear stations seek to continually improve the quality of their operations by identifying and closing important performance gaps. Thus, many utilities have implemented performance models adjusted to their plant's configuration and have instituted policies for such models. KHNP is developing a standard performance model to integrate the engineering processes and to improve the inter-relation among processes. The model, called the Standard Equipment Management Model (SEMM), is under development first by focusing on engineering processes and performance improvement processes related to plant equipment used at the site. This model includes performance indicators for each process that can allow evaluating and comparing the process performance among 21 operating units. The model will later be expanded to incorporate cost and management processes. (authors)

  3. Ingestion modelling in COSYMA

    International Nuclear Information System (INIS)

    Margeanu, Sorin; Angelescu, Tatiana

    2003-01-01

    One of the aims in the design of the COSYMA ingestion model was the ability to cope in a flexible manner with the various food chain related data and results at different stages of an accident consequence assessment. Since dynamic foodchain transport models themselves are normally rather complex and require significant computation times, they are usually not included in ACA codes, but are used to calculate and tabulate the needed information in the form of data libraries. Such data files contain specific activity concentrations in the foodstuff and their time integral normalised to unit deposit or unit air concentration for a series of times after the accident.They allow for calculations taking into account food restrictions. In an ACA run, the actual specific concentrations in the foodstuffs are obtained by multiplying the normalized concentrations taken from the data library by the ground or air concentrations in each grid point predicted with an atmospheric transport and deposition model. The paper presents the structure of the ingestion model: structure, methods and the libraries used for a nuclear accident consequences assessment. (authors)

  4. Mathematical modelling in economic processes.

    Directory of Open Access Journals (Sweden)

    L.V. Kravtsova

    2008-06-01

    Full Text Available In article are considered a number of methods of mathematical modelling of economic processes and opportunities of use of spreadsheets Excel for reception of the optimum decision of tasks or calculation of financial operations with the help of the built-in functions.

  5. Modelling of innovative SANEX process mal-operations

    International Nuclear Information System (INIS)

    McLachlan, F.; Taylor, R.; Whittaker, D.; Woodhead, D.; Geist, A.

    2016-01-01

    The innovative (i-) SANEX process for the separation of minor actinides from PUREX highly active raffinate is expected to employ a solvent phase comprising 0.2 M TODGA with 5 v/v% 1-octanol in an inert diluent. An initial extract / scrub section would be used to extract trivalent actinides and lanthanides from the feed whilst leaving other fission products in the aqueous phase, before the loaded solvent is contacted with a low acidity aqueous phase containing a sulphonated bis-triazinyl pyridine ligand (BTP) to effect a selective strip of the actinides, so yielding separate actinide (An) and lanthanide (Ln) product streams. This process has been demonstrated in lab scale trials at Juelich (FZJ). The SACSESS (Safety of Actinide Separation processes) project is focused on the evaluation and improvement of the safety of such future systems. A key element of this is the development of an understanding of the response of a process to upsets (mal-operations). It is only practical to study a small subset of possible mal-operations experimentally and consideration of the majority of mal-operations entails the use of a validated dynamic model of the process. Distribution algorithms for HNO_3, Am, Cm and the lanthanides have been developed and incorporated into a dynamic flowsheet model that has, so far, been configured to correspond to the extract-scrub section of the i-SANEX flowsheet trial undertaken at FZJ in 2013. Comparison is made between the steady state model results and experimental results. Results from modelling of low acidity and high temperature mal-operations are presented. (authors)

  6. A question driven socio-hydrological modeling process

    Science.gov (United States)

    Garcia, M.; Portney, K.; Islam, S.

    2016-01-01

    Human and hydrological systems are coupled: human activity impacts the hydrological cycle and hydrological conditions can, but do not always, trigger changes in human systems. Traditional modeling approaches with no feedback between hydrological and human systems typically cannot offer insight into how different patterns of natural variability or human-induced changes may propagate through this coupled system. Modeling of coupled human-hydrological systems, also called socio-hydrological systems, recognizes the potential for humans to transform hydrological systems and for hydrological conditions to influence human behavior. However, this coupling introduces new challenges and existing literature does not offer clear guidance regarding model conceptualization. There are no universally accepted laws of human behavior as there are for the physical systems; furthermore, a shared understanding of important processes within the field is often used to develop hydrological models, but there is no such consensus on the relevant processes in socio-hydrological systems. Here we present a question driven process to address these challenges. Such an approach allows modeling structure, scope and detail to remain contingent on and adaptive to the question context. We demonstrate the utility of this process by revisiting a classic question in water resources engineering on reservoir operation rules: what is the impact of reservoir operation policy on the reliability of water supply for a growing city? Our example model couples hydrological and human systems by linking the rate of demand decreases to the past reliability to compare standard operating policy (SOP) with hedging policy (HP). The model shows that reservoir storage acts both as a buffer for variability and as a delay triggering oscillations around a sustainable level of demand. HP reduces the threshold for action thereby decreasing the delay and the oscillation effect. As a result, per capita demand decreases during

  7. How can Product Development Process Modelling be made more useful?

    DEFF Research Database (Denmark)

    Wynn, David C; Maier, Anja; Clarkson, John P

    2010-01-01

    and on the way they are applied. The paper draws upon established principles of cybernetic systems in an attempt to explain the role played by process modelling in operating and improving PD processes. We use this framework to identify eight key factors which influence the utility of modelling in the context...... of use. Further, we indicate how these factors can be interpreted to identify opportunities to improve modelling utility. The paper is organised as follows. Section 2 provides background and motivation for the paper by discussing an example of PD process modelling practice. After highlighting from......, and the process being modelled. Section 5 draws upon established principles of cybernetic systems theory to incorporate this view in an explanation of the role of modelling in PD process operation and improvement. This framework is used to define modelling utility and to progressively identify influences upon it...

  8. Dispersive processes in models of regional radionuclide migration. Technical memorandum

    International Nuclear Information System (INIS)

    Evenson, D.E.; Dettinger, M.D.

    1980-05-01

    Three broad areas of concern in the development of aquifer scale transport models will be local scale diffusion and dispersion processes, regional scale dispersion processes, and numerical problems associated with the advection-dispersion equation. Local scale dispersion processes are fairly well understood and accessible to observation. These processes will generally be dominated in large scale systems by regional processes, or macro-dispersion. Macro-dispersion is primarily the result of large scale heterogeneities in aquifer properties. In addition, the effects of many modeling approximations are often included in the process. Because difficulties arise in parameterization of this large scale phenomenon, parameterization should be based on field measurements made at the same scale as the transport process of interest or else partially circumvented through the application of a probabilistic advection model. Other problems associated with numerical transport models include difficulties with conservation of mass, stability, numerical dissipation, overshoot, flexibility, and efficiency. We recommend the random-walk model formulation for Lawrence Livermore Laboratory's purposes as the most flexible, accurate and relatively efficient modeling approach that overcomes these difficulties

  9. Application of Statistical Model in Wastewater Treatment Process Modeling Using Data Analysis

    Directory of Open Access Journals (Sweden)

    Alireza Raygan Shirazinezhad

    2015-06-01

    Full Text Available Background: Wastewater treatment includes very complex and interrelated physical, chemical and biological processes which using data analysis techniques can be rigorously modeled by a non-complex mathematical calculation models. Materials and Methods: In this study, data on wastewater treatment processes from water and wastewater company of Kohgiluyeh and Boyer Ahmad were used. A total of 3306 data for COD, TSS, PH and turbidity were collected, then analyzed by SPSS-16 software (descriptive statistics and data analysis IBM SPSS Modeler 14.2, through 9 algorithm. Results: According to the results on logistic regression algorithms, neural networks, Bayesian networks, discriminant analysis, decision tree C5, tree C & R, CHAID, QUEST and SVM had accuracy precision of 90.16, 94.17, 81.37, 70.48, 97.89, 96.56, 96.46, 96.84 and 88.92, respectively. Discussion and conclusion: The C5 algorithm as the best and most applicable algorithms for modeling of wastewater treatment processes were chosen carefully with accuracy of 97.899 and the most influential variables in this model were PH, COD, TSS and turbidity.

  10. Evaluation of Models of the Reading Process.

    Science.gov (United States)

    Balajthy, Ernest

    A variety of reading process models have been proposed and evaluated in reading research. Traditional approaches to model evaluation specify the workings of a system in a simplified fashion to enable organized, systematic study of the system's components. Following are several statistical methods of model evaluation: (1) empirical research on…

  11. Two Undergraduate Process Modeling Courses Taught Using Inductive Learning Methods

    Science.gov (United States)

    Soroush, Masoud; Weinberger, Charles B.

    2010-01-01

    This manuscript presents a successful application of inductive learning in process modeling. It describes two process modeling courses that use inductive learning methods such as inquiry learning and problem-based learning, among others. The courses include a novel collection of multi-disciplinary complementary process modeling examples. They were…

  12. Spatial differentiation in characterisation modelling – what difference does it make?

    DEFF Research Database (Denmark)

    Hauschild, Michael Zwicky; Potting, José

    2004-01-01

    In the life cycle of a product, emissions take place at many different locations. The location of the sources and its surrounding condition influence the fate of the emission and the exposure it leads to but this source of variation is currently neglected in life cycle impact assessment, although...... results from the Danish LCA Methodology Development and Consensus Creation Project address this issue and provides a framework for spatially differentiated characterisation modelling together with easily applicable site-dependent factors for each European country and normalisation references for those...

  13. Mathematical modelling of the laser processing of compose materials

    International Nuclear Information System (INIS)

    Gromyko, G.F.; Matsuka, N.P.

    2009-01-01

    Expansion of the protective coating scope led to the necessity to work out lower priced methods of treatment of machine elements. Making of an adequate, agreed with process features, mathematical model and development of effective methods of its solving are promising directions in this fields. In this paper the mathematical model of high-temperature laser treatment via moving source of pre-sprayed with composite powder padding is developed. Presented model describes accurately enough the heat processes taking place by laser processing of machine elements. Varying input parameters of model (laser power, temperature and composition of environment, characteristics and quantitative composition of using materials, etc.) one can get a cheap tool of preliminary estimates for wide range of similar problems. Difference method, based on process physical features and taking into account main process-dependent parameters had been developed for solving of the built system of nonlinear equations. (authors)

  14. Modeling and knowledge acquisition processes using case-based inference

    Directory of Open Access Journals (Sweden)

    Ameneh Khadivar

    2017-03-01

    Full Text Available The method of acquisition and presentation of the organizational Process Knowledge has considered by many KM researches. In this research a model for process knowledge acquisition and presentation has been presented by using the approach of Case Base Reasoning. The validation of the presented model was evaluated by conducting an expert panel. Then a software has been developed based on the presented model and implemented in Eghtesad Novin Bank of Iran. In this company, based on the stages of the presented model, first the knowledge intensive processes has been identified, then the Process Knowledge was stored in a knowledge base in the format of problem/solution/consequent .The retrieval of the knowledge was done based on the similarity of the nearest neighbor algorithm. For validating of the implemented system, results of the system has compared by the results of the decision making of the expert of the process.

  15. Process modeling and control applied to real-time monitoring of distillation processes by near-infrared spectroscopy.

    Science.gov (United States)

    de Oliveira, Rodrigo R; Pedroza, Ricardo H P; Sousa, A O; Lima, Kássio M G; de Juan, Anna

    2017-09-08

    A distillation device that acquires continuous and synchronized measurements of temperature, percentage of distilled fraction and NIR spectra has been designed for real-time monitoring of distillation processes. As a process model, synthetic commercial gasoline batches produced in Brazil, which contain mixtures of pure gasoline blended with ethanol have been analyzed. The information provided by this device, i.e., distillation curves and NIR spectra, has served as initial information for the proposal of new strategies of process modeling and multivariate statistical process control (MSPC). Process modeling based on PCA batch analysis provided global distillation trajectories, whereas multiset MCR-ALS analysis is proposed to obtain a component-wise characterization of the distillation evolution and distilled fractions. Distillation curves, NIR spectra or compressed NIR information under the form of PCA scores and MCR-ALS concentration profiles were tested as the seed information to build MSPC models. New on-line PCA-based MSPC approaches, some inspired on local rank exploratory methods for process analysis, are proposed and work as follows: a) MSPC based on individual process observation models, where multiple local PCA models are built considering the sole information in each observation point; b) Fixed Size Moving Window - MSPC, in which local PCA models are built considering a moving window of the current and few past observation points; and c) Evolving MSPC, where local PCA models are built with an increasing window of observations covering all points since the beginning of the process until the current observation. Performance of different approaches has been assessed in terms of sensitivity to fault detection and number of false alarms. The outcome of this work will be of general use to define strategies for on-line process monitoring and control and, in a more specific way, to improve quality control of petroleum derived fuels and other substances submitted

  16. Cutting force model for high speed machining process

    International Nuclear Information System (INIS)

    Haber, R. E.; Jimenez, J. E.; Jimenez, A.; Lopez-Coronado, J.

    2004-01-01

    This paper presents cutting force-based models able to describe a high speed machining process. The model considers the cutting force as output variable, essential for the physical processes that are taking place in high speed machining. Moreover, this paper shows the mathematical development to derive the integral-differential equations, and the algorithms implemented in MATLAB to predict the cutting force in real time MATLAB is a software tool for doing numerical computations with matrices and vectors. It can also display information graphically and includes many toolboxes for several research and applications areas. Two end mill shapes are considered (i. e. cylindrical and ball end mill) for real-time implementation of the developed algorithms. the developed models are validated in slot milling operations. The results corroborate the importance of the cutting force variable for predicting tool wear in high speed machining operations. The developed models are the starting point for future work related with vibration analysis, process stability and dimensional surface finish in high speed machining processes. (Author) 19 refs

  17. Encadrement des produits et des procédés : réglementation et normalisation du commerce international

    Directory of Open Access Journals (Sweden)

    Morin Odile

    2003-07-01

    Full Text Available Produits et procédés sont encadrés à la fois par des réglementations et, à un autre niveau, par des normes du commerce international. Cette présentation traite des textes réglementaires au niveau communautaire et national. On rappellera que l’entrée en vigueur d’un règlement européen est suivie d’une transposition dans le droit de chaque pays membre et que la réglementation nationale s’applique en l’absence de dispositions communautaires. En matière de commerce international, seront évoquées les actions de normalisation du Conseil Oléicole International (COI pour les huiles d’olive et de grignons d’olive et celles du Codex Alimentarius pour les huiles et graisses comestibles. L’ensemble des dispositions réglementaires constitue un cadre englobant les productions de l’amont vers l’aval, à la fois sur un plan vertical (oléagineux, huiles et corps gras, huiles d’olive, margarines, procédés de raffinage et transversalement (composés organiques volatils, OGM, solvants d’extraction, additifs, contaminants…. Le cas de l’huile d’olive est particulier en ce qu’il bénéficie d’un encadrement au niveau international (normes commerciales COI et Codex Alimentarius, européen et national (réglementation. Le Codex Alimentarius, quant à lui, établit des normes à caractère vertical (huiles végétales, graisses animales, huiles d’olive, matières grasses tartinables… et horizontal (additifs, résidus de pesticides…. L’essentiel de cet encadrement est résumé dans les tableaux qui illustrent cette contribution.

  18. Night-time restricted feeding normalises clock genes and Pai-1 gene expression in the db/db mouse liver.

    Science.gov (United States)

    Kudo, T; Akiyama, M; Kuriyama, K; Sudo, M; Moriya, T; Shibata, S

    2004-08-01

    An increase in PAI-1 activity is thought to be a key factor underlying myocardial infarction. Mouse Pai-1 (mPai-1) activity shows a daily rhythm in vivo, and its transcription seems to be controlled not only by clock genes but also by humoral factors such as insulin and triglycerides. Thus, we investigated daily clock genes and mPai-1 mRNA expression in the liver of db/db mice exhibiting high levels of glucose, insulin and triglycerides. Locomotor activity was measured using an infrared detection system. RT-PCR or in situ hybridisation methods were applied to measure gene expression. Humoral factors were measured using measurement kits. The db/ db mice showed attenuated locomotor activity rhythms. The rhythmic expression of mPer2 mRNA was severely diminished and the phase of mBmal1 oscillation was advanced in the db/db mouse liver, whereas mPai-1 mRNA was highly and constitutively expressed. Night-time restricted feeding led to a recovery not only from the diminished locomotor activity, but also from the diminished Per2 and advanced mBmal1 mRNA rhythms. Expression of mPai-1 mRNA in db/db mice was reduced to levels far below normal. Pioglitazone treatment slightly normalised glucose and insulin levels, with a slight reduction in mPai-1 gene expression. We demonstrated that Type 2 diabetes impairs the oscillation of the peripheral oscillator. Night-time restricted feeding rather than pioglitazone injection led to a recovery from the diminished locomotor activity, and altered oscillation of the peripheral clock and mPai-1 mRNA rhythm. Thus, we conclude that scheduled restricted food intake may be a useful form of treatment for diabetes.

  19. Lipid Processing Technology: Building a Multilevel Modeling Network

    DEFF Research Database (Denmark)

    Diaz Tovar, Carlos Axel; Mustaffa, Azizul Azri; Hukkerikar, Amol

    2011-01-01

    of a computer aided multilevel modeling network consisting a collection of new and adopted models, methods and tools for the systematic design and analysis of processes employing lipid technology. This is achieved by decomposing the problem into four levels of modeling: 1. pure component properties; 2. mixtures...... and phase behavior; 3. unit operations; and 4. process synthesis and design. The methods and tools in each level include: For the first level, a lipid‐database of collected experimental data from the open literature, confidential data from industry and generated data from validated predictive property...... of these unit operations with respect to performance parameters such as minimum total cost, product yield improvement, operability etc., and process intensification for the retrofit of existing biofuel plants. In the fourth level the information and models developed are used as building blocks...

  20. Plasma Processing of Model Residential Solid Waste

    Science.gov (United States)

    Messerle, V. E.; Mossé, A. L.; Nikonchuk, A. N.; Ustimenko, A. B.; Baimuldin, R. V.

    2017-09-01

    The authors have tested the technology of processing of model residential solid waste. They have developed and created a pilot plasma unit based on a plasma chamber incinerator. The waste processing technology has been tested and prepared for commercialization.

  1. A Software Development Simulation Model of a Spiral Process

    OpenAIRE

    Carolyn Mizell; Linda Malone

    2009-01-01

    This paper will present a discrete event simulation model of a spiral development lifecycle that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process. There is a need for simulation models of software development processes other than the waterfall due to new processes becoming more widely used in order to overcome the limitations of the traditional waterfall lifecycle. The use of a spiral process can make the inherently difficult job of...

  2. Process Modelling of Curing Process-Induced Internal Stress and Deformation of Composite Laminate Structure with Elastic and Viscoelastic Models

    Science.gov (United States)

    Li, Dongna; Li, Xudong; Dai, Jianfeng

    2018-06-01

    In this paper, two kinds of transient models, the viscoelastic model and the linear elastic model, are established to analyze the curing deformation of the thermosetting resin composites, and are calculated by COMSOL Multiphysics software. The two models consider the complicated coupling between physical and chemical changes during curing process of the composites and the time-variant characteristic of material performance parameters. Subsequently, the two proposed models are implemented respectively in a three-dimensional composite laminate structure, and a simple and convenient method of local coordinate system is used to calculate the development of residual stresses, curing shrinkage and curing deformation for the composite laminate. Researches show that the temperature, degree of curing (DOC) and residual stresses during curing process are consistent with the study in literature, so the curing shrinkage and curing deformation obtained on these basis have a certain referential value. Compared the differences between the two numerical results, it indicates that the residual stress and deformation calculated by the viscoelastic model are more close to the reference value than the linear elastic model.

  3. A Heuristic Approach for Discovering Reference Models by Mining Process Model Variants

    NARCIS (Netherlands)

    Li, C.; Reichert, M.U.; Wombacher, Andreas

    Recently, a new generation of adaptive Process-Aware Information Systems (PAISs) has emerged, which enables structural process changes during runtime while preserving PAIS robustness and consistency. Such flexibility, in turn, leads to a large number of process variants derived from the same model,

  4. Tropospheric ozone changes, radiative forcing and attribution to emissions in the Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP

    Directory of Open Access Journals (Sweden)

    D. S. Stevenson

    2013-03-01

    Full Text Available Ozone (O3 from 17 atmospheric chemistry models taking part in the Atmospheric Chemistry and Climate Model Intercomparison Project (ACCMIP has been used to calculate tropospheric ozone radiative forcings (RFs. All models applied a common set of anthropogenic emissions, which are better constrained for the present-day than the past. Future anthropogenic emissions follow the four Representative Concentration Pathway (RCP scenarios, which define a relatively narrow range of possible air pollution emissions. We calculate a value for the pre-industrial (1750 to present-day (2010 tropospheric ozone RF of 410 mW m−2. The model range of pre-industrial to present-day changes in O3 produces a spread (±1 standard deviation in RFs of ±17%. Three different radiation schemes were used – we find differences in RFs between schemes (for the same ozone fields of ±10%. Applying two different tropopause definitions gives differences in RFs of ±3%. Given additional (unquantified uncertainties associated with emissions, climate-chemistry interactions and land-use change, we estimate an overall uncertainty of ±30% for the tropospheric ozone RF. Experiments carried out by a subset of six models attribute tropospheric ozone RF to increased emissions of methane (44±12%, nitrogen oxides (31 ± 9%, carbon monoxide (15 ± 3% and non-methane volatile organic compounds (9 ± 2%; earlier studies attributed more of the tropospheric ozone RF to methane and less to nitrogen oxides. Normalising RFs to changes in tropospheric column ozone, we find a global mean normalised RF of 42 mW m−2 DU−1, a value similar to previous work. Using normalised RFs and future tropospheric column ozone projections we calculate future tropospheric ozone RFs (mW m−2; relative to 1750 for the four future scenarios (RCP2.6, RCP4.5, RCP6.0 and RCP8.5 of 350, 420, 370 and 460 (in 2030, and 200, 300, 280 and 600 (in 2100. Models show some coherent responses of ozone to climate change

  5. Comparative analysis of business rules and business process modeling languages

    Directory of Open Access Journals (Sweden)

    Audrius Rima

    2013-03-01

    Full Text Available During developing an information system is important to create clear models and choose suitable modeling languages. The article analyzes the SRML, SBVR, PRR, SWRL, OCL rules specifying language and UML, DFD, CPN, EPC and IDEF3 BPMN business process modeling language. The article presents business rules and business process modeling languages theoretical comparison. The article according to selected modeling aspects of the comparison between different business process modeling languages ​​and business rules representation languages sets. Also, it is selected the best fit of language set for three layer framework for business rule based software modeling.

  6. Modelling Of Monazite Ore Break-Down By Alkali Process Spectrometry

    International Nuclear Information System (INIS)

    Visetpotjanakit, Suputtra; Changkrueng, Kalaya; Pichestapong, Pipat

    2005-10-01

    A computer modelling has been developed for the calculation of mass balance of monazite ore break-down by alkali process at Rare Earth Research and Development Center. The process includes the following units : ore digestion by concentrate NaOH, dissolution of digested ore by HCl, uranium and thorium precipitation and crystallization of Na3PO4 which is by-product from this process. The model named RRDCMBP was prepared in Visual Basic language. The modelling program can be run on personal computer and it is interactive and easy to use. User is able to choose any equipment in each unit process and input data to get output of mass balance results. The model could be helpful in the process analysis for the further process adjustment and development

  7. Quantum mechanical Hamiltonian models of discrete processes

    International Nuclear Information System (INIS)

    Benioff, P.

    1981-01-01

    Here the results of other work on quantum mechanical Hamiltonian models of Turing machines are extended to include any discrete process T on a countably infinite set A. The models are constructed here by use of scattering phase shifts from successive scatterers to turn on successive step interactions. Also a locality requirement is imposed. The construction is done by first associating with each process T a model quantum system M with associated Hilbert space H/sub M/ and step operator U/sub T/. Since U/sub T/ is not unitary in general, M, H/sub M/, and U/sub T/ are extended into a (continuous time) Hamiltonian model on a larger space which satisfies the locality requirement. The construction is compared with the minimal unitary dilation of U/sub T/. It is seen that the model constructed here is larger than the minimal one. However, the minimal one does not satisfy the locality requirement

  8. A production model and maintenance planning model for the process industry

    NARCIS (Netherlands)

    Ashayeri, J.; Teelen, A.; Selen, W.J.

    1995-01-01

    In this paper a model is developed to simultaneously plan preventive maintenance and production in a process industry environment, where maintenance planning is extremely important. The model schedules production jobs and preventive maintenance jobs, while minimizing costs associated with

  9. Graphene growth process modeling: a physical-statistical approach

    Science.gov (United States)

    Wu, Jian; Huang, Qiang

    2014-09-01

    As a zero-band semiconductor, graphene is an attractive material for a wide variety of applications such as optoelectronics. Among various techniques developed for graphene synthesis, chemical vapor deposition on copper foils shows high potential for producing few-layer and large-area graphene. Since fabrication of high-quality graphene sheets requires the understanding of growth mechanisms, and methods of characterization and control of grain size of graphene flakes, analytical modeling of graphene growth process is therefore essential for controlled fabrication. The graphene growth process starts with randomly nucleated islands that gradually develop into complex shapes, grow in size, and eventually connect together to cover the copper foil. To model this complex process, we develop a physical-statistical approach under the assumption of self-similarity during graphene growth. The growth kinetics is uncovered by separating island shapes from area growth rate. We propose to characterize the area growth velocity using a confined exponential model, which not only has clear physical explanation, but also fits the real data well. For the shape modeling, we develop a parametric shape model which can be well explained by the angular-dependent growth rate. This work can provide useful information for the control and optimization of graphene growth process on Cu foil.

  10. Modelling a uranium ore bioleaching process

    International Nuclear Information System (INIS)

    Chien, D.C.H.; Douglas, P.L.; Herman, D.H.; Marchbank, A.

    1990-01-01

    A dynamic simulation model for the bioleaching of uranium ore in a stope leaching process has been developed. The model incorporates design and operating conditions, reaction kinetics enhanced by Thiobacillus ferroxidans present in the leaching solution and transport properties. Model predictions agree well with experimental data with an average deviation of about ± 3%. The model is sensitive to small errors in the estimates of fragment size and ore grade. Because accurate estimates are difficult to obtain a parameter estimation approach was developed to update the value of fragment size and ore grade using on-line plant information

  11. New methods for clinical pathways-Business Process Modeling Notation (BPMN) and Tangible Business Process Modeling (t.BPM).

    Science.gov (United States)

    Scheuerlein, Hubert; Rauchfuss, Falk; Dittmar, Yves; Molle, Rüdiger; Lehmann, Torsten; Pienkos, Nicole; Settmacher, Utz

    2012-06-01

    Clinical pathways (CP) are nowadays used in numerous institutions, but their real impact is still a matter of debate. The optimal design of a clinical pathway remains unclear and is mainly determined by the expectations of the individual institution. The purpose of the here described pilot project was the development of two CP (colon and rectum carcinoma) according to Business Process Modeling Notation (BPMN) and Tangible Business Process Modeling (t.BPM). BPMN is an established standard for business process modelling in industry and economy. It is, in the broadest sense, a computer programme which enables the description and a relatively easy graphical imaging of complex processes. t.BPM is a modular construction system of the BPMN symbols which enables the creation of an outline or raw model, e.g. by placing the symbols on a spread-out paper sheet. The thus created outline can then be transferred to the computer and further modified as required. CP for the treatment of colon and rectal cancer have been developed with support of an external IT coach. The pathway was developed in an interdisciplinary and interprofessional manner (55 man-days over 15 working days). During this time, necessary interviews with medical, nursing and administrative staffs were conducted as well. Both pathways were developed parallel. Subsequent analysis was focussed on feasibility, expenditure, clarity and suitability for daily clinical practice. The familiarization with BPMN was relatively quick and intuitive. The use of t.BPM enabled the pragmatic, effective and results-directed creation of outlines for the CP. The development of both CP was finished from the diagnostic evaluation to the adjuvant/neoadjuvant therapy and rehabilitation phase. The integration of checklists, guidelines and important medical or other documents is easily accomplished. A direct integration into the hospital computer system is currently not possible for technical reasons. BPMN and t.BPM are sufficiently

  12. Modeling cancer registration processes with an enhanced activity diagram.

    Science.gov (United States)

    Lyalin, D; Williams, W

    2005-01-01

    Adequate instruments are needed to reflect the complexity of routine cancer registry operations properly in a business model. The activity diagram is a key instrument of the Unified Modeling Language (UML) for the modeling of business processes. The authors aim to improve descriptions of processes in cancer registration, as well as in other public health domains, through the enhancements of an activity diagram notation within the standard semantics of UML. The authors introduced the practical approach to enhance a conventional UML activity diagram, complementing it with the following business process concepts: timeline, duration for individual activities, responsibilities for individual activities within swimlanes, and descriptive text. The authors used an enhanced activity diagram for modeling surveillance processes in the cancer registration domain. Specific example illustrates the use of an enhanced activity diagram to visualize a process of linking cancer registry records with external mortality files. Enhanced activity diagram allows for the addition of more business concepts to a single diagram and can improve descriptions of processes in cancer registration, as well as in other domains. Additional features of an enhanced activity diagram allow to advance the visualization of cancer registration processes. That, in turn, promotes the clarification of issues related to the process timeline, responsibilities for particular operations, and collaborations among process participants. Our first experiences in a cancer registry best practices development workshop setting support the usefulness of such an approach.

  13. A dynamic uranium-leaching model for process-control studies

    International Nuclear Information System (INIS)

    Vetter, D.A.; Barker, I.J.; Turner, G.A.

    1989-01-01

    The modelling of the uranium-leaching process, and the logging of data from a plant for the evaluation of the model, are reported. A phenomenological approach was adopted in the development of the model. A set of eight chemical reactions was chosen to represent the complex chemistry of the process, and kinetic expressions for these reactions were incorporated in differential equations representing mass and energy balances. These equations were coded in FORTRAN to form a program that simulated the process, and that allowed averaged and continuous data from the plant to be compared with the model. This allowed the model to be 'tuned', and to reveal a number of minor problems with the control infrastructure on the plant. 7 figs., 21 refs

  14. A Software Development Simulation Model of a Spiral Process

    Science.gov (United States)

    Mizell, Carolyn; Malone, Linda

    2007-01-01

    There is a need for simulation models of software development processes other than the waterfall because processes such as spiral development are becoming more and more popular. The use of a spiral process can make the inherently difficult job of cost and schedule estimation even more challenging due to its evolutionary nature, but this allows for a more flexible process that can better meet customers' needs. This paper will present a discrete event simulation model of spiral development that can be used to analyze cost and schedule effects of using such a process in comparison to a waterfall process.

  15. Three-dimensional model analysis and processing

    CERN Document Server

    Yu, Faxin; Luo, Hao; Wang, Pinghui

    2011-01-01

    This book focuses on five hot research directions in 3D model analysis and processing in computer science:  compression, feature extraction, content-based retrieval, irreversible watermarking and reversible watermarking.

  16. Modeling of Multicomponent Mixture Separation Processes Using Hollow fiber Membrane

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sin-Ah; Kim, Jin-Kuk; Lee, Young Moo; Yeo, Yeong-Koo [Hanyang University, Seoul (Korea, Republic of)

    2015-02-15

    So far, most of research activities on modeling of membrane separation processes have been focused on binary feed mixture. But, in actual separation operations, binary feed is hard to find and most separation processes involve multicomponent feed mixture. In this work models for membrane separation processes treating multicomponent feed mixture are developed. Various model types are investigated and validity of proposed models are analysed based on experimental data obtained using hollowfiber membranes. The proposed separation models show quick convergence and exhibit good tracking performance.

  17. IT vendor selection model by using structural equation model & analytical hierarchy process

    Science.gov (United States)

    Maitra, Sarit; Dominic, P. D. D.

    2012-11-01

    Selecting and evaluating the right vendors is imperative for an organization's global marketplace competitiveness. Improper selection and evaluation of potential vendors can dwarf an organization's supply chain performance. Numerous studies have demonstrated that firms consider multiple criteria when selecting key vendors. This research intends to develop a new hybrid model for vendor selection process with better decision making. The new proposed model provides a suitable tool for assisting decision makers and managers to make the right decisions and select the most suitable vendor. This paper proposes a Hybrid model based on Structural Equation Model (SEM) and Analytical Hierarchy Process (AHP) for long-term strategic vendor selection problems. The five steps framework of the model has been designed after the thorough literature study. The proposed hybrid model will be applied using a real life case study to assess its effectiveness. In addition, What-if analysis technique will be used for model validation purpose.

  18. Fundamentals of Numerical Modelling of Casting Processes

    DEFF Research Database (Denmark)

    Hattel, Jesper Henri; Pryds, Nini; Thorborg, Jesper

    Fundamentals of Numerical Modelling of Casting Processes comprises a thorough presentation of the basic phenomena that need to be addressed in numerical simulation of casting processes. The main philosophy of the book is to present the topics in view of their physical meaning, whenever possible......, rather than relying strictly on mathematical formalism. The book, aimed both at the researcher and the practicing engineer, as well as the student, is naturally divided into four parts. Part I (Chapters 1-3) introduces the fundamentals of modelling in a 1-dimensional framework. Part II (Chapter 4...

  19. Reversibility in Quantum Models of Stochastic Processes

    Science.gov (United States)

    Gier, David; Crutchfield, James; Mahoney, John; James, Ryan

    Natural phenomena such as time series of neural firing, orientation of layers in crystal stacking and successive measurements in spin-systems are inherently probabilistic. The provably minimal classical models of such stochastic processes are ɛ-machines, which consist of internal states, transition probabilities between states and output values. The topological properties of the ɛ-machine for a given process characterize the structure, memory and patterns of that process. However ɛ-machines are often not ideal because their statistical complexity (Cμ) is demonstrably greater than the excess entropy (E) of the processes they represent. Quantum models (q-machines) of the same processes can do better in that their statistical complexity (Cq) obeys the relation Cμ >= Cq >= E. q-machines can be constructed to consider longer lengths of strings, resulting in greater compression. With code-words of sufficiently long length, the statistical complexity becomes time-symmetric - a feature apparently novel to this quantum representation. This result has ramifications for compression of classical information in quantum computing and quantum communication technology.

  20. The Gain-Loss Model: A Probabilistic Skill Multimap Model for Assessing Learning Processes

    Science.gov (United States)

    Robusto, Egidio; Stefanutti, Luca; Anselmi, Pasquale

    2010-01-01

    Within the theoretical framework of knowledge space theory, a probabilistic skill multimap model for assessing learning processes is proposed. The learning process of a student is modeled as a function of the student's knowledge and of an educational intervention on the attainment of specific skills required to solve problems in a knowledge…

  1. Modeling spatial processes with unknown extremal dependence class

    KAUST Repository

    Huser, Raphaël G.

    2017-03-17

    Many environmental processes exhibit weakening spatial dependence as events become more extreme. Well-known limiting models, such as max-stable or generalized Pareto processes, cannot capture this, which can lead to a preference for models that exhibit a property known as asymptotic independence. However, weakening dependence does not automatically imply asymptotic independence, and whether the process is truly asymptotically (in)dependent is usually far from clear. The distinction is key as it can have a large impact upon extrapolation, i.e., the estimated probabilities of events more extreme than those observed. In this work, we present a single spatial model that is able to capture both dependence classes in a parsimonious manner, and with a smooth transition between the two cases. The model covers a wide range of possibilities from asymptotic independence through to complete dependence, and permits weakening dependence of extremes even under asymptotic dependence. Censored likelihood-based inference for the implied copula is feasible in moderate dimensions due to closed-form margins. The model is applied to oceanographic datasets with ambiguous true limiting dependence structure.

  2. SPEEDUP modeling of the defense waste processing facility at the SRS

    International Nuclear Information System (INIS)

    Smith, F.G. III.

    1997-01-01

    A computer model has been developed for the dynamic simulation of batch process operations within the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS). The DWPF chemically treats high level waste materials from the site tank farm and vitrifies the resulting slurry into a borosilicate glass for permanent disposal. The DWPF consists of three major processing areas: Salt Processing Cell (SPC), Chemical Processing Cell (CPC) and the Melt Cell. A fully integrated model of these process units has been developed using the SPEEDUP trademark software from Aspen Technology. Except for glass production in the Melt Cell, all of the chemical operations within DWPF are batch processes. Since SPEEDUP is designed for dynamic modeling of continuous processes, considerable effort was required to device batch process algorithms. This effort was successful and the model is able to simulate batch operations and the dynamic behavior of the process. The model also includes an optimization calculation that maximizes the waste content in the final glass product. In this paper, we will describe the process model in some detail and present preliminary results from a few simulation studies

  3. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    Energy Technology Data Exchange (ETDEWEB)

    E. Gonnenthal; N. Spyoher

    2001-02-05

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [153447]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: (1) Performance Assessment (PA); (2) Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); (3) UZ Flow and Transport Process Model Report (PMR); and (4) Near-Field Environment (NFE) PMR. The work scope for this activity is presented in the TWPs cited above, and summarized as follows: continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data

  4. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    International Nuclear Information System (INIS)

    Sonnenthale, E.

    2001-01-01

    The purpose of this Analysis/Model Report (AMR) is to document the Near-Field Environment (NFE) and Unsaturated Zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrologic-chemical (THC) processes on unsaturated zone flow and transport. This is in accordance with the ''Technical Work Plan (TWP) for Unsaturated Zone Flow and Transport Process Model Report'', Addendum D, Attachment D-4 (Civilian Radioactive Waste Management System (CRWMS) Management and Operating Contractor (M and O) 2000 [1534471]) and ''Technical Work Plan for Nearfield Environment Thermal Analyses and Testing'' (CRWMS M and O 2000 [153309]). These models include the Drift Scale Test (DST) THC Model and several THC seepage models. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal loading conditions, and predict the chemistry of waters and gases entering potential waste-emplacement drifts. The intended use of this AMR is to provide input for the following: Performance Assessment (PA); Near-Field Environment (NFE) PMR; Abstraction of Drift-Scale Coupled Processes AMR (ANL-NBS-HS-000029); and UZ Flow and Transport Process Model Report (PMR). The work scope for this activity is presented in the TWPs cited above, and summarized as follows: Continue development of the repository drift-scale THC seepage model used in support of the TSPA in-drift geochemical model; incorporate heterogeneous fracture property realizations; study sensitivity of results to changes in input data and mineral assemblage; validate the DST model by comparison with field data; perform simulations to predict mineral dissolution and precipitation and their effects on fracture properties and chemistry of water (but not flow rates) that may seep into drifts; submit modeling results to the TDMS and document the models. The model development, input data, sensitivity and validation studies described in this AMR are

  5. Investigations on Temperature Fields during Laser Beam Melting by Means of Process Monitoring and Multiscale Process Modelling

    Directory of Open Access Journals (Sweden)

    J. Schilp

    2014-07-01

    Full Text Available Process monitoring and modelling can contribute to fostering the industrial relevance of additive manufacturing. Process related temperature gradients and thermal inhomogeneities cause residual stresses, and distortions and influence the microstructure. Variations in wall thickness can cause heat accumulations. These occur predominantly in filigree part areas and can be detected by utilizing off-axis thermographic monitoring during the manufacturing process. In addition, numerical simulation models on the scale of whole parts can enable an analysis of temperature fields upstream to the build process. In a microscale domain, modelling of several exposed single hatches allows temperature investigations at a high spatial and temporal resolution. Within this paper, FEM-based micro- and macroscale modelling approaches as well as an experimental setup for thermographic monitoring are introduced. By discussing and comparing experimental data with simulation results in terms of temperature distributions both the potential of numerical approaches and the complexity of determining suitable computation time efficient process models are demonstrated. This paper contributes to the vision of adjusting the transient temperature field during manufacturing in order to improve the resulting part's quality by simulation based process design upstream to the build process and the inline process monitoring.

  6. An ecological process model of systems change.

    Science.gov (United States)

    Peirson, Leslea J; Boydell, Katherine M; Ferguson, H Bruce; Ferris, Lorraine E

    2011-06-01

    In June 2007 the American Journal of Community Psychology published a special issue focused on theories, methods and interventions for systems change which included calls from the editors and authors for theoretical advancement in this field. We propose a conceptual model of systems change that integrates familiar and fundamental community psychology principles (succession, interdependence, cycling of resources, adaptation) and accentuates a process orientation. To situate our framework we offer a definition of systems change and a brief review of the ecological perspective and principles. The Ecological Process Model of Systems Change is depicted, described and applied to a case example of policy driven systems level change in publicly funded social programs. We conclude by identifying salient implications for thinking and action which flow from the Model.

  7. Model based methods and tools for process systems engineering

    DEFF Research Database (Denmark)

    Gani, Rafiqul

    need to be integrated with work-flows and data-flows for specific product-process synthesis-design problems within a computer-aided framework. The framework therefore should be able to manage knowledge-data, models and the associated methods and tools needed by specific synthesis-design work...... of model based methods and tools within a computer aided framework for product-process synthesis-design will be highlighted.......Process systems engineering (PSE) provides means to solve a wide range of problems in a systematic and efficient manner. This presentation will give a perspective on model based methods and tools needed to solve a wide range of problems in product-process synthesis-design. These methods and tools...

  8. The avalanche process of the multilinear fiber bundles model

    International Nuclear Information System (INIS)

    Hao, Da-Peng; Tang, Gang; Xun, Zhi-Peng; Xia, Hui; Han, Kui

    2012-01-01

    In order to describe the smooth nonlinear constitutive behavior in the process of fracture of ductile micromechanics structures, the multilinear fiber bundle model was constructed, based on the bilinear fiber bundle model. In the multilinear fiber bundle model, the Young modulus of a fiber is assumed to decay K max times before the final failure occurs. For the large K max region, this model can describe the smooth nonlinear constitutive behavior well. By means of analytical approximation and numerical simulation, we show that the two critical parameters, i.e. the decay ratio of the Young modulus and the maximum number of decays, have substantial effects on the failure process of the bundle. From a macroscopic view, the model can provide various shapes of constitutive curves, which represent diverse kinds of tensile fracture processes. However, at the microscopic scale, the statistical properties of the model are in accord with the classical fiber bundle model. (paper)

  9. Effect of Linked Rules on Business Process Model Understanding

    DEFF Research Database (Denmark)

    Wang, Wei; Indulska, Marta; Sadiq, Shazia

    2017-01-01

    Business process models are widely used in organizations by information systems analysts to represent complex business requirements and by business users to understand business operations and constraints. This understanding is extracted from graphical process models as well as business rules. Prior...

  10. Automated Signal Processing Applied to Volatile-Based Inspection of Greenhouse Crops

    Science.gov (United States)

    Jansen, Roel; Hofstee, Jan Willem; Bouwmeester, Harro; van Henten, Eldert

    2010-01-01

    Gas chromatograph–mass spectrometers (GC-MS) have been used and shown utility for volatile-based inspection of greenhouse crops. However, a widely recognized difficulty associated with GC-MS application is the large and complex data generated by this instrument. As a consequence, experienced analysts are often required to process this data in order to determine the concentrations of the volatile organic compounds (VOCs) of interest. Manual processing is time-consuming, labour intensive and may be subject to errors due to fatigue. The objective of this study was to assess whether or not GC-MS data can also be automatically processed in order to determine the concentrations of crop health associated VOCs in a greenhouse. An experimental dataset that consisted of twelve data files was processed both manually and automatically to address this question. Manual processing was based on simple peak integration while the automatic processing relied on the algorithms implemented in the MetAlign™ software package. The results of automatic processing of the experimental dataset resulted in concentrations similar to that after manual processing. These results demonstrate that GC-MS data can be automatically processed in order to accurately determine the concentrations of crop health associated VOCs in a greenhouse. When processing GC-MS data automatically, noise reduction, alignment, baseline correction and normalisation are required. PMID:22163594

  11. Near Field Environment Process Model Report

    Energy Technology Data Exchange (ETDEWEB)

    R.A. Wagner

    2000-11-14

    Waste emplacement and activities associated with construction of a repository system potentially will change environmental conditions within the repository system. These environmental changes principally result from heat generated by the decay of the radioactive waste, which elevates temperatures within the repository system. Elevated temperatures affect distribution of water, increase kinetic rates of geochemical processes, and cause stresses to change in magnitude and orientation from the stresses resulting from the overlying rock and from underground construction activities. The recognition of this evolving environment has been reflected in activities, studies and discussions generally associated with what has been termed the Near-Field Environment (NFE). The NFE interacts directly with waste packages and engineered barriers as well as potentially changing the fluid composition and flow conditions within the mountain. As such, the NFE defines the environment for assessing the performance of a potential Monitored Geologic Repository at Yucca Mountain, Nevada. The NFe evolves over time, and therefore is not amenable to direct characterization or measurement in the ambient system. Analysis or assessment of the NFE must rely upon projections based on tests and models that encompass the long-term processes of the evolution of this environment. This NFE Process Model Report (PMR) describes the analyses and modeling based on current understanding of the evolution of the near-field within the rock mass extending outward from the drift wall.

  12. Modelling and control of dynamic systems using gaussian process models

    CERN Document Server

    Kocijan, Juš

    2016-01-01

    This monograph opens up new horizons for engineers and researchers in academia and in industry dealing with or interested in new developments in the field of system identification and control. It emphasizes guidelines for working solutions and practical advice for their implementation rather than the theoretical background of Gaussian process (GP) models. The book demonstrates the potential of this recent development in probabilistic machine-learning methods and gives the reader an intuitive understanding of the topic. The current state of the art is treated along with possible future directions for research. Systems control design relies on mathematical models and these may be developed from measurement data. This process of system identification, when based on GP models, can play an integral part of control design in data-based control and its description as such is an essential aspect of the text. The background of GP regression is introduced first with system identification and incorporation of prior know...

  13. Physical and mathematical modelling of extrusion processes

    DEFF Research Database (Denmark)

    Arentoft, Mogens; Gronostajski, Z.; Niechajowics, A.

    2000-01-01

    The main objective of the work is to study the extrusion process using physical modelling and to compare the findings of the study with finite element predictions. The possibilities and advantages of the simultaneous application of both of these methods for the analysis of metal forming processes...

  14. Bridging process-based and empirical approaches to modeling tree growth

    Science.gov (United States)

    Harry T. Valentine; Annikki Makela; Annikki Makela

    2005-01-01

    The gulf between process-based and empirical approaches to modeling tree growth may be bridged, in part, by the use of a common model. To this end, we have formulated a process-based model of tree growth that can be fitted and applied in an empirical mode. The growth model is grounded in pipe model theory and an optimal control model of crown development. Together, the...

  15. Drift-Scale Coupled Processes (DST and THC Seepage) Models

    International Nuclear Information System (INIS)

    Dixon, P.

    2004-01-01

    The purpose of this Model Report (REV02) is to document the unsaturated zone (UZ) models used to evaluate the potential effects of coupled thermal-hydrological-chemical (THC) processes on UZ flow and transport. This Model Report has been developed in accordance with the ''Technical Work Plan for: Performance Assessment Unsaturated Zone'' (Bechtel SAIC Company, LLC (BSC) 2002 [160819]). The technical work plan (TWP) describes planning information pertaining to the technical scope, content, and management of this Model Report in Section 1.12, Work Package AUZM08, ''Coupled Effects on Flow and Seepage''. The plan for validation of the models documented in this Model Report is given in Attachment I, Model Validation Plans, Section I-3-4, of the TWP. Except for variations in acceptance criteria (Section 4.2), there were no deviations from this TWP. This report was developed in accordance with AP-SIII.10Q, ''Models''. This Model Report documents the THC Seepage Model and the Drift Scale Test (DST) THC Model. The THC Seepage Model is a drift-scale process model for predicting the composition of gas and water that could enter waste emplacement drifts and the effects of mineral alteration on flow in rocks surrounding drifts. The DST THC model is a drift-scale process model relying on the same conceptual model and much of the same input data (i.e., physical, hydrological, thermodynamic, and kinetic) as the THC Seepage Model. The DST THC Model is the primary method for validating the THC Seepage Model. The DST THC Model compares predicted water and gas compositions, as well as mineral alteration patterns, with observed data from the DST. These models provide the framework to evaluate THC coupled processes at the drift scale, predict flow and transport behavior for specified thermal-loading conditions, and predict the evolution of mineral alteration and fluid chemistry around potential waste emplacement drifts. The DST THC Model is used solely for the validation of the THC

  16. Mathematical modeling of the voloxidation process. Final report

    International Nuclear Information System (INIS)

    Stanford, T.G.

    1979-06-01

    A mathematical model of the voloxidation process, a head-end reprocessing step for the removal of volatile fission products from spent nuclear fuel, has been developed. Three types of voloxidizer operation have been considered; co-current operation in which the gas and solid streams flow in the same direction, countercurrent operation in which the gas and solid streams flow in opposite directions, and semi-batch operation in which the gas stream passes through the reactor while the solids remain in it and are processed batch wise. Because of the complexity of the physical ahd chemical processes which occur during the voloxidation process and the lack of currently available kinetic data, a global kinetic model has been adapted for this study. Test cases for each mode of operation have been simulated using representative values of the model parameters. To process 714 kgm/day of spent nuclear fuel, using an oxidizing atmosphere containing 20 mole percent oxygen, it was found that a reactor 0.7 m in diameter and 2.49 m in length would be required for both cocurrent and countercurrent modes of operation while for semibatch operation a 0.3 m 3 reactor and an 88200 sec batch processing time would be required

  17. Modeling of flash calcination process during clay activation

    International Nuclear Information System (INIS)

    Borrajo Perez, Ruben; Gonzalez Bayon, Juan Jose; Sanchez Rodriguez, Andy A.

    2011-01-01

    Pozzolanic activity in some materials can be increased by means of different processes, among them, thermal activation is one of the most promising. The activation process, occurring at high temperatures and velocities produces a material with better characteristics. In the last few years, high reactivity pozzolan during cure's early days has been produced. Temperature is an important parameter in the activation process and as a consequence, the activation units must consider temperature variation to allow the use of different raw materials, each one of them with different characteristics. Considering the high prices of Kaolin in the market, new materials are being tested, the clayey soil, which after a sedimentation process produces a clay that has turned out to be a suitable raw material, when the kinetics of the pozzolanic reaction is considered. Additionally, other material with higher levels of kaolin are being used with good results. This paper is about the modeling of thermal, hydrodynamics and dehydroxilation processes suffering for solids particles exposed to a hot gas stream. The models employed are discussed; the velocity and temperature of particles are obtained as a function of carrier gas parameters. The calculation include the heat losses and finally the model predict the residence time needed for finish the activation process. (author)

  18. Method for modeling social care processes for national information exchange.

    Science.gov (United States)

    Miettinen, Aki; Mykkänen, Juha; Laaksonen, Maarit

    2012-01-01

    Finnish social services include 21 service commissions of social welfare including Adoption counselling, Income support, Child welfare, Services for immigrants and Substance abuse care. This paper describes the method used for process modeling in the National project for IT in Social Services in Finland (Tikesos). The process modeling in the project aimed to support common national target state processes from the perspective of national electronic archive, increased interoperability between systems and electronic client documents. The process steps and other aspects of the method are presented. The method was developed, used and refined during the three years of process modeling in the national project.

  19. Guideline validation in multiple trauma care through business process modeling.

    Science.gov (United States)

    Stausberg, Jürgen; Bilir, Hüseyin; Waydhas, Christian; Ruchholtz, Steffen

    2003-07-01

    Clinical guidelines can improve the quality of care in multiple trauma. In our Department of Trauma Surgery a specific guideline is available paper-based as a set of flowcharts. This format is appropriate for the use by experienced physicians but insufficient for electronic support of learning, workflow and process optimization. A formal and logically consistent version represented with a standardized meta-model is necessary for automatic processing. In our project we transferred the paper-based into an electronic format and analyzed the structure with respect to formal errors. Several errors were detected in seven error categories. The errors were corrected to reach a formally and logically consistent process model. In a second step the clinical content of the guideline was revised interactively using a process-modeling tool. Our study reveals that guideline development should be assisted by process modeling tools, which check the content in comparison to a meta-model. The meta-model itself could support the domain experts in formulating their knowledge systematically. To assure sustainability of guideline development a representation independent of specific applications or specific provider is necessary. Then, clinical guidelines could be used for eLearning, process optimization and workflow management additionally.

  20. Business process model abstraction : a definition, catalog, and survey

    NARCIS (Netherlands)

    Smirnov, S.; Reijers, H.A.; Weske, M.H.; Nugteren, T.

    2012-01-01

    The discipline of business process management aims at capturing, understanding, and improving work in organizations by using process models as central artifacts. Since business-oriented tasks require different information from such models to be highlighted, a range of abstraction techniques has been

  1. On the suitability of BPMN for business process modelling

    NARCIS (Netherlands)

    Wohed, P.; Aalst, van der W.M.P.; Dumas, M.; Hofstede, ter A.H.M.; Russell, N.C.; Dustdar, S.; Fiadeiro, J.L.; Sheth, A.

    2006-01-01

    In this paper we examine the suitability of the Business Process Modelling Notation (BPMN) for business process modelling, using the Workflow Patterns as an evaluation framework. The Workflow Patterns are a collection of patterns developed for assessing control-flow, data and resource capabilities

  2. Process modeling for the Integrated Nonthermal Treatment System (INTS) study

    Energy Technology Data Exchange (ETDEWEB)

    Brown, B.W.

    1997-04-01

    This report describes the process modeling done in support of the Integrated Nonthermal Treatment System (INTS) study. This study was performed to supplement the Integrated Thermal Treatment System (ITTS) study and comprises five conceptual treatment systems that treat DOE contract-handled mixed low-level wastes (MLLW) at temperatures of less than 350{degrees}F. ASPEN PLUS, a chemical process simulator, was used to model the systems. Nonthermal treatment systems were developed as part of the INTS study and include sufficient processing steps to treat the entire inventory of MLLW. The final result of the modeling is a process flowsheet with a detailed mass and energy balance. In contrast to the ITTS study, which modeled only the main treatment system, the INTS study modeled each of the various processing steps with ASPEN PLUS, release 9.1-1. Trace constituents, such as radionuclides and minor pollutant species, were not included in the calculations.

  3. Continuation-like semantics for modeling structural process anomalies

    Directory of Open Access Journals (Sweden)

    Grewe Niels

    2012-09-01

    Full Text Available Abstract Background Biomedical ontologies usually encode knowledge that applies always or at least most of the time, that is in normal circumstances. But for some applications like phenotype ontologies it is becoming increasingly important to represent information about aberrations from a norm. These aberrations may be modifications of physiological structures, but also modifications of biological processes. Methods To facilitate precise definitions of process-related phenotypes, such as delayed eruption of the primary teeth or disrupted ocular pursuit movements, I introduce a modeling approach that draws inspiration from the use of continuations in the analysis of programming languages and apply a similar idea to ontological modeling. This approach characterises processes by describing their outcome up to a certain point and the way they will continue in the canonical case. Definitions of process types are then given in terms of their continuations and anomalous phenotypes are defined by their differences to the canonical definitions. Results The resulting model is capable of accurately representing structural process anomalies. It allows distinguishing between different anomaly kinds (delays, interruptions, gives identity criteria for interrupted processes, and explains why normal and anomalous process instances can be subsumed under a common type, thus establishing the connection between canonical and anomalous process-related phenotypes. Conclusion This paper shows how to to give semantically rich definitions of process-related phenotypes. These allow to expand the application areas of phenotype ontologies beyond literature annotation and establishment of genotype-phenotype associations to the detection of anomalies in suitably encoded datasets.

  4. Model reduction for dynamic real-time optimization of chemical processes

    NARCIS (Netherlands)

    Van den Berg, J.

    2005-01-01

    The value of models in process industries becomes apparent in practice and literature where numerous successful applications are reported. Process models are being used for optimal plant design, simulation studies, for off-line and online process optimization. For online optimization applications

  5. Modelling Of Flotation Processes By Classical Mathematical Methods - A Review

    Science.gov (United States)

    Jovanović, Ivana; Miljanović, Igor

    2015-12-01

    Flotation process modelling is not a simple task, mostly because of the process complexity, i.e. the presence of a large number of variables that (to a lesser or a greater extent) affect the final outcome of the mineral particles separation based on the differences in their surface properties. The attempts toward the development of the quantitative predictive model that would fully describe the operation of an industrial flotation plant started in the middle of past century and it lasts to this day. This paper gives a review of published research activities directed toward the development of flotation models based on the classical mathematical rules. The description and systematization of classical flotation models were performed according to the available references, with emphasize exclusively given to the flotation process modelling, regardless of the model application in a certain control system. In accordance with the contemporary considerations, models were classified as the empirical, probabilistic, kinetic and population balance types. Each model type is presented through the aspects of flotation modelling at the macro and micro process levels.

  6. Analytical and regression models of glass rod drawing process

    Science.gov (United States)

    Alekseeva, L. B.

    2018-03-01

    The process of drawing glass rods (light guides) is being studied. The parameters of the process affecting the quality of the light guide have been determined. To solve the problem, mathematical models based on general equations of continuum mechanics are used. The conditions for the stable flow of the drawing process have been found, which are determined by the stability of the motion of the glass mass in the formation zone to small uncontrolled perturbations. The sensitivity of the formation zone to perturbations of the drawing speed and viscosity is estimated. Experimental models of the drawing process, based on the regression analysis methods, have been obtained. These models make it possible to customize a specific production process to obtain light guides of the required quality. They allow one to find the optimum combination of process parameters in the chosen area and to determine the required accuracy of maintaining them at a specified level.

  7. Detection and quantification of flow consistency in business process models

    DEFF Research Database (Denmark)

    Burattin, Andrea; Bernstein, Vered; Neurauter, Manuel

    2017-01-01

    , to show how such features can be quantified into computational metrics, which are applicable to business process models. We focus on one particular feature, consistency of flow direction, and show the challenges that arise when transforming it into a precise metric. We propose three different metrics......Business process models abstract complex business processes by representing them as graphical models. Their layout, as determined by the modeler, may have an effect when these models are used. However, this effect is currently not fully understood. In order to systematically study this effect......, a basic set of measurable key visual features is proposed, depicting the layout properties that are meaningful to the human user. The aim of this research is thus twofold: first, to empirically identify key visual features of business process models which are perceived as meaningful to the user and second...

  8. Structured spatio-temporal shot-noise Cox point process models, with a view to modelling forest fires

    DEFF Research Database (Denmark)

    Møller, Jesper; Diaz-Avalos, Carlos

    Spatio-temporal Cox point process models with a multiplicative structure for the driving random intensity, incorporating covariate information into temporal and spatial components, and with a residual term modelled by a shot-noise process, are considered. Such models are flexible and tractable fo...... dataset consisting of 2796 days and 5834 spatial locations of fires. The model is compared with a spatio-temporal log-Gaussian Cox point process model, and likelihood-based methods are discussed to some extent....

  9. Structured Spatio-temporal shot-noise Cox point process models, with a view to modelling forest fires

    DEFF Research Database (Denmark)

    Møller, Jesper; Diaz-Avalos, Carlos

    2010-01-01

    Spatio-temporal Cox point process models with a multiplicative structure for the driving random intensity, incorporating covariate information into temporal and spatial components, and with a residual term modelled by a shot-noise process, are considered. Such models are flexible and tractable fo...... data set consisting of 2796 days and 5834 spatial locations of fires. The model is compared with a spatio-temporal log-Gaussian Cox point process model, and likelihood-based methods are discussed to some extent....

  10. Development of climate data storage and processing model

    Science.gov (United States)

    Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.

    2016-11-01

    We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.

  11. Hydrothermal processing of Hanford tank wastes: Process modeling and control

    International Nuclear Information System (INIS)

    Currier, R.P.

    1994-01-01

    In the Los Alamos National Laboratory (LANL) hydrothermal process, waste streams are first pressurized and heated as they pass through a continuous flow tubular reactor vessel. The waste is maintained at reaction temperature of 300--550 C where organic destruction and sludge reformation occur. This report documents LANL activities in process modeling and control undertaken in FY94 to support hydrothermal process development. Key issues discussed include non-ideal flow patterns (e.g. axial dispersion) and their effect on reactor performance, the use and interpretation of inert tracer experiments, and the use of computational fluid mechanics to evaluate novel hydrothermal reactor designs. In addition, the effects of axial dispersion (and simplifications to rate expressions) on the estimated kinetic parameters are explored by non-linear regression to experimental data. Safety-related calculations are reported which estimate the explosion limits of effluent gases and the fate of hydrogen as it passes through the reactor. Development and numerical solution of a generalized one-dimensional mathematical model is also summarized. The difficulties encountered in using commercially available software to correlate the behavior of high temperature, high pressure aqueous electrolyte mixtures are summarized. Finally, details of the control system and experiments conducted to empirically determine the system response are reported

  12. A model-based approach to on-line process disturbance management

    International Nuclear Information System (INIS)

    Kim, I.S.

    1988-01-01

    The methodology developed can be applied to the design of a real-time expert system to aid control-room operators in coping with process abnormalities. The approach encompasses diverse functional aspects required for an effective on-line process disturbance management: (1) intelligent process monitoring and alarming, (2) on-line sensor data validation, (3) on-line sensor and hardware (except sensors) fault diagnosis, and (4) real-time corrective measure synthesis. Accomplishment of these functions is made possible through the application of various models, goal-tree success-tree, process monitor-tree, sensor failure diagnosis, and hardware failure diagnosis models. The models used in the methodology facilitate not only the knowledge-acquisition process - a bottleneck in the development of an expert system - but also the reasoning process of the knowledge-based system. These transparent models and model-based reasoning significantly enhance the maintainability of the real-time expert systems. The proposed approach was applied to the feedwater control system of a nuclear power plant, and implemented into a real-time expert system, MOAS II, using the expert system shell, PICON, on the LMI machine

  13. Repairing business process models as retrieved from source code

    NARCIS (Netherlands)

    Fernández-Ropero, M.; Reijers, H.A.; Pérez-Castillo, R.; Piattini, M.; Nurcan, S.; Proper, H.A.; Soffer, P.; Krogstie, J.; Schmidt, R.; Halpin, T.; Bider, I.

    2013-01-01

    The static analysis of source code has become a feasible solution to obtain underlying business process models from existing information systems. Due to the fact that not all information can be automatically derived from source code (e.g., consider manual activities), such business process models

  14. Macro Level Simulation Model Of Space Shuttle Processing

    Science.gov (United States)

    2000-01-01

    The contents include: 1) Space Shuttle Processing Simulation Model; 2) Knowledge Acquisition; 3) Simulation Input Analysis; 4) Model Applications in Current Shuttle Environment; and 5) Model Applications for Future Reusable Launch Vehicles (RLV's). This paper is presented in viewgraph form.

  15. Three-dimensional model for fusion processes

    International Nuclear Information System (INIS)

    Olson, A.P.

    1984-01-01

    Active galactic nuclei (AGN) emit unusual spectra of radiation which is interpreted to signify extreme distance, extreme power, or both. The status of AGNs was recently reviewed by Balick and Heckman. It seems that the greatest conceptual difficulty with understanding AGNs is how to form a coherent phenomenological model of their properties. What drives the galactic engine. What and where are the mass-flows of fuel to this engine. Are there more than one engine. Do the engines have any symmetry properties. Is observed radiation isotropically emitted from the source. If it is polarized, what causes the polarization. Why is there a roughly spherical cloud of ionized gas about the center of our own galaxy, the Milky Way. The purpose of this paper is to discuss a new model, based on fusion processes which are not axisymmetric, uniform, isotropic, or even time-invariant. Then, the relationship to these questions will be developed. A unified model of fusion processes applicable to many astronomical phenomena will be proposed and discussed

  16. Experiences and Comparison Study of EPC & UML For Business Process & IS Modeling

    OpenAIRE

    Md. Rashedul Islam; Md. Rofiqul Islam; Md. Shariful Alam; Md. Shafiul Azam

    2011-01-01

    Business process modeling is an approach by which we can analyze and integrate the business process. Using the Business Process Modeling we can represent the current and future process of a business/organization/enterprise. The business process modeling is a prerequisite and essential implementing a business or making any automation system. In this paper, we present our experience in a Business Process Modeling for organization. This paper presents detailed description about business process ...

  17. Process Cost Modeling for Multi-Disciplinary Design Optimization

    Science.gov (United States)

    Bao, Han P.; Freeman, William (Technical Monitor)

    2002-01-01

    For early design concepts, the conventional approach to cost is normally some kind of parametric weight-based cost model. There is now ample evidence that this approach can be misleading and inaccurate. By the nature of its development, a parametric cost model requires historical data and is valid only if the new design is analogous to those for which the model was derived. Advanced aerospace vehicles have no historical production data and are nowhere near the vehicles of the past. Using an existing weight-based cost model would only lead to errors and distortions of the true production cost. This report outlines the development of a process-based cost model in which the physical elements of the vehicle are costed according to a first-order dynamics model. This theoretical cost model, first advocated by early work at MIT, has been expanded to cover the basic structures of an advanced aerospace vehicle. Elemental costs based on the geometry of the design can be summed up to provide an overall estimation of the total production cost for a design configuration. This capability to directly link any design configuration to realistic cost estimation is a key requirement for high payoff MDO problems. Another important consideration in this report is the handling of part or product complexity. Here the concept of cost modulus is introduced to take into account variability due to different materials, sizes, shapes, precision of fabrication, and equipment requirements. The most important implication of the development of the proposed process-based cost model is that different design configurations can now be quickly related to their cost estimates in a seamless calculation process easily implemented on any spreadsheet tool. In successive sections, the report addresses the issues of cost modeling as follows. First, an introduction is presented to provide the background for the research work. Next, a quick review of cost estimation techniques is made with the intention to

  18. A functional-dynamic reflection on participatory processes in modeling projects.

    Science.gov (United States)

    Seidl, Roman

    2015-12-01

    The participation of nonscientists in modeling projects/studies is increasingly employed to fulfill different functions. However, it is not well investigated if and how explicitly these functions and the dynamics of a participatory process are reflected by modeling projects in particular. In this review study, I explore participatory modeling projects from a functional-dynamic process perspective. The main differences among projects relate to the functions of participation-most often, more than one per project can be identified, along with the degree of explicit reflection (i.e., awareness and anticipation) on the dynamic process perspective. Moreover, two main approaches are revealed: participatory modeling covering diverse approaches and companion modeling. It becomes apparent that the degree of reflection on the participatory process itself is not always explicit and perfectly visible in the descriptions of the modeling projects. Thus, the use of common protocols or templates is discussed to facilitate project planning, as well as the publication of project results. A generic template may help, not in providing details of a project or model development, but in explicitly reflecting on the participatory process. It can serve to systematize the particular project's approach to stakeholder collaboration, and thus quality management.

  19. Retort process modelling for Indian traditional foods.

    Science.gov (United States)

    Gokhale, S V; Lele, S S

    2014-11-01

    Indian traditional staple and snack food is typically a heterogeneous recipe that incorporates varieties of vegetables, lentils and other ingredients. Modelling the retorting process of multilayer pouch packed Indian food was achieved using lumped-parameter approach. A unified model is proposed to estimate cold point temperature. Initial process conditions, retort temperature and % solid content were the significantly affecting independent variables. A model was developed using combination of vegetable solids and water, which was then validated using four traditional Indian vegetarian products: Pulav (steamed rice with vegetables), Sambar (south Indian style curry containing mixed vegetables and lentils), Gajar Halawa (carrot based sweet product) and Upama (wheat based snack product). The predicted and experimental values of temperature profile matched with ±10 % error which is a good match considering the food was a multi component system. Thus the model will be useful as a tool to reduce number of trials required to optimize retorting of various Indian traditional vegetarian foods.

  20. Fermentation process tracking through enhanced spectral calibration modeling.

    Science.gov (United States)

    Triadaphillou, Sophia; Martin, Elaine; Montague, Gary; Norden, Alison; Jeffkins, Paul; Stimpson, Sarah

    2007-06-15

    The FDA process analytical technology (PAT) initiative will materialize in a significant increase in the number of installations of spectroscopic instrumentation. However, to attain the greatest benefit from the data generated, there is a need for calibration procedures that extract the maximum information content. For example, in fermentation processes, the interpretation of the resulting spectra is challenging as a consequence of the large number of wavelengths recorded, the underlying correlation structure that is evident between the wavelengths and the impact of the measurement environment. Approaches to the development of calibration models have been based on the application of partial least squares (PLS) either to the full spectral signature or to a subset of wavelengths. This paper presents a new approach to calibration modeling that combines a wavelength selection procedure, spectral window selection (SWS), where windows of wavelengths are automatically selected which are subsequently used as the basis of the calibration model. However, due to the non-uniqueness of the windows selected when the algorithm is executed repeatedly, multiple models are constructed and these are then combined using stacking thereby increasing the robustness of the final calibration model. The methodology is applied to data generated during the monitoring of broth concentrations in an industrial fermentation process from on-line near-infrared (NIR) and mid-infrared (MIR) spectrometers. It is shown that the proposed calibration modeling procedure outperforms traditional calibration procedures, as well as enabling the identification of the critical regions of the spectra with regard to the fermentation process.